100% found this document useful (4 votes)
35 views58 pages

GPU Pro 360 Guide To Mobile Devices 1st Edition Wolfgang Engel - Download The Ebook Today and Own The Complete Version

The document promotes the 'GPU Pro 360' ebook series edited by Wolfgang Engel, which includes various titles focused on mobile devices, geometry manipulation, GPGPU, image space, and 3D engine design. It provides links to download these ebooks in multiple formats from textbookfull.com. Additionally, it includes information about the publication and copyright details related to the series.

Uploaded by

ifshangagie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (4 votes)
35 views58 pages

GPU Pro 360 Guide To Mobile Devices 1st Edition Wolfgang Engel - Download The Ebook Today and Own The Complete Version

The document promotes the 'GPU Pro 360' ebook series edited by Wolfgang Engel, which includes various titles focused on mobile devices, geometry manipulation, GPGPU, image space, and 3D engine design. It provides links to download these ebooks in multiple formats from textbookfull.com. Additionally, it includes information about the publication and copyright details related to the series.

Uploaded by

ifshangagie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 58

Explore the full ebook collection and download it now at textbookfull.

com

GPU Pro 360 Guide to Mobile Devices 1st Edition


Wolfgang Engel

https://textbookfull.com/product/gpu-pro-360-guide-to-
mobile-devices-1st-edition-wolfgang-engel/

OR CLICK HERE

DOWLOAD EBOOK

Browse and Get More Ebook Downloads Instantly at https://textbookfull.com


Click here to visit textbookfull.com and download textbook now
Your digital treasures (PDF, ePub, MOBI) await
Download instantly and pick your perfect format...

Read anywhere, anytime, on any device!

GPU Pro 360 Guide to Geometry Manipulation 1st Edition


Wolfgang Engel

https://textbookfull.com/product/gpu-pro-360-guide-to-geometry-
manipulation-1st-edition-wolfgang-engel/

textbookfull.com

GPU PRO 360: Guide to GPGPU 1st Edition Wolfgang Engel


(Editor)

https://textbookfull.com/product/gpu-pro-360-guide-to-gpgpu-1st-
edition-wolfgang-engel-editor/

textbookfull.com

GPU Pro 360 Guide to Image Space 1st Edition Wolfgang


Engel (Author)

https://textbookfull.com/product/gpu-pro-360-guide-to-image-space-1st-
edition-wolfgang-engel-author/

textbookfull.com

GPU Pro 360 Guide to 3D Engine Design 1st Edition Wolfgang


Engel (Author)

https://textbookfull.com/product/gpu-pro-360-guide-to-3d-engine-
design-1st-edition-wolfgang-engel-author/

textbookfull.com
GPU Zen Advanced Rendering Techniques Wolfgang Engel
(Editor)

https://textbookfull.com/product/gpu-zen-advanced-rendering-
techniques-wolfgang-engel-editor/

textbookfull.com

Game AI Pro 360: Guide to Architecture 1st Edition Steve


Rabin (Author)

https://textbookfull.com/product/game-ai-pro-360-guide-to-
architecture-1st-edition-steve-rabin-author/

textbookfull.com

Game AI Pro 360: Guide to Character Behavior 1st Edition


Steve Rabin (Author)

https://textbookfull.com/product/game-ai-pro-360-guide-to-character-
behavior-1st-edition-steve-rabin-author/

textbookfull.com

Game AI Pro 360: Guide to Movement and Pathfinding 1st


Edition Steve Rabin (Author)

https://textbookfull.com/product/game-ai-pro-360-guide-to-movement-
and-pathfinding-1st-edition-steve-rabin-author/

textbookfull.com

Game AI Pro 360: Guide to Tactics and Strategy 1st Edition


Steve Rabin (Author)

https://textbookfull.com/product/game-ai-pro-360-guide-to-tactics-and-
strategy-1st-edition-steve-rabin-author/

textbookfull.com
GPU Pro 360
Guide to Mobile Devices
GPU Pro 360
Guide to Mobile Devices

Edited by Wolfgang Engel


CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742

© 2019 by Taylor & Francis Group, LLC


CRC Press is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S. Government works

Printed on acid-free paper

International Standard Book Number-13: 978-0-8153-5281-5 (Paperback)


International Standard Book Number-13: 978-0-8153-5283-9 (Hardback)

This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have
been made to publish reliable data and information, but the author and publisher cannot assume responsibility
for the validity of all materials or the consequences of their use. The authors and publishers have attempted to
trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if
permission to publish in this form has not been obtained. If any copyright material has not been acknowledged
please write and let us know so we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted,
or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented,
including photocopying, microfilming, and recording, or in any information storage or retrieval system, without
written permission from the publishers.

For permission to photocopy or use material electronically from this work, please access www.copyright.com
(http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive,
Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration
for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate
system of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used
only for identification and explanation without intent to infringe.

Library of Congress Cataloging-in-Publication Data

Names: Engel, Wolfgang F., editor.


Title: GPU pro 360 guide to mobile devices / edited by Wolfgang Engel.
Description: Boca Raton : Taylor & Francis, CRC Press, [2018]
Identifiers: LCCN 2017060054| ISBN 9780815352815 (pbk. : alk. paper)
| ISBN 9780815352839 (hardback : alk. paper)
Subjects: LCSH: Computer graphics. | Graphics processing units--Programming.
| Pocket computers--Programming. | Mobile computing. | Rendering (Computer graphics)
Classification: LCC T385 .G68885 2018 | DDC 621.39/96--dc23
LC record available at https://lccn.loc.gov/2017060054

Visit the eResources: www.crcpress.com/9780815352815

Visit the Taylor & Francis Web site at


http://www.taylorandfrancis.com

and the CRC Press Web site at


http://www.crcpress.com
Contents

Introduction xi

Web Materials xv

1 Touchscreen-Based User Interaction 1


Andrea Bizzotto
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Motion Estimation . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Position Prediction . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Application: Controlling a Camera in a Spherical Coordinate
System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.5 Algorithm Overview . . . . . . . . . . . . . . . . . . . . . . . . 7
1.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.7 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . 9

2 Optimizing a 3D UI Engine for Mobile Devices 11


Hyunwoo Ki
2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2 Optimization Methods . . . . . . . . . . . . . . . . . . . . . . 13
2.3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3 A Shader-Based eBook Renderer 27


Andrea Bizzotto
3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.2 Page-Peeling Effect . . . . . . . . . . . . . . . . . . . . . . . . 28
3.3 Enabling Two Pages Side-by-Side . . . . . . . . . . . . . . . . 33
3.4 Improving the Look and Antialiasing Edges . . . . . . . . . . 35
3.5 Direction-Aligned Triangle Strip . . . . . . . . . . . . . . . . . 36
3.6 Performance Optimizations and Power Consumption . . . . . 37
3.7 Putting it Together . . . . . . . . . . . . . . . . . . . . . . . . 37
3.8 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

v
vi Contents

3.9 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
3.10 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . 39
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4 Post-Processing Effects on Mobile Devices 41


Marco Weber and Peter Quayle
4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
4.2 Technical Details . . . . . . . . . . . . . . . . . . . . . . . . . 44
4.3 Case Study: Bloom . . . . . . . . . . . . . . . . . . . . . . . . 46
4.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 48
4.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

5 Shader-Based Water Effects 57


Joe Davis and Ken Catterall
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.2 Techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
5.3 Optimizations . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

6 Realistic Real-Time Skin Rendering on Mobile 77


Renaldas Zioma and Ole Ciliox
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
6.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
6.3 Power of Mobile GPU . . . . . . . . . . . . . . . . . . . . . . . 79
6.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 80
6.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
6.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

7 Deferred Rendering Techniques on Mobile Devices 87


Ashley Vaughan Smith
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
7.2 Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
7.3 Overview of Techniques . . . . . . . . . . . . . . . . . . . . . . 88
7.4 OpenGL ES Extensions . . . . . . . . . . . . . . . . . . . . . . 94
7.5 Conclusion and Future Work . . . . . . . . . . . . . . . . . . . 96
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Contents vii

8 Bandwidth Efficient Graphics with the ARM Mali GPUs 99


Marius Bjørge
8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
8.2 Shader Framebuffer Fetch Extensions . . . . . . . . . . . . . . 99
8.3 Shader Pixel Local Storage . . . . . . . . . . . . . . . . . . . . 103
8.4 Deferred Shading Example . . . . . . . . . . . . . . . . . . . . 107
8.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

9 Efficient Morph Target Animation Using OpenGL ES 3.0 113


James L. Jones
9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
9.2 Previous Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
9.3 Morph Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
9.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 115
9.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
9.6 Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . 119
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119

10 Tiled Deferred Blending 121


Ramses Ladlani
10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
10.2 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
10.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . 124
10.4 Optimizations . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
10.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
10.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134

11 Adaptive Scalable Texture Compression 137


Stacy Smith
11.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
11.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
11.3 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
11.4 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . 140
11.5 Using ASTC Textures . . . . . . . . . . . . . . . . . . . . . . . 141
11.6 Quality Settings . . . . . . . . . . . . . . . . . . . . . . . . . . 142
11.7 Other Color Formats . . . . . . . . . . . . . . . . . . . . . . . 147
11.8 3D Textures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
11.9 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
viii Contents

12 Optimizing OpenCL Kernels for the ARM Mali-T600 GPUs 151


Johan Gronqvist and Anton Lokhmotov
12.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
12.2 Overview of the OpenCL Programming Model . . . . . . . . . 152
12.3 ARM Mali-T600 GPU Series . . . . . . . . . . . . . . . . . . . 152
12.4 Optimizing the Sobel Image Filter . . . . . . . . . . . . . . . . 155
12.5 Optimizing the General Matrix Multiplication . . . . . . . . . 163
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181

13 Hybrid Ray Tracing on a PowerVR GPU 183


Gareth Morgan
13.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
13.2 Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
13.3 Combining Ray Tracing with Rasterization . . . . . . . . . . . 187
13.4 Hard Shadows . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
13.5 Soft Shadows . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
13.6 Reflections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
13.7 Transparency . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
13.8 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
13.9 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
13.10 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200

14 Implementing a GPU-Only Particle-Collision System with ASTC


3D Textures and OpenGL ES 3.0 201
Daniele Di Donato
14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
14.2 GPU-Only Particle System . . . . . . . . . . . . . . . . . . . . 202
14.3 Physics Simulation . . . . . . . . . . . . . . . . . . . . . . . . 204
14.4 Rendering the Particles . . . . . . . . . . . . . . . . . . . . . . 211
14.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217

15 Animated Characters with Shell Fur for Mobile Devices 219


Andrew Girdler and James L. Jones
15.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
15.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
15.3 Creating a Shell Fur Texture . . . . . . . . . . . . . . . . . . . 220
15.4 Bone Batches or Single Pass? . . . . . . . . . . . . . . . . . . 221
15.5 Model Data and Setup . . . . . . . . . . . . . . . . . . . . . . 221
15.6 Animation with TF . . . . . . . . . . . . . . . . . . . . . . . . 223
15.7 Instancing for Fur Shells . . . . . . . . . . . . . . . . . . . . . 223
Contents ix

15.8 Lighting and Other Effects . . . . . . . . . . . . . . . . . . . . 225


15.9 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227

16 High Dynamic Range Computational Photography on Mobile GPUs 229


Simon McIntosh-Smith, Amir Chohan, Dan Curran, and Anton Lokhmotov
16.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
16.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
16.3 Tone-Mapping Operators . . . . . . . . . . . . . . . . . . . . . 233
16.4 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
16.5 GPGPU Using OpenCL . . . . . . . . . . . . . . . . . . . . . 239
16.6 OpenGL ES and Android . . . . . . . . . . . . . . . . . . . . . 243
16.7 Implementing an HDR Pipeline Using OpenCL and OpenGL ES 244
16.8 Android Implementation . . . . . . . . . . . . . . . . . . . . . 253
16.9 Performance of Our HDR Effects . . . . . . . . . . . . . . . . 258
16.10 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262

17 Efficient Soft Shadows Based on Static Local Cubemap 265


Sylwester Bala and Roberto Lopez Mendez
17.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
17.2 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
17.3 Algorithm Overview . . . . . . . . . . . . . . . . . . . . . . . . 266
17.4 What Is a Local Cubemap? . . . . . . . . . . . . . . . . . . . 267
17.5 Creating a Shadow Cubemap . . . . . . . . . . . . . . . . . . 268
17.6 Applying Shadows . . . . . . . . . . . . . . . . . . . . . . . . . 269
17.7 Smoothness . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
17.8 Combining the Shadow Technique with Others . . . . . . . . . 273
17.9 Performance and Quality . . . . . . . . . . . . . . . . . . . . . 274
17.10 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
17.11 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276

18 Physically Based Deferred Shading on Mobile 277


Ashley Vaughan Smith and Mathieu Einig
18.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
18.2 Physically Based Shading . . . . . . . . . . . . . . . . . . . . . 277
18.3 An Efficient Physically Based Deferred Renderer . . . . . . . . 280
18.4 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
18.5 Conclusion and Future Work . . . . . . . . . . . . . . . . . . . 285
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288

Contributors 289
Introduction
With all devices now touch-driven and offering few physical keys, the effective
usage of touch screens is critical. The chapter, “Touchscreen-Based User Inter-
action,” by Andrea Bizzotto, provides basic insight into the complexities and
solutions required to achieve the best possible user-interaction and experience
through a touch-based interface.
“Optimizing a 3D UI Engine for Mobile Devices,” by Hyunwoo Ki, offers
early insights into optimizing a user interface for mobile devices, looking at the
importance of font rendering engines and “dirty region”-based optimizations to
avoid rendering more than is actually required.
The chapter, “A Shader-Based eBook Renderer,” by Andrea Bizzotto illus-
trates a vertex-shader-based implementation of the page-peeling effect of a basic
eBook renderer. It covers high-quality procedural antialiasing of the page edges,
as well as some tricks that achieve a polished look. Two pages can be combined
side-by-side to simulate a real book, and additional techniques are introduced
to illustrate how to satisfy additional constraints and meet power-consumption
requirements.
The next chapter, “Post-Processing Effects on Mobile Devices,” by Marco We-
ber and Peter Quayle describes a general approach to implement post-processing
on handheld devices by showing how to implement a bloom effect with efficient
convolution.
Joe Davis and Ken Catterall show in “Shader-Based Water Effects” how to
render high-quality water effects at a low computational cost. Although there are
many examples of water effects using shaders that are readily available, they are
designed mainly for high-performance graphics chips on desktop platforms. This
chapter shows how to tailor a technique discussed by Kurt Pelzer (in ShaderX2 ,
“Advanced Water Effects,” 2004) to mobile platforms.
In “Realistic Real-Time Skin Rendering on Mobile,” Renaldas Zioma and Ole
Ciliox share how they implemented skin rendering using physically based shading
models for the Unity “The Chase” demo. Their method makes use of approxi-
mations and lookup textures in order to balance the arithmetic and texture load.
In “Deferred Rendering Techniques on Mobile Devices,” Ashley Vaughan
Smith explores various techniques for doing deferred rendering on mobile devices.
This chapter steps through deferred shading, light pre-pass rendering, and light

xi
xii Introduction

indexed rendering, and also details extensions that allow efficient read-access to
individual render targets to further improve performance.
In “Bandwidth Efficient Graphics with the ARM Mali GPUs,” Marius Bjørge
presents new ARM Mali GPU extensions that allow applications to efficiently
read and write data to the on-chip tile buffer. Applications can read the current
color, depth, and stencil values as well as treat the tile buffer as a local storage
with full read and write access. The chapter also contains example use-cases such
as soft particles and deferred shading.
In “Efficient Morph Target Animation Using OpenGL ES 3.0,” James L. Jones
shows how OpenGL ES 3.0 can be used to do morph target animation efficiently
on mobile devices. Transform feedback is used to blend a set of poses, and the
chapter also describes how to batch blend multiple poses to reduce the number
of passes.
In “Tiled Deferred Blending,” Ramses Ladlani describes a method for doing
deferred blending. Blended primitives are first tiled to screen-space tiles, and then
each tile is rendered while blending the primitives in a single fragment shader pass.
The method has proven efficient on immediate mode renderers where blending
involves an expensive read-modify-write operation with the framebuffer.
In “Adaptive Scalable Texture Compression,” Stacy Smith presents ASTC, a
new texture compression format that has been accepted as a Khronos standard.
ASTC is set to pretty much replace all existing compressed texture formats. It
supports bit rates ranging from 0.89 bpp up to 8 bpp for both LDR and HDR
textures. This chapter explains how it works, how to use it, and how to get the
most out of it.
In “Optimizing OpenCL Kernels for the ARM Mali-T600 GPUs,” Johan
Gronqvist and Anton Lokhmotov go into the details of writing efficient OpenCL
kernels for the ARM Mali-T600 GPUs. This chapter introduces the ARM Mali-
T600 GPU series and goes into a deep discussion of the performance character-
istics of various OpenCL kernels.
“Hybrid Ray Tracing on a PowerVR GPU” by Gareth Morgan describes how
an existing raster-based graphics engine can use ray tracing to add high-quality
effects like hard and soft shadows, reflection, and refraction while continuing to
use rasterization as the primary rendering method. The chapter also gives an
introduction to the OpenRL API.
“Implementing a GPU-Only Particle-Collision System with ASTC 3D Tex-
tures and OpenGL ES 3.0” by Daniele Di Donato shares how the author used
OpenGL ES 3.0 and ASTC 3D textures to do bandwidth-friendly collision detec-
tion of particles on the GPU. The 3D texture stores a voxel representation of the
scene, which is used to do direct collision tests as well as to look up the nearest
surface.
ARM is a registered trademark of ARM Limited (or its subsidiaries) in the EU and/or
elsewhere. Mali is a trademark of ARM Limited (or its subsidiaries) in the EU and/or elsewhere.
All rights reserved.
Introduction xiii

“Animated Characters with Shell Fur for Mobile Devices” by Andrew Girdler
and James L. Jones presents how the authors were able to optimize a high-quality
animation system to run efficiently on mobile devices. With OpenGL ES 3.0, they
made use of transform feedback and instancing in order to reach the performance
target.
“High Dynamic Range Computational Photography on Mobile GPUs” by Si-
mon McIntosh-Smith, Amir Chohan, Dan Curran, and Anton Lokhmotov ex-
plores HDR computational photography on mobile GPUs using OpenCL and
shares some very interesting results.
In “Efficient Soft Shadows Based on Static Local Cubemap,” Sylwester Bala
and Roberto Lopez Mendes introduce a novel soft shadow technique that makes
use of local cubemaps. The technique allows for very nice looking smooth shadows
at minimal performance cost.
In “Physically Based Deferred Shading on Mobile,” Ashley Vaughan Smith
and Mathieu Einig describe how to implement physically based deferred shading
on a power-constrained mobile device using extensions such as pixel local storage
and framebuffer fetch. The chapter also explains how these extensions can be
used to implement deferred decal rendering very easily on mobile GPUs.
Lastly, I would like to thank all the contributors in this book for their great
work and excellent chapters.
Web Materials
Example programs and source code to accompany some of the chapters are avail-
able on the CRC Press website: go to https://www.crcpress.com/9780815352815
and click on the “Downloads” tab.
The directory structure follows the book structure by using the chapter num-
bers as the name of the subdirectory.

General System Requirements


The material presented in this book was originally published between 2010 and
2016, and the most recent developments have the following system requirements:
• The DirectX June 2010 SDK (the latest SDK is installed with Visual Studio
2012).
• DirectX 11 or DirectX 12 capable GPUs are required to run the examples.
The chapter will mention the exact requirement.
• The OS should be Microsoft Windows 10, following the requirement of
DirectX 11 or 12 capable GPUs.
• Visual Studio C++ 2012 (some examples might require older versions).
• 2GB RAM or more.

• The latest GPU driver.

xv
1

Touchscreen-Based
User Interaction
Andrea Bizzotto

1.1 Introduction
The touchscreen plays a major role in user interaction on mobile devices. Al-
though some of these systems come with high-level APIs that could be used by
the programmer to detect gestures, others provide just raw access to the samples
read from the hardware. This chapter illustrates a mathematical framework that
can be used to estimate the motion and position of the input pointer on screen
(see Figure 1.1). An application for controlling the position of a camera in a
three-dimensional spherical coordinate system is presented as an usage example.

1.2 Motion Estimation


Let us approach the problem of estimating the motion on screen, given a set of
samples described by three values:

• The time when the sample was read.

• The position of the sample along the x axis.

• The position of the sample along the y axis.

The context of this problem is a main-loop based graphics application. With


this respect, the platform-specific touchscreen implementation might provide one
sample per-frame, or multiple samples (as it is the case if the touchscreen sub-
system is independent from the main loop).

1
2 1. Touchscreen-Based User Interaction

Figure 1.1. Touchscreen input: while the user drags the pointer on screen, the new
samples become available and at the beginning of each frame they can be used to
estimate the motion and position.

A very simple approach would be to calculate the motion as the velocity


between two consecutive samples as in Equation (1.1), where ti is the time of the
sample i, s(ti ) the corresponding position and v(ti ) the velocity:

s(tn ) − s(tn−1 )
v(tn ) = . (1.1)
tn − tn−1
Given the small screen size of mobile devices and their relatively high reso-
lution, reading consecutive samples can result in different x- and y-coordinates,
even if the pointer is not moving on screen. This effect was observed on OMAP
Zoom platforms, where a set of samples with slightly different positions was re-
turned even when the input position was fixed.
We can take in account this behavior in our model by assuming that the
samples are affected by Gaussian noise, whose variance is platform-dependent. If
on a particular device this side effect does not exist, our model is still valid as it
means that the variance of the noise is zero. Since we want our motion estimation
algorithm to be virtually unaffected by the presence of noise, we introduce a
method for noise reduction followed by a thresholding operation.
A very common technique for noise reduction is to apply an average filter on
the set of available samples, therefore reducing the error by a factor of N (where
N is the size of the set).
If the user drags the pointer on screen and we want to estimate the motion
of such interaction, we can use a more effective approach that reduces the noise
and makes use of all the samples in a given time interval:
1.2. Motion Estimation 3

1. Choose a “time window” that contains all the most recent samples. To do
so, we store all the samples that we want to process in an circular queue.
At each frame we update our queue by adding all the samples that were
read since the previous frame, and discarding the ones that are too old. If
no samples are read for some time, the queue will quickly empty and we
can set the motion to 0.
2. Calculate all the velocities between consecutive samples in the queue.
3. Obtain the average on those velocities.
This method is much more reliable than the previous one and introduces a
trade-off between accuracy and responsiveness of the filter:
Bigger window. More samples are stored: the accuracy increases but the re-
sponsiveness of the filter decreases. Since we are using a causal filter, if ∆T
is the length of the window, we are introducing a delay equal to ∆T /2.
Smaller window. Fewer samples are stored: the accuracy decreases but the
responsiveness increases.

void update ( Sample * pSamples , int num , float & mx , float & my ) {
// Add the samples read in the last frame .
queue . add ( pSamples , num );
// Deletes old samples
queue . remove ( getTime () - timeWindow );
if ( queue . size () < 2) {
mx = my = 0.0 f ;
return ;
}
int i = 0;
Iterator iter = queue . begin ();
Sample curr , prev = iter . data ();
while ( iter . next ()) {
curr = iter . data ();
// Motion is an array that stores motions of consecutive
samples .
motion [ i ]. dx = ( curr . x - prev . x ) / ( curr . t - prev . t );
motion [ i ]. dy = ( curr . y - prev . y ) / ( curr . t - prev . t );
prev = curr ;
i ++;
}
Calculat eAverage ( motion , i , mx , my );
// Apply thresholds
if ( mx < thresholdX ) mx = 0.0 f ;
if ( my < thresholdY ) my = 0.0 f ;
}

Listing 1.1. Motion estimation routine.


4 1. Touchscreen-Based User Interaction

The method can be made even more robust: we can define a platform-
dependent threshold that is proportional to the variance of the noise, and if
the estimated motion is smaller than this threshold, we set it to zero. This will
guarantee that no motion is detected if the input position does not change (taking
noise into account).
The complete method for updating the motion is summarized in Listing 1.1.
Such an approach has been successfully tested on several platforms. For the
purpose of motion estimation, choosing a time interval of 0.2 seconds gives good
results on an OMAP Zoom platform, while on the Samsung Omnia the read
input coordinates are constant if the pointer doesn’t move on screen, and the
time interval and threshold can be smaller as no noise needs to be removed. The
same applies on Windows emulation, where the mouse is used as a pointing device
to simulate the touchscreen.

1.3 Position Prediction


Position estimation seems to be an easy task as the position of the last sample
can be used directly. Some considerations can be made nevertheless:

• The instant when the input is processed (beginning of the new frame) is
not necessarily the same as the time of the last read sample.

• The graphics pipeline takes some time to render the frame and swap the
buffers, so if the input position is used to draw a target on screen (for
example a cursor), it might be worth to adjust this position according to
the time when the frame will be actually rendered.

These two contributions are put together in Equation (1.2), where △read is the
difference between the current time and the time of the last sample, and △frame
is the time to render the current frame, equal to the time to render the previous
frame (we can assume that the framerate is locally constant):

△total = △read + △frame . (1.2)

The predicted position is then as in Equation (1.3), where v(tlast ) is the motion
estimated as described and tlast the time of the last read sample:

s(tnew ) = s(tlast ) + v(tlast ) · (tlast + △total ). (1.3)


1.4. Application: Controlling a Camera in a Spherical Coordinate System 5

1.4 Application: Controlling a Camera in a Spherical


Coordinate System
A possible application is to use our new motion primitives to move a camera
around an object in the center of the scene. The target of the camera is therefore
fixed to the origin of the world coordinate system. If required, this constraint
can be relaxed by setting the target on an arbitrary point in three-dimensional
space, as with a simple translation this case can be included in the former one.
The position of the camera is determined by a vector (x, y, z) of coordinates,
while only the motion in the x- and y-directions of the screen is available. As it is
not intuitive to map these directions to a point in a cartesian coordinate system,
we can use an alternative representation for the position of the camera:

Yaw. Rotation around the y-axis.

Pitch. Rotation around the x-axis.

Radius. Distance of the camera from the origin.

This allows us to use the x- and y-motion on the touchscreen to update the
yaw and pitch angles of the camera, while leaving the radius fixed or adjustable
through other input controls.
Figure 1.2 shows the concept. The camera position is initially set to (0, 0, R)T .
At each frame the yaw and pitch angles are updated according to the motion on
the touchscreen, and the position is updated.

Figure 1.2. Three-dimensional spherical coordinate system: upon application of the


yaw and pitch angles the camera moves from the initial position (marked in blue) to
the final one (green). The camera defines a coordinate system defined by the position
and the three orthonormal vectors (Vx , Vy , Vz ).
6 1. Touchscreen-Based User Interaction

To render our scene from the viewpoint of the camera we need to calculate
the roto-translation matrix that transforms a point from world space coordinates
to camera space coordinates. In OpenGL such a matrix is known as model-view
matrix and can be computed once the position, target and up direction of the
camera are known; therefore, we want to find these values first.
The spherical coordinate system is useful to update the yaw and pitch angles,
but to find the model-view matrix the position needs to be expressed again in the
canonical coordinate system. This is done by applying the rotation matrices that
correspond to the two angles to the initial position, as in Equations (1.4), (1.5)
and (1.6).
   
cos (α) 0 − sin (α) 1 0 0
Ry (α) =  0 1 0  Rx (α) =  0 cos (α) − sin (α)  .
sin (α) 0 cos (α) , 0 sin (α) cos (α)
(1.4)

Mrot = Ry (yaw) · Rx (pitch). (1.5)

 
0
Vpos = Mrot ·  0  . (1.6)
R
The target of our camera is fixed at the origin, as Equation (1.7) states, and
the up direction (Vy ) is determined by Equation (1.8).

Vtarget = 0T . (1.7)


0
Vy = Mrot ·  1  . (1.8)
0
The model-view matrix is composed by a translation that shifts the coordinate
system to the position of the camera and by a rotation that takes into account its
relative orientation. Such rotation can be expressed as a matrix formed by three
orthonormal vectors that can be immediately derived from the position, target
and up direction of the camera. Equations (1.9), (1.10) and (1.11) illustrate the
concept:

Vtarget − Vpos
Vz = − . (1.9)
||Vtarget − Vpos ||

Vx = Vy × Vz . (1.10)
1.5. Algorithm Overview 7

 
VxT 0 ( )
 VyT 0  −Vpos
Mview =  · 03x3 . (1.11)
 VzT 0  0T 1
0T 1
It can be noticed that the vectors Vx , Vy , and Vz represent the axes of a right-
handed reference frame. For this reason the direction of the camera is opposite
of the vector Vz .
This mathematical framework allows the setup of an interactive system where
the user can spin the camera around the observed object. The yaw and pitch
angles can be updated at each frame by adding the motion on the x- and y-axes,
but once the user releases the pointer, the camera movement immediately stops.
It would be more realistic if the camera had a residual inertial motion, and the
next section illustrates how to implement this.

1.4.1 Adding Inertia


The desired effect consists in letting the camera spin for a short amount of time,
until the residual motion is zero. Such effect is best modeled by a decreasing
exponential function like the one in Equation (1.12):
{
0, t < 0,
f (t) = (1.12)
1 − e−τ t , t ≥ 0.
If tr is the time when the input is released and Mx the corresponding motion
on the x-axis, the residual speed at any instant in time is given by x(t) = Mx ·
f (t − tr ).
This value can be integrated over time to update the yaw angle (the same
procedure is valid for the y-axis / pitch angle). The integration is a discrete
operation as it can be performed at each frame, and Equation (1.13) provides a
method to update the angle frame by frame:

n
x(tn ) = Mx ·f (ti −tr )(ti −ti−1 ) = x(tn−1 )+Mx ·f (tn −tr )(tn −tn−1 ). (1.13)
i=0

1.5 Algorithm Overview


The general algorithm for updating the camera position within the main loop is
illustrated in Listing 1.2.
The motion is estimated using the discussed technique and used to update
the yaw and pitch angles. When the input is released the current time and the
motion are saved and later used to add the inertial contribution to the angles
8 1. Touchscreen-Based User Interaction

t = getTime () ;
touchscreen . GetMotion ( motionX , motionY ) ;
if ( touchscreen . Released () ) {
releaseMotionX = motionX ;
releaseMotionY = motionY ;
tr = t ;
}
yaw = yaw + motionX + rel easeM otionX * f ( t - tr ) ;
pitch = pitch + motionY + r elease Motion Y * f ( t - tr ) ;
MView = Calculat e Ma t ri x ( yaw , pitch , radius ) ;

RenderGeometry ( MView ) ;

Listing 1.2. Render loop.

(appropriate scaling factors can be used when updating the angles in order to
calibrate the speed of the camera position as desired). Finally, the model-view
matrix can be calculated with the techique previously described and the geometry
can be rendered using that matrix.
Figure 1.3 illustrates how the presented techniques have been used in an
OpenGL ES 2.0 demo developed at Imagination Technologies.

Figure 1.3. Flowers demo. This demo shows procedural plant growth with detailed
flowers and makes use of the described touchscreen implementation to spin the camera
around the growing plant.
1.6. Conclusion 9

1.6 Conclusion
This chapter showed how to approach motion estimation by means of a robust
algorithm whose parameters can be adapted on different platforms to get optimal
results. The length of the time window and the threshold can be tweaked as
necessary according to the requirements of the final application.
A simple OpenGL ES 2.0 demo that allows to control the camera in the
spherical coordinate system previously described is available in the book’s web
materials on the CRC Press website, together with full source code for OMAP
Zoom1, Zoom2 and Windows emulation, where the mouse is used to simulate the
touchscreen interaction.

1.7 Acknowledgments
The author would like to thank Kristof Beets and the whole BizDev and DevTech
teams for their work, contribution, and feedback.
2

Optimizing a 3D UI Engine
for Mobile Devices
Hyunwoo Ki

A graphical user interface (GUI) is the preferred type of user interface (UI). It
provides more intuitive visual components with images or animations, as opposed
to a text-based interface. Such a GUI is essential for both graphics applications
(for example, games) and non-graphics applications (for example, a phonebook).
It should offer us convenience as well as beauty. With advancement of embedded
systems, UI design shows a trend toward presenting three-dimensional graphical
looks. Because the UI is a fundamental component for all applications, a UI en-
gine should provide high performance and low memory consumption. The major
mission of our UX development group at INNOACE Co., Ltd., is developing an
optimized three-dimensional UI engine and an authoring tool for various mobile
devices to maximize user experience. Our core engine supports two-dimensional
and three-dimensional widgets, scene management, resource management, True-

Figure 2.1. Examples using a three-dimensional UI engine base on OpenGL ES 2.0.

11
12 2. Optimizing a 3D UI Engine for Mobile Devices

Type font rendering, three-dimensional transition, visibility culling, occlusion


culling, widget caching, partial rendering, etc. Based on this core engine, we
migrated our UI engine to various platforms, for example LINUX, Windows Mo-
bile, Android and GPOS. We developed a hardware renderer based on OpenGL
ES 2.0 for high-end smartphones and PMPs, and a software renderer for com-
mon mobile phones. This chapter presents practical optimization methods for
a three-dimensional UI engine based on OpenGL ES 2.0 with our development
experience (see Figure 2.1). However, we expect some features of our core engine
would be useful for OpenGL ES 1.1 or software renderers.

2.1 Overview
Optimization methods presented in this chapter include rendering pipeline op-
timization, TrueType font rendering, widget caches with partial update, and
resource management.

1. Rendering pipeline optimization. We describe fundamental development


guidelines of OpenGL ES 2.0 for a three-dimensional UI engine. We show
our usage examples of render states, shaders, textures, draw calls, and etc.

2. TrueType font rendering. Texts are a very important component of the


UI, and rendering texts requires higher costs than rendering other widgets.
Particularly, processing Asian languages (e.g., Korean and Chinese) lower
engine efficiency because it must handle many font glyphs. We present our
text renderer’s design and implementation issues to minimize draw calls
and texture changes with low memory consumption.

3. Widget caches with partial update. We render a composite widget to a


render texture, and consider this texture as a widget cache until a sub-
widget of the composite widget is changed. This widget cache accelerates
rendering a composite widget. Furthermore, according to interaction and
logic, we partially update this widget cache to reduce update costs.

4. Resource management. We developed an efficient resource management


system based on DirectX, one with resource garbage collectors. For exam-
ple, if a managed texture has not been used for a long time, we dispose it
on memory, and this texture is automatically restored from storage when
we use it. We describe design and implementation issues of our resource
management system.
Other documents randomly have
different content
The Project Gutenberg eBook of Warren
Commission (15 of 26): Hearings Vol. XV (of
15)
This ebook is for the use of anyone anywhere in the United
States and most other parts of the world at no cost and with
almost no restrictions whatsoever. You may copy it, give it away
or re-use it under the terms of the Project Gutenberg License
included with this ebook or online at www.gutenberg.org. If you
are not located in the United States, you will have to check the
laws of the country where you are located before using this
eBook.

Title: Warren Commission (15 of 26): Hearings Vol. XV (of 15)

Author: United States. Warren Commission

Release date: October 5, 2018 [eBook #58030]

Language: English

Credits: Produced by MWS, Curtis Weyant, Charlie Howard, and


the
Online Distributed Proofreading Team at
http://www.pgdp.net

*** START OF THE PROJECT GUTENBERG EBOOK WARREN


COMMISSION (15 OF 26): HEARINGS VOL. XV (OF 15) ***
INVESTIGATION OF
THE ASSASSINATION OF PRESIDENT JOHN F.
KENNEDY
HEARINGS
Before the President’s Commission
on the Assassination
of President Kennedy

Pursuant To Executive Order 11130, an Executive order creating a


Commission to ascertain, evaluate, and report upon the facts
relating to the assassination of the late President John F. Kennedy
and the subsequent violent death of the man charged with the
assassination and S.J. Res. 137, 88th Congress, a concurrent
resolution conferring upon the Commission the power to administer
oaths and affirmations, examine witnesses, receive evidence, and
issue subpenas

Volume
XV

UNITED STATES GOVERNMENT PRINTING OFFICE


WASHINGTON, D.C.

U.S. GOVERNMENT PRINTING OFFICE, WASHINGTON: 1964

For sale in complete sets by the Superintendent of Documents, U.S. Government


Printing Office Washington, D.C., 20402
PRESIDENT’S COMMISSION
ON THE
ASSASSINATION OF PRESIDENT
KENNEDY

Chief Justice Earl Warren, Chairman


Senator Richard B. Russell
Senator John Sherman Cooper
Representative Hale Boggs
Representative Gerald R. Ford
Mr. Allen W. Dulles
Mr. John J. McCloy

J. Lee Rankin, General Counsel

Assistant Counsel

Francis W. H. Adams
Joseph A. Ball
David W. Belin
William T. Coleman, Jr.
Melvin Aron Eisenberg
Burt W. Griffin
Leon D. Hubert, Jr.
Albert E. Jenner, Jr.
Wesley J. Liebeler
Norman Redlich
W. David Slawson
Arlen Specter
Samuel A. Stern
A
Howard P. Willens

A
Mr. Willens also acted as liaison between the Commission and
the Department of Justice.
Staff Members

Phillip Barson
Edward A. Conroy
John Hart Ely
Alfred Goldberg
Murray J. Laulicht
Arthur Marmor
Richard M. Mosk
John J. O’Brien
Stuart Pollak
Alfredda Scobey
Charles N. Shaffer, Jr.

Biographical information on the Commissioners and the staff can


be found in the Commission’s Report.
Preface
The testimony of the following witnesses is contained in volume XV;
Hyman Rubenstein, a brother of Jack L. Ruby; Glen D. King,
administrative assistant to the chief of the Dallas police; C. Ray Hall,
an FBI agent who interviewed Ruby; Charles Batchelor, assistant
chief of the Dallas police; Jesse E. Curry, chief of the Dallas police;
M. W. Stevenson, deputy chief of the Dallas police; Elgin English
Crull, city manager of Dallas; J. W. Fritz, captain in charge of the
Dallas Homicide Bureau; Roland A. Cox, a Dallas policeman; Harold
J. Fleming, vice president of the Armored Motor Car Service of
Dallas, and Don Edward Goin, Marvin E. Hall and Edward C. Dietrich,
employees of the Armored Motor Car Service; Capt. Cecil E. Talbert
of the Dallas Police Department, who was in charge of the patrol
division on November 26, 1963; Marjorie R. Richey, James Thomas
Aycox, Thomas Stewart Palmer, Joseph Weldon Johnson, Jr., Edward
J. Pullman, Herbert B. Kravitz, Joseph Rossi, Norman Earl Wright,
Lawrence V. Meyers, William D. Crowe, Jr., Nancy Monnell Powell,
Dave L. Miller and Russell Lee Moore (Knight), former employees,
business associates, friends, or acquaintances of Ruby; Eileen
Kaminsky and Eva L. Grant, sisters of Ruby; George William
Fehrenbach, a purported acquaintance of Ruby; Abraham Kleinman,
Ruby’s accountant; Wanda Yvonne Helmick, an employee of a
business associate of Ruby; Kenneth Lawry Dowe, who talked to
Ruby over the telephone on November 23, 1963; T. M. Hansen, Jr., a
Dallas police officer; Nelson Benton, a Dallas news reporter who
spoke with Chief Curry on the morning of November 26; Frank
Bellocchio, an acquaintance of Ruby, who spoke with him on
November 23, 1963; Alfred Douglas Hodge, an acquaintance of
Ruby; David L. Johnston, the justice of the peace who arraigned
Oswald for the murder of President Kennedy and Officer Tippit, and
who also gave testimony concerning Ruby’s whereabouts on
November 22, 1963; Stanley M. Kaufman, Ruby’s attorney, who
spoke to him on November 23; William S. Biggio and Clyde Franklin
Goodson, Dallas police officers; Roger C. Warner, a Secret Service
agent who participated in the investigation of the killing of Lee
Harvey Oswald; Seth Kantor, Danny Patrick McCurdy, Victor F.
Robertson, Jr., Frederic Rheinstein, Icarus M. Pappas, John G.
McCullough, Wilma May Tice, John Henry Branch, William Glenn
Duncan, Jr., Garnett Claud Hallmark, John Wilkins Newnam, Robert
L. Norton, Roy A. Pryor, Arthur William Watherwax, Billy A. Rea,
Richard L. Saunders, Thayer Waldo, Ronald Lee Jenkins, Speedy
Johnson, and Roy E. Standifer, all of whom gave testimony
concerning Ruby’s whereabouts on November 22 and/or November
23, 1963; William Kline and Oran Pugh, U.S. Customs officials who
gave testimony regarding their knowledge of Oswald’s trip to
Mexico; Lyndal L. Shaneyfelt, a photography expert with the FBI;
and Bruce Ray Carlin, Mrs. Bruce Carlin, and Ralph Paul,
acquaintances of Jack Ruby; Harry Tasker, taxicab driver in Dallas;
Paul Morgan Stombaugh, hair and fiber expert, FBI; Alwyn Cole,
questioned document examiner, Treasury Department; B. M.
Patterson and L. J. Lewis, witnesses in the vicinity of the Tippit crime
scene; Arthur Mandella, fingerprint expert, New York City Police
Department; John F. Gallagher, FBI agent; and Revilo Pendleton
Oliver, member of the council of the John Birch Society.
Contents
Page
Preface v
Testimony of—
Hyman Rubenstein 1
William S. Biggio 48
Glen D. King 51
C. Ray Hall 62
Seth Kantor 71
William D. Crowe, Jr. 96
Charles Batchelor 114
Jesse E. Curry 124, 641
M. W. Stevenson 133
Elgin English Crull 138
J. W. Fritz 145
Roland A. Cox 153
Harold J. Fleming 159
Don Edward Goin 168
Marvin E. Hall 174
Cecil E. Talbert 182
Marjorie R. Richey 192
James Thomas Aycox 203
Thomas Stewart Palmer 206
Joseph Weldon Johnson, Jr 218
Edward J. Pullman 222
Herbert B. Kravitz 231
Joseph Rossi 235
Norman Earl Wright 244
Russell Lee Moore (Knight) 251
Edward C. Dietrich 269
Eileen Kaminsky 275
George William Fehrenbach 289
Eva L. Grant 321
Victor F. Robertson, Jr. 347
Frederic Rheinstein 354
Icarus M. Pappas 360
John G. McCullough 373
Abraham Kleinman 383
Wilma May Tice 388
Wanda Yvonne Helmick 396
Nancy Monnell Powell 404
Kenneth Lawry Dowe 430
T. M. Hansen, Jr. 438
Dave L. Miller 450
Nelson Benton 456
Frank Bellocchio 466
John Henry Branch 473
William Glenn Duncan, Jr. 482
Garnett Claud Hallmark 488
Alfred Douglas Hodge 494
David L. Johnston 503
Stanley M. Kaufman 513
Danny Patrick McCurdy 529
John Wilkins Newnam 534
Robert L. Norton 546
Roy A. Pryor 554
Arthur William Watherwax 564
Billy A. Rea 571
Richard L. Saunders 577
Thayer Waldo 585
Clyde Franklin Goodson 596
Ronald Lee Jenkins 600
Speedy Johnson 607
Roy E. Standifer 614
Roger C. Warner 619
Lawrence V. Meyers 620
William Kline 640
Oran Pugh 640
Bruce Ray Carlin 641
Mrs. Bruce Carlin 656
Ralph Paul 664
Harry Tasker 679
Lyndal L. Shaneyfelt 686
Paul Morgan Stombaugh 702
L. J. Lewis 703
Alwyn Cole 703
Revilo Pendleton Oliver 709
B. M. Patterson 744
Arthur Mandella 745
John F, Gallagher 746
Index to Volumes I-XV 753

EXHIBITS INTRODUCED

Page
Aycox Exhibit No. 1 206
Bellocchio Exhibit
No. 1 469
Branch Exhibit No. 1 474
Carlin Exhibit No.:
1 655
2 655
3 655
4 655
Cole Exhibit No.:
1 704
2 704
3 704
4 704
5 704
6 704
7 705
8 706
9 706
Crowe Exhibit No.:
1 110
2 110
Crull Exhibit No. 1 140
Dowe Exhibit No.:
1 436
2 436
Duncan Exhibit No.:
1 483
2 484
Fehrenbach Exhibit
No.:
1 295
2 311
3 312
4 317
5 317
6 314
7 318
Fleming Exhibit No.
1 160
Gallagher Exhibit
No. 1 750
Goodson Exhibit No.
1 597
Hall (C. Kay) Exhibit
No.:
1 66
2 66
3 68
4 67
Hall (Marvin E.)
Exhibit No. 1 175
Hallmark Exhibit No.
1 489
Hansen Exhibit No.:
1 445
2 445
Helmick Exhibit No.
1 403
Hodge Exhibit No. 1 495
Jenkins Exhibit No.
1 601
Johnson Exhibit No.
1 614
Johnston Exhibit
No.:
1 509
2 509
3 513
4 513
5 513
Kantor Exhibit No.:
1 75
2 83
3 92
4 92
5 93
6 94
7 94
8 94
Kaufman Exhibit No.
1 515
King Exhibit No.:
1 59
2 59
3 59
4 60
5 62
Kleinman Exhibit
No. 1 387
Knight Exhibit No. 1 266
Kravitz Exhibit No. 1 234
McCullough Exhibit
No.:
1 380
2 380
McCurdy Exhibit No.
1 529
Miller Exhibit No. 1 454
Newnam Exhibit
No.:
1 537
2 538
3 538
4 535
Norton Exhibit No. 1 549
Oliver Exhibit No.:
1 713
2 713
3 717
4 722
5 723
6 732
7 737
8 738
9 741
10 741
11 743
12 743
Pappas Exhibit No.:
1 370
2 370
3 371
4 371
Patterson Exhibit:
A 744
B 745
Powell Exhibit
No.:
1 420
2 429
3 430
Pryor Exhibit No. 1 555
Pullman Exhibit No.
1 231
Rea Exhibit No. 1 573
Richey Exhibit No. 1 196
Robertson Exhibit
No.:
1 354
2 354
Rossi Exhibit No. 1 241
Rubenstein Exhibit
No.:
1 35
2 35
3 44
4 45
5 47
Saunders Exhibit
No. 1 577
Shaneyfelt Exhibit
No.:
8 687
9 687
10 687
11 687
12 687
13 687
14 687
15 689
16 689
17 690
18 690
19 690
20 690
21 690
22 690
23 692
24 694
25 696
26 697
27 698
28 698
29 698
30 698
31 698
32 698
33 698
34 700
35 700
36 701
Standifer Exhibit No.
1 615
Stombaugh Exhibit
No.:
1 702
2 702
3 702
4 702
5 702
6 702
Talbert Exhibit No.:
1 186
2 186
Tice Exhibit No. 1 395
Waldo Exhibit No. 1 586
Wright Exhibit No. 1 250
Hearings Before the President’s
Commission
on the
Assassination of President
Kennedy
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like