UnityAVProVideo PDF
UnityAVProVideo PDF
Version 1.5.18
1. Introduction
1. Features
2. Trial Version
3. Media Credits
2. System Requirements
1. Platforms not supported
3. Installation
1. Trial Version & Watermark Notes
2. Installing Multiple Platform Packages
4. Usage Notes
1. Platform Notes
2. Video File Locations
3. Streaming Notes
4. Augmented / Virtual Reality Notes
5. Hap Codec Notes
6. Transparency Notes
7. Hardware Decoding
8. Subtitle Notes
5. Quick Start Examples
1. Quick Start Fastest Start for Unity Experts
2. Quick Start Fullscreen Video Player using Prefabs
3. Quick Start 3D Mesh Video Player Example using Components
6. Usage
1. Getting Started
2. Unsupported Platform Fallback
3. Components
4. Scripting
7. Asset Files
1. Demos
2. Prefabs
3. Scripts
8. Scripting Reference
9. Supported Media Formats
1. Android
2. iOS, tvOS and OS X
3. Windows
4. Windows Phone / UWP
5. WebGL
10. Support
11. About RenderHeads Ltd
Appendix A - FAQ
Appendix B - Version History
Appendix C - Roadmap
2 / 70
AVPro Video is the newest video playback plugin from RenderHeads. We previously
developed the AVPro QuickTime and AVPro Windows Media plugins for Unity. In this next
generation of plugins we aim to create an easy to use, cross-platform video playback system
that uses the native features of each platform.
1.1 Features
● Versions for iOS, tvOS, OS X, Android, WebGL, Windows, Windows Phone and
UWP
● One API for video playback on all supported platforms
● 8K video support (on supported hardware)
● VR support (mono, stereo, equirectangular and cubemap)
● Transparency support (native and packed)
● Fast flexible video playback
● Unity Pro 4.6.8 and 5.x supported
● Free watermarked trial version available (download here)
● Fast native Direct3D, OpenGL and Metal texture updates
● Linear and Gamma colour spaces supported
● Graceful fallback in editor
● Components for IMGUI, uGUI and NGUI
● Easy to use, drag and drop components
● Streaming video from URL (when supported by platform)
We offer an unlimited trial version of AVPro Video for download from our website at
http://renderheads.com/product/avpro-video/. The trial version has no missing features or
time restrictions but it does apply a watermark to the rendered output. The watermarking
does have a small performance impact which is only really noticeable on very high resolution
videos.
ttps://vimeo.com/97887646
SampleSphere.mp4 - (c) copyright Daniel Arnett, h
3 / 70
● WebPlayer
● Linux desktop
● Tizen
● Samsung TV
● Game Consoles (XBox*, PS4 etc)
* XBox One may be supported using UWP build option. We have not tested this though.
4 / 70
1. Open up a fresh Unity session (to clear any locked plugin files)
2. Import the unitypackage file into your Unity project. If prompted to upgrade some
scripts click Yes.
3.1.1 Watermark
If you are using a trial version of the plugin then you will see a watermark displayed over the
video. The watermark is in the form of a “RenderHeads” logo that animates around the
screen, or a thick horizontal bar that move around the screen. In Windows if the GPU
decoding path is used when the watermark isn’t displayed - instead every few seconds the
video size will scale down.
The full version of AVPro Video has no watermarks for any platforms. If you use one of the
platform specific packages (eg AVPro Video for iOS, or AVPro Video for Windows) then you
will not see the watermark on the platform you purchased for, but you will see the watermark
on the other platforms. For example if you purchased AVPro Video for iOS then you will still
see the watermark in the Unity editor as this is running on Windows/OS X, but the videos
played back when you deploy to your iOS device will be watermark-free.
If you are upgrading from the trial version, make sure you delete the old /Assets/Plugins
folder as this contains the trial plugin and could conflict. You may need to close Unity first,
delete the files manually and then restart Unity and re-import the package (because Unity
locks native plugin files once they are loaded).
You can check which version you have installed by adding an MediaPlayer component to
your scene and clicking on the ‘about’ button in the Inspector for that component. The
version number is displayed in this box.
If you are not using the full all-in-one AVPro Video package and instead have opted to
purchase multiple individual platform packages then the installation must be done carefully,
especially when upgrading to a new version.
5 / 70
1. Open a fresh Unity instance (this is important as otherwise Unity may have locked
the plugin files which prevents them from being upgraded)
2. Import the iOS package
3. Import the Android package, but make sure that you have the iOS native plugin file
unticked (so that it is not overwritten)
● Android
○ Plugins/Android/AVProVideo.jar
○ Plugins/Android/libs/armeabi-v7a/libAVProLocal.so
○ Plugins/Android/libs/x86/libAVProLocal.so
● macOS
○ Plugins/AVProVideo.bundle
● iOS
○ Plugins/iOS/libAVProVideoiOS.a
● tvOS
○ Plugins/tvOS/libAVProVideotvOS.a
● WebGL
○ Plugins/WebGL/AVProVideo.jslib
● Windows
○ Plugins/WSA/PhoneSDK81/ARM/AVProVideo.dll
○ Plugins/WSA/PhoneSDK81/x86/AVProVideo.dll
○ Plugins/WSA/SDK81/ARM/AVProVideo.dll
○ Plugins/WSA/SDK81/x86/AVProVideo.dll
○ Plugins/WSA/SDK81/x86_64/AVProVideo.dll
○ Plugins/WSA/UWP/ARM/AVProVideo.dll
○ Plugins/WSA/UWP/x86/AVProVideo.dll
○ Plugins/WSA/UWP/x86_64/AVProVideo.dll
○ Plugins/x86/AVProVideo.dll
○ Plugins/x86_64/AVProVideo.dll
6 / 70
4.1.1 Android
● iOS and tvOS support requires Unity 5.4 or above (due to a breaking change in the
Unity API). If you require Unity 5.3 or below support for iOS then please contact us.
Otherwise you will have to upgrade to 5.4 or above.
● Under the hood we’re using the AVFoundation API
● If you want to support streaming you need to enable HTTP downloads explicitly. For
iOS this is an option in newer versions of Unity, but for Mac OS X and older versions
of Unity you have to do this explicitly by editing the plist file. There are notes below
on how to do this.
● For rendering we support OpenGL ES 2.0, OpenGL ES 3.0 and Metal
● Multi-threaded rendering is supported
4.1.3 Mac OS X
● Only 64-bit (x86_64) builds are supported (Apple dropped 32-bit support back in
7 / 70
4.1.4 Windows
● Under the hood we’re using the Media Foundation and DirectShow API’s. Media
Foundation is used for Windows 8 and above while DirectShow is used as a fallback
for Windows 7 and below.
● For rendering we support Direct3D 9, Direct3D 11 and OpenGL Legacy
● Multi-threaded rendering is supported
4.1.6 WebGL
● The supported formats and features is dependant on the web browser capabilities.
Video files can be played in almost any location, however we recommend placing video files
in the /Assets/StreamingAssets/ folder in your Unity project as this is the easiest folder to
get started with. StreamingAssets is a special folder that Unity copies to the build without
processing. Files copied elsewhere will require manual copying to the build location.
The MediaPlayer component allows you to browse for video files and specify them relative to
a parent folder:
8 / 70
For example if your file is stored in “Assets/StreamingAssets/video.mp4” you would set the
Location to “Relative To Streaming Assets Folder” and set the Video Path to “video.mp4”.
You can also specify absolute paths, URLs or paths relative to other locations:
This is the best and most common location for video files. This folder is located at
“Assets/StreamingAssets/” and you must create it if it doesn’t exist. Files copied to this
folder will not be imported or processed by Unity but they will be copied with the build
automatically.
Here you can specify a full URL or absolute path to the video file. A URL could be in the
form “http://myserver.com/myvideo.mp4” or “rtsp://myserver.com:8080/mystream.rtsp”
depending on the platform support and streaming service used.
● C:/MyFolder/AnotherFolder/MyVideo.mp4 (Windows)
● /Users/Mike/downloads/MyVideo.mp4 (Mac/Linux)
● /Storage/SD/Videos/MyVideo.mp4 (Android external SDCARD)
● /Storage/emulated/0/MyFolder/MyVideo.mp4 (Android local file system)
Using absolute paths can be useful for testing but isn’t useful when deploying to other
machines that don’t necessarily have the same file structure.
9 / 70
The project folder is the folder of your Unity project, so the folder containing the Assets,
Library and Project Settings sub-folders. Specifying files relative to the project folder can be
useful when you don’t want to include the video files in your Unity Assets folder but want to
keep them within the project folder structure. Often making a sub-folder called “Videos” is
useful. One possible problem of using this location is that when making a build your video
files will not be copied automatically to the build destination so they require manual copying.
For builds this folder should be located:
It isn’t that useful to put video files into this folder directly as they would then be processed
by Unity into MovieTexture’s and will bloat your project size. If you want to stop Unity
processing the video files simply rename the extension to something Unity doesn’t
understand, so “myvideo.mp4” could be renamed to “myvideo.mp4.bin”. Files within the
data folder (Assets folder in the editor) are not copied automatically to builds so you would
have to manually copy them.
10 / 70
When encoding MP4 videos for streaming make sure they are encoded with the video
header data at the beginning of the file. You normally do this by selecting “Fast Start” in
QuickTime encoder, or use the “-movflags faststart” in FFMPEG, Other encoders will have a
similar option. To prepare an MP4 for streaming using FFMPEG you can use the following
command:
11 / 70
If you are streaming videos from VIMEO as MP4 then you should note that you can replace
the “.mp4” part in the URL with “.m3u8” to instead make it an HLS stream. This may be
particularly useful if you are developing apps for the Apple’s App Store as you would need to
use HLS streaming to pass certification (as for April 2016).
This platform supports streaming of HLS streams which typically end with the m3u or m3u8
extension.
If you have an HTTPS URL it should work fine because Apple trusts the secure connection.
If you can only use HTTP then you app has to have a special flag set to let it use HTTP
connections (this is a security issue for Apple).
This setting is exposed in the Unity Player Settings here for iOS and tvOS:
If for some reason your version of Unity doesn’t expose this then you will have to add it
manually. In the Unity editor you need to edit "Unity.app/Contents/Info.plist" and in your built
application you would need to edit "your.app/Contents/Info.plist". These files need to have
these keys added:
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
We’ve also included a post process build script called “PostProcessBuild.cs” in the project
which edits the plist and adds this attribute. Currently it’s only set for iOS but you can edit
the #define at the top to allow Mac OS X too.
12 / 70
We found these streams handy for testing (no guarantee that they’re still working):
● Streaming MP4
○ http://downloads.renderheads.com/2016/BigBuckBunny_360p30_Streaming.
mp4
● HLS
○ http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8
○ http://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov/playlist.m3u8
● MPEG-Dash
○ http://www-itec.uni-klu.ac.at/ftp/datasets/DASHDataset2014/RedBullPlayStre
ets/10sec/RedBull_10_simple_2014_05_09.mpd
● RTSP
○ rtsp://rtmp.infomaniak.ch/livecast/latele
VR is still very new and you should always check for the latest recommended installation
13 / 70
AVPro Video supports 4K MP4 playback for creating 360 degree experiences. Stereo 4K
videos in both top-bottom and side-by-side formats are also supported. If you’re using
Windows 10 and have an Nvidia Geforce 1000 series (eg 1070) you can display 8K H.265
videos!
See the FAQ for tips to achieve high resolution video playback for VR.
For software decoders reducing the complexity of the encoded video will give the decoding
engine a much easier time and could result in higher frames rates and lower CPU/GPU
usage. Possible encoding tweaks include:
4.4.1 Stereo VR
AVPro Video supports stereoscopic videos in the top-bottom and left-right formats. You can
set the stereo packing format of your video in the Media Properties panel:
Now when using the InsideSphere shader on a mesh it will automatically map the right part
of the video to each eye. See the “Demo_360SphereVideo” scene for an example of how
this works.
Optionally you can manually set the material properties. The included shader
“InsideSphere.shader” allows you to easily set what format your video is in via a drop-down
in the material:
14 / 70
NOTE: Be sure to add the "UpdateStereoMaterial” component script to your scene when
using this material when using Unity 5.3 or below or Unity 5.4 and above without the Single
Pass VR option enabled. Often stereo VR requires 2 cameras, each set to a different layer
mask and 2 spheres also set to a different mask. AVPro Video doesn’t require this and just
uses your normal single camera and single sphere.
4.4.2 VR Audio
Currently only the Windows plugin has support for audio manipulation in VR.
There are two main audio issues with VR in Unity:
1. Audio redirection
Some VR systems such as the Oculus Rift and HTC Vive have their own audio
output device and one needs to redirect audio to these devices instead of the system
default audio device. Unity does this automatically for its internal audio, but AVPro
Video renders to the system default audio device.
This issue can be solved by either using the AudioOutput component (see below).
This component redirects the audio to be rendered by Unity and so it goes to the
correct device. AudioOutput requires that the Media Foundation video API be used,
so if you need to use the DirectShow API you can specify the name of the output
device manually in the field “Force Audio Output Device”:
15 / 70
2. Audio spatialisation
Audio needs to rotate as the user moves their head in the virtual world. This can be
achieve by using the AudioOutput component which redirects the audio from the
video into Unity (via Unity’s AudioListener).
AudioOutput requires that the video API is set to Media Foundation and that the “Use
Unity Audio” tickbox is selected.
16 / 70
For Android there is a special playback option called “Use Fast OES Path”. This option
caters especially for VR where users are trying to get the highest possible frame rate and
resolution out of the device (without it overheating at the same time). The option is
available in the Platform Specific section of the MediaPlayer component:
The OES path is not enabled by default because it requires some special care to be taken
and can be tricky for beginners. When this option is enabled the Android GPU returns
special OES textures (see EGL extension OES_EGL_image_external) that are hardware
specific. Unfortunately Unity isn’t able to use these textures directly, so you can’t just map
them to a material or UI. To use the texture a GLSL shader must be used. Unfortunately
Unity’s GLSL support isn’t as good as its CG shader support so again this makes things
more tricky. The GLSL compiler only happens on the device (not inside Unity) so errors in
the shader can be difficult to debug.
We have included a version of the VR sphere shader that supports stereo videos as an
example. Hopefully in the future we can improve the integration of these shaders so they
aren’t such special cases. This playback path is much faster though, so is definitely worth
exploring.
The Hap video codec is natively supported by AVPro Video on certain platforms and has the
following benefits:
● Very low CPU usage
● GPU decompression
● Low memory usage
17 / 70
AVI and MOV containers can both be used however we recommend the MOV container.
Hap is only supported on Windows and Mac OS X platforms.
Hap, Hap Alpha and HapQ are supported. Hap currently requires the “DirectShow” video
API to be selected:
4.5.3 Encoding
You can download the QuickTime codec for Windows and macOS here:
https://github.com/Vidvox/hap-qt-codec/releases
This codec can be used with QuickTime Pro or any other software that supports QuickTime
codecs such as Adobe After Effects and Adobe Premiere.
Alternatively you can use a recent build of FFMPEG with the following command-lines:
18 / 70
Not many video codecs have native support for transparency / alpha channels. Formats
supported by some platforms of AVPro Video are:
● Hap Alpha
○ Great support on Windows and Mac OS X. Fast and low overhead format,
though file size can get large depending on the content. Currently this is the
format we recommend for transparent video.
● Uncompressed RGBA
● Uncompressed YUVA
○ Uncompressed isn’t ideal for file size or disk bandwidth but can still be used
as a fallback
● ProRes 4444
○ Best support is on Mac OS X. Files are huge.
● VP6
○ Legacy format. We support it only via 3rd party DirectShow plugins for
Windows (eg LAV Filters)
Alternatively you can encode your videos in video formats that don’t support an alpha
channel by packing the alpha channel into the same frame. You can double the width for a
left-right packing layout, or double the height for a top-bottom packing layout. This packing
could be created in software such as AfterEffects, or the command-line ffmpeg tool can be
used. The packing format is set in the “Media Properties” panel of the AVPro Video
MediaPlayer component:
19 / 70
ffmpeg -i %1 -vf "split [a], pad=iw*2:ih [b], [a] alphaextract, [b] overlay=w" -y %1.mp4
ffmpeg -i %1 -vf "split [a], pad=iw:ih*2 [b], [a] alphaextract, [b] overlay=0:h" -y %1-tb.mp4
20 / 70
AVPro Video supports hardware decoding on most platforms for optimal performance.
WebGL, Windows 8.1, Windows 10, macOS, tvOS, iOS and Android all default to hardware
decoding when possible. When playing back multiple videos simultaneously one must be
careful not to exceed the number of videos that the hardware can play back smoothly. We
have tried to collect information about the limits of specific hardware.
4.7.1 NVidia
NVidia GPU’s use a technology called “Purevideo” or “NVDec” to off-load decoding from the
CPU. More information can be found here:
https://en.wikipedia.org/wiki/Nvidia_PureVideo#Nvidia_VDPAU_Feature_Sets
(table adapted from presentation “HIGH PERFORMANCE VIDEO ENCODING WITH NVIDIA GPUS”)
22 / 70
4.7.2 AMD
AMD UVD
https://en.wikipedia.org/wiki/Unified_Video_Decoder
23 / 70
AVPro Video supports external subtitles in the SRT format. Subtitles internal to media files
are not yet supported. See the subtitle demo scene to see how to use subtitles.
24 / 70
AVPro Video includes a number of example prefabs you can use to easily add video
playback to your project. The following steps will create an application that plays back a
fullscreen video:
25 / 70
The DisplayIMGUI component script is just one of the components for displaying video. It
uses the legacy Unity IMGUI system which always renders on top of everything else. Try
using the DisplayBackground or DisplayUGUI components for more control if you don’t want
your video to be on top.
AVPro Video includes a number of easy to use script components you can add to your
scene. In this example we show how to use the components to play a video onto a material
which is applied to a 3D model in the scene.
26 / 70
27 / 70
The easiest way to get started is to look at the included demos and see what script
components have been used. For video playback you need 3 things in your scene:
Decide how and where you want your video file to appear. There are a number of
different display component scripts included for different usage scenarios. If you
want to display the video on top of everything in your scene just add the
DisplayIMGUI script to a GameObject in your scene and set the Media Player field
your MediaPlayer component. Other display components work similarly.
AVPro Video is designed to still function even on platforms that aren’t natively supported.
Instead of displaying the actual video though, a dummy 10 second long “AVPro” visual is
shown. All of the video controls should still work. For example if you are running your editor
in Linux the dummy video player will appear in the editor and the real video will appear when
you deploy to supported platforms. If you deploy to an unsupported platform such as
Samsung TV you will also see the dummy video player. The code is easily extendable to
add custom video players for any unsupported platform.
6.3. Components
Included are a number of components to make this asset easy to use. The components are
located in the AVProVideo/Scripts/Components folder or you can add them from the
Components menu:
28 / 70
This is the core component for playing media. This component only handles the loading and
playback of media and doesn’t handle how it is displayed. Use the display script
29 / 70
● Video Location
○ Where to look for the file specified in the Video Path below. This can be an
absolute path/URL, or relative to one of the Unity folders. The
StreamingAssets folder is the easiest to use. Options are:
■ Absolute or URL
● This is an absolute path on your device, or an http URL
■ Relative to Project Folder
● The root folder is the folder above your Assets folder
■ Relative to Streaming Assets Folder
● The root folder is /Assets/StreamingAssets
■ Relative to Data Folder
● The root folder is /Assets
● Unity manual has more information:
http://docs.unity3d.com/ScriptReference/Application-dataPath.
html
■ Relative to Persistent Data Folder
● Unity manual has more information:
http://docs.unity3d.com/ScriptReference/Application-persistent
DataPath.html
● Video Path
○ The file path to the video in the StreamingAssets folder (e.g. myvideo.mp4 or
AndroidVideos/myvideo.mp4 if you want to use a subfolder)
● Auto Open
○ Whether to open the file when this component is enabled/starts
● Auto Start
○ Whether to play the video once a video is opened
● Loop
○ Whether to loop the video
● Playback Rate
○ Sets a multiplier that affects video playback speed
○ Not supported on Android
● Volume
○ 0..1 range for audio volume
● Muted
○ Whether the audio is muted
● Persistent
○ Applies DontDestroyOnLoad to the object so that it survives scene/level loads
● Debug Gui
○ Whether to display an overlay with statistics on the video playback - useful for
debugging
● Events
○ This event can be hooked up to scripting functions which will get called when
a non-looping video completes playback. See the Events section below for
more details and a scripting example
30 / 70
This is the most basic component for displaying the video. It uses the legacy Unity IMGUI
system to display the video to the screen. IMGUI is always rendered on top of everything
else in the scene, so if you require your video to be rendered in 3D space or as part of the
uGUI system it’s better to use the other components. Fields are:
● Media Player
○ The media player to display
● Display in Editor
○ Whether to display the rectangle in the editor - useful for debugging
● Scale Mode
○ How to fit the video to the screen
● Color
○ The color to tint the video, including alpha transparency
● Alpha Blend
○ Whether the video texture controls transparency. Leaving this off for opaque
videos is a minor optimisation
● Depth
○ The IMGUI depth to display at. Use this to change the order of rendering with
other IMGUI scripts
● Full Screen
○ Whether to ignore the X, Y, Width, Height values and just use the whole
screen
● X
○ The normalised (0..1) x position
● Y
○ The normalised (0..1) y position
31 / 70
This component is used to display a video using Unity’s uGUI system. Field are:
● Media Player
○ The media player to display
● Default Texture (optional)
○ A texture to display while the video isn’t playing (while it is buffering for
example).
● No Default Display
○ Will not show anything until there are frames available
● Color
○ The color to tint, including alpha transparency
● Material
○ Standard uGUI field, change this to one of the included materials when using
packed alpha or stereo videos
● UV Rect
○ Standard uGUI field
● Set Native Size
○ When the video loads will resize the RectTransform to the pixel dimensions of
the video
● Keep Aspect Ratio
○ Whether to keep the correct aspect ratio or stretch to fill
32 / 70
● Offset
○ The XY translation to apply to the texture
● Scale
○ The XY scale to apply to the texture
● Mesh
○ The mesh (renderer) to apply the texture to
● Media
○ The media player
● Default Texture (optional)
○ A texture to display while the video isn’t playing (while it is buffering for
example).
This component takes the texture generated by the Media Player component and assigns it
to a texture slot in a Material. This is useful for playing videos on 3D meshes. Fields are:
● Offset
○ The XY translation to apply to the texture
● Scale
○ The XY scale to apply to the texture
● Material
33 / 70
This component generates a cube mesh that is suitable for 3:2 cubemap 360 VR videos.
Fields are:
● Material
○ The material to apply to the cube. This is usually just a standard unlit
material.
● Media Player
○ The media player that will have its video texture displayed on the cube
● Expansion_coeff
○ The value used during enabling to pad the edges to the video to prevent
bilinear bleed artifacts. Default is 1.01.
This component currently only supports the Windows platforms (Windows 8 and above) and
is used to pass Audio from the Media Player to Unity. This allows audio effects,3D
placement and 360 VR spatialisers to be used. Fields are:
● Media Player
○ The media player that will have its audio outputted via Unity
34 / 70
6.4.1 Namespace
Most scripting is likely to center around the MediaPlayer.cs script. This script handles the
loading, playback and updating of videos. The script exposes a number of interfaces related
to different use cases and can be found in Interfaces.cs
MediaPlayer mp;
mp.Info.GetVideoWidth();
● Control Interface
○ The IMediaControl interface is exposed by the Control property
○ This interface is used to control playback, eg:
MediaPlayer mp;
mp.Control.Pause();
● TextureProducer interface
○ The IMediaProducer interface is exposed by the TextureProducer property
○ This interface is used to get information about how to display the current
texture and is used by the Display components, eg:
MediaPlayer mp;
Texture videoTexture = mp.TextureProducer.GetTexture();
The MediaPlayer script also has a number of methods for controlling loading of media:
● OpenVideoFromFile()
○ Loads the video specified. Useful if you need manual control over when the
video is loaded
● CloseVideo()
○ Closes the video, freeing memory
35 / 70
● MetaDataReady - Called when the width, height, duration etc data is available
● ReadyToPlay - Called when the video is loaded and ready to play
● Started - Called when the playback starts
● FirstFrameReady - Called when the first frame has been rendered
● FinishedPlaying - Called when a non-looping video has finished playing
● Closing - Called when the the media is closing
● Error - Called when an error occurred, usually during loading
Scripting example:
// Add the event listener (can also do this via the editor GUI)
MediaPlayer mp;
mp.Events.AddListener(OnVideoEvent);
36 / 70
7.1 Demos
● Demo_360SphereVideo.unity
○ Demo contains a video player that plays a 360
degree video using equirectangular(lat-long)
mapping.
○ The video is applied to a sphere, inside of which
is the main camera.
○ If the target device has a gyroscope then moving
the device around with rotate the camera to view
the video from different angles. For platforms
without gyroscope the mouse/touch can be used
to look around.
○ A special shader and script are used to allow a
single camera to render in stereo on a VR
headset. Click on the material to set whether it
should display the video as monoscopic, stereo
top-bottom or stereo left-right.
● Demo_360CubeVideo.unity
○ Same as the sphere demo above, but using a
cubemap 3x2 layout source video.
● Demo_BackgroundShader.unity
○ Basic demo that plays a video using the
background material which allows the video to
appear behind all content.
● Demo_FrameExtract.unity
○ Shows go to read frames out of the video for
saving to disk (jpg/png) or accessing pixel data.
● Demo_imGui.unity
○ Basic demo that plays a video and uses the legacy IMGUI display component
to draw the video to the screen.
○ Also has an audio clip to show audio-only media playback.
○ Also has 3 different streaming URLs to demonstrate streaming.
○ IMGUI is drawn on top of all other visual components.
● Demo_Mapping3D.unity
○ Demo containing a video player and a 3D scene
○ Some of the 3D models have the video mapped to them via the
ApplyToMaterial script
● Demo_Multiple.unity
○ This demo allows you to programmatically multiple load videos and test
multiple videos playing at once. Display is via the AVPro Video uGUI
component
● Demo_uGUI.unity
○ This demo shows how to display videos within the uGUI system. It uses the
37 / 70
7.2 Prefabs
● 360SphereVideo.prefab
○ Prefab containing a video player and mapping to a sphere. Useful for
playback of equirectangular 360 degree videos
● BackgroundVideo.prefab
○ Prefab containing a video player and a quad model with a special background
material applied. This material makes the quad get drawn before everything
else so it appears in the background.
● FullscreenVideo.prefab
○ Prefab controls a video player and the IMGUI display component for very
easy basic video playback creation
7.3 Scripts
● Components
○ ApplyToMaterial.cs
■ Applies the texture produced by the MediaPlayer component to a unity
material texture slot
○ ApplyToMesh.cs
■ Applies the texture produced by the MediaPlayer component to a
Untiy mesh (via MeshRenderer) by setting the mainTexture field of all
its materials
○ CubemapCube.cs
■ Generates a cube mesh that can be used for displaying a 3:2
cubemap packed video
○ DisplayBackground.cs
■ Displays the texture produced by the MediaPlayer component behind
all other content (not compatible with SkyBox)
○ DisplayIMGUI.cs
■ Displays the texture produced by the MediaPlayer component using
Unity’s legacy IMGUI system
○ DisplayUGUI.cs
■ Displays the texture produced by the MediaPlayer component using
Unity’s new uGUI system
○ MediaPlayer.cs
■ The main script for loading and controlling an instance of video
playback
○ UpdateStereoMaterial.cs
■ A helper script for VR stereo rendering to update the camera position
variable in a spherical material to help work out which eye to render
38 / 70
39 / 70
AVPro Video is designed to be used mainly with the supplied drag and drop component but
there are always times when a bit of scripting is needed. The asset includes sample scenes
which give some examples of how to use scripting to control video playback, apply video
textures to materials etc which are useful to learn from. The full class reference is available
online here:
http://downloads.renderheads.com/docs/AVProVideoClassReference/
MediaPlayer class
The MediaPlayer class is the main class for video playback and is where video files are
specified and controlled. This class is mainly controlled via the Unity Inspector UI and for
scripting through the interface properties it exposes.
Properties
● Events
○ returns the MediaPlayerEvent class
● Info
○ returns the IMediaInfo interface
● Control
○ returns the IMediaControl interface
● TextureProducer
○ returns the IMediaProducer interface
Methods
All of these methods use the interfaces exposed above and are just handy shortcuts
40 / 70
IMediaInfo interface
Methods
● float GetDurationMs();
○ Returns the duration of the video in milliseconds
● int GetVideoWidth();
○ Returns the width of the video in pixels
● int GetVideoHeight();
○ Returns the height of the video in pixels
● float GetVideoFrameRate();
○ Returns the frame rate of the video in frames per second
● float GetVideoDisplayRate();
○ Returns the actual frame rate achieved by the video decoder
● bool HasVideo();
○ Returns whether the media has visual tracks
● bool HasAudio();
○ Returns whether the media has audio tracks
● int GetAudioTrackCount();
○ Returns the number of audio tracks
● string GetPlayerDescription();
○ Returns a string describing the internal playback mechanism
IMediaControl interface
Methods
41 / 70
42 / 70
IMediaProducer interface
Methods
● Texture GetTexture();
○ Returns a Unity Texture object if there is a texture available otherwise null is
returned.
● int GetTextureFrameCount();
○ Returns the number of times the texture has been updated by the plugin.
This can be useful to know when the texture was updated as the value will
increment each time.
● long GetTextureTimeStamp();
○ Returns the presentation time stamp of the current texture in 100-
nanosecond units. This is useful for accurate frame syncing.
● bool RequiresVerticalFlip();
○ Some textures are decoded up-side-down and need to be vertically flipped
when displayed. This method returns whether the texture needs to be flipped
during display.
43 / 70
In general the most common format that is supported are MP4 files with H.264 encoding for
video and AAC encoding for audio. This format is supported across all platforms though not
necessarily all bit-rates and profiles.
Container support:
AVI Yes No No No
ASF/WMV Yes No No No
WAV Yes ? ? ?
WMA Yes No No No
MIDI Yes ? ? ?
44 / 70
MJPEG Yes No No No
WMV Yes No No No
DV Yes Yes in No No
Yosemite
and above
45 / 70
● * Yes, only in Windows 10 and only 4:2:0. If you use DirectShow and 3rd party filter
then 4:4:4 can be supported
● ** Apparently Apple will release support soon
● *** Requires option “Force DirectShow” to be ticked
● **** Older versions of Windows (Vista and XP) do not have support for H.264
decoding
● ***** iOS seems to often have trouble with MP3 audio tracks in a video file, so best to
use AAC instead
Cells with “?” are one’s we’re not sure about. We will do more testing and continue to
update this table. For more details on which codecs and what type of encoding is best, see
the per-platform details below.
9.1 Android
Android supports many media formats. For a complete list check the Android documentation
here: https://developer.android.com/guide/appendix/media-formats.html
HEVC support was officially added in Android 5.0 (Lollipop) but mostly as a software
decoding implementation.
A list of media-player related Android chipsets and which formats they support for hardware
decoding: http://kodi.wiki/view/Android_hardware
Many media formats are supported on iOS including H.264. For a complete list check the
iOS documentation here:
https://developer.apple.com/library/ios/documentation/Miscellaneous/Conceptual/iPhoneOS
TechOverview/MediaLayer/MediaLayer.html
The iPhone5C is a low-powered mobile device and has a limitation that the videos must be
less than 2 megapixels in resolution.
iOS seems to often have trouble with MP3 audio tracks in a video file, so best to use AAC
instead.
46 / 70
● DV
● Uncompressed R10k
● Uncompressed v210
● Uncompressed 2vuy
9.3 Windows
H.264 decoder supports up to profile L5.1, but Windows 10 supports above L5.1 profile:
https://msdn.microsoft.com/en-us/library/windows/desktop/dd797815(v=vs.85).aspx
Details on media supported by this platform can be found is platform are here:
https://msdn.microsoft.com/library/windows/apps/ff462087(v=vs.105).aspx
https://msdn.microsoft.com/en-us/windows/uwp/audio-video-camera/supported-codecs
9.5 WebGL
Support for WebGL platform is still varied and depends on the platform and browser support.
Some formats such as AVI file container are not supported at all. As with all other platforms,
H.264 video in an MP4 container is the most widely supported format.
Adaptive streaming (such as HLS) is still not supported natively by most browsers, but we
have seen it working in the Microsoft Edge and Safari browsers.
https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats
https://en.wikipedia.org/wiki/HTML5_video#Browser_support
http://www.encoding.com/html5/
47 / 70
If you are in need of support or have any comments/suggestions regarding this product
please contact us.
Email: [email protected]
Website: http://renderheads.com/product/avpro-video/
Unity Forum:
http://forum.unity3d.com/threads/released-avpro-video-complete-video-playback-solution.38
5611/
If you are reporting a bug, please include any relevant files and details so that we may
remedy the problem as fast as possible.
Essential details:
● Error message
○ The exact error message
○ The console/output log if possible
○ If it’s an Android build then an “adb logcat” capture
● Hardware
○ Phone / tablet / device type and OS version
● Development environment
○ Unity version
○ Development OS version
○ AVPro Video plugin version
● Video details:
○ Resolution
○ Codec
○ Frame rate
○ Better still, include a link to the video file
48 / 70
RenderHeads Ltd is an award winning creative and technical company that has been
designing and building cutting edge technology solutions since its formation in 2006. We
specialise in creating interactive audio-visual software for installations at auto shows,
museums, shows and expos.
11.1 Services
Many of the apps and projects we develop require features that Unity doesn’t yet provide, so
we have created several tools and plugins to extend Unity which are now available on the
Unity Asset Store. They all include a f ree trial or demo version that you can download
directly from the website here:
http://renderheads.com/product-category/for-developers/
Powerful cross-platform video playback solution for Unity, featuring support for
Windows, OS X, iOS, Android and tvOS. This is our newest plugin.
Video capture to AVI files direct from the GPU and encoded to files using DirectShow
codecs. Features include 4K captures, lat-long (equirectangular) 360 degree captures,
off-line rendering and more. Windows only.
49 / 70
Integrates DeckLink capture card functionality into Unity, allowing users to send and receive
high-definition uncompressed video data to and from these capture cards.
Highly productive tool allowing in-game and in-editor annotation of screenshots which can
then be shared with your team via FTP, Email, Slack or Teamwork.com with the click of the
mouse. Cross-platform.
50 / 70
The ability to play high resolution videos depends on the version of Windows
operating system and which video codecs you have installed.
By default AVPro Video will try to use the standard Microsoft codecs that come with
Windows. These have limitations, for example:
If you want to use 3rd party codecs you can install them to improve the range of
resolution and format options available. These come in the form of Media
Foundation or DirectShow codecs. The LAV Filters are a great example of
DirectShow codecs that will allow for higher resolution video decoding.
Yes and no. If you enter a YouTube URL into AVPro Video it will not be able to play
it because this is the URL to the website and not the video. It is possible to gain
access to the underlying MP4 file or HLS stream URL which will work with AVPro
Video. This may be against the terms and conditions of YouTube though.
This is usually caused by the video decoding taking too much CPU time. Some
videos (especially highly compressed ones) require a lot of CPU time to decode each
frame.
Try enabling the hardware decoding option in “Platform Specific” panel. This will
enable your application to use GPU decoding which is much faster. This option is
only supported on Windows 8.1 and above and when D3D11 graphics API is used.
You could also try re-encoding your video using settings that are less demanding for
51 / 70
You could also try switching to another video codec that allows for less CPU
intensive decoding. Codecs such as H.264 / H.265 are generally very heavy on the
CPU. Hap is an extremely fast codec but will result in large files.
Yes it is on most platforms. Android, iOS, tvOS and macOS mostly use GPU
decoding - but it depends on the codec. For Windows GPU decoding is enabled by
default but is only available for Windows 8.1 and above and when using D3D11
graphics API. You can toggle GPU decoding is enabled via the Platform Specific
panel:
The way seeking works depends on the platform. Some platforms support frame
accurate seeking but most will only seek to the nearest key-frame in the video. In
general to improve seek accuracy and responsiveness you can re-encode your
videos with more frequent key-frames (smaller GOP). The downside of this is that
the size of the video file will increase (more key-frames = larger file). We have two
seek functions: Seek() and SeekFast(). SeekFast can be used when accuracy isn’t
important but whether it’s faster or not depends on support from the platform and can
52 / 70
If your stereo video is only decoding one of the views then this is because there is
stereo metadata encoded within the video which causes our Media Foundation
decoder to interpret incorrectly. The only way to currently solve this is to remove the
metadata from the video. One way to do this is using FFMPEG:
ffmpeg -i %1 %1-video.nut
ffmpeg -i %1-video.nut -crf 20 %1-stripped.mp4
If your platform is Windows desktop then this is usually because the DirectShow
video API has not been selected. See the notes above about how to use the Hap
codec.
10. macOS - Publishing for Mac App Store and get error “Invalid Signature”?
We’re not sure why this happens but we’ve had reports that simply deleting all of the
.meta files inside the AVProVideo.bundle will resolve this issue.
Make sure that your video file doesn’t contain an MP3 audio track. In our experience
iOS doesn’t handle MP3 audio correctly and can often lead to the whole file not
loading/playing correctly. Use AAC for audio instead.
Yes if the video is not content protected and you have it's URL. URL's for items in
the video library take the following form:
ipod-library://item/item.m4v?id=1234567890123456789
You can get the URL of an item by using the MPMediaQuery class from the
MediaPlayer framework.
Yes. You need to use the UIImagePickerController to select the video. This will
result in the video being processed into your app's temp folder where you can access
it via the URL returned from the controller.
53 / 70
If you are not using Push notifications in your app, this Unity Asset does the job of
fixing this error: https://www.assetstore.unity3d.com/en/#!/content/59160. Just import
it, and build the xcode files again.
15. iOS - How can I fix the error message: "This app attempts to access
privacy-sensitive data without a usage description. The app's Info.plist must
contain an NSCameraUsageDescription key with a string value explaining to
the user how the app uses this data."?
16. Android - Why doesn’t my huge video file play from StreamingAssets folder?
Files in the StreamingAssets folder for Android are packed into a JAR file and so
before we can play them we must extract them to the apps persistent data folder.
Huge files can take a long time to extract and copy and sometimes they could even
cause the device to run out of storage space or memory. For really large files we
recommend placing them the videos in other folders on the device and then
referencing the absolute path to that file. This also has the added benefit of not
having a copy huge files and wait ages when deploying builds to the Android device.
17. Android - Why doesn’t the video display when using Vuforia?
In our test Vuforia doesn’t play well with AVPro Video when multi-threaded rendering
is enabled. Simple disable this option in the Player Settings screen.
18. Android - Why does my build fail with a message about DEX and zip classes?
This can happen when there is a conflict in the Java code. We include a JAR called
zip_file.jar which contain classes for handling zip files. If you have another plugin in
your project that also contains these classes then Unity will fail the build. Read the
exact Unity error message and if it is related to zip classes then you can try deleting
zip_file.jar file.
54 / 70
20. iOS - Why do I get this message about privacy permission using camera?
This seems to be a bug in Unity 5.4.1 and has been resolved in Unity 5.4.2.
We have found that H.264 videos with GOP = 1 (all I-Frames / Key-frames) freeze on
Android. GOP interval of 2 seems to be fine.
Windows 10 seems to limit the stream resolution based on your screen resolution
and the monitor DPI scaling setting. To get 4K using HLS you need a 4K monitor
and your DPI scaling set to 100%. This issue now seems to be resolved in the latest
version of Windows 10.
24. Why does the video control react correctly in Windows 7, but not in Windows 8
or above?
If you try to call video control function (eg Seek, Mute, SetVolume etc) just after you
open the video file, this can cause problems. Video loading can be synchronous or
asynchronous, so you need to wait until the video has completed loading before
issuing any control commands. In Windows 7 and below the DirectShow API is used
by default. Currently the open function works synchronously, so control functions will
work immediately after loading. In Windows 8 and above the Media Foundation API
is used by default, and this is asynchronous. The best approach is to use the event
system to wait for the video to be ready (contain meta data), or to poll the
MediaPlayer (Control.HasMetaData()) before issuing control functions.
55 / 70
Some users have reported issues with creating a stream with FFMPEG where the
stream will take a long time to connect and will only send through frames
occasionally. We have found that you need to manually add the GOP size
parameters to be a multiple of your frame-rate. So for example if your frame-rate is
25 then you should add “-g 25” in your FFMPEG command-line.
26. Why does changing the Quality Settings level turn the video texture black?
This appears to be a bug in Unity 5.4.2, 5.4.3 and 5.5.0 related to changing the
Texture Quality field in the Quality Settings. The issue is not present in Unity 5.4.1.
We have reported this issue to Unity.
27. What’s the difference between your different Unity video playback plugins?
OS X Yes No Yes
Android Yes No No
iOS Yes No No
tvOS Yes No No
WebGL Yes No
56 / 70
4K H.264 Yes Only with 3rd party Yes but very slow
codec
Works with VR Yes Yes best with 3rd party Not great
codecs
** Currently only using DirectShow path, adding Media Foundation path soon.
57 / 70
58 / 70
59 / 70
60 / 70
61 / 70
62 / 70
63 / 70
64 / 70
65 / 70
66 / 70
67 / 70
68 / 70
Appendix C - Roadmap
● Version 1.6.0
○ Add global audio volume
69 / 70
● Version X.X.X
○ Split out media definitions from player
○ Playmaker support
○ Loading from memory/resources
○ Fallback to using MovieTexture on some platforms?
○ 10-bit HEVC support?
● Version X
○ ← Your suggestions here, let us know :)
70 / 70