Sapera User
Sapera User
7
User's Manual
P/N: OC-SAPM-USER0
www.teledynedalsa.com
NOTICE
This document may not be reproduced nor transmitted in any form or by any means, either
electronic or mechanical, without the express written permission of TELEDYNE DALSA. Every effort
is made to ensure the information in this manual is accurate and reliable. Use of the products
described herein is understood to be at the user’s risk. TELEDYNE DALSA assumes no liability
whatsoever for the use of the products detailed in this document and reserves the right to make
changes in specifications at any time and without notice.
All other trademarks or intellectual property mentioned herein belongs to their respective owners.
This manual exists in Windows Help, and Adobe Acrobat® (PDF) formats (printed manuals are
available as special orders). The Help and PDF formats make full use of hypertext cross-references.
The Teledyne DALSA home page on the Internet, located at
http://www.teledynedalsa.com/imaging, contains documents, software updates, demos, errata,
utilities, and more.
Teledyne Digital Imaging offers the widest range of machine vision components in the world. From
industry-leading image sensors through powerful and sophisticated cameras, frame grabbers,
vision processors and software to easy-to-use vision appliances and custom vision modules.
Contents
SAPERA LT ARCHITECTURE _____________________________________ 4
APPLICATION ARCHITECTURE .............................................................................. 4
Library Architecture ................................................................................ 4
DEFINITION OF TERMS ...................................................................................... 6
SAPERA LT ++ AND SAPERA LT .NET CLASSES ....................................................... 7
Sapera LT ++ Basic Classes by Subject..................................................... 7
Sapera LT .NET Basic Classes by Subject................................................... 8
Sapera LT ++ and Sapera LT .NET Class Descriptions ................................. 9
Library Architecture
The typical machine vision application requires configuration of acquisition resources, image
capture and transfer to memory buffers. These image buffers can then be processed or displayed,
analyzed, with results determining subsequent processes. Events can also be monitored to trigger
appropriate responses. The Sapera LT library architecture is organized around these basic machine
vision functional blocks.
The following block diagram, while not exhaustive of all the classes available in Sapera LT,
illustrates the major functional blocks with the corresponding classes.
It is always recommended to use the source code provided with the demos and
examples as both a learning tool and a starting point for your applications. For a
complete list and description of the demos and examples included with Sapera LT
see the Sapera LT Getting Started Manual for Frame Grabbers and Sapera LT
Getting Started Manual for GigE Vision Cameras & 3D Sensors.
What is a module?
A module is a set of functions used to access and/or control a static or a dynamic resource. The
complete Sapera LT Standard C API is composed of a series of modules organized in a particular
architecture.
General Classes
SapManager SapLocation
Data Classes
(SapDataXXX)
SapManCallbackInfo SapData
Acquisition Classes
Frame-Grabber Specific Camera Specific General
SapFeature SapColorConversion
SapMetadata
General Classes
SapData SapLocation SapServerFileNotifyEventArgs
SapManager SapServerNotifyEventArgs
Data Classes
(SapDataXXX) SapResetEventArgs
SapException
SapResetEventArgs
Acquisition Classes
Frame-Grabber Specific Camera Specific General
SapAcquisition SapAcqDevice SapLut
SapMetadata
SapDisplayDoneEventArgs SapPerformance
SapDisplay
SapGraphic
SapAcqToBuf, These specialized transfer classes are a set derived from SapTransfer
SapAcqDeviceToBuf, that allow easy creation of the most commonly used transfer
SapBufToBuf, configurations.
SapMultiAcqToBuf
For example, setting up a transfer configuration from a SapAcquisition
object (frame grabber) to a SapBuffer object normally requires many
lines of code which call various functions in the SapTransfer class.
Using the specialized class SapAcqToBuf instead reduces this to just
one line of code.
SapData and SapData and its derived classes act as wrappers for Sapera LT data
SapDataXxx types, where each class encapsulates one data element of a specific
type. They are used as property values, method arguments, or return
values in various Sapera LT ++ and Sapera LT .NET classes.
SapFeature The SapFeature class includes the functionality to retrieve the feature
information from the SapAcqDevice class. Each feature supported by
the SapAcqDevice class provides a set of properties such as name,
type, access mode, and so forth, that can be obtained through the
feature module.
SapGio The purpose of the SapGio class is to control a block of general inputs
and outputs—a group of I/Os that may be read and/or written all at
once.
SapManager The SapManager class includes methods for describing the Sapera
resources present on the system. It also includes error management
capabilities.
SapProcessing The SapProcessing class allows you implement your own processing
through a derived class.
SapView The SapView class includes the functionality to show the resources of a
SapBuffer object in a window through a SapDisplay object. An ‘auto
empty’ mechanism allows synchronization between SapView and
SapTransfer objects in order to show buffers in realtime without
missing any data.
SapXferFrameRateInfo The SapXferFrameRateInfo class provides frame rate statistics for the
associated SapTransfer object. It is created automatically when a
SapTransfer object is constructed.
SapXferNode The SapXferNode class is the base class used to represent a source or
destination transfer node involved in a transfer task managed by the
SapTransfer class. The actual class for the node can be SapAcqDevice,
SapAcquisition, or SapBuffer.
SapXferPair The SapXferPair class describes a pair of source and destination nodes
for the SapTransfer class.
Image Processing Unit (IPU): The IPU performs real-time embedded image processing. The
capabilities of IPU vary based on the price performance criteria targeted for the acquisition
hardware. The embedded processing varies in complexity from color space conversion in simple
frame grabbers and cameras to image analysis to controlling external devices on vision processors.
With the Teledyne DALSA image acquisition device functional architecture in mind, let us take a
closer look at T2IR framework to understand what is it, its principal building blocks and how it
helps reduce costs.
The reliability of a vision system is reflected by its ability to handle both predictable and
unpredictable trigger signals. The parts of vision system – image acquisition and control - must
operate in harmony to achieve this reliability. A controlled response to system events is directly
related to the quality of information needed to produce products with consistent quality. This helps
lower costs by increasing the system uptime and yield.
T2IR is a combination of hardware and software features that work together to improve the
reliability of your vision system. T2IR features deliver full system level monitoring, control, and
diagnostics capability. It lets you reach inside your vision system to audit and debug image flow.
You can trace the flow of data from image capture right through transfer to host memory. You can
even store images temporarily in the onboard memory to overcome unexpected transfer
bottlenecks. That means no lost data, no false data and a clear source to identify and track any
errors. Sapera T2IR features accomplish these tasks in a non-intrusive manner that does not
interfere with the applications.
Host Computer
Teledyne
Trigger Application Trigger
Camera DALSA Memory Processing
Input Response Output
Framegrabber
Framer grabber or
Image
Acquistion Image Image Transfer to Program Logic / camera output
Transfer to Image Analysis
Trigger Acquistiion Frame Grabber Result trigger to other
Host
device.
T2IR aims to handle the common breakdown points in this chain such that corrective or
preventative action can be taken, and to eliminate the possibility of unknown faults/application
failure.
Teledyne
Trigger Application Trigger
Camera DALSA Memory Processing
Input Response Output
Framegrabber
Framer grabber or
Image
Acquistion Image Image Transfer to Program Logic / camera output
Transfer to Image Analysis
Trigger Acquistiion Frame Grabber Result trigger to other
Host
device.
The right target image Acquires the best quality images with object details critical
acquisition to make correct decisions.
Managing External Triggers Ensures synchronization between image acquisition and
object motion, reduces image artifacts due to motion and
provides control response to expected and unexpected
external events.
Tracking and Tracing Images Continuous coverage of the entire images flow reduces
waste and improves up time.
Monitoring the Acquisition and Enables preventive action if resource usage exceeds a
Transfer Process predetermined threshold, selectively keeping or discarding
images to sustain processing speed.
Overcoming Too Much Data Handles peak loads to avoid data loss, ensure smooth
operations.
Ensuring Data Quality Helps increase uptime and reduce waste.
Advanced Diagnostics Rapid pinpointing of errors for speedy diagnostic and
preventive actions.
Teledyne DALSA camera and frame grabber products incorporate various levels of control functions
for automating imaging applications. A good starting example is the integration of the trigger and
strobe control functions into onboard hardware.
This sounds simple enough: a trigger input generates a strobe output for lighting control and
camera exposure. However, there are circumstances in which a delay between the trigger input
and the strobe output is required; for example, if the camera and lighting units are not in the same
position on a conveyor as the trigger sensor. Coordinating these two events through software is
almost impossible and certainly not reliable (especially given the variations in command execution
of the Windows operating system). To solve this problem Teledyne DALSA has incorporated
programmable delay timers between these two signals.
The delay timers give developers a mechanism for establishing a precise delay between the trigger
input and firing of the lighting and camera exposure. However, this amount of programmed delay
is calculated based on the theoretical speed of the production line. If the actual speed is not
constant (a common occurrence), the position of the object in the resulting image may not be
suitable for analysis. Therefore, for reliable image acquisition the delay has to be linked to the
speed of the object. This is done using the pulse output from an encoder attached to a rotating
part of the conveyor system. Expressing the delay in terms of encoder ticks synchronizes it with
the actual speed of the production line. As a result, the object is always at the same location in the
image regardless of the speed of production line.
The easiest way to program trigger parameters is to use Sapera CamExpert. Sapera CamExpert is
camera configuration tool that offers intuitive graphical user interface and live image display for
For example, the External Trigger parameters are all grouped in one category in the Parameters
panel (shown here for the Xtium-CL PX4 frame grabber):
When you are satisfied with all the parameter settings, they can be saved in a configuration file
and later retrieved by the application at run time.
The example below shows how to access a previously stored camera configuration file for the Xtium
frame grabber in C++:
This synchronization achieves the first goal of Trigger-to-Image Reliability: the camera is properly
controlled to capture the image of the target being inspected. Of course, these hardware features
are under software control, but, once initialized, they act independently of any software execution,
leading to predictable results.
System designers want to build systems that offer scalable performance while minimizing costs. In
some cases it might more economical to combine multiple lower resolution cameras and optics to
construct higher resolution images, while in some other it might be necessary to distribute very
high speed images across multiple computers to minimize image processing and analysis.
In all cases when multiple acquisition devices are used, it is important that all devices operate
synchronously to produce images that are error free and ready to use. T2IR framework capabilities
permit this by incorporating critical features to achieve image acquisition synchronization in
hardware and software, without the need for external synchronization and data replicating devices.
This T2IR synchronization feature also permits implementation of different image processing setups
to achieve a target processing time. Let us closely look at some of the commonly used system
configurations.
In cases where images from different devices must be combined in one buffer, Teledyne DALSA
GigE Vision cameras, (such as Genie Nano and Linea GigE) and frame grabbers (such as Xtium-CL
MX4) incorporate the necessary hardware to work under Sapera LT to capture images in one
seamless Sapera buffer. Teledyne DALSA’s Xcelera and Xtium series frame grabbers, for example,
offer dedicated hardware signals to synchronize multiple boards and cameras together. The trigger
source can be easily set using CamExpert. Sapera LT SDK also provides dedicated demo
applications with source code to jump start the development efforts.
Similarly, Genie Nano and Linea GigE cameras series are also capable of accepting external input
signals that can be distributed to other cameras for synchronization.
The Xtium-CLHS series includes a dedicated image data forwarding port, allowing images to be
distributed across multiple frame grabbers (on the same PC or not). The frame grabber that
controls the camera receives the image; an exact copie is forwarded through the data port to a
slave frame grabber, which can in turn forward it to another frame grabber, and so on (up to 5
slaves). All frame grabbers receive the same image. It is up to the user to decide how to process
the image. For instance, using two frame grabbers, the first one could send half of the image to its
host, while the other sends the other half to its host. This approach can be used when camera
bandwidth exceeds the 2.1 GB/s limit of the CLHS cable.
A first criterion for a valid trigger is that a trigger has to represent an actual “part-in place” for
inspection. A false trigger is a signal that is not associated with a part in place. False triggers can
be caused by jitter resulting from electrical noise or glitches associated with mechanical actuators
and motors. T2IR capabilities offer an effective way to reduce faulty triggers by ensuring that the
signal remains active for a minimum duration before it can be considered as valid for the
acquisition. For added flexibility Teledyne DALSA products offer this T2IR feature as a user
programmable parameter.
PLC
Strobe
light
Sapera Application
Valid triggers generates response
Valid triggers event
Invalid triggers
After the probability of spurious triggers is minimized, user applications can be programmed to
handle the other extreme, appropriately called “over-trigger” conditions. An over-trigger condition
occurs when the camera receives a trigger but is busy acquiring previous image. Care must be
given to the fact that, in some cases, sending a trigger while grabbing the previous line or frame is
desirable to minimize the dead time between frames or lines (in case of line scan cameras).
Typical causes for an over-trigger state can be that the image generated from the previous trigger
is still being processed, or the sensor is currently being readout or exposed for the next image
(note that some cameras support exposing the sensor during readout, which allows for a higher
frame rate than otherwise possible).
The T2IR capabilities allow applications to tolerate over-trigger situations and track them if a
system starts to lose images. When frames are lost, T2IR capabilities notify Sapera-based user
applications with event messages for remedial actions. T2IR framework helps applications to
maintain control despite timing fluctuations in trigger generation.
More advanced applications may require inspection from multiple views. Continuing our previous
example, let us assume the object has to be inspected on each side, each with different lighting, at
the same frame rate. Now the constraints evolve from inspecting 3600 parts per minute to
handling 14,400 images per minute. In this scenario, the imaging system must correlate four
different acquisitions before making the final decision to accept, reject or re-inspect the object.
With synchronized acquisition timestamps, the 4 images for each item are:
Image 1 timestamp =
Image 2 timestamp = + x ticks =
Image 3 timestamp = + y ticks =
Image 4 timestamp = + z ticks =
Where x, y and z are the expected intervals between acquisitions.
The image tags (timestamps) are generated either from an onboard hardware clock, the PC clock
or increments using an external signal, be it a trigger, encoder tick or another pulse input at the
time of image acquisition and/or image transfer to the host. For example, the Xtium-CL MX4
provides the following hardware timestamps:
Since there is a time lag between image capture and analysis, the image timestamps can be used
to ensure that the system acts on the correct object. Timestamps can also be used to precisely
measure the acquisition or processing rates. It can also be used to determine if any loss of data
has occurred by comparing the time lapse between successive frames.
In C++, callback functions are used to access the timestamps; whenever registered events occur,
the associated callback function is executed.
For .NET, a similar mechanism uses the EnableEvent method and AcqDeviceNotify event to call the
associated event handler.
Trigger-to-Image Reliability framework includes a set of software tools to ensure that all required
images were captured accurately into onboard memory. While it is possible to continuously check
the status to monitor system operations, in practice it comes at the expense of system
performance. T2IR uses the concept of events that are issued by the acquisition devices to notify
the application if certain status flags have changed. This allows applications to operate more
optimally as it gets interrupted from its main processing task only when an event has occurred.
Since these notifications are handled at the user application level, the applications have complete
freedom to decide how best to handle them.
The table below summarizes the Sapera events associated with image capture and transfer
sequences into the host memory.
Sapera Events
Event Description
EndOfEven End of even field
EndOfField End of field (odd or even)
EndOfFrame End of frame
EndOfLine After a specific line number eventType = EndOfLine | lineNum
EndOfNLines After a specific line number (linescan cameras only) eventType = EndOfNLines | numLines
EndOfOdd End of odd field
End of transfer, that is, after all frames have been transferred following calls to
EndOfTransfer
SapTransfer.Snap or SapTransfer.Grab/SapTransfer.Freeze.
FieldUnderrun The number of active lines per field received from a video source is less than it should be.
LineUnderrun The number of active pixels per line received from a video source is less than it should be.
StartOfEven Start of even field
StartOfField Start of field (odd or even)
StartOfFrame Start of frame
StartOfOdd Start of odd field
In addition to these events, the status of the following acquisition signals can be monitored in the
host application. Note that the availability of status signals varies with the hardware used and this
availability can be verified programmatically. The SapAcquisition::GetSignalStatus(…) function can
be used to monitor these signals.
SapAcquisition::SignalNone No signal
SapAcquisition::SignalHSyncPresent Horizontal sync signal (analog video source) or line valid (digital video
source)
SapAcquisition::SignalVSyncPresent Vertical sync signal (analog video source) or frame valid (digital video
source)
SapAcquisition::SignalPixelClkPresent Pixel clock signal. For CameraLink devices, this status returns true if a clock
signal is detected on the base cable.
SapAcquisition::SignalPixelClk1Present
SapAcquisition::SignalPixelClk2Present Pixel clock signal. For CameraLink devices, this status returns true if a clock
signal is detected on the medium cable.
SapAcquisition::SignalPixelClk3Present Pixel clock signal. For CameraLink devices, this status returns true if a clock
signal is detected on the full cable.
SapAcquisition::SignalPixelClkAllPresent Pixel clock signal. For Camera Link devices, true if all required pixel clock
signals have been detected by the acquisition device based on the
CameraLink configuration selected.
SapAcquisition::SignalHSyncLock Successful lock to an horizontal sync signal, for an analog video source
SapAcquisition::SignalVSyncLock Successful lock to a vertical sync signal, for an analog video source
SapAcquisition::SignalPowerPresent Power is available for a camera. This does not necessarily mean that power
is used by the camera, it only indicates that power is available at the
camera connector, where it might be supplied from the board PCI bus or
from the board PC power connector. The returned value value is FALSE if
the circuit fuse is blown, therefore power cannot be supplied to any
connected camera.
SapAcquisition::SignalPixelLinkLock Lane lock signal. For HSLink and CLHS devices, true if all required lane lock
signals have been detected by the acquisition device based on the HSLink or
CLHS configuration selected.
The scalable nature of T2IR framework has allowed Teledyne DALSA to add sophisticated
parameter switching capability in its hardware products that are well suited for use with circular
buffers. Teledyne DALSA Genie cameras, for example, allow users to change trigger delay, strobe
outputs, exposure delay and duration, gain, LUTs and FFCs (flat-field coefficients) on a frame by
frame basis. Similarly, the Xtium-CL MX4 frame grabber allows users to switch flat-field and LUTs
on a frame by frame basis. When activated, these advanced switching features operate entirely in
the acquisition device without using the host CPU resources. Furthermore, the images generated
while switching parameters can be saved as a sequence of images.
T2IR provides a broad range of options to handle situations involving too much data. It provides
users with necessary information to discard images safely while preserving the accuracy of results
from images that were processed. When every image counts, discarding images inevitably leads to
reduced throughput. Thus, even when discarding images, care must be given to minimize the
impact on throughput. The T2IR framework allows applications to discard images early in the
acquisition pipeline if it is determined that the system won’t be able to handle the images
subsequently. The T2IR framework uses a concept of “trash” buffers to discard incoming images
efficiently. When a system is not able to handle the incoming data, the acquired images are
transferred into the “trash buffer”.
• Sapera Monitor
• External LEDs
• Sapera LogViewer
• Sapera PCI Diagnostic Tool
• Sapera Networking Tool
• Sapera Configuration
• Xtium Diagnostic Tool
Sapera Monitor
As part of the Trigger-to-Image-Reliability (T2IR) framework, the Sapera Monitor Tool allows users
to view the acquisition and transfer events generated by an acquisition device in real time. Sapera
Monitor is a standalone application that runs concurrently with CamExpert or with a user
application, and can therefore be useful for debugging applications and identifying problems
without having to code event handlers. See the Sapera Getting Started manual for more info.
Red
power connected
Flashing Red
initialization
Flashing Blue
waiting for IP
Blue
IP assigned Green
application
connected
Sapera Log Viewer runs transparently in the background without impacting the application
performance, and stores entire message communications and results. This allows analysis of the
log even after the error has occurred. Configuration options allow users to set the type of results to
log, such as ignoring info messages and logging only warning or error messages. Messages in the
viewer can be dynamically filtered and/or searched for key terms.
Refer to the utility's online help for more information on using the Log Viewer.
One important parameter is the PCI Express bus bit transfer rate supported by the host computer,
which defines the maximum data rate possible in the computer.
Another important parameter is the internal Xtium FPGA temperature, which, if excessive, may
explain erratic acquisitions due to poor computer ventilation.
The closure (collapse or horizontal shortening) of the eye surface would indicate problems such as
poor signal to noise, high cable capacitance, multipath interference, among many possible digital
transmission faults.
if (pBuffer->Create())
{
// Buffer object is correctly initialized
}
if (buffer.Create())
{
// Buffer object is correctly initialized
// Destroy the buffer resources
buffer.Destroy();
}
if (pBuffer->Create())
{
pBuffer->Destroy();
}
Sapera LT ++ objects that do not encapsulate management of Standard API resources are
correctly initialized as soon as their constructor has been called.
SapDataMono data(123);
if (buffer.Create())
{
// Buffer object is correctly initialized
}
if (pBuffer->Create())
{
// Buffer object is correctly initialized
}
Monitoring Errors
No matter which reporting mode is currently active, it is always possible to retrieve the latest error
message. If the error happened when Sapera LT ++ called a Standard API function, then a related
numeric code is also available. In order the retrieve this information, call the
SapManager::GetLastStatus method as follows:
In addition, the Sapera Log Viewer utility program, included with Sapera LT, provides an easy way
to view error messages. It includes a list box that stores these messages as soon as the errors
happen. Available options allow you to modify the different fields for display.
During development it is recommended to start the Log Viewer before your application and then let
it run so it can be referred to any time a detailed error description is required. However, errors are
actually stored by the Sapera Log Server (running in the background), even if the utility is not
running. Therefore it is possible to start the Log Viewer only when a problem occurs with your
application.
// try
{
// Code that possibly generates an error
}
catch (SapLibraryException exception)
{
// Exception handling code
}
// try
{
// Code that possibly generates an error
}
catch (SapLibraryException^ exception)
{
// Exception handling code
}
Monitoring Errors
No matter which reporting mode is currently active, it is always possible to retrieve the latest error
message. If the error happened when Sapera LT .NET called a Standard API function, then a
related numeric code is also available. In order the retrieve this information use the
LastStatusMessage and LastStatusCode properties of the SapManager class.
In addition, the Sapera Log Viewer utility program included with Sapera LT provides an easy way to
view error messages. It includes a list box that stores these messages as soon as the errors
happen. Available options allow you to modify the different fields for display.
During development it is recommended to start the Log Viewer before your application and then let
it run so it can be referred to any time a detailed error description is required. However, errors are
actually stored by the Sapera Log Server (running in the background), even if the utility is not
running. Therefore it is possible to start the Log Viewer only when a problem occurs with your
application.
What is a Capability?
A capability as its name implies, is a value or set of values that describe what a resource can do.
Capabilities are used to determine the possible valid values that can be applied to a resource's
parameters. They are read-only.
A capability can be obtained from a resource by using the GetCapability method in the
corresponding class. See the Sapera LT ++ Programmer’s Manual or the Sapera LT .NET
Programmer’s Manual for details.
What is a Parameter?
A parameter describes a current characteristic of a resource. It can be read/write or read-only.
A parameter for a resource can be obtained or set by using the GetParameter and SetParameter
methods in the corresponding class. See the Sapera LT ++ Programmer’s Manual or the Sapera LT
.NET Programmer’s Manual for details.
• SapAcquisition or SapAcqDevice: Use the SapAcquisition class if you are using a frame
grabber. Use the SapAcqDevice class if you are using a camera directly connected to your
PC, such as a Teledyne DALSA Genie camera.
• SapBuffer: Used to store the acquired data. Should be created using the ScatterGather
(preferable) or Contiguous buffer type to enable the transfer. See section Working with
Buffers for for more information about contiguous memory and scatter-gather.
• SapTransfer: Used to link the acquisition device to the buffer and to synchronize the
acquisition operations.
Acquiring an image requires one file (the CCF file) to configure the acquisition hardware. It defines
both the characteristics of the camera and how it will be used with the acquisition hardware. Use
CamExpert to generate this file. Resource parameters can also be accessed individually.
After the acquisition module is initialized using the CCF file, a compatible buffer can be created
using settings taken directly from the acquisition.
Before starting the actual transfer, you must create a transfer object to link the acquisition and the
buffer objects. Furthermore when stopping a transfer, you must call the SapTransfer::Wait method
to wait for the transfer process to terminate.
return 0;
}
For more details, see the Sapera LT ++ Programmer’s Manual and the source code for the demos
and examples included with Sapera LT.
Acquiring an image requires one file (the CCF file) to configure the acquisition hardware. It defines
both the characteristics of the camera and how it will be used with the acquisition hardware. Use
CamExpert to generate this file. Resource parameters can also be accessed individually.
After the acquisition module is initialized using the CCF file, a compatible buffer can be created
using settings taken directly from the acquisition.
Before initiating the actual transfer you must create a transfer object to link the acquisition and the
buffer objects. Furthermore, when stopping a transfer, you must call the Wait method in the
SapTransfer class to wait for the transfer process to terminate.
transfer.Pairs[0].EventType = SapXferPair.XferEventType.EndOfFrame;
transfer.XferNotify += new SapXferNotifyHandler(SapTransfer_XferNotify);
transfer.XferNotifyContext = view;
pTransfer->Pairs[0]->EventType = SapXferPair::XferEventType::EndOfFrame;
pTransfer->XferNotify += gcnew SapXferNotifyHandler(SapTransfer_XferNotify);
pTransfer->XferNotifyContext = pView;
For detailed information see the source code for the Sapera LT .NET demos and examples included
with Sapera LT.
pAcq->SetParameter(CORACQ_PRM_EXT_TRIGGER_LEVEL, CORACQ_VAL_LEVEL_TTL);
pAcq->SetParameter(CORACQ_PRM_EXT_TRIGGER_ENABLE, CORACQ_VAL_EXT_TRIGGER_ON);
pAcq->SetParameter(CORACQ_PRM_EXT_TRIGGER_DETECTION, CORACQ_VAL_RISING_EDGE);
You may then retrieve it using the SapAcquisition::GetLut method, manipulate it using the
methods in the SapLut Class, and reprogram it using the SapAcquisition::ApplyLut method.
The internal SapLut object is automatically destroyed when you call the SapAcquisition::Destroy
method. The following code is an example of these steps.
Sample Code
// Allocate and create resources for acquisition object
SapAcquisition pAcq =
new SapAcquisition(SapLocation(“X64-CL_1”, 0), “MyCamera.ccf”);
BOOL success = pAcq->Create();
You may then retrieve it using the Lut property of SapAcquisition, manipulate it using the methods
in SapLut, and reprogram it using the ApplyLut method of SapAcquisition.
The internal SapLut object is automatically destroyed when you call the Destroy method of
SapAcquisition.
After the SapAcqDevice class is initialized (with or without using a configuration file), certain
parameters are retrieved from it (acquisition width, height, and format) to create a compatible
buffer.
Before starting the transfer, you must create a transfer path between the SapAcqDevice class and
the SapBuffer class using one of the SapTransfer’s derived classes (SapAcqDeviceToBuf in this
case). Furthermore, when requesting a transfer stop, you must call SapTransfer::Wait to wait for
the transfer process to terminate completely.
// Example program
//
main()
{
// Allocate acquisition object
SapAcqDevice *pAcq =
new SapAcqDevice("Genie_M640_1", FALSE); // uses camera default settings
//new SapAcqDevice("Genie_M640", "MyCamera.ccf"); // loads configuration file
After the SapAcqDevice object is initialized (with or without a configuration file), certain
parameters are retrieved from it (acquisition width, height, and format) to create a compatible
buffer.
Before starting the transfer, you must create a transfer path between the SapAcqDevice and the
SapBuffer objects using the SapTransfer class or one of the specialized transfer classes
(SapAcqDeviceToBuf in this case). Furthermore when requesting a transfer stop, you must call the
Wait method of SapTransfer to wait for the transfer process to terminate completely.
transfer.Pairs[0].EventType = SapXferPair.XferEventType.EndOfFrame;
transfer.XferNotify += new SapXferNotifyHandler(SapTransfer_XferNotify);
transfer.XferNotifyContext = view;
pTransfer->Pairs[0]->EventType = SapXferPair::XferEventType::EndOfFrame;
pTransfer->XferNotify +=
gcnew SapXferNotifyHandler(SapTransfer_XferNotify);
pTransfer->XferNotifyContext = pView;
return 0;
}
In some circumstances, a set of feature values are tightly coupled together and must be written to
the camera at the same time. The next section shows how to proceed in such a case.
//
// Main Program
//
main()
{
BOOL status;
//
// Feature is a boolean
case SapFeature::TypeBool:
{
BOOL value;
if (featureAccessMode == SapFeature::AccessRW)
{
status = camera.GetFeatureValue(featureIndex, &value);
value = !value;
status = camera.SetFeatureValue(featureIndex, value);
}
}
break;
//
// Example 2 : Access specific feature (integer example)
//
// Get feature object
status = camera.GetFeatureInfo("Gain", &feature);
//
// Example 3 : Access specific feature (enumeration example)
//
// Example 4 : Access specific feature (LUT example)
//
// Select a LUT and retrieve its size and format
UINT32 lutNEntries, lutFormat;
//
// Example 5 : Callback management
//
// Browse event list
int numEvents;
status = camera.GetEventCount(&numEvents);
int eventIndex;
for (eventIndex = 0; eventIndex < numEvents; eventIndex++)
{
char eventName[64];
status = camera.GetEventNameByIndex(eventIndex, eventName, sizeof(eventName));
}
// Write features to device (by reading values from the internal cache)
success = pAcq->UpdateFeaturesToDevice();
//
// Example 1 : Browse through the feature list
//
for (int featureIndex = 0; featureIndex < featureCount; featureIndex++)
{
// Get information from current feature
// Get feature object
success = acqDevice.GetFeatureInfo(featureIndex, feature);
// Feature is a boolean
case SapFeature.Type.Bool:
{
bool localFeatureValue;
if (featureAccessMode == SapFeature.AccessMode.RW)
{
success = acqDevice.GetFeatureValue(
featureIndex, out localFeatureValue);
localFeatureValue = !localFeatureValue;
success = acqDevice.SetFeatureValue(
featureIndex, localFeatureValue);
}
}
break;
//
// Example 2 : Access specific feature (integer example)
//
// Get feature object
success = acqDevice.GetFeatureInfo("Gain", feature);
//
// Example 3 : Access specific feature (enumeration example)
//
// Get feature object
success = acqDevice.GetFeatureInfo("ExposureMode", feature);
//
// Example 4 : Access specific feature (LUT example)
//
// Select a LUT and retrieve its size and format
int numLutEntries;
int lutFormat;
// This cast is OK, because the "LUTFormat" feature uses the same values
// as the SapFormat enumeration
SapFormat saperaLutFormat = (SapFormat)lutFormat;
//
// Example 5 : Callback management
//
// Get all event names
string[] eventNames = acqDevice.EventNames;
// Example program
int main(array<String ^>^ args)
{
// Allocate acquisition object using default camera settings,
// and create resources
SapAcqDevice^ pAcqDevice =
gcnew SapAcqDevice(gcnew SapLocation("Genie_M640_1", 0));
bool success = pAcqDevice->Create();
// Feature is a boolean
success = pFeature->GetValueMin(featureValueMin);
success = pFeature->GetValueMax(featureValueMax);
success = pFeature->GetValueIncrement(featureValueIncrement);
// This cast is OK, because the "LUTFormat" feature uses the same values
// as the SapFormat enumeration
SapFormat saperaLutFormat = static_cast<SapFormat>(lutFormat);
//
// Example 5 : Callback management
//
// Get all event names
array<String^>^ eventNames = pAcqDevice->EventNames;
// Write features to device (by reading values from the internal cache)
success = acqDevice.UpdateFeaturesToDevice();
// Write features to device (by reading values from the internal cache)
success = pAcqDevice->UpdateFeaturesToDevice();
Display Examples
The example below illustrates how to display an image contained within a system buffer on the
computer VGA card. The buffer is transferred to the Windows Desktop using the DIB mode
(automatically detected by the SapView Class). When using this mode, a Windows Device-
Independent Bitmap (DIB) is first created before being sent to VGA memory.
For more details, see the Sapera LT ++ Programmer’s Manual or the Sapera LT .NET Programmer’s
Manual.
// Allocate and create view object, images will be displayed directly on the desktop
SapView *pView = new SapView(pBuffer, SapHwndDesktop);
success = pView->Create();
However, these methods often do not offer, especially for scroll bars, a sufficient level of
management following changes to the display area. So it is recommended that you use the
methods with the same names in the CImageWnd Class instead. Note that this class is part of the
GUI classes. Below is a partial listing of a dialog-based Windows application.
CSaperaAppDlg::CSaperaAppDlg()
{
m_pBuffer = NULL;
m_pView = NULL;
m_pImageWnd = NULL;
// Other initialization
...
}
BOOL CSaperaAppDlg::OnInitDialog()
{
// Call default handler
CDialog::OnInitDialog();
// Other initialization
...
// Allocate and create view object, images will be displayed in the MFC CWnd
// object identified by m_ViewWnd
m_pView = new SapView(m_pBuffer, m_ViewWnd.GetSafeHwnd());
success = m_pView->Create();
return TRUE;
}
CSaperaAppDlg::OnDestroy()
{
// Release resources for all objects
BOOL success = m_pView->Destroy();
success = m_pBuffer->Destroy();
void CSaperaAppDlg::OnPaint()
{
if (IsIconic())
{
...
}
else
{
// Call the default handler to paint a background
CDialog::OnPaint();
// Display last acquired image
m_pImageWnd->OnPaint();
}
}
For more details, see the Sapera LT ++ Programmer’s Manual and the source code for the demos
and examples included with Sapera LT.
However, these methods often do not offer, especially for scroll bars, a sufficient level of
management following changes to the display area. The Sapera LT .NET demos include an
ImageBox class (with included source code) for this purpose. Below are partial listings which do
use the ImageBox class.
// Allocate and create view object, images will be displayed in the form
SapView view = new SapView(buffer, form);
success = view.Create();
// Example program
int main(array<String ^>^ args)
{
// Allocate and create a 640x480x8 buffer object
SapBuffer^ pBuffer = gcnew SapBuffer(1, 640, 480, SapFormat::Mono8,
SapBuffer::MemoryType::ScatterGather);
bool success = pBuffer->Create();
// Allocate and create view object, images will be displayed in the form
SapView^ pView = gcnew SapView(pBuffer, pForm);
success = pView->Create();
return 0;
}
For more details, see the Sapera LT .NET Programmer’s Manual and the source code for the demos
and examples included with Sapera LT.
A SapBufferRoi object shares the same memory space as its parent, and it defines an adjustable
rectangular region of interest. A child may be used by acquisition to reduce bandwidth
requirements, or by a processing function in order to process a specific region.
// Use buffers
...
You may modify the origin and dimensions of the region of interest for a child buffer object before
calling its Create method. The following example demonstrates this concept.
// Swap left and right children, and make their height the same as the parent
success = pChild1->SetRoi(320, 0, 320, 480);
success = pChild2->SetRoi(0, 0, 320, 480);
// Use buffers
// ...
//*******************************************************
//
// You may modify the origin and dimensions of the region of interest for a child
// buffer object before calling its Create method. The following C# example
// demonstrates this concept.
// Swap left and right children, and make their height the same as the parent
success = child1.SetRoi(320, 0, 320, 480);
success = child2.SetRoi(0, 0, 320, 480);
// Use buffers
// ...
//*******************************************************
//
// You may modify the origin and dimensions of the region of interest for a child
// buffer object before calling its Create method. The following C++ example
// demonstrates this concept.
// Swap left and right children, and make their height the same as the parent
success = pChild1->SetRoi(320, 0, 320, 480);
success = pChild2->SetRoi(0, 0, 320, 480);
The SapBuffer.Copy or SapBuffer.SplitComponents methods can be used to extract either the RGB
or monochrome (IR) component into a separate buffer. The SapBuffer.MergeComponents method
can be used to merge separate components into the multiformat buffer type.
When using the SapBuffer.ReadElement method to access the buffer, the data is extracted as a
SapDataRGBA object, with the A component representing the monochrome (IR) portion.
If accessing the memory directly, for each line in the buffer, the first ¾ (left side) represents the
RGB data and the last ¼ (right side) represents the monochrome (IR) data.
Multiformat buffers use 2 pages; one page for RGB component and one page for the monochorme
(IR) component. When displaying multiformat buffers with the SapView class, use the AllPage and
Page properties to manage the current (active) page of the buffer (RGB or monochrome) to
display. The active page only applies when choosing which format to display when calling the
SapView.Show function.
For load and save operations, multiformat buffers only support the CRC and RAW formats.
The fastest way to access buffer data is to obtain direct access through a pointer. The GetAddress
and ReleaseAddress methods initiate and end direct data access, respectively . The drawback of
this method is that you need to know the buffer dimensions, format, and pitch in order to correctly
access the data. The following code illustrates this.
For more information on buffer data access functionality, see the Sapera LT ++ Programmer’s
Manual.
Example Code in C#
// Allocate and create a 640x480x8 buffer object
SapBuffer buffer = new SapBuffer(1, 640, 480, SapFormat.Mono8,
SapBuffer.MemoryType.ScatterGather);
bool success = buffer.Create();
The fastest way to access buffer data is to obtain direct access through a pointer. The GetAddress
and ReleaseAddress methods initiate and end direct data access, respectively. The drawback of this
method is that you need to know the buffer dimensions, format, and pitch in order to correctly
access the data.
Example Code in C#
// Allocate and create a 640x480 RGB 5-6-5 buffer object
SapBuffer buffer = new SapBuffer(1, 640, 480, SapFormat.RGB565,
SapBuffer.MemoryType.ScatterGather);
bool success = buffer.Create();
unsafe
{
For more information on buffer data access functionality, see the Sapera LT .NET Programmer’s
Manual.
However, this only allows for processing of all acquired images when the average processing time
is less than the time required to acquire one image. When processing cannot keep up with the
acquisition frame rate, it is often useful to have a special buffer, not part of the circular list, for
throwing away images that cannot be processed. In Sapera LT, this is called the trash buffer.
• Use one instance of the SapBuffer or SapBufferWithTrash classes, both of which can be
given a buffer count
API support for linking buffers together:
• Use one instance of the SapTransfer class, or one of its derived classes, for example,
SapAcqToBuf or SapAcqDeviceToBuf.
Auto-Empty Mechanism
Refers to an application configurable mechanism by which buffer state is automatically set to
empty.
• (Case 1) If the next buffer is empty, then transfer to the next buffer
• (Case 2) If the next buffer is full, then transfer to the current buffer
• Ignore the presence of a trash buffer
Example of case 2:
• (Case 1) If the next buffer is empty, then transfer to the next buffer
• (Case 2) If the next buffer is full, transfer to next empty buffer in the list
• (Case 3) If all buffers are full, then transfer to the current buffer
• Ignore the presence of a trash buffer
Example of case 1:
Example of case 2:
• (Case 1) If the next buffer is empty, then transfer to the next buffer
• (Case 2) If the next buffer is full, then transfer to the trash buffer
• (Case 3) Repeat transferring to the trash buffer as long as the next buffer is full
• Buffer state is irrelevant for the trash buffer
Example of case 1:
Example of case 2:
Example of case 3:
• (Case 1) If the next buffer is empty, then transfer to the next buffer
• (Case 2) If the next buffer is full, transfer to next empty buffer in the list
• (Case 3) If all buffers are full, then transfer to trash buffer
• (Case 4) Repeat transferring to the trash buffer as long as all buffers are full
• Buffer state is irrelevant for the trash buffer
Example of case 1:
Example of case 2:
Example of case 3:
• (Sapera LT ++) Use the GetCycleMode and SetCycleMode methods in the SapXferPair
class
• (.NET) Use the Cycle property in the SapXferPair class
Example 1: The application only needs to read timestamp information of acquired images
Example 2: The application only needs to process acquired images (no display)
Example 3: The application needs to process acquired images before displaying the resulting
processed images
Flat field correction uses 2 coefficients (offset and gain) per pixel to compensate for fixed pattern
noise (FPN) and photo response non-uniformity (PRNU).
• FPN is the variation in pixel response without incident light (also known as dark current). It
is noise signal generated by the background voltage present in the sensor. The flat field
offset coefficients are used to correct for this noise. To perform FPN calibration using
SapFlatField::ComputeOffset, a number of dark images are averaged (i.e., all light is
blocked from entering the sensor using the lens cap). The percentage of zero pixels allowed
in the averaged images can be set using the SapFlatField::SetBlackPixelPercentage (too
many zero pixels indicates the camera’s black level is too high and information is being
clipped; adjust the camera settings accordingly).
• PRNU is the variation in pixel response to a uniform amount of light. The flat field gain
coefficients are used to correct for this response non-uniformity such that all pixels output
the same value when exposed to the same incident light. To perform PRNU calibration using
SapFlatField::ComputeGain, a number of white images are averaged, such that the
camera is close to saturation, but not saturated. The gain coefficient is calculated for each
pixel such that it reaches a specified target value below saturation.
For both FPN and PRNU calibration, the greater the number of images averaged, the lesser the
effects of random noise.
Video
- X -/+ X Output
Video
The ComputeOffset function must be called before the ComputeGain function. To apply the
software flat field correction on an image, use the SapFlatField::Execute function. For hardware
flat field correction, the flat field correction file is loaded to the device and enabled on the
hardware; refer to the device documentation for more information.
The system offset and gain applied after the flat field correction are typically used to maximize the
image dynamic range for the typical image scene for the application.
The 8-bit or 16-bit format is determined by the format of the buffer passed to the
SapFlatField::ComputeOffset / ComputeGain functions. 16-bit files are used for 10, 12, 14, or
16-bit output format. In general, the sensor’s highest output format should be used to calibrate the
flat field coefficients. A 16-bit flat field coefficient file can be used with lower output formats by
setting an offset factor (SapFlatField::SetOffsetFactor).
Gain Divisor
For 8-bit gain coefficients, the gain divisor is typically equal to 128, so that a gain value between 0
and 255 becomes a value between 0 and 2. It is then set to the acquisition device gain divisor
value when calling the Create method (the SapFlatField::SetGainDivisor method is only used when
operating without hardware support). The gainDivisor and gain base are used to convert a floating
point gain value to an integer value that can be saved in a .TIFF image.
Gain Base
For gain base, if supported by the acquisition device (for example, the Genie TS), it is retrieved
from the device after calling the Create method. For all other acquisition devices, and for software
based flat field correction, the initial value for this attribute is 0, and the application code can call
SetGainBase if required.
Offset coefficient (Xn, Yn) = average pixel value (Xn, Yn) - DN value of ~3σ
This method preserves the dynamic range and reduces the number of pixels that are clipped at
zero (which results in loss of image data, even if offset and gain are subsequently applied to adjust
the black threshold).
The SapFlatField::SetOffsetMinMax can be used to limit the possible gain values. If pixels reach
this limit, they are flagged as defective when SapFlatField::EnableClippedGainOffsetDefects =
TRUE (default).
Gain Coefficients
Gain coefficients are calculated after offset coefficients are applied. Gain coefficients are calculated
such that all pixels reach the specified target value (or the maximum pixel value in the white
image). The SapFlatField::SetGainMinMax can be used to limit the possible gain values. If pixels
reach this limit, they are flagged as defective when
SapFlatField::EnableClippedGainOffsetDefects = TRUE (default).
are considered as defective pixels. By default, the maximum deviation is 0.25 x maximum pixel
value (for example for 8-bit images the maximum deviation is 63).
The maximum deviations for the black and white images are set using the
SapFlatField::SetDeviationMaxBlack / SetDeviationMaxWhite functions.
Pixel replacement is enabled/disabled using SapFlatField::EnablePixelReplacement. Pixels are
replaced using the pixel to its immediate left, other than the first pixel of a line, which uses the
pixel to the right.
2. The lens should be at the required magnification and aperture and slightly unfocused to avoid
introducing granularity or details in the reference image (when calibration is complete, refocus
the lens).
3. As the white reference is located at the object plane, any markings or contaminants on its
surface (that is, dust, scratches, smudges) will end up in the calibration profile of the camera.
To avoid this, use a clean white plastic or ceramic material rather than trying to rely on a paper
reference. (Ideally, the white object will be moving during the calibration process, as the
averaging process of the camera will diminish the effects of any small variation in the white
reference.)
4. Adjust the system gain until the peak intensity is at the desired DN level and then calibrate the
fixed pattern noise (FPN) using the SapFlatField::ComputeOffset function. Use a lens cap to
ensure that no light reaches the sensor.
5. Once complete, remove the lens cap and perform a photo response non-uniformity (PRNU)
calibration using SapFlatField::ComputeOffset using the desired target value (in DN). You want
all the pixels to match. This target value should be higher than the peak values you saw while
first setting up the camera.
// Rely on the SapFlatField class to automatically create the offset and gain buffers with the
// correct dimensions and format, but perform the calibration manually
// pAcquisition is an existing SapAcquisition object
// pBuffer is an existing SapBuffer object containing an acquired image.
BYTE* pBufData;
success = pBuffer->GetAddress(&pBufData);
BYTE* pOffsetData;
success = pBufferOffset->GetAddress(&pOffsetData);
BYTE* pGainData;
success = pBufferGain->GetAddress(&pGainData);
success = pFlatField->Destroy();
delete pFlatField;
Runtime Installations
Two types of Sapera LT runtime installations are available when deploying your application:
In the case of frame grabbers, the appropriate device driver must be installed along with the
installation of the Sapera LT runtimes.
In the case of frame grabbers, the required device drivers must be downloaded from the Teledyne
DALSA website and installed separately.
Teledyne DALSA's installation programs for Sapera LT can be started in two ways:
• Normal Mode. This is the interactive mode provided by default. It is initiated by invoking one
of the Sapera LT Setup.exe program. The installation proceeds normally as if it was started
from Windows Explorer or the Windows command line.
Frame grabbers: Regardless of installation mode, you must reboot after the installation
of Sapera LT. However, to streamline the installation process with frame grabbers, you
may install Sapera LT and the required device drivers before rebooting.
During driver installation, Windows Digital Signature and Logo Testing warnings, if any,
can be safely ignored.
• Running a silent mode installation by invoking the Sapera LT installation program with
command options to use the response file.
As soon as you upgrade your Sapera LT version, you must recreate your response files.
As an example, to create the setup.iss response file in the \Windows folder, use the following
command line:
SaperaLTRuntimeSetup.exe -r
SaperaLTRuntimeSetup.exe -s -f1".\setup.iss"
where the –s switch specifies the silent mode and the –f1 switch specifies the name and location
of the response file. In this example, the setup.iss file is in the same folder as the Sapera LT
installer.
As an example, to create the setup_uninstall.iss response file in the same directory as the
installation program of the Sapera LT runtime, use the following command line:
SaperaLTRuntimeSetup.exe –r –f1".\setup_uninstall.iss"
To uninstall the Sapera LT runtime in silent mode, use the following command line:
SaperaLTRuntimeSetup.exe -s -f1".\setup_uninstall.iss"
where the –s switch specifies the silent mode and the –f1 switch specifies the name and location
of the response file. In this example, the setup_uninstall.iss file is in the same folder as the
Sapera LT installer.
The diagram below shows the runtime architecture dependency. Compiler Runtime X provides
support for the calls made by your application to the standard compiler libraries (direct calls), while
Compiler Runtime Y provides support for the calls made by the Sapera LT library to the standard
compiler libraries (indirect calls). Several compiler versions can coexist in the same target system.
The following sections provide sales and technical support contact information.
Sales Information
Visit our web site: www.teledynedalsa.com
Email: mailto:[email protected]
Technical Support
Submit any support question or request via our web site:
When encountering hardware or software problems, please have the following documents included
in your support request:
The Sapera Log Viewer and PCi Diagnostics tools are available from the Windows Start menu
under Teledyne DALSA Sapera LT.
The Device Manager utility is available as part of the driver installation for your Teledyne
DALSA device and is available from the Windows Start menu under Teledyne DALSA >
Device Name > Device Manager.