KST VT
KST VT
KUKA.VisionTech 4.0
KUKA.Vision-
Tech 4.0
Issued: 11.12.2017
© Copyright 2017
KUKA Roboter GmbH
Zugspitzstraße 140
D-86165 Augsburg
Germany
This documentation or excerpts therefrom may not be reproduced or disclosed to third parties without
the express permission of KUKA Roboter GmbH.
Other functions not described in this documentation may be operable in the controller. The user has
no claims to these functions, however, in the case of a replacement or service work.
We have checked the content of this documentation for conformity with the hardware and software
described. Nevertheless, discrepancies cannot be precluded, for which reason we are not able to
guarantee total conformity. The information in this documentation is checked on a regular basis, how-
ever, and necessary corrections will be incorporated in the subsequent edition.
Subject to technical alterations without an effect on the function.
KIM-PS5-DOC
Translation of the original documentation
Contents
1 Introduction .................................................................................................. 11
1.1 Industrial robot documentation ................................................................................... 11
1.2 Representation of warnings and notes ...................................................................... 11
1.3 Trademarks ................................................................................................................ 11
1.4 Licenses ..................................................................................................................... 12
1.5 Terms used ................................................................................................................ 12
2 Purpose ........................................................................................................ 13
2.1 Target group .............................................................................................................. 13
2.2 Intended use .............................................................................................................. 13
5 Safety ............................................................................................................ 39
5.1 General safety measures ........................................................................................... 39
5.2 Standards and regulations ......................................................................................... 39
6 Planning ........................................................................................................ 41
6.1 Connecting cables and interfaces ............................................................................. 41
7 Transportation ............................................................................................. 49
7.1 Transportation ........................................................................................................... 49
9 Operation ...................................................................................................... 59
9.1 Menus ........................................................................................................................ 59
10 Start-up ......................................................................................................... 61
10.1 Networking ................................................................................................................. 61
10.1.1 Networking KR C4 with interface X64, stationary ................................................. 61
10.1.2 Networking KR C4 with cable inlet, stationary ...................................................... 61
10.1.3 Networking KR C4 with KUKA IPC and cameras, stationary ............................... 62
10.1.4 Networking KR C4 with interface X64, robot-guided ............................................ 62
10.1.5 Networking KR C4 with cable inlet, robot-guided ................................................. 63
10.1.6 Networking KR C4 with KUKA IPC and cameras, robot-guided ........................... 63
10.1.7 Networking KR C4 compact, stationary ................................................................ 63
10.1.8 Networking KR C4 compact with KUKA IPC and cameras, stationary ................. 64
10.2 Putting cameras into operation – overview ................................................................ 64
10.3 Configuring the KUKA IPC ........................................................................................ 65
10.4 Setting the IP address of the KUKA IPC in WorkVisual ............................................ 65
10.5 Configuring Ethernet KRL .......................................................................................... 66
10.6 Configuring cameras (smartHMI) .............................................................................. 66
10.7 Configuring cameras online (WorkVisual) ................................................................. 68
10.8 Configuring cameras offline (WorkVisual) ................................................................. 69
10.9 Aligning cameras (smartHMI) .................................................................................... 70
10.10 Aligning cameras (WorkVisual) ................................................................................. 70
10.11 Setting the exposure time (smartHMI) ....................................................................... 71
10.12 Setting the exposure time (WorkVisual) .................................................................... 71
10.13 Focusing the lens (smartHMI) ................................................................................... 72
10.14 Focusing the lens (WorkVisual) ................................................................................. 72
11 Configuration ............................................................................................... 81
11.1 Configuring a measurement task – overview ............................................................. 81
11.2 Creating a task and taking images (stationary) .......................................................... 81
11.2.1 Relative and absolute position data ...................................................................... 82
11.3 Creating a task and taking images (moving) .............................................................. 83
11.4 Acquiring images via WorkVisual ............................................................................... 84
11.5 Setting up an image processing task in WorkVisual .................................................. 84
11.5.1 Creating a tool block file in WorkVisual ................................................................ 85
11.5.2 Loading tool block files from the robot controller or KUKA IPC ............................ 86
11.5.3 Transferring a tool block file to the robot controller or KUKA IPC ......................... 87
11.6 Configuring a 2D task ................................................................................................ 87
11.7 Generating a 2D model .............................................................................................. 90
11.7.1 2D model with a stationary camera ...................................................................... 90
11.7.2 2D model with a moving camera .......................................................................... 90
11.8 Testing a 2D task ....................................................................................................... 91
11.9 Configuring a 3D task ................................................................................................ 92
11.10 Generating a 3D model .............................................................................................. 95
11.11 Testing a 3D task ....................................................................................................... 95
1 Introduction
t
Notices These notices serve to make your work easier or contain references to further
information.
1.3 Trademarks
1.4 Licenses
Product License
ImageProcessor 2.3.3 Apache_License_v2.0.txt
Term Description
BLOB Binary Large Object
Contiguous area for which one attribute (e.g.
brightness value) differs from that of the sur-
roundings
Depalletizing Removing components from pallets
Deracking Removing components from racks
EKI Ethernet KRL interface
GenICam Generic Interface for Cameras: generic pro-
gramming interface for cameras used in
industrial image processing.
GigE Gigabit Ethernet
Calibration pose Robot pose during calibration of a robot-
guided camera
KLI KUKA Line Interface
Connection to Ethernet network
KONI KUKA Option Network Interface
KUKA smartHMI User interface of the KUKA System Software
(KUKA smart Human-Machine Interface)
PoE Power over Ethernet
Power supply via network
Reference pose Robot pose for image acquisition
Reference position Position of a reference feature for verification
of the calibration
Tool block Image processing task created using WorkVi-
sual
2 Purpose
2
This documentation is aimed at users with the following knowledge and skills:
Expert knowledge of KRL programming
Advanced knowledge of the robot controller system
Advanced knowledge of network connections
Knowledge of the VisionPro software from Cognex (training)
Use KUKA.VisionTech is used to determine and correct the position of a robot rel-
ative to the position of a component with the aid of one or more cameras. Up
to 3 cameras may be operated simultaneously with the system. Only hardware
components approved by KUKA Roboter GmbH may be used. The hardware
components must only be operated under the specified environmental condi-
tions.
Misuse Any use or application deviating from the intended use is deemed to be imper-
missible misuse. This includes e.g.:
Operation outside the permissible operating parameters
Use in potentially explosive environments
Use in the vicinity of welding applications
Networking the camera network with a company network
3 Product description
3.1
t
Overview of VisionTech
package and a plug-in for the KUKA smartHMI. The acquisition and process-
ing of images is used to calculate a base correction. The base correction can
be used to correct the position of the robot relative to the position of a compo-
nent.
VisionTech can be used with both stationary and moving cameras. A station-
ary camera is fixed in its position, e.g. mounted on a stand or on the ceiling. A
moving camera is mounted on the robot flange.
Communication The robot controller communicates with one or more GigE cameras via the im-
age processing system. The connection between the kernel system and the
image processing system is established via the Ethernet KRL interface.
1 KR C4 compact 3 Switch
2 Connecting cable 4 Cameras
Item Description
1 Network card for PoE device (connection: CH1)
2 Network card for PoE device (connection: CH2)
Item Description
3 Network card for PoE device (connection: CH3)
4 Network card for PoE device (connection: CH4)
5 Network card for KLI (connection: LAN1)
6 Network card for KSB and VLAN10 (connection: LAN2)
7 Power key
8 Power supply (19 V~24 V DC INPUT)
Overview There are 5 RJ45 connections on the KUKA GigE switch. Ports 1 to 4 are PoE-
capable. KUKA MXG20 cameras can be connected to ports 1 to 3.
Port 5 can be used to connect the KR C4 compact; this port is not PoE-capa-
ble. The switch must be integrated into a housing with a protection rating of
IP 54 and supplied with a direct voltage of 12-36 V. The switch can be mount-
ed on a TS 35 top-hat rail.
In the KR C4, the switch is installed and connected to the power supply of the
KR C4. Up to 3 cameras can be connected to interfaces X64.1 to X64.3. Alter-
natively, the cameras can be connected directly to the switch in the KR C4 via
a cable inlet.
DIP switches
Designation Description
DIP switch for “Power DIP switch 1: P1 error message
Alarm Relay Output”
DIP switch 2: P2 error message
On: Activated
Off: Deactivated
Changeover switch for DIP switches 1 and 2 on: 100 Mbps
port 6 (selection of
DIP switches 1 and 2 off: 1000 Mbps
SFP speed)
LEDs
Power supply
Interfaces
Overview The KUKA MXG20 camera conforms to the GigE standard and is PoE-capa-
ble. The image resolution is 2 megapixels. The camera can be powered via
PoE or from an external power supply. A protective lens hood is mounted on
the camera. C-mount lenses can be used (>>> 3.2.5 "KUKA lenses"
Page 23).
A protective cap is mounted on the connection for the process interface / pow-
er supply. This may only be removed if a cable is connected to the interface.
1 LEDs
2 Process interface / power supply
3 Data/PoE interface
Assignment of
the connections
LEDs
Overview The KUKA VCXG-25M camera conforms to the GigE standard and is PoE-ca-
pable. The image resolution is 2.3 megapixels. The camera can be powered
via PoE or from an external power supply. C-mount lenses can be used
(>>> 3.2.5 "KUKA lenses" Page 23).
The KUKA VCXG-25M camera must not be used for the robot-guided
application.
Assignment of
the connections
1 MX1+ 5 MX3-
2 MX1- 6 MX2-
3 MX2+ 7 MX4+
4 MX3+ 8 MX4-
Lenses in various sizes and of various focal lengths are available for the cam-
eras. The lens required depends on the size of the component and the dis-
tance between the camera and the component.
The following must also be ensured when selecting the lens:
The lens fits on the camera
The lens matches the sensor
Only for the KUKA MXG20 camera: The lens fits into the protective lens
hood (depth: 35 mm, can be enlarged using extension rings – 12 mm
each)
The locking screws supplied for the iris and focus settings (2 Phillips
screws) must be used. Otherwise, the settings might change, leading
to errors in image processing.
1 Locking screw
Different connecting cables are required for the operation of one or more cam-
eras. The connecting cables used conform to the CAT6 standard for network
cables. The network cables have different connectors depending on the inter-
face. The cables are available in various lengths.
Name Description
FindingOnePart Simple detection of a workpiece
PickZoneCheck Check that a zone is collision-free
PresenceCheck Check whether an object exists
FineLocate Fine search for a specific feature
Measurement Measurement of values (in a metric unit) on the input
image
UserInputs Use of input parameters
Name Description
CodeReading Reading of barcodes and data matrix codes
MultipleParts Detection of multiple workpieces
4 Technical data
4
Basic data
t Power supply (exter- Control voltage: 12 … 24 V DC ± 10%
nal)
Rated current: 176 … 252 mA
Power supply (PoE) Control voltage: 36 … 57 V DC
Rated current: 88 mA (with 48 V DC)
Energy consumption approx. 4.2 W
Digital input Voltage (min.): 0 … 4.5 V DC
Voltage (max.): 11 … 30 V DC
Rated current: 6.0 … 10 mA
Pulse length: min. 2.0 µs
Trigger delay outside of treadout: 1 µs
Trigger delay during treadout: 14 µs
Digital output Voltage: 5 … 30 V DC
Rated current: max. 50 mA
MTBF 335774 h @ 45°C
Sensor size 1
/1.8"
Protection rating IP 67
Weight 185 g
Conformity CE
Standards and
Name Definition
guidelines
2004/108/EC EMC Directive:
Directive 2004/108/EC of the European Parlia-
ment and of the Council of 15 December 2004
on the approximation of the laws of the Member
States relating to electromagnetic compatibility
and repealing Directive 89/336/EEC
Name Definition
EN 61000-6-2 Electromagnetic compatibility (EMC) - Part 6-2:
Generic standards – Immunity for industrial envi-
ronments
EN 61000-6-4 Electromagnetic compatibility (EMC) - Part 6-4:
Generic standards – Emission standard for
industrial environments
Standards and
Name Definition
guidelines
2004/108/EC EMC Directive:
Directive 2004/108/EC of the European Parlia-
ment and of the Council dated 15 December
2004 on the approximation of the laws of the
Member States relating to electromagnetic com-
patibility and repealing Directive 89/336/EEC
EN 61000-6-2 Electromagnetic compatibility (EMC) – Part 6-2:
Generic standards – Immunity for industrial envi-
ronments
EN 61000-6-4 Electromagnetic compatibility (EMC) – Part 6-4:
Generic standards – Emission standard for
industrial environments
Standards and
Name Definition
guidelines
2004/108/EC EMC Directive:
Directive 2004/108/EC of the European Parlia-
ment and of the Council of 15 December 2004
on the approximation of the laws of the Member
States relating to electromagnetic compatibility
and repealing Directive 89/336/EEC
EN 55022 Information technology equipment – Radio dis-
turbance characteristics – Limits and methods of
measurement
EN 55024 Information technology equipment – Immunity
characteristics – Limits and methods of mea-
surement
EN 61000-3-2 Electromagnetic compatibility (EMC) - Part 3-2:
Limits – Limits for harmonic current emissions
EN 61000-3-3 Electromagnetic compatibility (EMC) - Part 3-3:
Limits – Limitation of voltage changes, voltage
fluctuations and flicker in public low-voltage sup-
ply systems, for equipment with rated current
For further information about the connecting cables, see (>>> 6.1 "Con-
necting cables and interfaces" Page 41).
For further information about the connecting cables, see (>>> 6.1 "Con-
necting cables and interfaces" Page 41).
For further information about the connecting cables, see (>>> 6.1 "Con-
necting cables and interfaces" Page 41).
1 Rear view
2 Side view
3 View from below
4 Front view
5 Side view with protective lens hood
1 Front view
2 Side view
Camera The identification plate is already affixed to the MXG20 camera. The camera
MXG20 label must be selected and attached by the user; this depends on the interface
to which the camera is connected:
Camera The following plates and labels are attached to the VCXG 25-M camera:
VCXG 25-M
Switch The following plates and labels are attached to the switch:
5 Safety
f
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Name/Edition Definition
6 Planning
Overview The connecting cables comprise all the cables for transferring power and sig-
nals between the robot controller, KUKA IPC and cameras. The following con-
necting cables are available:
For the KR C4 with interface X64:
Connecting cable, KR C4 – linear unit
Connecting cable, linear unit – robot
Connecting cable, KR C4 – robot
Connecting cable, KR C4 – camera (MXG20)
For the KR C4 with cable inlet to interface A13 on the switch:
Connecting cable, switch – linear unit
Connecting cable, linear unit – robot
Connecting cable, switch – robot
Connecting cable, switch – camera (MXG20 or VCXG-25M)
For the KR C4 compact:
Connecting cable, KR C4 compact – switch
Connecting cable, switch – camera (MXG20 or VCXG-25M)
For the KUKA IPC:
Connecting cable, KUKA IPC – KR C4
Connecting cable, KUKA IPC – KR C4 compact
Connecting cable, KUKA IPC – camera (MXG20 or VCXG-25M)
Information about the energy supply system for axes 1 to 3 and axes 3 to 6
can be found in separate documentation.
Depending on the specific system configuration, connecting cables are re-
quired in different lengths. The following cable lengths are available:
The maximum length of the connecting cables must not exceed 45 m with
moving (robot-guided) cameras and 35 m with stationary cameras. Thus if the
robot is operated on a linear unit which has its own energy supply chain these
cables must also be taken into account.
The following points must be observed when planning and routing the con-
necting cables:
Protect cables against exposure to mechanical stress.
Route the cables without mechanical stress – no tensile forces on the con-
nectors
Cables are only to be installed indoors.
Route cables in such a way that they cannot be damaged by sharp edges,
tools or other materials.
Route cables in such a way that they are located outside of the camera’s
field of vision.
Observe the permissible temperature range (fixed installation) of 243 K (-
30 °C) to 363 K (+90 °C).
Interfaces, KR C4 For connection of the connecting cables between the KR C4, linear unit, robot
and cameras, the following connectors are available at the interfaces:
Connector des-
Connecting cable Connections
ignation
KR C4 – linear unit X64.1 – X74.1.1 PushPull V4 connector at
both ends
X64.2 – X74.2.1
Linear unit – robot X74.1.1 – X74.1 PushPull V4 coupling –
PushPull V4 connector
X74.2.1 – X74.2
KR C4 – robot X64.1 – X74.1 PushPull V4 connector at
both ends
X64.2 – X74.2
KR C4 – MXG20 camera X64.1 – B1 PushPull V4 connector –
M12 male connector, 8-
X64.2 – B2
contact
X64.3 – B3
If a cable inlet on the KR C4 is used instead of interface X64, the following con-
nectors are available:
Connector des-
Connecting cable Connections
ignation
Switch – linear unit A13.1 – X74.1.1 RJ45 connector – Push-
Pull V4 connector
A13.2 – X74.2.1
Switch – robot A13.1 – X74.1 RJ45 connector – Push-
Pull V4 connector
A13.2 – X74.2
Switch – MXG20 camera A13.1 – B1 RJ45 connector – M12
male connector, 8-contact
A13.2 – B2
A13.3 – B3
Switch – VCXG-25M A13.1 – B1 RJ45 connector at both
camera ends
A13.2 – B2
A13.3 – B3
The RJ45 connectors that are connected to interfaces A13.1 to A13.3 have
protection rating IP 20.
Interface X64 Interfaces X64.1 to X64.3 are situated on the connection panel of the KR C4.
Connecting The following connecting cables are available for a KR C4 with interface X64:
cables, KR C4
Fig. 6-2: Connecting cable, KR C4 – linear unit (suitable for use on ro-
bots)
Fig. 6-3: Connecting cable, linear unit – robot (suitable for use on robots)
The following connecting cables are available for a KR C4 with a cable inlet:
Fig. 6-6: Connecting cable, switch – linear unit (suitable for use on ro-
bots)
Fig. 6-7: Connecting cable, switch – robot (suitable for use on robots)
Fig. 6-8: Connecting cable, switch – MXG20 camera (suitable for use on
robots)
Fig. 6-9: Connecting cable, switch – VCXG-25M camera (not suitable for
use on robots)
Interfaces, KR C4 For connection of the connecting cables between the KR C4 compact, switch
compact and cameras, the following connectors are available at the interfaces:
Connector des-
Connecting cable Connections
ignation
KR C4 compact – switch KONI – No RJ45 connector at both
PoE5 ends
Switch – MXG20 camera PoE 1 – B1 RJ45 connector – M12
male connector, 8-contact
PoE 2 – B2
PoE 3 – B3
Switch – VCXG-25M PoE 1 – B1 RJ45 connector at both
camera ends
PoE 2 – B2
PoE 3 – B3
The RJ45 connectors that are connected to interfaces PoE 1 to PoE 3 have
protection rating IP 20.
Connecting
cables, KR C4
compact
Fig. 6-11: Connecting cable, switch – MXG20 camera (not suitable for
use on robots)
Fig. 6-12: Connecting cable, switch – VCXG-25M camera (not suitable for
use on robots)
KUKA IPC inter- For connection of the connecting cables between the KUKA IPC and the cam-
faces eras, the following connectors are available at the interfaces:
Connector des-
Connecting cable Connections
ignation
KUKA IPC – KR C4 LAN1 – KLI RJ45 connector at both
ends
KUKA IPC – KR C4 com- LAN1 – KONI RJ45 connector at both
pact ends
KUKA IPC – MXG20 CH1 – B1 RJ45 connector – M12
cameras male connector, 8-contact
CH2 – B2
CH3 – B3
CH4 – B4
KUKA IPC – VCXG-25M CH1 – B1 RJ45 connector at both
cameras ends
CH2 – B2
CH3 – B3
CH4 – B4
KUKA IPC
connecting
cables
Fig. 6-13: Connecting cable, KUKA IPC – KR C4 (not suitable for use on
robots)
Fig. 6-14: Connecting cable, KUKA IPC – KR C4 compact (not suitable for
use on robots)
Fig. 6-15: Connecting cable, MXG20 camera – KUKA IPC (not suitable for
use on robots)
Fig. 6-16: Connecting cable, VCXG-25M camera – KUKA IPC (not suit-
able for use on robots)
7
T
Transportation
s
7.1 Transportation
t
Cameras Before transportation, the cameras must be removed. A protective cap must
t
Switch The switch must be packaged for transportation in ESD protection foil. Care
must be taken to ensure that the switch does not come into contact with hu-
midity.
The option package can either be installed on the robot controller via the sm-
t
artHMI or via WorkVisual.
Laptop/PC Software:
WorkVisual 5.0
The requirements for installation of WorkVisual are contained in the
WorkVisual documentation.
Option package KUKA.Ethernet KRL 3.0 (installed in WorkVisual)
Description The option package is installed in WorkVisual and added to the project. During
project deployment, the option package is automatically installed on the robot
controller.
In the case of an update, the previous version of the option package in WorkVi-
sual must first be uninstalled.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
VisionTech can be configured in such a way that image processing tasks are
executed on the KUKA IPC. The following steps are required for this:
Step Description
1 Generate setup for the KUKA IPC.
(>>> 8.4.1 "Generating the setup for KUKA IPC" Page 54)
2 Install setup on the KUKA IPC.
(>>> 8.4.2 "Installing setup on the KUKA IPC" Page 54)
Precondition The option package is installed in WorkVisual and has been added to the
project.
It is not possible to acquire images with the license for the service lap-
top.
Description If hardware components are modified, added or exchanged, this may invali-
date the license. A license becomes invalid if the current signature of the com-
puter differs too greatly from the saved signature. In this case, the license can
be repaired.
Description If the license is to be transferred from one system to another, it must be unin-
stalled and then reactivated on the other system. Following uninstallation, the
image processing functions activated by the license are no longer available.
Emergency licenses cannot be uninstalled. Uninstallation of a license is only
possible via WorkVisual.
Precondition If the license on the robot controller or KUKA IPC is to be uninstalled: The
connection to the robot controller or KUKA IPC has been established.
9 Operation
9.1 Menus
t
The following menus and commands are specific to this option package:
Main menu:
VisionTech
Sensor overview
Task configuration
Live picture
Calibration
Calibration management
Calibration verification
Licensing
Production screen
Settings
Menu sequence:
Commands > VisionTech
Trigger
WaitForResult
LoopResults
SetTargetBase
Direct
10 Start-up
t
10.1 Networking
t
Description Networking is carried out via Ethernet; the following variants are available:
Variant Description
KR C4 with interface X64, (>>> 10.1.1 "Networking KR C4 with inter-
stationary cameras face X64, stationary" Page 61)
KR C4 with cable inlet, sta- (>>> 10.1.2 "Networking KR C4 with cable
tionary cameras inlet, stationary" Page 61)
KR C4 with KUKA IPC, sta- (>>> 10.1.3 "Networking KR C4 with
tionary cameras KUKA IPC and cameras, stationary"
Page 62)
KR C4 with interface X64, (>>> 10.1.4 "Networking KR C4 with inter-
robot-guided cameras face X64, robot-guided" Page 62)
KR C4 with cable inlet, (>>> 10.1.5 "Networking KR C4 with cable
robot-guided cameras inlet, robot-guided" Page 63)
KR C4 with KUKA IPC, (>>> 10.1.6 "Networking KR C4 with
robot-guided cameras KUKA IPC and cameras, robot-guided"
Page 63)
KR C4 compact, stationary (>>> 10.1.7 "Networking KR C4 compact,
cameras stationary" Page 63)
KR C4 compact with KUKA (>>> 10.1.8 "Networking KR C4 compact
IPC, stationary cameras with KUKA IPC and cameras, stationary"
Page 64)
1 KR C4
2 KUKA IPC
3 KUKA camera (MXG20 or VCXG-25M)
Fig. 10-8: Networking KR C4 compact with KUKA IPC and cameras, sta-
tionary
1 KR C4 compact
2 KUKA IPC
3 KUKA camera (MXG20 or VCXG-25M)
Step Description
1 Only if using the KUKA IPC: Configure the KUKA IPC.
(>>> 10.3 "Configuring the KUKA IPC" Page 65)
2 Only if using the KUKA IPC: Set the IP address of the KUKA IPC in WorkVisual.
(>>> 10.4 "Setting the IP address of the KUKA IPC in WorkVisual" Page 65)
3 Configure Ethernet KRL interface.
(>>> 10.5 "Configuring Ethernet KRL" Page 66)
Step Description
4 Steps 5 to 8 can be carried out either on the KUKA smartHMI or in WorkVisual. If the
steps are to be carried out in WorkVisual, the WorkVisual project must first be loaded
from the robot controller.
Note: More information is contained in the WorkVisual documentation.
5 Configure the cameras.
On the KUKA smartHMI: (>>> 10.6 "Configuring cameras (smartHMI)" Page 66)
In WorkVisual:
(>>> 10.7 "Configuring cameras online (WorkVisual)" Page 68)
Or: (>>> 10.8 "Configuring cameras offline (WorkVisual)" Page 69)
6 Align the cameras.
On the KUKA smartHMI: (>>> 10.9 "Aligning cameras (smartHMI)" Page 70)
In WorkVisual: (>>> 10.10 "Aligning cameras (WorkVisual)" Page 70)
7 Set the exposure time.
On the KUKA smartHMI: (>>> 10.11 "Setting the exposure time (smartHMI)"
Page 71)
In WorkVisual: (>>> 10.12 "Setting the exposure time (WorkVisual)" Page 71)
8 Focus the lens.
On the KUKA smartHMI: (>>> 10.13 "Focusing the lens (smartHMI)" Page 72)
In WorkVisual: (>>> 10.14 "Focusing the lens (WorkVisual)" Page 72)
9 Only if the preceding steps have been carried out in WorkVisual: Transfer the
WorkVisual project.
(>>> 10.15 "Transferring the WorkVisual project" Page 73)
10 Calibrate the cameras.
(>>> 10.16 "Calibrating a camera (stationary)" Page 73)
(>>> 10.17 "Calibrating cameras (moving)" Page 75)
11 Configure verification of the calibration.
(>>> 10.18 "Configuring verification of the calibration" Page 76)
Procedure 1. Assign a fixed IP address to the KUKA IPC in the Windows network set-
tings.
2. Configure interfaces CH1 to CH4 in the Windows network settings as fol-
lows:
a. Assign fixed IP addresses to the interfaces. These addresses must be
in the same address range as the IP addresses of the cameras.
b. Activate the check boxes for the following settings:
eBus Universal Pro Driver
Internet Protocol Version 4 (TCP/IPv4)
c. Deactivate the check boxes for all other settings.
3. Set the Jumbo Packet setting to 9014 bytes in the driver settings of inter-
faces CH1 to CH4.
4. Deactivate the Windows firewall for interfaces CH1 to CH4.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Under the Options node, right-click on the option package.
3. Select Vision server settings in the context menu. A window opens.
4. Select the option Vision server running on external system (IPC) under
Connection settings.
5. Enter the IP address of the KUKA IPC in the IP address or computer
name box.
6. Save the project to apply the settings.
Description In order to establish a connection between the kernel system and the image
processing system, the Ethernet KRL option must be configured. The number
of the flag that is to trigger execution of the interrupt program must be entered
here. The interrupt program monitors the result of the image processing. It is
triggered once the calculation of the image processing is completed.
This flag can be used to call an error treatment subprogram via an in-
terrupt.
6. In the EKI alive flag box, enter the number of the flag that is set as soon
as an EKI connection exists. The flag is reset if the connection is discon-
nected or interrupted.
7. Press Save and close the window.
Example
Device IP address Subnet mask
address range
1st GigE camera 192.169.2.101 255.255.0.0
2nd GigE camera 192.169.2.102 255.255.0.0
3rd GigE camera 192.169.2.103 255.255.0.0
GigE network card in the robot 192.169.2.100 255.255.0.0
controller
Description
Box Description
Serial number Serial number of the cameras
Box Description
Description Description of the cameras
Status Green: The camera is ready for operation.
Red: The camera is not ready for operation
and can thus not be used.
If the configuration is carried out online, it is possible to search for the cameras
in the network with WorkVisual and to insert them into the project. The camer-
as can then be configured.
Only if the cameras are connected to the KUKA IPC: The Windows
firewall must be deactivated for the network connections to which
cameras are connected.
Example
Device IP address Subnet mask
address range
1st GigE camera 192.169.2.101 255.255.0.0
2nd GigE camera 192.169.2.102 255.255.0.0
3rd GigE camera 192.169.2.103 255.255.0.0
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Open the Options node and right-click on the option package.
3. Select Search for vision cameras in the context menu. A window opens.
4. Click on Search for cameras at the top right. The search is started. If cam-
eras are found, they are listed under Cameras found.
5. Only if the cameras are connected to the KUKA IPC: If not all cameras
have been found, e.g. because they were connected during the runtime,
the service for image recognition on the KUKA IPC can be restarted:
a. Right-click on the option package and select Vision server settings.
A window opens.
b. Click on Reload camera hosts at the top right.
c. Confirm the request for confirmation with Yes.
The restart of the service for image recognition may cause a running
production system to stop.
After the restart, a search is carried out for network cards to which a cam-
era is connected. The network cards are displayed under Camera hosts.
Settings that can have a detrimental effect on the system are highlighted
in red.
6. Click on Add to project (all) at the top right of the Search for cameras
window. All cameras in the list are added to the project.
7. Under Cameras found, click on the wrench icon next to the camera that
is to be configured. The camera settings window opens.
8. Optional: Modify the IP addresses of the cameras.
a. In the Device information area, click on Change IP configuration. A
window opens.
b. In the IP address and Subnet mask boxes, enter the IP address of
the camera and the subnet mask.
c. Click on Apply. The changes are saved on the camera.
9. Optional: Modify the settings in the General settings and I/O manage-
ment areas.
Description
Item Description
1 Camera found
2 Opens the live image display of the camera
3 Opens the camera settings
Do not use the camera Generic in this step. This catalog element is
used for cameras that are not supplied by KUKA. These cameras are
automatically added to the project when the KUKA System Software
is started or the cameras are updated in the sensor overview. When the proj-
ect is transferred from the robot controller to WorkVisual, advanced settings
can be made for these cameras.
To display the IP address and subnet mask of the camera here, the
project must be transferred to the robot controller, activated and then
transferred back to WorkVisual.
8. In the General settings area, select the mounting type for the camera.
9. Optional: Enter a name in the Description: box.
10. Optional: Carry out the desired settings in the I/O management area.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Click on the live image. The live image is displayed in enlarged form.
4. Enter the exposure time or set it using the plus/minus keys.
5. To save the exposure time, click on Set default exposure. The exposure
time is saved in the project and transferred to the robot controller or
KUKA IPC.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Click on the live image. The live image is displayed in enlarged form.
4. Select the desired calibration plane. All subsequent images are perspec-
tive-controlled. If no calibration plane is selected, the images are not per-
spective-controlled.
5. Enlarge the image section using the slider control, by scrolling or by click-
ing on the image.
6. Move the image section using either the arrows at the edge of the screen
or drag&drop. The feature of the measurement object that is to be sharply
in focus must be in this image section.
7. Only if using the KUKA MXG20 camera: Unscrew the protective lens hood.
8. Loosen the fastening screws of the camera in order to be able to turn the
aperture and lens.
9. Select the f-number, exposure time and lens setting so that the measure-
ment object is sharply in focus in the image section.
10. Re-tighten the fastening screws.
11. Only if using the KUKA MXG20 camera: Screw the protective lens hood
back on.
Description If the KUKA IPC is not being used, the project only needs to be transferred to
the robot controller. The procedure is the same as that described in the
WorkVisual documentation. If the KUKA IPC is being used, all settings must
additionally be transferred to the KUKA IPC.
Procedure 1. Click on the Deploy... button in the menu bar. The WorkVisual Project
Deployment window is opened. The virtual robot controller from the proj-
ect and the virtual KUKA IPC are displayed on the left-hand side. The real
robot controller and the real KUKA IPC are displayed on the right-hand
side.
2. Activate the corresponding check boxes on the left-hand side for both the
robot controller and the KUKA IPC.
3. Perform all other steps as described in the WorkVisual documentation.
Recommendations:
The camera should be directly above the calibration plate.
The calibration plate should fill as much of the image of the camera
as possible.
6. Press Calibration.
7. Once the calibration process has been completed, the result is displayed.
For an adequate degree of accuracy, the result should be < 1 mm.
8. Select an existing calibration plane in the Calibration plane box or choose
Create new calibration plane... to create a new calibration plane:
a. Enter a name for the calibration plane in the Name box.
b. Optional: Enter a description of the calibration plane in the Descrip-
tion box.
An example reference feature is provided with the software. After the software
has been installed on the service laptop, this example can be found in the Start
menu. We recommend using this reference feature. Any feature can be used
as the reference feature, however.
Step Description
1 Install reference feature.
Note: The reference feature must be installed, if possible, in
the calibration plane that is to be verified. It must also be
located in the field of vision of the camera.
Step Description
2 Configure the verification task.
(>>> 10.18.1 "Configuring the verification task" Page 77)
3 Determine the reference position.
(>>> 10.18.2 "Determining the reference position" Page 79)
Procedure 1. In the main menu, select VisionTech > Calibration verification. All cre-
ated verification tasks are displayed in an overview.
2. Press New and select the 2D verification type. A new verification task is
created.
3. Press Import tool block.
4. Select the directory in which the tool block file is located.
5. Select the tool block file and confirm with OK. The file is imported into Vi-
sionTech and can then be used for all verification tasks.
6. Press the button in the box of the desired verification task.
7. Optional: Change the name of the verification task in the Name box.
8. Select the desired camera in the Available sensors box and press the
button.
9. In the Tool block: box, select the tool block file that was imported for the
camera selected in the Available sensors box.
All tool block files imported in VisionTech are available. These can
also include tool block files that are not suitable for verification tasks,
e.g. because they were imported for the configuration of 3D tasks.
10. Next to the Tool block: box, press the button. A window opens with
the live image display.
11. In the Calibration plane box, select the calibration plane that is to be ver-
ified. If possible, the reference feature should be in this plane.
12. Optional: Enter the exposure time or set it using the plus/minus keys or the
slider control.
13. Press Apply. The settings are saved.
Configuration of the verification task has been successfully completed
when the button is black and the button is active.
Description
Item Description
1 Back to Overview
2 Name of the verification task
The name is freely selectable.
3 List of all available cameras that can be used for this task
4 List of all imported tool block files
5 Switches to the live image display in which the following settings
are possible:
Set exposure time
0 … 200 ms
Select calibration plane
Take images from this camera
6 Deletes the selected camera from this task
7 Adds the selected camera to this task
8 Status of the verification task
Green: Reference position has been determined successfully.
Red: No reference position has yet been determined.
9 Perform verification task and determine reference position
Active: The reference position can be determined.
Inactive: The reference position cannot be determined. The
verification task is not configured.
Button Description
Save picture Acquires an image with the selected camera and
saves it
Save Saves the task configuration
Cancel Aborts the task configuration without saving
Description Once the reference position has been determined, the number of successful
cycles is displayed above the table. The image processing task is executed
once per cycle. Unsuccessful cycles indicate disruptive influences or unstable
setup of the image processing system. To ensure sufficient accuracy and sta-
bility, 50 successful cycles are strongly recommended.
The limit value defines the maximum permissible offset in the positive and
negative directions. If this limit value is exceeded, the verification fails. The
minimum limit value determined from the standard deviation (sigma) of the
corresponding component serves as an orientation value. The limit value
should be selected in such a way that a failure only occurs if the calibration
really has deteriorated.
Column Description
Mean value Mean value of the position (X, Y and A) of the reference
feature relative to the calibration coordinate system over 50
cycles
Sigma Standard deviation or variance of the measured values
from 50 cycles about the mean values of X, Y and A
Note: Large values can result in very large minimum limit
values that are not suitable for verification. In this case,
check the image processing and minimize disruptive influ-
ences as far as possible.
Min. limit Minimum possible limit value of the corresponding compo-
value +/- nent
Limit value Limit value for modification of the position of the reference
+/- feature in the positive and negative directions
Calculation The minimum limit value is calculated based on a type 1 study of a measure-
ment systems analysis using the following formula:
Minimum limit value = (2 * s * Cg) / 0.2
Variable Description
s Variance of the 50 current measurements
Cg Measuring system analysis index
In this application, the measuring system analysis index
always has the value of 1.33.
11 Configuration
f
Step Description
1 Teach reference pose above the workpiece.
2 Create task and take images.
Only for 2D measurement tasks: (>>> 11.2 "Creating a task
and taking images (stationary)" Page 81)
For 2D and 3D measurement tasks: (>>> 11.3 "Creating a
task and taking images (moving)" Page 83)
The acquisition of images is alternatively possible with
WorkVisual: (>>> 11.4 "Acquiring images via WorkVisual"
Page 84)
3 Set up an image processing task in WorkVisual.
(>>> 11.5 "Setting up an image processing task in WorkVi-
sual" Page 84)
4 Configure task.
(>>> 11.6 "Configuring a 2D task" Page 87)
(>>> 11.9 "Configuring a 3D task" Page 92)
5 Generate model.
(>>> 11.7 "Generating a 2D model" Page 90)
(>>> 11.10 "Generating a 3D model" Page 95)
6 Test task.
(>>> 11.8 "Testing a 2D task" Page 91)
(>>> 11.11 "Testing a 3D task" Page 95)
7 Create KRL program.
In the case of 2D tasks with stationary cameras, it is possible to work with rel-
ative or absolute position data from the image processing system.
Relative: All position data refer to the position of the workpiece at the time
of model generation. In the case of relative position data, a model must be
generated.
Absolute: All position data refer to the origin of the calibration coordinate
system. In the case of absolute position data, it is not necessary to gener-
ate a model. All objects can be addressed directly in KRL without the need
for teaching.
Item Description
1 Fiducial mark
2 Origin of the calibration coordinate system
3 Position of the workpiece at the time of model generation
4 Current position of the workpiece
5 Relative position data
Item Description
1 Fiducial mark
2 Origin of the calibration coordinate system
3 Position of workpiece 1
4 Position of workpiece 2
5 Absolute position data
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. A window opens
with the live image display.
3. Optional: Select the desired calibration plane. All subsequent images are
perspective-controlled. If no calibration plane is selected, the images are
not perspective-controlled.
4. Optional: Click on Select folder and change the directory to which the im-
ages are saved. Confirm with OK.
5. Click on Save image. The image is saved as a PNG file. This file name is
made up as follows: Serial-number_Calibration-plane_Year-Month-
Day_Hour-Minute-Second-Millisecond.
Step Description
1 Only necessary if the images were acquired via the robot con-
troller:
Save perspective-controlled images to a USB stick or make
them available via the network.
2 Create a tool block file in WorkVisual.
(>>> 11.5.1 "Creating a tool block file in WorkVisual"
Page 85)
3 Transfer the tool block file to the robot controller or KUKA
IPC.
(>>> 11.5.3 "Transferring a tool block file to the robot control-
ler or KUKA IPC" Page 87)
If the text size in Windows is set to greater than 100%, graphics, texts
and input boxes may be shifted and/or overlap. This may restrict op-
eration of the plug-in. In this case, set the text size in Windows to
100%.
4. Click on the button. The image(s) are inserted next to the inputs under
InputImage.
10. Click on the button and choose whether the tool should be saved com-
pletely or without images or results.
11. Select a directory and click on Save.
Description
1 Linking arrow
11.5.2 Loading tool block files from the robot controller or KUKA IPC
Procedure 1. Select the menu sequence Tool block > Download from... and select the
robot controller or KUKA IPC. A window with an overview opens. All tool
block files located on the robot controller or KUKA IPC are displayed in the
overview.
2. If a tool block file is to be opened after loading in the tool block editor: Se-
lect the file and click on Open. The file is loaded from the robot controller
or KUKA IPC and then opened.
If one or more tool block files are to be downloaded from a directory:
a. Select the desired tool block files and click on Download. If existing
tool block files are to be overwritten, activate the check box for Over-
write existing.
b. Select a directory and confirm with OK. The tool block files are loaded
from the robot controller or KUKA IPC and saved in the selected direc-
tory.
c. Close the window.
11.5.3 Transferring a tool block file to the robot controller or KUKA IPC
Procedure Select the menu sequence Tool block > Upload to... and select the robot
controller or KUKA IPC. The tool block file is transferred. If the transfer was
successful, a corresponding message is displayed in the message win-
dow.
Description
Item Description
1 Back to Overview
2 Name of the task
The name is freely selectable.
3 Position data
(>>> 11.2.1 "Relative and absolute position data" Page 82)
Note: This box is only relevant for 2D tasks with a stationary cam-
era.
4 List of all available cameras that can be used for this task
5 List of all imported tool block files
6 Switches to the live image display in which the following settings
are possible:
Set exposure time
0 … 200 ms
Select calibration plane
Take images from this camera
7 Deletes the selected camera from this task
8 Adds the selected camera to this task
9 Generate model
Check box active: Model has been successfully generated.
Check box not active: No model has yet been generated.
Inactive: Model is not relevant (with absolute position data)
10 Select the number of parts
Item Description
11 State of the task
Green: Task has been successfully configured.
Red: Task is not configured.
12 Test task
Active: The task can be tested.
Inactive: The task cannot be tested. No model has yet been
generated, or the task is not configured.
Button Description
Save picture(s) Acquires an image with the selected camera and
saves it
Save Saves the task configuration
Cancel Aborts the task configuration without saving
Description If multiple workpieces were detected in the test, individual workpieces can be
selected from a list in the overview of the results. Areas detected by the cam-
eras are indicated in green in the images.
Depending on whether relative or absolute position data have been used, dif-
ferent values are displayed in the table:
In the case of relative position data: difference between the reference po-
sition used for model generation and the calculated position
In the case of absolute position data: deviation of the position of the indi-
vidual workpiece from the origin of the calibration coordinate system
If the selected is too low, this may lead to inconsistencies, e.g. if each
camera detects different objects. It is advisable to enter the number
of components that one expects to find.
Description
Item Description
1 Back to Overview
2 Name of the task
The name is freely selectable.
3 Generate model
Check box active: Model has been successfully generated.
Check box not active: No model has yet been generated.
4 List of all available cameras that can be used for this task
5 List of all imported tool block files
6 Switches to the live image display in which the following settings
are possible:
Set exposure time
0 … 200 ms
Take images from this camera
7 Deletes the selected camera from this task
8 Adds the selected camera to this task
9 Select the number of parts
10 State of the task
Green: Task has been successfully configured.
Red: Task is not configured.
11 Test task
Active: The task can be tested.
Inactive: The task cannot be tested. No model has yet been
generated, or the task is not configured.
Button Description
Save picture(s) Acquires an image with the selected camera and
saves it
Save Saves the task configuration
Cancel Aborts the task configuration without saving
Description If multiple workpieces were detected in the test, individual workpieces can be
selected from a list in the overview of the results. Areas detected by the cam-
eras are indicated in green in the images. The differences between the refer-
ence position and the calculated position are displayed in the table.
The value of Score is the mean square residual error after calculation of the
object position relative to the nominal position. If the position of the object can-
not be calculated, Score has the value -1.
12
2
Programming with inline forms and templates
Procedure Select the menu sequence Commands > VisionTech > Trigger.
Description This instruction requests image acquisition with the camera configured in the
specified task. The generated image is evaluated with the image processing
task. Inputs configured in the tool block are taken into consideration.
Item Description
1 Name of the task
All available tasks are displayed in the list. Verification tasks are
not displayed.
2 Input for the tool block
Touch the arrow to edit the data. The corresponding option win-
dow is opened.
(>>> 12.1.1 "Option window “Input”" Page 98)
Note: This box is only displayed if the input has been configured
in the tool block and corresponds to the naming convention.
Item Description
1 New value to be used by the tool block
Possible data types: Double, Boolean, String, Int32
2 Check box active: The value under item 1 is used by the tool
block; the default value is overwritten.
Check box not active: The default value is used and dis-
played in item 1.
Procedure Select the menu sequence Commands > VisionTech > WaitForResult.
Description This instruction is used to wait for a result of the image processing task. As
soon as a valid result is present, the program execution is resumed.
Item Description
1 Name of the task
All available tasks are displayed in the list. Verification tasks are
not displayed.
Procedure Select the menu sequence Commands > VisionTech > SetTargetBase.
Description The instruction calculates the temporary workpiece base on the basis of the
calibration base and the result of the image processing task. In this way, the
original calibration data remain available in the calibration base.
It is advisable only to use the workpiece base for the offset points.
Item Description
1 Calibration base
2 Workpiece base
Procedure 1. Select the menu sequence Commands > VisionTech > LoopResults.
2. Press Cmd OK to save the instruction.
3. Press Open/close fold.
4. Insert the desired instructions (e.g. SetTargetBase) into the fold.
Description The instruction executes the instructions contained in the fold for all results of
the image processing task. In the case of an error or if no workpiece is found,
the flag No success flag is briefly set to TRUE. This allows individual error
evaluation to be carried out (e.g. using an interrupt).
Procedure Select the menu sequence Commands > VisionTech > Direct.
Description The instruction calculates the shortest path between two points. Status and
Turn of the destination point are overwritten. The instruction only affects des-
tination points that are addressed with the motion type PTP. Direct is executed
in the advance run.
Item Description
1 Start point
2 Tool and base of the start point
Touch the arrow to edit the data. The corresponding option win-
dow is opened.
(>>> 12.5.1 "Option window “Tool/base for start point”" Page 100)
3 End point
4 Tool and base of the destination point
Touch the arrow to edit the data. The corresponding option win-
dow is opened.
(>>> 12.5.2 "Option window “Tool/base for destination point”"
Page 101)
Item Description
1 Select tool of the start point.
This box is only active if manual input has been selected for item 3.
2 Select base of the start point.
This box is only active if manual input has been selected for item 3.
3 From .dat file: Tool and base are read from the data list.
Input: Enter the tool and base manually.
Item Description
1 Select tool of the destination point.
This box is only active if manual input has been selected for item 3.
2 Select base of the destination point.
This box is only active if manual input has been selected for item 3.
3 From .dat file: Tool and base are read from the data list.
Input: Enter the tool and base manually.
In the case of a task with a short runtime, it is possible that the pro-
gram may stop in the line with the subprogram
VT_WAITFORRESULT. The following options are available for
avoiding this:
Program an advance run stop before execution of the task.
Program the VT_TASKTRIGGER subprogram instead of the
VT_TASKTRIGGER_ADRUN subprogram (also causes an advance run
stop)
Do not program any motion instructions between the BCO run and the
Open the communication fold.
Program ...
11 ;Define the task name here
12 TaskName[] = "Unknown"
13 ;---------------------------------------------------
14 ;Define the number of the calibration base here
15 CalibrationBaseNo = 0
16 ;---------------------------------------------------
17 ;Define the number of the part base here
18 TargetBaseNo = 0
19 ;---------------------------------------------------
20 ;Define the config file name here
21 ; -> Usually you dont have to change it
22 ConfigFile[] = "VisionTechConfig"
23
24 PTP HOME Vel=100 % DEFAULT
25 Open the communication
26
27 Set your inputs here
28 Trigger the task
29 Wait for result
30
31 ;---------------------------------------------------
32 ;This loop is used to handle each result of the task
33 FOR ObjectCounter = 1 TO ResultCounter
34
35 ;------------------------------------------------
36 ;The results of the toolblock are checked here
37 $FLAG[VTNoSuccessFlagNo] = NOT VT_CheckResult
(Results[ObjectCounter])
38
39 ;------------------------------------------------
40 ;Switch between the result types; -1 is DEFAULT
41 SWITCH Results[ObjectCounter].TypeId
42 CASE -1
43 Read the result attributes here
44 IF NOT $FLAG[VTNoSuccessFlagNo] THEN
45 ;---------------------------------------------
46 ;The part base is adjusted here
47 BASE_DATA[TargetBaseNo] = BASE_DATA[CalibrationBaseNo] :
VT_GetCorrectionFrame(Results[ObjectCounter])
48
49
;---------------------------------------------
50 ;Insert your part handling here
51 ;Use base data of the TargetBaseNo
52
53
54 ELSE
55 ;---------------------------------------------
56 ;Do error handling here
57
58
59
60 $FLAG[VTNoSuccessFlagNo] = FALSE
61 ENDIF
62 ENDSWITCH
63 ENDFOR
64 ObjectCounter = 1
65
66 Clear buffer
Description
Line Description
12 Enter the name of the task.
15 Enter the number of the calibration base.
18 Enter the number of the workpiece base.
Note: It is advisable to use this base only as a workpiece base,
as it is cyclically overwritten.
22 Specify the configuration file to be used.
Note: The configuration file is predefined by default and only
needs to be changed if more than 31 workpieces are to be
detected.
25 Communication to the image processing system is established
in this fold.
26 Motions can be inserted here to ensure that the manipulator is
outside the image acquisition range of the camera.
27 Inputs for the tool block can be defined in this fold. The fold
contains an example for every data type.
Note: In order to be able to use the inputs, they must first be
configured in the tool block.
28 Image acquisition during approximate positioning is requested
in this fold.
29 This fold waits for the result of the image processing. The
manipulator can be moved in the meantime, but these motions
must not interfere with the image acquisition range of the cam-
era.
33 If multiple workpieces have been detected, this loop executes
instructions for all workpieces.
37 Here the system checks whether the image processing task
was executed successfully.
41 If there are different types of workpiece, this instruction can be
used to execute individual instructions for the specific work-
piece type.
43 Object attributes from the tool block can be polled in this fold.
The fold contains an example for every data type.
Note: In order to be able to use the object attributes, they must
first be configured in the tool block.
47 If a workpiece has been detected successfully, the base is off-
set with this instruction.
52 Program here the instructions that are to be executed for the
detected workpieces. The workpiece base (line 18) must be
used for this.
57 Error treatment instructions can be programmed here.
66 In this fold, the memory is deleted.
67 Communication to the image processing system is discon-
nected in this fold.
Program ...
11 ;Define the task name here
12 TaskName[] = "Unknown"
13 ;---------------------------------------------------
14 ;Define the number of the calibration base here
15 CalibrationBaseNo = 0
16 ;---------------------------------------------------
17 ;Define the number of the part base here
18 TargetBaseNo = 0
19 ;---------------------------------------------------
20 ;Define the config file name here
21 ; -> Usually you dont have to change it
22 ConfigFile[] = "VisionTechConfig"
23
24 PTP HOME Vel=100 % DEFAULT
25 Open the communication
26
27 ;---------------------------------------------------
28 ;Insert the camera position here
29
30
31 Set your inputs here
32 Trigger the task
33
34 ;---------------------------------------------------
35 ;This loop is used to handle each result of the task
36 FOR ObjectCounter = 1 TO ResultCounter
37
38 ;------------------------------------------------
39 ;The results of the toolblock are checked here
40 $FLAG[VTNoSuccessFlagNo] = NOT VT_CheckResult
(Results[ObjectCounter])
41
42 ;------------------------------------------------
43 ;Switch between the result types; -1 is DEFAULT
44 SWITCH Results[ObjectCounter].TypeId
45 CASE -1
46 Read the result attributes here
47 IF NOT $FLAG[VTNoSuccessFlagNo] THEN
48 ;---------------------------------------------
49 ;The part base is adjusted here
50 BASE_DATA[TargetBaseNo] = BASE_DATA[CalibrationBaseNo] :
VT_GetCorrectionFrame(Results[ObjectCounter])
51
52 ;---------------------------------------------
53 ;Insert your part handling here
54 ;Use base data of the TargetBaseNo
55
56
57 ELSE
58 ;---------------------------------------------
59 ;Do error handling here
60
61
62
63 $FLAG[VTNoSuccessFlagNo] = FALSE
64 ENDIF
65 ENDSWITCH
66 ENDFOR
67 ObjectCounter = 1
68
69 Clear buffer
70 Close the communication
71 PTP HOME Vel=100 % DEFAULT
72
73 END
Description
Line Description
12 Enter the name of the task.
15 Enter the number of the calibration base.
18 Enter the number of the workpiece base.
Note: It is advisable to use this base only as a workpiece base,
as it is cyclically overwritten.
22 Specify the configuration file to be used.
Note: The configuration file is predefined by default and only
needs to be changed if more than 31 workpieces are to be
detected.
25 Communication to the image processing system is established
in this fold.
29 Program motions here to move the camera (or cameras in the
case of 3D) to the same position as during calibration.
31 Inputs for the tool block can be defined in this fold. The fold
contains an example for every data type.
Note: In order to be able to use the inputs, they must first be
configured in the tool block.
32 This fold requests an image acquisition and waits for the result
of the image processing.
36 If multiple workpieces have been detected, this loop executes
instructions for all workpieces.
40 Here the system checks whether the image processing task
was executed successfully.
44 If there are different types of workpiece, this instruction can be
used to execute individual instructions for the specific work-
piece type.
46 Object attributes from the tool block can be polled in this fold.
The fold contains an example for every data type.
Note: In order to be able to use the object attributes, they must
first be configured in the tool block.
50 If a workpiece has been detected successfully, the base is off-
set with this instruction.
55 Program here the instructions that are to be executed for the
detected workpieces. The workpiece base (line 18) must be
used for this.
60 Error treatment instructions can be programmed here.
69 In this fold, the memory is deleted.
70 Communication to the image processing system is discon-
nected in this fold.
Program ...
10 ;Define the verification task name here
11 TaskName[] = "Unknown"
12 ;---------------------------------------------------
13 ;Define the config file name here
14 ; -> Usually you dont have to change it
15 ConfigFile[] = "VisionTechConfig"
16
17 PTP HOME Vel=100 % DEFAULT
18 Open the communication
19
20 Trigger the task
21
22 ;------------------------------------------------
23 ;The results of the toolblock are checked here
24 $FLAG[VTNoSuccessFlagNo] = NOT VT_CheckVerifyResult(VerifyResult)
25
26 IF NOT $FLAG[VTNoSuccessFlagNo] THEN
27 IF NOT VerifyResult.IsInLimit THEN
28 ;---------------------------------------------
29 ;Do verification result not in range handling
30
31
32 ELSE
33 ;---------------------------------------------
34 ;Verification successful
35
36
37 ENDIF
38 ELSE
39 ;---------------------------------------------
40 ;Do error handling here
41
42
43 $FLAG[VTNoSuccessFlagNo] = FALSE
44 ENDIF
45
46 Clear buffer
47 Close the communication
48 PTP HOME Vel=100 % DEFAULT
49
50 END
Line Description
20 This fold requests an image acquisition and waits for the result
of the image processing.
24 Here the system checks whether the image processing task
was executed successfully.
30 If the reference feature was successfully detected during the
image processing task and the result is not inside the config-
ured limits, error treatment instructions can be programmed
here.
35 Program here the instructions to be executed if the result is
inside the configured limits.
41 Error treatment instructions can be programmed here.
46 In this fold, the memory is deleted.
47 Communication to the image processing system is discon-
nected in this fold.
It must be ensured that the appropriate configuration file for the appli-
cation is used:
In the case of 31 workpieces or fewer, it is advisable to use the con-
figuration file “VisionTechConfig” in all subprograms. The configuration
file “VisionTechConfigMaxParts” can also be used, but initialization of the
EKI channel (VT_INIT) can take longer with this file.
In the case of more than 31 workpieces, the configuration file “Vision-
TechConfigMaxParts” must be used in all subprograms. Otherwise, no
result is transferred.
Description The subprogram VT_INIT initializes the EKI channel. The configuration file is
transferred. VT_INIT is executed in the advance run.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Example VT_INIT("VisionTechConfig")
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Example VT_OPENCONNECTION("VisionTechConfig")
For this subprogram, the same base and the same tool data must be
used as for model generation. The result is calculated relative to the
base for model generation.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Image processing Name of the image processing task to be executed
task
Type: CHAR (IN)
For this subprogram, the same base and the same tool data must be
used as for model generation. The result is calculated relative to the
base for model generation.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Image processing Name of the image processing task to be executed
task
Type: CHAR (IN)
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Image processing Name of the image processing task to be executed
task
Type: CHAR (IN)
Reference base Base frame relative to which the result (= deviation in
this base) is to be output
Type: FRAME (IN)
Explanation of
Element Description
the syntax
RET Correction frame that can be offset against the compo-
nent base
Type: FRAME
Image processing Image processing result from which the correction
result frame is extracted
Type: VTRESULT (IN)
Explanation of
Element Description
the syntax
RET Return value
Type: BOOL
TRUE: The image processing task has been exe-
cuted successfully.
FALSE: An error occurred during execution of the
image processing task.
Image processing The image processing result that is to be checked
result
Type: VTRESULT (IN)
The flag for which the program is to wait is the same flag that is set to
TRUE in the subprogram VT_GETTASKRESULTS.
Syntax VT_WAITFORRESULT("Flag")
Explanation of
Element Description
the syntax
Flag Number of the flag for which the program is to wait
Type: INT (IN)
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Example VT_CLOSECONNECTION("VisionTechConfig")
Description The subprogram VT_CLEAR ends the EKI channel. The configuration file is
transferred. VT_CLEAR is executed in the advance run.
Ensure that no data are called by a connection that has already been
terminated by VT_CLEAR.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Example VT_CLEAR("VisionTechConfig")
1 Base correction
2 Offset of point P1 due to the base correction
3 Radius (limit value)
Explanation of
Element Description
the syntax
RET Return value
Type: BOOL
TRUE: The target robot pose lies within the defined
limit range.
FALSE: The limit value is exceeded.
Robot pose The robot pose to be checked
Type: FRAME (IN)
Element Description
Frame Frame returned by the function
VT_GETCORRECTIONFRAME
Type: FRAME (IN)
Offset Maximum offset between the original robot pose and
the robot pose in the new workpiece base in mm
Type: REAL (IN)
Description The subprogram VT_CHECKPOSE checks whether points on the robot de-
fined by the user lie within an area defined by the user. The points on the robot
defined by the user are defined relative to the robot flange. Up to 10 points can
be defined.
(>>> 13.13.1 "Defining the points on the robot" Page 118)
It must be possible for all defined points to be addressed by the robot in the
NULLFRAME base and with the NULLFRAME tool. If a point is located outside
of the robot’s workspace, the robot stops.
Fig. 13-3: Example: Defined points within and outside of the defined area
Syntax RET = VT_CHECKPOSE("Base of the area", "Limit value X", "Limit value Y", "Limit
value Z", "Base of the gripping point", "Gripping point", "Tool")
Explanation of
Element Description
the syntax
RET Return value
Type: BOOL
TRUE: All points defined by the user lie within the
defined limit values.
FALSE: The limit value of at least one point has
been exceeded.
Base of the area Number of the base within which the point to be
checked must lie
Type: INT (IN)
Limit value X Maximum limit value of the base in X direction
Type: INT (IN)
Limit value Y Maximum limit value of the base in Y direction
Type: INT (IN)
Limit value Z Maximum limit value of the base in Z direction
Type: INT (IN)
Base of the grip- Number of the base in which the component must be
ping point gripped
Type: INT (IN)
Gripping point Point at which the component is gripped
Type: FRAME (IN)
Tool Number of the tool with which the component must be
gripped
Type: INT (IN)
All points (relative to the robot flange) in base 16 that lie within the following
values are valid:
X value: max. 700 mm
Y value: max. 950 mm
Procedure 1. In the main menu, select Variable > Overview > Display.
2. Mark a variable in the UserCheckPoints tab and press Edit.
3. Enter the X, Y and Z values and save with OK.
4. Repeat steps 2 to 3 for the further points used. Enter the points in an un-
interrupted sequence from 1 to 10. If, for example, 5 points are desired,
enter the values into the variables UserCheckPoint[1] to UserCheck-
Point[5].
Description The subprogram VT_GETUSERDATA calls up optional user data from all con-
figured tool blocks and makes them available to the KRL.
VT_GETUSERDATA is executed in the advance run.
For the execution of the subprogram, 1 entry with user data from the
tool block is created for each camera.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
User data 2-dimensional array into which user data are written.
The first dimension corresponds to the camera index,
the second dimension contains the user data as a char-
acter string.
Type: CHAR (OUT)
The flag that is to be set to TRUE is the same flag that is used in the
subprogram VT_WAITFORRESULT. The flag that is to be set to
FALSE is defined in the settings for configuration of Ethernet KRL
(>>> 10.5 "Configuring Ethernet KRL" Page 66).
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
RESULTS Results of image processing
Type: VTRESULT (OUT)
RESULT- Contains the result of image processing after execution
COUNTER of the subprogram. If components are located, the
value corresponds to the number of components. The
value is always at least 1, even if no component is
located.
Type: INT (OUT)
Flag True Number of the flag to be set to TRUE once the result
has been read
Type: INT (IN)
Flag False Number of the flag to be set to FALSE once the result
has been read
Type: INT (IN)
The flag that is to be set to TRUE is the same flag that is used in the
subprogram VT_WAITFORRESULT. The flag that is to be set to
FALSE is defined in the configuration file.
Explanation of
Element Description
the syntax
CONFIGFILE Variable in which the name of the configuration file is
saved.
Type: CHAR (OUT)
RESULTS Results of image processing
Type: VTRESULT (OUT)
RESULT- Contains the result of image processing after execution
COUNTER of the subprogram. If components are located, the
value corresponds to the number of components. The
value is always at least 1, even if no component is
located.
Type: INT (OUT)
Flag True Number of the flag to be set to TRUE once the result
has been read
Type: INT (IN)
Flag False Number of the flag to be set to FALSE once the result
has been read
Type: INT (IN)
Description The subprogram VT_DIRECT calculates the status of axis 5 at the end (des-
tination) point in such a way that axes 4 and 6 are moved as little as possible.
The program requires the coordinates X, Y, Z, A, B and C of start and end
points as well as the status of the start point.
Start and end points can be taught with different tools and BASE coordinate
systems. In order for these to be calculated for the axis angles, the tool and
base must be transferred as a parameter for each start and end point.
If the status for the basic area or for axes 2 and 3 changes from the
start point to the end point, the program can no longer be used.
Syntax VT_DIRECT("Start point", "End point", "Base of the start point", "Base of the end
point", "Tool of the start point", "Tool of the end point")
Explanation of
Element Description
the syntax
Start point Type: E6POS (IN)
End point Type: E6POS (OUT)
Base of the start Type: FRAME (IN)
point
Base of the end Type: FRAME (IN)
point
Tool of the start Type: FRAME (IN)
point
Tool of the end Type: FRAME (IN)
point
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Image processing Name of the image processing task to be executed
task
Type: CHAR (IN)
The flag that is to be set to TRUE is the same flag that is used in the
subprogram VT_WAITFORRESULT. The flag that is to be set to
FALSE is defined in the configuration file.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Flag True Number of the flag input to be set to TRUE once the
program has been executed
Type: INT (IN)
Flag False Number of the flag input to be set to FALSE once the
program has been executed
Type: INT (IN)
The flag that is to be set to TRUE is the same flag that is used in the
subprogram VT_WAITFORRESULT. The flag that is to be set to
FALSE is defined in the configuration file.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
RESULT Result of the calibration verification
Type: VTVERIFYRESULT (OUT)
Flag True Number of the flag to be set to TRUE once the result
has been read
Type: INT (IN)
Flag False Number of the flag to be set to FALSE once the result
has been read
Type: INT (IN)
The subprogram reads in the result of the calibration verification and creates
a result structure. The result is written to the transfer parameter RESULT. The
flags are then set to TRUE and FALSE. The subprogram is executed in the ad-
vance run.
The flag that is to be set to TRUE is the same flag that is used in the
subprogram VT_WAITFORRESULT. The flag that is to be set to
FALSE is defined in the configuration file.
Explanation of
Element Description
the syntax
CONFIGFILE Variable in which the name of the configuration file is
saved.
Type: CHAR (OUT)
RESULT Result of the calibration verification
Type: VTVERIFYRESULT (OUT)
Flag True Number of the flag to be set to TRUE once the result
has been read
Type: INT (IN)
Flag False Number of the flag to be set to FALSE once the result
has been read
Type: INT (IN)
Description The subprogram VT_CLEARBUFFER deletes data which have been received
but not yet retrieved from the memory. The configuration file is transferred.
VT_CLEARBUFFER is executed in the advance run.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Example VT_CLEARBUFFER("VisionTechConfig")
Description The subprogram VT_SETEXPOSURE enables modification via the KRL pro-
gram of the exposure time configured in the task. The new exposure time is
applied for all cameras configured in the task. VT_SETEXPOSURE is execut-
ed in the advance run.
The exposure time must be within the limits defined in the file Vision-
TechUser.DAT (variables VTMinExposureValue and VTMaxExposu-
reValue). Otherwise, the exposure time configured in the task will be
used. By default, the range of values is 0 to 200 ms.
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Exposure time Desired exposure time in ms
Type: REAL (IN)
Explanation of
Element Description
the syntax
RET Return value
Type: BOOL
TRUE: The result structure contains the object attri-
bute being searched for
FALSE: The object attribute being searched for was
not found in the result structure
Object attribute Name of the object attribute to be searched for
Type: CHAR (IN)
Result structure Result structure in which the object attribute is to be
searched for
Type: VTRESULT (IN)
Explanation of
Element Description
the syntax
RET Return value in the form of a 32-bit integer with sign.
The number corresponds to the value of the object
attribute.
Type: INT
-231-1 … 231-1
Note: If the object attribute was not found in the result
structure, the value "0" is returned.
Object attribute Name of the object attribute whose value is to be con-
verted
Type: CHAR (IN)
Result structure Result structure in which the object attribute is to be
searched for
Type: VTRESULT (IN)
Explanation of
Element Description
the syntax
RET Return value in the form of a floating-point number with
sign. The number corresponds to the value of the
object attribute.
Type: INT
1.1*10-38 … 3.4*1038
Note: If the object attribute was not found in the result
structure, the value "0.0" is returned.
Object attribute Name of the object attribute whose value is to be con-
verted
Type: CHAR (IN)
Result structure Result structure in which the object attribute is to be
searched for
Type: VTRESULT (IN)
Explanation of
Element Description
the syntax
RET Return value in the form of a character string. This cor-
responds to the value of the object attribute.
Type: CHAR
1 … 100
Note: If the object attribute was not found in the result
structure, a character string of length 1 with the entry '#'
is returned.
Object attribute Name of the object attribute whose value is to be con-
verted
Type: CHAR (IN)
Result structure Result structure in which the object attribute is to be
searched for
Type: VTRESULT (IN)
Explanation of
Element Description
the syntax
RET Value of object attribute
Type: BOOL
Object attribute Name of the object attribute whose value is to be con-
verted
Type: CHAR (IN)
Result structure Result structure in which the object attribute is to be
searched for
Type: VTRESULT (IN)
Explanation of
Element Description
the syntax
RET Return value
Type: BOOL
TRUE: A valid value has been assigned to the ob-
ject attribute.
FALSE: No valid value has been assigned to the
object attribute.
Value Value of the object attribute to be checked for validity
Type: CHAR (IN)
b_isValid = VT_IsAttributeValSet(Results[i].Attribute1.Value[])
Description The subprogram VT_StringToBool converts a Char array into a Bool value. No
distinction is made between uppercase and lowercase letters.
VT_StringToBool is executed in the advance run.
Explanation of
Element Description
the syntax
RET Converted value
Type: BOOL
Char array Char array that is to be converted
Type: CHAR (IN)
b_isValid = VT_StringToBool("True")
Description The subprogram VT_StringToInt converts a Char array into an integer value.
VT_StringToInt is executed in the advance run.
Explanation of
Element Description
the syntax
RET Converted value
Type: INT
Char array Char array that is to be converted
Type: CHAR (IN)
i_isNumber = VT_StringToInt("123456")
Description The subprogram VT_StringToReal converts a Char array into a real value.
VT_StringToReal is executed in the advance run.
Explanation of
Element Description
the syntax
RET Converted value
Type: REAL
Char array Char array that is to be converted
Type: CHAR (IN)
3.40282*1038 … 1.17549*10-38
Note: A decimal point must be used as the decimal
separator. The value can also be written exponentially.
r_isNumber = VT_StringToReal("123456.789")
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Input Name or number of the input to be set on the tool block
Type: ENUM VTINPUTS (#Input1 … #Input5)
Value Value of the input to be transferred to the tool block
Type: BOOL (IN)
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Input Name or number of the input to be set on the tool block
Type: ENUM VTINPUTS (#Input1 … #Input5)
Value Value of the input to be transferred to the tool block
-2147483648 … 2147483648
Type: INT (IN)
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Input Name or number of the input to be set on the tool block
Type: ENUM VTINPUTS (#Input1 … #Input5)
Value Value of the input to be transferred to the tool block
Type: REAL (IN)
Explanation of
Element Description
the syntax
Configuration file Name of the configuration file
Type: CHAR (IN)
Input Name or number of the input to be set on the tool block
Type: ENUM VTINPUTS (#Input1 … #Input5)
Value Value of the input to be transferred to the tool block
Type: CHAR (IN)
1 … 100
Note: Special characters must not be used.
14
4
Example programs
x
Program 1 INI
2
3 PTP HOME Vel= 100% DEFAULT
4
5 PTP P1 Vel=100% PDAT1 Tool[1]: Gripper BASE[3]
6
7 VT.TRIGGER 2Dfix_relative
8
9 VT.WAITFORRESULT 2Dfix_relative
10
11 VT.SETTARGETBASE CalibrationBase=Base[1]:CalibBase
TargetBase=Base[2]:PartBase
12
13 VT.DIRECT Source: P1 Tool[1]:Gripper BASE[3] Target: P2
Tool[1]:Gripper BASE[2]:PartBase
14
15 PTP P2 Vel=100% PDAT2 Tool[1]:Gripper BASE[2]:PartBase
16
17 PTP HOME Vel= 100% DEFAULT
Description
Line Description
7 Requests execution of the task with the name
“2Dfix_relative”.
9 Waits for the result of the task “2Dfix_relative”.
11 Shifts the calibration base onto the workpiece base with the
vector of the result of the image processing task.
13 Recalculates Status and Turn if point P2 cannot be addressed
using the Status and Turn values from point P1.
15 Executes a PTP motion to the taught point. The offset base
from line 11 is used for this.
Program 1 INI
2
3 PTP HOME Vel= 100% DEFAULT
4
5 VT.TRIGGER 2Dfix_relative
6
7 VT.WAITFORRESULT 2Dfix_relative
8
9 VT.LOOPRESULTS
10 FOR ObjectCounter = 1 TO ResultCounter STEP 1
11 IF Results[ObjectCounter].Succeeded == FALSE THEN
12 $FLAG[VTNoSuccessFlagNo] = NOT VT_CheckResult
(Results[ObjectCounter])
13 ELSE
14 ;insert VT.SETTARGETBASE inline form here
15 VT.SETTARGETBASE CalibrationBase=Base[1]:CalibBase
TargetBase=Base[2]:PartBase
16
17 ;teach robot movement in the target base
Description
Line Description
5 Requests execution of the task with the name
“2Dfix_relative”.
7 Waits for the results of the task “2Dfix_relative”.
9 Executes the instructions contained in the fold (lines
10 … 22) for all results of the image processing task.
12 Describes the flag “NoSuccessFlag”. Notification messages
are generated for this result.
15 Shifts the calibration base onto the workpiece base with the
vector of the result of the image processing task.
18 Executes a PTP motion to the taught point. The shifted base
from line 15 is used for this. The results of the individually
detected workpieces are addressed in this way.
15 Messages
s
The “Messages” chapter contains selected messages. It does not cover all the
messages displayed in the message window.
15.2.1 VTH35004
Possible cause(s) Cause: The fiducial mark is not detected in all the images
(>>> Page 133)
Solution: Align the camera with the fiducial mark (>>> Page 134)
Description The fiducial mark is the cross at the center of the calibration plate; it must al-
ways be visible during calibration.
Description The camera must be aligned in such a way that the fiducial mark is visible.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.2 VTH35206
Possible cause(s) Cause: Task file has an invalid format (>>> Page 134)
Solution: Use valid task file (>>> Page 135)
Description The task file has an invalid format. As such, it cannot be loaded. The task file
must meet the following requirements:
The task file must be of file type .task (Vision Tech Task).
The task file must not have been created using an older version of Vision-
Tech. The task files are not compatible for use between different main ver-
sions of VisionTech.
15.2.3 VTH35901
Possible cause(s) Cause: Image acquisition not possible (>>> Page 135)
Solution: Connect camera (>>> Page 135)
Checking instruc- Check whether the camera has been connected in accordance with the
tions documentation.
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.4 VTH35902
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 138)
Solution: End the current process first (>>> Page 139)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.5 VTH35904
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 144)
Solution: Align the camera via the smartHMI (>>> Page 144)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.6 VTH35905
Possible cause(s) Cause: Camera not fully or correctly calibrated (>>> Page 148)
Solution: Calibrate camera (stationary) (>>> Page 148)
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
Recommendations:
The camera should be directly above the calibration plate.
The calibration plate should fill as much of the image of the camera
as possible.
6. Press Calibration.
7. Once the calibration process has been completed, the result is displayed.
For an adequate degree of accuracy, the result should be < 1 mm.
8. Select an existing calibration plane in the Calibration plane box or choose
Create new calibration plane... to create a new calibration plane:
a. Enter a name for the calibration plane in the Name box.
b. Optional: Enter a description of the calibration plane in the Descrip-
tion box.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
15.2.7 VTH35907
Possible cause(s) Cause: Task file has an invalid format (>>> Page 151)
Solution: Use valid task file (>>> Page 151)
Description The task file has an invalid format. As such, it cannot be loaded. The task file
must meet the following requirements:
The task file must be of file type .task (Vision Tech Task).
The task file must not have been created using an older version of Vision-
Tech. The task files are not compatible for use between different main ver-
sions of VisionTech.
15.2.8 VTH35912
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 152)
Solution: Align the camera via the smartHMI (>>> Page 152)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.9 VTH35913
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 156)
Solution: Align the camera via the smartHMI (>>> Page 156)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.10 VTH35918
Possible cause(s) Cause: Task file has an invalid format (>>> Page 160)
Solution: Use valid task file (>>> Page 160)
Description The task file has an invalid format. As such, it cannot be loaded. The task file
must meet the following requirements:
The task file must be of file type .task (Vision Tech Task).
The task file must not have been created using an older version of Vision-
Tech. The task files are not compatible for use between different main ver-
sions of VisionTech.
15.2.11 VTH35920
Possible cause(s) Cause: Cameras aligned with different components (>>> Page 161)
Solution: Align the camera via the smartHMI (>>> Page 161)
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
All calibration plates from KUKA are available for selection. Each calibra-
tion plate has a different size which is marked on the plate. The calibration
plate used can be determined on the basis of size.
3. Select the camera by tapping on the freeze-frame image.
4. Press Calibration Wizard.
5. Press Take picture. The fiducial mark (cross at the center of the calibra-
tion plate) must be visible.
Recommendations:
The camera should be directly above the calibration plate.
The calibration plate should fill as much of the image of the camera
as possible.
6. Press Calibration.
7. Once the calibration process has been completed, the result is displayed.
For an adequate degree of accuracy, the result should be < 1 mm.
8. Select an existing calibration plane in the Calibration plane box or choose
Create new calibration plane... to create a new calibration plane:
a. Enter a name for the calibration plane in the Name box.
b. Optional: Enter a description of the calibration plane in the Descrip-
tion box.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
15.2.12 VTH35923
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 165)
Solution: End the current process first (>>> Page 166)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera has been connected in accordance with the
tions documentation.
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.13 VTH35935
Possible cause(s) Cause: The robot is in AUT or AUT EXT mode. (>>> Page 171)
Solution: Change operating mode (>>> Page 171)
Description Some functions are deactivated in AUT and AUT EXT mode. These include:
Live image display
Manual image acquisition
These functions are only available in modes T1 and T2.
If the mode selector switch is the variant with a key: the key is inserted in
the switch.
Procedure 1. Turn the mode selector switch on the smartPAD. The connection manager
is displayed.
2. Select the operating mode.
3. Return the mode selector switch to its original position.
The selected operating mode is displayed in the status bar of the smart-
PAD.
Operat-
Use Velocities
ing mode
Program verification:
Programmed velocity, maxi-
For test operation, pro- mum 250 mm/s
T1 gramming and teach-
ing Jog mode:
Jog velocity, maximum
250 mm/s
Program verification:
T2 For test operation Programmed velocity
Jog mode: Not possible
For industrial robots Program operation:
AUT without higher-level Programmed velocity
controllers Jog mode: Not possible
For industrial robots Program operation:
AUT EXT with higher-level con- Programmed velocity
trollers, e.g. PLC Jog mode: Not possible
15.2.14 VTH35936
Possible cause(s) Cause: The robot is in AUT or AUT EXT mode. (>>> Page 172)
Solution: Change operating mode (>>> Page 173)
Description Some functions are deactivated in AUT and AUT EXT mode. These include:
Live image display
Manual image acquisition
These functions are only available in modes T1 and T2.
Procedure 1. Turn the mode selector switch on the smartPAD. The connection manager
is displayed.
2. Select the operating mode.
3. Return the mode selector switch to its original position.
The selected operating mode is displayed in the status bar of the smart-
PAD.
Operat-
Use Velocities
ing mode
Program verification:
Programmed velocity, maxi-
For test operation, pro- mum 250 mm/s
T1 gramming and teach-
ing Jog mode:
Jog velocity, maximum
250 mm/s
Program verification:
T2 For test operation Programmed velocity
Jog mode: Not possible
For industrial robots Program operation:
AUT without higher-level Programmed velocity
controllers Jog mode: Not possible
For industrial robots Program operation:
AUT EXT with higher-level con- Programmed velocity
trollers, e.g. PLC Jog mode: Not possible
15.2.15 VTH35937
Possible cause(s) Cause: The robot is in AUT or AUT EXT mode. (>>> Page 173)
Solution: Change operating mode (>>> Page 174)
Description Some functions are deactivated in AUT and AUT EXT mode. These include:
Live image display
Procedure 1. Turn the mode selector switch on the smartPAD. The connection manager
is displayed.
2. Select the operating mode.
3. Return the mode selector switch to its original position.
The selected operating mode is displayed in the status bar of the smart-
PAD.
Operat-
Use Velocities
ing mode
Program verification:
Programmed velocity, maxi-
For test operation, pro- mum 250 mm/s
T1 gramming and teach-
ing Jog mode:
Jog velocity, maximum
250 mm/s
Program verification:
T2 For test operation Programmed velocity
Jog mode: Not possible
For industrial robots Program operation:
AUT without higher-level Programmed velocity
controllers Jog mode: Not possible
For industrial robots Program operation:
AUT EXT with higher-level con- Programmed velocity
trollers, e.g. PLC Jog mode: Not possible
15.2.16 VTH35940
Possible cause(s) Cause: An error occurred during creation of the result graphics
(>>> Page 175)
Solution: Carry out cold restart (>>> Page 175)
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
15.2.17 VTH35941
Possible cause(s) Cause: An error occurred during creation of the result graphics
(>>> Page 176)
Solution: Carry out cold restart (>>> Page 176)
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
15.2.18 VTH35942
Possible cause(s) Cause: An error occurred during creation of the result graphics
(>>> Page 177)
Solution: Carry out cold restart (>>> Page 177)
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
15.2.19 VTH35943
Possible cause(s) Cause: An error occurred during creation of the result graphics
(>>> Page 178)
Solution: Carry out cold restart (>>> Page 178)
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
Description An error occurred during creation of the result graphics. The task results can
therefore not be distributed to the clients.
Checking instruc- Check in WorkVisual that correct and complete results are generated by
tions the tool block.
Check whether a current tool block V2 is being used.
Check whether result graphics are generated in the tool block.
15.2.20 VTH35944
Possible cause(s) Cause: Network connection has been terminated (>>> Page 179)
Solution: Restore network connection (>>> Page 179)
Description The network connection has been disconnected or the client has been
switched off.
15.2.21 VTH35945
Possible cause(s) Cause: Network connection has been terminated (>>> Page 179)
Solution: Restore network connection (>>> Page 179)
Description The network connection has been disconnected or the client has been
switched off.
15.2.22 VTH36001
Possible cause(s) Cause: An unexpected runtime error occurred in the software or hard-
ware (>>> Page 179)
Solution: Carry out cold restart (>>> Page 180)
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
15.2.23 VTH36002
Possible cause(s) Cause: An unexpected runtime error occurred in the software or hard-
ware (>>> Page 180)
Solution: Carry out cold restart (>>> Page 180)
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
15.2.24 VTH36003
Possible cause(s) Cause: A file required for execution of the option package is missing
(>>> Page 181)
Solution: Carry out cold restart (>>> Page 181)
Description A file that is required for execution of the option package was not found in the
file system. The file may have been manually deleted, renamed or moved in
the file system.
Description A file that is required for execution of the option package was not found in the
file system. The file may have been manually deleted, renamed or moved in
the file system.
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
No program is selected.
Network connection between PC and robot controller
Description A file that is required for execution of the option package was not found in the
file system. The file may have been manually deleted, renamed or moved in
the file system.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.25 VTH36004
Possible cause(s) Cause: A file required for execution of the option package is missing
(>>> Page 183)
Solution: Carry out cold restart (>>> Page 183)
Description A file that is required for execution of the option package was not found in the
file system. The file may have been manually deleted, renamed or moved in
the file system.
Description A file that is required for execution of the option package was not found in the
file system. The file may have been manually deleted, renamed or moved in
the file system.
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
2. Check whether the configuration in the project matches the real system
configuration. If not, correct the configuration.
3. Transfer the project back from WorkVisual to the robot controller and acti-
vate it.
Description A file that is required for execution of the option package was not found in the
file system. The file may have been manually deleted, renamed or moved in
the file system.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.26 VTH36005
Possible cause(s) Cause: Opening of the specified file failed (>>> Page 185)
Solution: Carry out cold restart (>>> Page 185)
Description Opening of the specified file failed. This may be a temporary problem.
Description Opening of the specified file failed. This may be a temporary problem.
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
Description Opening of the specified file failed. This may be a temporary problem.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.27 VTH36006
Possible cause(s) Cause: The specified KRL variable could not be found (>>> Page 187)
Solution: Carry out cold restart (>>> Page 187)
Cause: The specified KRL variable could not be found (>>> Page 187)
Solution: Check the WorkVisual project and transfer it again
(>>> Page 187)
Cause: The specified KRL variable could not be found (>>> Page 188)
Solution: Reinstall KUKA.VisionTech (>>> Page 188)
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.28 VTH36007
Possible cause(s) Cause: The specified KRL variable could not be found (>>> Page 189)
Solution: Carry out cold restart (>>> Page 189)
Cause: The specified KRL variable could not be found (>>> Page 189)
Solution: Check the WorkVisual project and transfer it again
(>>> Page 189)
Cause: The specified KRL variable could not be found (>>> Page 190)
Solution: Reinstall KUKA.VisionTech (>>> Page 190)
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
3. Transfer the project back from WorkVisual to the robot controller and acti-
vate it.
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.29 VTH36010
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.30 VTH36012
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.31 VTH36014
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.32 VTH36016
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 195)
Solution: End the current process first (>>> Page 196)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.33 VTH36019
Possible cause(s) Cause: Changes to the sensor configuration not permitted during cali-
bration (>>> Page 201)
Solution: End the calibration process (>>> Page 201)
Description Changes to the sensor configuration are not permitted during an ongoing cal-
ibration process. As a result, the sensor overview cannot be opened during the
calibration process.
The calibration process takes approx. 5 to 10 minutes.
15.2.34 VTH36020
Possible cause(s) Cause: Changes to the sensor configuration not permitted during cali-
bration (>>> Page 202)
Solution: End the calibration process (>>> Page 202)
Description Changes to the sensor configuration are not permitted during an ongoing cal-
ibration process. As a result, the sensor overview cannot be opened during the
calibration process.
The calibration process takes approx. 5 to 10 minutes.
15.2.35 VTH36022
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 202)
Solution: End the current process first (>>> Page 203)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.36 VTH36023
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 208)
Solution: End the current process first (>>> Page 209)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.37 VTH36024
Possible cause(s) Cause: Task file has an invalid format (>>> Page 214)
Solution: Use valid task file (>>> Page 214)
Description The task file has an invalid format. As such, it cannot be loaded. The task file
must meet the following requirements:
The task file must be of file type .task (Vision Tech Task).
The task file must not have been created using an older version of Vision-
Tech. The task files are not compatible for use between different main ver-
sions of VisionTech.
15.2.38 VTH36027
Possible cause(s) Cause: Tool and/or base not assigned (>>> Page 214)
Solution: Select tool and base (>>> Page 215)
Description The base and tool have not yet been set after a reboot of the robot controller.
A program was started without selecting the base and tool.
The procedure for checking whether the tool and/or base have been as-
signed is as follows:
Checking instruc- 1. Touch the Tool/base status indicator. The Cur. Tool/Base window is
tions opened.
2. Check whether Unknown is shown next to the tool and/or base and a
question mark instead of the number.
Description One tool (TOOL coordinate system) and one base (BASE coordinate system)
must be selected for Cartesian jogging.
Procedure 1. Touch the status indicator Cur. tool/base. The Cur. tool/base window
opens.
2. Select the desired tool and base.
3. The window closes and the selection is applied.
15.2.39 VTH36031
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
15.2.40 VTH36033
Description A camera model stored in the WorkVisual project does not match the camera
model that is actually connected. This can only occur if the cameras have been
configured offline, i.e. if the catalog element for the wrong camera model has
been dragged into the project manually.
The procedure for checking which camera models are configured is as
follows:
Checking instruc- 1. Select the Hardware tab in the Project structure window.
tions 2. Check which camera is inserted under the robot controller.
3. If the camera has been renamed, double-click on the camera.
4. On the Camera settings tab, check the Product entry.
15.2.41 VTH36037
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description The task list must be updated in order to see if the task is still available.
15.2.42 VTH36038
Possible cause(s) Cause: The SensorCalibration directory or the calibration result within
is write-protected (>>> Page 218)
Solution: Remove write protection of directory (>>> Page 218)
Checking instruc- 1. In the main menu, select Start-up > Service > Minimize HMI.
tions The smartHMI is minimized and the Windows interface is displayed.
2. Navigate to the directory.
3. Right-click and select Settings.
4. Check whether the check box Write-protected is deactivated.
Checking instruc- 1. In the Navigator, navigate to the directory in which the directory should be
tions located.
2. Check whether the directory is present.
15.2.43 VTH36039
Possible cause(s) Cause: Image acquisition not possible (>>> Page 219)
Solution: Connect camera (>>> Page 219)
Checking instruc- Check whether the camera has been connected in accordance with the
tions documentation.
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.44 VTH36040
Description The KUKA.VisionTech option package is supplied with two licenses. One of
the licenses has a smaller range of functions. This license is required for in-
stallation on a service laptop. The second license has the full range of func-
tions. This license is required for installation on the controller. Installing the
incorrect license on the controller results in an error message.
The procedure for checking whether the correct license is installed is as
follows:
The license key entered must match the license key under Vision license
key KRC.
Description The incorrect license must be uninstalled and the correct license installed.
Precondition If the license on the robot controller or KUKA IPC is to be uninstalled: The
connection to the robot controller or KUKA IPC has been established.
7. Click on ..., select the response file received and confirm with Open.
15.2.45 VTH36041
Possible cause(s) Cause: Vision server is not correctly installed (>>> Page 223)
Solution: Reinstall KUKA.VisionTech (>>> Page 223)
Description It was not possible to install the vision server correctly. Errors occurred during
the installation routine.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.46 VTH36042
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 224)
Solution: End the current process first (>>> Page 225)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera has been connected in accordance with the
tions documentation.
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.47 VTH36044
Possible cause(s) Cause: Tool block does not support the inputs or outputs used
(>>> Page 230)
Solution: Correcting the programming in the SRC file (>>> Page 230)
Cause: Tool block does not support the inputs or outputs used
(>>> Page 230)
Solution: Create new tool block in WorkVisual (>>> Page 230)
Cause: Tool block does not support the inputs or outputs used
Description Inputs or outputs are used which do not correspond to the format that the tool
block expects.
Checking instruc- Check the tool block documentation to see which inputs and outputs are
tions defined for tool block purposes and what their data type is.
Cause: Tool block does not support the inputs or outputs used
Description Inputs or outputs are used which do not correspond to the format that the tool
block expects.
Checking instruc- Check the tool block documentation to see which inputs and outputs are
tions defined for tool block purposes and what their data type is.
Select the menu sequence Editors > Options packages > Vision-
Tech > Tool block editor.
4. Click on the button. The image(s) are inserted next to the inputs under
InputImage.
10. Click on the button and choose whether the tool should be saved com-
pletely or without images or results.
11. Select a directory and click on Save.
15.2.48 VTH36045
Possible cause(s) Cause: The number of captured pictures is not sufficient for performing
a calibration (>>> Page 231)
Solution: Increase number of images taken (>>> Page 232)
Cause: The number of captured pictures is not sufficient for performing a calibration
that have not yet been calibrated, the freeze-frame images have a red
frame.
2. Select the calibration plate used as the calibration body.
3. Select the cameras that are to be calibrated by clicking on the freeze-
frame images.
4. Click on Calibration Wizard.
The number of images already taken is displayed using the Calibration
Wizard button.
5. Check that at least 6 images have been taken.
Description If fewer than 6 images were taken, calibration is not possible. Further images
must be taken in order to complete the calibration. A maximum of 9 images
can be taken.
15.2.49 VTH36046
Possible cause(s) Cause: The SensorCalibration directory or the calibration result within
is write-protected (>>> Page 233)
Solution: Remove write protection of directory (>>> Page 233)
Checking instruc- 1. In the main menu, select Start-up > Service > Minimize HMI.
tions The smartHMI is minimized and the Windows interface is displayed.
2. Navigate to the directory.
3. Right-click and select Settings.
4. Check whether the check box Write-protected is deactivated.
Checking instruc- 1. In the Navigator, navigate to the directory in which the directory should be
tions located.
2. Check whether the directory is present.
15.2.50 VTH36047
Possible cause(s) Cause: File with calibration result damaged (>>> Page 234)
Solution: Delete file (>>> Page 234)
Description The file that contains the calibration result is damaged and cannot be deleted
via the software interface. The file can be found in the SensorCalibration sub-
directory.
15.2.51 VTH36048
Possible cause(s) Cause: The fiducial mark is not detected in all the images
(>>> Page 235)
Solution: Align the camera with the fiducial mark (>>> Page 235)
Description The fiducial mark is the cross at the center of the calibration plate; it must al-
ways be visible during calibration.
Description The camera must be aligned in such a way that the fiducial mark is visible.
Solution: Use calibration poses that lie further away from each other
Description Record at least 6 new calibration poses and ensure that they are an adequate
distance from each other.
15.2.52 VTH36051
Possible cause(s) Cause: Unable to establish connection to vision server (>>> Page 237)
Solution: Disconnect camera from power supply and reconnect
(>>> Page 237)
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- 1. In the main menu, select Start-up > Service > Minimize HMI.
tions The smartHMI is minimized and the Windows interface is displayed.
2. Navigate to the directory.
3. Right-click and select Settings.
4. Check whether the check box Write-protected is deactivated.
15.2.53 VTH36053
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.54 VTH36055
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 241)
Solution: Align the camera via the smartHMI (>>> Page 241)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Check whether the object can be seen in the live image.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.55 VTH36056
Possible cause(s) Cause: Cameras aligned with different features (>>> Page 244)
Solution: Align the camera via the smartHMI (>>> Page 245)
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Only if the KUKA IPC is being used: The IP address of the KUKA IPC is
set in WorkVisual.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
15.2.56 VTH36058
Possible cause(s) Cause: More than 1 object is in the field of vision of the cameras
(>>> Page 246)
Solution: Only place 1 object in the field of vision of the cameras
(>>> Page 246)
Description More than 1 object is in the field of vision of the camera(s). There must be only
one object in the field of vision of the camera(s)
Description Restart model generation and only place 1 object in the field of vision of the
cameras.
The Z axis of the workpiece base points in the same direction as the Z
axis of the calibration coordinate system.
15.2.57 VTH36059
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 247)
Solution: Align the camera via the smartHMI (>>> Page 248)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
15.2.58 VTH36060
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 250)
Solution: Align the camera via the smartHMI (>>> Page 250)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Check whether the object can be seen in the live image.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
15.2.59 VTH36061
Possible cause(s) Cause: Cameras aligned with different features (>>> Page 252)
Solution: Align the camera via the smartHMI (>>> Page 252)
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.60 VTH36062
Possible cause(s) Cause: Cameras aligned with different features (>>> Page 254)
Solution: Align the camera via the smartHMI (>>> Page 254)
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.61 VTH36063
Possible cause(s) Cause: Cameras aligned with different features (>>> Page 256)
Solution: Align the camera via the smartHMI (>>> Page 256)
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.62 VTH36064
Possible cause(s) Cause: 3D model requires at least 3 point features (>>> Page 259)
Solution: Provide 3 point features for 3D model creation
(>>> Page 259)
Description Ensure that at least 3 point features exist for 3D reference model creation.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.63 VTH36065
Possible cause(s) Cause: More than 1 object is in the field of vision of the cameras
(>>> Page 261)
Solution: Only place 1 object in the field of vision of the cameras
(>>> Page 261)
Description More than 1 object is in the field of vision of the camera(s). There must be only
one object in the field of vision of the camera(s)
Description Restart model generation and only place 1 object in the field of vision of the
cameras.
15.2.64 VTH36066
Possible cause(s) Cause: More than 1 object is in the field of vision of the cameras
(>>> Page 262)
Solution: Only place 1 object in the field of vision of the cameras
(>>> Page 262)
Description More than 1 object is in the field of vision of the camera(s). There must be only
one object in the field of vision of the camera(s)
Description Restart model generation and only place 1 object in the field of vision of the
cameras.
15.2.65 VTH36069
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 263)
Solution: Align the camera via the smartHMI (>>> Page 263)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
Description The tool block is unsuitable for the object, e.g. the permissible scaling has
been configured in such a way that no object can be detected.
4. Click on the button. The image(s) are inserted next to the inputs under
InputImage.
10. Click on the button and choose whether the tool should be saved com-
pletely or without images or results.
11. Select a directory and click on Save.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.66 VTH36077
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 266)
Solution: End the current process first (>>> Page 267)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
15.2.67 VTH36078
Possible cause(s) Cause: Camera model not compatible (>>> Page 270)
Solution: Use KUKA MXG20 or VCXG-25M camera (>>> Page 271)
Description The camera in use provides a video or image format that is not supported by
KUKA.VisionTech.
Description Use the camera model KUKA MXG20 or KUKA VCXG-25M that is intended
for use with KUKA.VisionTech.
15.2.68 VTH36079
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 271)
Solution: End the current process first (>>> Page 272)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The KUKA.VisionTech option package is supplied with two licenses. One of
the licenses has a smaller range of functions. This license is required for in-
stallation on a service laptop. The second license has the full range of func-
tions. This license is required for installation on the controller. Installing the
incorrect license on the controller results in an error message.
The procedure for checking whether the correct license is installed is as
follows:
The license key entered must match the license key under Vision license
key KRC.
Description The incorrect license must be uninstalled and the correct license installed.
Precondition If the license on the robot controller or KUKA IPC is to be uninstalled: The
connection to the robot controller or KUKA IPC has been established.
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.69 VTH36080
Possible cause(s) Cause: Driver for network card missing (>>> Page 278)
Solution: Install driver for network card (>>> Page 278)
Description Drivers that are required for GigE Vision have not been enabled for the net-
work card or are not installed.
Checking instruc- Check in the Control Panel whether the correct drivers or only standard
tions drivers have been installed for the network card.
15.2.70 VTH36081
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 279)
Solution: End the current process first (>>> Page 279)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.71 VTH36082
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 284)
Solution: End the current process first (>>> Page 285)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera has been connected in accordance with the
tions documentation.
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.72 VTH36083
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 290)
Solution: End the current process first (>>> Page 291)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
15.2.73 VTH36084
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description The task list must be updated in order to see if the task is still available.
15.2.74 VTH36085
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description The task list must be updated in order to see if the task is still available.
15.2.75 VTH36086
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description There is a task in the task list. When attempting to delete the task, the error
message appears.
Description The task list must be updated in order to see if the task is still available.
15.2.76 VTH36087
Possible cause(s) Cause: The selected task is not valid (>>> Page 297)
Solution: Configure the task completely (>>> Page 298)
Description The selected task is not valid and can therefore not be executed.
The procedure for checking whether the preconditions for a valid task
are met is as follows:
Checking instruc- Check whether all cameras configured in the task are actually connected.
tions Check whether the correct number of camera configurations is present in
the task.
Check whether all cameras are configured with the correct mounting type
(stationary or robot-guided).
Check whether each camera has been assigned an image processing
task.
Check whether the task contains a model.
Description Configure the task completely. Take the connected cameras into consider-
ation.
This activity must be carried out in accordance with the procedure de-
scribed in the operating and programming instructions.
15.2.77 VTH36088
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 298)
Solution: Align the camera via the smartHMI (>>> Page 299)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.78 VTH36089
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 303)
Solution: Align the camera via the smartHMI (>>> Page 303)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Check whether the object can be seen in the live image.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.79 VTH36091
Possible cause(s) Cause: Tool block does not support the inputs or outputs used
(>>> Page 307)
Solution: Correcting the programming in the SRC file (>>> Page 307)
Cause: Tool block does not support the inputs or outputs used
(>>> Page 307)
Solution: Create new tool block in WorkVisual (>>> Page 307)
Cause: Tool block does not support the inputs or outputs used
Description Inputs or outputs are used which do not correspond to the format that the tool
block expects.
Checking instruc- Check the tool block documentation to see which inputs and outputs are
tions defined for tool block purposes and what their data type is.
Cause: Tool block does not support the inputs or outputs used
Description Inputs or outputs are used which do not correspond to the format that the tool
block expects.
Checking instruc- Check the tool block documentation to see which inputs and outputs are
tions defined for tool block purposes and what their data type is.
4. Click on the button. The image(s) are inserted next to the inputs under
InputImage.
10. Click on the button and choose whether the tool should be saved com-
pletely or without images or results.
11. Select a directory and click on Save.
15.2.80 VTH36094
Possible cause(s) Cause: Camera not fully or correctly calibrated (>>> Page 308)
Solution: Calibrate camera (stationary) (>>> Page 308)
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
Recommendations:
The camera should be directly above the calibration plate.
The calibration plate should fill as much of the image of the camera
as possible.
6. Press Calibration.
7. Once the calibration process has been completed, the result is displayed.
For an adequate degree of accuracy, the result should be < 1 mm.
8. Select an existing calibration plane in the Calibration plane box or choose
Create new calibration plane... to create a new calibration plane:
a. Enter a name for the calibration plane in the Name box.
b. Optional: Enter a description of the calibration plane in the Descrip-
tion box.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
15.2.81 VTH36095
Possible cause(s) Cause: Object located outside of the camera’s field of vision
(>>> Page 311)
Solution: Align the camera via the smartHMI (>>> Page 312)
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description KUKA.VisionTech cannot detect any object. The object is located outside of
the camera’s field of vision.
The procedure for checking whether the object is located outside of the
camera’s field of vision is as follows:
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
The cameras have the same orientation, i.e. the component is in the
same position in all the images.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.82 VTH36096
Possible cause(s) Cause: Tool block does not support the inputs or outputs used
(>>> Page 315)
Solution: Correcting the programming in the SRC file (>>> Page 315)
Cause: Tool block does not support the inputs or outputs used
(>>> Page 316)
Solution: Create new tool block in WorkVisual (>>> Page 316)
Cause: Tool block does not support the inputs or outputs used
Description Inputs or outputs are used which do not correspond to the format that the tool
block expects.
Checking instruc- Check the tool block documentation to see which inputs and outputs are
tions defined for tool block purposes and what their data type is.
Cause: Tool block does not support the inputs or outputs used
Description Inputs or outputs are used which do not correspond to the format that the tool
block expects.
Checking instruc- Check the tool block documentation to see which inputs and outputs are
tions defined for tool block purposes and what their data type is.
4. Click on the button. The image(s) are inserted next to the inputs under
InputImage.
10. Click on the button and choose whether the tool should be saved com-
pletely or without images or results.
11. Select a directory and click on Save.
15.2.83 VTH36097
Checking instruc- 1. In the Navigator, navigate to the directory in which the directory should be
tions located.
2. Check whether the directory is present.
15.2.84 VTH36098
Possible cause(s) Cause: No model generated for task (>>> Page 318)
Solution: Generate 3D model (>>> Page 318)
Description No model has been generated for the task. As such, the task cannot be carried
out.
Description No model has been generated for the task. As such, the task cannot be carried
out.
Generation of the model has been successfully completed when the Mod-
el button has a check mark. The position of the reference workpiece in the
calibration base is now known; all position data of other workpieces are
relative to this position.
4. Optional: Click on the image once. The image is now displayed in its orig-
inal size. The image can be enlarged further by touching it. The image can
be increased or decreased in size using the slide controller. It is possible
to navigate within the image using the arrow keys or by dragging a finger
or stylus over the image.
Description No model has been generated for the task. As such, the task cannot be carried
out.
15.2.85 VTH36099
Possible cause(s) Cause: Cameras aligned with different components (>>> Page 320)
Solution: Align the camera via the smartHMI (>>> Page 320)
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
Recommendations:
The camera should be directly above the calibration plate.
The calibration plate should fill as much of the image of the camera
as possible.
6. Press Calibration.
7. Once the calibration process has been completed, the result is displayed.
For an adequate degree of accuracy, the result should be < 1 mm.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
15.2.86 VTH36100
Possible cause(s) Cause: The fiducial mark is not detected in all the images
(>>> Page 325)
Solution: Align the camera with the fiducial mark (>>> Page 325)
Description The fiducial mark is the cross at the center of the calibration plate; it must al-
ways be visible during calibration.
Description The camera must be aligned in such a way that the fiducial mark is visible.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
The user must carry out a risk analysis and is responsible for ensuring
correctly adapted lighting.
Checking instruc- Check whether the lighting is too bright or too dim.
tions
15.2.87 VTH36101
Possible cause(s) Cause: Cameras aligned with different features (>>> Page 328)
Solution: Align the camera via the smartHMI (>>> Page 328)
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Description In order to detect features, all features must be acquired by every camera.
Each of the cameras has detected different features.
The procedure for checking whether each camera detects the same fea-
tures is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Live picture.
tions 2. Press Start. A live image is generated for the camera.
3. Check which feature can be seen in the live image.
4. Repeat the procedure for each camera.
5. Compare whether each camera sees the same features.
Description In order to display an object or feature in the field of vision of the camera, the
camera must be aligned.
Procedure 1. Expand the tree structure of the robot controller on the Hardware tab in
the Project structure window.
2. Right-click on the camera and select Camera live image. The acquisition
of live images by this camera starts automatically.
3. Optional: Click on the button. The operator control elements are hidden
and the display of the image is enlarged accordingly. To display the oper-
ator control elements again, click on the button again.
4. Stationary camera: Position the camera over the measurement object so
that the measurement object is visible in the live image of the camera.
Robot-guided cameras: Position the robot over the measurement object
so that the measurement object is visible in the live image of the cameras.
If necessary, align the cameras again.
5. Tighten and secure the camera fastening screws.
15.2.88 VTH36102
Possible cause(s) Cause: The specified KRL variable could not be found (>>> Page 329)
Solution: Carry out cold restart (>>> Page 330)
Cause: The specified KRL variable could not be found (>>> Page 330)
Solution: Check the WorkVisual project and transfer it again
(>>> Page 330)
Cause: The specified KRL variable could not be found (>>> Page 330)
Solution: Reinstall KUKA.VisionTech (>>> Page 331)
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Description The WorkVisual project must be checked and then transferred to the robot
controller again.
Description The specified KRL variable could not be found in the data list (DAT file). The
variable was not declared in the data list or has not be initialized.
The procedure for checking the state of a variable is as follows:
Checking instruc- Check the state of a variable with the VARSTATE() function. The
tions VARSTATE function supplies 3 return values: #DECLARED, #INITIAL-
IZED or #UNKNOWN.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
15.2.89 VTH36103
Possible cause(s) Cause: License key is missing or invalid (>>> Page 331)
Solution: Activate KUKA.VisionTech license key (>>> Page 332)
Description KUKA.VisionTech requires a valid license key for the program to work. No li-
cense key has been saved or an invalid license key has been saved.
The procedure for checking whether a valid license key has been saved
is as follows:
The license key entered must match the license key under Vision license
key KRC.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.90 VTH36104
Possible cause(s) Cause: License key is missing or invalid (>>> Page 332)
Solution: Activate KUKA.VisionTech license key (>>> Page 333)
Description KUKA.VisionTech requires a valid license key for the program to work. No li-
cense key has been saved or an invalid license key has been saved.
The procedure for checking whether a valid license key has been saved
is as follows:
The license key entered must match the license key under Vision license
key KRC.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.91 VTH36105
Possible cause(s) Cause: License key is missing or invalid (>>> Page 333)
Solution: Activate KUKA.VisionTech license key (>>> Page 334)
Description KUKA.VisionTech requires a valid license key for the program to work. No li-
cense key has been saved or an invalid license key has been saved.
The procedure for checking whether a valid license key has been saved
is as follows:
The license key entered must match the license key under Vision license
key KRC.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.92 VTH36107
Possible cause(s) Cause: License key is missing or invalid (>>> Page 334)
Solution: Activate KUKA.VisionTech license key (>>> Page 335)
Description KUKA.VisionTech requires a valid license key for the program to work. No li-
cense key has been saved or an invalid license key has been saved.
The procedure for checking whether a valid license key has been saved
is as follows:
The license key entered must match the license key under Vision license
key KRC.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.93 VTH36108
Possible cause(s) Cause: License key is missing or invalid (>>> Page 335)
Solution: Activate KUKA.VisionTech license key (>>> Page 336)
Description KUKA.VisionTech requires a valid license key for the program to work. No li-
cense key has been saved or an invalid license key has been saved.
The procedure for checking whether a valid license key has been saved
is as follows:
The license key entered must match the license key under Vision license
key KRC.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.94 VTH36114
Possible cause(s) Cause: The calibration poses do not differ to an adequate extent
(>>> Page 336)
Solution: Use calibration poses that lie further away from each other
(>>> Page 337)
Solution: Use calibration poses that lie further away from each other
Description Record at least 6 new calibration poses and ensure that they are an adequate
distance from each other.
15.2.95 VTH36115
Possible cause(s) Cause: The calibration poses do not differ to an adequate extent
(>>> Page 337)
Solution: Use calibration poses that lie further away from each other
(>>> Page 338)
Solution: Use calibration poses that lie further away from each other
Description Record at least 6 new calibration poses and ensure that they are an adequate
distance from each other.
15.2.96 VTH36117
Possible cause(s) Cause: Camera model not compatible (>>> Page 338)
Solution: Use KUKA MXG20 or VCXG-25M camera (>>> Page 339)
Description The camera in use provides a video or image format that is not supported by
KUKA.VisionTech.
Description Use the camera model KUKA MXG20 or KUKA VCXG-25M that is intended
for use with KUKA.VisionTech.
15.2.97 VTH36118
Possible cause(s) Cause: Camera model not compatible (>>> Page 339)
Solution: Use KUKA MXG20 or VCXG-25M camera (>>> Page 339)
Description The camera in use provides a video or image format that is not supported by
KUKA.VisionTech.
Description Use the camera model KUKA MXG20 or KUKA VCXG-25M that is intended
for use with KUKA.VisionTech.
15.2.98 VTH36119
Possible cause(s) Cause: Camera model not compatible (>>> Page 339)
Solution: Use KUKA MXG20 or VCXG-25M camera (>>> Page 339)
Description The camera in use provides a video or image format that is not supported by
KUKA.VisionTech.
Description Use the camera model KUKA MXG20 or KUKA VCXG-25M that is intended
for use with KUKA.VisionTech.
15.2.99 VTH36121
Possible cause(s) Cause: No further emergency license available (>>> Page 340)
Solution: Contact KUKA Support (>>> Page 340)
Description All available emergency licenses have already been used up. A total of 5
emergency licenses are available. An emergency license is valid for 3 days.
On expiry, emergency licenses cannot be reactivated. For this reason, they
should only be used in an actual emergency.
Description All available emergency licenses have already been used up. A total of 5
emergency licenses are available. An emergency license is valid for 3 days.
On expiry, emergency licenses cannot be reactivated. For this reason, they
should only be used in an actual emergency.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.100 VTH36138
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 341)
Solution: End the current process first (>>> Page 342)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description The sensor cable that runs from the camera to the robot controller or to the
IPC is faulty or not connected correctly.
The cameras can be connected to the following interfaces of the KR C4 robot
controller:
X64.1 - B1
X64.2 - B2
X64.3 - B3
If a cable inlet on the KR C4 is used instead of interface X64:
A13.1 - B1
A13.2 - B2
A13.3 - B3
If the KR C4 compact robot controller is used:
PoE1 - B1
PoE2 - B2
PoE3 - B3
If a KUKA IPC is used:
CH1 - B1
CH2 - B2
CH3 - B3
CH4 - B4
The procedure for checking whether the sensor cable is defective is as
follows:
Checking instruc- 1. Check whether the connectors are correctly connected. Particular atten-
tions tion must be paid to:
Pins pushed in
Corrosion
Scorched contacts
Connector insert pushed back
Socket pushed back
Connector on correct slot
2. Check whether the cable is mechanically damaged. Causes of squashed
cables or wires can include the following:
Cable straps too tight
Clips too tight
Trapped when closing a cover
Bend radius too tight
3. Check whether the cable still conducts electricity. Particular attention must
be paid to:
Cross-connection of individual wires
Short-circuit of individual wires with the ground conductor
Correct wiring in accordance with circuit diagram
The energy supply system ensures that the cables are guided with
minimum stress despite the high load on the sensor cable caused by
the robot motion.
15.2.101 VTH36140
Possible cause(s) Cause: No reference feature or more than 1 reference feature in the
field of vision of the camera (>>> Page 347)
Solution: Place only 1 reference feature in the field of vision of the cam-
era (>>> Page 347)
Cause: No reference feature or more than 1 reference feature in the field of vision of the cam-
era
Description During execution of the verification task for verifying the calibration, there is no
reference feature (marker) or more than 1 reference feature in the field of vi-
sion of the camera. There must be precisely 1 reference feature in the field of
vision of the camera.
Solution: Place only 1 reference feature in the field of vision of the camera
Description Place only 1 reference feature (marker) in the field of vision of the camera and
perform calibration verification again.
15.2.102 VTH36141
Possible cause(s) Cause: No reference feature or more than 1 reference feature in the
field of vision of the camera (>>> Page 347)
Solution: Place only 1 reference feature in the field of vision of the cam-
era (>>> Page 347)
Cause: No reference feature or more than 1 reference feature in the field of vision of the cam-
era
Description During execution of the verification task for verifying the calibration, there is no
reference feature (marker) or more than 1 reference feature in the field of vi-
sion of the camera. There must be precisely 1 reference feature in the field of
vision of the camera.
Solution: Place only 1 reference feature in the field of vision of the camera
Description Place only 1 reference feature (marker) in the field of vision of the camera and
perform calibration verification again.
15.2.103 VTH36144
Possible cause(s) Cause: No reference feature or more than 1 reference feature in the
field of vision of the camera (>>> Page 348)
Solution: Place only 1 reference feature in the field of vision of the cam-
era (>>> Page 348)
Cause: No reference feature or more than 1 reference feature in the field of vision of the cam-
era
Description During execution of the verification task for verifying the calibration, there is no
reference feature (marker) or more than 1 reference feature in the field of vi-
sion of the camera. There must be precisely 1 reference feature in the field of
vision of the camera.
Solution: Place only 1 reference feature in the field of vision of the camera
Description Place only 1 reference feature (marker) in the field of vision of the camera and
perform calibration verification again.
15.2.104 VTH36145
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 348)
Solution: End the current process first (>>> Page 349)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
15.2.105 VTH36146
Possible cause(s) Cause: Camera is tied up in another process (>>> Page 353)
Solution: End the current process first (>>> Page 354)
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
Checking instruc- Check whether the camera is currently sending or receiving data.
tions
LEDs
15.2.106 VTH36150
Checking instruc- Check the status of the robot interpreter in the status bar.
tions Icon Color Description
Gray No program is selected.
15.2.107 VTH36156
Checking instruc- 1. In the main menu, select Start-up > Service > Minimize HMI.
tions The smartHMI is minimized and the Windows interface is displayed.
2. Navigate to the directory.
3. Right-click and select Settings.
4. Check whether the check box Write-protected is deactivated.
15.2.108 VTH36157
Possible cause(s) Cause: An unexpected runtime error occurred in the software or hard-
ware (>>> Page 359)
Solution: Carry out cold restart (>>> Page 359)
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
15.2.109 VTH36168
Possible cause(s) Cause: No further emergency license available (>>> Page 360)
Solution: Contact KUKA Support (>>> Page 360)
Description All available emergency licenses have already been used up. A total of 5
emergency licenses are available. An emergency license is valid for 3 days.
On expiry, emergency licenses cannot be reactivated. For this reason, they
should only be used in an actual emergency.
Description It was not possible to install the vision server correctly. Errors occurred during
the installation routine.
Procedure 1. Only for an update: Uninstall the previous version of the VisionTech op-
tion package in WorkVisual.
2. Install the VisionTech option package in WorkVisual.
3. Load the project from the robot controller.
4. Insert the VisionTech option package into the project.
5. Deploy the project from WorkVisual to the robot controller and activate it.
6. The request for confirmation Do you want to activate the project […]? is dis-
played on the smartHMI. The active project is overwritten during activa-
tion. If no relevant project will be overwritten: Answer the query with Yes.
7. An overview with the changes and a request for confirmation are displayed
on the smartHMI. Answer this with Yes. The option package is installed
and the robot controller carries out a reboot.
Description All available emergency licenses have already been used up. A total of 5
emergency licenses are available. An emergency license is valid for 3 days.
On expiry, emergency licenses cannot be reactivated. For this reason, they
should only be used in an actual emergency.
Description The correct license key must be activated for KUKA.VisionTech to work cor-
rectly.
15.2.110 VTH36169
Possible cause(s) Cause: No reference feature or more than 1 reference feature in the
field of vision of the camera (>>> Page 362)
Solution: Place only 1 reference feature in the field of vision of the cam-
era (>>> Page 362)
Cause: No reference feature or more than 1 reference feature in the field of vision of the cam-
era
Description During execution of the verification task for verifying the calibration, there is no
reference feature (marker) or more than 1 reference feature in the field of vi-
sion of the camera. There must be precisely 1 reference feature in the field of
vision of the camera.
Solution: Place only 1 reference feature in the field of vision of the camera
Description Place only 1 reference feature (marker) in the field of vision of the camera and
perform calibration verification again.
15.2.111 VTH36170
Description The name of a task does not contain characters or only contains blank spaces.
The name of a task must contain at least one character that is not a blank
space. The characters \ / : * ? " < > | must not be used.
The procedure for checking whether the name of a task contains either
no characters or only blank spaces is as follows:
Checking instruc- 1. In the main menu, select VisionTech > Task configuration.
tions 2. Check in the Tasks area to see if an invalid name has been used.
15.2.112 VTH36171
Possible cause(s) Cause: Stationary and robot-guided camera selected (>>> Page 363)
Solution: Select suitable cameras (>>> Page 363)
Description A stationary camera and a robot-guided camera were selected for calibration.
Stationary cameras and robot-guided cameras cannot be calibrated simulta-
neously.
15.2.113 VTH36172
Possible cause(s) Cause: An unexpected runtime error occurred in the software or hard-
ware (>>> Page 363)
Solution: Carry out cold restart (>>> Page 364)
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
15.2.114 VTH36175
Possible cause(s) Cause: Camera not fully or correctly calibrated (>>> Page 364)
Solution: Calibrate camera (stationary) (>>> Page 364)
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
Recommendations:
The camera should be directly above the calibration plate.
The calibration plate should fill as much of the image of the camera
as possible.
6. Press Calibration.
7. Once the calibration process has been completed, the result is displayed.
For an adequate degree of accuracy, the result should be < 1 mm.
8. Select an existing calibration plane in the Calibration plane box or choose
Create new calibration plane... to create a new calibration plane:
a. Enter a name for the calibration plane in the Name box.
b. Optional: Enter a description of the calibration plane in the Descrip-
tion box.
Description The camera is not fully or correctly calibrated. Therefore, it is not possible to
determine an object position.
15.2.115 VTH36177
Possible cause(s) Cause: The robot is in the overhead singularity (>>> Page 367)
Solution: Tilt axis 2 or axis 3 (>>> Page 368)
Description In the overhead singularity, the wrist root point (= center point of axis A5)
is located vertically above axis A1 of the robot.
The position of axis A1 cannot be determined unambiguously by means of
reverse transformation and can thus take any value.
Description Tilt axis 2 or axis 3 to move the robot out of the overhead singularity.
Procedure 1. Select Axes as the coordinate system for the jog keys.
2. Set jog override.
3. Hold down the enabling switch.
Axes A1 to A6 are displayed next to the jog keys.
4. Press the Plus or Minus jog key to move an axis in the positive or negative
direction.
The position of the robot during jogging can be displayed: select Dis-
play > Actual position in the main menu.
15.2.116 VTH36178
Possible cause(s) Cause: The robot is in the extended position singularity (α2 position)
(>>> Page 368)
Solution: Tilt axis 3 (>>> Page 369)
Description In the extended position, the wrist root point (= center point of axis A5) is
located in the extension of axes A2 and A3 of the robot.
The robot is at the limit of its work envelope.
Although reverse transformation does provide unambiguous axis angles,
low Cartesian velocities result in high axis velocities for axes A2 and A3.
Description Tilt axis 3 to move the robot out of the extended position singularity.
Procedure 1. Select Axes as the coordinate system for the jog keys.
2. Set jog override.
3. Hold down the enabling switch.
Axes A1 to A6 are displayed next to the jog keys.
4. Press the Plus or Minus jog key to move an axis in the positive or negative
direction.
The position of the robot during jogging can be displayed: select Dis-
play > Actual position in the main menu.
15.2.117 VTH36179
Possible cause(s) Cause: The robot is in wrist axis singularity (α5 position)
(>>> Page 369)
Solution: Tilt axis 5 (>>> Page 370)
Description In the wrist axis singularity position, the axes A4 and A6 are parallel to one
another and axis A5 is within the range ±0.01812°.
Description Tilt axis 5 to move the robot out of wrist axis singularity.
Procedure 1. Select Axes as the coordinate system for the jog keys.
2. Set jog override.
3. Hold down the enabling switch.
Axes A1 to A6 are displayed next to the jog keys.
4. Press the Plus or Minus jog key to move an axis in the positive or negative
direction.
The position of the robot during jogging can be displayed: select Dis-
play > Actual position in the main menu.
15.2.118 VTH37001
Possible cause(s) Cause: Unable to establish connection to vision server (>>> Page 371)
Solution: Disconnect camera from power supply and reconnect
(>>> Page 371)
Description Restart the camera. To do so, disconnect the camera from the power supply
and then reconnect it.
1 LEDs
2 Data/PoE interface
3 Process interface / power supply
15.2.119 VTH37002
Possible cause(s) Cause: An unexpected runtime error occurred in the software or hard-
ware (>>> Page 372)
Solution: Carry out cold restart (>>> Page 372)
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
15.2.120 VTH37003
Possible cause(s) Cause: An unexpected runtime error occurred in the software or hard-
ware (>>> Page 373)
Solution: Carry out cold restart (>>> Page 373)
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
Description A runtime error has occurred in the software or in the connected hardware
(e.g. camera) and the process cannot be executed.
15.2.121 VTH37004
Description No task is configured. In order to be able to open an inline form, a task must
be configured.
Description
Item Description
1 Back to Overview
2 Name of the task
The name is freely selectable.
3 Position data
(>>> 11.2.1 "Relative and absolute position data" Page 82)
Note: This box is only relevant for 2D tasks with a stationary cam-
era.
4 List of all available cameras that can be used for this task
5 List of all imported tool block files
6 Switches to the live image display in which the following settings
are possible:
Set exposure time
0 … 200 ms
Select calibration plane
Take images from this camera
7 Deletes the selected camera from this task
8 Adds the selected camera to this task
9 Generate model
Check box active: Model has been successfully generated.
Check box not active: No model has yet been generated.
Inactive: Model is not relevant (with absolute position data)
10 Select the number of parts
Item Description
11 State of the task
Green: Task has been successfully configured.
Red: Task is not configured.
12 Test task
Active: The task can be tested.
Inactive: The task cannot be tested. No model has yet been
generated, or the task is not configured.
Button Description
Save picture(s) Acquires an image with the selected camera and
saves it
Save Saves the task configuration
Cancel Aborts the task configuration without saving
Description No task is configured. In order to be able to open an inline form, a task must
be configured.
If the selected is too low, this may lead to inconsistencies, e.g. if each
camera detects different objects. It is advisable to enter the number
of components that one expects to find.
Description
Item Description
1 Back to Overview
2 Name of the task
The name is freely selectable.
3 Generate model
Check box active: Model has been successfully generated.
Check box not active: No model has yet been generated.
4 List of all available cameras that can be used for this task
5 List of all imported tool block files
6 Switches to the live image display in which the following settings
are possible:
Set exposure time
0 … 200 ms
Take images from this camera
7 Deletes the selected camera from this task
8 Adds the selected camera to this task
9 Select the number of parts
10 State of the task
Green: Task has been successfully configured.
Red: Task is not configured.
11 Test task
Active: The task can be tested.
Inactive: The task cannot be tested. No model has yet been
generated, or the task is not configured.
Button Description
Save picture(s) Acquires an image with the selected camera and
saves it
Save Saves the task configuration
Cancel Aborts the task configuration without saving
16 Maintenance
Maintenance
The overview may contain maintenance symbols that are not relevant
symbols for the maintenance work on this product. The maintenance illustra-
tions provide an overview of the relevant maintenance work.
Oil change
Tighten screw/nut
Clean component
Exchange battery
17 Repair
Precondition The robot controller is switched off and secured to prevent unauthorized
persons from switching it on again.
The power cable is de-energized.
Precondition The robot controller is switched off and secured to prevent unauthorized
persons from switching it on again.
The power cable is de-energized.
18 Troubleshooting
T
Error Remedy
“Power” LED on the KUKA Check that the plugged and screwed
s
t
MXG20 camera does not connections along the connecting cable
light up are fitted securely.
Check the connecting cable for dam-
age.
Neither LED “P1” nor LED Check whether voltage is present at the
“P2” lights up power connection (V1 or V2).
LED “P1” or LED “P2” lights Check that the DIP switches are set cor-
up red on the switch rectly. Both switches must be in the “ON”
position.
LED “Gigabit PoE Port” The connected device does not support
does not light up on the PoE.
switch
19.1 Decommissioning
s
Procedure 1. Disconnect the cameras and switch from the power supply.
2. Unplug the connecting cables.
3. Prepare the cameras and switch for storage or transportation.
19.2 Storage
Precondition If the cameras and switch are to be put into long-term storage, the following
points must be observed:
The place of storage must be as dry and dust-free as possible.
Avoid temperature fluctuations.
Avoid condensation.
Observe and comply with the permissible temperature ranges for storage.
Select a storage location in which the packaging materials cannot be dam-
aged.
Only store the cameras and switch indoors.
Procedure Cover the cameras and switch with ESD protection foil and seal it against
dust.
19.3 Disposal
When the cameras and switch reach the end of their useful life, dispose of
them as electrical scrap without disassembling.
20 Appendix
2
Various tool block templates are available for creating VisionTech applications
in WorkVisual. Most applications can be created using the templates.
3D templates LocatePart3D
(>>> 20.1.5 "LocatePart3D" Page 397)
Features
(>>> 20.1.6 "Features" Page 397)
Utilities
(>>> 20.1.7 "CrspCollector" Page 397)
Utilities GraphicCollector
(>>> 20.1.8 "GraphicCollector" Page 397)
StringCollector
(>>> 20.1.9 "StringCollector" Page 398)
ListOperators
(>>> 20.1.10 "ListOperators" Page 398)
Logic
(>>> 20.1.11 "Logic" Page 401)
The templates contain script code that defines the behavior of the cor-
responding tool block. Some of the templates have inputs/outputs or
contain tools that are addressed by name by the script code. The
names end with an underscore.
Do not delete or rename these inputs/outputs and tools, as this may result in
errors during execution of the tool block.
20.1.1 LocatePartsKnownPosition
Description With this template, a number of parts are searched for in the field of view. The
rough position of these parts must be known, however. The search area is lim-
ited to the known rough position.
The template contains a PatMax tool (Part_UseOneCopyPerPartInstance)
for each part that is to be located. This tool must be configured and copied for
each additional part. The search area of the copied parts must be adapted ac-
cordingly.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the search for the
Grey parts is to be carried out. This input
must be linked to the corresponding
input of the search tools.
Outputs
Name Type Description
PartResults_ List <CogTrans- Positions of the parts located
form2DLinear>
Score_ List <double> Score with which the parts were
located
Graphics_ CogGraphicCol- Result graphics
lection
20.1.2 LocatePartsOneStage
Description This template is suitable for parts which are defined by relatively coarse struc-
tures, e.g. complete panels. The search area is the entire image. The PatMax
tool provided must be configured in such a way that all expected parts can be
located.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the search for the
Grey parts is to be carried out. This input
must be linked to the corresponding
input of the search tools.
NumberOf- Int32 Maximum number of parts that are to
Parts be returned. If the PatMax tool finds
more parts than desired, only the con-
figured number of results is output.
Outputs
Name Type Description
PartResults_ List <CogTrans- Positions of the parts located
form2DLinear>
Score_ List <double> Score with which the parts were
located
Graphics_ CogGraphicCol- Result graphics
lection
20.1.3 Snippets
These tool blocks are executed on the parts located using the 2D templates.
A loop is used for this and is executed once for each part that is located.
LoopLocatedParts
(>>> 20.1.3.1 "LoopLocatedParts" Page 391)
SampleLoops These tool blocks are derived from LoopLocatedParts and are executed in
the 2nd step of processing.
LoopFineLocateParts
(>>> 20.1.3.2 "LoopFineLocateParts" Page 392)
LoopPickZoneCheck
(>>> 20.1.3.3 "LoopPickZoneCheck" Page 393)
LoopPresenceCheck
(>>> 20.1.3.4 "LoopPresenceCheck" Page 394)
LoopCompo- These tool blocks can be added to the tools ProcessFixturedParts and Loop-
nents LocatedParts in order to perform a fine search or a pick zone check.
LocateSingleCircle
(>>> 20.1.3.5 "LocateSingleCircle" Page 394)
LocateSinglePattern
(>>> 20.1.3.6 "LocateSinglePattern" Page 395)
PickZoneCheck
(>>> 20.1.3.7 "PickZoneCheck" Page 395)
20.1.3.1 LoopLocatedParts
Description LoopLocatedParts is the basis for all tool blocks located in the directory
1_SampleLoops.
If the input SelectedIndex_ < 0, the tools contained in this tool block are exe-
cuted once for each element contained in PartResults. If a tool block to be ex-
ecuted has an input LoopIndex_, this is set with the current loop index before
execution.
If the input SelectedIndex_ ≥ 0, the tool block is only executed once. The input
SelectedIndex_ is forwarded to the inputs LoopIndex_ of the contained tool
blocks.
In this way, the input SelectedIndex_ can be used to force execution of the tool
block for 1 specific part. This can be useful during debugging.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the parts are
Grey located in the 1st step.
PartResults List <Cog- Positions of the parts located
Transform-
If these values are used in the loop,
2DLinear>
they must be linked to the correspond-
ing tools.
PartScores List <double> Score with which the parts were
located
If these values are used in the loop,
they must be linked to the correspond-
ing tools.
Selected- Int32 Selection index for 1 specific part
Index_
If the input < 0, this tool block is exe-
cuted in the loop for all parts.
Outputs
Name Type Description
PartResults List <Cog- Starting positions of the parts located
Transform-
This output is optional, depending on
2DLinear>
what is executed in the loop.
Score List <double> Score with which the parts were
located. The output is optional.
20.1.3.2 LoopFineLocateParts
Description This tool block carries out a fine search of the parts located in the 1st step. De-
pending on the input SelectedIndex_, this search is carried out for all located
parts (< 0) or for a single selected part (≥ 0).
Every time the loop is executed, all tools contained in the tool block are exe-
cuted:
Fixture
Defines the coordinate system for the part to be investigated. All subse-
quent tools use this coordinate system.
ProcessFixturedPart
Executes the loop component LocateSinglePattern
(>>> 20.1.3.6 "LocateSinglePattern" Page 395). The position (PartResult)
and the score of the fine search are output in a list. The position can be
transferred either in the Fixture coordinate system of the current part or in
the Pixel coordinate system.
ListCollectorPartResults
Collects the PartResults lists when the loop is executed and joins them to-
gether.
ListCollectorScores
Collects the Score lists when the loop is executed and joins them together.
GraphicCollector
Collects the result graphics when the loop is executed and joins them to-
gether.
The collected PartResults, Scores and Graphics are transferred to the corre-
sponding outputs.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the parts are
Grey located in the 1st step.
PartResults List <Cog- Positions of the parts located in the 1st
Transform- step
2DLinear>
PartScores List <double> Score with which the parts were
located
Selected- Int32 Selection index for 1 specific part
Index_
If the input < 0, this tool block is exe-
cuted in the loop for all parts.
Outputs
Name Type Description
PartResults List <Cog- Output positions of the parts located in
Transform- the fine search
2DLinear>
20.1.3.3 LoopPickZoneCheck
Description This tool block carries out a pick zone check of the parts located in the 1st step.
Depending on the input SelectedIndex_, this check is carried out for all located
parts (< 0) or for a single selected part (≥ 0).
Every time the loop is executed, all tools contained in the tool block are exe-
cuted:
Fixture
Defines the coordinate system for the part to be investigated. All subse-
quent tools use this coordinate system.
ProcessFixturedPart
Executes the loop component PickZoneCheck (>>> 20.1.3.7 "Pick-
ZoneCheck" Page 395). The information about whether the part can be
picked is returned to the output PickZoneClear.
ListFilterPartResults
SelectedIndex_ is used to select an element in PartResults. This is added
to the output list if PickZoneClear has the value TRUE.
ListFilterScores
SelectedIndex_ is used to select an element in PartScores. This is added
to the output list if PickZoneClear has the value TRUE.
GraphicCollector
Collects the result graphics when the loop is executed and joins them to-
gether. It also accepts graphics for which PickZoneClear has the value
FALSE.
The collected PartResults, Scores and Graphics are transferred to the corre-
sponding outputs.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the parts are
Grey located in the 1st step.
PartResults List <Cog- Positions of the parts for which the pick
Transform- zone check is to be carried out.
2DLinear>
PartScores List <double> Scores that belong to the input posi-
tions
Selected- Int32 Selection index for 1 specific part
Index_
If the input < 0, this tool block is exe-
cuted in the loop for all parts.
Outputs
Name Type Description
PartResults List <Cog- Filtered input positions for which the
Transform- pick zone is free
2DLinear>
Score List <double> Filtered input scores for which the pick
zone is free
Graphics CogGraphic- Result graphics of the pick zone check
Collection
20.1.3.4 LoopPresenceCheck
Description This tool block carries out a check of the parts located in the 1st step. The
presence of certain features on the parts is checked. Depending on the input
SelectedIndex_, this check is carried out for all located parts (< 0) or for a sin-
gle selected part (≥ 0).
Every time the loop is executed, all tools contained in the tool block are exe-
cuted:
Fixture
Defines the coordinate system for the part to be investigated. All subse-
quent tools use this coordinate system.
ProcessFixturedPart
Executes the loop component LocateSinglePattern
(>>> 20.1.3.6 "LocateSinglePattern" Page 395). The information about
whether the feature is present is returned to the output Found.
ListCreatorPresenceFlags
Incrementally collects the Found flags in the output list.
GraphicCollector
Collects the result graphics when the loop is executed and joins them to-
gether. It also accepts graphics for which PickZoneClear has the value
FALSE.
The collected PresenceFlags and Graphics are transferred to the correspond-
ing outputs.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the parts are
Grey located in the 1st step.
PartResults List <Cog- If SelectedIndex_ < 0, the number of
Transform- positions contained in this input list
2DLinear> determines the number of times the
loop is executed.
PartScores List <double> The input is not used.
Selected- Int32 Selection index for 1 specific part
Index_
If the input < 0, this tool block is exe-
cuted in the loop for all parts.
Outputs
Name Type Description
Pres- List <bool> List with flags. These flags specify
enceFlags whether the feature that is to be
checked has been found for the part in
question.
Graphics CogGraphic- Result graphics
Collection
20.1.3.5 LocateSingleCircle
Description This tool block executes a CogFindCircleTool and creates a position based on
the calculated center of the circle. In this position, the angle has the value 0,
while scaling and aspect each have the value 1. The position is calculated both
in the coordinate system of the selected part and in the Pixel coordinate sys-
tem. It is returned to the corresponding outputs. The score is always set to 1.
If no circle could be determined, the output Found_ is set to FALSE and the
output lists (PartResult_, PartResultPixelSpace_ and Score_) are empty.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the search for a
Grey circle is to be carried out.
20.1.3.6 LocateSinglePattern
Description This tool block executes a PatMax tool. The position (PartResult) of the pattern
is calculated in part coordinates and Pixel coordinates. The coordinates and
the score are returned to the corresponding outputs. If the pattern being
searched for was not found (Found_ has the value FALSE), the output lists are
empty.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the search for a
Grey pattern is to be carried out.
Outputs
Name Type Description
PartResult_ List <Cog- List containing the position of the pat-
Transform- tern found, specified in the current
2DLinear> coordinate system of the input image.
PartResult- List <Cog- List containing the position of the pat-
PixelSpace_ Transform- tern found, specified in the Pixel coordi-
2DLinear> nate system.
Score_ List <double> List containing the score. If the pattern
has been found, Score has the value 1.
Found_ bool TRUE: Pattern has been found
Graphics CogGraphic- Result graphic
Collection
20.1.3.7 PickZoneCheck
Description This tool block performs a pick zone check in a search area. The check is car-
ried out on the basis of the histogram inside the search area. The mean gray
value (mean) and the contrast (standard deviation) are measured. A check is
then made to see if these values are within a specified range.
Inputs
Name Type Description
InputImage CogImage8- Input image in which the pick zone
Grey check is to be carried out.
Outputs
Name Type Description
PickZone- Bool TRUE: The pick zone check was suc-
Clear_ cessful, i.e. the pick zone is free.
Graphics CogGraphic- Result graphic
Collection
20.1.4.1 PoseCreator
This tool block uses the input values to create a 2D position that is output at
the output Pose_.
Inputs
Name Type Description
X_ double X translation value of the position to be
created
Y_ double Y translation value of the position to be
created
A_Rad_ double Rotational value of the position to be
created (arc in radians)
Default value: 0
Scaling_ double Scaling value of the position to be cre-
ated
Default value: 1
Aspect_ double X/Y scaling ratio of the position to be
created
Default value: 1
Outputs
Name Type Description
Pose_ CogTrans- 2D position created from the input val-
form2DLinear ues
20.1.4.2 PoseInspector
This tool block outputs the individual values of a position at the outputs.
Inputs
Name Type Description
Pose_ CogTrans- 2D position whose values are to be
form2DLinear output.
Outputs
Name Type Description
X_ double X translation value of the position
Y_ double Y translation value of the position
20.1.5 LocatePart3D
The PatMax tool and the FindLine tool must not be renamed. The fea-
tures must be evenly distributed over the component.
20.1.6 Features
These tool blocks are executed on the parts found using the tool block
LocatePart3D.
CircleFeature
Feature which describes a circle in 3D.
LineFeature
Line feature, without a start and end point.
LineSegFeature
Straight line feature; normally represents points on a straight edge of the
part that is to be located. Has a start point and end point, which are defined
by the search area of the FindLine tool.
PointFeaturePM
Point feature which is located with a PatMax tool.
20.1.7 CrspCollector
Collects the data from the FeatureCrsp outputs of 3D feature tool blocks and
makes these available at the output as a list of Crsps. In addition, the result
images of the Crsps are collected and forwarded in the Graphic output.
20.1.8 GraphicCollector
Inputs
Name Type Description
LoopIndex_ Int32 0: A new, empty output list is created.
any CogGraphic- Input lists to be joined together. It is
Collection possible to create any number of inputs
of this type.
Outputs
Name Type Description
Collected- CogGraphic- Output list with the collected graphics
Graphics_ Collection
20.1.9 StringCollector
Description This tool block converts the values at its inputs to strings and joins them to-
gether in an output string. The individual strings are separated by a space. The
output string is output at the output UserData. The data types Int32, Double,
Bool and String are supported at the inputs. In the case of values of data type
Double, 5 decimal places are generated; the decimal separator is a point.
Inputs
Name Type Description
any Int32, Double, Input data that are to be converted into
Bool, String a string and joined together. It is possi-
ble to create any number of these
inputs.
Outputs
Name Type Description
UserData String Output string with the collected values
20.1.10 ListOperators
ListCollector
(>>> 20.1.10.1 "ListCollector" Page 399)
ListConnector
(>>> 20.1.10.2 "ListConnector" Page 399)
ListCount
(>>> 20.1.10.3 "ListCount" Page 399)
ListCreator
(>>> 20.1.10.4 "ListCreator" Page 399)
ListFilter
(>>> 20.1.10.5 "ListFilter" Page 400)
ListItemSelector
(>>> 20.1.10.6 "ListItemSelector" Page 400)
StringFromLists
(>>> 20.1.10.7 "StringFromLists" Page 401)
20.1.10.1ListCollector
Description The tool block joins together lists received in sequence. The merged list is out-
put at the output OutputList_.
Inputs
Name Type Description
LoopIndex_ Int32 0: The input list is adopted as the output
list.
>0: The elements of the input list are ap-
pended to the output list.
ListToAdd IList Input list whose elements are appended to the
output list.
Outputs
Name Type Description
OutputList_ IList Merged output list
20.1.10.2ListConnector
Description The tool block joins together lists received in parallel. The merged list is output
at the output OutputList_. The input lists are copied into the output list in the
order of the inputs. It is possible to create any number of lists as inputs.
The input lists must contain data of the same type, otherwise execu-
tion is canceled with an error.
Inputs
Name Type Description
any IList Input list whose elements are collected in the
output list. It is possible to create any number
of these inputs.
Outputs
Name Type Description
OutputList_ IList Merged output list
20.1.10.3ListCount
Description The tool block outputs the number of elements contained in the input list.
Inputs
Name Type Description
any IList Input list
Note: Only one input is supported.
Outputs
Name Type Description
any Int32 Number of elements contained in the input list.
20.1.10.4ListCreator
The tool block collects in a list, in sequence, the objects present at the input
ItemToAdd_. This list is output at the output OutputList_.
Inputs
Name Type Description
ItemToAdd Object Element that is to be appended to the output
list.
LoopIndex_ Int32 0: Before execution, a new, empty output
list is generated.
>0: The input element is appended to the
existing output list.
20.1.10.5ListFilter
In accordance with the input EnableAdd_, the tool block copies elements in se-
quence from the input list ListToFilter_ to the output list FilteredList_. Elements
are only copied if the input EnableAdd_ has the value TRUE. Every time the
tool block is executed, the input list element can be appended to the output list
using the input LoopIndex_.
Inputs
Name Type Description
ListToFilter_ IList Input list to be filtered.
EnableAdd_ Bool TRUE: The input list element with the index
LoopIndex_ is copied to the output list.
LoopIndex_ Int32 0: Before execution, a new, empty output
list is generated.
>0: The input element is appended to the
existing output list (dependent on the input
EnableAdd_).
Outputs
Name Type Description
FilteredList_ IList Filtered input list
20.1.10.6ListItemSelector
Description The tool block outputs a selected element from the input list InputList_. The
index of the element is selected using the input SelectedIndex_. If this value
is outside the permissible range (< 0 or greater than the list), an error is gen-
erated.
Exactly 1 output must be created before this tool block is used. This
output must correspond to the type of the elements contained in the
list.
Inputs
Name Type Description
InputList_ IList Input list from which an element is to be out-
put.
Selected- Int32 Index of the input list element that is to be out-
Index_ put.
Outputs
Name Type Description
any IList Selected input list element
20.1.10.7StringFromLists
Description The tool block generates a string from the list elements present at the inputs.
It is possible to create any number of inputs. All input lists must have the same
number of elements.
First, the elements with the index 0 from all input lists are joined together, then
the elements with the index 1, and so on. The individual partial strings are sep-
arated by a vertical line. The input lists may contain data of the types Bool,
Int32, Double and String. In the case of values of type Double, 5 decimal plac-
es are generated; the decimal separator is a point.
Inputs
Name Type Description
any Bool, Int32, Input lists whose elements are to be
Double, String converted to a string. It is possible to
create any number of inputs.
Outputs
Name Type Description
OutputString_ String Output string
20.1.11 Logic
AND
(>>> 20.1.11.1 "AND" Page 401)
OR
(>>> 20.1.11.2 "OR" Page 402)
RangeChecker
(>>> 20.1.11.3 "RangeChecker" Page 402)
20.1.11.1AND
Description The tool block performs logic ANDing of all input values. It is possible to link
any number of inputs with the tool block. Inputs of data types Bool and Int32
are supported.
Inputs
Name Type Description
any Int32, Bool Input data to be linked using the logic AND
operation. It is possible to create any number
of these inputs.
In the case of an input of type Int32:
≠0: TRUE
0: FALSE
Outputs
Name Type Description
And_ Bool Result of the AND operation on all inputs
NAnd_ Bool Inverted result of the AND operation on all
inputs
20.1.11.2OR
Description The tool block performs logic ORing of all input values. It is possible to link any
number of inputs with the tool block. Inputs of data types Bool and Int32 are
supported.
Inputs
Name Type Description
any Int32, Bool Input data to be linked using the logic OR
operation. It is possible to create any number
of these inputs.
In the case of an input of type Int32:
≠0: TRUE
0: FALSE
Outputs
Name Type Description
Or_ Bool Result of the OR operation on all inputs
NOr_ Bool Inverted result of the OR operation on all
inputs
20.1.11.3RangeChecker
Description The tool block checks whether the value of the input Value_ is within the range
defined by the inputs RangeMin_ and RangeMax_.
Inputs
Name Type Description
Range- Double Lower limit for the value of the input
Min_
The value can be created as a constant in the
tool block or linked to an output of a different
tool.
Range- Double Upper limit for the value of the input
Max_
The value can be created as a constant in the
tool block or linked to an output of a different
tool.
Value_ Double Input value that is compared with the limit val-
ues
Outputs
Name Type Description
InRange_ Bool TRUE: The value of the input Value_ is within
the range defined by the inputs RangeMin_
and RangeMax_.
NotIn- Bool Inverted result of the output InRange_
Range_
The KUKA V2 tool block templates are a further development of the KUKA V1
tool block templates for 2D applications. These templates contain descriptions
of the functionality and inputs/outputs of the tool block as well as providing the
option of saving notes in the template (Comments tab).
The KUKA V2 templates group information about recognized objects together
to form VisionTechResult2D objects, while the KUKA V1 templates contain
multiple lists with individual pieces of information.
Basic Tools These templates can be used to recognize objects in an image with a single-
step search process. The results are saved in a data structure of type
VisionTechResultCollection2D and can thus be processed further by tools and
tool blocks.
BlobAlign
(>>> 20.2.3 "BlobAlign" Page 405)
PmAlign
(>>> 20.2.4 "PmAlign" Page 406)
Loops A loop is executed for every result from the data structure of type
VisionTechResultCollection2D. A loop always consists of the template Loop-
FixturedResult and at least one tool or an additional tool block template. This
tool or the tool block template must be integrated into the LoopFixturedResult
template. The tool block templates in the LoopComponents folder can be used
for this.
LoopFixturedResult
(>>> 20.2.5 "LoopFixturedResult" Page 407)
LoopComponents
BarCodeReading
(>>> 20.2.6 "BarCodeReading" Page 407)
PickZoneCheck
(>>> 20.2.7 "PickZoneCheck" Page 408)
FineLocate
(>>> 20.2.8 "FineLocate" Page 409)
Measurement
(>>> 20.2.9 "Measurement" Page 410)
PresenceCheck
(>>> 20.2.10 "PresenceCheck" Page 412)
ToolBlock
(>>> 20.2.16 "Tool block" Page 416)
Development
ResultViewer
(>>> 20.2.17 "ResultViewer" Page 416)
Logic
AND
(>>> 20.2.18 "AND" Page 417)
OR
(>>> 20.2.19 "OR" Page 417)
RangeChecker
(>>> 20.2.20 "RangeChecker" Page 418)
20.2.1 VisionTechResult2D
Description This data structure contains all the information relating to a found workpiece.
Properties
Name Type Description
Graphics CogGraphic- Contains all results graphics (type
Collection ICogGraphic) of the workpiece found,
e.g. contours, search area.
Object- Object- Collection of the object attributes
Attributes Attribute- (type ObjectAttribute)
Collection
The object attributes consist of a
name (type String) and a value (type
Object).
ObjectId Int32 Consecutive number of the workpiece
The number must be unique in con-
junction with the type ID.
ObjectPosition CogTransform- Position of the workpiece
2DLinear
ObjectScore Double Accuracy with which the workpiece
was detected
TypeId Int32 Type ID of the workpiece found
This can be used to distinguish
between different workpiece types.
Methods
Name Type Description
Clone VisionTech- Generates a copy of the workpiece
Result2D
Dispose Void Enables the resources used
20.2.2 2DTopLevel
Description Tool block template of the top level for 2D tasks used by VisionTech. This tool
block makes the required structure available for communication with the robot
controller.
Inputs
Name Type Description
InputImage ICogImage Input image
NumberOf- Int32 Number of components to be
Parts detected by means of the image
processing
Input1 … 5 Bool, Int32, Float, Inputs 1 to 5
(optional) Double, String
These inputs can be used in the
KRL program or during testing of a
task to set application-specific
inputs in the tool block.
These inputs can be created by the
user.
Note: Subprograms can be used to
transfer parameters to these inputs.
Further information can be found in
(>>> 13.25 "Subprograms for trans-
ferring input parameters"
Page 128).
Outputs
Name Type Description
Results VisionTechResult- Results list
Collection2D
Error Exception Exceptions or errors that occur dur-
ing execution of the image process-
ing task
UserData String User-specific data forwarded to
KRL by VisionTech
20.2.3 BlobAlign
Description This template saves the results of the tool Cognex.CogBlobTool in a data
structure of type VisionTechResultCollection2D which it makes available at an
output for further use. The tool Cognex.CogBlobTool can be configured as de-
scribed by Cognex.
The following parameters can be modified in the script of the template:
Name Description
TypeID Defines the type ID of the results found
PatternColor Color of the graphics to be generated
LineWidth Line thickness of the graphics to be generated
AddPointMarker- TRUE: Marking of the center of gravity of the BLOB
Graphics is added to the graphics
AddBoundingBox- TRUE: Display of the rectangle surrounding the
Graphics BLOB is added to the graphics
AddSearchRegion- TRUE: Display of the search area in which BLOBs
Graphics are searched for is added to the graphics
AddBoundary- TRUE: Display of the contours of the BLOB is added
Graphics to the graphics
AddObject- TRUE: Display of the object ID and type ID at the
IdentifierGraphics center of gravity of the BLOB is added to the graph-
ics
Name Description
AreaSize- TRUE: The size (in pixels) of the BLOBs found is
AsAttribute added as an object attribute
KeyArea Name of the object attribute (size of the BLOBs
found)
Inputs
Name Type Description
InputImage ICogImage Input image
NumberOf- Int32 Number of components to be detected
Parts by means of the image processing
Outputs
Name Type Description
Results VisionTech- Results that have been found
Result-
The results contain positions of the
Collection2D
workpieces found, result graphics,
accuracies, etc.
20.2.4 PmAlign
Inputs
Name Type Description
InputImage CogImage8- Input image
Grey
NumberOf- Int32 Number of components to be detected
Parts by means of the image processing
Outputs
Name Type Description
Results VisionTech- Results that have been found
Result-
The results contain positions of the
Collection2D
workpieces found, result graphics,
accuracies, etc.
20.2.5 LoopFixturedResult
Description This template is the basis for all tool blocks that are to be executed once for
each result of a previously executed tool block. A coordinate system is defined
that is based on the position of the individual result. The internal tool block Pro-
cessFixtureResult is then executed. In this tool block, the result can be modi-
fied or new results can be generated.
If a tool block is to be executed for each result without first defining a coordi-
nate system, the Fixture tool can be deleted. This tool is an integral part of the
LoopFixturedResult template.
If additional results are to be generated, the internal tool block can provide
multiple outputs with the data structure VisionTechResult2D or
VisionTechResultCollection2D. The values of these outputs are added to the
list of results at the Results output.
Inputs
Name Type Description
InputImage ICogImage Input image
Results VisionTech- Results for which the internal tool block
Result- is to be executed
Collection2D
Selected- Int32 Index of the result for which the internal
ResultIndex tool block is to be executed
<0: The internal tool block is execut-
ed for every result in the input list.
≥0: The internal tool block is only
executed for the result with the de-
fined index.
Outputs
Name Type Description
Results VisionTech- Results that have been modified or
Result- generated. The value of this output is
Collection2D automatically generated and should not
be linked to the inputs or outputs of
other tools or tool blocks.
20.2.6 BarCodeReading
Description This template uses the tool Cognex.CogBarcodeTool and adds information as
an object attribute to a result of type VisionTechResult2D about whether it was
possible to read a barcode in the defined area. The read content of the bar-
code is also saved in an object attribute. The tool Cognex.CogBarcodeTool
can be configured as described by Cognex.
The following parameters can be modified in the script of the template:
Name Description
PatternColor- Color of the search area if the search for a barcode is
Good successful
Name Description
PatternColor- Color of the search area if the search for a barcode is
Bad not successful
AddSearch- TRUE: The search area is added to the results as a
RegionGraphics graphic
Inputs
Name Type Description
InputImage ICogImage Input image
Result VisionTech- Result to which information is to be
Result2D added
KeyBarcodeS- String Key to be used for storing the informa-
tatus tion about whether the barcode was
readable
KeyBarcode- String Key to be used for storing the informa-
Value tion that was read in the barcode
Outputs
Name Type Description
Result VisionTech- Result to which the information was
Result2D added
20.2.7 PickZoneCheck
Description This template uses the Cognex.CogHistogramTool template and checks the
pick zones next to a workpiece. The Cognex.CogHistogramTool template can
be configured as described by Cognex.
Normally, the average brightness of an area in which a workpiece is present,
or into which a workpiece protrudes, differs from a homogeneous background.
In these cases, a collision of the robot with foreign bodies or other workpieces
within a pick zone can be avoided by means of this check.
It is possible to arrange multiple instances of this template in sequence in or-
der to check several pick zones per workpiece. The overall pick zone of a
workpiece is only deemed to be free if all checks indicate a free pick zone.
The following parameters can be modified in the script of the template:
Name Description
PatternColor- Color of the pick zone if it is free
Good
PatternColor- Color of the pick zone if it is not free
Bad
PatternColor- Color of the pick zone if it is partially outside the image
OutsideImage
FlagSearch- TRUE: The pick zone is added to the result as a
Region graphic
Inputs
Name Type Description
InputImage ICogImage Input image
Result VisionTech- Result that is to be checked
Result2D
KeyPickZone- String Name of the object attribute in which
Clear the information is saved regarding
whether the pick zone was free
Outputs
Name Type Description
Result VisionTech- Result, generated from the inputs, to
Result2D which the information was added
20.2.8 FineLocate
Description In order to determine a feature in or on the workpiece that is important for pro-
cessing with the robot, the position of a previously detected workpiece can be
modified. The following templates are available for this application:
LocateSinglePattern_FL
This template uses the tool Cognex.CogPmAlignTool. Rotation and posi-
tion of the new pattern are saved in the result of type VisionTechResult2D.
LocateSingleCircle_FL
This template uses the tool Cognex.CogFineCircleTool to locate the cen-
ter of a circle. As a circle does not contain information about the rotational
position of the workpiece, only the translation of the workpiece is moved
to the center of the circle and the rotational information remains un-
changed.
LocateSingleBlob_FL
This template uses the tool Cognex.CogBlobTool to locate a contiguous
freeform object. The translational information is determined using the cen-
ter of gravity. The rotation is defined by the principal extension axis of the
BLOB.
The following parameters can be modified in the script of these templates:
Name Description
PatternColor Color of the graphics to be generated
LineWidth Line thickness of the graphics to be generated
AddObject- TRUE: Display of the object ID and type ID is added
IdentifierGraphics to the graphics
Name Description
AddCenterPoint- TRUE: Marking of the origin of the pattern is added
MarkerGraphics to the graphics
AddTrainRegion- TRUE: Display of the training range is added to the
Graphics graphics
AddMatch- TRUE: Display of the contour of the pattern is added
FeaturesGraphics to the graphics
Name Description
AddCenterPoint- TRUE: Marking of the center of the circle is added to
MarkerGraphics the graphics
AddCircleGraphics TRUE: Display of the fitted circle is added to the
graphics
AddCircleCaliper- TRUE: Display of the individual caliper gauge graph-
Graphics ics is added to the graphics
Name Description
AddPointMarker- TRUE: Marking of the center of gravity of the BLOB
Graphics is added to the graphics
AddBoundingBox- TRUE: Display of the rectangle surrounding the
Graphics BLOB is added to the graphics
AddBoundary- TRUE: Display of the contours of the BLOB is added
Graphics to the graphics
Inputs
Name Type Description
InputImage ICogImage Input image
Result VisionTech- Result that is to be checked
Result2D
Outputs
Name Type Description
Result VisionTech- Result to which information was
Result2D added
20.2.9 Measurement
Description The templates in the Measurement folder can be used to perform various cal-
culations.
FeatureDistance
(>>> 20.2.9.1 "FeatureDistance" Page 410)
LocateSingle templates
(>>> 20.2.9.2 "LocateSingle templates" Page 411)
PixelToMetricsConverter
(>>> 20.2.9.3 "PixelToMetricsConverter" Page 412)
20.2.9.1 FeatureDistance
Description This template can be used to determine the distance between 2 points in mm.
For this, the coordinates of the start and end points in pixels are required. The
result is saved as an object attribute of type VisionTechResult2D.
The following parameters can be modified in the script of the template:
Name Description
GraphicColor Color of the graphics to be generated
LineWidth Line thickness of the graphics to be added
InputsSpaceName Coordinate system in which the start and end points
are specified
Pixel coordinate system: "#" (default)
Workpiece coordinate system: "."
AddPoint- TRUE: Marking of the start and end points of the
MarkerGraphics measurement is added to the graphics
AddDistance- TRUE: Line connecting the points is added to the
LineGraphics graphics
The input image must have the format PNG and be generated by
KUKA.VisionTech 3.1, as it will not otherwise have the required infor-
mation about the camera calibration.
Inputs
Name Type Description
InputImage ICogImage Input image
Result VisionTech- Current result
Result2D
KeyFeature- String Name of the object attribute in which
Distance the distance between the two points is
saved
X1 Double X coordinate of the start point (in pix-
els)
Y1 Double Y coordinate of the start point (in pix-
els)
X2 Double X coordinate of the end point (in pixels)
Y2 Double Y coordinate of the end point (in pixels)
Outputs
Name Type Description
Result VisionTech- Result to which the distance was
Result2D added
Description In order to determine the distance between 2 workpiece features in mm, the
pixel coordinates of the origins of these features are required. There are 3 tem-
plates available for this; each template covers a different case.
LocateSinglePattern_M
This template uses the tool Cognex.CogPmAlignTool. The X/Y position of
the feature is made available at the outputs. The value of the result re-
mains unchanged.
LocateSingleCircle_M
This template uses the tool Cognex.CogFineCircleTool. The X/Y position
of the center of the circle is made available at the outputs. The value of the
result remains unchanged.
LocateSingleBlob_M
This template uses the tool Cognex.CogBlobTool. The X/Y position of the
center of gravity of the BLOB is made available at the outputs. The value
of the result remains unchanged.
The following parameters can be modified in the script of these templates:
Name Description
PatternColor Color of the graphics to be generated
LineWidth Line thickness of the graphics to be generated
AddObject- TRUE: Display of the object ID and type ID is added
IdentifierGraphics to the graphics
Name Description
AddCenterPoint- TRUE: Marking of the origin of the pattern is added
MarkerGraphics to the graphics
AddTrainRegion- TRUE: Display of the training range is added to the
Graphics graphics
AddMatch- TRUE: Display of the contour of the pattern is added
FeaturesGraphics to the graphics
Name Description
AddCenterPoint- TRUE: Marking of the center of the circle is added to
MarkerGraphics the graphics
AddCircleGraphics TRUE: Display of the fitted circle is added to the
graphics
AddCircleCaliper- TRUE: Display of the individual caliper gauge graph-
Graphics ics is added to the graphics
Name Description
AddPointMarker- TRUE: Marking of the center of gravity of the BLOB
Graphics is added to the graphics
AddBoundingBox- TRUE: Display of the rectangle surrounding the
Graphics BLOB is added to the graphics
AddBoundary- TRUE: Display of the contours of the BLOB is added
Graphics to the graphics
Inputs
Name Type Description
InputImage ICogImage Input image
Result VisionTech- Current result
Result2D
Outputs
Name Type Description
Result VisionTech- Result that was not modified
Result2D
XCenter Double X coordinate of the feature in pixels
YCenter Double Y coordinate of the feature in pixels
20.2.9.3 PixelToMetricsConverter
Description This template can be used to convert values specified in pixels to a metric unit.
The unit depends on the calibration plate used, but is generally mm.
The input image must have the format PNG and be generated by
KUKA.VisionTech 3.1, as it will not otherwise have the required infor-
mation about the camera calibration.
Inputs
Name Type Description
InputImage ICogImage Input image
PixelValue Double Pixel value to be converted
Outputs
Name Type Description
MetricValue Double Result of conversion to a metric unit
20.2.10 PresenceCheck
Description This template is used to check whether a specific feature is present. The po-
sition of the workpiece is determined using a template from the Basic Tools
folder. Relative to this position, the system checks whether there is a feature
Name Description
PatternColor Color of the graphics to be generated
LineWidth Line thickness of the graphics to be generated
AddObject- TRUE: Display of the object ID and type ID is added
IdentifierGraphics to the graphics
Name Description
AddCenterPoint- TRUE: Marking of the origin of the pattern is added
MarkerGraphics to the graphics
AddTrainRegion- TRUE: Display of the training range is added to the
Graphics graphics
AddMatch- TRUE: Display of the contour of the pattern is added
FeaturesGraphics to the graphics
Name Description
AddCenterPoint- TRUE: Marking of the center of the circle is added to
MarkerGraphics the graphics
AddCircleGraphics TRUE: Display of the fitted circle is added to the
graphics
AddCircleCaliper- TRUE: Display of the individual caliper gauge graph-
Graphics ics is added to the graphics
Name Description
AddPointMarker- TRUE: Marking of the center of gravity of the BLOB
Graphics is added to the graphics
AddBoundingBox- TRUE: Display of the rectangle surrounding the
Graphics BLOB is added to the graphics
AddBoundary- TRUE: Display of the contours of the BLOB is added
Graphics to the graphics
Inputs
Name Type Description
InputImage ICogImage Input image
Result VisionTech- Result that is to be checked
Result2D
Outputs
Name Type Description
Result VisionTech- Result to which information was
Result2D added
20.2.11 AttributeAssembler
Description This tool block adds object attributes to a result with the data structure
VisionTechResult2D. Name and value of the object attributes are defined via
the inputs. Inputs can be generated automatically by dragging the output of a
tool or tool block onto this tool block. Alternatively, inputs can be generated via
the context menu. If a new input is added, this input is interpreted as the value
for a new object attribute and the corresponding input for the name of the ob-
ject attribute is automatically generated.
The object attributes are added to the attribute ObjectAttributes of the data
structure VisionTechResult2D. An object attribute with the serial number X is
only added if both NameX and ValueX exist as inputs.
If the object attributes are to be used in KRL, a maximum of 5 attributes can
be transferred. Each value is converted into a string and transferred. The
length of the name and value of the object attribute is limited to 100 characters.
Inputs
Name Type Description
Result VisionTech- Current result to which object attri-
Result2D butes are to be added
Name1 … N String Name of object attribute 1 ... N
Value1 … N - Value of object attribute 1 ... N
Outputs
Name Type Description
Result VisionTech- Result to which object attributes
Result2D have been added
20.2.12 AttributeValueExtractor
Description This tool block reads values from object attributes and automatically provides
them as an output. Any number of object attributes can be read. If an object
attribute that is to be read has been removed from this tool block, the output is
also deleted and vice versa.
Inputs
Name Type Description
Result VisionTech- Result from which the object attri-
Result2D butes are to be read
Outputs
Name Type Description
AttributeName - Output that is generated automatically
1…N
This output contains the value of the object
attribute 1 … N that is to be read. The name of
the output corresponds to the name of the
object attribute.
20.2.13 PartSorter
Description This tool block sorts the results in the data structure
VisionTechResultCollection2D by the X or Y coordinate in ascending or de-
scending order. This enables the robot to clear and sort workpieces, for exam-
ple.
Inputs
Name Type Description
Results VisionTechResult- Unsorted collection of results
Collection2D
Orientation System.Win- Horizontal: The results are sort-
dows.Forms.Ori- ed by the X coordinate
entation Vertical: The results are sorted
by the Y coordinate
SortDirection System.Compo- Ascending: Sorting in ascending
nentModel.List- order
SortDirection Descending: Sorting in de-
scending order
Outputs
Name Type Description
Results VisionTechResult- Sorted collection of results
Collection2D
20.2.14 ResultCollector
Description This tool block groups multiple results lists together as a single results list. Re-
sults can be dragged onto the tool block. There can be any number of inputs
and their names can be chosen freely. The results lists are arranged consec-
utively. The grouped results list thus contains the results of the first results list
first, then the results of the second results list, and so on.
The results are displayed in a tree structure on the Results tab. Here it is pos-
sible to select which results are to be displayed as a graphic. This display of
the graphics has no effect on the generated results list.
Inputs
Name Type Description
InputImage ICogImage Input image
Results1 … N VisionTechResult2D or Results list 1 … N
VisionTechResult-
Collection2D
Outputs
Name Type Description
Results VisionTechResult- Results list generated from the val-
Collection2D ues of the inputs
20.2.15 ResultCreator
Description This tool block generates a results list with data structure
VisionTechResultCollection2D from the values of the inputs. This ensures
compatibility with older tool blocks created using the KUKA V1 templates. This
also makes it possible to expand the older tool blocks with new functionalities
from the KUKA V2 templates.
The number of positions of the workpieces found and the accuracy with which
the workpieces were detected must be identical. This number specifies the
number of results at the Results output. There can be any number of graphics.
The graphics are assigned to every result.
The following setting can be made on the Settings tab:
Type ID: ID for the results found by the tool. This ID can be overwritten us-
ing the input TypeID.
Inputs
Name Type Description
InputImage ICogImage Input image
TypeID Int32 Type ID for the results to be gener-
ated
Default value: -1
PartResults_ IEnumerable Workpiece positions for the results
<CogTransform- to be generated
2DLinear>
Scores_ IEnumerable Accuracy with which the work-
<Double> pieces were detected for the results
that are to be generated
Graphics_ CogGraphic- Graphics for the results to be gen-
Collection erated
Outputs
Name Type Description
Results VisionTechResult- Results list generated from the val-
Collection2D ues of the inputs
Description This tool block is an empty standard tool block for user-specific use. There can
be any number of inputs and outputs.
Unlike the standard tool block CogToolBlock, this tool block contains the Com-
ments tab. Furthermore, when a script is generated, a reference to the file Ku-
kaRoboter.VisionTech.Cognex.VisionProExtensions.dll is automatically set.
All data structures of the KUKA V2 templates are located in this file.
20.2.17 ResultViewer
This tool block is intended for development and enables simple error
analysis. Use of this tool block during production is not recommend-
ed, as it can have an adverse effect on the processing power.
Description This tool block displays results and result lists. Both can be dragged onto the
tool block to create matching inputs. There can be any number of inputs and
their names can be chosen freely. The name of the input is used in the display
of the grouped results.
The results are displayed in a tree structure on the Results tab. Here it is pos-
sible to select which results are to be displayed as a graphic.
Inputs
Name Type Description
InputImage ICogImage Input image
Results1 … N VisionTechResult2D or Results list 1 … N
VisionTechResult-
Collection2D
20.2.18 AND
Description The tool block performs logic ANDing of all input values. It is possible to link
any number of inputs with the tool block. Inputs of data types Bool and Int32
are supported. Unconnected inputs are ignored.
If data types that are not supported are used for the inputs, the result
at the And output automatically has the value FALSE.
Inputs
Name Type Description
any Int32, Bool Input data to be linked using the logic AND
operation. It is possible to create any number
of these inputs.
In the case of an input of type Int32:
≠0: TRUE
0: FALSE
Outputs
Name Type Description
And Bool Result of the AND operation on all inputs
NAnd Bool Inverted result of the AND operation on all
inputs
20.2.19 OR
Description The tool block performs logic ORing of all input values. It is possible to link any
number of inputs with the tool block. Inputs of data types Bool and Int32 are
supported. Unconnected inputs are ignored.
Inputs
Name Type Description
any Int32, Bool Input data to be linked using the logic OR
operation. It is possible to create any number
of these inputs.
In the case of an input of type Int32:
≠0: TRUE
0: FALSE
Outputs
Name Type Description
Or_ Bool Result of the OR operation on all inputs
NOr Bool Inverted result of the OR operation on all
inputs
20.2.20 RangeChecker
Description The tool block checks whether the value of the input Value is within the range
defined by the inputs RangeMin and RangeMax.
Inputs
Name Type Description
RangeMin Double Lower limit for the value of the input
The value can be created as a constant in the
tool block or linked to an output of a different tool
block.
RangeMax Double Upper limit for the value of the input
The value can be created as a constant in the
tool block or linked to an output of a different tool
block.
Value Double Input value that is compared with the limit values
Outputs
Name Type Description
InRange Bool TRUE: The value of the input Value is within the
range defined by the inputs RangeMin and
RangeMax.
NotIn- Bool Inverted result of the output InRange
Range
Procedure Select the desired signal source for each output in the I/O management
area.
Description of There are 3 outputs at the process interface of the KUKA MXG20 camera
KUKA MXG20 (Line1 … Line3) that can be configured by the user. These outputs can be as-
signed internal signals via the software.
As standard, the outputs are assigned the following signals:
Line1: Timer1Active
Line2: Off
Line3: Off
Description of There is one output at the process interface of the KUKA VCXG-25 camera
KUKA VCXG-25 (Line3) that can be configured by the user. This output can be assigned inter-
nal signals via the software.
As standard, the output is assigned the signal Timer1Active.
Procedure Under Timer configuration in the I/O management area, carry out the
desired settings for each timer.
Description of A timer can be used to control the internal signals. The timers can be activated
KUKA MXG20 with the signal sources Timer1Active, Timer2Active and Timer3Active.
Name Description
Trigger Signal that starts the timer
source:
Off: Trigger source is deactivated.
Action1: Selected action signal
ExposureEnd: Signal edge at end of exposure
ExposureStart: Signal edge at start of exposure
FrameEnd: Signal edge at end of reading of an image
FrameStart: Signal edge at start of reading of an im-
age
Line0: Signal edge at input
Software: Software trigger
TriggerSkipped: TiggerSkipped signal
Trigger acti- Mode in which the trigger source starts the timer
vation:
AnyEdge: Falling or rising signal edge
FallingEdge: Falling signal edge
LevelHigh: Signal state "High"
LevelLow: Signal state "Low"
RisingEdge: Rising signal edge
Duration Length of time the timer is active
[ms]:
Delay [ms]: Interval between input of the trigger signal and activation
of the timer
Description of A timer can be used to control the internal signals. The timer can be activated
VCXG-25M with the signal source Timer1Active.
Name Description
Trigger Signal that starts the timer
source:
Off: Trigger source is deactivated.
Action1: Selected action signal
ExposureEnd: Signal edge at end of exposure
ExposureStart: Signal edge at start of exposure
FrameTransferSkipped: FrameTransferSkipped sig-
nal
Line0: Signal edge at input
Software: Software trigger
TriggerSkipped: TiggerSkipped signal
Trigger acti- Mode in which the trigger source starts the timer
vation:
AnyEdge: Falling or rising signal edge
FallingEdge: Falling signal edge
RisingEdge: Rising signal edge
Name Description
Duration Length of time the timer is active
[ms]:
Delay [ms]: Interval between input of the trigger signal and activation
of the timer
Tightening The following tightening torques (Nm) are valid for screws and nuts where no
torques other specifications are given.
The specified values apply to lightly oiled black (e.g. phosphated) and coated
(e.g. mech. galv., zinc flake plating) screws and nuts.
Strength class
Thread 8.8 10.9 12.9
M1.6 0.17 Nm 0.24 Nm 0.28 Nm
M2 0.35 Nm 0.48 Nm 0.56 Nm
M2.5 0.68 Nm 0.93 Nm 1.10 Nm
M3 1.2 Nm 1.6 Nm 2.0 Nm
M4 2.8 Nm 3.8 Nm 4.4 Nm
M5 5.6 Nm 7.5 Nm 9.0 Nm
M6 9.5 Nm 12.5 Nm 15.0 Nm
M8 23.0 Nm 31.0 Nm 36.0 Nm
M10 45.0 Nm 60.0 Nm 70.0 Nm
M12 78.0 Nm 104.0 Nm 125.0 Nm
M14 125.0 Nm 165.0 Nm 195.0 Nm
M16 195.0 Nm 250.0 Nm 305.0 Nm
M20 370.0 Nm 500.0 Nm 600.0 Nm
M24 640.0 Nm 860.0 Nm 1030.0 Nm
M30 1330.0 Nm 1700.0 Nm 2000.0 Nm
Strength class
Thread 8.8 10.9
ISO7991 ISO7380, ISO07381
Allen screw Fillister head
screw
M3 0.8 Nm 0.8 Nm
M4 1.9 Nm 1.9 Nm
M5 3.8 Nm 3.8 Nm
Strength class
Thread 10.9
ISO7984
pan head screws
M4 2.8 Nm
21 KUKA Service
2
Availability KUKA Customer Support is available in many countries. Please do not hesi-
tate to contact us if you have any questions.
Index
Numbers Documentation, industrial robot 11
2D model, generating 90
2D model, moving camera 90 E
2D model, stationary camera 90 EKI 12
2D task, configuring 87 EMC Directive 27, 29, 30
2D task, testing 91 Emergency license, activating 56
2DTopLevel, 2D template 404 EN 60204-12006/A12009 39
3D model, generating 95 Ethernet KRL, configuring 66
3D task, configuring 92 Example programs 131
3D task, testing 95 Exposure time, setting (smartHMI) 71
89/336/EEC 27, 29, 30 Exposure time, setting (WorkVisual) 71
A F
AND, Utility 401, 417 FeatureDistance, 2D template 410
Appendix 389 Features, 3D templates 397
Application examples 24 FineLocate, 2D templates 409
Areas of application 15 Functions 15
AttributeAssembler, utility 414
AttributeValueExtractor, utility 414 G
General safety measures 39
B GenICam 12
BarCodeReading, 2D template 407 GigE 12
BLOB 12 GraphicCollector, Utility 397
BlobAlign, 2D template 405
I
C Image processing task, setting up 84
Calibration plane, deleting 75 Images, acquisition (WorkVisual) 84
Calibration plates 24 Images, taking (moving) 83
Calibration pose 12 Images, taking (stationary) 81
Camera, calibrating (stationary) 73 Inline forms 97
Camera, configuring timers 419 Installation 51
Camera, exchange 383 Installation via smartHMI 51
Camera, outputs, configuration 418 Installation via WorkVisual 52
Cameras, aligning (smartHMI) 70 Installing via WorkVisual 52
Cameras, aligning (WorkVisual) 70 Intended use 13
Cameras, calibrating (moving) 75 Interfaces 41
Cameras, configuring (smartHMI) 66 Interfaces, KR C4, connecting cables 42
Cameras, configuring offline (WorkVisual) 69 Interfaces, KR C4 compact, connecting cables
Cameras, configuring online (WorkVisual) 68 45
Cameras, I/O management 418 Interfaces, KUKA IPC, connecting cables 46
Cancel, program 358 Introduction 11
Communication 15
Configuration 81 K
Connecting cables 24, 32, 41 KLI 12
Connecting cables, KR C4 32, 43 Knowledge, required 13
Connecting cables, KR C4 compact 33, 45 KONI 12
Connecting cables, KUKA IPC 33, 46 KUKA Customer Support 423
CrspCollector, Utility (3D) 397 KUKA GigE switch 18
KUKA GigE switch, technical data 29
D KUKA IPC, configuring 65
Decommissioning 387 KUKA IPC, executing VisionTech 54
Defining the points on the robot 118 KUKA IPC, generating setup 54
Depalletizing 12 KUKA lenses 23, 30
Deracking 12 KUKA MXG20 camera 21
Dimensions, extension 35 KUKA MXG20 camera, technical data 27
Dimensions, MXG20 camera 34 KUKA Service 423
Dimensions, VCXG-25M camera 35 KUKA VCXG-25M camera 22
Disposal 387 KUKA VCXG-25M camera, technical data 28
Training 13 W
Transportation 49 Warnings 11
Troubleshooting 385 Wrist root point 367, 368
U
Uninstalling via smartHMI 52
Uninstalling via WorkVisual 53
Updating via WorkVisual 52
Utilities, 2D templates 396
V
Verification task, configuring 77
Verification, configuring 76
VisionTechResult2D, data structure 404
VT_2DMoving, template 104
VT_2DStationary, template 101
VT_2DVerifyStationary, template 106
VT_3DMoving, template 104
VT_CHECKPOSE (subprogram) 116
VT_CHECKPOSITION (subprogram) 115
VT_CHECKRESULT (subprogram) 113
VT_CLEAR (subprogram) 114
VT_CLEARBUFFER (subprogram) 123
VT_CLOSECONNECTION (subprogram) 114
VT_ContainsAttribute (subprogram) 124
VT_DIRECT (subprogram) 120
VT_GetBoolFromAttrName (subprogram) 126
VT_GetCharFromAttrName (subprogram) 125
VT_GETCORRECTIONFRAME (subprogram)
113
VT_GetIntFromAttrName (subprogram) 124
VT_GetRealFromAttrName (subprogram) 125
VT_GETRESULTS (subprogram) 121
VT_GETTASKRESULTS (subprogram) 119
VT_GETTASKRESULTS_IR (subprogram) 119
VT_GETUSERDATA (subprogram) 118
VT_GETVERIFICATIONRESULT (subprogram)
122
VT_GETVERIFYRESULT_IR (subprogram) 122
VT_INIT (subprogram) 111
VT_IsAttributeValSet (subprogram) 126
VT_OPENCONNECTION (subprogram) 111
VT_SETEXPOSURE (subprogram) 123
VT_SetInputAsBool (subprogram) 128
VT_SetInputAsInt (subprogram) 128
VT_SetInputAsReal (subprogram) 129
VT_SetInputAsString (subprogram) 129
VT_StringToBool (subprogram) 127
VT_StringToInt (subprogram) 127
VT_StringToReal (subprogram) 128
VT_TASKTRIGGER (subprogram) 112
VT_TASKTRIGGER_ADRUN (subprogram) 112
VT_TASKTRIGGER_REFBASE (subprogram)
112
VT_VERIFICATIONTRIGGER (subprogram)
121
VT_WAITFORRESULT (subprogram) 114
VTRESULT, structure 109
VTRESULTATTRIBUTE, structure 111
VTVERIFYRESULT, structure 110