4.SDI Administration Guide
4.SDI Administration Guide
SAP HANA Smart Data Integration and SAP HANA Smart Data Quality 2.0 SP03
Document Version: 1.0 – 2020-02-12
Administration Guide
© 2020 SAP SE or an SAP affiliate company. All rights reserved.
1 Administration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Administration Guide
2 PUBLIC Content
Manage Agent Nodes in an Agent Group. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Add Adapters to an Agent Group. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Configure Remote Sources in an Agent Group. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
4.3 Managing Remote Sources and Subscriptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Create a Remote Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .54
Suspend and Resume Remote Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Alter Remote Source Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Manage Remote Subscriptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Processing Remote Source or Remote Subscription Exceptions. . . . . . . . . . . . . . . . . . . . . . . . . 61
4.4 Managing Design Time Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Execute Flowgraphs and Replication Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Schedule Flowgraphs and Replication Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Stop Non-Realtime Flowgraph Executions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Start and Stop Data Provisioning Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Schedule Data Provisioning Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Executing Partitions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
4.5 Managing Enterprise Semantic Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Roles for Enterprise Semantic Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Enterprise Semantic Services Knowledge Graph and Publication Requests. . . . . . . . . . . . . . . . . 72
Publishing Artifacts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Monitor the Status of Publication Requests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Manage Published Artifacts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Data Profiling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Setting Configuration Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Troubleshooting Enterprise Semantic Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Administration Guide
Content PUBLIC 3
6 Troubleshooting and Recovery Operations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
6.1 Troubleshooting Real-Time Replication Initial Queue Failures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Resolve User Privilege Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Resolve Remote Source Parameter Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Resolve Improper Source Database Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
Resolve Improper Adapter Configurations on the Agent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Resolve Uncommitted Source Database Transactions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
Resolve Log Reader Instance Port Conflicts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .123
Resolve Data Provisioning Server Timeouts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .124
Load Clustered and Pooled Table Metadata into SAP HANA. . . . . . . . . . . . . . . . . . . . . . . . . . . 125
6.2 Recovering from Replication Failures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Check for Log Reader Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Recover from a Source Table DDL Schema Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Recover from a Truncated Source Table. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Recover from Source Table and Replication Task Recreation. . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Recover from a Source and Target Data Mismatch. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
Recover from Data Inconsistencies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Recover from an Agent Communication Issue. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Resolve Stopped or Delayed Replication on Oracle. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .133
Resolve Locked SAP HANA Source Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .134
Reset the Remote Subscription. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
Clear Remote Subscription Exceptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
6.3 Recovering from Crashes and Unplanned System Outages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
Recover from an Index Server Crash. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Recover from a Data Provisioning Server Crash. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
Recover from a Data Provisioning Agent JVM Crash. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
Recover from an Unplanned Source Database Outage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Recover from an SAP ASE Adapter Factory Crash. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
6.4 Troubleshooting Data Provisioning Agent Issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
Data Provisioning Agent Log Files and Scripts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Clean an Agent Started by the Root User. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
Agent JVM Out of Memory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
Adapter Prefetch Times Out. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
Agent Reports Errors When Stopping or Starting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Uninstalled Agent Reports Alerts or Exceptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Create an Agent System Dump. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
Resolve Agent Parameters that Exceed JVM Capabilities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
6.5 Troubleshooting Other Issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
Activate Additional Trace Logging for the Data Provisioning Server. . . . . . . . . . . . . . . . . . . . . . 148
Resolve a Source and Target Data Mismatch. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
Configuring the Operation Cache. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Administration Guide
4 PUBLIC Content
Ensure Workload Management and Resource Consumption. . . . . . . . . . . . . . . . . . . . . . . . . . . 152
7 SQL and System Views Reference for Smart Data Integration and Smart Data Quality. . . . . . 154
7.1 SQL Statements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
ALTER ADAPTER Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
ALTER AGENT Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
ALTER REMOTE SOURCE Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . 159
ALTER REMOTE SUBSCRIPTION Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . .163
CANCEL TASK Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .164
CREATE ADAPTER Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .166
CREATE AGENT Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .168
CREATE AGENT GROUP Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . 170
CREATE AUDIT POLICY Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . .171
CREATE REMOTE SOURCE Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . 173
CREATE REMOTE SUBSCRIPTION Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . .174
CREATE VIRTUAL PROCEDURE Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . 179
DROP ADAPTER Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
DROP AGENT Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
DROP AGENT GROUP Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . 183
DROP REMOTE SUBSCRIPTION Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . 184
GRANT Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
PROCESS REMOTE SUBSCRIPTION EXCEPTION Statement [Smart Data Integration]. . . . . . . . 187
SESSION_CONTEXT Function [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .188
START TASK Statement [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
7.2 System Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
ADAPTER_CAPABILITIES System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . 195
ADAPTER_LOCATIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . 196
ADAPTERS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
AGENT_CONFIGURATION System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . 197
AGENT_GROUPS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
AGENTS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
M_AGENTS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
M_REMOTE_SOURCES System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . 198
M_REMOTE_SUBSCRIPTION_COMPONENTS System View [Smart Data Integration]. . . . . . . . .199
M_REMOTE_SUBSCRIPTION_STATISTICS System View [Smart Data Integration]. . . . . . . . . . . 200
M_REMOTE_SUBSCRIPTIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . 200
M_SESSION_CONTEXT System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . 202
REMOTE_SOURCE_OBJECT_COLUMNS System View [Smart Data Integration]. . . . . . . . . . . . 203
REMOTE_SOURCE_ OBJECT_DESCRIPTIONS System View [Smart Data Integration]. . . . . . . . 203
REMOTE_SOURCE_OBJECTS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . 204
REMOTE_SOURCES System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . .204
REMOTE_SUBSCRIPTION_EXCEPTIONS System View [Smart Data Integration]. . . . . . . . . . . . 205
Administration Guide
Content PUBLIC 5
REMOTE_SUBSCRIPTIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . 206
TASK_CLIENT_MAPPING System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . 206
TASK_COLUMN_DEFINITIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . 207
TASK_EXECUTIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . 207
TASK_LOCALIZATION System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . 208
TASK_OPERATIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . .209
TASK_OPERATIONS_EXECUTIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . 209
TASK_PARAMETERS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . 210
TASK_TABLE_DEFINITIONS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . .211
TASK_TABLE_RELATIONSHIPS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . 212
TASKS System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
VIRTUAL_COLUMN_PROPERTIES System View [Smart Data Integration]. . . . . . . . . . . . . . . . . 213
VIRTUAL_TABLE_PROPERTIES System View [Smart Data Integration]. . . . . . . . . . . . . . . . . . . 214
BEST_RECORD_GROUP_MASTER_STATISTICS System View [Smart Data Quality]. . . . . . . . . . 214
BEST_RECORD_RESULTS System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . 215
BEST_RECORD_STRATEGIES System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . 216
CLEANSE_ADDRESS_RECORD_INFO System View [Smart Data Quality]. . . . . . . . . . . . . . . . . .217
CLEANSE_CHANGE_INFO System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . 218
CLEANSE_COMPONENT_INFO System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . 219
CLEANSE_INFO_CODES System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . 220
CLEANSE_STATISTICS System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
GEOCODE_INFO_CODES System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . 222
GEOCODE_STATISTICS System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . . 223
MATCH_GROUP_INFO System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
MATCH_RECORD_INFO System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . . 224
MATCH_SOURCE_STATISTICS System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . 225
MATCH_STATISTICS System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
MATCH_TRACING System View [Smart Data Quality]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
Administration Guide
6 PUBLIC Content
1 Administration Guide for SAP HANA
Smart Data Integration and SAP HANA
Smart Data Quality
This guide describes the common tasks and concepts necessary for the ongoing operation, administration,
and monitoring of SAP HANA smart data integration and SAP HANA smart data quality.
● Monitoring
● Administration and maintenance tasks
● Troubleshooting and recovery operations
For information about the initial installation and configuration of SAP HANA smart data integration and SAP
HANA smart data quality, refer to the Installation and Configuration Guide for SAP HANA Smart Data
Integration and SAP HANA Smart Data Quality.
For information about administration of the overall SAP HANA system, refer to the SAP HANA Administration
Guide.
Related Information
Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
SAP HANA Administration Guide
Administration Guide
Administration Guide for SAP HANA Smart Data Integration and SAP HANA Smart
Data Quality PUBLIC 7
2 Monitoring Data Provisioning in the SAP
HANA Web-based Development
Workbench
The Data Provisioning monitors are browser-based interfaces that let you monitor agents, remote sources and
subscriptions, design-time objects (flowgraphs and replication tasks), and tasks created in an SAP HANA
system.
Related Information
For some versions of SAP HANA, to enable statistics collection for the monitors you must activate the
dpStatistics job.
Prerequisites
If you are using one of the following SAP HANA versions, enable statistics collection per the following
procedure. For more information, see the SAP HANA smart data integration Product Availability Matrix (PAM).
Administration Guide
8 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
The user must have the following roles or privileges.
Context
Procedure
Related Information
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 9
2.2 Access the Data Provisioning Monitors
In the SAP HANA Web-based Development Workbench Editor, there are several ways to access the Data
Provisioning monitors.
Prerequisites
● The user must have the following roles or privileges to use the data provisioning monitors.
● Your web browser must support the SAPUI5 library sap.m (for example, Microsoft Internet Explorer 9).
For more information about SAPUI5 browser support, see SAP Note 1716423 and the Product
Availability Matrix (PAM) for SAPUI5.
Context
● For activated design-time objects (flowgraphs or replication tasks), in the SAP HANA Web-based
Development Workbench: Editor, select the object and click the Launch Monitoring Console icon.
● From within a monitor, you can switch to another monitor by selecting it from the Navigate to drop-down
box.
● Enter the URL address of each monitor directly into a web browser.
Administration Guide
10 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
Monitor How to Access
Data Provisioning Design Time Object <host name>:80<2 digit instance number>/sap/hana/im/dp/
Monitor monitor/index.html?view=IMDesignTimeObjectMonitor
Related Information
The Data Provisioning monitors use several shared controls and processes.
Control Description
Unit Select to change the unit for parameters such as byte size
for memory statistics or time frames for duration statistics.
Refresh At the top, monitor level, select to refresh the data on all
panes of the monitor, or you can refresh individual panes.
Auto Refresh: <x> seconds Select the check box to enable auto refresh. Selecting Auto
Refresh at the top, monitor level also selects the Auto
Refresh options for all panes in the monitor. You can change
the frequency for how often to refresh the pane(s).
Clear Filter You can apply filters on most columns or objects by selecting
the column heading or object and entering filter criteria.
Clear Filter removes the filters applied to that pane, or the
Clear Filter button at the top, monitor level clears all filters
for all panes in the monitor.
Navigate to: <monitor> Switch to another monitor by selecting it from the Navigate
to drop-down box.
Action History Select to display a list of recent user actions such as user
name, timestamp, action, and execution status of the action.
These actions are stored in the table
"SAP_HANA_IM_DP"."sap.hana.im.dp.moni
tor.ds::DP_UI_ACTION_HISTORY". Users with the Operations
role can truncate (clear) this table.
The remote source and remote subscription statistic Collect Time Interval displays on the toolbar. By default,
statistics are collected every 5 minutes (300 seconds). You can change this interval to any value larger than 60
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 11
seconds (any value less than 60 will be treated as 60 seconds) by changing the dpserver.ini
collect_interval value as in the folowing SQL statement. Refresh your browser after changing the setting
to display the new value.
For any given pane, right-click a column heading the display column view controls:
Related Information
The Data Provisioning Agent Monitor displays system information for all agents such as status, memory
details, adapter-agent mapping, and agent group information.
The Agent Monitor pane displays basic information such as the agent details, status, the last time it connected
to the Data Provisioning server, and memory usage details.
Control Description
Create Agent Creates a new agent; enter the following required parameters:
● Agent Name
● Host name
● Port
Administration Guide
12 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
Control Description
Alter Agent Lets you modify agent parameters (for example Host, Port, Enable SSL, Agent
Group) for the selected agent.
Drop Agent Removes the selected agent from the SAP HANA system.
Update Adapters Refreshes all adapters registered for the selected agent so that any new
capabilities can be used by SAP HANA.
● CONNECTING
● DISCONNECTED
● CONNECTED
The Adapter Agent Mapping pane identifies the adapters that are associated with each agent instance. When
you first open the Data Provisioning Agent Monitor, it displays information for all agents and adapters.
Selecting an agent displays Adapter-Agent Mapping for that agent.
Control Description
Update Updates the selected adapter for the agent so that any new capabilities can be
used by SAP HANA.
The Agent Group pane lets you view, create, and manage agent groups.
Control Description
Related Information
Manage Agents from the Data Provisioning Agent Monitor [page 36]
Manage Adapters from the Data Provisioning Agent Monitor [page 38]
Create or Remove an Agent Group [page 46]
Manage Agent Nodes in an Agent Group [page 47]
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 13
2.5 Monitoring Remote Sources
The Data Provisioning Remote Source monitor lets you view each remote source's status and statistics.
The Remote Source Monitor pane displays basic information such as the adapter name, status, and
subscription information. Selecting a remote source displays the statistics for that source in the pane below.
● OK
● ERROR
● SUSPENDED
Select a remote source and choose Alter Remote Source to suspend or resume the capture or distribution of
the remote source.
The Remote Source Statistics pane displays details such as aggregated counts of records processed by
receivers and appliers. For aggregated values, the collected time interval is 5 minutes; refer to the statistic's
time stamp.
Some statistics have corresponding graph views. Select the statistic and choose Show Graph. For example, say
the Count of all, data and meta, records processed by Applier statistic shows a Statistic Value of 83300.
Selecting Graph shows a chart with the count value at each 5-minute interval over a selected time frame, for
example 6 hours (Plot values for past: 6 hours).The summation of all these values is the Statistic Value.
You can select a duration (Plot values for past: ) from 6 hours to 15 days, or change the Graph Type.
Receiver delay Oracle Log Reader Last received timestamp (dpserver, receiver) -
Remote_Source_Processed_Row_Timestamp (dpagent,
(does not apply to trigger-based
adapter)
replication)
Administration Guide
14 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
Statistic Adapter type Description
Related Information
The Data Provisioning Remote Subscription Monitor provides information about how data is being replicated to
the Data Provisioning Server.
The Remote Subscription Monitor pane displays basic information such as the design-time information, status,
and subscription type. You can select a subscription name to view its description. Selecting a subscription
displays the statistics for that subscription in the pane below.
Control Description
Reset Resets the real-time process to start from the initial load again
Notifications Select to view or add email notifications (for example in the case of a warning or
error).
The Remote Subscription Statistics pane below displays the same information as in the Remote Source
Statistics pane.
Related Information
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 15
Create Notifications [page 21]
You create remote subscriptions when you want to listen for real-time changes to data that you replicated into
the Data Provisioning Server. A remote subscription can have seven different statuses.
The following table describes the status names (in both the Data Provisioning Remote Subscription Monitor and
in the M_REMOTE_SUBSCRIPTIONS monitoring view) and their descriptions.
You can select a status entry to display the Replication Status Details window for information about queueing
and distribution statistics:
● Queue command: Adapter received begin marker from the Data Provisioning Server
● Queue command: Adapter applied begin marker to source
● Queue command: Adapter read begin marker from source and sent to the Data Provisioning Server
● Distribute command: Adapter received begin marker from the Data Provisioning Server
● Distribute command: Adapter applied begin marker to source
● Distribute command: Adapter read begin marker from source and sent to the Data Provisioning Server
Subscribed, request to MAT_START_BEG_MARKE The receiver is waiting for the begin marker that indicates the
queue R first changed data to queue while the initial load is running.
Request to stop queuing MAT_START_END_MARKE The receiver queues the rows and is waiting for the end marker
and start distribution R that indicates the last row of the initial load.
Queuing changes MAT_COMP_BEG_MARKER The receiver has received the begin marker, change data is be
ing queued, and the initial load can now start.
Queuing closed, starting MAT_COMP_END_MARKER The receiver queues the changed rows and is waiting for the
distribution end marker that indicates the last row of the initial load. The in
itial load has completed and the end marker is sent to the
adapter. If the state doesn’t change to
AUTO_CORRECT_CHANGE_DATA, the adapter or source sys
tem is slow in capturing the changes.
Applying queued changes AUTO_CORRECT_CHANGE When the end marker is received, the applier loads the
using auto-correct _DATA changed data captured (and queued during the initial load) to
the target.
Replicating changes APPLY_CHANGE_DATA All of the changes captured while the initial load was running
have completed and are now loaded to the target. The sub
scription is now applying changes as they happen in the source
system.
Administration Guide
16 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
2.7 Monitoring Design-Time Objects
The Data Provisioning Design Time Object Monitor provides information about your design-time objects
(flowgraphs and replication tasks). For example, you can see the duration of a task execution for a flowgraph
and how many records have been processed.
The Design Time Objects pane displays basic information such as the target schema name and whether the
object has a table type input or variables. Selecting a design-time object displays task and remote subscription
information for that object in the panes below.
Control Description
Notifications Select to view or add email notifications (for example in the case of a warning or
error).
The Task Monitor pane displays task duration and number of processed records.
Control Description
Remote Statements For a selected task, if the Remote Statements button is enabled, select it to view
the SQL remote statement string used for task execution.
Notifications Select to view or add email notifications (for example in the case of a warning or
error).
The Remote Subscription Monitor pane displays subscription status and other processing details. Select a
subscription name to launch the Data Provisioning Remote Subscription Monitor Select a status entry to display
a Details dialog for more details including marker information and exceptions.
Related Information
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 17
2.8 Monitoring Tasks
The Data Provisioning Task Monitor provides you with information about your replication (hdbreptask) and
transformation (hdbflowgraph) tasks. For example, you can see the duration of a task execution and how many
records have been processed.
When you first open the Data Provisioning Task Monitor, it displays information for all tasks.
The Task Overview pane lists all tasks with information such as design time name (select to open the Design-
Time Objects monitor), create time, and memory size, for example. You can select a task to show information
for only that task in the panes below.
Control Description
Notifications Select to view or add email notifications (for example in the case of a warning or
error).
The Task Execution Monitor pane lists all of the recent executions of the task(s) and associated data such as
the schema name, start time, duration, status, and number of processed records.
Control Description
Remote Statements For a selected task, if the Remote Statements button is enabled, select it to view
the SQL remote statement string used for task execution.
Execute Remaining Partitions Execute partitions that didn't run or failed after executing a task.
The Task Operation Execution Monitor pane lists all of the recent operations of the task(s) and associated data
including the operation type, start time, duration, status, and number of processed records.
Related Information
Change Retention Period for Data Provisioning Task Monitor [page 19]
Start and Stop Data Provisioning Tasks [page 65]
Schedule Data Provisioning Tasks [page 66]
Executing Partitions [page 67]
Start and Stop Data Provisioning Tasks [page 65]
Schedule Flowgraphs and Replication Tasks [page 63]
Administration Guide
18 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
2.8.1 Change Retention Period for Data Provisioning Task
Monitor
Change the retention period for task statistics tables if you want to retain the data in the Task Execution
Monitor and Task Operation Execution Monitor for longer than 90 days.
Context
The following parameters specify how long to keep the statistics data and when to delete them:
● The task_data_retention_period parameter specifies the period of time the data remains in the
statistics tables, in seconds. This period is calculated from the time the task reached the COMPLETED,
FAILED, or CANCELLED status. The default value is 7776000, or 90 days. A value of 0 (zero) or -1 means
never delete the data.
● The task_data_retention_period_check_interval parameter specifies how often the data is
deleted by the garbage collection thread. The default value is 300 seconds (5 minutes).
To change the default values of these parameters, you must add a new section named task_framework to each
of the indexserver.ini, scriptserver.ini, and the xsengine.ini files.
Procedure
2. In the Systems view, right-click the name of your SAP HANA server and choose Configuration and
Monitoring Open Administration .
3. Click the Configuration tab.
4. Right-click indexserver.ini and choose Add Section.
5. On the Section Name screen of the Add Section Wizard, enter task_framework for the section name, and
click Next.
6. On the Scope Selection screen, select System from the Assign Values todropdown list, and click Next.
7. On the Key Value Pairs screen, enter task_data_retention_period in the Key field, and enter the
number of seconds you want the statistics data to be kept in the statistics tables.
Note
If you set the retention period to 0 (zero) or -1, statistics data is not deleted.
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 19
8. Click Finish.
9. Select indexserver.ini, right-click, and choose Add Parameter.
10. On the Add New Parameter screen, enter task_data_retention_period_check_interval in the Key
field, and enter the time interval for the Task Data Cleanup Process to run.
Note
By creating user settings profiles, you can quickly switch between different monitor layouts. Settings profiles
contain information about visible columns, column order, column width, column filters, table visibility, and
slider positions.
Context
You can create, modify, or remove settings profiles in each Data Provisioning Monitor by clicking the Settings
button.
Procedure
Administration Guide
20 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
2.10 Create Notifications
Create e-mail notifications for various task, remote subscription, and design-time object statuses.
Prerequisites
The user must have the following roles or privileges to create status notifications:
Context
FAILED
CANCELLED
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 21
Object Type Supported Statuses
WARNING
FAILED
CANCELLED
ERROR
WARNING
Procedure
1. In the Task Overview, Remote Subscription Monitor, or Design Time Objects table, select the object for
which you want to create a notification.
2. Click the Notifications button.
The list of notifications for the object is displayed.
3. Click the Add button to create a new notification.
4. Specify a name, status conditions, and recipient e-mail addresses for the notification.
5. If you want to enable the notification immediately, select Is active.
6. Click Create Notification.
The new notification is added to the list of notifications for the object. When the conditions for the
notification are met, users in the recipient list are sent an e-mail containing details about the event that
triggered the notification.
Related Information
● Agent availability
● Agent memory usage
● Remote subscription exception
Administration Guide
22 PUBLIC Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench
● Data Quality reference data
● Long-running tasks
You can configure monitoring alerts from the SAP HANA cockpit. For more information, see the SAP HANA
Administration Guide.
Related Information
Administration Guide
Monitoring Data Provisioning in the SAP HANA Web-based Development Workbench PUBLIC 23
3 Monitoring Data Provisioning in SAP Web
IDE
Monitors let you view status and other key performance indicators for agents, remotes sources, and tasks, for
example.
In the SAP HANA database explorer, you can view status and other monitoring information for agents, remote
sources and subscriptions, tasks, and design-time objects (flowgraphs and replication tasks) created in an SAP
HANA system.
Related Information
In SAP Web IDE, you can view agents, remote sources and subscriptions, and tasks created in an SAP HANA
system from the SAP HANA cockpit or from within the SAP HANA database explorer.
Prerequisites
Your web browser must support the SAPUI5 library sap.m (for example, Microsoft Internet Explorer 9).
Administration Guide
24 PUBLIC Monitoring Data Provisioning in SAP Web IDE
For more information about SAPUI5 browser support, see SAP Note 1716423 and the Product Availability
Matrix (PAM) for SAPUI5.
Context
Procedure
Task overview: Monitoring Data Provisioning in SAP Web IDE [page 24]
Related Information
For SAP Web IDE, within the SAP HANA database explorer you can monitor basic system information of an
agent such as status, memory, and the time it last connected with the Data Provisioning Server.
You can sort and hide individual columns by right-clicking a row and selecting your display preferences.
Administration Guide
Monitoring Data Provisioning in SAP Web IDE PUBLIC 25
Parent topic: Monitoring Data Provisioning in SAP Web IDE [page 24]
Related Information
To see an overview of all agents, in the database explorer expand the database's Catalog folder, right-click
Agents, and select Show Agents.
Agent Port Port that the agent uses to communicate with the Data Provisioning Server.
● CONNECTING
● DISCONNECTED
● CONNECTED
Since Last Connect Elapsed time since the last connection from the Data Provisioning Server to the Data Provi
sioning Agent.
Last Connect Time The last connect time from the Data Provisioning Server to the Data Provisioning Agent.
Protocol Type of network protocol used between the Data Provisioning Agent and the Data Provision
ing Server.
● TCP
● HTTP
Administration Guide
26 PUBLIC Monitoring Data Provisioning in SAP Web IDE
Column Description
Used Swap Space Amount of swap space currently used by the agent.
Free Swap Space Amount of free swap space on the agent host.
Is SSL Enabled Specifies whether the agent listening on TCP port uses SSL
To see an overview of agent groups, in the database explorer expand the database's Catalog folder, right-click
Agent Groups, and select Show Agent Groups.
Within SAP Web IDE, the SAP HANA database explorer provides information about your remote sources.
Parent topic: Monitoring Data Provisioning in SAP Web IDE [page 24]
Related Information
Administration Guide
Monitoring Data Provisioning in SAP Web IDE PUBLIC 27
3.3.1 Information Available on the Data Provisioning Remote
Sources Monitor
To see an overview of all remote sources, in the database explorer expand the database's Catalog folder, right-
click Remote Sources, and select Show Remote Sources.
Location Port that the agent uses to communicate with the Data Provisioning Server.
● OK
● ERROR
● SUSPENDED
To see details for a remote source, in the database explorer expand the database's Catalog folder and select
Remote Sources. From the resulting list of tasks in the pane below, select a task.
The detail page displays information in four sections: Remote Objects, Dictionary, Statistics, and Exceptions.
You can click Edit to edit the propeties of the remote source.
Within SAP Web IDE, the SAP HANA database explorer provides information about your remote subscriptions.
Parent topic: Monitoring Data Provisioning in SAP Web IDE [page 24]
Related Information
Administration Guide
28 PUBLIC Monitoring Data Provisioning in SAP Web IDE
3.4.1 Information Available on the Data Provisioning Remote
Subscriptions Monitor
To see an overview of all remote subscriptions, in the database explorer expand the database's Catalog folder,
right-click Remote Subscriptions, and select Show Remote Subscriptions.
You can start an ALTER REMOTE SUBSCRIPTION statement with these commands: Queue, Distribute, and
Reset. You can drop the subscription with the DROP REMOTE SUBSCRIPTION statement. Select one or more
remote subscriptions, and click one of the buttons.
Command Description
Queue Initiate real-time data processing. Typically, the initial load of data is preceded by the
Queue command.
Remote Source Name Name of the remote source for which this subscription is defined.
Subscription State Name of the state of the remote subscription. For more information, see Remote Subscrip
tion Statuses [page 16].
Last Processed Elapsed time since the last changed data was processed.
Last Processed Transaction Time the last changed data was processed.
Time
● TABLE
● VIRTUAL TABLE
● TABLE
● VIRTUAL TABLE
To see details for a remote subscription, in the database explorer expand the database's Catalog folder and
select Remote Subscriptions. From the resulting list of subscriptions in the pane below, select a subscription.
Administration Guide
Monitoring Data Provisioning in SAP Web IDE PUBLIC 29
Table 8: Information Available in Remote Subscription Statistics Table
Column Description
Schema Name Name of the schema (user name) in the remote source.
Received Count Total number of messages received by the Data Provisioning Server.
Received Size Total size of messages received by the Data Provisioning Server.
Since Last Message Received Time elapsed between now and when the last message was received.
Last Message Received Time Time the last message was received.
Since Last Message Applied Time elapsed between now and when the last message was applied.
Within SAP Web IDE, the SAP HANA database explorer provides information about your replication tasks and
transformation tasks.
You can sort and hide individual columns by right-clicking a row and selecting your display preferences. You can
also start and stop tasks.
Execute Tasks
From the Task Overview or the Task Details page, select a task that you want to run. You can select to run more
than one task on the Task Overview page. Click Execute.
Cancel Tasks
From the Task Execution table in the Task Details page, select a running task. You can select multiple tasks from
the Task Execution table. Click Cancel Execution.
Parent topic: Monitoring Data Provisioning in SAP Web IDE [page 24]
Administration Guide
30 PUBLIC Monitoring Data Provisioning in SAP Web IDE
Related Information
To see an overview of all tasks, in the database explorer expand the database's Catalog folder, right-click Tasks,
and select Show Tasks.
Has Table Type Input TRUE if the task is modeled with a table type as input. This means data would need to be
passed (pushed) at execution time.
Has SDQ TRUE if the task contains smart data quality (SDQ) functionality.
Is read-only TRUE if the task is read only (has only table type outputs), FALSE if it writes to non-table-
type outputs.
Is valid TRUE if the task is in a valid state, FALSE if it has been invalidated by a dependency.
Procedure Name If the task was created with a procedure instead of a plan, this attribute contains the name
of the stored procedure.
Procedure Schema If the task was created with a procedure instead of a plan, this attribute contains the
schema name of the stored procedure.
Realtime Design Time Object TRUE if the task is a real-time design time object like a replication or flowgraph task,
Schema Name Name of the schema in which the task was created.
SQL Security Security model for the task, either DEFINER or INVOKER.
Administration Guide
Monitoring Data Provisioning in SAP Web IDE PUBLIC 31
Column Description
To see details for a task, in the database explorer expand the database's Catalog folder and select Tasks. From
the resulting list of tasks in the pane below, select a task.
The detail page displays information in three sections: Task Executions, Task Partitions, and Task Operations. If
you select a task in the first table, the other tables show information for only that task.
Use the row-count display to change the number of task executions to display at a time in each table. The
default is 500. After selecting a different task, refresh the tables to see all corresponding information.
Schema Name Name of the schema in which the task was created.
Port Port number that the task uses to communicate with the Data Provisioning Server.
Partition Count Number of logical partitions used for parallel processing in the task.
Tip
When working in Web-based Development Workbench, you can click the partition count
to display additional information about the partitions used in the task execution.
Start Time Day, date, and time when the task started.
End Time Day, date, and time when the task ended.
Duration Total elapsed time from start to end for COMPLETED or FAILED tasks.
● STARTING
● RUNNING
● COMPLETED
● FAILED
● CANCELLED (or, CANCELLING)
Tip
When working in Web-based Development Workbench, you can click the status to dis
play additional information about the task execution.
Administration Guide
32 PUBLIC Monitoring Data Provisioning in SAP Web IDE
Column Description
● TRUE
● FALSE
Has Remote Statements Indicates whether there is one or more remote statements.
HANA User Name of the user that started the execution of this task.
Schema Name Name of the schema in which the task was created.
Partition ID Identification number of the logical partition used for parallel processing within the task.
Start Time Day, date, and time when the task started.
End Time Day, date, and time when the task ended.
Duration Total elapsed time from start to end for COMPLETED or FAILED tasks.
● STARTING
● RUNNING
● COMPLETED
● FAILED
● CANCELLED (or, CANCELLING)
Tip
When working in Web-based Development Workbench, you can click the status to dis
play additional information about the task execution.
Administration Guide
Monitoring Data Provisioning in SAP Web IDE PUBLIC 33
Column Description
Has Remote Statements Indicates whether there is one or more remote statements.
For a transformation task, the name of the operation is the name of the node that appears in
the flowgraph.
Schema Name Name of the schema (user name) in the remote source.
Partition ID Identification number of the logical partition used for parallel processing within the task.
Partition Name Name of the logical partition used for parallel processing within the task.
Operation Type Current type of operation. For example, the operation type can be Table Writer, Adapter,
Projection, and so forth.
Start Time Day, date, and time when the task started.
End Time Day, date, and time when the task ended.
● STARTING
● RUNNING
● COMPLETED
● FAILED
Side Effects Indicates whether or not this operation generates side effect statistics.
● TRUE
● FALSE
Administration Guide
34 PUBLIC Monitoring Data Provisioning in SAP Web IDE
4 Administering Data Provisioning
This section describes common tasks related to the ongoing administration of SAP HANA smart data
integration and SAP HANA smart data quality.
Related Information
Manage Agents from the Data Provisioning Agent Monitor [page 36]
Use the Data Provisioning Agent Monitor to perform basic administration tasks such as registering,
altering, or dropping Data Provisioning Agents.
Manage Adapters from the Data Provisioning Agent Monitor [page 38]
Use the Data Provisioning Agent Monitor to perform basic administration tasks, such as adding
adapters to or removing adapters from a Data Provisioning Agent instance.
Administration Guide
Administering Data Provisioning PUBLIC 35
Back Up the Data Provisioning Agent Configuration [page 42]
You can back up your Data Provisioning Agent configuration by copying key static configuration files to
a secure location.
Related Information
Use the Data Provisioning Agent Monitor to perform basic administration tasks such as registering, altering, or
dropping Data Provisioning Agents.
Prerequisites
The user must have the following roles or privileges to manage agents:
Administration Guide
36 PUBLIC Administering Data Provisioning
Context
Use the following controls in the Agent Monitor table to perform an action.
Procedure
● Select Create Agent to register a new agent with the SAP HANA system.
a. Specify the name of the agent and relevant connection information.
b. If the agent uses a secure SSL connection, select Enable SSL.
c. If you want to assign the agent to an existing agent group, select the group under Agent Group.
d. Click Create Agent.
The agent is removed from the Agent Monitor table. If the agent was assigned to an agent group, it’s also
removed from the agent group.
Related Information
Administration Guide
Administering Data Provisioning PUBLIC 37
4.1.2 Manage Adapters from the Data Provisioning Agent
Monitor
Use the Data Provisioning Agent Monitor to perform basic administration tasks, such as adding adapters to or
removing adapters from a Data Provisioning Agent instance.
Prerequisites
The user must have the following roles or privileges to manage adapters:
Context
Use the buttons in the Agent Monitor and Agent Adapter Mapping tables to perform an action.
Procedure
● To add adapters to an agent instance, select the agent and click Add Adapters in the Agent Monitor table.
a. Select the desired adapters from the list of adapters deployed on the agent instance.
b. Click Add Adapters.
Administration Guide
38 PUBLIC Administering Data Provisioning
All adapters registered for the selected agent are refreshed, and any new capabilities can be used by SAP
HANA.
● To update a single adapter, select the adapter and click Update in the Adapter Agent Mapping table.
The selected adapter is refreshed, and any new capabilities can be used by SAP HANA.
Related Information
Use the LATENCY MONITORING remote source capability to monitor latency between the Data Provisioning
Agent and a source database.
Restriction
Latency monitoring tracks only the general latency between the agent and the source database. There is
currently no capability for tracking specific rows from source to target.
When latency monitoring is enabled, a ticket is written to the source database and then detected and
processed by the adapter. Information about how and when each component processes the ticket is stored in
the M_REMOTE_SOURCE_LATENCY_HISTORY view in the target SAP HANA database.
1. At fixed intervals, SAP HANA sends a request to the adapter to create a latency ticket.
2. Adapter creates and records the ticket in the source database.
3. Adapter reads the ticket and sends it back to the server.
4. Server records that it received the returned ticket and each server subcomponent timestamps the ticket
until it reaches the applier.
The agent latency is the difference between the timestamps for when the ticket is written to the database
and when the ticket is executed at the primary database.
For example, consider whether the agent is falling behind or catching up? If it’s catching up and latency is
decreasing, performance may be sufficient and no additional tuning may be needed. If it’s falling behind and
latency is increasing, consider tuning your remote source.
You can create latency tickets either as a one-time operation or continuously at a set interval.
Administration Guide
Administering Data Provisioning PUBLIC 39
● Create a single latency ticket:
Latency ticket timestamps are recorded following an order of operations that depends on whether the adapter
is a Log Reader adapter or a trigger-based adapter.
1 SDB and/or DDL Adapter SDB: Time when the ticket enters the database and is time
stamped
2 LRI Adapter Time when the adapter's inbound picks up the ticket
3 LRO Adapter Time when the adapter's outbound picks up the ticket
4 SNDR Adapter Time when the Data Provisioning Agent sender forwards
the ticket to the Data Provisioning Server (receiver)
Administration Guide
40 PUBLIC Administering Data Provisioning
Operation Type Description
7 RECEIVER_RECEIVED Server A single ticket entering the Data Provisioning Server's re
ceiver module
8 RECEIVER_SENT Server A single ticket exiting the Data Provisioning Server's re
ceiver module
10 APPLIER_RECEIVED Server A single ticket entering the Data Provisioning Server's ap
plier module
11 APPLIER_APPLIED Server A single ticket exiting the Data Provisioning Server's applier
module
1 SDB and/or DDL Adapter SDB: Time when the ticket enters the database and is time
stamped
2 SCANNER Adapter Time when the adapter's scanner picks up the ticket from
source database
3 SNDR Adapter Time when the Data Provisioning Agent sender forwards
the ticket to the Data Provisioning Server (receiver)
6 RECEIVER_RECEIVED Server A single ticket entering the Data Provisioning Server's re
ceiver module
7 RECEIVER_SENT Server A single ticket exiting the Data Provisioning Server's re
ceiver module
9 APPLIER_RECEIVED Server A single ticket entering the Data Provisioning Server's ap
plier module
10 APPLIER_APPLIED Server A single ticket exiting the Data Provisioning Server's applier
module
Administration Guide
Administering Data Provisioning PUBLIC 41
4.1.4 Back Up the Data Provisioning Agent Configuration
You can back up your Data Provisioning Agent configuration by copying key static configuration files to a secure
location. You can use this backup to restore communication between the SAP HANA server and the Data
Provisioning Agent.
Note
This backup can be restored only to an agent host with the same fully qualified domain name as the original
agent. You can’t use the backup to transport configuration settings between agents with different fully
qualified domain names.
For example, you can’t use a backup from an agent on <host1>.mydomain.com to restore settings to an
agent on <host2>.mydomain.com.
Restriction
Changed-data capture status information for Log Reader adapters can’t be backed up and restored.
Unless specified, all files and directories that you need to back up are located under <DPAgent_root>:
● dpagent.ini
● dpagentconfig.ini
● sec
● secure_storage
● ssl/cacerts
● configuration/com.sap.hana.dp.adapterframework
● lib/
● camel/
● LogReader/config
● LogReader/sybfilter/system/<platform>/LogPath.cfg
Uninstall the Data Provisioning Agent from a host system using the uninstallation manager.
Context
The uninstallation manager supports graphical and command-line modes on Windows and Linux platforms.
Administration Guide
42 PUBLIC Administering Data Provisioning
Procedure
Tip
To ensure that all installation entries are removed correctly, use the same user and privileges as the
original installation owner.
For example, if sudo was used during the original installation, log in as the installation owner and
run sudo ./hdbuninst <...>.
Results
Next Steps
After uninstalling the agent, several files and directories generated by the agent during runtime are left in place.
If you choose, you can safely remove these remaining files and directories manually.
● configTool/
● configuration/
● install/
● log/
Administration Guide
Administering Data Provisioning PUBLIC 43
● LogReader/
● workspace/
Restriction
Failover is not supported for initial and batch load requests. Restart the initial load following a failure due to
agent unavailability.
Restriction
Load balancing is supported only for initial loads. It is not supported for changed-data capture (CDC)
operations.
Planning considerations
Before configuring agents in a group, review the following considerations and limitations:
● For real-time replication failover, each agent in a group must be installed on a different host system.
● All agents in a group must have identical adapter configurations.
● All agents in a group must use the same communication protocol. You cannot mix on-premise agents
(TCP) and cloud-based agents (HTTP) in a single group.
Administration Guide
44 PUBLIC Administering Data Provisioning
Parent topic: Administering Data Provisioning [page 35]
Related Information
When an agent node in an agent group is inaccessible for longer than the configured heartbeat interval, the
Data Provisioning Server chooses a new active agent within the group. It then resumes replication for any
remote subscriptions active on the original agent.
Initial and batch load requests to a remote source configured on the agent group are routed to the first
available agent in the group.
Restriction
Failover is not supported for initial and batch load requests. Restart the initial load following a failure due to
agent unavailability.
Although no user action is required for automatic failover within an agent group, you may choose to monitor
the current agent node information.
● To query the current master agent node name for a remote source:
Caution
If all nodes in an agent group are down, replication cannot continue and must be recovered after one or
more agent nodes are available.
Restarting nodes in an agent group does not impact active replication tasks.
Administration Guide
Administering Data Provisioning PUBLIC 45
For the master agent node, stopping or restarting the agent triggers the agent group failover behavior and a
new active master node is selected.
With multiple agents in an agent group, you can choose to have the agent for the initial loads selected
randomly, selected from the list of agents in a round-robin fashion, or not load balanced.
Note
Agent grouping provides load balancing for initial loads only. Load balancing is not supported for changed-
data capture (CDC) operations.
You can create an agent group or remove an existing group in the Data Provisioning Agent Monitor.
Prerequisites
The user who creates or removes the agent group must have the following roles or privileges:
Administration Guide
46 PUBLIC Administering Data Provisioning
Context
Use the buttons in the Agent Group table to create or remove an agent group.
Procedure
Note
When you remove an agent group, any agent nodes for the group are removed from the group first.
Agents cannot be removed from the group if there are active remote subscriptions.
Any agent nodes are removed from the group, and the group is removed from the Agent Group table.
Related Information
You can manage the agent nodes that belong to an agent group in the Data Provisioning Agent Monitor.
Prerequisites
The user must have the following roles or privileges to manage agent nodes:
Administration Guide
Administering Data Provisioning PUBLIC 47
Table 18: Roles and Privileges
Action Role or Privilege
Context
Use the buttons in the Agent Monitor and Agent Group tables to perform the action.
Tip
Select an agent group in the Agent Group table to display its nodes in the Agent Monitor table.
Procedure
● To register a new agent with the SAP HANA system and add it to an existing agent group, click Create
Agent.
When specifying the parameters for the agent, select the agent group from the Agent Group list.
○ Select the new agent group from the Agent Group list.
If you are assigning the agent to a different group, select the empty entry for Enable SSL to avoid
connection issues when the group is changed.
○ To remove the agent from an agent group, select the empty entry from the Agent Group list.
The group for the agent is displayed in the Agent Monitor table.
● To add multiple existing agents to an agent group, select the group in the Agent Group table and click Add
Agents.
a. Select the agents that you want to add to the group.
b. Click Add Agents.
The selected agents are assigned to the agent group and all associated entries in the Agent Monitor and
Agent Group tables are updated.
Administration Guide
48 PUBLIC Administering Data Provisioning
Related Information
Before you can create remote sources in an agent group, you must add adapters to the group in the SAP HANA
Web-based Development Workbench.
Prerequisites
The user who adds an adapter must have the following roles or privileges:
Procedure
1. Open the SQL console in the SAP HANA Web-based Development Workbench.
2. If you do not know the agent names, query the system for a list of agents and agent groups.
4. Add the agent to each additional agent node in the agent group.
Related Information
Administration Guide
Administering Data Provisioning PUBLIC 49
ALTER ADAPTER Statement [Smart Data Integration] [page 156]
To receive the benefits of failover from an agent group, you must configure your remote sources in the agent
group.
Related Information
Procedure
a. In the Catalog editor, right-click the Provisioning Remote Sources folder, and choose New
Remote Source.
b. Enter the required configuration information for the remote source, including the adapter name.
c. In the Location dropdown, choose agent group, and select the agent group name.
d. Click Save.
● To add an existing remote source to an agent group:
a. In the Catalog editor, select the remote source in the Provisioning Remote Sources folder.
b. In the Location dropdown, choose agent group, and select the agent group name.
c. Click Save.
Related Information
Administration Guide
50 PUBLIC Administering Data Provisioning
Configure Remote Sources in the SQL Console
Procedure
1. Open the SQL console in the SAP HANA studio or Web-based Development Workbench.
2. Execute the CREATE or ALTER REMOTE SOURCE statement in the SQL console.
Note
If you are changing only the location for the remote source, you can omit the ADAPTER and
CONFIGURATION clauses:
Related Information
When you use ALTER REMOTE SOURCE to modify a remote source, you must specify the configuration and
credential details as XML strings.
Administration Guide
Administering Data Provisioning PUBLIC 51
Example Configuration Clause
Administration Guide
52 PUBLIC Administering Data Provisioning
Note
You cannot change user names while the remote source is suspended.
Remote sources establish the connection between a data provisioning adapter and your source system.
Remote subscriptions monitor a remote source for real-time changes to data replicated into the Data
Provisioning Server.
Remote sources are generally created by an administrator, and can then be used for remote subscriptions in
replication tasks and flowgraphs created by a data provisioning modeler.
Related Information
Administration Guide
Administering Data Provisioning PUBLIC 53
4.3.1 Create a Remote Source
Using SAP HANA smart data integration, you set up an adapter that can connect to your source database, then
create a remote source to establish the connection.
Prerequisites
● The user who creates the remote source must have the following roles or privileges:
Context
Related Information
In SAP HANA smart data integration, you can create a remote source with the Web-based Development
Workbench user interface.
Prerequisites
The user who creates the remote source must have the following roles or privileges:
Administration Guide
54 PUBLIC Administering Data Provisioning
Table 21: Roles and Privileges
Action Role or Privilege
Procedure
1. In the Web-based Development Workbench Catalog editor, expand the Provisioning node.
2. Right-click the Remote Sources folder and choose New Remote Source.
3. Enter the required information including the adapter and Data Provisioning Agent names.
Regarding user credentials, observe the following requirements:
○ A remote source created with a secondary user can be used only for querying virtual tables.
○ If the remote source is used for designing a .hdbreptask or .hdbflowgraph enabled for real time,
use technical user.
○ If you create a remote subscription using the CREATE REMOTE SUBSCRIPTION SQL statement, use
technical user.
4. Select Save.
Related Information
In SAP HANA smart data integration, you can create a remote source using the SQL console.
Prerequisites
The user who creates the remote source must have the following roles or privileges:
Administration Guide
Administering Data Provisioning PUBLIC 55
Context
To create a remote source using the SQL console, you must know the connection information for your source.
For an existing remote source, the connection information is in an XML string in the CONFIGURATION
statement.
For your adapter, refer to the remote source configuration topic for that adapter in this guide to see its sample
SQL code. Change the variables to the correct values for your remote source.
The example at the end of this topic illustrates the basic CONFIGURATION connection information XML string
for a Microsoft SQL Server adapter.
● If you’ve recently updated the Data Provisioning Agent, the connection information XML string could also
have been updated for your adapter. Therefore, refresh the adapter to get up-to-date connection
information.
● To view the connection information for an existing remote source, execute SELECT * FROM
"PUBLIC"."REMOTE_SOURCES". In the resulting view, look in the CONNECTION_INFO column.
Tip
To ensure you can view the entire XML string in the CONNECTION_INFO column, in your SAP HANA
preferences enable the setting Enable zoom of LOB columns.
● To view all of the configuration parameters for a given adapter type, execute SELECT * FROM
"PUBLIC"."ADAPTERS". In the resulting view, look in the CONFIGURATION column. This information can
be useful if you want to, for example, determine the PropertyEntry name for a given parameter in the user
interface, shown as displayName. For example:
Example
Administration Guide
56 PUBLIC Administering Data Provisioning
</PropertyGroup>
<PropertyGroup name="cdc" displayName="CDC Properties">
<PropertyEntry name="pdb_dcmode" displayName="Database Data Capture
Mode">MSCDC</PropertyEntry>
</PropertyGroup>
<PropertyGroup name="logreader" displayName="LogReader">
<PropertyEntry name="skip_lr_errors" displayName="Ignore log record
processing errors">false</PropertyEntry>
</PropertyGroup>
</ConnectionProperties>
'WITH CREDENTIAL TYPE 'PASSWORD' USING
'<CredentialEntry name="credential">
<user>myuser</user>
<password>mypassword</password>
</CredentialEntry>'
Related Information
The syntax for creating secondary user credentials for SAP HANA smart data integration adapters is different
from the syntax for SAP HANA system adapters.
The syntax for creating secondary user credentials for SAP HANA smart data integration adapters is as follows.
You can suspend and resume capture and distribution for remote sources within the Data Provisioning Remote
Subscription Monitor.
Prerequisites
The user must have the following roles or privileges to suspend and resume capture and distribution:
Administration Guide
Administering Data Provisioning PUBLIC 57
Table 23: Roles and Privileges
Context
Use the Alter Remote Source button in the monitor to perform the action.
Procedure
1. Select the remote source in the Remote Source Monitor table and click Alter Remote Source.
2. Click Suspend or Resume for CAPTURE or DISTRIBUTION.
Confirmation of the action is displayed in the status console.
3. Close the Alter Remote Source dialog.
Results
Related Information
Administration Guide
58 PUBLIC Administering Data Provisioning
4.3.3 Alter Remote Source Parameters
You can modify some remote source parameters while the remote source is suspended.
Context
In the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data
Quality, see each adapter's remote source description topic regarding which parameters you can modify when
a remote source is suspended.
Note
You can’t change the User Name parameter when the remote source is suspended.
Procedure
1. In the Data Provisioning Remote Subscription Monitor, suspend capture on the remote source.
2. In the SAP HANA Web-based Development Workbench catalog, change the intended remote source
parameters.
3. Re-enter the credentials for the remote source and save the changes.
4. Resume capture on the remote source.
Related Information
You can drop, queue, distribute, and reset remote subscriptions within the Data Provisioning Remote
Subscription Monitor.
Prerequisites
The user must have the following roles or privileges to manage remote subscriptions:
Administration Guide
Administering Data Provisioning PUBLIC 59
Table 24: Roles and Privileges
Context
Use the buttons in the Remote Subscription Monitor table to perform the action.
Procedure
Note
A warning appears if you attempt to drop a remote subscription that is used by any flowgraphs or
replication tasks. Click Drop if you want to continue and drop the remote subscription anyway.
Results
The remote subscription is queued, distributed, or reset. If you drop a remote subscription, the subscription is
removed from the Remote Subscription Monitor table.
Related Information
Administration Guide
60 PUBLIC Administering Data Provisioning
4.3.5 Processing Remote Source or Remote Subscription
Exceptions
If an error occurs or if the row count on the target table doesn’t match, for example, look at the Exceptions
table and process the entries.
To process a remote source or remote subscription exception using the monitoring UI:
Related Information
PROCESS REMOTE SUBSCRIPTION EXCEPTION Statement [Smart Data Integration] [page 187]
Design time objects such as flowgraphs and replication tasks manage the replication and transformation of
data in SAP HANA smart data integration and SAP HANA smart data quality.
Administration Guide
Administering Data Provisioning PUBLIC 61
Parent topic: Administering Data Provisioning [page 35]
Related Information
You can execute design time objects including flowgraphs and replication tasks from the Data Provisioning
Design Time Object Monitor.
Restriction
Real-time flowgraphs and replication tasks can’t be executed from the Data Provisioning Design Time
Object Monitor.
Prerequisites
The user must have the following roles or privileges to schedule flowgraphs and replication tasks.
Procedure
1. Select the flowgraph or replication task in the Design Time Objects table.
2. Click Execute.
a. If the object uses table type parameters, select the tables to use when executing the object.
b. If the object uses variable parameters, specify the values to use when executing the object.
Administration Guide
62 PUBLIC Administering Data Provisioning
c. Click Execute.
Results
The object execution begins and the task appears in the Task Monitor table.
Related Information
You can schedule design time objects including flowgraphs and replication tasks within the Data Provisioning
Design Time Object Monitor.
Restriction
Real-time flowgraphs and replication tasks can’t be scheduled from the Data Provisioning Design Time
Object Monitor.
Prerequisites
The user must have the following roles or privileges to schedule flowgraphs and replication tasks.
● Enable scheduling via XS Job Admin Dashboard /sap/hana/xs/admin/jobs/ (The user who enables other
users to schedule needs the role sap.hana.xs.admin.roles::JobSchedulerAdministrator).
● To schedule design time objects, the job sap.hana.im.dp.monitor.jobs::scheduleTask needs to be enabled in
the XS Job Details page: /sap/hana/xs/admin/jobs/#/package/sap.hana.im.dp.monitor.jobs/job/
scheduleTask.
Administration Guide
Administering Data Provisioning PUBLIC 63
Procedure
1. Select the flowgraph or replication task in the Design Time Objects table.
2. Click the Schedules button.
The Schedules dialog appears.
3. To create a new schedule for the task, click Add.
a. Select the frequency (once or recurring), interval if recurring (year, month, week, day, hour, minute,
second), and the time (local, not server time or UTC) for the object execution.
b. If the object uses table type parameters, select the tables to use when executing the object.
c. If the object uses variable parameters, specify the values to use when executing the object.
d. Click Schedule.
The new schedule is added to the list of schedules for the object.
4. To remove an existing schedule, select the schedule and click Delete.
The schedule is removed from the list of schedules for the object.
5. Close the Schedules dialog.
Results
Your object executes as scheduled and you can monitor the results of each execution of the object.
Related Information
You can stop the execution of non-realtime flowgraphs within the Data Provisioning Design Time Object Monitor.
Prerequisites
The user must have the following roles or privileges to stop flowgraph execution.
Administration Guide
64 PUBLIC Administering Data Provisioning
Table 27: Roles and Privileges
Procedure
1. Select the task for the flowgraph in the Task Monitor table.
2. Click Stop.
Results
Related Information
You can start and stop tasks within the Data Provisioning Task Monitor.
Prerequisites
The user must have the following privileges to start or stop tasks:
Action Privilege
Administration Guide
Administering Data Provisioning PUBLIC 65
Procedure
Note
Tasks that belong to real-time design time objects can’t be started or scheduled from the Data
Provisioning Task Monitor.
● To stop a task, select the running task in the Task Execution Monitor table and click Stop.
Note that there might be a delay in stopping the task depending on when the cancellation was initiated and
the pending operation.
Related Information
You can schedule tasks within the Data Provisioning Task Monitor.
Prerequisites
The user must have the following roles or privileges to schedule tasks.
● Enable scheduling via XS Job Admin Dashboard /sap/hana/xs/admin/jobs/ (The user that enables
other users to schedule needs the role sap.hana.xs.admin.roles::JobSchedulerAdministrator).
Administration Guide
66 PUBLIC Administering Data Provisioning
● To schedule tasks, the Job sap.hana.im.dp.monitor.jobs::scheduleTask needs to be enabled in the XS Job
Details page: /sap/hana/xs/admin/jobs/#/package/sap.hana.im.dp.monitor.jobs/job/
scheduleTask.
Procedure
Results
Your task executes as scheduled and you can monitor the results of each execution of the task.
Related Information
In the Data Provisioning Task Monitor, you can view and execute partitions in the following ways:
Administration Guide
Administering Data Provisioning PUBLIC 67
Related Information
Context
In the Data Provisioning Task Monitor, you can view and execute specific partitions.
Procedure
If there are partitions (or table type parameters) configured for the flowgraph or reptask, the Set
Parameters for task <name> displays.
2. Select the partition to execute and select Start.
Example
Administration Guide
68 PUBLIC Administering Data Provisioning
4.4.6.2 Execute Remaining Partitions
Context
After executing a task, in the Data Provisioning Task Monitor you can execute all the partitions that didn't run or
failed.
Procedure
Example
Context
After executing a task, in the Data Provisioning Task Monitor you can execute a specific partition that failed.
Procedure
Administration Guide
Administering Data Provisioning PUBLIC 69
2. In the Task Execution Monitor table, for a failed task execution, select the Partition Count.
Example
You can execute a specific failed partition using the following query:
Use the SAP HANA Enterprise Semantic Services Administration browser-based application to administer and
monitor artifacts for semantic services.
To launch the SAP HANA Enterprise Semantic Services Administration tool, enter the following URL in a web
browser:
http://<<your_HANA_instance:port>>/sap/hana/im/ess/ui
Component Description
Published Artifacts View and remove artifacts from the knowledge graph.
When you have drilled in to a component, you can click the navigation menu in the upper-left corner to open
other components or return to the Home page.
Related Information
Administration Guide
70 PUBLIC Administering Data Provisioning
Monitor the Status of Publication Requests [page 75]
Manage Published Artifacts [page 78]
Data Profiling [page 81]
Setting Configuration Parameters [page 83]
Troubleshooting Enterprise Semantic Services [page 84]
Managing Agents and Adapters [page 35]
Managing Agent Groups [page 44]
Managing Remote Sources and Subscriptions [page 53]
Managing Design Time Objects [page 61]
About SAP HANA Enterprise Semantic Services
Description Role
(sap.hana.im.ess.services.views.datalineage:GET_ALL_IMPACTING_TABLES,
sap.hana.im.ess.services.views.datalineage:GET_IMPACTING_TABLES,
sap.hana.im.ess.services.views.datalineage:GET_LINEAGE_FROM_VIEW,
sap.hana.im.ess.services.views.datalineage:GET_LINE
AGE_FROM_SCHEMA)
To use the search, ctid API, or the remote view (sap.hana.im.ess.serv sap.hana.im.ess.roles::User
ices.views:REMOTE_OBJECTS)
(sap.hana.im.ess.services.views.datalineage:GET_ACCESSIBLE_LINE
AGE_FROM_VIEW,
sap.hana.im.ess.services.views.datalineage:GET_ACCESSIBLE_LINEAGE)
Administration Guide
Administering Data Provisioning PUBLIC 71
4.5.2 Enterprise Semantic Services Knowledge Graph and
Publication Requests
Enterprise Semantic Services uses a knowledge graph that describes the semantics of the datasets that are
available to users or applications connected to SAP HANA. It is natively stored in the SAP HANA database.
Datasets represented in the knowledge graph can include tables, SQL views, SAP HANA views, remote objects
in remote sources, and virtual tables that refer to remote objects.
An Enterprise Semantic Services publication request extracts information from a resource and publishes it in
the knowledge graph. When a user searches for an object based on its metadata and contents, the knowledge
graph provides the results.
The knowledge graph becomes populated by one or more of the following methods:
● An SAP HANA administrator uses the Enterprise Semantic Services Administration tool to publish
datasets
● An SAP HANA administrator configures the Enterprise Semantic Services REST API so that an application
can publish datasets
● If an application has already been configured to call the Enterprise Semantic Services REST API, the
application can populate the knowledge graph. For example in SAP HANA Agile Data Preparation, when you
add a worksheet, the content publishes to the knowledge graph.
Related Information
The SAP Enterprise Semantic Services (ESS) Administration tool lets you publish (or unpublish) artifacts.
You can publish or unpublish an artifact programmatically using the on-demand Enterprise Semantic Services
API. This method is useful for applications that manage the life cycle of their artifacts, that is, applications that
create, delete, and update SAP HANA artifacts. The application determines which artifacts to publish,
republish, or unpublish to the Enterprise Semantic Service knowledge graph. An example is the SAP Agile Data
Preparation application.
Administrators can also use the SAP HANA ESS Administration tool to publish or unpublish artifacts. This is
useful for applications that manage the life cycle of SAP HANA artifacts but do not want (or cannot easily)
integrate artifact management with Enterprise Semantic Services. An example is the SAP ERP application. In
those cases, it is easier to delegate to an administrator the task of determining which artifacts should be
Administration Guide
72 PUBLIC Administering Data Provisioning
published to Enterprise Semantic Services depending on the needs of application (for example, access to a
semantic service like search).
The best practice is to separate the artifacts published by applications using the on-demand ESS API from
those published by an administrator using the SAP ESS Administration tool. Therefore, the artifacts will belong
to different publisher groups, as shown on the SAP ESS Administration tool Published Artifacts tile.
Related Information
Use the SAP HANA Enterprise Semantic Services (ESS) Administration tool to publish artifacts in the
knowledge graph.
Procedure
Note that in the following browsers, you can search for an object within a node using a Filter: Publication
Schedules (catalog only), Published Artifacts, Data Profiling Blacklist, Entity Grid Tags.
1. Right-click the object and select Filter. If the Filter option does not display for the object, select Refresh.
2. In the Filter dialog box, start typing the object name or part of the name. The list filters as you type.
3. Select OK to save the filter on the node. To clear the filter later, right-click the object and select Remove
filter.
Note
Selecting Refresh on an object removes all filters from the object and its children.
Children of the artifact inherit the INCLUDED configuration of the parent unless specifically excluded.
4. Configure the publication schedule as follows.
a. For Next Scheduled Date, click in the box to select a date and time to next publish the artifact.
If you do not enter a date and time, it is set to the current date and time.
Administration Guide
Administering Data Provisioning PUBLIC 73
b. Enter a frequency Period.
5. Configure Data profiling options.
a. Select Discover content type to include content types in the publication; however, this can impact
performance and is not suggested for production scenarios.
b. Select Extract searchable values to extract values; however, this can impact performance and is not
suggested for production scenarios.
6. Select Active to enable the schedule.
Results
The objects appear in the browser tree marked with a solid green plus symbol (for included objects) or a solid
red minus symbol (for excluded objects). Inherited objects display an outlined green or red symbol. These
markers indicate that a request action has been initiated but is independent of the actual status of the request.
To view the status of the request, open the Publication Requests monitor. To view the results of requests, view
the Published Artifacts monitor.
The Publication Schedule displays a table that describes all of the scheduled publication and data profiling
requests.
The search field above the table lets you search for published artifacts.
Publication The mode selected for the schedule (INCLUDE, EXCLUDE, IN Yes
HERIT)
Next Scheduled Date The timestamp for when the schedule will next execute No
Discover Whether or not the option for Discover content type was selected No
Administration Guide
74 PUBLIC Administering Data Provisioning
Column Name Description Filterable
Extracted Searchable Val Whether or not the option for Extract searchable values was se No
ues lected
Warning A warning indicates that a scheduled artifact has been deleted. The Yes
Delete option will be enabled on the schedule.
Delete To delete a schedule for an artifact that has been deleted, select the No
checkbox and click Delete.
Context
When an artifact is configured as included for publication, a publication schedule appears in the table of
schedules. If the artifact gets deleted, the publication schedule will remain until the crawler cannot detect it. A
warning will then appear for that schedule.
Procedure
Use Publication Requests to view and monitor publishing and profiling requests.
Context
To monitor the status of all requests, select the Publication Requests tile.
A user can search for only those catalog or remote objects that are described in the knowledge graph as a
result of successful publication requests. However, if the name of an artifact unexpectedly does not appear in
the search results, the publication of the corresponding artifact might have failed.
Administration Guide
Administering Data Provisioning PUBLIC 75
Related Information
Enterprise Semantic Services Publication Requests displays the status and any error messages for each
request.
The search field above the table lets you search for published artifacts.
Detail Click the magnifying glass icon to see more details. The Detail page No
displays the following information and statistics:
For the latter three statistics, you can see the associated number of
requests that are Successful, Failed, In progress, or Not started.
ID A number that helps identify a request in the list of requests. This Yes
number might not be unique in some cases.
Publication Artifact Fully qualified name of catalog object or repository object that was Yes
published
Administration Guide
76 PUBLIC Administering Data Provisioning
Column Name Description Filterable
Publisher Group Indicates whether a publication was scheduled using the SAP HANA Yes
ESS Administration tool or using the REST API. In the former case,
the predefined publisher group is sap.hana.im.ess.AdminPublish
erGroup. In the latter case, a call to the publish() API must specify a
publisherGroup parameter that defines the ownership of the speci
fied publication in the knowledge graph.
Publisher Name of the SAP HANA user who submitted the request Yes
Request Type Request types on the Publication Requests monitor home page in Yes
clude:
● ON_DEMAND_PUBLISH
● ON_DEMAND_UNPUBLISH
● SCHEDULED_PUBLISH
● MONITORING_UNPUBLISH
● RETRY_ON_DEMAND_PUBLISH
● RETRY_ON_DEMAND_UNPUBLISH
● RETRY_SCHEDULED_PUBLISH
● PUBLISH_NOT_STARTED
● UNPUBLISH_NOT_STARTED
● UNPUBLISH_IN_PROGRESS
● PUBLISH_DONE
● PUBLISH_FAILED
● UNPUBLISH_FAILED
Status Status values on the Publication Requests monitor home page in Yes
clude:
● REQUEST_PENDING
● IN_PROGRESS
● DONE
● DONE_WITH_ERRORS
● NOTHING_DONE
● STOPPING
● STOPPED
● PROFILING_NOT_STARTED
● PROFILING_IN_PROGRESS
● PROFILING_DONE
● PROFILING_FAILED
● INACTIVATED
● NOT PROFILABLE
● BLACKLISTED
● OBSOLETE
● PUBLICATION_FAILED
● STOPPED
Administration Guide
Administering Data Provisioning PUBLIC 77
Column Name Description Filterable
Retry To retry one or more requests, select the Retry checkbox for each, No
or select the Retry checkbox in the column heading to select all
failed requests, and select the Retry button.
Stop To display requests that are currently in progress, select the Refresh No
icon to update the Status column. To move the latest publication re
quests to the top of the list, for Request Date select Sort
Descending.
Use Published Artifacts to view the artifacts that have been published to the knowledge graph and remove
(unpublish) them.
Context
The knowledge graph describes the semantics of published artifacts (datasets). Metadata crawlers and data
profiling requests let you publish artifacts to the knowledge graph. Thereby, applications can search for and
locate these objects and their metadata.
There are two ways to publish artifacts to the knowledge graph: The HTTP REST API publish() method and
the SAP HANA Enterprise Semantic Services Administration tool. If the same artifact gets published by both
mechanisms, the artifact is identified in the Published Artifacts monitor as belonging to a corresponding
publisher group. Therefore, publisher groups define ownership of specific publications in the knowledge graph.
When an artifact is published with a specific publisher group, it can only be unpublished by that group. If the
same artifact has been published with multiple publisher groups, it can only unpublished when all
corresponding publisher groups unpublish it. This control helps avoid conflicts between applications and an
administrator using the Administration tool. Otherwise, an application could publish an artifact and another
application or administrator could unpublish it.
In the case of the HTTP publish() API, the publisher group name is specific to the application; for example
for SAP HANA Agile Data Preparation, it could be com.sap.hana.im.adp. For the SAP HANA ESS Administration
tool, the predefined publisher group name is sap.hana.im.ess.AdminPublisherGroup.
Administration Guide
78 PUBLIC Administering Data Provisioning
To limit the size of both extracted metadata elements and extracted searchable attribute values in knowledge
graph, you can also select artifacts to unpublish.
Procedure
Note that in the following browsers, you can search for an object within a node using a Filter: Publication
Schedules (catalog only), Published Artifacts, Data Profiling Blacklist, Entity Grid Tags.
1. Right-click the object and select Filter. If the Filter option does not display for the object, select Refresh.
2. In the Filter dialog box, start typing the object name or part of the name. The list filters as you type.
3. Select OK to save the filter on the node. To clear the filter later, right-click the object and select Remove
filter.
Note
Selecting Refresh on an object removes all filters from the object and its children.
View the Publication Requests monitor to confirm that the object was removed. For example, the Request
Type would indicate MONITORING_UNPUBLISH.
Related Information
Enterprise Semantic Services Published Artifacts displays artifacts that have been published and also lets you
remove (unpublish) artifacts from the knowledge graph.
The Published Artifact Browser displays all the published objects available in the Catalog, Content, and Remote
Sources folders. The size of an artifact is measured as the total number of searchable metadata elements and
searchable attribute values extracted from that artifact.
The search field above the table lets you search for published artifacts.
Administration Guide
Administering Data Provisioning PUBLIC 79
Name Description Filterable
Publisher Group Indicates whether a publication was scheduled using the SAP HANA Not applicable
ESS Administration tool or using the REST HTTP publish() API.
When an artifact is published with a specific publisher group, the ar
tifact can only be unpublished by the same group. If a same artifact
has been published with different publisher groups, the artifact will
be unpublished when all associated groups have unpublished it.
For the SAP HANA ESS Administration tool, the predefined pub
lisher group name is sap.hana.im.ess.AdminPublisherGroup. For the
REST HTTP publish() API, a call to the publish() API must
specify a publisherGroup parameter that determines the
name of the publisher group.
Number of published arti Number of basic artifacts recursively contained in the selected arti Not applicable
facts fact when the selected artifact is a container. If the selected artifact
is a basic artifact, the number of published artifacts is equal to 1.
Number of metadata ele Total number of extracted metadata elements in the selected arti Not applicable
ments fact.
Number of extracted val Total number of attribute values extracted in the selected artifact. Not applicable
ues
Publication Artifact Qualified name of an artifact that was published or contains pub Yes
lished artifacts. The fully qualified name is described by three at
Wildcards
tributes:
Oldest Refresh Oldest date of updated basic artifacts in the corresponding con No
tainer. This date is NULL in the case of a basic artifact.
Last Refresh Most recent date of updated basic artifacts in the corresponding No
container. This date is the last update in the case of a basic artifact.
Basic Artifacts Number of published basic artifacts recursively contained in the Yes
corresponding container. This value is 1 in the case of a basic arti
fact.
Administration Guide
80 PUBLIC Administering Data Provisioning
Name Description Filterable
Removable Metadata Number of non-shared metadata elements. It indicates the number Yes
of searchable metadata elements extracted from the corresponding
published artifacts that are not shared with other published arti
facts. This number gives an indication of how many metadata ele
ments would be removed if you unpublished the artifact.
Removable Values Number of searchable attribute values extracted for the catalog ob Yes
ject represented by the published artifact. It indicates the number
of metadata elements that are not shared with other published arti
facts. This number gives an indication of how many profiled values
would be removed in the case of unpublishing.
Unpublish To unpublish the artifact, select the Unpublish check box and click No
Save. To unpublish all displayed artifacts, select the Unpublish
check box in the column heading and click Save.
Enterprise Semantic Services can profile the contents of artifacts that have been published to the knowledge
graph.
Data profiling is a process that analyzes the values contained in specific columns of a dataset (the columns to
analyze are specified internally using ESS logic). Analysis of the data in a column discovers business types, and
searchable values can then be extracted and indexed using SAP HANA full text index.
Enterprise Semantic Services can profile the contents of the following artifacts:
● SQL tables
● Column views issued from graphical Calculation views, Attribute, and Analytic views
● Virtual tables created from remote objects of a remote source with the PASSWORD credential type (see
the topic “CREATE REMOTE SOURCE” in the SAP HANA SQL and System Views Reference).
Note
When requesting profiling of a catalog object that does not result from an activation, you must assign the
role SELECT with grant option to the technical user _HANA_IM_ESS. (For activated objects, there is nothing
to do.)
Administration Guide
Administering Data Provisioning PUBLIC 81
Related Information
You can prevent artifacts (limited to catalog objects) from being profiled.
Context
Limiting the artifacts to profile lets you control the volume of searchable attribute values or avoid extracting
searchable values from datasets that hold sensitive or personal data.
To prevent an artifact from being profiled, an administrator can blacklist artifacts. When a catalog object that
was previously profiled is blacklisted, all of its extracted searchable attribute values are immediately removed
from the knowledge graph. The catalog object will never be profiled again, even if there is still a data profiling
schedule associated with the object(s).
Procedure
Note that in the following browsers, you can search for an object within a node using a Filter: Publication
Schedules (catalog only), Published Artifacts, Data Profiling Blacklist, Entity Grid Tags.
1. Right-click the object and select Filter. If the Filter option does not display for the object, select Refresh.
2. In the Filter dialog box, start typing the object name or part of the name. The list filters as you type.
3. Select OK to save the filter on the node. To clear the filter later, right-click the object and select Remove
filter.
Note
Selecting Refresh on an object removes all filters from the object and its children.
3. To blacklist the artifact, select the Blacklisted check box and click Save. To blacklist all displayed artifacts,
select the Blacklisted check box in the column heading and click Save.
To re-enable data profiling for an artifact, clear the check box and click Save.
Administration Guide
82 PUBLIC Administering Data Provisioning
4.5.6.2 Information Available on the Data Profiling Blacklist
The Enterprise Semantic Services Data Profiling Blacklist lets you view and choose which artifacts to blacklist
(remove data profiling values).
The search field above the table lets you search for published artifacts.
Catalog Object Fully qualified name of the selected object Not applicable
Blacklisted catalog objects Number of blacklisted objects in the selected artifact Not applicable
Extracted values The total number of extracted values for the selected object Not applicable
Schema Name Name of the schema to which the artifact belongs Yes
Wildcards
Extracted Values Number of extracted searchable values for the object Yes
Blacklisted Select the checkbox to blacklist the object and click Save. Clear the No
checkbox to enable data profiling for the object and click Save.
As an Enterprise Semantic Services administrator, you can set configuration parameter values such as
maximum sizes of persistent queues, rolling policy of persistent queues, and so on.
To set configuration parameters, in the SAP HANA studio Administration Console, a system administrator sets
values in the reserved table sap.hana.im.ess.eg.configuration::CONFIGURATION. To do so, specific database
procedures and user privileges are required.
Example
CALL
"SAP_HANA_IM_ESS"."sap.hana.im.ess.eg.configuration::SET_CONFIGURATION_VALU
E"('MAX_ESS_PROFILING_JOBS_SCHEDULE_TIME', value)
Administration Guide
Administering Data Provisioning PUBLIC 83
4.5.8 Troubleshooting Enterprise Semantic Services
Troubleshooting solutions, tips, and API error messages for Enterprise Semantic Services.
Related Information
Symptom When importing the HANA_IM_ESS delivery unit (DU), an activation error occurs and appears in the SAP
HANA studio job log view.
Solution Check the job log details in SAP HANA studio. If the error message is not meaningful, then:
Symptom The ESS DU has been uninstalled using the uninstallation procedure. When you reimport the ESS DU,
activation errors occur, showing that dependent objects are not found.
Cause Activated objects may have dependent objects that do not yet exist and therefore cause an error.
● Verify that all ESS DUs (including the DEMO DU) have been properly uninstalled through the SAP HANA
Application Lifecycle Management console.
● Verify that all related packages have been deleted (those with a naming of sap.hana.im.ess...); otherwise
remove them as follows:
○ Create a workspace in the Repositories tab of SAP HANA studio.
Administration Guide
84 PUBLIC Administering Data Provisioning
○ Remove the packages from there.
Symptom The Publication Requests monitor displays a message that includes the phrase If the failure
repeats, contact SAP support.
Solution If the transaction that failed was a publish request, on the Publication Requests monitor for the artifact in
question, select the Retry check box and the Retry button.
Symptom Publishing requests appear as not processed in the SAP HANA ESS Administration tool's Publication
Schedules view. Request Status remains REQUEST PENDING or REQUESTED.
Symptom A publishing request failed (the Request Status is FAILED in the SAP HANA ESS Administration tool
Publication Schedules view).
Solution 1 The user can “upgrade” the format of the view by editing (make a small change such as adding a space) and
saving it.
Related Information
Symptom Publishing requests have been processed and all ESS background jobs are active, but data profiling requests
appear as not processed in the SAP HANA ESS Administration tool Data Profiling Monitor view. Profiling
Status remains as REQUEST PENDING.
Administration Guide
Administering Data Provisioning PUBLIC 85
Cause Investigate the error as follows:
● Enable the trace level for xsa:sap.hana.im.ess to ERROR in the SAP HANA Administration Console
● Inspect the latest diagnosis file xsengine_alert_xxx.trc.
● Check for the following error message:
Verify whether the script server was created during installation. To do so, in the SAP HANA Administration
Console, view the Configuration tab, and in daemon.ini, expand scriptserver.
Solution See “Configure Smart Data Quality” in the Installation and Configuration Guide.
Symptom A run-time object has the Request Status of FAILED in SAP HANA ESS Administration tool Data Profiling
Monitor view. An error with message code ESS805 “Insufficient privilege - user xxx
must have SELECT with GRANT option on xxx“ is returned.
Cause If the run-time object is not an activated object, then check that the SELECT right on the run-time object has
been granted WITH GRANT OPTION to the technical user _HANA_IM_ESS.
Symptom A data profiling request failed (the Request Status is FAILED in the SAP HANA ESS Administration tool Data
Profiling Monitor view).
Solution 2 Invalid arguments have been passed to the API. See API Error Messages [page 88].
Related Information
Symptom The SAP HANA user 'User' cannot perform a search using the ESS API. An internal server error message is
returned to the application.
● In the SAP HANA Administration Console, set the trace level for xsa:sap.hana.im.ess to INFO.
Administration Guide
86 PUBLIC Administering Data Provisioning
See Activate Error Trace for Enterprise Semantic Services [page 92].
● Inspect latest diagnosis file xsengine_alert_xxx.trc.
● Check for the following error message:
This means that the 'User' who is publishing has not been granted the privilege Role:
sap.hana.im.ess.roles::User.
CALL
"_SYS_REPO"."GRANT_ACTIVATED_ROLE"('sap.hana.im.ess.roles::User','user')
;
Symptom A search query does not return an expected catalog object that exists in the SAP HANA instance.
OR
Suggestions do not show an expected term, although that term is associated with a database object in the
SAP HANA instance.
Solution Verify the user who is posing the search query has sufficient privileges to access the searchable elements of
the expected catalog object as in the following table.
table, SQL view, virtual table, column metadata Owner, object privilege, READ
view CATALOG, DATA ADMIN
table, SQL view, virtual table Profiled data Owner, SELECT object privilege
Symptom A search query does not return an expected database object that exists in the SAP HANA instance.
Cause Assuming that the user has sufficient required authorizations, check the syntax of the search query.
Sales ATT Does not match value “AT&T” “AT T” will match AT&T, AT-T, AT/T
Sales_2007
Administration Guide
Administering Data Provisioning PUBLIC 87
Search query Unexpected match Correction
Symptom Acronyms or abbreviations are not matched by a search query, which, as a result, does not return an expected
database object that exists in the SAP HANA instance.
Solution To modify the entries in the term mapping table, see "Search Term Mapping" in the Modeling Guide for SAP
HANA Smart Data Integration and SAP HANA Smart Data Quality.
Symptom A search query does not return an expected database object that exists in the SAP HANA instance.
Cause Assuming that the user has sufficient authorizations and the syntax of the search query is correct, then use
the SAP HANA ESS Administration tool Publish and Unpublish Monitor view to verify whether the database
object has been successfully published to ESS, that is, its Request Status is SUCCESSFUL.
Solution If the Request Status is FAILED, take the appropriate actions according to the error code and message.
ESS100=Error occurred Publish Not necessarily an API error. First check trace log file for details on the error.
when extracting metadata
from a publication artifact.
ESS153=Metadata Extrac Publish Check API artifact argument or SAP HANA view was concurrently deleted.
tion Internal Error: No view
definition.
ESS154=Metadata Extrac Publish Check API artifact argument or package was concurrently deleted.
tion Internal Error: No
package name.
ESS158=Package name Publish Check API artifact argument or artifact was concurrently deleted.
‘{0}’ does not exist.
ESS159=HANA view ‘{0}/ Publish Check API artifact argument or artifact was concurrently deleted.
{1}.{2}’ does not exist.
ESS160=Publication arti Publish Check API arguments and list of supported types of publication artifacts.
fact of type ‘{0}’ is not sup
ported.
Administration Guide
88 PUBLIC Administering Data Provisioning
Error API Action
ESS161=Schema name Publish Check API artifact argument or schema was concurrently deleted.
‘{0}’ does not exist.
ESS162=Catalog object Publish Check API artifact argument or catalog object was concurrently deleted.
XXX does not exist or is
not supported.
ESS163=Invalid publication Publish Not necessarily an API error. Verify the artifact exists.
artifact qualified name
‘{0}’.
ESS164=Invalid container Publish Check API artifact argument or updates happened concurrently. Verify the
path ‘{0}’. Errors 180 – 186 path still exists.
ESS180=Expecting charac Publish Parsing error in artifact name. Check API artifact argument.
ter ‘{0}’ but encountered
character ‘{1}’ in the publi
cation artifact.
ESS181=Close quote with Publish Parsing error in artifact name. Check API artifact argument.
out a matching open quote
in string ‘{0}’ of the publi
cation artifact.
ESS182=Invalid catalog Publish Parsing error in artifact name. Check API artifact argument.
publication artifact [’{0}’].
ESS183=Invalid publication Publish Parsing error in artifact name. Check API artifact argument.
artifact ‘{0}’ [’{1}’].
ESS184=First identification Publish Parsing error in artifact name. Check API artifact argument.
element of the publication
artifact is not a catalog or
content ‘{0}’.
ESS185=Unmatched quote Publish Parsing error in artifact name. Check API artifact argument.
for identification element
{0} in publication artifact.
ESS186=Identification ele Publish Parsing error in artifact name. Check API artifact argument.
ment {0} should not be
empty.
Error containing string: Search Search syntax error. First check explanation given in error message. Then
“Search query syntax er check search query in trace log with trace level DEBUG.
ror” Errors 500 – 526
Administration Guide
Administering Data Provisioning PUBLIC 89
Error API Action
Error containing string: Search Request message error. First check explanation given in error message. Then
“Request error message” check request message in trace log with trace level DEBUG.
Errors 600 – 645
ESS705=Invalid scope Publish Only roman letters (both lowercase and uppercase), digits, and underscore
name [{0}]. characters are valid.
ESS715=Invalid type filter Publish Invalid argument for typeFilter in API. Check explanation given in error mes
[{0}] for container [{1}]. sage in trace log.
Please use one of [ta
ble,virtualtable,view, col
umnview]
ESS720=External source Publish Check API parameter “source”. Can only be LOCAL.
[{0}] not supported.
ESS725=Mandatory argu Publish Mandatory API argument is missing. First check explanation given in error
ment [{0}] is missing in message. Then check search query in trace log with trace level DEBUG.
CT on-demand
function [{1}].
Search
ESS730=Invalid value [{0}] Publish Invalid argument in API. First check explanation given in error message. Then
for argument [{1}] in func check search query in trace log with trace level DEBUG.
CT on-demand
tion [{2}], expecting: [{3}].
Search
ESS735=Invalid value [{0}] Publish Invalid argument in API. First check explanation given in error message. Then
at position [{1}] for argu check search query in trace log with trace level DEBUG.
CT on-demand
ment [{2}] in function
Search
[{3}], expecting: [{4}].
ESS808=Profiling Error: In CT on-demand Request to profile an unknown type of run-time object. First check explana
valid Runtime Object Type tion given in error message. Then check search query in trace log with trace
{0}. Only the following in level DEBUG.
put values are supported:
{1}
ESS810=Profiling Error: CT on-demand Check API argument or artifact was concurrently deleted.
{0} “{1}”.”{2}” does not ex
ist. Runtime object does
not exist
Administration Guide
90 PUBLIC Administering Data Provisioning
Error API Action
ESS812=Profiling Error: CT on-demand Check API argument or artifact has been updated concurrently.
“{0}” is not attribute of {1}
“{2}”.”{3}”
Tips for troubleshooting and preparing diagnostic information for SAP support.
Related Information
Procedure for error messages in HANA ESS Administration Monitor view that include If the failure
repeats.
Prerequisites
The error trace for Enterprise Semantic Services must be activated before you retry the operation.
Administration Guide
Administering Data Provisioning PUBLIC 91
Context
Procedure
1. Go to the Enterprise Semantic Services Administration Monitor by entering the following URL in a web
browser:
http://<<your_HANA_instance:port>>/sap/hana/im/ess/ui
2. If the transaction that failed was a publish request, click the Publish and Unpublish Monitor tile, find your
run-time object in the Published Artifact column, and click the Retry icon.
Next Steps
After you retry the operation and it still fails, collect diagnostic information.
The error trace obtains detailed information about actions in Enterprise Semantic Services.
Prerequisites
To configure traces, you must have the system privilege TRACE ADMIN.
Context
To activate the error trace for Enterprise Semantic Services, follow these steps:
Procedure
1. Log on to SAP HANA studio with a user that has system privilege TRACE ADMIN.
2. In the Administration editor, choose the Trace Configuration tab.
Administration Guide
92 PUBLIC Administering Data Provisioning
3. Choose the Edit Configuration button for the trace that you want to configure.
4. Expand the XS ENGINE node.
Note
5. Locate xsa:sap.hana.im.ess and check that system trace level is INFO, ERROR, or DEBUG, because it is
usually set to DEFAULT.
6. Click Finish.
After the Enterprise Semantic Services error trace is activated, find the error message.
Procedure
1. Log on to SAP HANA studio with a user name that has system privilege CATALOG READ.
2. In the Administration editor, choose the Trace Configuration tab and go to Diagnosis Files tab.
3. Look for one of the two most recent files (xsengine_alert_xxx.trc or
indexserver_alert_xxx.trc).
4. Go to the end of the file to see the error.
Procedure
1. Log on to SAP HANA studio with a user name that has the system privilege CATALOG READ.
2. In the Administration editor, choose the Trace Configuration tab and go to Diagnosis Files tab.
Administration Guide
Administering Data Provisioning PUBLIC 93
5 Maintaining Connected Systems
Avoid errors and minimize downtime by accounting for SAP HANA smart data integration components when
planning maintenance tasks for connected systems such as source databases and the SAP HANA system.
Related Information
Consider any effects to your data provisioning landscape before performing common SAP HANA maintenance
tasks.
Most commonly, suspend and resume any data provisioning remote sources before performing SAP HANA
maintenance operations.
Related Information
Administration Guide
94 PUBLIC Maintaining Connected Systems
5.1.1 Update the SAP HANA System
Suspend and resume remote sources when you must update the SAP HANA system.
Prerequisites
Be sure to back up the SAP HANA system before starting the upgrade process.
Procedure
Related Information
Suspend and resume remote sources when you must perform a takeover and failback operation with SAP
HANA system replication.
Prerequisites
If the SAP HANA system is used as a data source, both the primary and secondary systems must be configured
with a virtual IP.
Administration Guide
Maintaining Connected Systems PUBLIC 95
Procedure
Next Steps
Following an unplanned failback operation, monitor remote subscription exceptions in the Data Provisioning
Remote Subscription Monitor. To clear any exceptions, click Retry Operation.
Suspend and resume remote sources when you must perform a host auto-failover with SAP HANA scale-out.
Prerequisites
If the SAP HANA system is used as a data source, both the active and standby hosts must be configured with a
virtual IP.
Procedure
Next Steps
Following an unplanned failover operation, monitor remote subscription exceptions in the Data Provisioning
Remote Subscription Monitor. To clear any exceptions, click Retry Operation.
Administration Guide
96 PUBLIC Maintaining Connected Systems
Related Information
Keep your databases performing at maximum capacity by learning about cleaning log files, recovery, and
preparing for restarts.
Change the Primary Archive Log Path During Replication [page 104]
Replication isn’t impacted when the primary archive log path is changed during replication.
Maintain the Source Database Without Propagating Changes to SAP HANA [page 104]
Use the Maintenance User Filter to define a source database user that can perform maintenance tasks
in a source database, without having the changes propagated to the SAP HANA system through data
provisioning adapters.
Administration Guide
Maintaining Connected Systems PUBLIC 97
Related Information
When you must restart a source database, stop any remote source capture and restart the capture after
restarting the database.
Procedure
Related Information
When you change the password for the source database user, you must update any remote sources that access
the database with that user.
Procedure
Related Information
Administration Guide
98 PUBLIC Maintaining Connected Systems
Suspend and Resume Remote Sources [page 57]
Suspend and Resume Remote Sources [page 57]
Avoid disk space issues and ensure smooth real-time replication by regularly cleaning up the LogReader
archives.
The Oracle, DB2, and MS SQL LogReader adapters rely on the log reader archive log to retrieve changed data
from the source databases. Over time, the archive log can grow to consume large amounts of disk space if it
isn’t cleaned up.
Related Information
Identify the Log Sequence Number (LSN) and use the RMAN utility to clean the log.
Procedure
1. Identify the ending truncation point of the last commit transaction in the SAP HANA server.
00000000000000000000000000000000000000000000000000000000000000000000000001298ee5
00000001000003940000670f01e0000001298ee20000000000000000
Administration Guide
Maintaining Connected Systems PUBLIC 99
2. Use the ending truncation point to identify the LSN of the log file in Oracle.
The number of the Oracle redo log group is returned. For example: 3
4. Use the RMAN utility to clean the archive log.
The log files that can safely be cleaned have a sequence number smaller than the LSN minus the number
of the Oracle redo log group.
For example, with an LSN of 916 and an Oracle redo log group of 3, logs through sequence number 913
may be safely cleaned.
Identify the Log Sequence Number (LSN) and truncate the log using a Microsoft SQL Server command.
Procedure
1. Identify the end LSN of the last commit transaction in the SAP HANA server.
000000000000000000000000000000000000000000000000000000000000000000000000032b0000
0ef80004000004010000032b00000ef8000100000ef0000300000000
Administration Guide
100 PUBLIC Maintaining Connected Systems
2. Use the truncation point to clean the archive log in MS SQL.
Identify the Log Sequence Number (LSN) and use the DB2 utility to clean the log.
Procedure
Note
<pds_username> is the same user specified in the remote source in SAP HANA.
db2flsn -q <log_sequence_number>
The filename of the log containing the current LSN is returned. For example: S0000354.LOG
db2 "get db cfg for <database_name>" | grep -E "First log archive method"
a. If the first archive log method is LOGRETAIN, use a DB2 command to delete the old log files.
b. If the first archive log method isn’t LOGRETAIN, delete log files older than the identified LSN log
manually from the archive log path on the DB2 host.
Administration Guide
Maintaining Connected Systems PUBLIC 101
5.2.4 Recover from Missing LogReader Archives
When archive logs are missing, replication fails. There are multiple solutions for recovering and restarting
replication.
Tip
If a secondary archive log path is set before replication, replication with automatically switch to the
secondary log path if an archive log is missing from the primary path.
Procedure
If the error contains the filename of the missing archive log, the log can be restored from a backup. If
the error doesn’t contain a missing archive log filename, clean the archive log safely.
b. Restore the missing archive log file from the backup location on the source machine.
For example:
cp /rqa16clnx3_work1/oraclearchive/backup/1_2581_896793021.dbf /
rqa16clnx3_work1/oraclearchive/o12lnxrdb
Administration Guide
102 PUBLIC Maintaining Connected Systems
5.2.5 Recover from LogReader Database Upgrades
After upgrading a source database, you may need to force a version migration to resume remote sources on
LogReader adapters.
Context
After a Microsoft SQL Server or Oracle source database is upgraded, the associated log reader adapter may be
unable to resume remote sources, and an error such as the following appears in the framework trace log:
LogReader adapters store the source database version in an internal table. When a remote source is resumed,
the adapter compares the current database version with the stored version and prompts a migration if the
versions don’t match.
Add a parameter to the Data Provisioning Agent configuration file to force a version migration.
Procedure
1. In the agent configuration file, add the migration parameter for your database type.
Note
Administration Guide
Maintaining Connected Systems PUBLIC 103
5.2.6 Change the Primary Archive Log Path During
Replication
Replication isn’t impacted when the primary archive log path is changed during replication.
You can use the Data Provisioning Remote Subscription Monitor to verify that the replication for all tables hasn’t
been impacted.
Use the Maintenance User Filter to define a source database user that can perform maintenance tasks in a
source database, without having the changes propagated to the SAP HANA system through data provisioning
adapters.
● For Oracle and Microsoft SQL LogReader adapters, database transactions including INSERT, UPDATE, and
DELETE, and DDL changes such as ALTER TABLE
● For the SAP HANA adapter, database transactions such as INSERT, UPDATE, and DELETE
Prerequisites
Determine the source database user that performs the maintenance tasks.
Procedure
2. In the SAP HANA Web-based Development Workbench, choose Provisioning Remote Sources and
right-click the remote source.
Administration Guide
104 PUBLIC Maintaining Connected Systems
3. In the Maintenance User Filter option, specify the username of the database maintenance user.
4. Re-enter the logon credentials for the source database, and save the changes.
5. Resume the remote sources for the source database.
Related Information
Re-execute a replication task when Microsoft SQL Server fails over during the initial load.
Context
If a Microsoft SQL Server failover happens during the initial load of a replication task execution, the replication
task fails.
For example:
Internal error: Remote execution error occurred when getting next row from
Result Set.
Note
If the failover occurs during real-time replication, no action is required, and the replication continues
automatically.
Procedure
1. In the Data Provisioning Design Time Object Monitor locate the replication task marked as FAILED.
2. After the Microsoft SQL Server failover completes, re-execute the replication task.
Administration Guide
Maintaining Connected Systems PUBLIC 105
5.2.9 Recover with SAP HANA System Replication Failover
Re-execute a replication task when SAP HANA fails over during the initial load.
Context
If an SAP HANA failover happens during the initial load of a replication task execution, the replication task fails.
For example:
Note
If the failover occurs during real-time replication, see Failover with SAP HANA Scale-Out [page 96].
Procedure
1. In the Data Provisioning Design Time Object Monitor locate the replication task marked as FAILED.
2. After the SAP HANA failover completes, re-execute the replication task.
Administration Guide
106 PUBLIC Maintaining Connected Systems
6 Troubleshooting and Recovery Operations
This section describes common troubleshooting scenarios for your SAP HANA smart data integration
landscape, as well as recovery steps to follow when errors occur.
Related Information
Diagnose and resolve common failure scenarios for initial queues in real-time replication tasks.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 107
Real-time replication tasks may fail when the associated adapters haven’t been properly configured on
the Data Provisioning Agent.
Load Clustered and Pooled Table Metadata into SAP HANA [page 125]
Real-time replication tasks with clustered or pooled tables may fail when metadata hasn’t been
correctly loaded into the SAP HANA database.
Related Information
Adapters used for real-time replication require the remote source database user to be configured with
privileges specific to the source database type.
Context
Insufficient user privileges are indicated in the <DPAgent_root>/log/framework.trc trace log file.
For example:
Required privileges and/or roles not granted to the database user [ZMTEST].
Missing privileges and/or roles are [SELECT ANY TRANSACTION]
Procedure
Administration Guide
108 PUBLIC Troubleshooting and Recovery Operations
4. Execute the replication task.
Related Information
A replication task may fail if remote source parameters are specified with invalid or out-of-range values, or if
values for any mandatory dependent parameters aren’t specified.
Context
Remote source parameter errors are indicated in the <DPAgent_root>/log/framework.trc log file, and
may vary based on the specific remote source parameter.
For example, the following scenarios may cause a remote source parameter error:
[ERROR]
com.sap.hana.dp.oraclelogreaderadapter.OracleLogReaderAdapter.cdcOpen[669]
- Adapter validation failed.
com.sap.hana.dp.cdcadaptercommons.validation.ValidationException: Failed to
validate properties. Error(s):
The value [10] of property [lr_max_op_queue_size] is not in the range [25,
2147483647].
● An Oracle remote source is configured to use TNSNAMES, but the TNSNAMES file and connection
parameters aren’t specified.
● An Oracle remote source is configured as a Multitenant Database, but the container database service
name, pluggable database service name, or Oracle multitenant credentials aren’t specified.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 109
Property [cdb_password] is mandatory.
Property [cdb_service_name] is mandatory.
Property [cdb_username] is mandatory.
Property [pds_service_name] is mandatory. Context: null
com.sap.hana.dp.adapter.sdk.AdapterException: Adapter validation failed.
Failed to validate properties. Error(s):
Property [cdb_password] is mandatory.
Property [cdb_service_name] is mandatory.
Property [cdb_username] is mandatory.
Property [pds_service_name] is mandatory.
Resolve the error by specifying complete remote source parameter values within the valid range.
Procedure
1. Alter the remote source parameter and specify a value within the valid range or any missing dependent
parameter values.
2. Re-create the replication task or edit the existing replication task.
3. Execute the replication task.
Related Information
Real-time replication tasks may fail when the remote source database hasn’t been properly configured to
support real-time replication.
Administration Guide
110 PUBLIC Troubleshooting and Recovery Operations
Enable the Secure File LOB Setting on Oracle [page 114]
Oracle LOB data may not replicate correctly when the DB_SECUREFILE setting is set to “ALWAYS” or
“PREFERRED”.
Related Information
Real-time replication tasks on Oracle remote sources may fail if the Oracle archive log hasn’t been enabled.
Context
For example:
[ERROR] com.sap.hana.dp.cdcadaptercommons.StatefulAdapterCDC
$CDCOpened.addSubscription[791] - [oracleSrc] Failed to add the first
subscription. SubscriptionSpecification [header=remoteTableId=466,
remoteTriggerId=178688, sql=SELECT "T1"."INT_C1", "T1"."VARCHAR_C2" FROM
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 111
"""LR_USER"".""TESTTB1""" "T1" , subscription=, customId=, seqID=SequenceId
[value=[0, 0, 0, 0]], isLastSubscription=true, withSchemaChanges=false,
firstSubscriptionOnTable=true, lastSubscriptionOnTable=true]
com.sap.hana.dp.adapter.sdk.AdapterException: Failed to start Log Reader because
of failure to initialize LogReader. Error:Oracle LogMiner must be installed in
order to use this command. Please verify that LogMiner is installed.
For example:
Resolve the error by enabling the archive log on the Oracle source database.
Procedure
Administration Guide
112 PUBLIC Troubleshooting and Recovery Operations
6.1.3.2 Enable Supplemental Logging on Oracle
Real-time replication tasks on Oracle remote sources may fail if supplemental logging on the Oracle source
database hasn’t been enabled or doesn’t match the supplemental logging parameter in the remote source.
Context
An error may be indicated in the <DPAgent_root>/log/framework.trc log file in the following scenarios.
[ERROR]
com.sap.hana.dp.adapter.framework.core.WorkerThread.processRequest[329] -
Adapter validation failed. Minimal supplemental logging not enabled.
● The remote source parameter is set to “database”, but PRIMARY KEY and UNIQUE KEY database-level
supplemental logging isn’t enabled.
For example:
[ERROR]
com.sap.hana.dp.adapter.framework.core.WorkerThread.processRequest[329] -
Adapter validation failed. Database PRIMARY KEY and/or UNIQUE supplemental
logging is not enabled.
● The remote source parameter is set to “table”, but table-level supplemental logging isn’t enabled.
For example:
[ERROR]
com.sap.hana.dp.adapter.framework.core.WorkerThread.processRequest[329] -
Adapter validation failed. Primary key and/or unique key supplemental logging
is not turned on for these Oracle system tables: [LOBFRAG$, TABPART$,
TABSUBPART$, COLTYPE$, INDSUBPART$, MLOG$, TYPE$, INDCOMPART$, NTAB$,
TABCOMPART$, COLLECTION$, LOB$, PROCEDUREINFO$, SNAP$, OPQTYPE$, DEFERRED_STG
$, LOBCOMPPART$, ARGUMENT$, RECYCLEBIN$, SEQ$, ATTRIBUTE$, INDPART$]
Resolve the error by enabling a supplemental logging level that matches the remote source configuration.
Procedure
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY, UNIQUE) COLUMNS;
● Enable table-level supplemental logging on each table specified in the error message.
ALTER TABLE <TABLE_NAME> ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS;
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 113
For example:
ALTER TABLE SYS.ARGUMENT$ ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS;
ALTER TABLE SYS.ARGUMENT$ ADD SUPPLEMENTAL LOG DATA (UNIQUE INDEX) COLUMNS;
ALTER TABLE SYS.ATTRIBUTE$ ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS;
ALTER TABLE SYS.ATTRIBUTE$ ADD SUPPLEMENTAL LOG DATA (UNIQUE INDEX) COLUMNS;
...
Oracle LOB data may not replicate correctly when the DB_SECUREFILE setting is set to “ALWAYS” or
“PREFERRED”.
Context
The Oracle Log Reader adapter supports Secure File LOB only when the primary database setting
DB_SECUREFILE is set to “PERMITTED”.
Procedure
If the Oracle service name isn’t specified correctly, the remote source can be created but remote tables can’t
be browsed.
Context
When the Oracle service name is set as a domain name, the “Database Name” remote source parameter must
be specified as the Oracle service name.
Administration Guide
114 PUBLIC Troubleshooting and Recovery Operations
For example, when checking the Oracle service name:
Procedure
For example:
USE_SID_AS_SERVICE_LISTENER=ON
LISTENER =
(DESCRIPTION_LIST =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = <agent_hostname>)(PORT = 1521))
(ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521))
)
)
Replication tasks may fail if the Microsoft SQL Server transaction log can’t be read or is full.
Context
Resolve the error by installing the sybfilter driver and making the log file readable.
● If the Microsoft SQL Server transaction log is full, the following error may be reported:
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 115
Verify that the auto-extend option for the Microsoft SQL Server log file is enabled, and that the disk where
the log file is located has available free space.
Related Information
Replication tasks may fail when the data capture mode is set to “Native Mode” and the Microsoft SQL Server
hasn’t been initialized.
Context
The Microsoft SQL Server database must be initialized to ensure that the log reader adapter can open the
supplemental log of each table marked for replication.
Note
For example:
Related Information
Administration Guide
116 PUBLIC Troubleshooting and Recovery Operations
6.1.3.7 Enable the DB2 Archive Log
Real-time replication tasks on DB2 remote sources may fail if the DB2 archive log hasn’t been enabled.
Context
For example:
[ERROR]
com.sap.hana.dp.adapter.framework.core.WorkerThread.processRequest[329] -
Failed to add the first subscription. Error: Failed to start Log Reader because
of failure to initialize LogReader. Error: An error occured while getting DB
parameter <LOGARCHMETH1>. Exception: DB2 SQL Error: SQLCODE=-286,
SQLSTATE=42727, SQLERRMC=8192;QARUSER, DRIVER=4.18.60
Resolve the error by setting the primary DB2 UDB database transaction logging to archive logging.
Related Information
Real-time replication tasks on DB2 remote sources may fail if the user temporary tablespace hasn’t been
created.
Context
For example:
[ERROR]
com.sap.hana.dp.adapter.framework.core.WorkerThread.processRequest[329] -
Failed to add the first subscription. Error: Failed to start Log Reader because
of failure to initialize LogReader. Error: An error occured while getting DB
parameter <LOGARCHMETH1>. Exception: DB2 SQL Error: SQLCODE=-286,
SQLSTATE=42727, SQLERRMC=8192;QARUSER, DRIVER=4.18.60
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 117
Related Information
Remote sources for DB2 can’t be created if the DB2 port is outside the allowed range of 1-65335.
Context
For example:
Resolve the error by configuring DB2 with a port in the valid range (1-65335).
Procedure
○ On Windows, C:\Windows\System32\drivers\etc\services
○ On UNIX or Linux, /etc/services
2. Update the database manager configuration.
Administration Guide
118 PUBLIC Troubleshooting and Recovery Operations
6.1.3.10 Verify the DB2 Native Connection Settings
The LogReader may fail to initialize properly when the DB2 native connection fails.
Context
When the native connection fails, you may see the following error in the log:
[ERROR]
com.sap.hana.dp.db2logreaderadapter.DB2RepAgentWrapper.initialize[858] -
Failed to initialize LogReader.
Could not find Resource Bundle containing index: Could not get the log end
locator because: Native database connection failed with code <-1>.
Resolve the error by verifying that the connection details for your DB2 database are configured correctly:
● Host
● Port
● Database Name
● Database Source Name
Real-time replication tasks on the SAP ASE adapter may fail if an entry for the adapter hasn’t been added to
the interface file of the data server.
Context
For example:
Resolve the error by adding an entry for the SAP ASE adapter to the interface file on the SAP ASE server.
Procedure
1. Add the entry to the interface file on the SAP ASE data server.
<entry_name>
master tcp ether <agent_hostname> <port>
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 119
query tcp ether <agent_hostname> <port>
Note
The entry name must match the adapter instance name specified in the remote source configuration.
The port number must match the SAP ASE adapter server port configured in <DPAgent_root>/
Sybase/interfaces.
Real-time replication tasks may fail when the associated adapters haven’t been properly configured on the
Data Provisioning Agent.
Related Information
Replication tasks may fail if the JDBC driver isn’t provided or doesn’t match the version of the source database.
Context
When a compatible JDBC driver isn’t provided, a replication task failure error may be indicated.
For example:
Administration Guide
120 PUBLIC Troubleshooting and Recovery Operations
● When available, use the JDBC driver distributed with the source database installation.
For example:
○ For Oracle, $ORACLE_HOME/jdbc/lib/ojdbc7.jar
○ For DB2, $INSTHOME/sqllib/java/db2jcc4.jar
● The JDBC driver version shouldn’t be lower than the source database version.
You can use the java command to verify the JDBC driver version. For example:
Procedure
Replication errors may occur if a noncompatible Java Runtime Environment (JRE) is provided.
Context
We recommend that you use the SAP JVM bundled with the Data Provisioning Agent.
For complete information about supported Java Runtime Environment versions, see the Product Availability
Matrix (PAM).
Remote subscriptions may fail to queue if the DB2 runtime environment variables aren’t set on the Data
Provisioning Agent host.
Context
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 121
For example:
[INFO ]
com.sap.hana.dp.db2logreaderadapter.DB2RepAgentWrapper.initialize[1186] -
Initializing LogReader ...
[ERROR]
com.sap.hana.dp.adapter.framework.core.WorkerThread.processRequest[354] - /
rqac48lnx3_work1/dpfiles/dataprovagent/LogReader/lib/linux64/libsybrauni98.so:
libdb2.so.1: cannot open shared object file: No such file or directory Context:
java.lang.UnsatisfiedLinkError: /rqac48lnx3_work1/dpfiles/dataprovagent/
LogReader/lib/linux64/libsybrauni98.so: libdb2.so.1: cannot open shared object
file: No such file or directory
Resolve the error by setting the DB2 runtime environment variables before starting the Data Provisioning
Agent.
Procedure
source /home/db2inst1/sqllib/db2profile
Tip
By sourcing the profile in the DPAgent_env.sh file, the environment variables are set each time you
use the Data Provisioning Agent Configuration tool.
2. Restart the agent with the Data Provisioning Agent Configuration tool.
3. Re-execute the replication task.
Real-time replication tasks may fail when attempting to queue a remote subscription if the source database or
table has uncommitted transactions.
Context
For example:
SQL ERROR--
Message: ORA-00054: resource busy and acquire with NOWAIT specified or timeout
expired
SQLState: 61000
Remote Code: 54
Message: SQL execution failed
Administration Guide
122 PUBLIC Troubleshooting and Recovery Operations
Resolve the error by committing any transactions in the source database or table.
Procedure
1. Ensure that any transactions have been committed in the source database or table.
2. Re-execute the replication task.
For trigger-based adapters such as SAP HANA or Teradata, reset the remote subscription.
Log reader adapters require an instance port that must not be used by any other applications on the Data
Provisioning Agent host.
Context
Log reader adapter instances can’t be created when the specified instance port is already in use.
Procedure
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 123
6.1.7 Resolve Data Provisioning Server Timeouts
When the default message timeout value for the Data Provisioning Server is too short, real-time replication
tasks may fail during the initial load operation.
Context
The data provisioning adapter used by the replication task reports a timeout error.
For example:
Error: (256, 'sql processing error: QUEUE: SUB_T002: Failed to add subscription
for remote subscription SUB_T002[id = 165727] in remote source
MSSQLECCAdapterSrc[id = 165373]. Error: exception 151050: CDC add subscription
failed: Request timed out.\n\n: line 1 col 1 (at pos 0)')
The error can be resolved by increasing the message timeout value or cleaning up the source database archive
log.
Procedure
Related Information
Administration Guide
124 PUBLIC Troubleshooting and Recovery Operations
6.1.8 Load Clustered and Pooled Table Metadata into SAP
HANA
Real-time replication tasks with clustered or pooled tables may fail when metadata hasn’t been correctly
loaded into the SAP HANA database.
Context
For example:
[ERROR] com.sap.hana.dp.cdcadaptercommons.StatefulAdapterCDC
$CDCOpened.addSubscription[791] - [OracleECCAdapterSrc] Failed to add the
first subscription. SubscriptionSpecification [header=remoteTableId=68,
remoteTriggerId=160778, sql=, subscription=, customId=, seqID=SequenceId
[value=[0, 0, 0, 0]], isLastSubscription=true, withSchemaChanges=false,
firstSubscriptionOnTable=true, lastSubscriptionOnTable=true]
java.lang.NullPointerException: while trying to invoke the method
com.sap.hana.dp.adapter.sdk.parser.Query.getFromClause() of a null object loaded
from local variable 'query'
Note
Beginning with the ABAP Platform 1808/1809 release, cluster and pooled tables aren’t supported in SAP
S/4HANA HANA.
Procedure
1. Edit the dictionary replication script with the name of the remote source and the SAP HANA schema name.
By default, the dictionary replication script is located at <DPAgent_root>/LogReader/scripts/
replicate_dictionary.sql.
2. Execute the dictionary replication script in the SAP HANA database.
3. Re-execute the failed replication task.
Replication tasks may fail and generate remote source or subscription exceptions for a number of reasons.
Generally, recovering replication involves processing any exceptions in addition to other tasks.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 125
Replication may stop if there are log reader errors.
Recover from Source Table and Replication Task Recreation [page 128]
Restore replication after a failure when a source table and replication task are both re-created in a short
timeframe.
Related Information
For more information about log reader errors, check the trace file located in the log folder of your Data
Provisioning Agent instance. You may also choose to increase the trace log levels for the Data Provisioning
Server.
Administration Guide
126 PUBLIC Troubleshooting and Recovery Operations
Related Information
Activate Additional Trace Logging for the Data Provisioning Server [page 148]
Table replication may fail when there’s a DDL schema change in the source table and the replication task isn’t
configured to replicate with structure.
Context
When table replication fails, remote source and remote subscription exceptions may be generated.
Recover replication by processing any exceptions and recreating the replication task.
Procedure
1. In the Data Provisioning Remote Subscription Monitor, verify any reported exceptions.
Schema change exceptions on the failed replication task can’t be ignored or retried.
2. In the SAP HANA Web-based Development Workbench editor, drop the failed replication task.
3. Re-create the replication task with the “Initial + realtime with structure” replication behavior.
4. Activate and execute the new replication task.
5. In the Data Provisioning Remote Subscription Monitor, ignore the schema change exception.
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 127
6.2.3 Recover from a Truncated Source Table
Context
Restore replication by truncating the target table and processing any remote subscription exceptions.
Procedure
1. In the Data Provisioning Remote Subscription Monitor, verify any remote subscription errors.
2. Truncate the target table in the SAP HANA system to restore data consistency between the source and
target tables.
3. In the Data Provisioning Remote Subscription Monitor, ignore the TRUNCATE TABLE errors.
Related Information
Restore replication after a failure when a source table and replication task are both re-created in a short
timeframe.
Context
● There are multiple active remote subscriptions to the same source database.
● There’s a significant amount of change data on the source table or other subscribed tables that haven't yet
been replicated to the SAP HANA target
● The remote subscription or replication task is dropped, and the source table and replication task are re-
created in a short timeframe.
Under these circumstances, Log Reader adapters may capture the DROP TABLE DDL and attempt to replicate
it to the SAP HANA target, generating an exception.
Administration Guide
128 PUBLIC Troubleshooting and Recovery Operations
Restore replication by processing any remote source and subscription exceptions.
Procedure
1. In the Data Provisioning Remote Subscription Monitor, check for remote subscription errors.
2. Restore replication by selecting the exception and choosing Ignore.
Related Information
Replication may fail when a new row is inserted in a source table that has a primary key and the target table
already contains the row.
Note
Updating or deleting source table rows that don’t exist in the target table doesn’t cause a replication failure.
Context
Recover replication by processing any remote subscription exceptions and re-executing the replication task.
Procedure
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 129
6.2.6 Recover from Data Inconsistencies
After a replication task has failed, you can use the lost data tracker in the Data Provisioning Agent command-
line configuration tool to identify and correct data inconsistencies that may have occurred.
Prerequisites
Before attempting to recover from data inconsistencies, ensure that you have downloaded and installed the
correct JDBC libraries. See the SAP HANA smart data integration Product Availability Matrix (PAM) for details.
Place the files in the <DPAgent_root>/lib folder.
Context
The tool tracks data inconsistencies by using a specified column as a partition in both the source and target
tables, and then comparing data row counts for each partition. Inconsistencies between source and target can
be printed as a SQL fix plan, or corrected automatically.
Procedure
source_db_host=<hostname>
source_db_port=<port_number>
source_db_name=<database_name>
source_db_user=<username>
source_db_pswd=<password>
target_db_host=<hostname>
target_db_port=<port_number>
target_db_name=<database_name>
target_db_user=<username>
target_db_pswd=<password>
Tip
Administration Guide
130 PUBLIC Troubleshooting and Recovery Operations
3. Start the configuration tool with the --LostDataTracker parameter.
Tip
○ Tracking mode
○ row: Inconsistencies are processed row by row.
○ block: Inconsistencies are processed by partition block.
Default: block
○ Whether to generate a SQL fix plan for any inconsistencies
Default: yes
○ Whether to automatically fix any inconsistencies
Default: yes
○ Virtual table name to use when processing fixes
Next Steps
After specifying all configuration parameters, the lost data tracker automatically starts comparing the source
and target tables.
If a SQL fix plan was requested, a SQL command for each fix is displayed.
If automatic inconsistency correction was specified, the fix for each inconsistency detected is automatically
applied to the target table.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 131
Related Information
SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP HANA SDI 2.0
Process any remote subscription exceptions and restore replication when a network interruption or other
communication issue occurs between the Data Provisioning Agent and the SAP HANA server.
Context
A communication issue may generate remote subscription exceptions such as the following:
In some scenarios, no exceptions are generated, but changed source data isn’t replicated to the target. In these
scenarios, an error message may be indicated in the Data Provisioning Server trace log.
For example:
Procedure
Administration Guide
132 PUBLIC Troubleshooting and Recovery Operations
○ On Windows, netstat -na | findstr "5050"
○ On Linux, netstat -na|grep 5050
Additionally, verify that no operating system firewall rules are configured for the agent TCP port.
4. Restart the Data Provisioning Server on the SAP HANA system.
5. Resume all sources on the affected agent.
6. In the Data Provisioning Remote Subscription Monitor, retry or ignore any remaining remote subscription
exceptions.
7. Verify the remote source and remote subscription statuses.
Related Information
If replication from an Oracle source system is stopped or delayed, or if you notice poor performance, check the
instance log for reports of unsupported transactions.
Context
When the Oracle LogMiner starts scanning from the middle of a transaction and fails to translate the raw
record, it reports an unsupported operation. This unsupported operation occurs most often on UPDATE
operations involving wide tables.
The Oracle Log Reader adapter can manage these records by using a standby scanner, but frequent
occurrence of unsupported operations can slow scan performance.
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 133
6.2.9 Resolve Locked SAP HANA Source Tables
A replication task or flowgraph may fail if an SAP HANA source table is locked and the remote subscription
request times out.
Context
A replication task or flowgraph that has timed out due to a locked table may fail with the following error:
SAP DBTech JDBC: [256]: sql processing error: QUEUE: RT_SFC100: Failed to add
subscription for remote subscription RT_SFC100.Error: exception 151050: CDC add
subscription failed: Request timed out.
Additionally, you can verify a locked table by checking the blocked transactions in the SAP HANA source.
To resolve a locked SAP HANA source table, refer to SAP Note 0001999998 .
When log reader-based real-time replication has stopped, you can try resetting the remote subscription.
Procedure
Tip
If you have the required privileges, you can also reset remote subscriptions from the Data Provisioning
Remote Subscription Monitor. For more information, see Manage Remote Subscriptions.
Note
Execute the script using the same user that is configured for replication.
Administration Guide
134 PUBLIC Troubleshooting and Recovery Operations
○ On Windows, use the Services manager in Control Panel.
○ On Linux, run ./dpagent_service.sh stop.
5. (Optional) Enable additional logging on the Data Provisioning Agent.
a. Open the Agent configuration file in a text editor.
Increasing the log level generates additional information useful for further debugging, if necessary. You can
safely revert the log level after resolving the issue.
6. Restart the SAP HANA Data Provisioning Agent service.
Tip
If you have the required privileges, you can also queue and distribute remote subscriptions from the
Data Provisioning Remote Subscription Monitor. For more information, see Manage Remote
Subscriptions.
Next Steps
If you’re unable to reset the remote subscription, you may need to clear any outstanding remote subscription
exceptions first.
Related Information
Replication may be stopped due to remote subscription exceptions. For example, a remote subscription
exception may be generated when a primary key violation, constraint violation, or null primary key occurs.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 135
● Remote subscription exceptions appear in the task execution log.
● E-mail alerts may be sent to the administrator.
● Exceptions are displayed in monitoring, or when you query the remote subscription exception public view.
After correcting the root cause of the remote subscription exception, you can clear the exceptions with a SQL
statement:
Resume replication and ensure data consistency when you experience a crash or unplanned system outage.
Related Information
Administration Guide
136 PUBLIC Troubleshooting and Recovery Operations
6.3.1 Recover from an Index Server Crash
Verify your data consistency when you experience an SAP HANA index server crash.
Procedure
1. After SAP HANA restarts, verify that all SAP HANA processes show an active GREEN status.
2. Compare source and target table row counts to verify data consistency.
Context
When the Data Provisioning Server crashes, an error may be indicated in the usr/sap/
<sid>HDB<instance>/<hana_machine_name>/trace/dpserver_alert_<hana_machine_name>.trc
log file.
For example:
Recover from the crash by restarting replication and processing any remote subscription exceptions.
Procedure
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 137
5. Process any remote subscription exceptions remaining after change data capture has resumed.
Next Steps
In a multiple container SAP HANA configuration, the Data Provisioning Server is managed individually in each
tenant database. Repeat the recovery process for each tenant database.
Related Information
The Data Provisioning Agent may crash if the Java virtual machine is configured with insufficient available
memory.
Context
For example:
Recover by adjusting the agent's maximum available memory and restarting the agent.
Procedure
1. Check the available memory on the Data Provisioning Agent host machine.
For example:
# free -m
total used free shared buffers cached
Mem: 258313 196343 61970 0 589 53022
Administration Guide
138 PUBLIC Troubleshooting and Recovery Operations
-/+ buffers/cache: 142731 115582
Swap: 2047 0 2047
Related Information
Replication may stop if there’s an unplanned source database outage due to a network or hardware issue.
Note
On Oracle, the Oracle Log Reader adapter automatically reconnects to the source database when available
and resumes replication without any additional actions.
Context
A source database outage may be indicated by multiple error types in the logs.
For example:
Recover by processing remote subscription exceptions and resuming replication on the remote source.
Procedure
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 139
Related Information
The dpadapterfactory process used in the SAP ASE and SAP ECC ASE adapters restarts automatically if it’s
stopped or crashes.
Context
For example:
[ERROR] com.sap.hana.dp.adapter.framework.cpp.CppAdapterManager.read[236] -
Socket closed
com.sap.hana.dp.adapter.sdk.AdapterException: Socket closed
at
com.sap.hana.dp.adapter.framework.socket.SocketConnector.read(SocketConnector.jav
a:103)
at
com.sap.hana.dp.adapter.framework.cpp.CppAdapterManager.read(CppAdapterManager.ja
va:232)
For example:
Administration Guide
140 PUBLIC Troubleshooting and Recovery Operations
Related Information
This section describes error situations related to the Data Provisioning Agent and their solutions.
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 141
6.4.1 Data Provisioning Agent Log Files and Scripts
Various log files are available to monitor and troubleshoot the Data Provisioning Agent.
<DPAgent_root>/log/framework_alert.trc Data Provisioning Agent adapter framework log. Use this file
to monitor data provisioning agent statistics.
Additionally, scripts for performing source database initialization and cleanup operations can be found at
<DPAgent_root>/LogReader/scripts.
If the Data Provisioning Agent is accidentally started by the root user, you should clean up the agent
installation. In a normal configuration, the Data Provisioning Agent should be started only by a user with the
same rights and permissions as the installation owner defined during the agent installation.
Context
If the agent is started by the root user, additional files are created in the installation location. The agent can’t
access these additional files because root is their owner, and they should be removed.
Note
This applies only if the agent was started by the user named root, and not other users that may belong to
the root group or have similar permissions.
Administration Guide
142 PUBLIC Troubleshooting and Recovery Operations
Procedure
○ com.sap.hana.dp.adapterframework
○ org.eclipse.core.runtime
○ org.eclipse.osgi
Caution
Note
Depending on the permissions, you may require sudo access to remove the secure_storage file.
If the Java Virtual Machine on the Data Provisioning Agent is running out of memory, you may need to adjust
configuration parameters on the Agent or your adapters.
For Oracle log reader adapters, there are multiple possible solutions.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 143
Setting Virtual Memory
For general memory issues, you can increase the JVM memory usage.
In dpagent.ini, change the memory allocation parameter, and restart the Agent.
You can remove the -XX:+HeapDumpOnOutOfMemoryError parameter, or you can reduce the amount of
space that the dump file will take up, to help avoid any “disk full” errors.
However, you should add the line back in, if needed, when you are attempting to debug out-of-memory errors
of the Data Provisioning Agent. You would also need to ensure that you have enough disk space for the dump
file.
When task execution fails due to an adapter timeout, you may need to adjust the adapter prefetch timeout
parameter.
In the Index Server trace log, you may notice an error similar to the following:
When this type of error appears, you can adjust the value of the adapter prefetch timeout parameter.
For example, you might change the value to 600 to set the timeout to 10 minutes.
Administration Guide
144 PUBLIC Troubleshooting and Recovery Operations
6.4.5 Agent Reports Errors When Stopping or Starting
If the Data Provisioning Agent reports error code 5 when stopping or starting on Windows, you may need to run
the configuration tool as an administrator.
To run the configuration tool once as an administrator, right-click dpagentconfigtool.exe and choose Run
as administrator.
Alternatively, you can set the configuration tool to always run as an administrator. Right-click
dpagentconfigtool.exe and choose Properties. From the Compatibility tab, choose Run this program as an
administrator and click OK.
After uninstalling the Data Provisioning Agent and installing a new version, alerts and exceptions may be
reported for the previous installation.
If the agent is uninstalled, but associated remote sources and the agent registration aren’t dropped, any
replication tasks still using remote sources on the previous agent may report alerts and exceptions.
Solution
Drop outdated remote sources and the agent registration referring to the old installation.
If you don’t know the old agent name, you can find it using SELECT * FROM sys.m_agents.
Use the command-line agent configuration tool to generate system dumps when troubleshooting the Data
Provisioning Agent.
Context
By default, the configuration tool generates system dumps that include the following information:
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 145
● Log reader, framework, and OSGi logs and traces
● Information about running Java and Data Provisioning Agent processes
● Information about JVM, threads, OSGi, and adapters, if the agent connection is available
Note
To export a system dump, the Data Provisioning Agent must be started and running.
Procedure
Related Information
Administration Guide
146 PUBLIC Troubleshooting and Recovery Operations
6.4.8 Resolve Agent Parameters that Exceed JVM
Capabilities
Replication tasks may fail during the initial load if the fetch size exceeds the capabilities of the Data
Provisioning Agent JVM.
Context
For example:
For example:
Resolve the errors by reducing the row fetch size and increasing the JVM maximum size.
Procedure
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 147
Agent Runtime Options
This section describes various issues unrelated to replication failures or the Data Provisioning Agent and their
solutions.
Activate Additional Trace Logging for the Data Provisioning Server [page 148]
The trace logs contain detailed information about actions in the Data Provisioning server.
Related Information
The trace logs contain detailed information about actions in the Data Provisioning server.When you’re
troubleshooting issues, you can gain additional insight about the root cause by increasing the level of detail in
the logs.
Prerequisites
To configure traces, you must have the system privilege TRACE ADMIN.
Procedure
1. Log on to SAP HANA studio with a user that has system privilege TRACE ADMIN.
Administration Guide
148 PUBLIC Troubleshooting and Recovery Operations
2. In the Administration editor, choose the Trace Configuration tab.
3. Choose Edit Configuration under “Database Trace”.
4. Filter the component list for dp*.
5. Set the System Trace Level to DEBUG or for any relevant components under the DPSERVER and
INDEXSERVER nodes.
Scenario Traces
Note
This option can generate a large amount of data in the trace log.
○ INDEXSERVER
○ dpadaptermanager
Trace information for initial smart data access requests.
○ XSENGINE
○ dpadaptermanager
Trace information for initial smart data access requests from the SAP
HANA Web-based Development Workbench.
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 149
Scenario Traces
Note
This option can generate a large amount of data in the trace log.
○ INDEXSERVER
○ dpdistributor
Trace information for the message distributor.
○ dpapplier
Trace information for the message applier.
○ dpremotesubscriptionmanager
Trace information for remote subscription runtime details on the index
server side.
Tip
To ensure that applicable trace information is captured at the time of the error and not overwritten by
log wrapping, you can increase the maximum log size while capturing the traces. In the GLOBAL node,
increase the value of the maxfilesize parameter.
6. Click Finish.
Results
Additional debug information for the modified components is added to the trace log.
Tip
You can also modify the trace log levels by issuing commands from a SQL console and manually specifying
the component to adjust.
For example, to set the Data Provisioning Server dpframework trace to DEBUG:
If data in the source and target tables of a replication task is mismatched, fix the error by re-executing the
replication task.
Administration Guide
150 PUBLIC Troubleshooting and Recovery Operations
6.5.3 Configuring the Operation Cache
You can improve performance by using an operation cache in some script servers.
The operation cache holds operation instances for Global Address Cleanse, Universal Data Cleanse, Geocode,
and Type Identifier (TID), which are initialized and ready for use during task plan execution. This improves
performance by avoiding the process of task plan operation initialization/create and deletion, and allows the
reuse of the cached instances both inside a single plan and across plans.
Having more instances cached improves performance, but those additional cached instances consume more
memory.
Operation cache instances are typespecific and are set in the file scriptserver.ini.
You can use the following monitoring views to verify the cached operations, usage count, and so on.
The operation cache can be configured in SAP HANA studio by editing the file scriptserver.ini.
To start operation cache instances, you must set the enable_adapter_operation_cache parameter to yes. The
following sample SQL statement sets the parameter to yes.
You can turn off the cache for each node, or change the default number of instances. The following sample SQL
statement changes the number of instances.
By default, the operation cache is enabled for all the supported types of operations (which is recommended)
and has the following number of instances.
Geocode 10
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 151
Option Number of instances default setting.
Type Identification 20
The operation cache is used for both batch jobs and real-time jobs. Batch jobs can exhaust the operation
cache, leaving insufficient resources to optimize the running of real-time jobs. If you run real-time jobs, use
these settings to ensure that a dedicated number of operation cache instances are available for real-time tasks.
By default, the number of instances made available to real-time tasks is half the total number of instances for
each option.
Geocode 5
Type Identification 0
Within the Cleanse node, you can configure two options that are relevant to Global Address Cleanse. When
caching is enabled, for better performance set these options as follows:
Ensure that in circumstances of limited memory or CPU resources that processes and system resources
remain responsive.
Information on the specifics and procedures of workload management is found in the SAP HANA
Administration Guide.
SAP HANA smart data integration takes advantage of this framework and allows you to better handle
circumstances of limited resources. Workload management in SAP HANA allows you to optimize for your
Administration Guide
152 PUBLIC Troubleshooting and Recovery Operations
system. This framework also works within the limited memory or CPU resources, as you define them in the
workload class and mapping.
For example, if the workload class sets “STATEMENT THREAD LIMIT = 5”, then SAP HANA creates up to five
instances per node or operation in parallel during the task plan execution.
If the workload class sets “STATEMENT MEMORY LIMIT = 2GB”, but any of the nodes or operation in the task
plan require more than 2 GB of memory, then the task would fail with an error “[MEMORY_LIMIT_VIOLATION]
Information about current memory composite-limit violation”.
Consider these options and constraints to create the best possible performance.
Related Information
Administration Guide
Troubleshooting and Recovery Operations PUBLIC 153
7 SQL and System Views Reference for
Smart Data Integration and Smart Data
Quality
This section contains information about SQL syntax and system views that can be used in SAP HANA smart
data integration and SAP HANA smart data quality.
For complete information about all SQL statements and system views for SAP HANA and other SAP HANA
contexts, see the SAP HANA SQL and System Views Reference.
For information about the capabilities available for your license and installation scenario,refer to the Feature
Scope Description (FSD) for your specific SAP HANA version on the SAP HANA Platform page.
Related Information
SAP HANA smart data integration and SAP HANA smart data quality support many SQL statements to allow
you to do such tasks as create agents and adapters, administer your system, and so on.
Administration Guide
154 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
The ALTER REMOTE SUBSCRIPTION statement allows the QUEUE command to initiate real-time data
processing, and the DISTRIBUTE command applies the changes.
PROCESS REMOTE SUBSCRIPTION EXCEPTION Statement [Smart Data Integration] [page 187]
The PROCESS REMOTE SUBSCRIPTION EXCEPTION statement allows the user to indicate how an
exception should be processed.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 155
Related Information
SQL Reference for Additional SAP HANA Contexts (SAP HANA SQL and System Views Reference)
The ALTER ADAPTER statement alters an adapter. Refer to CREATE ADAPTER for a description of the AT
LOCATION clause.
Syntax
Syntax Elements
<adapter_name>
<agent_name>
<properties>
Description
The ALTER ADAPTER statement alters an adapter. Refer to CREATE ADAPTER for a description of the AT
LOCATION clause.
Administration Guide
156 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Examples
Add or remove an Create two agents and an adapter at the first agent:
existing adapter at agent
or Data Provisioning CREATE AGENT TEST_AGENT_1 PROTOCOL 'TCP' HOST 'test_host1'
Server PORT 5050;
CREATE AGENT TEST_AGENT_2 PROTOCOL 'HTTP';
CREATE ADAPTER TEST_ADAPTER AT LOCATION AGENT TEST_AGENT_1;
Refresh configuration Read configuration and query optimization capabilities of an adapter from the
and query optimization adapter setup at the agent or Data Provisioning Server:
capabilities of an adapter
ALTER ADAPTER TEST_ADAPTER REFRESH AT LOCATION DPSERVER;
ALTER ADAPTER TEST_ADAPTER REFRESH AT LOCATION AGENT
TEST_AGENT_2;
Update display name Change display name for an adapter to 'My Custom Adapter':
property of an adapter
ALTER ADAPTER TEST_ADAPTER PROPERTIES 'display_name=My
Custom Adapter';
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 157
7.1.2 ALTER AGENT Statement [Smart Data Integration]
The ALTER AGENT statement changes an agent's host name and/or port and SSL property if it uses the TCP
protocol. It can also assign an agent to an agent group.
Syntax
Syntax Elements
<agent_name>
<agent_hostname>
<agent_port_number>
Specifies whether the agent's TCP listener on the specified port uses SSL.
<agent_group_name>
The name of the agent clustering group to which the agent should be attached.
Description
The ALTER AGENT statement changes an agent's host name and/or port if it uses the TCP protocol. It can also
assign an agent to an agent group.
Administration Guide
158 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Examples
● Alter TEST_AGENT's hostname test_host and port to 5051, if it uses 'TCP' protocol
The ALTER REMOTE SOURCE statement modifies the configuration of an external data source connected to
the SAP HANA database.
The ALTER REMOTE SOURCE SQL statement is available for use in other areas of SAP HANA, not only SAP
HANA smart data integration. Refer to the ALTER REMOTE SOURCE topic for complete information. This
information is specific to smart data integration functionality.
Syntax
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 159
| STOP LATENCY MONITORING <latency_ticket_name>
| CLEAR LATENCY HISTORY [ <latency_ticket_name> ]
Syntax Elements
Syntax elements specific to smart data integration are described as follows. For information about syntax
elements that aren’t specific to smart data integration, refer to the ALTER REMOTE SOURCE topic.
<adapter_clause>
Adapter configuration.
ALTER REMOTE SOURCE SUSPEND Suspends the adapter and agent from
CAPTURE reading any more changes from source
system. This is helpful when the source
system or SAP HANA is preparing for
planned maintenance or an upgrade.
ALTER REMOTE SOURCE Clears all the data received from the adapter
<remote_source_name> CLEAR for this remote source from HANA tables.
OBJECTS
Administration Guide
160 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
ALTER REMOTE SOURCE Starts building HANA dictionary tables that
<remote_source_name> REFRESH contain remote source objects.
OBJECTS
Starts the collection of latency statistics one time or at regular intervals. The user
specifies a target latency ticket in the monitoring view.
ALTER REMOTE SOURCE <remote_source_name> STOP LATENCY MONITORING <ticket_name>
Stops the collection of latency statistics into the given latency ticket.
ALTER REMOTE SOURCE <remote_source_name> CLEAR LATENCY HISTORY
Clears the latency statistics (for either one latency ticket, or for the whole remote
source, from the monitoring view.
Description
The ALTER REMOTE SOURCE statement modifies the configuration of an external data source connected to
the SAP HANA database. Only database users with the object privilege ALTER for remote sources may alter
remote sources.
Note
You may not change a user name while a remote source is suspended.
Permissions
This statement requires the ALTER object privilege on the remote source.
Examples
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 161
<PropertyEntry name="proxyport">8080</PropertyEntry> <PropertyEntry
name="truststore"></PropertyEntry>
<PropertyEntry name="supportformatquery"></PropertyEntry>
</ConnectionProperties>' WITH CREDENTIAL TYPE 'PASSWORD'
USING '<CredentialEntry name="password"><user></user><password></password></
CredentialEntry>';
The configuration clause must be a structured XML string that defines the settings for the remote source. For
example, the CONFIGURATION string in the following example configures a remote source for an Oracle
database.
Administration Guide
162 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
<PropertyEntry name="pdb_dflt_column_repl">true</PropertyEntry>
<PropertyEntry name="pdb_ignore_unsupported_anydata">false</
PropertyEntry>
<PropertyEntry name="pds_sql_connection_pool_size">15</
PropertyEntry>
<PropertyEntry name="pds_retry_count">5</PropertyEntry>
<PropertyEntry name="pds_retry_timeout">10</PropertyEntry>
</PropertyGroup>
</PropertyGroup>
</ConnectionProperties>'
Related Information
ALTER REMOTE SOURCE Statement (Access Control) (SAP HANA SQL and System Views Reference)
The ALTER REMOTE SUBSCRIPTION statement allows the QUEUE command to initiate real-time data
processing, and the DISTRIBUTE command applies the changes.
Syntax
Syntax Elements
<subscription_name>
Description
The ALTER REMOTE SUBSCRIPTION statement allows the QUEUE command to initiate real-time data
processing, and the DISTRIBUTE command applies the changes. Typically, the initial load of data is preceded
by QUEUE command. The DISTRIBUTE command is used when initial load completes. The RESET command
can be used to reset the real-time process to start from the initial load again.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 163
Permissions
This statement requires the ALTER object privilege on the remote source.
Example
Now insert or update a material record in ECC system and see it updated to TGT_MARA table in SAP HANA.
Reset the real-time process and restart the load.
Syntax
Administration Guide
164 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Syntax Elements
Specifies the task execution ID to cancel. See the START TASK topic for more information about
TASK_EXECUTION_ID.
Number of seconds to wait for the task to cancel before returning from the command.
Description
The default behavior is for the CANCEL TASK command to return after sending the cancel request. Optionally,
a WAIT value can be specified where the command will wait for the task to actually cancel before returning. If
the command has waited the specified amount of time, then the CANCEL TASK will error out with the error
code 526 (request to cancel task was sent but task did not cancel before timeout was reached).
Note
If the WAIT value is 0, the command returns immediately after sending the cancel request, as it would if no
WAIT value were entered.
Permissions
The user that called START TASK can implicitly CANCEL; otherwise, the CATALOG READ and SESSION ADMIN
roles are required.
Examples
Assuming that a TASK performTranslation was already started using START TASK and has a task execution ID
of 255, it would be cancelled using the following commands. The behavior is the same for the following two
cases:
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 165
Assuming that a TASK performTranslation was already started using START TASK and has a task execution id
of 256 and the user wants to wait up to 5 seconds for the command to cancel, it would be cancelled using the
following command:
If the task was able to cancel within 5 seconds, the CANCEL TASK will return as a success. If it didn't cancel
within 5 seconds, then the return will be the error code 526.
SQL Script
You can call CANCEL TASK within the SQL Script CREATE PROCEDURE. Refer to the SAP HANA SQL Script
Reference for complete details about CREATE PROCEDURE.
● Table UDF
● Scalar UDF
● Trigger
● Read-only procedures
Related Information
The CREATE ADAPTER statement creates an adapter that is deployed at the specified location.
Syntax
Administration Guide
166 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Syntax Elements
<adapter_name>
<agent_name>
The agent name if the adapter is set up on the agent.
<properties>
AT LOCATION DPSERVER
The adapter runs inside the Data Provisioning Server process in SAP HANA.
AT LOCATION
Specify an agent that is set up outside of SAP HANA for the adapter to run inside.
Description
The CREATE ADAPTER statement creates an adapter that is deployed at the specified location. The adapter
must be set up on the location prior to running this statement. When the statement is executed, the Data
Provisioning Server contacts the adapter to retrieve its configuration details such as connection properties and
query optimization capabilities.
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 167
Examples
Create an adapter at the Data Create an adapter TEST_ADAPTER running in the Data Provisioning Server.
Provisioning Server
CREATE ADAPTER TEST_ADAPTER AT LOCATION DPSERVER;
The CREATE AGENT statement registers connection properties of an agent that is installed on another host.
Syntax
Syntax Elements
<agent_name>
PROTOCOL
The protocol for the agent.
HTTP Agent uses HTTP protocol for communication with DP server. Use this
protocol when the SAP HANA database is on the cloud.
PROTOCOL 'HTTP'
Administration Guide
168 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
TCP Agent uses TCP protocol and listens on the specified port to receive
requests from DP server. Use this protocol when the SAP HANA
database can connect to agent's TCP port.
{ENABLE | Specifies if agent's TCP listener on the specified port uses SSL.
DISABLE}
SSL
<agent_group_name>
The name of the agent clustering group to which the agent should belong.
Description
The CREATE AGENT statement registers connection properties of an agent that is installed on another host.
The DP server and agent use these connection properties when establishing communication channel.
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Examples
Create an agent with TCP Create an agent TEST_AGENT running on test_host and port 5050.
protocol
CREATE AGENT TEST_AGENT PROTOCOL 'TCP' HOST
'test_host' PORT 5050;
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 169
Create an agent with HTTP Create an agent TEST_AGENT that uses HTTP.
protocol
CREATE AGENT TEST_AGENT PROTOCOL 'HTTP';
Create an agent with HTTP Create an agent TEST_AGENT that uses HTTP and belongs to agent
protocol in an agent group clustering group TEST_GROUP.
The CREATE AGENT GROUP statement creates an agent clustering group to which individual agents can be
assigned.
Syntax
Syntax Elements
<agent_group_name>
Description
The CREATE AGENT GROUP statement creates an agent clustering group to which individual agents can be
assigned. An agent group can be used instead of a single agent to provide fail-over capabilities.
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Administration Guide
170 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
System privilege: ADAPTER ADMIN
Examples
Related Information
The CREATE AUDIT POLICY statement creates a new audit policy, which can then be enabled and cause the
specified audit actions to occur.
The CREATE AUDIT POLICY SQL statement is available for use in other areas of SAP HANA, not only SAP
HANA smart data integration. Refer to the CREATE AUDIT POLICY topic for complete information. The
information below is specific to smart data integration functionality.
Syntax
Refer to the SAP HANA SQL and System Views Reference for complete information about CREATE AUDIT
POLICY syntax.
Syntax Elements
Syntax elements specific to smart data integration are described below. For information about syntax elements
that are not specific to smart data integration, refer to the SAP HANA SQL and System Views Reference.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 171
| DROP ADAPTER
| CREATE REMOTE SUBSCRIPTION
| ALTER REMOTE SUBSCRIPTION
| DROP REMOTE SUBSCRIPTION
| PROCESS REMOTE SUBSCRIPTION EXCEPTION
Description
The CREATE AUDIT POLICY statement creates a new audit policy. This audit policy can then be enabled and
cause the auditing of the specified audit actions to occur.
Permissions
Only database users with the CATALOG READ or INIFILE ADMIN system privilege can view information in the
M_INIFILE_CONTENTS view. For other database users, this view is empty. Users with the AUDIT ADMIN
privilege can see audit-relevant parameters.
Related Information
CREATE AUDIT POLICY Statement (Access Control) (SAP HANA SQL and System Views Reference)
Administration Guide
172 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
7.1.10 CREATE REMOTE SOURCE Statement [Smart Data
Integration]
The CREATE REMOTE SOURCE statement defines an external data source connected to SAP HANA database.
The CREATE REMOTE SOURCE SQL statement is available for use in other areas of SAP HANA, not only SAP
HANA smart data integration. Refer to the CREATE REMOTE SOURCE topic for complete information. The
information below is specific to smart data integration functionality.
Syntax
Refer to the SAP HANA SQL and System Views Reference for complete information about CREATE REMOTE
SOURCE syntax.
Syntax Elements
Syntax elements specific to smart data integration are described below. For information about syntax elements
that are not specific to smart data integration, refer to the SAP HANA SQL and System Views Reference.
<adapter_clause>
Configures the adapter.
Description
The CREATE REMOTE SOURCE statement defines an external data source connected to SAP HANA database.
Only database users having the system privilege CREATE SOURCE or DATA ADMIN are allowed to add a new
remote source.
Permissions
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 173
Related Information
CREATE REMOTE SOURCE Statement (Access Control) (SAP HANA SQL and System Views Reference)
The CREATE REMOTE SUBSCRIPTION statement creates a remote subscription in SAP HANA to capture
changes specified on the entire virtual table or part of a virtual table using a subquery.
Syntax
Syntax Elements
<subscription_name>
ON [<schema_name>.]<virtual_table_name>
See "Remote subscription for TARGET TASK or TARGET TABLE using ON Clause"
below.
AS (<subquery>)
See "Remote subscription for TARGET TASK or TARGET TABLE using AS Clause"
below.
[WITH [RESTRICTED] SCHEMA CHANGES]
Include this clause to propagate source schema changes to the SAP HANA virtual table
and remote subscription target table.
WITH SCHEMA CHANGES corresponds to the replication task options Initial + realtime
with structure or Realtime only with structure and the flowgraph options Real-time and
with Schema Change.
Administration Guide
174 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
With the optional RESTRICTED clause, WITH RESTRICTED SCHEMA CHANGES
propagates source schema changes only to the SAP HANA virtual table, and not the
remote subscription target table.
<table_spec>
The table definition.
<load_behavior>
For a target table that logs the loading history, these parameters specify the target
column names that will show the change type and corresponding timestamp for each
operation. The CHANGE TYPE COLUMN <column_name> displays I, U, or D for INSERT,
UPSERT, or DELETE. In the case when multiple operations of the same type occur on
the same source row with the same timestamp (because the operations are in the same
transaction), use the CHANGE SEQUENCE COLUMN <column_name>, which adds an
incremental digit to distinguish the operations.
The following example for INSERT is for the same remote subscription and includes the
CHANGE_TIME column.
<task_spec>
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 175
The task definition.
<start_task_var> specifies the name and value for a start task variable.
<var_name> is the name of variable that was defined within the task plan.
Variable values provided in this section will be used at runtime (for example, when
executing the task using START TASK).
<var_value> is the value that should be used in place of the variable name specified
when executing the task.
If the task uses table types for input and/or output, then the task expects actual table,
virtual table, or view names at runtime. These actual tables, virtual tables, or view
names are specified as task parameters. Depending on the type of remote subscription
being created, the task parameters may or may not need actual table, virtual table, or
view names for specific parameters (see below for more details).
<proc_spec>
{ PROCEDURE [<schema_name>.]<proc_name>[(<param_list>)] }
Description
The CREATE REMOTE SUBSCRIPTION statement creates a remote subscription in SAP HANA to capture
changes specified on the entire virtual table or part of a virtual table using a subquery. The changed data can
be applied to an SAP HANA target table or passed to a TASK or PROCEDURE if the changes require
transformation. The owner of the remote subscription must have the following privileges:
Administration Guide
176 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
● EXECUTE privilege on the stored procedure
● START TASK privilege on the task
Note
If you create a remote subscription using the CREATE REMOTE SUBSCRIPTION SQL statement, use
technical user for the Credentials Mode parameter when creating a remote source.
Permissions
This statement requires the CREATE REMOTE SUBSCRIPTION object privilege on the remote source.
Each parameter in <param_list> is used in comparing its columns with columns for the corresponding table
type defined in the task plan. Hence, the order of parameters in <param_list> must match the order of table
types defined in the task plan for input and output sources.
The AS (<subquery>) part of the syntax lets you define the SQL and the columns to use for the subscription.
The subquery should be a simple SELECT <column_list> from <virtual_table> and should not contain a
WHERE clause. The <column_list> should match the target table schema in column order and name.
<param_list> must contain one of the parameters as table type and this table type (schema and name) must
be the same as the one defined in the task plan. This table type must also have the same columns as being
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 177
output by the subquery (excluding _OP_CODE and _COMMIT_TIMESTAMP). This table type must have
_OP_CODE as the last but one column and _COMMIT_TIMESTAMP as the last column. Only one parameter in
<param_list> can be a table type.
Each parameter in <param_list> is used in comparing its columns with columns for the corresponding table
type defined in the task plan. Hence the order of parameters in <param_list> must match the order of table
types defined in task plan for input and output sources.
Example
Create a remote subscription on a virtual table and apply changes using a real-time task.
Related Information
SQL Notation Conventions (SAP HANA SQL and System Views Reference)
Data Types (SAP HANA SQL and System Views Reference)
Administration Guide
178 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
7.1.12 CREATE VIRTUAL PROCEDURE Statement [Smart Data
Integration]
Creates a virtual procedure using the specified programming language that allows execution of the procedure
body at the specified remote source.
The CREATE VIRTUAL PROCEDURE SQL statement is available for use in other areas of SAP HANA, not only
SAP HANA smart data integration. Refer to the CREATE VIRTUAL PROCEDURE Statement (Procedural) topic
for complete information. The information below is specific to smart data integration functionality.
Syntax
CONFIGURATION <configuration_json_string>
Syntax Elements
<configuration_json_string>
Description
The CREATE VIRTUAL PROCEDURE statement creates a new virtual procedure from a remote source
procedure. When creating a virtual procedure using the SQL Console:
1. Return the metadata of the source procedure [number, types, and configuration (JSON) string] by invoking
the built-in SAP HANA procedure:
"PUBLIC"."GET_REMOTE_SOURCE_FUNCTION_DEFINITION"
('<remote_source_name>','<remote_object_unique_name>',?,?,?);
2. Edit the CONFIGURATION JSON string to include the appropriate parameter values.
Permissions
This statement requires the CREATE VIRTUAL PROCEDURE object privilege on the remote source.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 179
Example
If you use the SQL Console to create a virtual procedure, the following example illustrates an ABAP adapter.
TABLE (
BANK_CTRY NVARCHAR (6),
BANK_KEY NVARCHAR (30),
BANK_NAME NVARCHAR (120) ,
CITY NVARCHAR (70)
)
For more information about using the SQL Console, see the SAP HANA Administration Guide.
Administration Guide
180 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
7.1.13 DROP ADAPTER Statement [Smart Data Integration]
Syntax
Syntax Elements
<adapter_name>
<drop_option>
Description
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 181
Example
Syntax
Syntax Elements
<agent_name>
<drop_option>
RESTRICT drops the agent only if it does not have any dependent objects.
Description
Administration Guide
182 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Example
Create an agent TEST_AGENT and adapter CUSTOM_ADAPTER on the agent. Make sure that the custom
adapter is setup on the agent.
Syntax
Syntax Elements
<agent_group_name>
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 183
Description
The DROP AGENT GROUP statement removes an agent clustering group. All dependent objects must be
removed before an agent clustering group can be dropped.
Permissions
Role: sap.hana.im.dp.monitor.roles::Operations
Example
Syntax
Syntax Elements
<subscription_name>
Administration Guide
184 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
The name of the remote subscription.
Description
The DROP REMOTE SUBSCRIPTION statement drops an existing remote subscription. If the remote
subscription is actively receiving changes from source table, then a RESET command is automatically called
before dropping it.
Permissions
This statement requires the DROP object privilege on the remote source.
Example
GRANT is used to grant privileges and structured privileges to users and roles. GRANT is also used to grant
roles to users and other roles.
The GRANT SQL statement is available for use in other areas of SAP HANA, not only SAP HANA smart data
integration. Refer to the GRANT topic for complete information. The information below is specific to smart data
integration functionality.
Syntax
Refer to the GRANT topic for complete information about GRANT syntax.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 185
Syntax Elements
Syntax elements specific to smart data integration are described below. For information about syntax elements
that are not specific to smart data integration, refer to the GRANT topic.
<system_privilege>
<source_privilege>
Source privileges are used to restrict the access and modifications of a source entry.
CREATE REMOTE SUBSCRIP This privilege allows the creation of remote subscriptions exe
TION cuted on this source entry. Remote subscriptions are created in
a schema and point to a virtual table or SQL on tables to cap
ture change data.
PROCESS REMOTE SUB This privilege allows processing exceptions on this source entry.
SCRIPTION EXCEPTION Exceptions that are relevant for all remote subscriptions are
created for a remote source entry.
<object_privilege>
Object privileges are used to restrict the access and modifications on database objects.
Database objects are tables, views, sequences, procedures, and so on.
AGENT MESSAGING Authorizes the user with which the agent com DDL
municates with the data provisioning server us
ing HTTP protocol.
Administration Guide
186 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Object Privilege Privilege Purpose Command Types
Not all object privileges are applicable to all kinds of database objects. To learn which
object types allow which privilege to be used, see the table below.
Remote
Schem Function / Subscrip
Privilege a Table View Sequence Procedure tion Agent
Related Information
GRANT Statement (Access Control) (SAP HANA SQL and System Views Reference)
The PROCESS REMOTE SUBSCRIPTION EXCEPTION statement allows the user to indicate how an exception
should be processed.
Syntax
Syntax Elements
<exception_id>
RETRY Indicates to retry the current failed operation. If the failure is due to opening a
connection to a remote source, then the connection is established. If the failure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 187
happens when applying changed data to a target table, then the RETRY operation
retries the transaction again on the target table.
IGNORE Indicates to ignore the current failure. If the failure happens when applying
changed data to a target table, then the IGNORE operation skips the current
transaction and proceeds with the next transaction. The exception is cleared.
Description
The PROCESS REMOTE SUBSCRIPTION EXCEPTION statement allows the user to indicate how an exception
should be processed.
Permissions
This statement requires the PROCESS REMOTE SUBSCRIPTION EXCEPTION object privilege on the remote
source.
Example
SESSION_CONTEXT is available for use in other areas of SAP HANA, not only SAP HANA smart data
integration. Refer to the SESSION_CONTEXT topic for complete information. The information below is specific
to smart data integration functionality.
Syntax
SESSION_CONTEXT(<session_variable>)
Administration Guide
188 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Description
A predefined session variables that is set by the server and is read-only (cannot be SET or UNSET) is
‘TASK_EXECUTION_ID’.
Related Information
SESSION_CONTEXT Function (Miscellaneous) (SAP HANA SQL and System Views Reference)
Starts a task.
Syntax
Syntax Elements
<task_name>
<var_list>
Specifies one or more start task variables. Variables passed to a task are scalar
constants. Scalar parameters are assumed to be NOT NULL.
<start_task_var> Specifies the name and value for a start task variable. A task
can contain variables that allow for dynamic replacement of
task plan parameters. This section is where, at run time during
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 189
START TASK, the values that should be used for those variables
can be provided.
<param_list>
Task parameters. If the task uses table types for input and/or output, then those need
to be specified within this section. For more information about these data types, see
BNF Lowest Terms Representations and Data Types in the Notation topic.
Parameters are implicitly defined as either IN or OUT, as inferred from the task plan.
Arguments for IN parameters could be anything that satisfies the schema of the input
table type (for example, a table variable internal to the procedure, or a temporary
table). The actual value passed for tabular OUT parameters can be, for example, '?', a
physical table name, or a table variable defined inside the procedure.
Description
Starts a task.
START TASK when executed by the client the syntax behaves in a way consistent with the SQL standard
semantics, e.g. Java clients can call a procedure using a JDBC CallableStatement. Scalar output variables are a
scalar value that can be retrieved from the callable statement directly.
Note
Unquoted identifiers are implicitly treated as uppercase. Quoting identifiers will respect capitalization and
allow for using white spaces which are normally not allowed in SQL identifiers.
Administration Guide
190 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Permissions
This statement requires the EXECUTE privilege on the schema in which the task was created.
Examples
The TASK performTranslation was already created, and the task plan has two table type input parameters and a
single table type output parameter. You call the performTranslation task passing in the table types to use for
execution.
SQL Script
You can call START TASK within the SQL Script CREATE PROCEDURE. Refer to the SAP HANA SQL Script
Reference for complete details about CREATE PROCEDURE.
● Table UDF
● Scalar UDF
● Trigger
● Read-only procedures
The TASK_EXECUTION_ID session variable provides a unique task execution ID. Knowing the proper task
execution ID is critical for various pieces of task functionality including querying for sideeffect information and
task processing status, and canceling a task.
TASK_EXECUTION_ID is a read-only session variable. Only the internal start task code updates the value.
The value of TASK_EXECUTION_ID will be set during the START TASK command execution. In the case of
asynchronous execution (START TASK ASYNC), the value is updated before the command returns so it is
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 191
available before the actual task has finished asynchronously running. If the execution of START TASK was
successful, then the value is updated to the unique execution ID for that START TASK execution. If the
execution of START TASK was unsuccessful, then the TASK_EXECUTION_ID variable will be set back to the
state as if no START TASK was run.
The users can obtain the value of TASK_EXECUTION_ID by using either of the following:
● The already existing SESSION_CONTEXT() function. If this function is used and if no tasks have been run
or a task was run and it was unsuccessful, then a NULL value will be returned.
● The M_SESSION_CONTEXT monitoring view. This would need to be queried using a KEY value of
“TASK_EXECUTION_ID”. If no row exists with that key, then that means that the session variable hasn’t
been set (no tasks run or last task execution was unsuccessful).
Note
Session variables are string values. The user needs to cast appropriately based on how they want to use the
value.
Related Information
SQL Notation Conventions (SAP HANA SQL and System Views Reference)
System views allow you to query for various information about the system state using SQL commands. The
results appear as tables.
System views are located in the SYS schema. In a system with tenant databases, every database has a SYS
schema with system views that contain information about that database only. In addition, the system database
has a further schema, SYS_DATABASES, which contains views for monitoring the system as a whole. The views
in the SYS_DATABASES schema provide aggregated information from a subset of the views available in the SYS
schema of all tenant databases in the system. These union views have the additional column DATABASE_NAME
to allow you to identify to which database the information refers. To be able to view information in these views,
you need the system privilege CATALOG READ or DATABASE ADMIN.
Administration Guide
192 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
SAP HANA system views are separated into two categories: metadata views and runtime views. Metadata
views provide metadata about objects in the database, including options or settings that were set using a DDL
statement. Runtime views provide actual HANA runtime data, including statistics and status information
related to the execution of DML statements. Runtime views start with M_ for monitoring.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 193
Provides details about an exception that occurred during the execution of a remote subscription. The
exceptions can be processed using the PROCESS REMOTE SUBSCRIPTION EXCEPTION SQL
statement.
Administration Guide
194 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
CLEANSE_COMPONENT_INFO System View [Smart Data Quality] [page 219]
Identifies the location of parsed data elements in the input and output.
Related Information
System Views Reference for Additional SAP HANA Contexts (SAP HANA SQL and System Views Reference)
Structure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 195
7.2.2 ADAPTER_LOCATIONS System View [Smart Data
Integration]
Structure
Structure
Administration Guide
196 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
7.2.4 AGENT_CONFIGURATION System View [Smart Data
Integration]
Agent configuration
Structure
Structure
Structure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 197
Column Data type Description
IS_SSL_ENABLED VARCHAR(5) Specifies whether the agent listening on TCP port uses SSL
Provides the status of all agents registered in the SAP HANA database.
Structure
LAST_CONNECT_TIME TIMESTAMP The last time the session cookie was used for suc
cessful re-connection
Stores dictionary status information, remote source owner information, and the status of data collection.
Note
This system view is for keeping track of the status of metadata dictionaries for remote sources. If there is
no dictionary for a given remote source, it will not appear in the view.
Administration Guide
198 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
For basic remote source information you can select from REMOTE_SOURCES. It includes the following.
● REMOTE_SOURCE_NAME
● ADAPTER_NAME
● CONNECTION_INFO
● AGENT_GROUP_NAME
Structure
● STARTED
● COMPLETED
● RUNNING (GET OBJECTS)
● RUNNING (GET OBJECT DETAILS)
● FAILED
● CANCELLED
● CLEARED
Structure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 199
Column Data type Description
Provides details of current processing details of a remote subscription (e.g. number of messages or
transactions received, applied since the start of the SAP HANA database).
Structure
Administration Guide
200 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
1 - CLUSTER
2 - POOL
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 201
Column Data type Description
Note
The M_SESSION_CONTEXT view is available for use in other areas of SAP HANA, not only SAP HANA
smart data integration. Refer to the M_SESSION_CONTEXT topic for complete information. The
information below is specific to smart data integration functionality.
Each variable is categorized in SECTION column to USER (user defined variable using SET command or client
API call) or SYSTEM (predefined variable or system property).
Related Information
M_SESSION_CONTEXT System View (SAP HANA SQL and System Views Reference)
Administration Guide
202 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
7.2.13 REMOTE_SOURCE_OBJECT_COLUMNS System View
[Smart Data Integration]
If the adapter can provide column-level information for each table in the remote system, once this dictionary is
built you can search for relationships between tables. This table is useful for analyzing relationships between
tables in the remote source.
Structure
IS_AUTOINCREMENT
Structure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 203
Column Data type Description
Stores browsable nodes as well as importable objects (virtual tables). This view is built from remote source
metadata dictionaries.
Structure
Remote sources
Administration Guide
204 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
Related Information
REMOTE_SOURCES System View (SAP HANA SQL and System Views Reference)
Provides details about an exception that occurred during the execution of a remote subscription. The
exceptions can be processed using the PROCESS REMOTE SUBSCRIPTION EXCEPTION SQL statement.
Structure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 205
7.2.18 REMOTE_SUBSCRIPTIONS System View [Smart Data
Integration]
Structure
Provides the client mapping when a task is created by the ABAP API.
Administration Guide
206 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
CLIENT NVARCHAR(128) Name of the client that created the task with the ABAP API
Structure
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
COLUMN_NAME NVARCHAR(128) Name of the column used in the task plan within a table
MAPPED_NAME NVARCHAR(128) Mapped name of the column used in a task plan within a ta
ble
Data in this view is updated while the task is in progress. For example, STATUS, PROCESSED_RECORDS, and
TOTAL_PROGRESS_PERCENT are continuously updated until the task is complete.
Users may view information only for tasks that they ran themselves or were granted permissions to view.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 207
Structure
PROCEDURE_PARAMETERS NVARCHAR(5000) Displays the input <param-list> values that were speci
fied in the START TASK SQL command
HAS SIDE EFFECTS VARCHAR(5) 'TRUE' if the task produces side effect data, else 'FALSE'
Administration Guide
208 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
Contains all operations that exist for a given task, as well as details about those operations.
Structure
Data in this view is updated while the task is in progress. For example, STATUS, PROCESSED_RECORDS, and
OPERATIONS_PROGRESS_PERCENT are continuously updated until the task is complete.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 209
Users may view information only for tasks that they ran themselves or were granted permissions to view.
Structure
● STARTING
● RUNNING
● FAILED
● COMPLETED
● CANCELLING
● CANCELLED
HAS_SIDE_EFFECTS VARCHAR(5) 'TRUE' if the task produces side effect data, else 'FAL
SE'
Administration Guide
210 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
Contains all of the tables used by the various sideeffect producing operation.
Structure
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for an operation
IS_PRIMARY_TABLE TINYINT Specifies whether this table is the primary table in a relation
ship
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 211
7.2.27 TASK_TABLE_RELATIONSHIPS System View [Smart
Data Integration]
Structure
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for an operation
RELATED_TABLE_NAME NVARCHAR(128) Name of the table to which the table specified in TA
BLE_NAME is related
FROM_ATTRIBUTE NVARCHAR(128) Name of the column in the TABLE_NAME table that relates
to the TO_ATTRIBUTE
Structure
Administration Guide
212 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Column Data type Description
PLAN NCLOB Task plan used to define the task, or task plan gener
ated to call the procedure
HAS_TABLE_TYPE_INPUT VARCHAR(5) 'TRUE' if the task is modeled with a table type as in
put, meaning data would need to be passed at execu
tion time
HAS SDQ VARCHAR(5) 'TRUE' if the task contains SDQ (smart data quality)
functionality
IS_READ_ONLY VARCHAR(5) 'TRUE' if the task is read only (has only table type out
puts), 'FALSE' if it writes to non-table-type outputs
SQL_SECURITY VARCHAR(7) Security model for the task, either 'DEFINER' or 'IN
VOKER'
Lists the properties of the columns in a virtual table sent by the adapter via CREATE VIRTUAL TABLE SQL
statement.
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 213
Structure
Lists the properties of a virtual table sent by the adapter via the CREATE VIRTUAL TABLE SQL statement.
Structure
Administration Guide
214 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Contains governance information for every column in every record that is updated in the best record process.
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 215
Column Data type Description
DST_ROW_TYPE NVARCHAR(1) Identifies how the record was updated or if it was newly cre
ated
STRATEGY_GROUP_ID INTEGER Identification number that identifies the best record strategy
group
BEST_RECORD_RULE NVARCHAR(256) Name of the rule that updates one or more columns as it is
defined in the best record configuration
UPDATE_NUM INTEGER Number of times the column was updated in the best record
process
OPERATION_TYPE NVARCHAR(1) Identifies how the record was updated in the best record
process
Contains information on which strategies are used in each strategy group and in which order.
Structure
Administration Guide
216 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Column Data type Description
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
STRATEGY_GROUP_NAME NVARCHAR(256) Name of the strategy group as defined in the best record
configuration
STRATEGY_NAME NVARCHAR(256) Name of the strategy as defined in the best record configu
ration
Describes how well an address was assigned as well as the type of address.
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 217
Column Data type Description
ASSIGNMENT_LEVEL NVARCHAR(4) Code that represents the level to which the address matched
data in the address reference data
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
ENTITY_ID NVARCHAR(12) Identifier describing the type of record that was processed
Administration Guide
218 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Column Data type Description
Identifies the location of parsed data elements in the input and output.
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 219
Column Data type Description
TABLE_NAME NVARCHAR(128) Name of the input table where the component element was
found
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
COLUMN_NAME NVARCHAR(128) Name of the column in the input table where the component
element was found
OUTPUT_TABLE_NAME NVARCHAR(128) Name of the output table where the component element was
written
OUTPUT_COLUMN_NAME NVARCHAR(128) Name of the column in the output table where the compo
nent element was written
Contains one row per info code generated by the cleansing process.
Administration Guide
220 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
ENTITY_ID NVARCHAR(12) Identifier describing the type of record that was processed
INFO_CODE NVARCHAR(10) Information code that gives information about the process
ing of the record
Structure
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 221
Column Data type Description
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
ENTITY_ID NVARCHAR(12) Identifier describing the type of record that was processed
NUM_RECORDS BIGINT Total number of records processed for the entity instance
NUM_VALIDS BIGINT Number of valid records processed for the entity instance
NUM_SUSPECTS BIGINT Number of suspect records processed for the entity instance
NUM_BLANKS BIGINT Number of blank records processed for the entity instance
NUM_HIGH_SIGNIFI BIGINT Number of records with high significance changes for the en
CANT_CHANGES tity instance
Contains one row per info code generated by the geocode transformation process.
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
Administration Guide
222 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Column Data type Description
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 223
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
Administration Guide
224 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Column Data type Description
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 225
Structure
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
Contains one row for each match decision made during the matching process.
Structure
Administration Guide
226 PUBLIC SQL and System Views Reference for Smart Data Integration and Smart Data Quality
Column Data type Description
TASK_EXECUTION_ID BIGINT Unique identifier for a particular run of a task plan created
when START TASK is called
TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for the operation
ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
RELATED_TABLE_NAME NVARCHAR(128) Name of the table defined in the task plan for an operation
RELATED_ROW_ID BIGINT Unique identifier of the row processed for this execution of
the task plan
POLICY_NAME NVARCHAR(256) Name of the match policy that processed the related rows
RULE_NAME NVARCHAR(256) Name of the match rule that processed the related rows
Administration Guide
SQL and System Views Reference for Smart Data Integration and Smart Data Quality PUBLIC 227
Important Disclaimers and Legal Information
Hyperlinks
Some links are classified by an icon and/or a mouseover text. These links provide additional information.
About the icons:
● Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your
agreements with SAP) to this:
● The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.
● SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
● Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering a SAP-hosted Web site. By using such
links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this
information.
Example Code
Any software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax
and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of
example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Gender-Related Language
We try not to use genderspecific word forms and formulations. As appropriate for context and readability, SAP may use masculine word forms to refer to all genders.
Administration Guide
228 PUBLIC Important Disclaimers and Legal Information
Administration Guide
Important Disclaimers and Legal Information PUBLIC 229
www.sap.com/contactsap
SAP and other SAP products and services mentioned herein as well as
their respective logos are trademarks or registered trademarks of SAP
SE (or an SAP affiliate company) in Germany and other countries. All
other product and service names mentioned are the trademarks of their
respective companies.