Notepad++ v8.7.7 regression-fix from v8.7.6
Notepad++ v8.7.7 regression-fix from v8.7.6
4. Designing in CPI
6. Monitoring in CPI
Follow below SAP blog to create your Trial BTP account and Initial setup for Cloud Platform Integration.
Creating Trial account for Cloud Platform Integration on Cloud Foundry Environment, Creating User Cr...
OR
Trial Account Setup for SAP Integration Suite (CPI, Integration Advisor, API Management, Enterprise ...
Make sure to provide all the roles to authorize your user in BTP under the Security section.
3. CPI Overview (includes Discover, Design, Monitor and Settings)
CPI Design section provides the complete platform to develop your custom Integration flows (iflows)
inside the package shown. Packages as well as iflows can be directly imported here which are
downloaded from other CPI tenants.
CPI Settings section provides tenant settings that aren't required in daily-based integration
development.
4. Designing in CPI
In the design section, initially, you have a create a new package in which all the artifacts/iflows will be
created further.
You can create multiple packages to categorize various types of iflows as per the need. Also, there is an
option to import any package which could be exported from other tenants.
Inside the package, various features like Edit this package, export and delete are available.
Tags show the tags assigned to this package regarding the integration solution.
In Artifacts, you can create Integration Flow (iflow), Message Mapping, Value Mapping, Script Collection,
Integration Adapter.
Integration Flow is created to design end-to-end integration between two or more systems where data
format conversion, data manipulation, data filtering, conditionally routing, message mapping, value
mapping, encoding/decoding, encryption/decryption, etc. features can be used. This is the main topic so
it is explained below in detail.
Message Mapping is created to map two different structures(source and target) to transform the
message as shown below. Message mapping is done in XML format only. This can be used in iflows.
Refer following for a detailed understanding of Message Mapping:
Value Mapping is created to map Key-Value pair where a different set of keys are supposed to be
converted to specific values during message processing as shown below. This can be used in iflows.
Refer following for a detailed understanding of Value Mapping:
Script Collection is created to store a collection of multiple Groovy and Java Scripts which can further be
used in iflows.
Using Script Collection across various Integration Flows in a Package in SAP CPI
Integration Adapter is to add new custom adapters that you've developed using adapter SDK. This can
be used in iflows.
Refer following for a detailed understanding of Integration Adapter:
To create a custom Integration flow(iflow) -> Edit the package -> click on Add button -> select
Integration Flow.
Copy is to create one copy of this iflow with a different name in this same package.
View Metadata is to see and edit iflow metadata i.e. iflow name, description, etc.
Download is to download the iflow in your computer, which can later be imported on any CPI tenant.
Configure is to configure the iflow parameters which you have set during the iflow design. This doesn't
need to edit the iflow.
Deploy is to deploy the iflow on CPI tenant on runtime. To run any artifact, it needs to be deployed.
Version is also one important concept to understand that while development you can save your work
done in versions which can further be helpful if you want to revert to any previous version (or newer
versions).
On the top right corner, Edit is to edit the iflow to make changes in iflow (Note: You might not see this
edit option in standard iflows if they are not modifiable).
Configure is to configure the iflow parameters which you have set during the iflow design. This doesn't
need to edit the iflow.
Deploy is to deploy the iflow on CPI tenant on runtime. To run any artifact, it needs to be deployed.
Delete is to delete this iflow (Warning: You might not able to recover the artifact once deleted, so make
sure you've downloaded this for the safe side before deleting).
On the right side, there are scaling options to handle the zooming workspace.
On the bottom right corner as well as the bottom bar in light blue color, there are features where you
can choose a different option and do the configuration of selected palettes in iflow workspace.
Below bar contains all the palettes (different components used to design any iflow). To use any palette,
click on any icon shown, which opens a dropdown as shown (opened dropdown is for mappings) then
choose the required palette by clicking on it once and clicking again in the workspace to drop it.
Whenever you create any new iflow, it comes with the below Integration Process and a few palettes.
Integration Process is the main palette (the main component of this workspace) in which usually,
integration is designed with the help of other palettes by putting in Integration Process (e.g. Start, one
arrow and End). Processing of the iflow also starts with Integration Process.
To take any action on any palette, select the palette by clicking on it. A few direct actions (in the above
screenshot i.e. paste, delete; these can vary according to the palette) are already shown on the right
side of the selected palette and more functionalities are at the bottom.
Participant:
Receiver is used to connect to any external system as a receiver.
Event:
However, a timer is chosen from the Events in this case and put in the Integration Process:
Note: Every palette in iflow has a General tab where you can rename the name of the palette at your
convenience.
Timer is used to provide iflow a trigger point, from where message processing of iflow will start. It can
be scheduled as per the requirement in Scheduler tab shown below. Now, scheduled as Run
Once means iflow will be processed/run only once after deploying.
Transformation:
Refer following for a detailed understanding of message transformation and routers steps:
Content Modifier is chosen as it is used to manipulate the incoming message (Headers, Exchange
Properties, Body).
In the content modifier, a constant body is put just for the demo purpose.
It is very important to connect your iflow steps/palettes using an arrow to complete the integration as
below.
When you select any palette it shows one arrow in the right-side options. To connect it to another
palette -> click on this arrow and drag it to your target palette. It basically shows the direction of your
message processing as below:
Note: Generally, four types of parameters flow through this arrow from one palette to another i.e.
Message Headers, Exchange Properties, Message Body and Runtime parameters( Message processing id,
status, start/end runtime, etc.)
Filter is used to filter out the XML payload based on conditions provided in Xpath.
Routing:
Router is used to decide which path your message has to flow. A router can connect through one
incoming and multiple outgoing routes but then it decides only one outgoing route based on conditions
provided in outgoing routes (conditions can be written using XPATH or Camel expressions). There has to
be at least one outgoing route with a Default Route condition, this route is processed only if no other
route condition is satisfied.
Define Router
Multicast is used to send a single incoming message to multiple outgoing routes. Message can multicast
into multiple routes in two ways: Parallel and Sequential Multiple.
Parallel Multicast will send messages parallelly to all outgoing routes at the same time.
Sequential Multicast will send messages in a sequence to all the outgoing routes based on the sequence
you have configured.
Define Multicast
Gather and Join are used to merge multiple routes coming from multicast into a single outgoing route.
EIPinCPI – Aggregator
Call:
Here, in External Call you can connect to any external system using various adapters e.g. HTTP, SOAP,
OData, JDBC, etc. In the Local call, a call from the Integration process to the local Integration process be
initiated.
External Call:
Request Reply is the most call palette used to make service calls with a response expected, in which
various Adapters are available, which can be configured to connect to the external system.
SAP Cloud Platform Integration (CPI) || Part 8 || Working with Request Reply
SAP Cloud Platform Integration (CPI) || Part 11 || How to use a Local Integration Process
Persistence:
These features are used to store some sort of information on CPI Tenant at runtime of iflow, which also
can be retrieved later.
In Data Store Operations, there are four operations available as of now i.e. Write is to store the message
onto CPI tenant permanently, Delete is to delete the stored message, Get is to retrieve the stored
message and Select is to select a message from the data stored already and send in the message body of
iflow.
Persist Messages
Write Variable is used to define a variable and share its value across different iflows in the CPI tenant.
Mapping:
Refer following for a detailed understanding of different types of Mapping:
Message Mapping: Mapping Overview in SAP Cloud Platform Integration (f.k.a. HCI or SAP HANA Cloud
Platform,integratio...
Operation Mapping: [ How-To ] Moving Operation Mapping from SAP Process Orchestration To SAP
Integration Suite
Script is used to write your own code to fulfill the requirement which cannot be done using cpi palettes.
CPI provides options for using JavaScript and Groovy Script.
Refer following for a detailed understanding of Groovy Script:
SAP Cloud Integration (CPI/HCI) || Writing Groovy Scripts _ With Basic Examples
Refer following for Groovy Compiler made perfectly to develop groovy code as per the CPI message
processing way.
https://groovyide.com/cpi
In CPI, there is a standard way to handle runtime errors. Wherever any failure occurs during runtime in
an iflow it will directly trigger the Exception Subprocess and continue to process the steps in it. As per
the usual practice, the error is printed as an attachment in the monitor and an email alert is triggered to
report the error to the concerned team as below.
Refer following for a detailed understanding of Exception Handling in CPI:
YouTube Demo:
SAP CPI Exception sub process || Error Handling || Step by step guide
6. Monitoring in CPI
In the monitor section, you can monitor and analyze several aspects of artifacts.
Manage Security is to maintain various security materials e.g. credentials, certificates, etc.
Manage Stores is to check and maintain storage material e.g. variables, message persisted number
ranges, etc.
An iflow can be triggered either from any external system or internally in CPI e.g. Start Timer. If you
want to trigger it externally, a Sender system and a sender Adapter are required to generate the CPI
Iflow Endpoint as shown below. Consuming these Endpoints you can initiate the call to run iflow using
Postman or any third-party system.
Refer following for a detailed understanding of How to call CPI Iflow externally:
YouTube Demo:
SAP CPI Course | Request Reply - HTTP/OData | Get Northwind Customers using HTTP Adapter
In the above step Exposing CPI Endpoints to call externally, iflow endpoint will be generated in CPI
Monitor. However, to authenticate the endpoint from an external system you need a Service Key which
will be created in the SAP BTP subaccount associated with your CPI. In Service Key, Client ID and Client
Secret can be used as the Username and password of Basic Authentication.
Using Cloud Integration APIs with Tools on Cloud Foundry: Creating a Service Key
Conclusion
After reading the above blog, you should have a good knowledge of End-to-End Integration via SAP BTP
CPI starting from scratch. Considering that you've gone through all the links referred to the blog, you're
ready to start designing your artifacts based on requirements.
I'd like to thank everyone whose knowledgeable articles have been referred to the blog.
Introduction
In this blog, we will see how a SuccessFactors picklist lookup can be implemented and
consume the picklist <key - value> pairs in the incoming XML and further message
processing using SAP cloud platform integration in details.
Here, we will generate the output XML as shown below from input XML, where the
values of three XML elements will be replaced with their keys.
The picklists pairs are highlighted which are being consumed in our scenario.
Description
The complete iFlow which is further explained step by step, here it is just to give you an
overview that there're basically two Local Integration Processes:
In first Local Integration Process(SF Picklist from lookup), we are fetching the
required picklists in XML format and storing the same in global variable in CPI.
Alternative: You can also design a separate iFlow for SF lookup according to your
business requirement.
Step1:
Start Timer 1 will trigger the execution of local process call SF Picklist from
lookup.
Request Reply sfapi call is used to query the picklists from SuccessFactors using
below configuration:
Step2:
Printing the XML payload fetched from SuccesFactors as an attachment using
below code:
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
Result: The screenshot of XML payload attachment is provided below which contains
various picklists.
Step3:
Step4:
Step5:
We also need to check whether required input payload is coming or not as per
our requirement. Here We're checking if results XML tag is coming with data.
A local integration call is processed for picklist conversion described in next step.
Step6:
In content modifier GV Picklist, a property is created to access the picklists from
global variable which was stored in step3.
Input payload coming from Main Integration Flow is passed further as it is.
Step7:
Below Groovy Script is provided which is basically accessing the XML picklists
from message property and input payload from message body.
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import groovy.xml.XmlUtil;
import groovy.util.XmlParser;
def Message processData(Message message) {
//message Body
def records = message.getBody(java.lang.String) as
String;
//bloodGroup
HashMap<String, String> bloodGroup = new
HashMap<String, String>(); //create a new Hashmap object
xml_picklist.PicklistOption.findAll { p ->
p.picklistId[0].text().toString().equals("bloodGroup")
//finding all bloodGroup picklistid entities
}.each { p ->
bloodGroup.put(p.externalCode[0].text().toString(),
(p.en_US[0].text().toString().toUpperCase())) //put all
key(externalCode) value(en_US->UpperCase) pairs in Hashmap
}
//salutation
HashMap<String, String> salutation = new
HashMap<String, String>(); //create a new Hashmap object
xml_picklist.PicklistOption.findAll { p ->
p.picklistId[0].text().toString().equals("salutation")
//finding all salutation picklistid entities
}.each { p ->
salutation.put(p.id[0].text().toString(),
(p.en_US[0].text().toString().toUpperCase())) //put all key(id)
value(en_US->UpperCase) pairs in Hashmap
}
//addressType
HashMap<String, String> addressType = new
HashMap<String, String>(); //create a new Hashmap object
xml_picklist.PicklistOption.findAll { p ->
p.picklistId[0].text().toString().equals("addressType")
//finding all addressType picklistid entities
}.each { p ->
addressType.put(p.id[0].text().toString(),
(p.en_US[0].text().toString().toUpperCase())) //put all key(id)
value(en_US->UpperCase) pairs in Hashmap
}
if(check_exist_bloodGroup.toString().equals("false")){
def temp_variable =
xlm_records.results[count].BLOOD_GROUP.text().toString().toUpper
Case(); //get value of blood Group in UpperCase in
temp_variable
for (Map.Entry<String, String> entry :
bloodGroup.entrySet()) { //looping bloodGroup Hashmap to search
corresponding value
if(entry.getValue().equals(temp_variable)){ //Finding
value of temp_variable in corresponding hashmap
def lv_val =
entry.getKey().toString(); //access and store
corresponding key in lv_val variable
xlm_records.results[count].BLOOD_GROUP[0].value = lv_val;
//replace value of particular xml element with unique key
}
}
} //IF End for Blood group
//-------------------------------------------
Salutation------------------------------------------------------
-------------------
def check_exist_saluation =
xlm_records.results[count].TITLE.isEmpty(); // Checking for
Title, is exists
if(check_exist_saluation.toString().equals("false")){
def temp_variable =
xlm_records.results[count].TITLE.text().toString().toUpperCase()
; //get value of Title in UpperCase in temp_variable
for (Map.Entry<String, String> entry :
salutation.entrySet()) { //looping bloodGroup Hashmap to search
corresponding value
if(entry.getValue().equals(temp_variable)){ //Finding
value of temp_variable in corresponding hashmap
def lv_val =
entry.getKey().toString(); //access and store
corresponding key in lv_val variable
if(lv_val.contains("-")){
lv_val =
entry.getKey().toString().split("-"); //removing unwanted piece
of from string if any
xlm_records.results[count].TITLE[0].value = lv_val[1];
//replace value of particular xml element with unique key
}
else{
xlm_records.results[count].TITLE[0].value = lv_val;
}
}
}
} //IF End for TITLE
//-------------------------------------------
Address_type----------------------------------------------------
---------------------
def check_exist_addresstype =
xlm_records.results[count].Address_type.isEmpty(); // Checking
for Title, is exists
if(check_exist_addresstype.toString().equals("false")){
def temp_variable =
xlm_records.results[count].Address_type.text().toString().toUppe
rCase(); //get value of Title in UpperCase in temp_variable
for (Map.Entry<String, String> entry :
addressType.entrySet()) { //looping bloodGroup Hashmap to search
corresponding value
if(entry.getValue().equals(temp_variable)){ //Finding
value of temp_variable in corresponding hashmap
def lv_val =
entry.getKey().toString(); //access and store
corresponding key in lv_val variable
if(lv_val.contains("-")){
lv_val =
entry.getKey().toString().split("-"); //removing unwanted piece
of from string if any
xlm_records.results[count].Address_type[0].value = lv_val[1];
//replace value of particular xml element with unique key
}
else{
xlm_records.results[count].Address_type[0].value = lv_val;
}
}
}
} //IF End for TITLE
count++;
}
Result: The output payload produced by the above Groovy Script is the final required
XML payload which is then printed as an attachment by custom Groovy code as follows:
Conclusion
In this way, we have called the picklists from SuccessFactors in XML format, stored in
the global variable of SAP CPI, passed a sample XML payload for message processing
then accessed picklists from global variable in a newly created property and finally XML
payload is modified as required with the help of custom object Groovy Script.
You can utilize the above design and Groovy codes according to your requirement.
Ravi Jangra