0% found this document useful (0 votes)
26 views32 pages

value mapping

The document discusses the integration of IDocs from S/4HANA to the SAP BTP Integration Suite, emphasizing the need for a single HTTP destination to streamline multiple IDoc types. It proposes using a central dispatcher iFlow to dynamically route messages based on IDoc type, eliminating the need for multiple RFC destinations. Additionally, it highlights the importance of using value mapping and Groovy scripting for dynamic configuration to manage IDoc types without frequent changes to the main iFlow.

Uploaded by

sapcpibuddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views32 pages

value mapping

The document discusses the integration of IDocs from S/4HANA to the SAP BTP Integration Suite, emphasizing the need for a single HTTP destination to streamline multiple IDoc types. It proposes using a central dispatcher iFlow to dynamically route messages based on IDoc type, eliminating the need for multiple RFC destinations. Additionally, it highlights the importance of using value mapping and Groovy scripting for dynamic configuration to manage IDoc types without frequent changes to the main iFlow.

Uploaded by

sapcpibuddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 32

Process direct – is an Internal Adaptor

Common Idoc Routing iflow

Question about BTP Integration Suite - IDoc Integration from S/4 to Integration Suite :

We are builiding multiple Integrations to send different IDOC types from S/4 to Integration suite.
Each Integration flow(iFlow) generates it's own endpoint URL(https:<host>:<port>/cxf/<endpoint>),
which we need to maintain in SM59 in S4, to send the IDOC.
Due to this we are required to create multiple RFC destinations(and Ports), one per Integration, which
doesn't really appears to be the right approach.

Is there a way to have just one HTTP Destination in SM59 to BTP Integration Suite and use that to send
multiple types of IDOCs?

In fact, not only IDOCs, but for any integration (File, REST, OData etc. ) we would like to have just One
RFC destination to Integration suite.

Answer:

we need a generic flow in IS to call respective iFlow based on the IDOC message type.
I am curious if we can do the same for other types of integration like FILE, REST, ODATA etc. using the
same generic iFlow and same SM59 RFC Destination???

Answer : for other channels you probably need seperate iflows because channels parameters differs.

It is possible to use a single HTTP RFC destination in SM59 for all IDOC integrations with SAP BTP
Integration Suite by leveraging a central “dispatcher” iFlow. This iFlow acts as a generic entry point,
extracting the IDOC type or other metadata to dynamically route the message to specific sub-flows
based on its type (e.g., ORDERS, MATMAS). This approach eliminates the need for multiple RFC
destinations or ports in S/4HANA while centralizing routing logic in the Integration Suite. Ensure secure
communication (e.g., HTTPS with OAuth) and implement robust monitoring for scalability and
maintainability.

I asked chatgpt:- Give me some scenarios of iflows in terms of sap cpi. Don't tell me the solutions just
give me the question bank for practice.

you can also try these for practice and confirm the answers for iflows after u try those on own.

Chatgpt replied:-
---------------------+++++++++++++++++++++
Here are some random SAP CPI scenarios for practice:
----------------------+++++++++++++++++++++
1. API Integration & Transformation

A third-party API provides customer data in JSON format, but your SAP backend expects XML. Design a
flow to retrieve, transform, and send the data to SAP.

2. Error Handling & Logging

Your integration fetches orders from an API, but sometimes the API returns an error (e.g., 429 Too Many
Requests). Implement error handling with retries and logging.

3. File-to-API Integration

A vendor sends CSV files to an SFTP server daily. You need to read the file, convert it to JSON, and send it
to an API. Ensure duplicate files are not processed.

4. Pagination in API Calls

A REST API returns only 100 records per call. Implement pagination to fetch all data and merge the
results before sending them to an SAP system.

5. OData Service Consumption

Retrieve sales order details from an SAP S/4HANA OData service and send them to a third-party system
in JSON format. Ensure only new orders are fetched.

6. Dynamic Routing

Depending on the country code in incoming messages, route the message to different API endpoints
(e.g., US orders go to Endpoint A, EU orders go to Endpoint B).

7. JWT Authentication Handling

A REST API requires JWT-based authentication. Implement a flow to generate a JWT token, use it in API
calls, and refresh it when expired.

8. Parallel Processing

A batch file contains multiple orders. Process each order in parallel and ensure that failures in one order
do not impact others.

9. SAP Event-Driven Architecture

When a new material is created in SAP S/4HANA, trigger an event in CPI to notify an external inventory
management system.

SAP Cloud Integration IDoc Receiver Handler

Integration is critical piece in any project and IDoc is standard interface used to integrate with ECC / S4
systems, therefore, appropriate IDoc Handling strategy is must to ensure smooth execution of business
activities.

 In this blog post, we will discuss how to create IFlow in CPI to handle the multiple IDocs received
from the source system.
 We will see the configurations required in SAP CPI. This will be useful in any project which will
be handling multiple IDocs coming from source systems.

Problem Statement

In CPI, we normally use a single IFlow to receive all the IDocs from the Source system(unlike SAP PI
where we develop separate interfaces for each IDoc), we achieve this in CPI with the help of Process
Direct functionality, using which we can call other IFlows in same and different packages.

Figure (below) shows how such IFlow looks like…


Image Source: SAP BTP Trial Account

To achieve this we read the IDoc Control Data in the IFlow and use a router to route the messages based
on the Message Type or (as shown above) the IDoc type of the incoming IDoc.

 But the downside of this approach is that whenever there is an addition of a new IDoc type, we
must add a new condition, a new branch and a new receiver to this IFlow, basically we will have
to edit and redeploy this IFlow, which proves to be a tedious task and highly not recommended
in the live system where thousands of messages pass daily and making changes in the IFlow can
affect the Business as Usual (BAU) activities. This may lead to multiple message failures, increase
of the ticket counts and worse, loss of business or monetary loss for the client.

Solution Approach

The shortcoming of the above approach is that we need to make changes in the main IFlow to add or
remove the IDoc Types.

Therefore, the solution should be such where we should not have to make changes in the main IFlow
and we should be able to add or remove the IDoc types, which can be achieved using:
 Value Mapping,

 Groovy Scripting, and

 Dynamic Configuration

The IFlow will look like this:

Image Source: SAP BTP Trial Account

The idea behind this approach is to use Value mapping for configuration and whenever we want to add
or remove any IDoc in the IFlow, it can be done by changing the Value mapping only rather than
changing the complete IFlow.

Step by Step Logic

1. Create 4 new properties in the Content Modifier, MESTYP, DOCTYP, SENDER and LS, and read
the values of IDoc Message Type, IDoc Type, Sender and Logical system from the IDoc Control
Data(EDIDC) segment respectively using XPath,

2. We will take the IDoc control data as input of the Groovy,

The required information is the Source, Message Type, IDoc Type and the Logical system stored
in the respective Exchange Property,
3. In the groovy script we will access the Value mapping and get the corresponding Process Direct
address to the next IFlow as shown below,

4. Then we will write the address of the next IFlow in the Header PD_Address,

import java.util.HashMap;
import com.sap.it.api.ITApi.ValueMappingApi;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
import com.sap.gateway.ip.core.customdev.util.Message;

def Message processData(Message message) {


// Get properties
map = message.getProperties();
String mestyp = map.get("MESTYP");
String doctyp = map.get("DOCTYP");
String sender = map.get("SENDER");
String log_sys = map.get("LS");

// Get ValueMapping API


def api = ITApiFactory.getApi(ValueMappingApi.class, null);

if (api == null) {
throw new Exception("Could not retrieve ValueMappingAPI.");
}

// Get the value from value mapping


String value = api.getMappedValue(sender, mestyp, log_sys, doctyp,"ProcessDirect" );

if (value == null ) {
throw new Exception("No values found in the Value Mapping for Keys: " + sender +
" : " + mestyp + " : " + log_sys + " : " + doctyp);
}
//Set Process Direct Address as Header
message.setHeader("PD_Address", value);
return message;
}

Later the header "PD_Address"can be dynamically called in the Process Direct using Apache Camel
expression:

Image Source: SAP BTP Trial Account

The Value Mapping will look like this:

Integration is critical piece in any project and IDoc is standard interface used to integrate with ECC / S4
systems, therefore, appropriate IDoc Handling strategy is must to ensure smooth execution of business
activities.

 In this blog post, we will discuss how to create IFlow in CPI to handle the multiple IDocs received
from the source system.

 We will see the configurations required in SAP CPI. This will be useful in any project which will
be handling multiple IDocs coming from source systems.

Problem Statement

In CPI, we normally use a single IFlow to receive all the IDocs from the Source system(unlike SAP PI
where we develop separate interfaces for each IDoc), we achieve this in CPI with the help of Process
Direct functionality, using which we can call other IFlows in same and different packages.

Figure (below) shows how such IFlow looks like…


Image Source: SAP BTP Trial Account

To achieve this we read the IDoc Control Data in the IFlow and use a router to route the messages based
on the Message Type or (as shown above) the IDoc type of the incoming IDoc.

 But the downside of this approach is that whenever there is an addition of a new IDoc type, we
must add a new condition, a new branch and a new receiver to this IFlow, basically we will have
to edit and redeploy this IFlow, which proves to be a tedious task and highly not recommended
in the live system where thousands of messages pass daily and making changes in the IFlow can
affect the Business as Usual (BAU) activities. This may lead to multiple message failures, increase
of the ticket counts and worse, loss of business or monetary loss for the client.

Solution Approach

The shortcoming of the above approach is that we need to make changes in the main IFlow to add or
remove the IDoc Types.

Therefore, the solution should be such where we should not have to make changes in the main IFlow
and we should be able to add or remove the IDoc types, which can be achieved using:

 Value Mapping,
 Groovy Scripting, and
 Dynamic Configuration

The IFlow will look like this:


Image Source: SAP BTP Trial Account

The idea behind this approach is to use Value mapping for configuration and whenever we want to add
or remove any IDoc in the IFlow, it can be done by changing the Value mapping only rather than
changing the complete IFlow.

Step by Step Logic

1. Create 4 new properties in the Content Modifier, MESTYP, DOCTYP, SENDER and LS, and read
the values of IDoc Message Type, IDoc Type, Sender and Logical system from the IDoc Control
Data(EDIDC) segment respectively using XPath,
2. We will take the IDoc control data as input of the Groovy,
3. The required information is the Source, Message Type, IDoc Type and the Logical system stored
in the respective Exchange Property,

4. In the groovy script we will access the Value mapping and get the corresponding Process Direct
address to the next IFlow as shown below,
5. Then we will write the address of the next IFlow in the Header PD_Address,

import java.util.HashMap;
import com.sap.it.api.ITApi.ValueMappingApi;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
import com.sap.gateway.ip.core.customdev.util.Message;

def Message processData(Message message) {


// Get properties
map = message.getProperties();
String mestyp = map.get("MESTYP");
String doctyp = map.get("DOCTYP");
String sender = map.get("SENDER");
String log_sys = map.get("LS");

// Get ValueMapping API


def api = ITApiFactory.getApi(ValueMappingApi.class, null);

if (api == null) {
throw new Exception("Could not retrieve ValueMappingAPI.");
}

// Get the value from value mapping


String value = api.getMappedValue(sender, mestyp, log_sys, doctyp,"ProcessDirect" );

if (value == null ) {
throw new Exception("No values found in the Value Mapping for Keys: " + sender +
" : " + mestyp + " : " + log_sys + " : " + doctyp);
}

//Set Process Direct Address as Header


message.setHeader("PD_Address", value);
return message;
}
Later the header "PD_Address"can be dynamically called in the Process Direct using Apache Camel
expression:

Image Source: SAP BTP Trial Account

The Value Mapping will look like this:

import com.sap.gateway.ip.core.customdev.util.Message;

import java.util.HashMap;

import com.sap.it.api.ITApi.ValueMappingApi;

import com.sap.it.api.ITApiFactory;

import com.sap.it.api.mapping.ValueMappingApi;
def Message processData(Message message) {

def properties = message.getProperties();

def msgtyp = properties.get("MESTYP")

def idoctyp = properties.get("IDOCTYP")

def cimtyp = properties.get("CIMTYP")

def sndprn = properties.get("SNDPRN")

def rcvprn = properties.get("RCVPRN")

def srcagency = properties.get("SRCAGENCY")

def srcschema = properties.get("SRCSCHEMA")

def trgagency = properties.get("TRGAGENCY")

def trgschema = properties.get("TRGSCHEMA")

def key = sndprn+"|"+msgtyp+"|"+idoctyp+"|"+cimtyp+"|"+rcvprn

def api = ITApiFactory.getApi(ValueMappingApi.class, null);

if (api == null) {

throw new Exception("Could not retrieve ValueMappingAPI.");

// Get the value from value mapping

String value = api.getMappedValue(srcagency,srcschema,key,trgagency,trgschema );

if (value == null ) {

throw new Exception("No values found in the Value Mapping for Keys: " + sndprn + " : "
+ msgtyp + " : " + rcvprn + " : " + idoctyp);

//Set Process Direct Address as Header


message.setHeader("PD_Address", value);

return message;

}
-------------------------------------------------------------------------------------------------------------------------------------------
-- 🚀 **Exciting Update in SAP CPI Timer Scheduler**

As SAP professionals, we are always on the lookout for features that make integrations smoother and
more efficient. Recently, SAP introduced a highly useful enhancement to the timer scheduler in CPI. I
might be a bit late to the party (as I wasn’t actively practicing for a while), but when I discovered it, I
knew I had to share my thoughts! 😊

**What’s New?**
SAP has added advanced scheduling options to the timer, which now allows you to:
✅ Schedule interfaces for **specific days of the week** (e.g., only Mondays and Fridays).
✅ Set up runs for **specific days of the month** (e.g., the 1st, 15th, or last day of the month).
✅ Define **custom time intervals** with more precision.

This is a game-changer for scenarios where timing is critical, like payroll interfaces, where the process
needs to start and stop according to cutoff dates.

**My Experience So Far**


In the past, I relied on Groovy scripts to handle these requirements. While Groovy offered flexibility,
there were two major challenges:
1️⃣Making it **configurable** for end-users or non-technical colleagues.
2️⃣Ensuring it was **easy to understand** for those who, like me, prefer to avoid coding when possible.
With this new update, those challenges are significantly reduced. The enhanced timer scheduler
provides a more **intuitive and user-friendly** approach, enabling better configurability without
requiring custom scripts for every unique schedule.

**Your Thoughts?**
For those of you already using this feature, how has it impacted your projects? Have you faced any
challenges or discovered any creative use cases? I'd love to hear your feedback and learn from your
experiences!

Let’s discuss and grow together. 💡


**🚀 Handling No Data Scenarios in SAP CPI Timer-Based Interfaces**

When an interface in **SAP CPI** is triggered by external applications or events, data is usually passed
along, ensuring the integration flow (iFlow) doesn’t fail at runtime. However, in **⏰ timer-based
interfaces** dealing with delta loads, there’s a chance no data will be received. This can lead to failures,
especially at steps where the iFlow expects data.

### ✅ **How to Handle This?**


To avoid such runtime failures when no data is available, I recommend **using a router to check if data
exists**. Here's how:

1. **Set a Property**:
- Use a **️⃣Content Modifier** to capture the body payload in a property.
- Check if this property is null or empty within the router.

2. **Alternative**: Use a 📝 **Groovy script** to set a flag in the properties if the body is null.

### 💡 **Why Content Modifier?**


I prefer the **Content Modifier** approach for its simplicity and maintainability. It avoids the need for
custom scripts, making it a cleaner solution. Check out the images below for implementation details.

### 🎯 **Bonus Tip!**

In cases where multiple **End steps** exist in an iFlow, monitoring becomes tricky—you won’t know
which End step was executed. To address this:

- Add a **🧾 Custom Header** at runtime.


- Use it to store a note like "No Data Exists" or similar to indicate the execution flow.

This approach helps ensure smoother monitoring and clarity during interface execution.

In case you're interested to get the groovy for a common custom header which you can use in all
interfaces, here's my detailed post on that:

---
**🌟 Your Turn!**
What approaches do you use to handle no-data scenarios or improve monitoring in your interfaces?
Share your thoughts below 👇 and let’s discuss ways to make our integrations more robust!
ALM alert can be set up to trigger notification to person responsible

Integration suite will not read data from CDS view if there are connection issue with M2C data reside on
table and no data loss.

Next run of the scheduler will process the pending records automatically based on the frequency

Pattern for M2C Integration : Simple Pattern where IS will pool the data from CDS view hourly basis with
JDBC adapter storing the hourly time stamp and sent it to M2C in JSON format simple HTTPS post.

End point to be provided by M2C team its a secure connection and support OAuth2.0.

Rollout Requipments : For future new ETO Plant need to be added in table on data sphere to extract the
data and make it available on CDS view for integration with M2C.

It is agreed real time integration is not a need and hourly

Synchronization is ok.This Integration Flow will be consuming enriched data from CDS view from
different master tables in S4HC.

New custom CDS view will be created and Integration suit to use a JDBC connection to connect to the
Viewand extract the data every hour and sent to M2C via HTTPS post

Integration suite to make a query and pull data from CDS view and transform an XML and convert it in
JSON format and sent to M2C API endpoint
Content Modifier (named as Send Exception Details):

Message Header:

Expression - stacktrace: ${exception.stacktrace}

Constant - InterfaceID: IFLOW_01 (you can set this as anything unique name to use it as I-flow specific
Id)

Message Body:
<Time>${date:now:hh:mm:ss}</Time>
<Message_ID>${property.SAP_MessageProcessingLogID}</Message_ID>
<IFlow_Name>${camelId}</IFlow_Name>
<Exception_Message>${exception.message}</Exception_Message>
</Exception_Details>
</root>

Process Direct Adapter:


Main Exception Handling Flow:

Using the ProcessDirect adapter at the sender end and keeping the same endpoint, the exception
message is received in Exception Handling Main Flow.

Sender side Process Direct Snapshot below:

Content Modifier (named as Get Header/Properties

Message Header:
Exchange Property:

Groovy Script (named as Call Value Map

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.io.*
import java.lang.*;
import java.util.*;

def Message processData(Message message)


{
def body = message.getBody(java.lang.String) as String;
def map = message.getHeaders();
def InterfaceName = map.get("InterfaceId") as String;
def obj = ITApiFactory.getApi(ValueMappingApi.class, null);
def mappedValue = obj.getMappedValue("InterfaceDetails", "InterfaceName", InterfaceName,
"InterfaceDetails", "RecipientName");

if (mappedValue == null)
{ throw new IllegalStateException("No valuemapping maintained for Interface: "+InterfaceName);
}
message.setProperty("RecipientName", mappedValue);
return message;
}

Mail Adapter Configuration:

In the Connection tab, the connectivity details are maintained. In the Processing tab of the mail adapter,
below is the snapshot of the configuration done:
Mail Body:

Hi Team Members
</br>
</br>
This email is to notify you that an interface has failed in the middleware SAP Cloud Integration.</br>
Here are the details about the interface, and the exception caught which would help you analyze this
issue further:</br>
</br>
<table style="width:50%">
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Date</b>
</td>
<td>${property.Date}</td>
</tr>
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Time</b>
</td>
<td>${property.Time}</td>
</tr>
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Message ID</b>
</td>
<td>${property.Message_ID}</td>
</tr>
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Integration Flow Name</b>
</td>
<td>${property.IFlow_Name}</td>
</tr> <tr>
<td style="background-color:#5B8AB6;color:white">
<b>Exception Message</b>
</td>
<td>${property.Exception_Message}</td>
</tr>
</table>
</br>
</br>
<i>This is an auto-generated email from the middleware (SAP Cloud Integration). Please do not reply to
this email.</i>
</br>
The above Groovy is calling the Value Mapping maintained as per the InterfaceID that we had set in the
above steps as Constant in Exception Subprocess Content Modifier to maintain a unique Iflow-specific ID
which would enable sending the email as per Iflow respective owners.

Value Map Snapshot:

In the value map, we can add more entries and correspond to them, we can assign the desired email IDs
(separated with commas) to which email should be sent for that specific iFlow failure.

This is how we can handle the exceptions for the iFlows and send email notifications to desired emails.

Hello Team,

Message has been failed with below error:

Error detail: ${exception.message}

Correlation ID: ${header.SAP_MplCorrelationId}

Message ID: ${property.SAP_MessageProcessingLogID}

Iflow Name: ${camelId}

Date: ${date:now:dd-MM-yyyy HH:mm} / SAP CPI - Tenant Number: ${property.SystemName}

Thanks & Regards,

SAP CPI Team

Again go to Pallets ->> Participant ->> Receiver add this Receiver to the flow and connect this with Send request by using MAIL
adapter
Configure the Mail adapter as below
With this we have competed creating Exception Sub Process. After deploying this i-Flow if any error occurred it will send an alert
mail like below.

Note: I wontedly kept some error in this i-Flow to show the alert mail as below.

You might also like