value mapping
value mapping
Question about BTP Integration Suite - IDoc Integration from S/4 to Integration Suite :
We are builiding multiple Integrations to send different IDOC types from S/4 to Integration suite.
Each Integration flow(iFlow) generates it's own endpoint URL(https:<host>:<port>/cxf/<endpoint>),
which we need to maintain in SM59 in S4, to send the IDOC.
Due to this we are required to create multiple RFC destinations(and Ports), one per Integration, which
doesn't really appears to be the right approach.
Is there a way to have just one HTTP Destination in SM59 to BTP Integration Suite and use that to send
multiple types of IDOCs?
In fact, not only IDOCs, but for any integration (File, REST, OData etc. ) we would like to have just One
RFC destination to Integration suite.
Answer:
we need a generic flow in IS to call respective iFlow based on the IDOC message type.
I am curious if we can do the same for other types of integration like FILE, REST, ODATA etc. using the
same generic iFlow and same SM59 RFC Destination???
Answer : for other channels you probably need seperate iflows because channels parameters differs.
It is possible to use a single HTTP RFC destination in SM59 for all IDOC integrations with SAP BTP
Integration Suite by leveraging a central “dispatcher” iFlow. This iFlow acts as a generic entry point,
extracting the IDOC type or other metadata to dynamically route the message to specific sub-flows
based on its type (e.g., ORDERS, MATMAS). This approach eliminates the need for multiple RFC
destinations or ports in S/4HANA while centralizing routing logic in the Integration Suite. Ensure secure
communication (e.g., HTTPS with OAuth) and implement robust monitoring for scalability and
maintainability.
I asked chatgpt:- Give me some scenarios of iflows in terms of sap cpi. Don't tell me the solutions just
give me the question bank for practice.
you can also try these for practice and confirm the answers for iflows after u try those on own.
Chatgpt replied:-
---------------------+++++++++++++++++++++
Here are some random SAP CPI scenarios for practice:
----------------------+++++++++++++++++++++
1. API Integration & Transformation
A third-party API provides customer data in JSON format, but your SAP backend expects XML. Design a
flow to retrieve, transform, and send the data to SAP.
Your integration fetches orders from an API, but sometimes the API returns an error (e.g., 429 Too Many
Requests). Implement error handling with retries and logging.
3. File-to-API Integration
A vendor sends CSV files to an SFTP server daily. You need to read the file, convert it to JSON, and send it
to an API. Ensure duplicate files are not processed.
A REST API returns only 100 records per call. Implement pagination to fetch all data and merge the
results before sending them to an SAP system.
Retrieve sales order details from an SAP S/4HANA OData service and send them to a third-party system
in JSON format. Ensure only new orders are fetched.
6. Dynamic Routing
Depending on the country code in incoming messages, route the message to different API endpoints
(e.g., US orders go to Endpoint A, EU orders go to Endpoint B).
A REST API requires JWT-based authentication. Implement a flow to generate a JWT token, use it in API
calls, and refresh it when expired.
8. Parallel Processing
A batch file contains multiple orders. Process each order in parallel and ensure that failures in one order
do not impact others.
When a new material is created in SAP S/4HANA, trigger an event in CPI to notify an external inventory
management system.
Integration is critical piece in any project and IDoc is standard interface used to integrate with ECC / S4
systems, therefore, appropriate IDoc Handling strategy is must to ensure smooth execution of business
activities.
In this blog post, we will discuss how to create IFlow in CPI to handle the multiple IDocs received
from the source system.
We will see the configurations required in SAP CPI. This will be useful in any project which will
be handling multiple IDocs coming from source systems.
Problem Statement
In CPI, we normally use a single IFlow to receive all the IDocs from the Source system(unlike SAP PI
where we develop separate interfaces for each IDoc), we achieve this in CPI with the help of Process
Direct functionality, using which we can call other IFlows in same and different packages.
To achieve this we read the IDoc Control Data in the IFlow and use a router to route the messages based
on the Message Type or (as shown above) the IDoc type of the incoming IDoc.
But the downside of this approach is that whenever there is an addition of a new IDoc type, we
must add a new condition, a new branch and a new receiver to this IFlow, basically we will have
to edit and redeploy this IFlow, which proves to be a tedious task and highly not recommended
in the live system where thousands of messages pass daily and making changes in the IFlow can
affect the Business as Usual (BAU) activities. This may lead to multiple message failures, increase
of the ticket counts and worse, loss of business or monetary loss for the client.
Solution Approach
The shortcoming of the above approach is that we need to make changes in the main IFlow to add or
remove the IDoc Types.
Therefore, the solution should be such where we should not have to make changes in the main IFlow
and we should be able to add or remove the IDoc types, which can be achieved using:
Value Mapping,
Dynamic Configuration
The idea behind this approach is to use Value mapping for configuration and whenever we want to add
or remove any IDoc in the IFlow, it can be done by changing the Value mapping only rather than
changing the complete IFlow.
1. Create 4 new properties in the Content Modifier, MESTYP, DOCTYP, SENDER and LS, and read
the values of IDoc Message Type, IDoc Type, Sender and Logical system from the IDoc Control
Data(EDIDC) segment respectively using XPath,
The required information is the Source, Message Type, IDoc Type and the Logical system stored
in the respective Exchange Property,
3. In the groovy script we will access the Value mapping and get the corresponding Process Direct
address to the next IFlow as shown below,
4. Then we will write the address of the next IFlow in the Header PD_Address,
import java.util.HashMap;
import com.sap.it.api.ITApi.ValueMappingApi;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
import com.sap.gateway.ip.core.customdev.util.Message;
if (api == null) {
throw new Exception("Could not retrieve ValueMappingAPI.");
}
if (value == null ) {
throw new Exception("No values found in the Value Mapping for Keys: " + sender +
" : " + mestyp + " : " + log_sys + " : " + doctyp);
}
//Set Process Direct Address as Header
message.setHeader("PD_Address", value);
return message;
}
Later the header "PD_Address"can be dynamically called in the Process Direct using Apache Camel
expression:
Integration is critical piece in any project and IDoc is standard interface used to integrate with ECC / S4
systems, therefore, appropriate IDoc Handling strategy is must to ensure smooth execution of business
activities.
In this blog post, we will discuss how to create IFlow in CPI to handle the multiple IDocs received
from the source system.
We will see the configurations required in SAP CPI. This will be useful in any project which will
be handling multiple IDocs coming from source systems.
Problem Statement
In CPI, we normally use a single IFlow to receive all the IDocs from the Source system(unlike SAP PI
where we develop separate interfaces for each IDoc), we achieve this in CPI with the help of Process
Direct functionality, using which we can call other IFlows in same and different packages.
To achieve this we read the IDoc Control Data in the IFlow and use a router to route the messages based
on the Message Type or (as shown above) the IDoc type of the incoming IDoc.
But the downside of this approach is that whenever there is an addition of a new IDoc type, we
must add a new condition, a new branch and a new receiver to this IFlow, basically we will have
to edit and redeploy this IFlow, which proves to be a tedious task and highly not recommended
in the live system where thousands of messages pass daily and making changes in the IFlow can
affect the Business as Usual (BAU) activities. This may lead to multiple message failures, increase
of the ticket counts and worse, loss of business or monetary loss for the client.
Solution Approach
The shortcoming of the above approach is that we need to make changes in the main IFlow to add or
remove the IDoc Types.
Therefore, the solution should be such where we should not have to make changes in the main IFlow
and we should be able to add or remove the IDoc types, which can be achieved using:
Value Mapping,
Groovy Scripting, and
Dynamic Configuration
The idea behind this approach is to use Value mapping for configuration and whenever we want to add
or remove any IDoc in the IFlow, it can be done by changing the Value mapping only rather than
changing the complete IFlow.
1. Create 4 new properties in the Content Modifier, MESTYP, DOCTYP, SENDER and LS, and read
the values of IDoc Message Type, IDoc Type, Sender and Logical system from the IDoc Control
Data(EDIDC) segment respectively using XPath,
2. We will take the IDoc control data as input of the Groovy,
3. The required information is the Source, Message Type, IDoc Type and the Logical system stored
in the respective Exchange Property,
4. In the groovy script we will access the Value mapping and get the corresponding Process Direct
address to the next IFlow as shown below,
5. Then we will write the address of the next IFlow in the Header PD_Address,
import java.util.HashMap;
import com.sap.it.api.ITApi.ValueMappingApi;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
import com.sap.gateway.ip.core.customdev.util.Message;
if (api == null) {
throw new Exception("Could not retrieve ValueMappingAPI.");
}
if (value == null ) {
throw new Exception("No values found in the Value Mapping for Keys: " + sender +
" : " + mestyp + " : " + log_sys + " : " + doctyp);
}
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import com.sap.it.api.ITApi.ValueMappingApi;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
def Message processData(Message message) {
if (api == null) {
if (value == null ) {
throw new Exception("No values found in the Value Mapping for Keys: " + sndprn + " : "
+ msgtyp + " : " + rcvprn + " : " + idoctyp);
return message;
}
-------------------------------------------------------------------------------------------------------------------------------------------
-- 🚀 **Exciting Update in SAP CPI Timer Scheduler**
As SAP professionals, we are always on the lookout for features that make integrations smoother and
more efficient. Recently, SAP introduced a highly useful enhancement to the timer scheduler in CPI. I
might be a bit late to the party (as I wasn’t actively practicing for a while), but when I discovered it, I
knew I had to share my thoughts! 😊
**What’s New?**
SAP has added advanced scheduling options to the timer, which now allows you to:
✅ Schedule interfaces for **specific days of the week** (e.g., only Mondays and Fridays).
✅ Set up runs for **specific days of the month** (e.g., the 1st, 15th, or last day of the month).
✅ Define **custom time intervals** with more precision.
This is a game-changer for scenarios where timing is critical, like payroll interfaces, where the process
needs to start and stop according to cutoff dates.
**Your Thoughts?**
For those of you already using this feature, how has it impacted your projects? Have you faced any
challenges or discovered any creative use cases? I'd love to hear your feedback and learn from your
experiences!
When an interface in **SAP CPI** is triggered by external applications or events, data is usually passed
along, ensuring the integration flow (iFlow) doesn’t fail at runtime. However, in **⏰ timer-based
interfaces** dealing with delta loads, there’s a chance no data will be received. This can lead to failures,
especially at steps where the iFlow expects data.
1. **Set a Property**:
- Use a **️⃣Content Modifier** to capture the body payload in a property.
- Check if this property is null or empty within the router.
2. **Alternative**: Use a 📝 **Groovy script** to set a flag in the properties if the body is null.
In cases where multiple **End steps** exist in an iFlow, monitoring becomes tricky—you won’t know
which End step was executed. To address this:
This approach helps ensure smoother monitoring and clarity during interface execution.
In case you're interested to get the groovy for a common custom header which you can use in all
interfaces, here's my detailed post on that:
---
**🌟 Your Turn!**
What approaches do you use to handle no-data scenarios or improve monitoring in your interfaces?
Share your thoughts below 👇 and let’s discuss ways to make our integrations more robust!
ALM alert can be set up to trigger notification to person responsible
Integration suite will not read data from CDS view if there are connection issue with M2C data reside on
table and no data loss.
Next run of the scheduler will process the pending records automatically based on the frequency
Pattern for M2C Integration : Simple Pattern where IS will pool the data from CDS view hourly basis with
JDBC adapter storing the hourly time stamp and sent it to M2C in JSON format simple HTTPS post.
End point to be provided by M2C team its a secure connection and support OAuth2.0.
Rollout Requipments : For future new ETO Plant need to be added in table on data sphere to extract the
data and make it available on CDS view for integration with M2C.
Synchronization is ok.This Integration Flow will be consuming enriched data from CDS view from
different master tables in S4HC.
New custom CDS view will be created and Integration suit to use a JDBC connection to connect to the
Viewand extract the data every hour and sent to M2C via HTTPS post
Integration suite to make a query and pull data from CDS view and transform an XML and convert it in
JSON format and sent to M2C API endpoint
Content Modifier (named as Send Exception Details):
Message Header:
Constant - InterfaceID: IFLOW_01 (you can set this as anything unique name to use it as I-flow specific
Id)
Message Body:
<Time>${date:now:hh:mm:ss}</Time>
<Message_ID>${property.SAP_MessageProcessingLogID}</Message_ID>
<IFlow_Name>${camelId}</IFlow_Name>
<Exception_Message>${exception.message}</Exception_Message>
</Exception_Details>
</root>
Using the ProcessDirect adapter at the sender end and keeping the same endpoint, the exception
message is received in Exception Handling Main Flow.
Message Header:
Exchange Property:
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.mapping.ValueMappingApi;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.io.*
import java.lang.*;
import java.util.*;
if (mappedValue == null)
{ throw new IllegalStateException("No valuemapping maintained for Interface: "+InterfaceName);
}
message.setProperty("RecipientName", mappedValue);
return message;
}
In the Connection tab, the connectivity details are maintained. In the Processing tab of the mail adapter,
below is the snapshot of the configuration done:
Mail Body:
Hi Team Members
</br>
</br>
This email is to notify you that an interface has failed in the middleware SAP Cloud Integration.</br>
Here are the details about the interface, and the exception caught which would help you analyze this
issue further:</br>
</br>
<table style="width:50%">
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Date</b>
</td>
<td>${property.Date}</td>
</tr>
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Time</b>
</td>
<td>${property.Time}</td>
</tr>
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Message ID</b>
</td>
<td>${property.Message_ID}</td>
</tr>
<tr>
<td style="background-color:#5B8AB6;color:white">
<b>Integration Flow Name</b>
</td>
<td>${property.IFlow_Name}</td>
</tr> <tr>
<td style="background-color:#5B8AB6;color:white">
<b>Exception Message</b>
</td>
<td>${property.Exception_Message}</td>
</tr>
</table>
</br>
</br>
<i>This is an auto-generated email from the middleware (SAP Cloud Integration). Please do not reply to
this email.</i>
</br>
The above Groovy is calling the Value Mapping maintained as per the InterfaceID that we had set in the
above steps as Constant in Exception Subprocess Content Modifier to maintain a unique Iflow-specific ID
which would enable sending the email as per Iflow respective owners.
In the value map, we can add more entries and correspond to them, we can assign the desired email IDs
(separated with commas) to which email should be sent for that specific iFlow failure.
This is how we can handle the exceptions for the iFlows and send email notifications to desired emails.
Hello Team,
Again go to Pallets ->> Participant ->> Receiver add this Receiver to the flow and connect this with Send request by using MAIL
adapter
Configure the Mail adapter as below
With this we have competed creating Exception Sub Process. After deploying this i-Flow if any error occurred it will send an alert
mail like below.
Note: I wontedly kept some error in this i-Flow to show the alert mail as below.