100% found this document useful (1 vote)
492 views

real_time_scraios

Uploaded by

Khadar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
492 views

real_time_scraios

Uploaded by

Khadar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

IDOC TO REST_API:

In this IDoc to REST API integration scenario, the business requirement


is to send real-time order information from an SAP ECC system to a
third-party REST API, such as an e-commerce platform or CRM, for
order processing, tracking, or analytics.
The integration flow starts with the IDoc Sender Adapter in SAP CPI,
which receives the IDoc from SAP ECC. In CPI, a call is made to generate
an OAuth token using an HTTP Receiver Adapter by invoking the
authentication endpoint with required credentials. The token is then
stored dynamically for reuse. Next, the IDoc data is converted to JSON
format using Message Mapping to match the REST API's expected
structure. Finally, the JSON payload is sent to the target REST API using
another HTTP Receiver Adapter, with the OAuth token added in the
Authorization header for secure communication.
This integration automates order data synchronization between systems,
ensuring real-time updates and reducing manual intervention, which is
critical for businesses requiring seamless communication across SAP and
third-party systems.

OData to SFTP:
In this scenario, data is fetched from an OData service and temporarily
stored in a Data Store in SAP CPI. The data store acts as a temporary
holding area, allowing us to handle errors or failures during processing. If
the data can't be sent to the target (like an SFTP server) due to an issue,
the message stays in the data store and can be retried later. The data is
written with a unique Entry ID, and it remains in the data store until
successfully processed or deleted after a certain period.

The use of data stores ensures that no data is lost due to failures and
makes error handling more efficient. For example, if an error happens
during the data transfer to SFTP, the message stays in the store and will
be retried automatically in the next run. This approach helps to manage
data efficiently across multiple integration flows, ensuring smooth and
reliable integration between systems.

AMS activities
In an AMS (Application Maintenance and Services) project for SAP CPI, the
main focus is on managing and resolving tickets raised by clients or users.
These tickets are typically raised when there are issues with data
transmission, server connectivity, or other integration-related problems.
The issues are categorized into different priority levels—P1 (very high
priority), P2 (high priority), P3 (low priority), and P4 (very low priority)—
based on the impact on client business or systems. The team is
responsible for addressing these tickets, debugging the issues, and
ensuring that the system operates smoothly.

The activities in the AMS project involve daily monitoring of the tickets,
prioritizing them, and resolving the issues in a timely manner. This
includes identifying the root cause of the problem, fixing errors, and
performing necessary troubleshooting steps to restore functionality. The
main goal is to ensure that the integration between systems, especially
SAP CPI, is stable and any disruptions are quickly resolved to minimize
business impact.

RFC Cofiguration:

In this lecture, the process of configuring the RFC (Remote Function Call)
adapter for connecting SAP systems (like S4 Hana or ECC) with SAP Cloud
Platform Integration (CPI) via the SAP Cloud Connector is explained. First,
the RFC connection between the local SAP server (ECC/S4) and the SAP
Cloud Connector is set up. This is typically done by the Basis team, where
you provide details like the application server, system number, and
system ID. Once configured, you check the connection status to confirm it
is successful.

Then, the RFC connection between SAP CPI and SAP ECC/S4 is set up in
the BTP cockpit by creating a destination with RFC as the type. Details like
user credentials, client number, and system number are entered, and the
connection is tested. A function module (RFC name) must also be
maintained to avoid errors. Once everything is configured and the
connection is verified, an IFlow can be created in CPI to send data using
the RFC adapter. This involves associating the RFC package with the IFlow
for data integration between systems.

Number Range:
Number Range Use Case in SAP CPI

The Number Range feature in SAP CPI is used to assign unique, sequential
numbers to messages. It remembers the last number used and continues
from there for the next message, even if there is a delay or break.

Example:

 If today three messages are processed (001, 002, 003), the next
message tomorrow starts from 004.

 It prevents duplicate numbering and ensures continuity.


Steps:

1. Create a Number Range under Monitor → Overview → Number


Range.

2. Set Min Value, Max Value, Field Length, and enable Rotate
(optional).

3. Use the getProperty function in Content Modifier to retrieve the


current number.

4. Map it to your target message.

This helps maintain sequential numbering for tracking and unique


identification.

Externalization Use Case in SAP CPI

Purpose of Externalization:
Externalization allows configurable parameters (e.g., endpoints,
credentials, file names) to be updated without modifying or redeploying
the Integration Flow (IFlow).

Use Case:

1. In a project, there are typically three environments:

o Development (DEV)

o Quality Assurance (QA)

o Production (PROD)

2. Each environment may have different connection details, such as:

o SFTP server details (address, folder, credentials)

o Endpoints, proxy, file names, etc.

3. Instead of editing the IFlow for each environment, these values are
externalized.

o Example: data_proxy, authentication_name, folder_name

4. How It Works:

o During IFlow deployment, you update the Configurable


Parameters in the Configure section.

o This ensures the IFlow remains the same, and only


environment-specific details change.
5. Benefits:

o No need to edit or redeploy IFlows for different environments.

o Makes migration to QA/PROD environments seamless.

o Saves time and reduces errors.

Conclusion: Externalization is critical for managing configuration changes


efficiently across environments without modifying the actual IFlow.

Handling Failed Messages and JMS Adapter Use Case in SAP CPI

Problem:

 Messages that fail in the process cannot be automatically resent


back to SAP CPI.

 Common issues: Data errors, manual intervention required, or


connection failures.

JMS Adapter Solution in SAP CPI:

1. Purpose of JMS Adapter:

o Acts as a fail-safe mechanism for retrying failed messages.

o Stores the message temporarily until it can be resent or


processed successfully.

2. Steps to Handle Failed Messages:

o Use the JMS Adapter to queue messages.

o Messages remain in the queue for a specific duration (e.g.,


hours, days) before they expire.

o Failed messages can be manually retried or deleted if needed.

3. Configuration Overview:

o Start Message Event: Captures incoming messages.

o JMS Adapter (Queue): Holds messages in the queue.

o Retry Mechanism: Attempts to resend the message to the


receiver.

o Manual Deletion: In cases where retry fails repeatedly,


messages can be manually removed.

4. Example Scenario:
o Use Case: Addition Calculator Service.

 Sends two input values (e.g., value1 and value2) for


addition.

 If the receiver fails to process this message, it gets


stored in the JMS queue.

o Retry the message until it is successfully processed.

Benefits:

 Ensures message reliability and prevents data loss.

 Provides flexibility to retry or remove failed messages manually.

 Enhances fault tolerance in message processing.

Conclusion:
Using the JMS Adapter in SAP CPI is an effective way to handle failed
messages by queuing them, ensuring they can be retried without losing
data.

Why Use Data Stores?

 Error Handling: If message processing fails (e.g., issues with the


target SFTP), the data remains stored in the Data Store, allowing
reprocessing without relying on the source system to resend the
same data.

 Message Retry: Data Stores enable retrying the message in


subsequent runs.

 Dynamic Processing: We can dynamically store and retrieve


messages using unique identifiers like Product ID or IDoc Number.

 Temporary Storage: For large or time-consuming operations, we


can enrich data once and reuse it in multiple IFlows.

Steps:

1. Create the First IFlow (Write to Data Store):

o Retrieve data from the OData service using a scheduler.


o Use the Write Data Store step to temporarily save the
message.

o Provide the following in the Write Data Store:

 Data Store Name (e.g., DataStore_Write)

 Entry ID: A unique identifier (like Product ID) for each


message.

 Visibility: Set to Global to allow access from other


IFlows.

2. Create the Second IFlow (Read from Data Store):

o Use the Get Data Store step to read the stored message.

o Provide the same Data Store Name and Entry ID to fetch


the exact message.

o Post-Processing: Enable Delete on Completion to remove


the message after successful processing.

3. Send the Retrieved Data to the SFTP Server:

o Add an SFTP Receiver to write the retrieved data to a


specified SFTP directory.

o Optionally, use a CSV Converter to format the data as


needed.

4. Handle Failures:

o If the target system (SFTP) is unavailable or processing fails,


the message remains in the Data Store.

o The message can be reprocessed in the next retry cycle


without losing data.

Key Benefits to Highlight:

 Reliable Processing: Data Stores ensure no data is lost during


failures.

 Dynamic Handling: Using unique identifiers like Product ID allows


processing specific messages.

 Reusability: Data stored once can be reused in multiple IFlows.

 Error Recovery: Simplifies error handling and retry mechanisms.


Real-Life Use Case:

In SuccessFactors integrations, where data enrichment can take time, we


fetch employee data once, store it in a Data Store, and reuse it in multiple
processes. Similarly, in this scenario, we temporarily stored the OData
message and ensured reliable delivery to SFTP.

This explanation simplifies the complex process while demonstrating your


understanding of Data Stores and their role in error handling and data
persistence in SAP CPI.

Scenario: Inserting data into a database from CPI using JDBC Adapter.

Step-by-Step Explanation:

1. Goal:
I needed to insert records into a database (PostgreSQL in this case)
using SAP CPI. The source data was in a CSV file.

2. Approach:

o First, I established a JDBC connection in CPI to the target


database.

o I used a CSV to XML converter to convert the source CSV


data into XML format.

o I designed a Message Mapping to transform the XML data


into the JDBC structure required for database insertion.

3. Steps Taken:

o Scheduler/Timer: Used to trigger the iFlow.

o Content Modifier: Simulated input CSV data (used for testing


as I didn’t have access to upload files).

o CSV to XML Converter: Converted the CSV data into XML


format.

o Splitter: Used a General Splitter to process multiple records


one by one.

o Message Mapping: Mapped the XML fields to the JDBC


structure (action = "INSERT").
o JDBC Adapter: Automatically converted the XML structure
into an SQL INSERT query and inserted the data into the target
database.

4. Results:

o Verified the records in the database using a SELECT query.

o For bulk records, I used the Splitter to handle and insert each
record one by one.

Alternative Solution (Advanced):


If I needed to insert bulk records in one go, I could optimize this using a
Groovy Script. The script would combine multiple records into a single
SQL INSERT query for better performance.

Key Tools and Concepts Used:

 JDBC Adapter

 CSV to XML Converter

 General Splitter

 Message Mapping

 Content Modifier

 Groovy Script (for bulk operations)

Scenario:
In SAP CPI, I needed to display a specific IDoc number in the Message
Monitoring level as a custom header. This is crucial for easy message
searchability when clients or users ask for the status of a specific IDoc.

Problem:

 The incoming IDoc XML had a namespace in its structure.

 SAP CPI's XPath expression was unable to fetch the required


DOCNumber due to the namespace.

 As a result, the IDoc number could not be stored in the custom


header (e.g., SAP_ApplicationID).

Solution:
1. Remove XML Namespace Using XSLT Mapping

o I created an XSLT mapping to remove the namespace from


the incoming XML.

o After applying the XSLT, the XML structure became


namespace-free, allowing CPI to access the DOCNumber using
XPath.

2. Set the Custom Header

o In a Content Modifier, I set a custom header named


SAP_ApplicationID (a standard header for application
identifiers).

o Using an XPath expression, I extracted the DOCNumber


from the modified XML and stored it in the custom header.

3. Steps in the iFlow:

o XSLT Mapping: Removes the namespace.

o Content Modifier: Fetches the DOCNumber and sets it in the


SAP_ApplicationID header.

End Result:

 The IDoc number (DOCNumber) now appears as a custom header in


the Message Monitoring UI.

 This makes it easier to search for specific messages based on IDoc


numbers, especially when clients request verification of a particular
IDoc.

Benefits:

1. Improved Monitoring: Quickly locate and verify messages in SAP


CPI.

2. Client-Friendly: Directly search for IDocs using the custom header.

3. Robust Design: Handles XML namespaces effectively using XSLT.

Key Tools/Concepts Used:

 XSLT Mapping: To remove namespaces.


 Content Modifier: To set the custom header.

 XPath Expression: To extract DOCNumber from the XML.

simply it

ChatGPT said:

ChatGPT

Complex Scenario:
In one of my projects, we received IDoc messages in SAP CPI, and I
needed to display the IDoc number (DOCNumber) in the Message
Monitoring UI using a custom header (SAP_ApplicationID). However, the
XML namespaces in the incoming IDoc caused the XPath expression to
fail, and the IDoc number wasn’t getting stored.

Solution:

1. I created an XSLT mapping to remove the namespaces from the


XML.

2. Using a Content Modifier, I extracted the DOCNumber using an


XPath expression and stored it in the custom header
SAP_ApplicationID.

3. After deploying and testing, the IDoc number was successfully


displayed in the Message Monitoring UI.

Outcome:
This allowed us to quickly search messages using the IDoc number,
making it easier to troubleshoot and confirm message processing for
clients.

This solution simplified message tracking and solved the namespace issue
effectively.

In SAP CPI (Cloud Platform Integration), transporting artifacts (like


iFlows, value mappings, etc.) between environments (e.g., Development,
Testing, and Production) is a key activity to ensure smooth integration.
Below are the primary transport mechanisms used in SAP CPI:
1. Manual Transport (Export and Import Artifacts)

 Artifacts like iFlows or Integration Packages can be exported


from one environment and imported into another manually.

 Steps:

1. Go to the Design workspace in SAP CPI.

2. Select the Integration Package.

3. Use the Export option to download it as a .zip file.

4. Switch to the target environment and Import the package.

 Use case: Small-scale or one-off transports.

2. CTS+ (Change and Transport System Plus)

 CTS+ integrates SAP CPI with the traditional SAP Change and
Transport System for managing transports.

 Artifacts are stored in transport requests (TRs) that can be moved


between environments.

 Steps:

1. Configure CTS+ to integrate with CPI.

2. Use CTS+ tools to transport artifacts like iFlows and value


mappings.

 Use case: Organizations using SAP CTS for centralized transport


management.

3. Transport Management Service (TMS)

 SAP Transport Management Service is a cloud-based solution to


manage transports efficiently across multiple CPI tenants.

 It automates artifact movement between environments (Dev → QA


→ Prod).

 Steps:

1. Enable TMS in the SAP BTP subaccount.

2. Configure Transport Nodes (e.g., Development, Test, and


Production tenants).
3. Transport artifacts using TMS.

 Use case: Standard transport mechanism for SAP Cloud Integration


in SAP BTP.

4. CI/CD Pipeline (Continuous Integration/Continuous Deployment)

 SAP CPI artifacts can be transported using DevOps tools like


Jenkins, Azure DevOps, or GitHub Actions.

 Process:

1. Artifacts are exported from CPI and stored in a version


control system (like Git).

2. Automated CI/CD pipelines transport artifacts across


environments.

 Use case: For advanced DevOps-based deployments, providing


automation, versioning, and rollback capabilities.

Summary Table: Transport Mechanisms in SAP CPI

Mechanism Description Best For

Export and Import


Manual Transport Small-scale transports.
artifacts manually.

Integrates with SAP's SAP landscapes using


CTS+
traditional CTS system. CTS.

Transport
Cloud-based transport
Management Standard CPI transports.
tool for SAP BTP.
Service (TMS)

DevOps tools for Automated, version-


CI/CD Pipelines
automated transport. controlled deployments.

Conclusion:

The choice of transport mechanism depends on the organization’s setup,


project size, and deployment frequency. For most cloud-centric setups,
TMS is the preferred approach, while CI/CD pipelines are ideal for
advanced automation.

4o

You might also like