0% found this document useful (0 votes)
17 views36 pages

Sap Cpi

The document provides an overview of SAP Cloud Platform Integration (CPI) and its role as a Platform as a Service (PaaS) solution for integrating on-premise and cloud applications. It outlines different cloud service models, integration patterns (A2A, B2B), and the functionalities of the CPI web interface, including design, monitoring, and security management. Additionally, it describes the process of creating integration flows, managing artifacts, and utilizing various palette functions for effective integration development.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views36 pages

Sap Cpi

The document provides an overview of SAP Cloud Platform Integration (CPI) and its role as a Platform as a Service (PaaS) solution for integrating on-premise and cloud applications. It outlines different cloud service models, integration patterns (A2A, B2B), and the functionalities of the CPI web interface, including design, monitoring, and security management. Additionally, it describes the process of creating integration flows, managing artifacts, and utilizing various palette functions for effective integration development.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 36

SAP CPI

Cloud service Models:-


In cloud computing there are 4 types in IT.
1. Traditional IT Model
2. Infrastructure as a Service (IAAS)
3. Platform as a Service (PAAS)
4. Software as a Service (SAAS)

Traditional IT Model:-
In Traditional IT have to manage all things like Data, Runtime, Middleware,
Server, storage--------. It is very difficult and highly charged to maintain all
things, so lot of companies right now looking at the IAAS,SAAS,PAAS.
Infrastructure as a Sevice:-
In IAAS provider manage Infrastructure things (server, storage, Networking)
remaining things like Applications, Data, Runtime, Middle ware things have to
manage by them.
This is Ideal for Business or Developers who need to build custom applications
and manage their infrastructure but don’t want to physically maintain
hardware.
Ex:- AWS, Microsoft Azure, Google Cloud Platform.
Platform as a Service:-
Here we have to manage Applications and data, the providers handles
remaining infrastructure and platform maintenance (Like Runtime, Middle
ware, Networking-----)
This is ideal for developers who want to focus on application development
without manage the underlying infrastructure or platform.
Ex:- Heroku, Google App Engine, Azure App Services.
Software as a Service:-
Here everything managed by the provider, user have just interact with the
software.
This is Ideal for end users and business who need ready-to-use software
solutions without worrying about maintenance and infrastructure.
Ex:- Google work space(Gmail, Docs), Microsoft 365, Salesforce----

SAP CPI is a Under the Paas

SAP CPI Overview:-


CPI is a cloud based middleware solutions that enables the integration of on-
premise and cloud applications with in an organizations as well as across
different organizations.
It’s a part of SAP Integration Suite and facilitates data exchange between
different systems such as S/4 Hana, SF, Third party applications and other SAP
and Non SAP systems.
CPI supports wide range of Integration patterns including A2A, B2B, Hybrid
integration scenarios.
A2A Integration:
This type of integrations focuses on enabling communication b/w different
applications within the enterprise ecosystem (Ex: B/W two SAP Systems, B/W
SAP and third party systems or B/W any combination of cloud and on-prmise
applications).
A2A integration typically use a message driven approach, where messages are
exchanges based on business events (like order creation, invoice submission
etc--).
B2B Integrations:
B2B integration in the concept of SAP CPI refers to the integrations b/w
different organizations to facilitate the seamless exchange of data such as Pos,
invoices or inventory updates b/w external systems.
These integrations typically involve the use of industry standards and protocols
like EDI, XML, EDIFACT, X12, CXML etc.. and are crucial for automating and
streamlining communications b/w business in supply chains, procurement,
finance and other domains.
SAP CPI WEB IDE:-
Navigation Menu:-
In the homepage we would see 5 options and some may 4 see only because
while enabling capabilities. B2B trading partner management should be kept
enabled.
1. Discover
2. Design
3. Monitor
4. B2B trading Management
5. B2B scenarios
Discover:-
Here we would be able to see the standard pre-packaged content that are
created by SAP. We can use these standard interfaces if it addresses our
business requirement we can use it, but directly we can not modify it, instead
we have to copy or download and move it design tab to make certain changes.
Design:-
In design Tab we will be create new packages or the imported packages will
display here. All interface development will be developed here.
This is web based tool in CPI that allows us to design, configure, deploy
integration scenarios.
It provides a graphical interface to define the Iflows, apply transformations,
map messages and set routing and processing logic.
Integration flow:-
An Iflow is a graphical representation of integration Logic. It is used to model
and manage integration scenarios, defining how data flows b/w systems.

First we open Design Click Create


Create using to creating a new package

We have to enter all the details like Package name, system Name----
After enter the required details click on save it creates New Package in Design.
Package:-
Grouping up of all similar interface in one folder is called Package.
After open the package click on edit option to create a Artifact.
Artifact:
Each object in a package is called Artifact.

Click on Artifact after click on ADD we have select Integration flow, message
mapping, ---- which ever we need.
Monitor: -
It is a web based tool that allowed for monitoring and trouble shoot integration
process.
Provides detailed logs and tracking capabilities for Iflows, offering visibility into
message processing errors and performance issues.
It Has multiple sections
Monitor Message Processing:-
In this section we will be able to monitor the interfaces that was run past one
hour. We will able to see success, failed and completed messages separate in
each tile. If we need to customise the duration, we can change it from 1 hour
to 24 hours as per need.
Clicking on any message in the Message Monitoring view will show you
detailed information such as:
 Message Status: Success, Error, Pending, etc.
 Log/Trace Information: This shows the steps the message has gone
through in the integration flow, including any errors encountered.
 Error Handling: If the message failed, you will typically see error
messages or exception details that can be used to debug the issue.
 Attachments: If configured, attachments related to the message can be
viewed, such as payloads or any other external files.
Logs and Trace
 View Logs for Troubleshooting:
o You can access detailed logs for each message to debug problems
or understand performance bottlenecks. This includes logging at
the level of:
 Mapping: Data transformation logs.
 Adapter: Connectivity or communication issues.
 Process Integration Flow: Logs of each step in the process.
Manage Integration Content:-
Here we will be able to see all the integration flows that are deployed. We can
also see the end points if exists.
Here we change log Configuration into Info level to Trace or debug and etc..

Manage Security:-
In this section we will be able to store credentials, certificates, PGP Keys,
Performance connectivity checks.
1. Security Material:-
When creating integration flows (iFlows) that involve secure communication,
we need to ensure that the correct security parameters are in place. These
include:
 Authentication credentials (e.g., OAuth, Basic Authentication)
 SSL/TLS encryption for secure connections
 Public/Private Key Pairs (for example, when using SFTP, HTTPS, or other
secure protocols)
Here we can create User Credentials, OAuth2 credentiials, secure parameters,
OAuth2 authorization codes,--- etc for security communications.
Here we can also upload the know hosts (SSH).
The "Know Host (SSH)" concept generally refers to the use of Secure Shell
(SSH) for secure communication with external systems, often for tasks like
connecting to an SFTP server, executing commands, or transferring files over a
secure channel.
In SSH, a "known host" is a system or server whose SSH fingerprint is already
registered and known by the client. When the client attempts to connect to a
remote system using SSH, the host’s public key is validated against the known
hosts file to ensure that the client is connecting to the correct server and not to
a malicious one.
This concept is primarily used to protect against man-in-the-middle attacks
where a third party might intercept or alter the communication between the
client and the server.
Manage Keystore:-
A Keystore is a container used to store cryptographic keys and certificates. It
holds:
 Private Keys (used for authentication and encryption)
 Public Keys (used to verify signatures or for encryption)
 SSL/TLS Certificates (used to establish secure communication)
 Trusted Certificates (used to trust communication with external systems)
Steps to Import Certificates and Keys:
1. In the Keystore section, click on Import.
2. Select the Certificate Type (Private Key, Public Key, or Certificate).
3. Choose the file you want to import (e.g., a .pfx, .pem, .cer, or .key file).
4. Provide a name for the certificate or key to identify it in the Keystore.
5. If importing a private key, you may need to provide a password to access
it.
6. Click Import to add the certificate/key to the Keystore.
2. Viewing Keystore Entries
Once certificates and keys are imported into the Keystore, you can view them
to ensure they have been stored correctly.
Steps to View Keystore Entries:
1. Navigate to the Keystore section in the Security settings.
2. You will see a list of all certificates and keys stored in the Keystore.
3. Click on any entry to view detailed information about the certificate or
key, such as:
o Certificate details (Issuer, Subject, Validity Period).
o Key aliases.
o Fingerprint.
o Expiry date of certificates.
3. Exporting Certificates
In some cases, you might need to export certificates for sharing or importing
them into other systems (e.g., for integrating with other platforms or services).
Steps to Export Certificates:
1. In the Keystore tab, select the certificate you want to export.
2. Click on Export.
3. Choose the format (e.g., .pem, .cer, .pfx).
4. Specify the destination where the certificate will be exported.
4. Deleting Keystore Entries
If you no longer need a certificate or key in the Keystore (e.g., an expired
certificate or outdated key), you can delete it.
Steps to Delete Keystore Entries:
1. In the Keystore section, select the certificate or key you want to delete.
2. Click Delete.
3. Confirm the deletion to remove the entry from the Keystore.
5. Updating Certificates
When certificates expire or need to be updated (e.g., for renewed certificates),
you can replace the old certificates with new ones.
Steps to Update Certificates:
1. First, delete the expired or outdated certificate from the Keystore.
2. Import the new certificate by following the import process described
earlier.
6. Exporting the Keystore
You may need to export the Keystore to back up or transfer it to another
environment (such as from development to production).
Steps to Export the Keystore:
1. Navigate to the Keystore section.
2. Click on Export.
3. Select the Keystore format for export (e.g., .pfx or .jks).
4. Provide a password for the Keystore if needed and select a location to
save the file.
PGP Keys:-
Pretty Good Privacy (PGP) is a popular encryption method used for securing
email communications and file transfers. In the context of SAP Cloud Platform
Integration (CPI), PGP is often used for encrypting and decrypting files or
messages as part of integration scenarios. PGP keys are essential for these
processes and are used for message encryption, digital signatures, and
authentication in secure communication.
We have two PGP keys
 Public Key: Used for encrypting data that can only be decrypted by the
corresponding private key.
 Private Key or Secret Key: Used for decrypting data that was encrypted with
the corresponding public key.
1. Under Keystore, import the PGP keys:
o Choose Import and select the key file (either public key or private
key).
o For PGP public key, you’ll import it so others can use it to encrypt
files/messages for you.
o For PGP private key, you’ll import it to decrypt files/messages that
were encrypted with your PGP public key.
o Enter any required passwords (e.g., passphrase for private keys).
User Roles:-
Here we add user roles to CPI.
Ex: EsbMessaging.send

Connectivity test:-
A Connectivity Test helps verify that the communication between SAP CPI and
external systems (such as APIs, SFTP servers, databases, or cloud applications)
is functioning properly and that all configurations are correct.

Manage Stores:-
Here we will be able to read, write and delete payloads temporarily, create
variables, store ques which will be used in JMS adapters and create number
range.
Manage Locks:-
Here we will be able to remove the locks which get created when we are
editing the Iflow. So that other person wont able to parallely edit the same
Iflow.

Integration Flow:-
After you creating a package, open that package and click on artifacts select
add and select Iflow to crate a Iflow

Edit:
For enable the Iflow to make changes and deploy.
Configure:
This option we will use only after completing the Iflow development. This
process is called configuring the externalized parameters.
Ex:- In DEV environment the SFTP server will be different and QA the server will
be different. So we have to go inside the Iflow and edit the SFTP server details.
To avoid editors in the flow we externalised those fields, then with configure
option will see all externalised parameters in one window and close it.
Deploy:
For deploy or activate the Iflow.
Save As Version:
For saving the changes as a separate version developed can go back to the
previous version.

SAP CPI Standard Palette Functions:


1. Participants
First palette function group we are going to see is the Participants group. In
participants group, there are 2 palette functions:
 Sender
 Receiver
In the typical integration process, we can use multiple sender and receiver.
There are specific properties that we can configure in the sender or receiver
participant. Once you connect the sender or receiver to the start or any palette
(say, request reply, send) you will be prompted to select the adapter.

2. Process:
There are 3 types of processes available:
Integration Process
Local Integration Process
Exception Sub Process.

Integration Process:

This is the main process that you will be using for all the interface developments. It is mandatory to
have one integration process and one IFlow should contain only one Main integration process.

Local Integration Process:


Local Integration Process will be useful when you have a complex IFlow or an
IFlow where multiple (>25 or 35) palettes involved. It is just used for splitting
main integration process to smaller chunks of processes.
You cannot use ONLY local integration process in IFlow, as having a main
integration process (as discussed above) is a must. You can have any number of
local integration process in an IFlow. You can call the local integration process
in main integration process via another palette function, PROCESS CALL.
In Process call palette, you have to select the local integration process that you
have to use in main integration process. In Local integration process, SENDER
palette cannot be used. Also Start or end message events cannot be used. In
Local Integration Process, only Start Event and End Event/Error End
Event/Terminate Event can be used.
Exception Sub Process:
This is not a mandatory process to be used in IFlow. As in the name suggests, it
is used for capturing the error messages and doing the necessary process
thereafter like mailing to business regarding the failure. Similar to Local
Integration Process, we cannot have Sender palette and Start message event.
For Exception sub process, we specifically have Error Start Event and Error End
Event/End Message.
Note: You cannot use Exception Sub Process separately in the artifact editor as
we do for Local Integration Process. It should be used within the main
integration flow or inside the local integration flow.
3. Events:
There are 9 events in the Events palette group:
 Start Message
 End Message
 Start Event
 End Event
 Error Start
 Error End
 Start Timer
 Terminate Message
 Escalate End Event

Start and End Message:


The Start Message and the End Message events are used when SAP Cloud
Platform Integration receives a message from a Sender and when SAP Cloud
Platform Integration sends a message to a Receiver. By default, when you
create an integration flow, the start and end message events are made
available.
Error Start/End and Escalation End:
In case of error end, the error reason is not sent back to source. The message
just fails in CPI. Whereas, in case of escalation end event, the error reason is
sent back to source and the message goes to escalated status in CPI. Error Start
and Error End can be used only within an exception sub-process.
Start and End event:
These two events can be only used in Local Integration Process.
Terminate Message:
A Terminate Message event stops further processing of a message. For
example, you have defined specific values on the payload. If the payload
doesn’t match those values, the process is terminated. The message status
displayed in the message processing log is Completed, because it has
terminated the message successfully.
If you want to show it as Error in CPI Message Monitoring logs, then instead of
Terminate Message event, replace it with any of the below events:
Use an Error End event.
Use an Escalation End Event (sets a message status to Escalated in that case).
4. Timer:

Timer start is especially useful in scenarios where you have go and pull data
from systems or have to trigger Web services at specified time/ intervals. The
usual pattern of using a timer is – A content modifier followed by a timer. That
is because a timer does not create payload in the pipeline. With a content
modifier, you can create the request payload that can be sent to the system.
In the properties section, we have three options:
Run Once – Every time you deploy the IFlow, the IFlow will get triggered once.
Schedule on Day – This option will be useful if you want to schedule on a
specific day, every 10secs/1min/or any time.
Schedule to Recur – This option will enable user to schedule on
daily/weekly/monthly basis.
5. Connectors:
Connectors are the arrow heads which denotes the message flow in the any
IFlow Integration Process. Usually, we don’t use Connectors in Palette bar,
instead we use it on clicking the palette function and dragging it to the target
palette.
6. Delete:
Delete palette is used when we have to delete any palette that is inside any
Integration Process. As said earlier in Connectors, we usually don’t delete from
Palette tool bar, instead clicking on the palette which you want to delete would
throw us list of options (similar to above screenshot) and click on Delete.
7. Message Transformations:
In the message transformers, we do have many transformation tools/palette
functions available
1. Content Modifier
2. Converters(CSV to XML / EDI to XML / JSON to XML / XML to
JSON / XML to CSV / XML to EDI)
3. Decoders(Base64 Decoder / GZIP Decompression / MIME
Multipart Decoder / ZIP Decompression)
4. EDI Extractor
5. Encoder(Base64 Encoder / GZIP Compression / MIME Multipart
Encoder / ZIP Compression)
6. Filter
7. Message Digest
8. Script(GroovyScript / JavaScript)
9. XML Modifier
1. Content Modifier:-

This is the most important and most used palette function in the IFlow
development. As the name suggests, it is used to modify the content of the
incoming payload. Content in the sense, not only the body of the payload and
also the headers and properties.
You can also store the incoming payload in property and use it later for any
purpose. For storing the incoming payload, you would be using camel
expression.
I have created one parameter called IdocPayload and source value will be the
body of the incoming payload, hence provided as ${in.body}. Since it is an
expression, I have selected Expression as the Source Type. Its not just only the
expression, we have many Source Types.
Constant, Global Variable, Number Range, Expression, XPath, Header, Local
Variable, Property.
You can later reference Idoc Payload anywhere in the IFlow by calling the
property name with property as a prefix. For e.g.: ${property.IdocPayload}.
These properties won’t be sent to the receiver side, as it is just for the IFlow
development. Headers work in the similar way as the properties, but the only
thing is, headers will be sent to target system.
Whatever you are passing in the Message body section of the content modifier
will be sent to the next step of the IFlow, irrespective of the incoming payload
from source system.

2. Converters:
Converters are components used to convert message data from one format to
another. Converters are essential when you need to integrate systems that use
different data formats.
Converters in CPI are typically used to transform data into a desired format
before processing, or before sending data to another system. These are
especially important when your integration scenarios involve systems that
expect different data formats or encoding standards.
We have 6 different type of converters in CPI
1. XML to JSON Converter:
Converts data in XML format to JSON format.
We may have an inbound message in XML format but the target system expects
a JSON payload. This converter helps in transforming the data before sending it.
2. JSON to XML Converter:
Converts data from JSON format to XML format.
Similar to the XML to JSON converter, this is used when you receive a JSON
payload but need to send the data in XML format to the target system.
3. XML to CSV Converter:
Converts XML data into CSV (Comma-Separated Values) format.
When your integration involves systems that expect data in CSV format (e.g.,
some external file systems or reporting tools), this converter is used to
transform the XML into CSV.
4. CSV to XML Converter:
Converts JSON data into CSV format.
Similar to XML to CSV, this is used when the message is in CSV format but
needs to be converted into XML format for consumption by another system.
5. EDI to XML Converter
Converts EDI (Electronic data Interchange) in to XML format.
Some common EDI standards are EDIFACT, ANSI X12, and TRADACOMS.
6. XML to EDI Converter
Converts XML in to EDI (Electronic data Interchange) format.
3. Decoders:
This palette can be used to decode the message content which is received over
network. There are no configurations that needs to be done in CPI. Just add it
in your IFlow, rest itself take cares.
We have various decoder provided:
Base64 Decode: Decodes base64-encoded message content.
GZIP Decompress: Decompresses the message content using GNU zip (GZIP).
ZIP Decompress: Decompresses the message content using zip.
MIME Multipart Decode: Transforms a MIME multipart message into a
message with attachments. If the multipart headers are part of the message
body, select Multipart Headers Inline. If this option is not selected and the
content type camel-header is not set to a multipart type, no decoding takes
place.
If this option is selected and the body of the message is not a MIME multipart
(with MIME headers in the body), the message is handled as a MIME comment
and the body is empty afterwards.
4. Encoders:
Same like Decoders here we use it for encode the data
5. EDI Extractor:
EDI Extractor enables you to extract EDI headers and transfer to camel headers.
This element extracts data from single incoming EDI document and adds it to
the exchange such that this information can be used further in message
processing. EDI extractor can read both flat file and XML format.
6. Filter:
Filter is a mechanism used to selectively process or route messages based on
certain conditions. Filters help in processing only specific messages that meet
predefined criteria, which can improve performance, reduce unnecessary
processing, and ensure the right data is being handled at each step of the
integration flow.
Filters in SAP CPI can be applied in various parts of an iFlow (Integration Flow)
to refine how messages are handled. These filters are commonly used to:
1. Route messages to different processing steps or systems based on
specific conditions.
2. Transform or modify the message content based on filtering criteria.
3. Conditionally stop or skip processing certain messages.
7. Message Digest:
A Message Digest is a cryptographic hash function that generates a fixed-size
string (or value) that uniquely represents the content of a message. It is used to
ensure data integrity, verify authenticity, and protect data during transmission.
The key feature of a message digest is that even the smallest change in the
original data will result in a completely different output hash, which makes it
useful for detecting data tampering.
In SAP Cloud Platform Integration (CPI), the Message Digest is primarily used
to ensure data integrity, verify authenticity, and create digital signatures. This
is particularly useful when integrating different systems, ensuring that the data
hasn't been tampered with during transmission. In CPI, the message digest
(generated by cryptographic hash functions like MD5, SHA-1, or SHA-256) can
be used in various integration scenarios for secure and trusted data exchanges.
8. Script:
In PI/PO, we would be using Java for complex mapping scenarios. In CPI, we
have Groovy and JavaScript, where Groovy is predominant one used in CPI
community.
10.XML Validator:
In SAP Cloud Platform Integration (CPI), the XML Validator is used to validate
XML messages against an XML Schema Definition (XSD) or other structural
rules. This validation ensures that the XML message conforms to a predefined
structure and meets the required business rules before being processed further
in the integration flow (iFlow).
Call:
We have two type calls in CPI
1. External Call
2. Local Call
1. External Call:
Palette functions in External Call is used for communicating with
external/backend systems through any adapter like ODATA,HTTP etc… There
are different types of external calls provided by CPI: –
Content Enricher
Poll Enrich
Request Reply
Send
1. Content Enricher:
Content enricher allows us to concatenate the incoming payload with the
response payload of the receiver system, like OData api response. This feature
enables you to make external calls during the course of an integration process
to obtain additional data.
Content enricher alone won’t work. It must be connected to the receiver
system. Typically, the arrow has to be connected from request-reply palette to
the receiver system, but in content enricher the arrow connection will be from
receiver system to content enricher.
There are two aggregation algorithms in the properties section.
COMBINE:
Combine algorithm doesn’t have any special behaviour. It just concatenates the
incoming payload with the response of the receiver system.
Source Payload:
<module>
<module_shortname>CPI</module_shortname>
<module_fullname>Cloud Platform Integration</module_fullname>
</module>
OData API Response:
<Customers>
<Customer>
<CompanyName>Alfreds Futterkiste</CompanyName>
<Address>Obere Str. 57</Address>
<Region/>
<PostalCode>12209</PostalCode>
<CustomerID>ALFKI</CustomerID>
<City>Berlin</City>
<ContactName>Maria Anders</ContactName>
<ContactTitle>Sales Representative</ContactTitle>
</Customer>
<Customers>
Content Enricher Output:
<?xml version='1.0' encoding='UTF-8'?>
<multimap:Messages xmlns:multimap="http://sap.com/xi/XI/SplitAndMerge">
<multimap:Message1>
<module>
<module_shortname>CPI</module_shortname>
<module_fullname>Cloud Platform Integration</module_fullname>
</module>
</multimap:Message1>
<multimap:Message2>
<Customers>
<Customer>
<CompanyName>Alfreds Futterkiste</CompanyName>
<Address>Obere Str. 57</Address>
<Region/>
<PostalCode>12209</PostalCode>
<CustomerID>ALFKI</CustomerID>
<City>Berlin</City>
<ContactName>Maria Anders</ContactName>
<ContactTitle>Sales Representative</ContactTitle>
</Customer>
</Customer>
</multimap:Message2>
</multimap:Messages>

ENRICH:
Enrich Algorithm allows us to combine the two payloads based on the key
elements in the XML converting the two separate messages to a single
enhanced payload. You have to provide 4 parameters for enrich algorithm –
original (incoming payload to content enricher) and lookup message (receiver
response connected to content enricher).
Note:
The key element’s value should be same in both original and the lookup
message, so that the concatenation would happen as expected.

2. Poll Enrich:
In SAP Cloud Platform Integration (CPI), Poll Enrich refers to the process of
polling data from an external system and enriching the message with that data
before continuing the integration flow. The goal of Poll Enrich is to pull relevant
information from external sources (e.g., databases, APIs, or files) and enhance
the message being processed.
The Poll Enrich pattern is commonly used when you need to retrieve external
data (from systems like SAP or non-SAP systems) and integrate it into your
message flow. This is particularly useful in scenarios where the source system
contains the necessary context or additional information that must be included
in the message before it reaches the target system.
3. Request Reply:
Unlike content enricher, request reply does not concatenate the incoming
payload with the lookup message, instead the response is only forwarded to
the next palette function. The original/source message will no longer be valid
after the request reply step. Request reply is synchronous by nature.

Whatever the incoming payload is, it will get replaced by the response
generated after the odata lookup step.

4. Send:
Send palette function can be used to configure a service call to a receiver
system for scenarios and adapters where no reply is expected. Send palette is
not supported by Odata adapter, as odata adapter is synchronous in nature.
Compatible adapters for Send palette:
• AS2 adapter
• FTP adapter
• JMS adapter
• Mail adapter
• SOAP SAP RM adapter
• SFTP adapter
• XI adapter (Quality of Service „Exactly Once“)
Mostly we will be using Request reply in the project IFlows and in some cases
content enricher. Send is not used often but it depends on the business
requirement.
2. Local Call:
Local call palettes are used to communicate inside the integration process
window. It is not used for external system communication.

1. Process Call:
Process Call is used when there is a local integration process is in the IFlow.
Assume, you have included local integration process and you have to call it in
the main integration process. In this case, we can use the process call palette.

In the process call palette properties, select the local integration process
(system will list out all the local integration processes available in that IFlow).
This process call will run only once.

2. Looping Process Call:


If you want to execute the local integration process multiple times or based on
a condition, then looping process call can be used.

3. Idempotent Process Call:

Execute a process call step to check if an incoming message was already


processed and skip the processing of this message.
For e.g.: If you want to execute an order only once and it should not get
processed upon retrying. In that case, using Idempotent process call would
work. The Idempotent Process Call detects if an order or any which you provide
in the message ID field, has already been successfully processed and stores the
status of the successful process in the idempotent repository. If a duplicate is
found, then the message is marked as duplicate. If you have a receiver system
(let’s say a third-party system) that can’t handle duplicate messages properly,
you can call the receiver system from within an Idempotent Process Call.

Routing:
Routing consist of many palette functions where the primary role is to combine
the incoming message or split based on a condition xpath or an expression.
There are many routing palette functions:

1. Aggregator:
Aggregator function is used to combine incoming chunks of messages based on
a condition expression into a single message. Until the further message is
received, earlier messages will get stored in a datastore. It only supports XML.
There are two kinds of Aggregation Algorithm available.
Combine : This will combine the incoming messages without any guarantee on
the order of the messages.
Combine In Sequence : This will combine the incoming messages in exact
order.
Properties:
Correlation Expression (XPath): XPath expression that identifies the element
based on which the incoming message is correlated/combined.
Incoming Format: XML
Aggregation Algorithm: Combine and Combine in sequence.
Message sequence expression: Field based on which the sequence of the
messages is determined. The field should contain numbers, so that sequence
can be determined.
Last Message Condition: Condition till when incoming messages must get
aggregated/combined.
Data Store name: By default – Aggregator-1

2. Gather and Join:


Here we are going to cover two palette functions – Gather and Join.
Join – Used to merge multiple route into one route. It doesn’t have any
config/properties.
Gather – Used to concatenate the messages of all the routes.
If we have placed a router/multicast step and have 2-3 routes, then we have to
first JOIN to merge all the routes and to concatenate all the payloads into one
single payload, we have to use GATHER.
It can combine XML of same format/different format, plain text messages or
any format.
If we are using Splitter, then we have to use only Gather function and not join
as there are no multiple routes.

3. MultiCast:

Multicast allows us to route the incoming message to different routes similar to


router, but NOT based on condition. It contains two different multicast –
Parallel multicast and Sequential multicast.
Parallel Multicast – Routes the incoming message to different branches but
while the execution of the messages will be random. The parallel multicast will
pass the incoming messages to all 3 branches simultaneously and whichever
branch gets executed first will get passed to output.
Sequential Multicast – Branch order is important in this multicast. So, if the
first branch gets failed then the other branches won’t get executed.
If we want the branch order to be prioritized, then select the branch and click
on MOVE UP or MOVE DOWN button.
4. Router:
Router is used to route the incoming message via different route based on the
condition (XPath or expression).
There is no need to combine the messages after route, it can be handled
individually also.
In Router there has to be ONE default route, which will be executed when
other routes condition is not satisfied.

5. Splitter:

As the name denotes, splitter is used to split the messages into small chunks of
messages based on the XPath provided. We have 7 types of splitters:
1. EDI Splitter
2. General Splitter
3. Iterating Splitter
4. IDOC Splitter
5. PKCS#7/CMS Splitter
6. Tar Splitter
7. Zip Splitter

General Splitter:
General splitter splits the incoming message into N number of individual
messages, each message enveloped by parent element.
Iterating Splitter:
Iterating splitter splits the incoming message into N number of individual
messages, unlike general splitter, there won’t be any envelope surrounded.

Persistance:
1. Persisting Message
Store a message so that you can access the stored message and analyze it at
later point.
In an integration Flow at any point, you can add the persist palette function to
store the message. The message storage feature is useful for auditing purposes.
This component stores data on your tenant. Note that there is an overall disk
space limit of 32 GB.
But the limitation is, unlike data store operations, you won’t be able to see the
persisted messages via any GUI. Instead, you have to use the Cloud Integration
OData API to access the persisted messages. And there is no feature like,
accessing the contents of the persisted messages in between the execution of
the integration flow. It should be after the execution of the integration flow.
You need to do odata api call to access the persisted data in the below URL
format.
https://<Cloud Integration host>/api/v1/MessageProcessingLogs(‘<Message
ID>’)/MessageStoreEntries
Properties:
Provide a unique Step ID. This step ID will be coming as MessageStoreId in the
odata api response. You have an option to encrypt the stored message.

2. Data Store Operations


You might come across a situation where a message payload of one interface
after successful processing is required to be reused in the same interface or any
other interface. To handle such requirements, SAP CPI has introduced new
feature ‘DataStoreOperations’ under which various controls can be used in the
iflows for storing/reading/deleting the message payload of an interface.
1. DataStore – Write:

I have built a simple iflow where we have a HTTPS sender with XML body and
passed to a content modifier which just captures the payload body and passes
it over to next step. Then we have the Write palette function.
Datastore name – Name of the datastore. You can also dynamically define the
data store name by getting the header or property value, $
{header.headername} or ${property.propertyname}. Max length is 40chars.
Entry ID – Specify an entry ID that is stored together with the message content.
Max length 255 characters.
Visibility – If you want to access it from another IFlow, then keep it as Global or
else Integration Flow.
Retention threshold for alerting – 2 days, after 2 days, system will throw an
alert.
Expiration Period – 30 days, after which system removes the entry from the
data store.
Encrypt Stored message – encrypt the stored content.
Overwrite existing message – If you are sending same data store name, then at
each time of the flow execution, the message content will be overwritten.
Include message headers – If you want to save the message headers too.

2. Data Store – Get

The Data Store Get operation retrieves data that has previously been stored in
a data store during the execution of an integration flow.
Datastore name – Name of the datastore which you want to fetch.
Entry ID – Name of the entry ID which corresponds to the data store name.
Visibility – Whether you want to search inside the integration flow or search
for the data store messages globally.
Delete on Completion – If it has to be deleted, then enable “delete on
completion”.
Throw Exception on missing entry – If no records found, then exception should
be thrown if this option is enabled.
3. Data Store – Select:

If there are multiple data store messages and you want to fetch all the
messages which is having a particular data store name, then SELECT can be
used.
If you want to fetch a particular record with combination of data store name
and entry ID.
In select operation, we won’t be providing Entry ID.

4. Data Store – Delete:

It is for deleting data store entries. But if there are multiple entry IDs as in our
example, then we need to mention the Entry ID along with the datastore name.

4. Write Variables

Variables are typically having the same meaning, where we can store data
inside it and can change it anytime. You define variables to share data across
different integration flows. Once you write the variable to the variable store
and marked as Global, then it can be accessed from other IFlows.
We use the Write Variables step type to create a variable at a certain point
within the message processing sequence. To consume the variable (either in
another step of the same integration flow or in another integration flow)

Security:
1. Decryptor:

Decryptor is used to decrypt encrypted messages or data within an integration


flow. When sensitive information is transmitted (for example, passwords, credit
card details, or other private data), encryption is often used to secure it. The
Decryptor step is then used to decrypt that encrypted content so that the data
can be processed, logged, or transmitted further within the integration.

2. Encryptor:
Encryptor step is used to encrypt data before sending it to another system.
Encryption is a security measure that ensures the confidentiality and integrity
of sensitive information, preventing unauthorized access during transmission.
The Encryptor in SAP CPI can encrypt messages or files, using encryption
algorithms like PGP (Pretty Good Privacy) or AES (Advanced Encryption
Standard), depending on the use case.

3. Signer
Signer step is used to digitally sign a message or file. This operation ensures the
integrity and authenticity of the data. When data is signed, it provides a way to
verify that the data has not been tampered with during transmission and that it
originates from a trusted source.

4. Verifier
Verifier step is used to verify the digital signature applied to a message or file.
It checks that the received data has not been tampered with and confirms that
the sender is authentic.
The Verifier uses the public key corresponding to the private key that was
used to sign the message. This step is essential for ensuring data integrity and
authenticity when receiving signed messages or documents.

MAPPINGS:

SAP Cloud Platform Integration (SAP CPI) provides message mapping feature.
Similar to PI/PO, we have the message mapping feature with graphical editor
and the same graphical mapping functions. CPI provides us with many
mappings’ palette functions for different usecases.
– Message Mapping
– ID Mapping
– Operation Mapping
– XSLT Mapping

1. Message Mapping:

We have 2 options to create message mapping. Either it can be an artifact itself


or can be created inside IFlow.
Points to know:
1. We have to upload the XSD schema of the source and target message in
IFlow -> Resources -> Add -> Schema -> XSD, before starting to work on the
mapping operations.
2. We have the simulate operation where instead of testing the mapping
end to end, we can upload the test XML file and click on SIMULATE -> TEST
option to view the output.

These are the standard mapping functions provided by SAP in the graphical
editor tool. There is also a provision to embed groovy code if the standard
functionality is not suitable to the requirements, which can be found at the top
of the above image.

Arithmetic – add, subtract, equals (number), absolute, sqrt, square, sign, neg,
inv, power, lesser, greater, multiply, divide, max, min, ceil, floor, round, counter,
formatNumber.
Boolean – and, or, not, equals(boolean), notEquals, if, ifS, ifWithoutElse,
ifSWithoutElse, isNil.
Constant – constant, copyValue, xsi:nil.
Conversions – fixValues, ValueMapping.
Date – currentDate, dateTrans, dateBefore, dateAfter, compareDates.
Node functions – createIf, removeContexts, replaceValue, exists, splitByValue,
collapseContexts, useOneAsMany, sort, sortByKey, mapWithDefault,
formatByExample.
Statistic – sum, average, count, index.
Text – concat, substring, equals(string), indexOf(2), indexOf(3), lastIndexOf(2),
lastIndexOf(3), compare, replaceString, length, endsWith, startsWith(2),
startsWith(3), toUpperCase, toUpperCase, trim.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy