Sap Cpi
Sap Cpi
Traditional IT Model:-
In Traditional IT have to manage all things like Data, Runtime, Middleware,
Server, storage--------. It is very difficult and highly charged to maintain all
things, so lot of companies right now looking at the IAAS,SAAS,PAAS.
Infrastructure as a Sevice:-
In IAAS provider manage Infrastructure things (server, storage, Networking)
remaining things like Applications, Data, Runtime, Middle ware things have to
manage by them.
This is Ideal for Business or Developers who need to build custom applications
and manage their infrastructure but don’t want to physically maintain
hardware.
Ex:- AWS, Microsoft Azure, Google Cloud Platform.
Platform as a Service:-
Here we have to manage Applications and data, the providers handles
remaining infrastructure and platform maintenance (Like Runtime, Middle
ware, Networking-----)
This is ideal for developers who want to focus on application development
without manage the underlying infrastructure or platform.
Ex:- Heroku, Google App Engine, Azure App Services.
Software as a Service:-
Here everything managed by the provider, user have just interact with the
software.
This is Ideal for end users and business who need ready-to-use software
solutions without worrying about maintenance and infrastructure.
Ex:- Google work space(Gmail, Docs), Microsoft 365, Salesforce----
We have to enter all the details like Package name, system Name----
After enter the required details click on save it creates New Package in Design.
Package:-
Grouping up of all similar interface in one folder is called Package.
After open the package click on edit option to create a Artifact.
Artifact:
Each object in a package is called Artifact.
Click on Artifact after click on ADD we have select Integration flow, message
mapping, ---- which ever we need.
Monitor: -
It is a web based tool that allowed for monitoring and trouble shoot integration
process.
Provides detailed logs and tracking capabilities for Iflows, offering visibility into
message processing errors and performance issues.
It Has multiple sections
Monitor Message Processing:-
In this section we will be able to monitor the interfaces that was run past one
hour. We will able to see success, failed and completed messages separate in
each tile. If we need to customise the duration, we can change it from 1 hour
to 24 hours as per need.
Clicking on any message in the Message Monitoring view will show you
detailed information such as:
Message Status: Success, Error, Pending, etc.
Log/Trace Information: This shows the steps the message has gone
through in the integration flow, including any errors encountered.
Error Handling: If the message failed, you will typically see error
messages or exception details that can be used to debug the issue.
Attachments: If configured, attachments related to the message can be
viewed, such as payloads or any other external files.
Logs and Trace
View Logs for Troubleshooting:
o You can access detailed logs for each message to debug problems
or understand performance bottlenecks. This includes logging at
the level of:
Mapping: Data transformation logs.
Adapter: Connectivity or communication issues.
Process Integration Flow: Logs of each step in the process.
Manage Integration Content:-
Here we will be able to see all the integration flows that are deployed. We can
also see the end points if exists.
Here we change log Configuration into Info level to Trace or debug and etc..
Manage Security:-
In this section we will be able to store credentials, certificates, PGP Keys,
Performance connectivity checks.
1. Security Material:-
When creating integration flows (iFlows) that involve secure communication,
we need to ensure that the correct security parameters are in place. These
include:
Authentication credentials (e.g., OAuth, Basic Authentication)
SSL/TLS encryption for secure connections
Public/Private Key Pairs (for example, when using SFTP, HTTPS, or other
secure protocols)
Here we can create User Credentials, OAuth2 credentiials, secure parameters,
OAuth2 authorization codes,--- etc for security communications.
Here we can also upload the know hosts (SSH).
The "Know Host (SSH)" concept generally refers to the use of Secure Shell
(SSH) for secure communication with external systems, often for tasks like
connecting to an SFTP server, executing commands, or transferring files over a
secure channel.
In SSH, a "known host" is a system or server whose SSH fingerprint is already
registered and known by the client. When the client attempts to connect to a
remote system using SSH, the host’s public key is validated against the known
hosts file to ensure that the client is connecting to the correct server and not to
a malicious one.
This concept is primarily used to protect against man-in-the-middle attacks
where a third party might intercept or alter the communication between the
client and the server.
Manage Keystore:-
A Keystore is a container used to store cryptographic keys and certificates. It
holds:
Private Keys (used for authentication and encryption)
Public Keys (used to verify signatures or for encryption)
SSL/TLS Certificates (used to establish secure communication)
Trusted Certificates (used to trust communication with external systems)
Steps to Import Certificates and Keys:
1. In the Keystore section, click on Import.
2. Select the Certificate Type (Private Key, Public Key, or Certificate).
3. Choose the file you want to import (e.g., a .pfx, .pem, .cer, or .key file).
4. Provide a name for the certificate or key to identify it in the Keystore.
5. If importing a private key, you may need to provide a password to access
it.
6. Click Import to add the certificate/key to the Keystore.
2. Viewing Keystore Entries
Once certificates and keys are imported into the Keystore, you can view them
to ensure they have been stored correctly.
Steps to View Keystore Entries:
1. Navigate to the Keystore section in the Security settings.
2. You will see a list of all certificates and keys stored in the Keystore.
3. Click on any entry to view detailed information about the certificate or
key, such as:
o Certificate details (Issuer, Subject, Validity Period).
o Key aliases.
o Fingerprint.
o Expiry date of certificates.
3. Exporting Certificates
In some cases, you might need to export certificates for sharing or importing
them into other systems (e.g., for integrating with other platforms or services).
Steps to Export Certificates:
1. In the Keystore tab, select the certificate you want to export.
2. Click on Export.
3. Choose the format (e.g., .pem, .cer, .pfx).
4. Specify the destination where the certificate will be exported.
4. Deleting Keystore Entries
If you no longer need a certificate or key in the Keystore (e.g., an expired
certificate or outdated key), you can delete it.
Steps to Delete Keystore Entries:
1. In the Keystore section, select the certificate or key you want to delete.
2. Click Delete.
3. Confirm the deletion to remove the entry from the Keystore.
5. Updating Certificates
When certificates expire or need to be updated (e.g., for renewed certificates),
you can replace the old certificates with new ones.
Steps to Update Certificates:
1. First, delete the expired or outdated certificate from the Keystore.
2. Import the new certificate by following the import process described
earlier.
6. Exporting the Keystore
You may need to export the Keystore to back up or transfer it to another
environment (such as from development to production).
Steps to Export the Keystore:
1. Navigate to the Keystore section.
2. Click on Export.
3. Select the Keystore format for export (e.g., .pfx or .jks).
4. Provide a password for the Keystore if needed and select a location to
save the file.
PGP Keys:-
Pretty Good Privacy (PGP) is a popular encryption method used for securing
email communications and file transfers. In the context of SAP Cloud Platform
Integration (CPI), PGP is often used for encrypting and decrypting files or
messages as part of integration scenarios. PGP keys are essential for these
processes and are used for message encryption, digital signatures, and
authentication in secure communication.
We have two PGP keys
Public Key: Used for encrypting data that can only be decrypted by the
corresponding private key.
Private Key or Secret Key: Used for decrypting data that was encrypted with
the corresponding public key.
1. Under Keystore, import the PGP keys:
o Choose Import and select the key file (either public key or private
key).
o For PGP public key, you’ll import it so others can use it to encrypt
files/messages for you.
o For PGP private key, you’ll import it to decrypt files/messages that
were encrypted with your PGP public key.
o Enter any required passwords (e.g., passphrase for private keys).
User Roles:-
Here we add user roles to CPI.
Ex: EsbMessaging.send
Connectivity test:-
A Connectivity Test helps verify that the communication between SAP CPI and
external systems (such as APIs, SFTP servers, databases, or cloud applications)
is functioning properly and that all configurations are correct.
Manage Stores:-
Here we will be able to read, write and delete payloads temporarily, create
variables, store ques which will be used in JMS adapters and create number
range.
Manage Locks:-
Here we will be able to remove the locks which get created when we are
editing the Iflow. So that other person wont able to parallely edit the same
Iflow.
Integration Flow:-
After you creating a package, open that package and click on artifacts select
add and select Iflow to crate a Iflow
Edit:
For enable the Iflow to make changes and deploy.
Configure:
This option we will use only after completing the Iflow development. This
process is called configuring the externalized parameters.
Ex:- In DEV environment the SFTP server will be different and QA the server will
be different. So we have to go inside the Iflow and edit the SFTP server details.
To avoid editors in the flow we externalised those fields, then with configure
option will see all externalised parameters in one window and close it.
Deploy:
For deploy or activate the Iflow.
Save As Version:
For saving the changes as a separate version developed can go back to the
previous version.
2. Process:
There are 3 types of processes available:
Integration Process
Local Integration Process
Exception Sub Process.
Integration Process:
This is the main process that you will be using for all the interface developments. It is mandatory to
have one integration process and one IFlow should contain only one Main integration process.
Timer start is especially useful in scenarios where you have go and pull data
from systems or have to trigger Web services at specified time/ intervals. The
usual pattern of using a timer is – A content modifier followed by a timer. That
is because a timer does not create payload in the pipeline. With a content
modifier, you can create the request payload that can be sent to the system.
In the properties section, we have three options:
Run Once – Every time you deploy the IFlow, the IFlow will get triggered once.
Schedule on Day – This option will be useful if you want to schedule on a
specific day, every 10secs/1min/or any time.
Schedule to Recur – This option will enable user to schedule on
daily/weekly/monthly basis.
5. Connectors:
Connectors are the arrow heads which denotes the message flow in the any
IFlow Integration Process. Usually, we don’t use Connectors in Palette bar,
instead we use it on clicking the palette function and dragging it to the target
palette.
6. Delete:
Delete palette is used when we have to delete any palette that is inside any
Integration Process. As said earlier in Connectors, we usually don’t delete from
Palette tool bar, instead clicking on the palette which you want to delete would
throw us list of options (similar to above screenshot) and click on Delete.
7. Message Transformations:
In the message transformers, we do have many transformation tools/palette
functions available
1. Content Modifier
2. Converters(CSV to XML / EDI to XML / JSON to XML / XML to
JSON / XML to CSV / XML to EDI)
3. Decoders(Base64 Decoder / GZIP Decompression / MIME
Multipart Decoder / ZIP Decompression)
4. EDI Extractor
5. Encoder(Base64 Encoder / GZIP Compression / MIME Multipart
Encoder / ZIP Compression)
6. Filter
7. Message Digest
8. Script(GroovyScript / JavaScript)
9. XML Modifier
1. Content Modifier:-
This is the most important and most used palette function in the IFlow
development. As the name suggests, it is used to modify the content of the
incoming payload. Content in the sense, not only the body of the payload and
also the headers and properties.
You can also store the incoming payload in property and use it later for any
purpose. For storing the incoming payload, you would be using camel
expression.
I have created one parameter called IdocPayload and source value will be the
body of the incoming payload, hence provided as ${in.body}. Since it is an
expression, I have selected Expression as the Source Type. Its not just only the
expression, we have many Source Types.
Constant, Global Variable, Number Range, Expression, XPath, Header, Local
Variable, Property.
You can later reference Idoc Payload anywhere in the IFlow by calling the
property name with property as a prefix. For e.g.: ${property.IdocPayload}.
These properties won’t be sent to the receiver side, as it is just for the IFlow
development. Headers work in the similar way as the properties, but the only
thing is, headers will be sent to target system.
Whatever you are passing in the Message body section of the content modifier
will be sent to the next step of the IFlow, irrespective of the incoming payload
from source system.
2. Converters:
Converters are components used to convert message data from one format to
another. Converters are essential when you need to integrate systems that use
different data formats.
Converters in CPI are typically used to transform data into a desired format
before processing, or before sending data to another system. These are
especially important when your integration scenarios involve systems that
expect different data formats or encoding standards.
We have 6 different type of converters in CPI
1. XML to JSON Converter:
Converts data in XML format to JSON format.
We may have an inbound message in XML format but the target system expects
a JSON payload. This converter helps in transforming the data before sending it.
2. JSON to XML Converter:
Converts data from JSON format to XML format.
Similar to the XML to JSON converter, this is used when you receive a JSON
payload but need to send the data in XML format to the target system.
3. XML to CSV Converter:
Converts XML data into CSV (Comma-Separated Values) format.
When your integration involves systems that expect data in CSV format (e.g.,
some external file systems or reporting tools), this converter is used to
transform the XML into CSV.
4. CSV to XML Converter:
Converts JSON data into CSV format.
Similar to XML to CSV, this is used when the message is in CSV format but
needs to be converted into XML format for consumption by another system.
5. EDI to XML Converter
Converts EDI (Electronic data Interchange) in to XML format.
Some common EDI standards are EDIFACT, ANSI X12, and TRADACOMS.
6. XML to EDI Converter
Converts XML in to EDI (Electronic data Interchange) format.
3. Decoders:
This palette can be used to decode the message content which is received over
network. There are no configurations that needs to be done in CPI. Just add it
in your IFlow, rest itself take cares.
We have various decoder provided:
Base64 Decode: Decodes base64-encoded message content.
GZIP Decompress: Decompresses the message content using GNU zip (GZIP).
ZIP Decompress: Decompresses the message content using zip.
MIME Multipart Decode: Transforms a MIME multipart message into a
message with attachments. If the multipart headers are part of the message
body, select Multipart Headers Inline. If this option is not selected and the
content type camel-header is not set to a multipart type, no decoding takes
place.
If this option is selected and the body of the message is not a MIME multipart
(with MIME headers in the body), the message is handled as a MIME comment
and the body is empty afterwards.
4. Encoders:
Same like Decoders here we use it for encode the data
5. EDI Extractor:
EDI Extractor enables you to extract EDI headers and transfer to camel headers.
This element extracts data from single incoming EDI document and adds it to
the exchange such that this information can be used further in message
processing. EDI extractor can read both flat file and XML format.
6. Filter:
Filter is a mechanism used to selectively process or route messages based on
certain conditions. Filters help in processing only specific messages that meet
predefined criteria, which can improve performance, reduce unnecessary
processing, and ensure the right data is being handled at each step of the
integration flow.
Filters in SAP CPI can be applied in various parts of an iFlow (Integration Flow)
to refine how messages are handled. These filters are commonly used to:
1. Route messages to different processing steps or systems based on
specific conditions.
2. Transform or modify the message content based on filtering criteria.
3. Conditionally stop or skip processing certain messages.
7. Message Digest:
A Message Digest is a cryptographic hash function that generates a fixed-size
string (or value) that uniquely represents the content of a message. It is used to
ensure data integrity, verify authenticity, and protect data during transmission.
The key feature of a message digest is that even the smallest change in the
original data will result in a completely different output hash, which makes it
useful for detecting data tampering.
In SAP Cloud Platform Integration (CPI), the Message Digest is primarily used
to ensure data integrity, verify authenticity, and create digital signatures. This
is particularly useful when integrating different systems, ensuring that the data
hasn't been tampered with during transmission. In CPI, the message digest
(generated by cryptographic hash functions like MD5, SHA-1, or SHA-256) can
be used in various integration scenarios for secure and trusted data exchanges.
8. Script:
In PI/PO, we would be using Java for complex mapping scenarios. In CPI, we
have Groovy and JavaScript, where Groovy is predominant one used in CPI
community.
10.XML Validator:
In SAP Cloud Platform Integration (CPI), the XML Validator is used to validate
XML messages against an XML Schema Definition (XSD) or other structural
rules. This validation ensures that the XML message conforms to a predefined
structure and meets the required business rules before being processed further
in the integration flow (iFlow).
Call:
We have two type calls in CPI
1. External Call
2. Local Call
1. External Call:
Palette functions in External Call is used for communicating with
external/backend systems through any adapter like ODATA,HTTP etc… There
are different types of external calls provided by CPI: –
Content Enricher
Poll Enrich
Request Reply
Send
1. Content Enricher:
Content enricher allows us to concatenate the incoming payload with the
response payload of the receiver system, like OData api response. This feature
enables you to make external calls during the course of an integration process
to obtain additional data.
Content enricher alone won’t work. It must be connected to the receiver
system. Typically, the arrow has to be connected from request-reply palette to
the receiver system, but in content enricher the arrow connection will be from
receiver system to content enricher.
There are two aggregation algorithms in the properties section.
COMBINE:
Combine algorithm doesn’t have any special behaviour. It just concatenates the
incoming payload with the response of the receiver system.
Source Payload:
<module>
<module_shortname>CPI</module_shortname>
<module_fullname>Cloud Platform Integration</module_fullname>
</module>
OData API Response:
<Customers>
<Customer>
<CompanyName>Alfreds Futterkiste</CompanyName>
<Address>Obere Str. 57</Address>
<Region/>
<PostalCode>12209</PostalCode>
<CustomerID>ALFKI</CustomerID>
<City>Berlin</City>
<ContactName>Maria Anders</ContactName>
<ContactTitle>Sales Representative</ContactTitle>
</Customer>
<Customers>
Content Enricher Output:
<?xml version='1.0' encoding='UTF-8'?>
<multimap:Messages xmlns:multimap="http://sap.com/xi/XI/SplitAndMerge">
<multimap:Message1>
<module>
<module_shortname>CPI</module_shortname>
<module_fullname>Cloud Platform Integration</module_fullname>
</module>
</multimap:Message1>
<multimap:Message2>
<Customers>
<Customer>
<CompanyName>Alfreds Futterkiste</CompanyName>
<Address>Obere Str. 57</Address>
<Region/>
<PostalCode>12209</PostalCode>
<CustomerID>ALFKI</CustomerID>
<City>Berlin</City>
<ContactName>Maria Anders</ContactName>
<ContactTitle>Sales Representative</ContactTitle>
</Customer>
</Customer>
</multimap:Message2>
</multimap:Messages>
ENRICH:
Enrich Algorithm allows us to combine the two payloads based on the key
elements in the XML converting the two separate messages to a single
enhanced payload. You have to provide 4 parameters for enrich algorithm –
original (incoming payload to content enricher) and lookup message (receiver
response connected to content enricher).
Note:
The key element’s value should be same in both original and the lookup
message, so that the concatenation would happen as expected.
2. Poll Enrich:
In SAP Cloud Platform Integration (CPI), Poll Enrich refers to the process of
polling data from an external system and enriching the message with that data
before continuing the integration flow. The goal of Poll Enrich is to pull relevant
information from external sources (e.g., databases, APIs, or files) and enhance
the message being processed.
The Poll Enrich pattern is commonly used when you need to retrieve external
data (from systems like SAP or non-SAP systems) and integrate it into your
message flow. This is particularly useful in scenarios where the source system
contains the necessary context or additional information that must be included
in the message before it reaches the target system.
3. Request Reply:
Unlike content enricher, request reply does not concatenate the incoming
payload with the lookup message, instead the response is only forwarded to
the next palette function. The original/source message will no longer be valid
after the request reply step. Request reply is synchronous by nature.
Whatever the incoming payload is, it will get replaced by the response
generated after the odata lookup step.
4. Send:
Send palette function can be used to configure a service call to a receiver
system for scenarios and adapters where no reply is expected. Send palette is
not supported by Odata adapter, as odata adapter is synchronous in nature.
Compatible adapters for Send palette:
• AS2 adapter
• FTP adapter
• JMS adapter
• Mail adapter
• SOAP SAP RM adapter
• SFTP adapter
• XI adapter (Quality of Service „Exactly Once“)
Mostly we will be using Request reply in the project IFlows and in some cases
content enricher. Send is not used often but it depends on the business
requirement.
2. Local Call:
Local call palettes are used to communicate inside the integration process
window. It is not used for external system communication.
1. Process Call:
Process Call is used when there is a local integration process is in the IFlow.
Assume, you have included local integration process and you have to call it in
the main integration process. In this case, we can use the process call palette.
In the process call palette properties, select the local integration process
(system will list out all the local integration processes available in that IFlow).
This process call will run only once.
Routing:
Routing consist of many palette functions where the primary role is to combine
the incoming message or split based on a condition xpath or an expression.
There are many routing palette functions:
1. Aggregator:
Aggregator function is used to combine incoming chunks of messages based on
a condition expression into a single message. Until the further message is
received, earlier messages will get stored in a datastore. It only supports XML.
There are two kinds of Aggregation Algorithm available.
Combine : This will combine the incoming messages without any guarantee on
the order of the messages.
Combine In Sequence : This will combine the incoming messages in exact
order.
Properties:
Correlation Expression (XPath): XPath expression that identifies the element
based on which the incoming message is correlated/combined.
Incoming Format: XML
Aggregation Algorithm: Combine and Combine in sequence.
Message sequence expression: Field based on which the sequence of the
messages is determined. The field should contain numbers, so that sequence
can be determined.
Last Message Condition: Condition till when incoming messages must get
aggregated/combined.
Data Store name: By default – Aggregator-1
3. MultiCast:
5. Splitter:
As the name denotes, splitter is used to split the messages into small chunks of
messages based on the XPath provided. We have 7 types of splitters:
1. EDI Splitter
2. General Splitter
3. Iterating Splitter
4. IDOC Splitter
5. PKCS#7/CMS Splitter
6. Tar Splitter
7. Zip Splitter
General Splitter:
General splitter splits the incoming message into N number of individual
messages, each message enveloped by parent element.
Iterating Splitter:
Iterating splitter splits the incoming message into N number of individual
messages, unlike general splitter, there won’t be any envelope surrounded.
Persistance:
1. Persisting Message
Store a message so that you can access the stored message and analyze it at
later point.
In an integration Flow at any point, you can add the persist palette function to
store the message. The message storage feature is useful for auditing purposes.
This component stores data on your tenant. Note that there is an overall disk
space limit of 32 GB.
But the limitation is, unlike data store operations, you won’t be able to see the
persisted messages via any GUI. Instead, you have to use the Cloud Integration
OData API to access the persisted messages. And there is no feature like,
accessing the contents of the persisted messages in between the execution of
the integration flow. It should be after the execution of the integration flow.
You need to do odata api call to access the persisted data in the below URL
format.
https://<Cloud Integration host>/api/v1/MessageProcessingLogs(‘<Message
ID>’)/MessageStoreEntries
Properties:
Provide a unique Step ID. This step ID will be coming as MessageStoreId in the
odata api response. You have an option to encrypt the stored message.
I have built a simple iflow where we have a HTTPS sender with XML body and
passed to a content modifier which just captures the payload body and passes
it over to next step. Then we have the Write palette function.
Datastore name – Name of the datastore. You can also dynamically define the
data store name by getting the header or property value, $
{header.headername} or ${property.propertyname}. Max length is 40chars.
Entry ID – Specify an entry ID that is stored together with the message content.
Max length 255 characters.
Visibility – If you want to access it from another IFlow, then keep it as Global or
else Integration Flow.
Retention threshold for alerting – 2 days, after 2 days, system will throw an
alert.
Expiration Period – 30 days, after which system removes the entry from the
data store.
Encrypt Stored message – encrypt the stored content.
Overwrite existing message – If you are sending same data store name, then at
each time of the flow execution, the message content will be overwritten.
Include message headers – If you want to save the message headers too.
The Data Store Get operation retrieves data that has previously been stored in
a data store during the execution of an integration flow.
Datastore name – Name of the datastore which you want to fetch.
Entry ID – Name of the entry ID which corresponds to the data store name.
Visibility – Whether you want to search inside the integration flow or search
for the data store messages globally.
Delete on Completion – If it has to be deleted, then enable “delete on
completion”.
Throw Exception on missing entry – If no records found, then exception should
be thrown if this option is enabled.
3. Data Store – Select:
If there are multiple data store messages and you want to fetch all the
messages which is having a particular data store name, then SELECT can be
used.
If you want to fetch a particular record with combination of data store name
and entry ID.
In select operation, we won’t be providing Entry ID.
It is for deleting data store entries. But if there are multiple entry IDs as in our
example, then we need to mention the Entry ID along with the datastore name.
4. Write Variables
Variables are typically having the same meaning, where we can store data
inside it and can change it anytime. You define variables to share data across
different integration flows. Once you write the variable to the variable store
and marked as Global, then it can be accessed from other IFlows.
We use the Write Variables step type to create a variable at a certain point
within the message processing sequence. To consume the variable (either in
another step of the same integration flow or in another integration flow)
Security:
1. Decryptor:
2. Encryptor:
Encryptor step is used to encrypt data before sending it to another system.
Encryption is a security measure that ensures the confidentiality and integrity
of sensitive information, preventing unauthorized access during transmission.
The Encryptor in SAP CPI can encrypt messages or files, using encryption
algorithms like PGP (Pretty Good Privacy) or AES (Advanced Encryption
Standard), depending on the use case.
3. Signer
Signer step is used to digitally sign a message or file. This operation ensures the
integrity and authenticity of the data. When data is signed, it provides a way to
verify that the data has not been tampered with during transmission and that it
originates from a trusted source.
4. Verifier
Verifier step is used to verify the digital signature applied to a message or file.
It checks that the received data has not been tampered with and confirms that
the sender is authentic.
The Verifier uses the public key corresponding to the private key that was
used to sign the message. This step is essential for ensuring data integrity and
authenticity when receiving signed messages or documents.
MAPPINGS:
SAP Cloud Platform Integration (SAP CPI) provides message mapping feature.
Similar to PI/PO, we have the message mapping feature with graphical editor
and the same graphical mapping functions. CPI provides us with many
mappings’ palette functions for different usecases.
– Message Mapping
– ID Mapping
– Operation Mapping
– XSLT Mapping
1. Message Mapping:
These are the standard mapping functions provided by SAP in the graphical
editor tool. There is also a provision to embed groovy code if the standard
functionality is not suitable to the requirements, which can be found at the top
of the above image.
Arithmetic – add, subtract, equals (number), absolute, sqrt, square, sign, neg,
inv, power, lesser, greater, multiply, divide, max, min, ceil, floor, round, counter,
formatNumber.
Boolean – and, or, not, equals(boolean), notEquals, if, ifS, ifWithoutElse,
ifSWithoutElse, isNil.
Constant – constant, copyValue, xsi:nil.
Conversions – fixValues, ValueMapping.
Date – currentDate, dateTrans, dateBefore, dateAfter, compareDates.
Node functions – createIf, removeContexts, replaceValue, exists, splitByValue,
collapseContexts, useOneAsMany, sort, sortByKey, mapWithDefault,
formatByExample.
Statistic – sum, average, count, index.
Text – concat, substring, equals(string), indexOf(2), indexOf(3), lastIndexOf(2),
lastIndexOf(3), compare, replaceString, length, endsWith, startsWith(2),
startsWith(3), toUpperCase, toUpperCase, trim.