
Purpose
The guiding principle for programming standards and guidelines at Syensqo is to use what is generally accepted by the industry as “best-practice”, rather than defining a bespoke set of rules. By adopting this approach, it is more likely that developers engaged by Syensqo are already familiar with the “best-practice” approach and can work effectively in the Syensqo environment immediately.
This document’s purpose is to provide developers with the standards and guidance required to develop in Syensqo’s landscape.
Assumptions
All tools required to develop best-practice based are available.
Since SAP development tools and approaches are evolving so does this document.
The SAP Development Approach has been understood.
SAP Cloud Integration/SCPI/HCI
Design Principles and Modularization
Design integration flows with simplicity, modularity, and maintainability in mind. Key principles include:
- Modularize Flows: Avoid monolithic, overly complex iFlows. Break down complex processes into smaller, logical units. Use Local Integration Processes (sub-processes within an iFlow) to encapsulate reusable or distinct logic blocks. This makes the main integration process easier to read and maintain. If an iFlow is becoming lengthy or handling many tasks, it’s a sign to split it up.
- One Interface, One IFlow: Design each iFlow to handle a single integration interface or a specific sender-receiver pair. If an integration scenario involves multiple target systems, consider using one iFlow per receiver for clarity and fault isolation. Similarly, if multiple sources send similar messages, you can introduce a dispatcher iFlow that routes incoming messages to separate iFlows for each target/system. This separation improves transparency and monitoring since each iFlow represents a distinct interface. (Decoupling flows via asynchronous queues like JMS can further isolate failures and facilitate retries between parts of a process.)
- Reusable Subflows: For common sequences (e.g., data enrichment, calling a common API, or error handling routines), consider creating reusable integration processes or even separate template iFlows that can be invoked via the Process Direct adapter. This avoids duplicating logic across iFlows and centralizes updates to that logic.
- Layout and Readability: Maintain a clean and logical layout in the iFlow editor. Arrange processing steps left-to-right and top-to-bottom in sequence. Use straight connectors and alignment tools (Auto-Layout) to produce a tidy diagram. This visual clarity helps any developer quickly understand the flow. Additionally, make use of labels and annotations: for example, label branches in a router with the condition name, or add notes for complex logic. Clear design and layout will make troubleshooting and future enhancements much easier.
- Externalize Configurations: Design iFlows to be environment-agnostic. Do not hardcode environment-specific details (URLs, credentials, file paths, etc.) inside flows. Instead, externalize these parameters so they can be configured per environment (Dev/QA/Prod) without altering the flow’s logic. For example, define the endpoint URL or API keys as externally configurable parameters. This makes deployments to higher landscapes simpler and less error-prone and adheres to the principle of 12-factor app configuration.
By adhering to these design principles, you enhance the scalability and maintainability of integrations. The goal is to build flows that are easy to understand, modify, and extend, following solid software design practices adapted to integration scenarios.
Mapping and Transformation
Integrations often require transforming data between formats (XML, JSON, CSV, etc.) or structures. Here are best practices for message mappings and transformations:
- Choose the Right Tool for the Job: Use out-of-the-box transformation tools whenever possible instead of custom code. For structured XML-to-XML transformations, leverage Graphical Message Mapping or XSLT mapping – these are optimized for XML and provide a visual mapping interface. If you need to transform XML to JSON or vice versa, consider using the message mapping step (which supports JSON as target format) or an XML-to-JSON converter if appropriate. Use Content Modifier steps for simple value assignments or constructing small payloads (especially if you need to build a new message body from scratch). Reserve Groovy scripts or JavaScript mappings for cases that cannot be handled with standard mapping functions (for example, complex computations or dynamic logic that the graphical mapping cannot easily express). This approach ensures maintainability – mappings are easier for others to understand and adjust than large script code.
- Keep Mappings Manageable: In graphical mappings, maintain a clean structure. Map only the required fields – use the Filter or Remove contexts functions to drop any unnecessary data early. Keep an eye on mapping complexity: if a single mapping becomes too complicated (e.g., with many functions or if/else logic), evaluate if it should be broken into multiple mapping steps (such as a two-step mapping) or complemented by a script for the complex portion. Simplicity improves performance and clarity.
- Handle Value Mappings and Lookups: Often you'll need to map code values (e.g., country codes, status codes) between systems. Use the Value Mapping artifact for this purpose, which acts as a lookup table for cross-reference values. Populate value-mapping tables with source-to-target value pairs rather than encoding such logic in the mapping script. At runtime, the mapping step can call these by the valueMapping function. This separates configuration from logic. Ensure value mappings are named clearly (for example, using a naming pattern like VM_<SourceSystem>_to_<TargetSystem>_<ValueSet> – e.g. VM_SAPtoLegacy_CountryCodes) to indicate their content. Maintaining these artifacts centrally makes updates easier when values change.
- Test Transformations Thoroughly: Develop mappings with sample payloads from real systems. Use the built-in Mapping Simulator (for graphical mappings) or external tools for XSLT to validate that your transformations work as expected (especially for edge cases or optional fields). This will catch issues early. Ensure to handle exceptions in mapping – for instance, if an unexpected value appears, you might map it to a default or throw a controlled error that can be caught by error handling logic.
- Avoid Unnecessary Complexity in Scripts: If you do use a Groovy or JavaScript for transformation, keep the script focused and as simple as possible. Do not replicate features available in mapping steps or adapters via scripting. For example, do not use a script to split messages or to perform simple field mappings that the mapping step can handlehelp.sap.com. Scripts should ideally be small utilities (e.g., custom date conversion, complex string parsing) within an integration flow. Overusing scripts can make maintenance harder and can introduce performance overhead if not carefully written.
By following these guidelines, you ensure that data mappings are efficient, transparent, and easy to maintain. The key is to use SCPI's rich palette of transformation tools to their strengths and keep custom code to a minimum.
Tools (pending figaf)
automated testing
itelliJ integrated with SCPI/eclipse integration
inline iflow editor
Error handling/Alerting (FEH ? AIF ?)
Pending Business requirements/NFR's, standard alerting should be in place
- exceptions should be handled in all cases
MPL Attachements are bad - DO NOT LOG ENTIRE PAYLOADS.
Async
Retry based on business requirements and data/message criticality
IDOC post ALEAUDIT with error messages
Catch exceptions for alerting (if not posting error to another application system).
Sync
All return appropriate errors to calling system - be as informative as possible
Security
Authentication (endpoints)
Certificates > oAuth > Basic
Data at Rest
Integration should NOT maintain data at rest
Credentials Storage
Stored in secure store
Web based tools and LLMS
DO NOT ! including JSON/XML formatter of data, LLM based coding agents (is there a co-pilot sanctioned by Syensqo ?).
AI tools need to be used as per current IT policy - https://thehub.syensqo.com/en/syensqoai/introducing-syensqos-ai-usage-policy-key-milestone-our-ai-journey?check_logged_in=1
Documentation
Inline
ALWAYS document Iflow sender and receivers
ALWAYS Iflow Description
ALWAYS give descriptive name to flow steps, including sender and receiver default system and internal integration processes (do not leave as default)-
example
should be 
Confluence
As build with reference to MAPPING/FUNC SPEC/TECH OBJECT in SCPI
Logging
MUST use logging - messageLog.addCustomHeaderProperty("LABEL", "INFO like IDOC_NUM/MATNR/KUNNR");
Naming Standards
Iflow
Externalised Variables
General Integration Principles
- standard > custom
- simple > complex
- timers are bad
- data caches are bad
- ????
API Management
Policy Design/Minimum security standard
When Authentication is available -
oAuth + api-key
When oAuth or basic auth isnt available -
ipwhitelist + api-key