Real Time Transformations FAQ
General | |
What are Real Time Transformations? | Real-Time Transformations represent a new concept that enables the transformation of unprocessed records. This allows avoiding to drop/recreate the data that has already been processed, with the benefits of:
|
Who should consider using Real Time Transformations? | The feature is tailored to support real-time connectivity use cases. Anyone, who plans on using Celonis applications and Action Engine should first achieve real-time connectivity. In general, Real Time Transformations are an effective countermeasure against long-running and unstable transformations. |
Which source systems are supported? | Currently, Real Time Transformations are only supported for SAP. |
Will a migration be required for existing customers? | Real Time Transformations follow a different processing logic compared to transformations executed in Data Jobs. It requires some migration effort and changes in the transformation scripts. Further reference: Real Time Transformations: Setup Guide |
Which process templates are available to import? | Templates for O2C and P2P (both SAP ECC) can be installed via the Marketplace or via the import functionality in Replication Cockpit. NoteIf you are interested for the AP or AR real-time process templates, we are happy to collaborate with you. In this case, please reach out to us via Support portal. |
Technical | |
What are the pre-requisites? | Real-Time Transformations are part of the Replication Cockpit. Therefore, the same pre-requisites are valid:
|
How do Real Time Transformations differentiate from transformations that are executed in Data Jobs? | Batch Transformation (Data Jobs):
Real Time Transformation:
|
Should I move all transformations in the Replication Cockpit to run in Real Time mode? | In theory, every transformation can be converted. However, there are some scripts for which it doesn't make sense like the creation of the currency conversion tables (those should still remain in Data Jobs). Also, a Full Load is currently not supported in the Replication Cockpit. This use case still needs to be covered by the Data Jobs. |
What is a Trigger table? | Real Time Transformations are executed each time new records are extracted to a specific table. This means that you need to map each transformation statement to a table whose extraction should trigger the transformation. This table is called a Trigger Table. |
What is a Staging table? | Staging tables are intermediary tables that allow you to execute Real Time Transformations. They contain only a specific set of records. For every table that is extracted via the Replication Cockpit, 2 different corresponding staging tables exists:
|
What are Dependent tables? | The dependent tables for a transformation are all tables that are required to be up-to-date at the point when the transformation is executed. The Replication Cockpit automatically considers a special logic so that only the records of the staging table are taken into account for which corresponding updates in the dependent tables exist. |
Do Real Time Transformations support parameters? | In the Replication Cockpit, you can use and reference all Data Pool Parameters. However, there is no concept of replication-specific parameters anymore - like it is the case for Data Jobs. |
Do you know how long the transformations will take with this tool? | We have done several pilots. Here are the overall numbers: Pilot A
Pilot B
Pilot C
|
Could it happen that the Replication Time is quicker than the Transformation - so that the staging table changes in the middle of the Transformation? | The Transformation is always a part of the Replication (Replication = Extraction + Transformation). This means that a new extraction will not be executed (and hence the staging table won't change) unless the transformation is over. |
How are those staging tables created, especially for custom tables that are customer-specific? | The staging table is created automatically before executing the transformation. You don't need to do anything manually. |
What if we only need a left join (e.g. from VBAP to VBAK): will the dependencies also be applied and will it also cut our entries from VBAP that are not in VBAK? | It is up to the user to define the dependency or not. This is done manually. For an INNER JOIN defining the dependency is a must, and for LEFT JOIN it can be optional, assuming you are okay with having NULL values. If you want to enforce consistency and avoid NULLs, you have to define the dependency and the technology behind the scenes will make sure that only the records from VBAP that have "equivalents" in VBAK are processed. |
Do Real-Time Transformations also handle deleted records? | Yes, we can handle deletions too, but a different approach is required here. The deleted records are pushed to a separate staging table, e.g. VBAP_DELETED_DATA, and then you need to setup transformations to remove these records from the source and data model tables. We have not done this for our standard process connectors, but it works theoretically. You can also choose to delete the records directly from the table, i.e. VBAP. In that case you loose the flexibility of executing a transformation against it. However, you can set-up nightly/weekly drop/re-create transformations which will make sure that the deleted records are handled. |
Do we have an update/insert indicator in the trigger table to distinguish between new and updated entries? | Yes, there is a column CL_CHANGE_TYPE with the values I/U. |