Skip to main content

Step 3: Compatibility of Data Jobs and Real Time Transformations

While the Replication Cockpit handles your fast, real-time updates, you still need standard Data Jobs for your "big picture" tasks—like full weekly reloads or currency conversions. If your Data Jobs aren't updated to match the new Real-Time structure, they will overwrite your delta changes or cause data inconsistencies in your Data Model.

Full Load

Delta Load

Extraction

Data Job

Replication Cockpit

Transformation

Data Job

Replication Cockpit

Data Model Load

Data Job

Data Job

The Replication Cockpit requires physical tables to perform delta operations (inserting and deleting rows). You must update your Data Job SQL to create tables instead of views.

  • Action: Change CREATE VIEW to CREATE TABLE.

  • Manual Task: You must manually DROP the existing view in your database once before the new table-based script can run.

The syntax shift:

-- Old Batch Style
CREATE VIEW "O2C_KNA1" AS ...

-- New Compatible Style (Required for RTT)
CREATE TABLE "O2C_KNA1" AS ...

To avoid data silos, your Data Jobs must write to the exact same dedicated activity tables used by the Replication Cockpit.

  • Action: Target trigger-specific tables (e.g., _CEL_O2C_VBAP_ACTIVITIES) rather than one giant, shared activity table.

Because activities are now split into multiple tables, your Data Job needs to "glue" them back together at the end of a full load. This ensures your Data Model sees every activity as a single, continuous list.

  • Action: Add a final step to your Data Job to refresh the Union View using UNION ALL.

-- Glue your split tables back together for the Data Model
CREATE OR REPLACE VIEW _CEL_O2C_ACTIVITIES AS (
    SELECT * FROM _CEL_O2C_VBAK_ACTIVITIES
    UNION ALL
    SELECT * FROM _CEL_O2C_VBAP_ACTIVITIES
    -- Add all other dedicated trigger tables here
);

Organize your work into three distinct job types to keep the system synchronized:

Job type

Frequency

Purpose

Full Load

Weekly

Wipes and rebuilds all tables to ensure 100% data integrity.

Daily Load

Daily

Handles complex logic not yet in real-time (e.g., Currency Conversion).

Data Model Load

Hourly

Refreshes the frontend to display the latest real-time delta changes.

Before moving your new configuration into production, verify these four critical areas:

  • Database Cleanup:

    • Drop Legacy Views: Did you manually run DROP VIEW for every table you converted to a physical table? (SQL will throw an "Object already exists" error if you try to create a table where a view still resides).

    • Clear Old Activity Tables: Have you deleted the old, centralized _CEL_O2C_ACTIVITIES table to ensure it doesn't conflict with your new Union View of the same name?

  • Configuration Alignment:

    • Match Table Names: Open your Replication Cockpit and your Data Job side-by-side. Do the target table names (e.g., _CEL_O2C_VBAP_ACTIVITIES) match character-for-character?

    • Verify Columns: If you added new columns to your Data Job logic, have you also added them to the corresponding Real-Time Transformation script in the Cockpit?

  • Schedule Synchronization:

    • Data Model Load Frequency: Is your Data Model set to reload frequently (e.g., every 30 mins)?

      Note

      Real-time transformations update the database instantly, but users won't see those changes until the Data Model reloads.

    • Full Load Blackout: Have you ensured that your Weekly Full Load doesn't overlap with heavy extraction periods to avoid system timeouts?

  • Dependency Validation:

    • Check Trigger Logic: For every transformation in the Replication Cockpit, is the Trigger Table correctly assigned?

    • Mandatory Dependencies: Are all tables used in INNER JOIN statements listed as dependencies? (Missing a dependency here is the #1 cause of "missing records" in RTT).

Tip

After completing this checklist, perform a manual delta extraction for one table. Check the execution logs in the Replication Cockpit to ensure the status shows "Success" and that the number of "Records Processed" matches the expected delta size.

If your first run doesn't go as planned, use this table to identify the cause.

Error / symptom

Likely Cause

Solution

"Object already exists"

A legacy View is still in the database.

Manually run DROP VIEW [TableName] before re-running the Data Job.

Missing Records

A Dependency was not defined for an INNER JOIN.

Add the joined table as a dependency in the Replication Cockpit.

"Table Not Found"

The target table wasn't initialized.

Run a one-time Data Job using CREATE TABLE ... LIKE to set up the schema.

Stale Data in Dashboards

The Data Model hasn't reloaded.

Check your Data Model load schedule; ensure it runs shortly after your extractions.

Duplicate Activities

Incorrect Delta Approach (e.g., using INSERT instead of MERGE).

Update your SQL to include a WHERE NOT EXISTS clause or a MERGE statement.

Related topics