Skip to main content

Celonis Product Documentation

Scheduling data jobs for object-centric process mining

When you've published your object-centric data model to the data pool, set up a schedule to run your data pipeline regularly. You'll need schedules for these data jobs:

  • ocpm-extraction-job, which extracts data from your source system. Your extraction package’s data job might have a different name. If you have multiple source systems, schedule the extractions for all of them.

  • ocpm-data-job, which runs the transformations to create objects and events in your production environment.

  • Any other data jobs that you’re using in your production environment for object-centric process mining.

The data job test:ocpm-data-job runs the transformations to create objects and events in your development environment. You might prefer to run this manually as you need to - the instructions are at Running transformations. You can set up a schedule if you prefer, for example to run it overnight.

If you're using only the OCPM Data Pool, set up the schedules in there. If you're using the feature that gives you object-centric data models in all your data pools, set them up in each data pool where you're using them. In the data pool, go to Schedules > Add Schedule and follow the instructions in Scheduling the execution of data jobs.

You can set up schedules to run the transformations for just one perspective at a time (or more than one if you want). Perspective scheduling can be done for Celonis catalog perspectives and custom perspectives.

When you run the OCPM data job through a schedule where you've selected perspectives, we'll identify all the objects and events that are part of those perspectives. Then we’ll run only the transformations that are needed to create those objects and events from your source system data. If you have modeled several processes in the same data pool, you can use this method to limit the time taken and the potential for incomplete runs.

When queuing of data jobs is enabled, if an instance of a data job is running and another instance of the same data job is scheduled to start, the second instance will be queued until the first finishes. If queuing of data jobs isn't enabled, the second instance will be skipped.

Here's how to set up schedules for perspectives:

  1. In the data pool where your object-centric data model is, add as many schedules as you want for ocpm-data-job or the development version test:ocpm-data-job.

  2. In each schedule under Selected Data Jobs, go to the context menu (the three vertical dots) for your OCPM data job, and select the Configure Perspectives option.

  3. Click Perspectives to display the list of perspectives that are enabled in your OCPM data job. Check the box for one or more perspectives that you want to load together and click Save.

  4. To verify the results, you can run the schedule by clicking the Execute button at the top right, and confirming that you want to execute the OCPM data job.

  5. Currently, while the feature is in Limited Availability, you'll also need to enable queuing of data jobs manually for every schedule where you've selected perspectives. To do this, go to the context menu (the three vertical dots) for each schedule and select Execution Settings, set the slider on, and click Save. When the feature enters General Availability, queuing of data jobs will become the default behavior for all schedules in the OCPM Data Pool.

Tip

Queuing of data jobs lets Data Integration schedule multiple instances of the same data job. It’s available in every data pool, not just the OCPM Data Pool. So you can activate it for other data jobs if you want to - including case-centric data jobs. In every data pool that doesn't have object-centric data models active, queuing of data jobs will be off by default. In every data pool that does, it will be on by default.