RapidCanvas Docs
  • Welcome
  • GETTING STARTED
    • Quick start guide
    • Introduction to RapidCanvas
    • RapidCanvas Concepts
    • Accessing the platform
  • BASIC
    • Projects
      • Projects Overview
        • Creating a project
        • Reviewing the Projects listing page
        • Duplicating a Project
        • Modifying the project settings
        • Deleting Project(s)
        • Configuring global variables at the project level
        • Working on a project
        • Generating the about content for the project
        • Generating AI snippets for each node on the Canvas
        • Marking & Unmarking a Project as Favorite
      • Canvas overview
        • Shortcut options on canvas
        • Queuing the Recipes
        • Bulk Deletion of Canvas Nodes
        • AI Guide
      • Recipes
        • AI-assisted recipe
        • Rapid model recipe
        • Template recipe
        • Code Recipe
        • RAG Recipes
      • Scheduler overview
        • Creating a scheduler
        • Running the scheduler manually
        • Managing schedulers in a project
        • Viewing the schedulers in a project
        • Viewing the run history of a specific scheduler
        • Publishing the updated data pipeline to selected jobs from canvas
        • Fetching the latest data pipeline to a specific scheduler
        • Comparing the canvas of the scheduler with current canvas of the project
      • Predictions
        • Manual Prediction
        • Prediction Scheduler
      • Segments and Scenarios
      • DataApps
        • Model DataApp
        • Project Canvas Datasets
        • Custom Uploaded Datasets
        • SQL Sources
        • Documents and PDFs
        • Prediction Service
        • Scheduler
        • Import DataApp
    • Connectors
      • Importing dataset(s) from the local system
      • Importing Text Files from the Local System
      • Connectors overview
      • Connect to external connectors
        • Importing data from Google Cloud Storage (GCS)
        • Importing data from Amazon S3
        • Importing data from Azure Blob
        • Importing data from Mongo DB
        • Importing data from Snowflake
        • Importing data from MySQL
        • Importing data from Amazon Redshift
        • Importing data from Fivetran connectors
    • Workspaces
      • User roles and permissions
    • Artifacts & Models
      • Adding Artifacts at the Project Level
      • Adding Models at the Project Level
      • Creating an artifact at the workspace level
      • Managing artifacts at the workspace level
      • Managing Models at the Workspace Level
      • Prediction services
    • Environments Overview
      • Creating an environment
      • Editing the environment details
      • Deleting an environment
      • Monitoring the resource utilization in an environment
  • ADVANCED
    • Starter Guide
      • Quick Start
    • Setup and Installation
      • Installing and setting up the SDK
    • Helper Functions
    • Notebook Guide
      • Introduction
      • Create a template
      • Code Snippets
      • DataApps
      • Prediction Service
      • How to
        • How to Authenticate
        • Create a new project
        • Create a Custom Environment
        • Add a dataset
        • Add a recipe to the dataset
        • Manage cloud connection
        • Code recipes
        • Display a template on the UI
        • Create Global Variables
        • Scheduler
        • Create new scenarios
        • Create Template
        • Use a template in a flow notebook
      • Reference Implementations
        • DataApps
        • Artifacts
        • Connectors
        • Feature Store
        • ML model
        • ML Pipeline
        • Multiple Files
      • Sample Projects
        • Model build and predict
    • Rapid Rag
  • Additional Reading
    • Release Notes
      • May 14, 2025
      • April 21, 2025
      • April 01, 2025
      • Mar 18, 2025
      • Feb 27, 2025
      • Jan 27, 2025
      • Dec 26, 2024
      • Nov 26, 2024
      • Oct 24, 2024
      • Sep 11, 2024
        • Aug 08, 2024
      • Aug 29, 2024
      • July 18, 2024
      • July 03, 2024
      • June 19, 2024
      • May 30, 2024
      • May 15, 2024
      • April 17, 2024
      • Mar 28, 2024
      • Mar 20, 2024
      • Feb 28, 2024
      • Feb 19, 2024
      • Jan 30, 2024
      • Jan 16, 2024
      • Dec 12, 2023
      • Nov 07, 2023
      • Oct 25, 2023
      • Oct 01, 2024
    • Glossary
Powered by GitBook
On this page
  • Creating a Prediction Scheduler
  • Running the Scheduler Manually
  • Managing Prediction Schedulers in a Project
  • Viewing the Prediction schedulers in a project
  • Publishing the Updated Data Pipeline to Selected Prediction Scheduler from Canvas
  • Fetching the Latest Data Pipeline to a Specific Prediction Scheduler
  • Comparing the Canvas of the Prediction Scheduler with Current Canvas of the Project
  1. BASIC
  2. Projects
  3. Predictions

Prediction Scheduler

PreviousManual PredictionNextSegments and Scenarios

Last updated 1 month ago

Creating a Prediction Scheduler

Use this procedure to create a prediction scheduler for a model within a project.

  1. Hover over the menu and select Projects. The Projects dashboard is displayed.

  2. Select the project for which you want to create a prediction scheduler. You can create prediction schedulers for different models in a project.

The Canvas page is displayed.

  1. Click the Predictions tab on the project navigation menu on the left to open the Predictions page.

  2. Do one of the following:

    • Click the plus icon on the top right corner of the page.

    • Click the +Prediction Scheduler option to create a prediction scheduler. However, you can only view this option when there are no schedulers created in this project.

The following page is displayed where you can create a prediction scheduler to run this data pipeline at the set time interval.

  1. Select the model that you want to use to generate predictions at the scheduled frequency.

  2. Click Proceed.

  3. Select the model on which you want to generate predictions at the scheduled frequency.

  4. Select the scheduler frequency. Possible values:

    • Daily - This displays Hrs and Min. drop-down to select the time at which the job should be triggered.

    • Weekly - This displays days in a week and time at which the scheduler should be run.

    • Cron - This displays the Unix Cron format to create a scheduler.

  5. View the data pipeline on the canvas.

  6. Click Save to create the prediction scheduler. This also enables the +Destination option to configure the data connector to which you can publish the generated output datasets or the input dataset.

You can see the project variables button only if the variables are defined at the project level. After creating the scheduler, you can change the value in project variables.

  1. Click + Destination. This opens the Destinations side panel.

  2. Click + Destination.

  3. Select the dataset that you want to add to the destination. If the dataset list is huge, you can use the search option to search for the dataset you want.

  4. Select the destination from the drop-down list. You can only view the list of external data sources configured under this tenant excluding Snowflake and Fivetran connectors.

    Info When you select the SQL connector to synchronize or copy the output dataset generated after running the project, the table name column is displayed. Here, you can provide the table name and select either "Append" or "Replace". Opting for the "Append" option will append the dataset to the existing one, provided both datasets share the same schema. Alternatively, selecting the "Replace" option will replace the existing dataset with the new one.

    If you choose the data connector as MongoDB, you can provide the database name and collection. In the event that the provided collection name already exists, the new dataset will be appended to the existing collection.

  5. Provide the destination folder and destination file name to save the file in the destination folder with the new file name after the job is run every time at the scheduled time.

  6. Click Save to save this destination. This button is enabled only after you select all the required destination fields.

    Note

    • You can store files in multiple destinations. To add another destination, click + DESTINATION. If you want to remove any destination, click the delete icon.

    • If you no longer want to save the output to the configured destination, you can use the delete icon to delete the destination.

  7. Close the window after configuring the destination for the job. Once the destination details are set, the destination node will appear on the scheduler canvas.

  8. Click GLOBAL VARIABLES to change the configured parameters for this job.

Note: The GLOBAL VARIABLES button is enabled only when the global variables are declared at the project level. To configure global variables, refer to configuring global variables at a project level.

  1. Change the value for the key. Please note that you cannot change the key.

Running the Scheduler Manually

Use this procedure to manually run or re-run the prediction scheduler. However, scheduled runs occur automatically based on the configured recurrence.

To manually run a prediction scheduler:

  1. Select the project in which you want to run the job.

  2. Click the Predictions tab to view the list of manual predictions and prediction schedulers for this project. The predictions list page is only visible if predictions have been created for the selected project.

  3. Click the scheduler name link you want to run manually. This opens the specific prediction scheduler's page.

  4. Click Run to initiate the job manually. This opens the Manual Run Configuration side panel.

  5. Enter a run name and click Run. Once the prediction scheduler run starts, its status changes from Created to Entity Loading and then to Running. When the run completes successfully, you can view the output on the Run History page.

To re-run a job, click the ellipsis icon in the Run Name column on the Run History page and select RE-RUN.

To delete a specific run, select DELETE.

Managing Prediction Schedulers in a Project

Use this procedure to manage all the prediction schedulers in a project.

  1. Hover over the menu icon and select Projects. The Projects dashboard is displayed.

  2. Select the project for which you can schedule or create a job. You can create jobs for different scenarios in a project.

  3. Click the Predictions tab on the left navigation menu of the project to open the schedulers page and view the list of manual and automatic schedulers you have already created.

    Note: If there are multiple schedulers, you can use the search option to find the scheduler you want.

    You can also create a new scheduler, using the plus option.

  4. Click on the Prediction Scheduler name that you want to edit. This redirects you to the Prediction scheduler page where you can edit the prediction scheduler details.

  5. Modify the required details.

  6. Click Save to view the new changes.

    On this Prediction scheduler page, you can also:

    • Run this job manually, clicking the Run button.

    • View the run history, using the Run history icon. This allows you to view the history of all prediction scheduler runs till date and up to 300 records of last 30 days.

    • Pause the job that is running, using the Pause icon. You can click the same icon to start the paused prediction scheduler.

    • Click the Action drop-down to select the Delete option to delete this prediction scheduler permanently.

    • Click the Timeout 1hr option to change the timeout duration of the prediction scheduler. You can view this option when you click the Actions drop-down. By default, the timeout duration is set to 1 hr. Setting this will terminate the prediction scheduler after this duration.

Viewing the Prediction schedulers in a project

Use this procedure to view all the prediction schedulers in a project and see the output generated after every job run.

  1. Hover over the menu icon and select Projects. The Projects dashboard is displayed.

  2. Select the project for which you can to schedule or create a job. You can create jobs for different scenarios in a project.

  3. Click the Predictions tab on the left navigation menu of the project to open the predictions page and navigate to the Prediction scheduler tab to view the list of prediction schedulers created.

  4. Review this information:

Scheduler Name: The name of the job.

Input Model: The selected model used to create the prediction scheduler.

Status: The status of this scheduler. Possible values:

  • Scheduler Active - By default is set to Active.

  • Scheduler Inactive - Indicates that the manual run has been paused.

Last run by: Indicates whether the business user or scheduler has run the scheduler last.

Last Run: The date and time at which the scheduler was run lastly.

Last 5 Runs: Indicates the last five scheduler run status. Possible values:

  • Failed - The scheduler has failed to run.

  • Success - The scheduler has been run successfully.

  • Created - The scheduler has been created.

Last Run Prediction Output: Click to view the output generated from the last scheduler run. This option is available only after the scheduler has been executed; until then, it remains disabled.

Last Run Log: Click to view the logs. You can check the logs to understand the errors in jobs that have failed to run.

Last Run Canvas: Click to view the canvas page. You can only view the page to see the failed blocks and successful blocks in the data pipeline.

Last Run Project Variables: Click to view the last run project variables in this job.

You can click on the table settings icon to reorder the columns and select and deselect the columns you want to view.

  1. Click the ellipses icon next to the scheduler name to access the following options:

  • Edit: Modify the scheduler details.

  • Pause: Temporarily halt the current scheduler run.

  • Run: Manually execute the scheduler outside the automatic schedule.

  • Run History: View past runs and track executed prediction schedulers.

  • Delete: Permanently remove the scheduler from the project.

Publishing the Updated Data Pipeline to Selected Prediction Scheduler from Canvas

Use this procedure to republish the data pipeline to prediction scheduler. When you update the dataset, delete a recipe or add a new recipe to the data pipeline, you can republish the new flow to the prediction scheduler using the Publish to Prediction Schedulers option on the canvas. This updates the canvas on the selected schedulers.

To publish the changes made in the data pipeline to all or specific prediction scheduler(s) in a project:

  1. Select the project to navigate to the canvas view page.

  2. Click the Actions drop-down and select Publish to Prediction Schedulers on the canvas. This displays the Republish Model Flow to Prediction Scheduler dialog.

    This displays the list of prediction schedulers to which you want to publish the latest or updated data pipeline.

  3. Select the checkboxes corresponding to the prediction schedulers to which you want to update the latest canvas. This enables the Yes, Republish button.

  4. Click Yes, Republish to republish or update the latest data pipeline to the selected prediction schedulers.

From the next schedule, the prediction scheduler run is performed on the new modeling pipeline.

Fetching the Latest Data Pipeline to a Specific Prediction Scheduler

Use this procedure to fetch the changes made to the data pipeline on the canvas to the data pipeline in a specific prediction scheduler.

To publish the changes made to the data pipeline on the canvas to a specific scheduler from the prediction scheduler page:

  1. Select the project to navigate to the canvas view page.

  2. Select Predictions from the project level navigation. This takes you to the Schedulers page where you can view the list of schedulers created for this project.

  3. Select the prediction scheduler to which you want to publish the changes made to the data pipeline. This takes you to the selected prediction scheduler page.

  4. Click the Republish button in the canvas section to incorporate all the changes that were made to the canvas at the project level to this pipeline.

    The Republish Model Flow to Prediction Scheduler window appears.

  5. Click Yes, Republish to republish the project canvas to the scheduler.

Comparing the Canvas of the Prediction Scheduler with Current Canvas of the Project

Use this procedure to compare the current canvas of the project and canvas of the prediction scheduler side-by-side to track changes.

To compare the canvas of the scheduler with the current canvas of the project:

  1. Select the project to navigate to the canvas view page.

  2. Select Predictions from the project level navigation. This takes you to the prediction schedulers page where you can view the list of schedulers created for this project.

  3. Select the scheduler that you want to compare with the current canvas of the project. This opens the scheduler page.

  4. Click Compare to compare the canvas of this prediction scheduler with the canvas of the project to notice the differences. You can see the difference between both the canvases side by side.

If you notice the canvas of the prediction scheduler is not up-to-date, you can click Republish to fetch the latest canvas of the project to update the changes.