RapidCanvas Docs
  • Welcome
  • GETTING STARTED
    • Quick start guide
    • Introduction to RapidCanvas
    • RapidCanvas Concepts
    • Accessing the platform
  • BASIC
    • Projects
      • Projects Overview
        • Creating a project
        • Reviewing the Projects listing page
        • Duplicating a Project
        • Modifying the project settings
        • Deleting Project(s)
        • Configuring global variables at the project level
        • Working on a project
        • Generating the about content for the project
        • Generating AI snippets for each node on the Canvas
        • Marking & Unmarking a Project as Favorite
      • Canvas overview
        • Shortcut options on canvas
        • Queuing the Recipes
        • Bulk Deletion of Canvas Nodes
        • AI Guide
      • Recipes
        • AI-assisted recipe
        • Rapid model recipe
        • Template recipe
        • Code Recipe
        • RAG Recipes
      • Scheduler overview
        • Creating a scheduler
        • Running the scheduler manually
        • Managing schedulers in a project
        • Viewing the schedulers in a project
        • Viewing the run history of a specific scheduler
        • Publishing the updated data pipeline to selected jobs from canvas
        • Fetching the latest data pipeline to a specific scheduler
        • Comparing the canvas of the scheduler with current canvas of the project
      • Predictions
        • Manual Prediction
        • Prediction Scheduler
      • Segments and Scenarios
      • DataApps
        • Model DataApp
        • Project Canvas Datasets
        • Custom Uploaded Datasets
        • SQL Sources
        • Documents and PDFs
        • Prediction Service
        • Scheduler
        • Import DataApp
    • Connectors
      • Importing dataset(s) from the local system
      • Importing Text Files from the Local System
      • Connectors overview
      • Connect to external connectors
        • Importing data from Google Cloud Storage (GCS)
        • Importing data from Amazon S3
        • Importing data from Azure Blob
        • Importing data from Mongo DB
        • Importing data from Snowflake
        • Importing data from MySQL
        • Importing data from Amazon Redshift
        • Importing data from Fivetran connectors
    • Workspaces
      • User roles and permissions
    • Artifacts & Models
      • Adding Artifacts at the Project Level
      • Adding Models at the Project Level
      • Creating an artifact at the workspace level
      • Managing artifacts at the workspace level
      • Managing Models at the Workspace Level
      • Prediction services
    • Environments Overview
      • Creating an environment
      • Editing the environment details
      • Deleting an environment
      • Monitoring the resource utilization in an environment
  • ADVANCED
    • Starter Guide
      • Quick Start
    • Setup and Installation
      • Installing and setting up the SDK
    • Helper Functions
    • Notebook Guide
      • Introduction
      • Create a template
      • Code Snippets
      • DataApps
      • Prediction Service
      • How to
        • How to Authenticate
        • Create a new project
        • Create a Custom Environment
        • Add a dataset
        • Add a recipe to the dataset
        • Manage cloud connection
        • Code recipes
        • Display a template on the UI
        • Create Global Variables
        • Scheduler
        • Create new scenarios
        • Create Template
        • Use a template in a flow notebook
      • Reference Implementations
        • DataApps
        • Artifacts
        • Connectors
        • Feature Store
        • ML model
        • ML Pipeline
        • Multiple Files
      • Sample Projects
        • Model build and predict
    • Rapid Rag
  • Additional Reading
    • Release Notes
      • May 14, 2025
      • April 21, 2025
      • April 01, 2025
      • Mar 18, 2025
      • Feb 27, 2025
      • Jan 27, 2025
      • Dec 26, 2024
      • Nov 26, 2024
      • Oct 24, 2024
      • Sep 11, 2024
        • Aug 08, 2024
      • Aug 29, 2024
      • July 18, 2024
      • July 03, 2024
      • June 19, 2024
      • May 30, 2024
      • May 15, 2024
      • April 17, 2024
      • Mar 28, 2024
      • Mar 20, 2024
      • Feb 28, 2024
      • Feb 19, 2024
      • Jan 30, 2024
      • Jan 16, 2024
      • Dec 12, 2023
      • Nov 07, 2023
      • Oct 25, 2023
      • Oct 01, 2024
    • Glossary
Powered by GitBook
On this page
  • Overview
  • Adding an AI-assisted recipe
  • Editing the Chart Output
  1. BASIC
  2. Projects
  3. Recipes

AI-assisted recipe

PreviousRecipesNextRapid model recipe

Last updated 1 month ago

Overview

If there are no standard templates for performing data transformations, AI-assisted recipes can be used. This functionality integrates AI capabilities, allowing business users to prompt AI to generate code based on their input. Once the code is generated, it can be added to the recipe and run through the data pipeline, displaying the output as datasets or charts.

You can also ask questions about the dataset directly in the chat and receive accurate answers from the AI. After receiving the AI response, you can easily express your satisfaction using the thumbs-up or thumbs-down options.

Ask AI

Click on the dataset block in the canvas and select the AI-assisted option from the Add recipes drop-down. This opens the code editor, where you can prompt the AI to generate a code snippet based on your input.

Select the Dataset

You need to select the dataset on which you want to apply the data transformation and generate code based on the given prompt. You also have the option to choose the output type: dataset, chart, text, or model. If you select dataset as the output, up to five datasets can be generated and added to the data pipeline, depending on the given prompt.

Add the Generated Code to Recipes in the Data Pipeline

Use the +Add to recipe option in the AI-assisted code editor to include the AI-generated code in the data pipeline. This option becomes available only after running the text prompt.

Save and Run the Code Recipe in the Flow

After saving the code, click on the run option to execute the AI-assisted (code) recipe and generate the output, which can be in the form of a dataset or a chart. You can continue building custom templates or code recipes using the Ask AI feature.

Adding an AI-assisted recipe

Use this procedure to add an AI-assisted recipe to the data pipeline or ML flow using the integrated AI tool. The recipes can be related to data pre-processing or building a model.

Note: If the application is installed in the customer’s Virtual Private Cloud (VPC) environment, features that rely on external API calls will be restricted. If a user tries to access these features, a notification will inform them that the admin has disabled these features.

To add and run an AI-assisted recipe:

  1. Click the dataset block on the canvas to open the side sheet.

Select the recipe

This opens the Ask AI tab where you can type the text prompt in the provided query box to generate the code recipe.

If you want to view the column names and data type of each column in the uploaded dataset, you can expand the datasets in the Inputs section on the left.

Note:

  • Use the delete icon to delete the uploaded dataset.

  • Use the plus button to add multiple datasets to use in the code recipe. In the drop-down, you can only find the datasets that you have added onto your Project canvas. If there is only one dataset on the canvas, this button remains disabled.

Select the dataset to run the recipe

  1. By default, the input dataset is selected from the list. Here, the input dataset is Titanic.

Note: You can select maximum of four datasets.

Enter the text prompt

  1. Enter the query in the provided query box. In this example, we have provided the query to concatenate two columns in the dataset and generate the code for this, "Concatenate the First_name and Last_name columns and generate a new column with name".

You can use @ in the query box to get the list of column names available in the selected dataset.

Note: (Optional) From the ellipses icon, select Generate Query Limit to run the query only on the selected number of rows in the dataset. You have three options to select from:

*   Full data
*   100k rows
*   1 million rows

Select the output type you want to generate

  1. Type a slash (/) in the query box to choose the type of output you want the AI to generate for your query, such as a dataset, chart, model, or text.

You can also view prompt suggestions option. Select this option and press enter to view prompt suggestions for the selected dataset. Besides the suggested prompts, you can also ask a query to generate additional prompt suggestions. There is also a run button appears next to each prompt. You can run the prompt directly from the chat window.

If you do not select the output type, the platform will auto-detect the output type based on the entered prompt.

Generate the code

The AI consumes the text prompt and generates the related code for concatenating two columns. You can see the output generated by the AI with the dataset size (total columns and rows)

You can only view 100 records in the output dataset. If you want to see lesser record, select the number of records you want to view from the drop-down list. You can also view the size of the dataset.

Info:

  • You can pin the generated dataset to use this dataset as a source to run the subsequent set of recipes. If you want to unpin, you can click on the unpin icon that is next to the Test button to unpin all datasets.

  • For the text responses, you can use the Copy answer option to copy the responses.

View the code

Note:

  • You can use the thumbs up and thumbs down to indicate your satisfaction with the output generated by AskAI for a given query. When you select the thumbs up, you can report whether the response generated by AskAI was incorrect, an error, gibberish, an incomplete response, or if there was another issue.

  • You can generate up to five datasets based on your prompt in AskAI. Additionally, you can view the data type of each column within these intermediate datasets, helping you better understand the structure of your data.

  • You can now view explanations for each line or code block generated by Ask AI, helping you understand the logic and functionality behind the code more easily.

Add the generated code to recipe

  1. Click + Recipe to add the template for concatenating two columns to the recipe in the data pipeline. If you want to remove the recipe, click - Recipe to remove from the pipeline.

Note: You can use the dataset generated by this recipe as an input for the next prompt using the Query icon.

  1. Provide the custom name for the recipe.

  2. Click Add to Recipe. After adding, you can see a green color icon next to the response indicating the addition of this recipe to the data pipeline.

Test the recipe on the dataset

  1. Click Test and select a test option to test the recipe on full dataset, 100k rows, or 1 million rows before saving and running this recipe in the data pipeline. Possible options:

    • Test (full data)

    • Test with 100k rows

    • Test with 1 Million rows

Note: You can use the Stop button to halt the recipe execution at any point during its test run. Additionally, you can view the test outputs for the dataset, artifact, and model.

You can see the test output dataset in a new tab as shown in the screenshot below:

You can also view logs while testing the recipe by clicking the Log for Test option. This provides access to detailed records and allows you to either download the logs as a text file or open them in a new browser tab for a more detailed view.

Save and run the recipe in the data pipeline

  1. Click the Run button to run the recipe and view the output on the canvas. If needed, you can stop the recipe at any time by clicking the Stop button.

Keyboard Shortcuts:

  • Ctrl+T (or Cmd+T on Mac) → Test a recipe

  • Ctrl+R (or Cmd+R on Mac) → Run a recipe

  • Ctrl+S (or Cmd+S on Mac) → Save code changes

In this example, we want to generate a new column called full name after concatenating two columns that is first name and last name.

Check the canvas

  1. Go back to the canvas to view the output.

  1. Click the dataset block on the canvas. This opens the pull-out window. The output dataset will include an extra column resulting from the concatenation process that is the 'name' column.

  2. Click View to see a dataset generated after concatenating two columns.

Viewing Canvas Nodes in AI-Assisted Recipe

Use this procedure to view and access canvas nodes from the AI-Assisted Recipe.

  1. Click on the dataset block and select AI-Assisted Recipe, or choose an existing AI-Assisted Recipe. This will open the Ask AI tab.

  2. On the left, click to open the input side panel, then click Canvas Nodes to expand and view a list of entities or components on the canvas—such as datasets, recipes, artifacts, and models. You can switch between different project components from here.

Note: Use the search option to find the entity you need.

  1. Click on the entity you wish to view on the canvas, and it will open in a new tab.

  2. Click View Code on the AI-Assisted Recipe to open the code generated by Ask AI in a sub-tab under the Code tab.

  3. On the Current tab, you can write your own code based on business requirements, referencing the code generated by Ask AI. The auto-suggestion feature assists you while coding by offering suggestions for file names, artifacts, and methods, helping to speed up the process and reduce errors.

Editing the Chart Output

After generating the chart, you can use the Edit option to customize it. This allows you to modify elements such as the chart's colors, type, and title. Follow the steps below to edit your chart:

  1. Enter a prompt to generate the chart. You can use the slash option to select the output type, then type your query.

  2. Click the Edit option on the chart to open the Chart Edit window.

  3. Enter a prompt describing how you want the chart to be modified.

    • For example, you can enter a query to make the chart colorful and change its title to Number of Passengers by Survival Status.

    You can see the chart color and title being updated in the screenshot below.

  4. Click Save to apply the changes to the chart.

Duplicating AI-Assisted Chat Conversations from the Original Project

When you duplicate a project, not only are DataApps, Prediction Services, and Environments copied, but you can also duplicate the AI-Assisted Chat Conversations from the original project.

However, the duplicated chat will need to be re-run or refreshed to generate updated results in the new project. To do so, navigate to the AI-Assisted Recipe Node within the data pipeline of the copied project.

Use this procedure to refresh the chat conversations in the duplicated projects:

  1. Click on the duplicated project from the Projects Dashboard to navigate to the canvas.

  2. Select the AI-Assisted Recipe Node to open the AskAI tab. If there were chat conversations in the original project, they will appear in the chat window.

Note: You must upload the data file in the dataset node to re-run the queries.

  1. Click Refresh Chat to re-run the previous queries and generate updated results. A confirmation message will appear.

  2. Click Yes to execute the queries.

Other recipe categories in AI-assisted recipes

In AI-assisted recipes, you can also use code and snippets (default) to perform data transformations.

Code

You can use the Code option in the AI-assisted recipe to write Python code and define logic for data transformation in the provided code editor. Subsequently, run this code recipe in the pipeline to transform the data and produce a dataset or a chart output.

Before running the code recipe in the data pipeline or flow, you can use the Test option to test the code and view the output. If the output is what you are expecting, you can run the custom code recipe in your flow.

Writing logic for the template from scratch

Use this procedure to write the data transformation template from scratch.

  1. Click the dataset block on the canvas to open the pull-out window.

  2. Click the Code tab.

  1. Write the logic for the code recipe in the provided coding space using Python.

Info: There is also a sample code available for datasets and charts. You can access this by navigating to the Code Tab and clicking on the Sample syntax icon corresponding to the Test button. This opens the Sample Syntax dialog. Click the Dataset & Chart tab and copy the sample Dataset & Chart syntax.

  1. Copy the code recipe generated by the AskAI into the Code tab and click Edit in Notebook to edit the code in the Jupyter Notebook editor.

  2. Click Save Back To Recipe.

  3. Click Test to test the code you have written.

  4. Click Save Code and then click the Run button to run this transformation in the data pipeline.

Writing logic to generate Artifact

Use this procedure to write a logic to generate an artifact from the code tab and add the generated artifact to the data pipeline.

  1. Click the dataset block on the canvas to open the pull-out window.

  2. Click the Code tab and provide the below code to generate the artifact.

Info: There is also a sample code available for artifact. You can access this by navigating to the Code Tab and clicking on the Sample syntax icon corresponding to the Test button. This opens the Sample Syntax dialog. Click the Artifacts tab and copy the sample Artifacts syntax.

def transform(entities, context):
    from utils.notebookhelpers.helpers import Helpers
    from utils.dtos.templateOutput import ArtifactOutput

    input_df_1 = entities['output_1'] # this is for reading input dataset

    import pandas as pd
    import numpy as np
    output_df_1 = input_df_1.drop(['Age'], axis=1)

    artifactsDir = Helpers.getOrCreateArtifactsDir(context, artifactsId = "test-artifact")
    output_df_1.head(10).to_csv(artifactsDir + '/test.csv')

    return {
        'output_2': output_df_1, # output_2 is the name of the output to be generated. Change the name as per your requirements.
        "test-artifact": ArtifactOutput()
        }

Important: You can test the artifact code by using the Test option.

  1. Click Save and then click the Run icon to add the generated artifact to the data pipeline.

Writing logic to generate a model

Use this procedure to write a logic to generate an ML model from the code tab. You can later use this model on the similar datasets to make predictions.

  1. Click the dataset block on the canvas to open the pull-out window.

  2. Click the Code tab and provide the below code to generate the model.

Info: There is also a sample code available for model. You can access this by navigating to the Code Tab and clicking on the Sample syntax icon corresponding to the Test button. This opens the Sample Syntax dialog. Click the Model tab and copy the sample Model syntax.

def transform(entities, context):
    from utils.notebookhelpers.helpers import Helpers
    from utils.dtos.templateOutput import ModelOutput
    from utils.dtos.rc_ml_model import RCMLModel


    input_df_1 = entities['output_3'] # this is for reading input dataset

    value_file = Helpers.getChildDir(context) + "/value.txt"
    with open(value_file, "w") as f:
        f.write("3")

    class TestModel(RCMLModel):
        def load(self, artifacts):
            value_file = artifacts['value']
            with open(value_file, "r") as f:
                self.value = int(f.read())
        
        def predict(self, model_input):
            x = float(model_input.values[0][0])
            output = x + self.value
            return output


    return {
        'test-model-code': ModelOutput(TestModel, {"value": value_file})
        }
  1. Click Save and then click the Run icon to generate the model that is trained with the dataset in the pipeline.

Writing logic to add global variables

Use this procedure to add global variables to store artifacts and models built on the source dataset in a project.

  1. Click the dataset block on the canvas to open the pull-out window.

  2. Click the Code tab and provide the below code to add global variables.

def transform(entities, context):
    from utils.notebookhelpers.helpers import Helpers

    input_df_1 = entities['titanic'] # this is for reading input dataset
    

    import pandas as pd
    import numpy as np
    output_df_1 = input_df_1.drop(columns=['Sex'])
    
    print("value of global variable:")
    print(Helpers.get_global_var(context, "test-var"))

    return {
        'output_1': output_df_1, # output_1 is the name of the output to be generated. Change the name as per your requirements.
        }
  1. Click Save and then click the Run icon to add the global variables.

Click the plus icon and select the AI-assisted recipe in the pull-out window.

Click the generate icon to generate the code. The generate button is enabled only after you select the dataset and provide the query.

Click the View Code icon to view the code generated by the AI for the given prompt. It is optional to view the code. You can return to the output—whether it's a dataset or chart—by closing the 'View Code' window using the Close option

Note: To check recipe logs, click on the recipe block on the canvas and from the side panel, click the logs icon . This displays a detailed record of both successful and failed recipe executions. For additional options, use the kebab menu to either export the logs as a text file or open them in a new browser tab for a more comprehensive view.

If you want to add a code recipe to the flow, see .

Click the plus icon and select the AI-assisted Recipe recipe in the pull-out window. This takes you to the Ask AI tab.

Click the plus icon and select the AI-assisted Recipe recipe in the pull-out window. This takes you to the Ask AI tab.

Click the plus icon and select the AI-assisted Recipe recipe in the pull-out window. This takes you to the Ask AI tab.

Click the plus icon and select the AI-assisted Recipe recipe in the pull-out window. This takes you to the Ask AI tab.

Adding an AI-assisted recipe