RapidCanvas Docs
  • Welcome
  • GETTING STARTED
    • Quick start guide
    • Introduction to RapidCanvas
    • RapidCanvas Concepts
    • Accessing the platform
  • BASIC
    • Projects
      • Projects Overview
        • Creating a project
        • Reviewing the Projects listing page
        • Duplicating a Project
        • Modifying the project settings
        • Deleting Project(s)
        • Configuring global variables at the project level
        • Working on a project
        • Generating the about content for the project
        • Generating AI snippets for each node on the Canvas
        • Marking & Unmarking a Project as Favorite
      • Canvas overview
        • Shortcut options on canvas
        • Queuing the Recipes
        • Bulk Deletion of Canvas Nodes
        • AI Guide
      • Recipes
        • AI-assisted recipe
        • Rapid model recipe
        • Template recipe
        • Code Recipe
        • RAG Recipes
      • Scheduler overview
        • Creating a scheduler
        • Running the scheduler manually
        • Managing schedulers in a project
        • Viewing the schedulers in a project
        • Viewing the run history of a specific scheduler
        • Publishing the updated data pipeline to selected jobs from canvas
        • Fetching the latest data pipeline to a specific scheduler
        • Comparing the canvas of the scheduler with current canvas of the project
      • Predictions
        • Manual Prediction
        • Prediction Scheduler
      • Segments and Scenarios
      • DataApps
        • Model DataApp
        • Project Canvas Datasets
        • Custom Uploaded Datasets
        • SQL Sources
        • Documents and PDFs
        • Prediction Service
        • Scheduler
        • Import DataApp
    • Connectors
      • Importing dataset(s) from the local system
      • Importing Text Files from the Local System
      • Connectors overview
      • Connect to external connectors
        • Importing data from Google Cloud Storage (GCS)
        • Importing data from Amazon S3
        • Importing data from Azure Blob
        • Importing data from Mongo DB
        • Importing data from Snowflake
        • Importing data from MySQL
        • Importing data from Amazon Redshift
        • Importing data from Fivetran connectors
    • Workspaces
      • User roles and permissions
    • Artifacts & Models
      • Adding Artifacts at the Project Level
      • Adding Models at the Project Level
      • Creating an artifact at the workspace level
      • Managing artifacts at the workspace level
      • Managing Models at the Workspace Level
      • Prediction services
    • Environments Overview
      • Creating an environment
      • Editing the environment details
      • Deleting an environment
      • Monitoring the resource utilization in an environment
  • ADVANCED
    • Starter Guide
      • Quick Start
    • Setup and Installation
      • Installing and setting up the SDK
    • Helper Functions
    • Notebook Guide
      • Introduction
      • Create a template
      • Code Snippets
      • DataApps
      • Prediction Service
      • How to
        • How to Authenticate
        • Create a new project
        • Create a Custom Environment
        • Add a dataset
        • Add a recipe to the dataset
        • Manage cloud connection
        • Code recipes
        • Display a template on the UI
        • Create Global Variables
        • Scheduler
        • Create new scenarios
        • Create Template
        • Use a template in a flow notebook
      • Reference Implementations
        • DataApps
        • Artifacts
        • Connectors
        • Feature Store
        • ML model
        • ML Pipeline
        • Multiple Files
      • Sample Projects
        • Model build and predict
    • Rapid Rag
  • Additional Reading
    • Release Notes
      • May 14, 2025
      • April 21, 2025
      • April 01, 2025
      • Mar 18, 2025
      • Feb 27, 2025
      • Jan 27, 2025
      • Dec 26, 2024
      • Nov 26, 2024
      • Oct 24, 2024
      • Sep 11, 2024
        • Aug 08, 2024
      • Aug 29, 2024
      • July 18, 2024
      • July 03, 2024
      • June 19, 2024
      • May 30, 2024
      • May 15, 2024
      • April 17, 2024
      • Mar 28, 2024
      • Mar 20, 2024
      • Feb 28, 2024
      • Feb 19, 2024
      • Jan 30, 2024
      • Jan 16, 2024
      • Dec 12, 2023
      • Nov 07, 2023
      • Oct 25, 2023
      • Oct 01, 2024
    • Glossary
Powered by GitBook
On this page
  • Adding a Worksheet with Multiple Files to the Canvas
  • Appending a File to the Source Dataset
  1. BASIC
  2. Connectors

Importing dataset(s) from the local system

PreviousConnectorsNextImporting Text Files from the Local System

Last updated 26 days ago

To import a file from your local system to the canvas for performing predictions and generating a modelling pipeline, follow the standard upload procedure. You can upload up to 25 files at a time, with a maximum file size of 5GB each. If you have multiple files grouped in a worksheet, you can upload the worksheet to automatically add each file as a separate dataset on the canvas, making it easier to manage and work with multiple data sources in one go.

You can upload data using the following methods:

  • Dataset – Import a dataset or file onto the canvas to perform data transformations from the local system or through data connectors. Refer to or through available .

  • Text File – Import a text file onto the canvas to apply transformations. Refer to .

  • Artifact – Import artifacts for transformation. You can either use existing artifacts or upload new ones from your local system. Refer to section.

  • Code Recipe – Fetch dataset by writing code within a code recipe. Refer to .

To import the file from the local system:

  1. Click the project to which you want to upload the file. The Canvas page is displayed.

  2. Do one of the following:

    • Click the +Dataset option on the canvas. However, this option is displayed only when there are no datasets uploaded onto the canvas.

  • Click the plus icon and select Dataset to navigate to the Create New Data set window.

  1. By default, the project name is populated in the Project field.

  2. Select the source from where you want to upload. By default, File Import is selected.

    After establishing the connection and importing the files, the imported files are populated in this drop-down list.

  3. Select the Mode to upload the file. Possible options:

    • Single file import - Use this option to import only a single file onto the canvas.

    • Merge - Use this option to merge multiple files into one file. Ensure that schema in all the files is same.

    • Segregate - Use this option to upload multiple files together onto the canvas as separate files.

  4. Select Single file upload.

  5. Click Import Files from Local to browse and upload the file from your local system.

You can upload multiple files using the Import Files from Local option. When you upload more than one file, the Mode will automatically switch to Segregate.

  1. Click Import. Once the file is imported, you can view the file name and file size.

You can perform these actions:

  • If you want to delete the uploaded file, click the delete icon corresponding to this file name.

  • If you want to rename the file name, click the edit icon in the Dataset name.

  1. Click File Configuration to expand and view the file configuration fields, such as Separator and Encoding.

  1. Separator and Encoding are auto-detected by the platform when you upload the file and this file has a single column containing all column names separated by a specified separator.

Note: The separator option allows you to split all the values separated by a separator into different columns.

  1. Select separator from the drop-down list if the platform failed to auto-detect. Possible values:

* Comma
* Tab
* Pipe
* Colon
* Semicolon
* Dot
* Space
  1. Select the encoding option if it is not auto-detected by the platform.

  2. Click Apply to apply the separator and encoding options you have selected. Please note that these options are only available for CSV files.

You can now see the data in the file clicking the Open Dataset option. This takes you to the View data page.

13. View the sample data and the data type of each column in the Sample Data section. To change the data type of a specific column, click the data type drop-down under the column name and select the new data type.

  1. Click Done. Once the dataset is added, you are redirected to the Canvas view page where you can see the uploaded dataset node.

Adding a Worksheet with Multiple Files to the Canvas

If you have an .xlsx worksheet containing multiple files, you can import all the files within the sheet onto the canvas in one step. Follow the steps below to upload and manage these files:

  1. Navigate to the Project: Click the project to which you want to upload the worksheet. The Canvas page will be displayed.

  2. Start the Dataset Upload:

    • If no datasets have been uploaded yet, click the +Dataset option on the canvas.

    • Alternatively, click the plus (+) icon and select Dataset to open the dataset creation window.

  3. Import the Worksheet:

    • On the Create New Dataset page, click Import Files From Local to upload your worksheet.

    • The file names of all the sheets contained within the uploaded worksheet will be listed.

    • You can choose to remove any files you do not want to upload by clicking the delete icon next to each.

  4. Preview the Files: Click Import to proceed. You will be able to view a sample of each file's data. For worksheets with multiple files, use the navigation arrows to scroll through the previews.

  5. Complete the Upload: Click View in Canvas to display all the uploaded datasets on the canvas.

Appending a File to the Source Dataset

You can append a file to an existing dataset, provided both datasets share the same schema. However, keep in mind that:

  • When a file is appended, all recipes previously run with the source dataset will become invalid and move to an unbuilt state. You must re-run the flow after appending the dataset.

  • Any segments created will be deleted, and custom scenarios will use the entire dataset instead of the segmented data.

  • You can only append a file to the source dataset.

Use this procedure to append a file to the source dataset.

To add a file to the source dataset:

  1. Select a project to open the canvas.

  2. Do one of the following to add a file to the source dataset:

    • Right-click the source dataset block and select Add File.

    • Click the dataset block to open the side sheet, then click the plus (+) button and select File.

    • Click the dataset block, then click Preview to navigate to the View Data page. Click the plus (+) button, then select File.

  1. Review the following details that are preselected and cannot be changed on the Append file page:

  • Project – The current project is selected by default.

  • Source – The original source of the dataset is preselected.

  • Mode – The append mode is selected by default.

  1. Click Import Files From Local to browse and select the file to append. The dataset name is set by default, but you can rename it.

  2. Click Import. Once imported, you can view the file name and file size.

Note:

  • To delete the uploaded file, click the Delete icon next to its name.

  • To rename the file, click the Edit icon in the Dataset Name field.

  • Click File Configuration to expand and view configuration details such as Separator and Encoding (inherited from the source dataset; these cannot be modified).

  • View a preview of the sample data in the Sample Data section.

  1. Click Done. You will be redirected to the View Data page, where you can see the total number of rows after the file is appended.

You can either upload the file from the local system or create a new connection using the +New Connection to import files from external Data connectors. For more information, see

Connect to data connectors
Importing a Dataset from your local system
connectors
importing a text file from the local system
Artifacts
adding a code recipe to fetch data
File Upload