RapidCanvas Docs
  • Welcome
  • GETTING STARTED
    • Quick start guide
    • Introduction to RapidCanvas
    • RapidCanvas Concepts
    • Accessing the platform
  • BASIC
    • Projects
      • Projects Overview
        • Creating a project
        • Reviewing the Projects listing page
        • Duplicating a Project
        • Modifying the project settings
        • Deleting Project(s)
        • Configuring global variables at the project level
        • Working on a project
        • Generating the about content for the project
        • Generating AI snippets for each node on the Canvas
        • Marking & Unmarking a Project as Favorite
      • Canvas overview
        • Shortcut options on canvas
        • Queuing the Recipes
        • Bulk Deletion of Canvas Nodes
        • AI Guide
      • Recipes
        • AI-assisted recipe
        • Rapid model recipe
        • Template recipe
        • Code Recipe
        • RAG Recipes
      • Scheduler overview
        • Creating a scheduler
        • Running the scheduler manually
        • Managing schedulers in a project
        • Viewing the schedulers in a project
        • Viewing the run history of a specific scheduler
        • Publishing the updated data pipeline to selected jobs from canvas
        • Fetching the latest data pipeline to a specific scheduler
        • Comparing the canvas of the scheduler with current canvas of the project
      • Predictions
        • Manual Prediction
        • Prediction Scheduler
      • Segments and Scenarios
      • DataApps
        • Model DataApp
        • Project Canvas Datasets
        • Custom Uploaded Datasets
        • SQL Sources
        • Documents and PDFs
        • Prediction Service
        • Scheduler
        • Import DataApp
    • Connectors
      • Importing dataset(s) from the local system
      • Importing Text Files from the Local System
      • Connectors overview
      • Connect to external connectors
        • Importing data from Google Cloud Storage (GCS)
        • Importing data from Amazon S3
        • Importing data from Azure Blob
        • Importing data from Mongo DB
        • Importing data from Snowflake
        • Importing data from MySQL
        • Importing data from Amazon Redshift
        • Importing data from Fivetran connectors
    • Workspaces
      • User roles and permissions
    • Artifacts & Models
      • Adding Artifacts at the Project Level
      • Adding Models at the Project Level
      • Creating an artifact at the workspace level
      • Managing artifacts at the workspace level
      • Managing Models at the Workspace Level
      • Prediction services
    • Environments Overview
      • Creating an environment
      • Editing the environment details
      • Deleting an environment
      • Monitoring the resource utilization in an environment
  • ADVANCED
    • Starter Guide
      • Quick Start
    • Setup and Installation
      • Installing and setting up the SDK
    • Helper Functions
    • Notebook Guide
      • Introduction
      • Create a template
      • Code Snippets
      • DataApps
      • Prediction Service
      • How to
        • How to Authenticate
        • Create a new project
        • Create a Custom Environment
        • Add a dataset
        • Add a recipe to the dataset
        • Manage cloud connection
        • Code recipes
        • Display a template on the UI
        • Create Global Variables
        • Scheduler
        • Create new scenarios
        • Create Template
        • Use a template in a flow notebook
      • Reference Implementations
        • DataApps
        • Artifacts
        • Connectors
        • Feature Store
        • ML model
        • ML Pipeline
        • Multiple Files
      • Sample Projects
        • Model build and predict
    • Rapid Rag
  • Additional Reading
    • Release Notes
      • May 14, 2025
      • April 21, 2025
      • April 01, 2025
      • Mar 18, 2025
      • Feb 27, 2025
      • Jan 27, 2025
      • Dec 26, 2024
      • Nov 26, 2024
      • Oct 24, 2024
      • Sep 11, 2024
        • Aug 08, 2024
      • Aug 29, 2024
      • July 18, 2024
      • July 03, 2024
      • June 19, 2024
      • May 30, 2024
      • May 15, 2024
      • April 17, 2024
      • Mar 28, 2024
      • Mar 20, 2024
      • Feb 28, 2024
      • Feb 19, 2024
      • Jan 30, 2024
      • Jan 16, 2024
      • Dec 12, 2023
      • Nov 07, 2023
      • Oct 25, 2023
      • Oct 01, 2024
    • Glossary
Powered by GitBook
On this page
  • To import the text file from the local system:
  • Viewing the File Information
  • To view data:
  • Actions on the Data View Page:
  1. BASIC
  2. Connectors

Importing Text Files from the Local System

PreviousImporting dataset(s) from the local systemNextConnectors overview

Last updated 1 month ago

Use this procedure to import text files from the local system on which you want to perform predictions and generate a modelling pipeline.

To import the text file from the local system:

  1. Click the project to which you want to upload the file. The Canvas page is displayed.

  2. Click the plus (+) icon and select Text File to navigate to the Create New File page. Alternatively, you can click the +Text File option on the canvas, but this option is only available when no datasets are present on the canvas.

  1. By default, the project name is populated in the Project field.

  2. Select the source from where you want to upload. By default, File Import is selected.

    You can either upload the file from the local system or create a new connection using +New Connection to import files from external data connectors. For more information, see Connect to data connectors.

    Note:

    • Supported file formats: .txt, .json, .html, .md

    • Connector types supported: Amazon S3, Azure Blob Storage, and Google Cloud Platform

    After establishing the connection and importing the files, the imported files are populated in this drop-down list.

  3. Select the Mode to upload the file. Possible options:

    • Single file import – Import a single file onto the canvas.

    • Segregate – Upload multiple files together onto the canvas as separate files.

  4. Select Single file Import.

  5. Click Import Files from Local to browse and upload the file from your local system.

  6. Click Import. Once the file is imported, you can view the file name and file size.

    You can now see the data in the file by clicking the Open Text Input option. This takes you to the Data page to view the data.

  7. Click Done. Once the file is added, you are redirected to the Canvas view page where you can see the uploaded file node.

Viewing the File Information

Use this procedure to view the file information.

To view data:

  1. Select the text block that you have uploaded onto the canvas. This opens the side sheet.

  2. Click Preview to navigate to the Data page.

  3. View the data within the text file you have uploaded on the Data tab.

  4. Click the Source tab.

  5. Review the Source details:

    • Source Type: The type of source from which the file was imported.

    • File Names: The name of the file.

    • Updated on: The date on which the file was last updated.

    • Audit History: Click Audit History to check the log of user activities. Each entry includes the user who performed the action, the type of action performed, and the timestamp.

Actions on the Data View Page:

Click the plus (+) icon to perform the following actions:

  • Update file – Replace the existing file with a new file.

  • Template – Select a template recipe to run on the file data.

  • Code – Select a code recipe to run on the file data.

Click the Actions drop-down to:

  • Export the file – Use the Export as Markdown option to export the file in the format you uploaded it.

  • Delete the file – Remove the file and associated recipes with the dataset using the Delete option.