Release Notes for RapidCanvas January 27, 2025

New Features

Following are the new features introduced in this release:

Import DataApp Functionality

We’re thrilled to announce the Import DataApp feature, which allows users to bring DataApps developed outside of RapidCanvas—using frameworks like ReactJS and Streamlit—into the RapidCanvas platform.

Key Highlights:

  • DataApp Type Selection: During the import process, users must select the appropriate DataApp type to ensure seamless integration and optimal performance.

Note

Selecting the wrong type may cause functionality or visualization issues.

With this feature, RapidCanvas provides a flexible and seamless way to integrate external DataApps, while offering the power of AI-driven analytics for dynamic and interactive data exploration.

Save Code Checkpoints in the API Connector Recipe

Introducing the Save Code Checkpoints feature, offering data scientists greater flexibility and control during the recipe-building process.

Key Highlights:

  • Save Checkpoints: Save up to five code checkpoints directly within the code editor.

  • Manage Checkpoints: Easily edit or delete checkpoints to keep them relevant and aligned with current development needs.

  • Finalize Code: Finalized code within a checkpoint can be saved back to the current editor, enabling the recipe to utilize this version during pipeline execution.

Additionally, we’ve added code-wrapping functionality to the editor, improving code readability and navigation by automatically wrapping lines beyond a certain length.

Enhancements to DataApps

We’ve introduced two new features to enhance your DataApps experience:

Allow Column Hyperlink Toggle:

  • Linking Between Columns:

    • Users can now enable linking between two columns in a dataset using the Allow Column Hyperlink Toggle.

    • This feature enables each row in one column to navigate to a corresponding row in another column via hyperlinks. For example, if you use AskAI to generate a hyperlink between two columns, you can configure the Column Name (the source column) and the Column Hyperlink (the target column).

Show Model Response Code Toggle:

  • Model Transparency:

    • Users can now toggle on the option to view the code generated by the model in response to queries.

    • This feature enhances transparency and provides valuable insights into the model’s internal processes and how it generates responses.

These enhancements improve flexibility and transparency in DataApps, providing users with more control and understanding of how their data and models interact.

Support for 50 Docs in Documents and PDF DataApp

The Documents and PDF DataApp has been enhanced to support the upload and querying of up to 50 documents.

Key Highlights:

  • Increased Document Capacity:

    • Users can now upload and query a larger collection of documents, up to 50 within the DataApp.

  • Improved Flexibility:

  • This enhancement enables users to handle more extensive document sets, providing greater flexibility for projects requiring the analysis or retrieval of information from multiple documents.

This update improves scalability, making the Documents and PDF DataApp more powerful for handling and querying larger document collections.

Enhancements

The enhancements made to the existing functionalities across various modules include:

Change in Nomenclature Across the Platform

To align with modern practices and enhance clarity, several terminology updates have been introduced across the platform:

  • Tenant is now referred to as Workspace.

  • Prediction Job has been renamed to Prediction Run in project-level navigation.

  • Job is now called Scheduler in project-level navigation.

  • Global Variables are now referred to as Project Variables in Project Settings.

  • Packages have been renamed to Python Libraries in Environments.

These changes ensure a more intuitive and consistent user experience across the platform.

Project-Wise Artifacts and Models

We’ve introduced a dedicated Artifacts & Models tab at the project level, enabling users to efficiently view and manage artifacts and models specific to each project. While users can still access artifacts and models across all projects, this project-level tab provides a more streamlined and focused experience.

Key Highlights:

Artifacts Management:

  • View artifacts generated specifically within a project.

  • Upload new artifacts directly to a project for use in data pipelines.

  • Add existing artifacts from the global Artifacts & Models page or create entirely new artifacts.

  • Manage artifacts with the ability to preview, download, or delete uploaded artifacts.

Note

Artifacts generated as outputs from recipes within a project cannot be deleted to ensure data integrity.

Models Management:

  • View and manage models available within the workspace.

  • Add models to a project using the Add Model option.

  • Models are sorted by their “Updated On” date, ensuring quick and easy access to the most recent versions.

This new project-level tab provides users with enhanced control and flexibility, offering a more focused way to manage artifacts and models directly within individual projects.

Project Duplication Now Includes AskAI Conversations

When duplicating a project that contains AskAI recipes, the associated AskAI conversations are now copied as well.

Key Highlights:

  • In the duplicate project, users can access the AskAI chat and click the Refresh Chat button to re-execute all queries.

  • Re-executed queries reflect updated results based on the latest data, ensuring accuracy and relevance.

This enhancement provides a seamless experience when duplicating projects with AI-driven interactions.

Enhancements to Prediction Service History (Previously Logs)

We’ve significantly improved the Prediction Service History (formerly Logs) to enhance usability and functionality.

Key Updates:

  • Renamed Logs to History:

    • The “Logs” option is now called History for better clarity.

    • Terminology changes:

      • Save Logs is now Save History.

      • The Logs tab is now the History tab, displaying real-time data for up to 30 days.

  • New Real-Time Logs Option for Debugging:

    • A new Logs option has been added to provide real-time debugging information for the Prediction Service.

    • This enables users to monitor prediction service activities and troubleshoot issues in real time.

  • New Columns in History:

    • Tracking ID: Displays the tracking ID for each prediction run.

    • Request ID: Provides the request ID for specific runs.

    • User: Shows details about the user who executed the prediction service.

  • Enhanced Sorting and Navigation:

    • Logs are now sorted by End Time in chronological order for easier analysis.

    • A search function has been added, allowing users to filter by user or prediction service.

  • Export Functionality:

    • Export prediction service logs using the export icon or open them in a new tab for detailed analysis.

  • Auto Updation of Edited Code

    • Changes in Prediction service code are now auto-updated providing seamless experience to the user

These updates provide a streamlined, user-friendly, and data-rich experience for managing and debugging prediction services

Display Error Explanations in AskAI Recipe

We’ve enhanced the AskAI Recipe experience by providing improved error handling and debugging tools:

Key Highlights:

Error Highlighting: When an error occurs, the Code Editor now highlights the specific line where the error occurred.

Error Explanations: A detailed error explanation is displayed directly within the editor, offering clear guidance on how to resolve the issue.

This update helps users quickly identify and address errors, ensuring a smoother coding and debugging experience.

Addition of Linux Libraries Tab in Environments

The Environments section has been enhanced to provide greater flexibility by enabling the addition of Linux libraries alongside Python libraries when creating environments.

Key Highlights:

  • Support for Linux Libraries: Users can now install Linux libraries in environments where the operating system is Linux and Python is installed on top of it.