RapidCanvas Docs
  • Welcome
  • GETTING STARTED
    • Quick start guide
    • Introduction to RapidCanvas
    • RapidCanvas Concepts
    • Accessing the platform
  • BASIC
    • Projects
      • Projects Overview
        • Creating a project
        • Reviewing the Projects listing page
        • Duplicating a Project
        • Modifying the project settings
        • Deleting Project(s)
        • Configuring global variables at the project level
        • Working on a project
        • Generating the about content for the project
        • Generating AI snippets for each node on the Canvas
        • Marking & Unmarking a Project as Favorite
      • Canvas overview
        • Shortcut options on canvas
        • Queuing the Recipes
        • Bulk Deletion of Canvas Nodes
        • AI Guide
      • Recipes
        • AI-assisted recipe
        • Rapid model recipe
        • Template recipe
        • Code Recipe
        • RAG Recipes
      • Scheduler overview
        • Creating a scheduler
        • Running the scheduler manually
        • Managing schedulers in a project
        • Viewing the schedulers in a project
        • Viewing the run history of a specific scheduler
        • Publishing the updated data pipeline to selected jobs from canvas
        • Fetching the latest data pipeline to a specific scheduler
        • Comparing the canvas of the scheduler with current canvas of the project
      • Predictions
        • Manual Prediction
        • Prediction Scheduler
      • Segments and Scenarios
      • DataApps
        • Model DataApp
        • Project Canvas Datasets
        • Custom Uploaded Datasets
        • SQL Sources
        • Documents and PDFs
        • Prediction Service
        • Scheduler
        • Import DataApp
    • Connectors
      • Importing dataset(s) from the local system
      • Importing Text Files from the Local System
      • Connectors overview
      • Connect to external connectors
        • Importing data from Google Cloud Storage (GCS)
        • Importing data from Amazon S3
        • Importing data from Azure Blob
        • Importing data from Mongo DB
        • Importing data from Snowflake
        • Importing data from MySQL
        • Importing data from Amazon Redshift
        • Importing data from Fivetran connectors
    • Workspaces
      • User roles and permissions
    • Artifacts & Models
      • Adding Artifacts at the Project Level
      • Adding Models at the Project Level
      • Creating an artifact at the workspace level
      • Managing artifacts at the workspace level
      • Managing Models at the Workspace Level
      • Prediction services
    • Environments Overview
      • Creating an environment
      • Editing the environment details
      • Deleting an environment
      • Monitoring the resource utilization in an environment
  • ADVANCED
    • Starter Guide
      • Quick Start
    • Setup and Installation
      • Installing and setting up the SDK
    • Helper Functions
    • Notebook Guide
      • Introduction
      • Create a template
      • Code Snippets
      • DataApps
      • Prediction Service
      • How to
        • How to Authenticate
        • Create a new project
        • Create a Custom Environment
        • Add a dataset
        • Add a recipe to the dataset
        • Manage cloud connection
        • Code recipes
        • Display a template on the UI
        • Create Global Variables
        • Scheduler
        • Create new scenarios
        • Create Template
        • Use a template in a flow notebook
      • Reference Implementations
        • DataApps
        • Artifacts
        • Connectors
        • Feature Store
        • ML model
        • ML Pipeline
        • Multiple Files
      • Sample Projects
        • Model build and predict
    • Rapid Rag
  • Additional Reading
    • Release Notes
      • May 14, 2025
      • April 21, 2025
      • April 01, 2025
      • Mar 18, 2025
      • Feb 27, 2025
      • Jan 27, 2025
      • Dec 26, 2024
      • Nov 26, 2024
      • Oct 24, 2024
      • Sep 11, 2024
        • Aug 08, 2024
      • Aug 29, 2024
      • July 18, 2024
      • July 03, 2024
      • June 19, 2024
      • May 30, 2024
      • May 15, 2024
      • April 17, 2024
      • Mar 28, 2024
      • Mar 20, 2024
      • Feb 28, 2024
      • Feb 19, 2024
      • Jan 30, 2024
      • Jan 16, 2024
      • Dec 12, 2023
      • Nov 07, 2023
      • Oct 25, 2023
      • Oct 01, 2024
    • Glossary
Powered by GitBook
On this page
  • DataApps
  • Create App Template
  • Get App Templates
  • Enable/Disable an app template
  • Creating a dataapp by using an app template
  • Create dataapp for a given project and app template
  • Launch the dataapp
  • Get dataapps
  • Update a dataapp
  • Using Dataapp in Recipe
  • Test if dataapp launched successfully
  • Deleting dataapps
  • Dataapp variables
  1. ADVANCED
  2. Notebook Guide

DataApps

DataApps

from utils.rc.client.requests import Requests
from utils.rc.client.auth import AuthClient

from utils.rc.dtos.project import Project
from utils.rc.dtos.dataset import Dataset
from utils.rc.dtos.recipe import Recipe
from utils.rc.dtos.transform import Transform
from utils.rc.dtos.artifact import Artifact
from utils.rcclient.libs.dataapp_generator.dataapp_generator import DataappGenerator
from utils.rcclient.entities.app_template import AppTemplate, ParamMetadata
from utils.rcclient.enums import AppTemplateInputType, AppTemplateSource
from utils.rcclient.entities.dataapp import Dataapp

from utils.rc.dtos.template_v2 import TemplateV2, TemplateTransformV2

import requests
import pandas as pd
import logging
from utils.utils.log_util import LogUtil
LogUtil.set_basic_config(format='%(levelname)s:%(message)s', level=logging.INFO)
Requests.setRootHost("https://test.dev.rapidcanvas.net/api/")
AuthClient.setToken()

Create App Template

app_template = DataappGenerator.generate_app_template(name='testapp-template', display_name="RC streamlit test app 1", path="dataapps", source=AppTemplateSource.TENANT)
app_template.create_input_param(name="input_dataset", input_type=AppTemplateInputType.ENTITY, metadata=ParamMetadata(input_name="Entity", is_required=True, default_value="entity"))
app_template.create_input_param(name="input_artifact", input_type=AppTemplateInputType.ARTIFACT, metadata=ParamMetadata(input_name="Artifact", is_required=True, default_value="artifact"))
app_template.publish(force=True)
app_template._id

Get App Templates

It returns dict of templates with key as app template name and value as app template object.

templates = AppTemplate.get_all()
templates.get('testapp-template').name

Enable/Disable an app template

.. code:: ipython3

testapp_tmpl = templates.get('testapp-template')
testapp_tmpl.disable()
testapp_tmpl.enable()

Creating a dataapp by using an app template

Create a sample project

# Create project
project = Project.create(
    name='Sample Employee',
    description='Sample Employee_promotion'
    # createEmpty=True
)
print(project.id)

# Add dataset
employee = project.addDataset(
    dataset_name='sample_employee',
    dataset_description='Sample Employee Promotion Dataset',
    dataset_file_path='data/sample_employee_promotion_case.csv' #path as per your folder structure in Jypyter
)

# Add artifacts
Artifact.add_file("my-outside-artifact-v1", "data/titanic.csv")

# Publish template
time_diff_template = TemplateV2(
    name="sample_time_diff", description="Calculate the time difference between two dates", project_id=project.id,
    source="PROJECT", status="ACTIVE", tags=["UI", "Scalar"]
)
time_diff_template_transform = TemplateTransformV2(
    type = "python", params=dict(notebookName="timediff.ipynb"))

time_diff_template.base_transforms = [time_diff_template_transform]
time_diff_template.publish("transforms/timediff.ipynb")

# Create recipe
calculate_age_recipe = project.addRecipe([employee], name='calculate_age_recipe', artifacts=["my-outside-artifact-v1"])

# Add transform
calculate_age_transform = Transform()
calculate_age_transform.templateId = time_diff_template.id
calculate_age_transform.name='age'
calculate_age_transform.variables = {
    'inputDataset': 'sample_employee',
    'start_date': 'birth_date',
    'end_date': 'start_date',
    'how': 'years',
    'outputcolumn': 'age',
    'outputDataset': 'employee_with_age'
}
calculate_age_recipe.add_transform(calculate_age_transform)

# Run recipe
calculate_age_recipe.run()

Create dataapp for a given project and app template

dataapp = Dataapp.get_or_create(name="dataapp-using-testapp-template", app_template_id=testapp_tmpl._id, project_id=project.id, params={"input_dataset": employee.name})

Launch the dataapp

dataapp.launch()

Get dataapps

dataapp_by_name = Dataapp.find_by_name("dataapp-using-testapp-template")
dataapp_by_name._id

Update a dataapp

dataapp.description = "Updated description"
dataapp.display_name = "Updated display name"
dataapp.save()

Using Dataapp in Recipe

chart_tmpl = TemplateV2( name="chart-tmpl", description="sample chart tmpl", project_id=project.id, source="PROJECT", status="ACTIVE", tags=["UI", "Scalar"] ) chart_tmpl_transform = TemplateTransformV2( type = "python", params=dict(notebookName="create_chart.ipynb"))

chart_tmpl.base_transforms = [chart_tmpl_transform]
chart_tmpl.publish("transforms/create_chart.ipynb")

chart_recipe = project.addRecipe([employee], name='chart_recipe', artifacts=["my-outside-artifact-v1"])

chart_transform = Transform()
chart_transform.templateId = chart_tmpl.id
chart_transform.name='age'
chart_transform.variables = {
    'input_dataset': employee.name,
    "employee_dataset": employee.name,
    "chart_app": dataapp.name
}
chart_recipe.add_transform(chart_transform)

chart_recipe.run()

Test if dataapp launched successfully

.. code:: ipython3

import os
host = Requests.getRootHost()
base_url = host.split("/api/")[0]
path = dataapp.url.split('?')[0]

url = f"{base_url}{path}healthz"
res = requests.get(url)
print(f"app: {app_template.name} | url={url} | status: {res.status_code}")
assert res.status_code == 200

Deleting dataapps

Use this code block to delete the dataapp.

dataapp.delete()

Dataapp variables

The following table lists all variables in dataapps.

Variable syntax
Variable description
Output

dataapp.env_type

This is used to fetch the env type.

<EnvType.SMALL: 'SMALL'>

dataapp.cores

The is used to fetch the total cores in the selected environment.

1

dataapp.description

This is used to fetch the description for the dataapp

dataapp.display_name

This is used to fetch the display name of the dataapp

dataapp.disk_in_gbs

This is used to fetch the disk space in gbs.

20

dataapp.mem_in_mbs

This is used to fetch memory in mbs.

2048

dataapp.name

This is used to fetch the dataapp name.

20

dataapp.params

This is used to fetch the parameters in the environment.

{}

dataapp.project_id

This is used to fetch the project ID of the dataapp.

'523fea48-e27f-4d1e-b264-07455eda450a'

dataapp.scenario_id

This is used to fetch the scenario ID of a project for which dataapp is generated.

'523fea48-e27f-4d1e-b264-07455eda450a'

dataapp.url

This is used to fetch the dataapp URL to view charts.

'/dataapps/e6ec9bb2-9633-4037-8153-8a58ba943b77/?dlId=dataapp-using-testapp-template'

PreviousCode SnippetsNextPrediction Service

Last updated 2 months ago