RapidCanvas Docs
  • Welcome
  • GETTING STARTED
    • Quick start guide
    • Introduction to RapidCanvas
    • RapidCanvas Concepts
    • Accessing the platform
  • BASIC
    • Projects
      • Projects Overview
        • Creating a project
        • Reviewing the Projects listing page
        • Duplicating a Project
        • Modifying the project settings
        • Deleting Project(s)
        • Configuring global variables at the project level
        • Working on a project
        • Generating the about content for the project
        • Generating AI snippets for each node on the Canvas
        • Marking & Unmarking a Project as Favorite
      • Canvas overview
        • Shortcut options on canvas
        • Queuing the Recipes
        • Bulk Deletion of Canvas Nodes
        • AI Guide
      • Recipes
        • AI-assisted recipe
        • Rapid model recipe
        • Template recipe
        • Code Recipe
        • RAG Recipes
      • Scheduler overview
        • Creating a scheduler
        • Running the scheduler manually
        • Managing schedulers in a project
        • Viewing the schedulers in a project
        • Viewing the run history of a specific scheduler
        • Publishing the updated data pipeline to selected jobs from canvas
        • Fetching the latest data pipeline to a specific scheduler
        • Comparing the canvas of the scheduler with current canvas of the project
      • Predictions
        • Manual Prediction
        • Prediction Scheduler
      • Segments and Scenarios
      • DataApps
        • Model DataApp
        • Project Canvas Datasets
        • Custom Uploaded Datasets
        • SQL Sources
        • Documents and PDFs
        • Prediction Service
        • Scheduler
        • Import DataApp
    • Connectors
      • Importing dataset(s) from the local system
      • Importing Text Files from the Local System
      • Connectors overview
      • Connect to external connectors
        • Importing data from Google Cloud Storage (GCS)
        • Importing data from Amazon S3
        • Importing data from Azure Blob
        • Importing data from Mongo DB
        • Importing data from Snowflake
        • Importing data from MySQL
        • Importing data from Amazon Redshift
        • Importing data from Fivetran connectors
    • Workspaces
      • User roles and permissions
    • Artifacts & Models
      • Adding Artifacts at the Project Level
      • Adding Models at the Project Level
      • Creating an artifact at the workspace level
      • Managing artifacts at the workspace level
      • Managing Models at the Workspace Level
      • Prediction services
    • Environments Overview
      • Creating an environment
      • Editing the environment details
      • Deleting an environment
      • Monitoring the resource utilization in an environment
  • ADVANCED
    • Starter Guide
      • Quick Start
    • Setup and Installation
      • Installing and setting up the SDK
    • Helper Functions
    • Notebook Guide
      • Introduction
      • Create a template
      • Code Snippets
      • DataApps
      • Prediction Service
      • How to
        • How to Authenticate
        • Create a new project
        • Create a Custom Environment
        • Add a dataset
        • Add a recipe to the dataset
        • Manage cloud connection
        • Code recipes
        • Display a template on the UI
        • Create Global Variables
        • Scheduler
        • Create new scenarios
        • Create Template
        • Use a template in a flow notebook
      • Reference Implementations
        • DataApps
        • Artifacts
        • Connectors
        • Feature Store
        • ML model
        • ML Pipeline
        • Multiple Files
      • Sample Projects
        • Model build and predict
  • Additional Reading
    • Release Notes
      • April 21, 2025
      • April 01, 2025
      • Mar 18, 2025
      • Feb 27, 2025
      • Jan 27, 2025
      • Dec 26, 2024
      • Nov 26, 2024
      • Oct 24, 2024
      • Sep 11, 2024
        • Aug 08, 2024
      • Aug 29, 2024
      • July 18, 2024
      • July 03, 2024
      • June 19, 2024
      • May 30, 2024
      • May 15, 2024
      • April 17, 2024
      • Mar 28, 2024
      • Mar 20, 2024
      • Feb 28, 2024
      • Feb 19, 2024
      • Jan 30, 2024
      • Jan 16, 2024
      • Dec 12, 2023
      • Nov 07, 2023
      • Oct 25, 2023
      • Oct 01, 2024
    • Glossary
Powered by GitBook
On this page
  • Creating Project
  • Building Model
  • Predicting Model
  • Building Features
  • Creating Predict Service
  1. ADVANCED
  2. Notebook Guide
  3. Reference Implementations

ML Pipeline

# Get the latest lib from Rapidcanvas
# !pip install --extra-index-url=https://us-central1-python.pkg.dev/rapidcanvas-361003/pypi/simple utils==0.12dev0

from utils.rc.client.requests import Requests
from utils.rc.client.auth import AuthClient

from utils.rc.dtos.project import Project
from utils.rc.dtos.dataset import Dataset
from utils.rc.dtos.recipe import Recipe
from utils.rc.dtos.transform import Transform
from utils.rc.dtos.template import Template
from utils.rc.dtos.template import TemplateTransform
from utils.rc.dtos.template import TemplateInput
from utils.rc.dtos.artifact import Artifact
from utils.rc.dtos.dataSource import DataSource
from utils.rc.dtos.dataSource import DataSourceType
from utils.rc.dtos.dataSource import RedisStorageConfig
from utils.rc.dtos.prediction_service import PredictionService
from utils.dtos.rc_prediction_service import RCPredictionService

from utils.rc.dtos.template_v2 import TemplateV2, TemplateTransformV2

import pandas as pd
import logging

logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO)
# Requests.setRootHost("https://test.dev.rapidcanvas.net/api/")
# Requests.setRootHost("http://localhost:8080/api/")
AuthClient.setToken()
INFO:Authentication successful

Creating Project

project = Project.create(
    name="Example ML Pipeline",
    description="Testing python lib",
    createEmpty=True
)
2023-02-02 16:26:47.176 INFO    root: Found existing project by name: Example ML Pipeline
2023-02-02 16:26:47.178 INFO    root: Deleting existing project
2023-02-02 16:26:48.309 INFO    root: Creating new project by name: Example ML Pipeline
project.id
'12e039d5-3254-4cf2-85c3-6af1350a5299'

Building Model

model_name = "add-value-model"
service_name = "add-value-service"
recipe = project.addRecipe([], name="build")
2023-02-02 16:26:50.639 INFO    root: Creating new recipe
template = TemplateV2(
    name="CreateMLModel", description="CreateMLModel", project_id=project.id, source="CUSTOM", status="ACTIVE", tags=["Number", "datatype-long"]
)
template_transform = TemplateTransformV2(type="python", params=dict(notebookName="CreateMLModel.ipynb"))
template.base_transforms = [template_transform]
template.publish("transforms/CreateMLModel.ipynb")
2023-02-02 16:26:51.790 INFO    root: Publishing template | data=TemplateV2(name='CreateMLModel', display_name=None, id=None, version='1.0', project_id='12e039d5-3254-4cf2-85c3-6af1350a5299', projectId='12e039d5-3254-4cf2-85c3-6af1350a5299', is_global=False, description='CreateMLModel', tags=['Number', 'datatype-long'], baseTransforms=[TemplateTransformV2(type='python', params={'notebookName': 'CreateMLModel.ipynb'})], base_transforms=[TemplateTransformV2(type='python', params={'notebookName': 'CreateMLModel.ipynb'})], source='CUSTOM', status='ACTIVE', inputs=[])
2023-02-02 16:26:54.391 INFO    root: Template Published
2023-02-02 16:26:54.688 INFO    blib2to3.pgen2.driver: Generating grammar tables from /Users/nikunj/miniconda3/lib/python3.8/site-packages/blib2to3/Grammar.txt
2023-02-02 16:26:54.703 INFO    blib2to3.pgen2.driver: Writing grammar tables to /Users/nikunj/Library/Caches/black/22.1.0/Grammar3.8.11.final.0.pickle
2023-02-02 16:26:54.705 INFO    blib2to3.pgen2.driver: Writing failed: [Errno 2] No such file or directory: '/Users/nikunj/Library/Caches/black/22.1.0/tmpk31rwcgx'
2023-02-02 16:26:54.707 INFO    blib2to3.pgen2.driver: Generating grammar tables from /Users/nikunj/miniconda3/lib/python3.8/site-packages/blib2to3/PatternGrammar.txt
2023-02-02 16:26:54.709 INFO    blib2to3.pgen2.driver: Writing grammar tables to /Users/nikunj/Library/Caches/black/22.1.0/PatternGrammar3.8.11.final.0.pickle
2023-02-02 16:26:54.710 INFO    blib2to3.pgen2.driver: Writing failed: [Errno 2] No such file or directory: '/Users/nikunj/Library/Caches/black/22.1.0/tmplbzvi2tj'
2023-02-02 16:26:54.748 WARNING papermill: Input notebook does not contain a cell with tag 'parameters'
2023-02-02 16:26:55.562 INFO    papermill: Executing notebook with kernel: python3
INFO:User authenticated successfully
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:
**************************************
**    CREATING INPUTS: modelName    **
**************************************
2023-02-02 16:26:58.816 INFO    utils.rc.wrapper.templates: Inputs created successfully | template_id=c0fe7cbc-5d4e-4472-9882-f59973bf87b7
Inputs created successfully | template_id=c0fe7cbc-5d4e-4472-9882-f59973bf87b7
transform = Transform()
transform.templateId = template.id
transform.name = "transform_1"
transform.variables = {
    "modelName": model_name
}
# recipe.prepareForLocal(transform, contextId="CreateMLModel")
recipe.addTransform(transform)
WARNING:
#############################################IMPORTANT#############################################
addTransform is going to deprecate soon. Please use add_transform instead
####################################################################################################

2023-02-02 16:27:01.438 INFO    root: Adding new transform
2023-02-02 16:27:03.741 INFO    root: Transform added Successfully
recipe.run()
2023-02-02 16:27:03.753 INFO    root: Started running
2023-02-02 16:27:03.758 INFO    root: You can look at the progress on UI at https://test.dev.rapidcanvas.net/#/projects/12e039d5-3254-4cf2-85c3-6af1350a5299
2023-02-02 16:27:10.691 INFO    root: No errors found
all_models = PredictionService.get_all_models()
assert model_name in all_models, "models dont match"

Predicting Model

titanic_dataset = project.addDataset(
    dataset_name="titanic_dataset",
    dataset_description="titanic_dataset",
    dataset_file_path="data/titanic.csv"
)
2023-02-02 16:27:14.256 INFO    root: Creating new dataset by name:titanic_dataset
2023-02-02 16:27:15.456 INFO    root: Uploading file data/titanic.csv ....
2023-02-02 16:27:29.642 INFO    root: Uploading Done
predict_recipe = project.addRecipe([titanic_dataset], name="predict")
2023-02-02 16:27:30.786 INFO    root: Creating new recipe
template = TemplateV2(
    name="PredictMLModel", description="PredictMLModel", project_id=project.id, source="CUSTOM", status="ACTIVE", tags=["Number", "datatype-long"]
)
template_transform = TemplateTransformV2(type="python", params=dict(notebookName="PredictMLModel.ipynb"))
template.base_transforms = [template_transform]
template.publish("transforms/PredictMLModel.ipynb")
2023-02-02 16:27:31.968 INFO    root: Publishing template | data=TemplateV2(name='PredictMLModel', display_name=None, id=None, version='1.0', project_id='12e039d5-3254-4cf2-85c3-6af1350a5299', projectId='12e039d5-3254-4cf2-85c3-6af1350a5299', is_global=False, description='PredictMLModel', tags=['Number', 'datatype-long'], baseTransforms=[TemplateTransformV2(type='python', params={'notebookName': 'PredictMLModel.ipynb'})], base_transforms=[TemplateTransformV2(type='python', params={'notebookName': 'PredictMLModel.ipynb'})], source='CUSTOM', status='ACTIVE', inputs=[])
2023-02-02 16:27:34.532 INFO    root: Template Published
2023-02-02 16:27:34.547 WARNING papermill: Input notebook does not contain a cell with tag 'parameters'
2023-02-02 16:27:35.292 INFO    papermill: Executing notebook with kernel: python3
INFO:User authenticated successfully
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:
**************************************************
**    CREATING INPUTS: modelName, modelInput    **
**************************************************
2023-02-02 16:27:38.548 INFO    utils.rc.wrapper.templates: Inputs created successfully | template_id=08784ac1-4ee8-4cef-848b-705b267540f9
Inputs created successfully | template_id=08784ac1-4ee8-4cef-848b-705b267540f9
transform = Transform()
transform.templateId = template.id
transform.name = "transform"
transform.variables = {
    "modelInput": titanic_dataset.name,
    "modelName": model_name
}
# predict_recipe.prepareForLocal(transform, contextId="PredictMLModel")
predict_recipe.addTransform(transform)
predict_recipe.run()
WARNING:
#############################################IMPORTANT#############################################
addTransform is going to deprecate soon. Please use add_transform instead
####################################################################################################

2023-02-02 16:27:41.166 INFO    root: Adding new transform
2023-02-02 16:27:43.461 INFO    root: Transform added Successfully
2023-02-02 16:27:43.464 INFO    root: Started running
2023-02-02 16:27:43.465 INFO    root: You can look at the progress on UI at https://test.dev.rapidcanvas.net/#/projects/12e039d5-3254-4cf2-85c3-6af1350a5299
2023-02-02 16:27:59.152 INFO    root: No errors found
output = predict_recipe.getChildrenDatasets()['output']
output.getData()
<div>
    <style scoped>
        .dataframe tbody tr th:only-of-type {
            vertical-align: middle;
        }

        .dataframe tbody tr th {
            vertical-align: top;
        }

        .dataframe thead th {
            text-align: right;
        }
    </style>
    <table border="1" class="dataframe">
        <thead>
            <tr style="text-align: right;">
                <th></th>
                <th>PassengerId</th>
                <th>Survived</th>
                <th>Pclass</th>
                <th>Name</th>
                <th>Sex</th>
                <th>Age</th>
                <th>SibSp</th>
                <th>Parch</th>
                <th>Ticket</th>
                <th>Fare</th>
                <th>Cabin</th>
                <th>Embarked</th>
                <th>Fare_added</th>
            </tr>
        </thead>
        <tbody>
            <tr>
                <th>0</th>
                <td>1</td>
                <td>0</td>
                <td>3</td>
                <td>Braund, Mr. Owen Harris</td>
                <td>male</td>
                <td>22.0</td>
                <td>1</td>
                <td>0</td>
                <td>A/5 21171</td>
                <td>7.25</td>
                <td>nan</td>
                <td>S</td>
                <td>10.25</td>
            </tr>
            <tr>
                <th>1</th>
                <td>2</td>
                <td>1</td>
                <td>1</td>
                <td>Cumings, Mrs. John Bradley (Florence Briggs Th...</td>
                <td>female</td>
                <td>38.0</td>
                <td>1</td>
                <td>0</td>
                <td>PC 17599</td>
                <td>71.2833</td>
                <td>C85</td>
                <td>C</td>
                <td>74.2833</td>
            </tr>
            <tr>
                <th>2</th>
                <td>3</td>
                <td>1</td>
                <td>3</td>
                <td>Heikkinen, Miss. Laina</td>
                <td>female</td>
                <td>26.0</td>
                <td>0</td>
                <td>0</td>
                <td>STON/O2. 3101282</td>
                <td>7.925</td>
                <td>nan</td>
                <td>S</td>
                <td>10.925</td>
            </tr>
            <tr>
                <th>3</th>
                <td>4</td>
                <td>1</td>
                <td>1</td>
                <td>Futrelle, Mrs. Jacques Heath (Lily May Peel)</td>
                <td>female</td>
                <td>35.0</td>
                <td>1</td>
                <td>0</td>
                <td>113803</td>
                <td>53.1</td>
                <td>C123</td>
                <td>S</td>
                <td>56.1</td>
            </tr>
            <tr>
                <th>4</th>
                <td>5</td>
                <td>0</td>
                <td>3</td>
                <td>Allen, Mr. William Henry</td>
                <td>male</td>
                <td>35.0</td>
                <td>0</td>
                <td>0</td>
                <td>373450</td>
                <td>8.05</td>
                <td>nan</td>
                <td>S</td>
                <td>11.05</td>
            </tr>
            <tr>
                <th>...</th>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
                <td>...</td>
            </tr>
            <tr>
                <th>95</th>
                <td>96</td>
                <td>0</td>
                <td>3</td>
                <td>Shorney, Mr. Charles Joseph</td>
                <td>male</td>
                <td>nan</td>
                <td>0</td>
                <td>0</td>
                <td>374910</td>
                <td>8.05</td>
                <td>nan</td>
                <td>S</td>
                <td>11.05</td>
            </tr>
            <tr>
                <th>96</th>
                <td>97</td>
                <td>0</td>
                <td>1</td>
                <td>Goldschmidt, Mr. George B</td>
                <td>male</td>
                <td>71.0</td>
                <td>0</td>
                <td>0</td>
                <td>PC 17754</td>
                <td>34.6542</td>
                <td>A5</td>
                <td>C</td>
                <td>37.6542</td>
            </tr>
            <tr>
                <th>97</th>
                <td>98</td>
                <td>1</td>
                <td>1</td>
                <td>Greenfield, Mr. William Bertram</td>
                <td>male</td>
                <td>23.0</td>
                <td>0</td>
                <td>1</td>
                <td>PC 17759</td>
                <td>63.3583</td>
                <td>D10 D12</td>
                <td>C</td>
                <td>66.3583</td>
            </tr>
            <tr>
                <th>98</th>
                <td>99</td>
                <td>1</td>
                <td>2</td>
                <td>Doling, Mrs. John T (Ada Julia Bone)</td>
                <td>female</td>
                <td>34.0</td>
                <td>0</td>
                <td>1</td>
                <td>231919</td>
                <td>23.0</td>
                <td>nan</td>
                <td>S</td>
                <td>26.0</td>
            </tr>
            <tr>
                <th>99</th>
                <td>100</td>
                <td>0</td>
                <td>2</td>
                <td>Kantor, Mr. Sinai</td>
                <td>male</td>
                <td>34.0</td>
                <td>1</td>
                <td>0</td>
                <td>244367</td>
                <td>26.0</td>
                <td>nan</td>
                <td>S</td>
                <td>29.0</td>
            </tr>
        </tbody>
    </table>
    <p>100 rows × 13 columns</p>
</div>

Building Features

online_data_store = DataSource.createDataSource(
    "online-redis",
    DataSourceType.REDIS_STORAGE,
    {RedisStorageConfig.HOST: "10.41.1.3", RedisStorageConfig.PORT: "6379"}
)
2023-02-02 16:28:04.109 INFO    root: Found existing data source by name: online-redis
2023-02-02 16:28:04.111 INFO    root: Updating the same
recipe = project.addRecipe([titanic_dataset], name="feature_store_sync")
2023-02-02 16:28:06.393 INFO    root: Creating new recipe
template = TemplateV2(
    name="FeatureStoreSync", description="FeatureStoreSync", project_id=project.id, source="CUSTOM", status="ACTIVE", tags=["Number", "datatype-long"]
)
template_transform = TemplateTransformV2(type="python", params=dict(notebookName="FeatureStoreSync.ipynb"))
template.base_transforms = [template_transform]
template.publish("transforms/FeatureStoreSync.ipynb")
2023-02-02 16:28:07.586 INFO    root: Publishing template | data=TemplateV2(name='FeatureStoreSync', display_name=None, id=None, version='1.0', project_id='12e039d5-3254-4cf2-85c3-6af1350a5299', projectId='12e039d5-3254-4cf2-85c3-6af1350a5299', is_global=False, description='FeatureStoreSync', tags=['Number', 'datatype-long'], baseTransforms=[TemplateTransformV2(type='python', params={'notebookName': 'FeatureStoreSync.ipynb'})], base_transforms=[TemplateTransformV2(type='python', params={'notebookName': 'FeatureStoreSync.ipynb'})], source='CUSTOM', status='ACTIVE', inputs=[])
2023-02-02 16:28:10.183 INFO    root: Template Published
2023-02-02 16:28:10.198 WARNING papermill: Input notebook does not contain a cell with tag 'parameters'
2023-02-02 16:28:10.953 INFO    papermill: Executing notebook with kernel: python3
INFO:User authenticated successfully
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:Creating template input | nb_stage=COMPILE_TIME
INFO:
*********************************************************************************************************
**    CREATING INPUTS: datasetName, columns, featureEntityName, featureEntityColumn, dataSourceName    **
*********************************************************************************************************
2023-02-02 16:28:14.111 INFO    utils.rc.wrapper.templates: Inputs created successfully | template_id=2c4144c4-08bb-43aa-8507-7c02a6d526cf
Inputs created successfully | template_id=2c4144c4-08bb-43aa-8507-7c02a6d526cf
transform = Transform()
transform.templateId = template.id
transform.name = "transform_1"
transform.variables = {
    "datasetName": titanic_dataset.name,
    "columns": "Name,Sex,Fare",
    "featureEntityName": "Passenger",
    "featureEntityColumn": "PassengerId",
    "dataSourceName": online_data_store.name
}
# recipe.prepareForLocal(transform, "feature_store")
recipe.addTransform(transform)
WARNING:
#############################################IMPORTANT#############################################
addTransform is going to deprecate soon. Please use add_transform instead
####################################################################################################

2023-02-02 16:28:16.741 INFO    root: Adding new transform
2023-02-02 16:28:19.075 INFO    root: Transform added Successfully
recipe.run()
2023-02-02 16:28:19.092 INFO    root: Started running
2023-02-02 16:28:19.095 INFO    root: You can look at the progress on UI at https://test.dev.rapidcanvas.net/#/projects/12e039d5-3254-4cf2-85c3-6af1350a5299
2023-02-02 16:29:01.242 INFO    root: No errors found
output = recipe.getChildrenDatasets()['feature_sync_stats']
output.getData()
<div>
    <style scoped>
        .dataframe tbody tr th:only-of-type {
            vertical-align: middle;
        }

        .dataframe tbody tr th {
            vertical-align: top;
        }

        .dataframe thead th {
            text-align: right;
        }
    </style>
    <table border="1" class="dataframe">
        <thead>
            <tr style="text-align: right;">
                <th></th>
                <th>records</th>
            </tr>
        </thead>
        <tbody>
            <tr>
                <th>0</th>
                <td>891</td>
            </tr>
        </tbody>
    </table>
</div>

Creating Predict Service

class AddValuePredictionService(RCPredictionService):

    def pre_process(self, context, input_df):
        from utils.notebookhelpers.dataStoreHelpers import DataStoreHelpers
        import pandas as pd
        import json

        data_store = DataStoreHelpers.get_data_store(context, "online-redis")
        df_json = input_df.T.to_dict()
        for key in df_json:
            value = df_json[key]
            features_json = data_store.get_feature("Passenger", str(value["PassengerId"]))
            if features_json is None:
                raise Exception("features not found")
            features = json.loads(features_json)
            value.update(features)
            df_json[key] = value

        df = pd.DataFrame(df_json).T
        return df

    def post_process(self, context, output_df):
        return output_df.to_json()
prediction_service = PredictionService.create_service(
    name=service_name,
    description="testing purposes",
    model_name=model_name,
    service_obj=AddValuePredictionService,
    env_id=None,
    data_source_ids=[online_data_store.id]
)
2023-02-02 16:29:06.276 INFO    root: Found existing service by name: add-value-service
2023-02-02 16:29:06.278 INFO    root: Updating service
2023-02-02 16:29:07.477 INFO    root: Service is updated.
INFO:curl --location --request POST 'https://test.dev.rapidcanvas.net/api//v2/predict/add-value-service'             --header 'Authorization: Bearer eyJhbGciOiJSUzI1NiJ9.eyJ0ZW5hbnRJZCI6IjBiZTA3Y2E0LWE4OTctNDViYS05NjU2LTc3MzI5MDliYzUzYyIsImlkIjoiOWFiZDJiY2ItMzgxYi00ZjJlLThiMzYtZTkxOGI2YzQ5ODVmIiwiYXVkIjoid2ViYXBwIiwiZXhwIjoxNjc1NDIxODA1LCJpYXQiOjE2NzUzMzU0MDV9.WYRe_g_cRdGHGQI105nKsQ-o2oyWNGTjKOUE8r1YJ6wAoKQuvTnQvUdJ41bSHbuYNTbSsax81_aMdeyrqg8RNd7HET1cF3VEk7SVkfTjaww3PVdLDEHJmurpcg6xx9E6vS-2ET1YGYGy72-2ZQQmUc_bW5tPwjwTHZVN7zMPwkfNcYLxOOBRID25PgGXx40oWUlK0lw2hSSonfcqkHbQm5dcuJn1uky4ayXuPxd7kpVNLbpUPSoXpC-hzfc08mUOY33PNveh4HgkepzIMV11EvdSpWk6w39VFzpAY7hWUPOBjeDdWy8ndXz6F5TR2ERV9_HowXvFpOefPkJEL1whGw' --header 'Content-Type: application/json --data-raw '<PUT YOUR DATA HERE>'
PredictionService.refresh_service(prediction_service.name)
PredictionService.predict_by_service(
    prediction_service.name,
    {"PassengerId": [134, 145]}
)
'{"PassengerId":{"0":134,"1":145},"Name":{"0":"Weisz, Mrs. Leopold (Mathilde Francoise Pede)","1":"Andrew, Mr. Edgardo Samuel"},"Sex":{"0":"female","1":"male"},"Fare":{"0":26.0,"1":11.5},"Fare_added":{"0":29.0,"1":14.5}}'
PreviousML modelNextMultiple Files