Global AI Workshops
  • Welcome
  • Getting started
    • Lab 1 - Get Azure Access
    • Lab 2 - Setup Dev Environment
  • The Azure Function that can see
    • Introduction
    • Lab 1 - Train your model
    • Lab 2 - Create the Function
    • Clean up
  • Cognitive Services
    • Lab 1 - Cognitive Search
    • Lab 2 - Bot Composer
  • Azure machine learning
    • Introduction
    • Lab 1 - AML Workspace Setup
    • Lab 2 - Data
    • Lab 3 - Train your model
    • Lab 4 - Deploy to an ACI
      • Lab 4.2 - Deploy to ACI using the Azure Portal
      • Lab 4.1 - Deploy to ACI using Python
    • Lab 5 - Deploy to Managed Endpoint
    • Clean Up
Powered by GitBook
On this page
  • Install the AzureML SDK
  • Download the scoring script
  • Create the Python Script
  • Connect to your workspace
  • Load the model
  • Define the environment
  • Create the deployment configuration
  • Deploy the model

Was this helpful?

  1. Azure machine learning
  2. Lab 4 - Deploy to an ACI

Lab 4.1 - Deploy to ACI using Python

An easy way testing your model is to deploy the model in a container to an Azure Container Instance. A Container Instance is the easiest way to deploy a container.

Install the AzureML SDK

In your terminal install the AzureML SDK

pip install azureml-sdk

Download the scoring script

# Create a directory
mkdir deploy
cd deploy

# Download the scoring script
wget https://raw.githubusercontent.com/GlobalAICommunity/back-together-2021/main/workshop-assets/amls/score.py

Create the Python Script

We start with creating a python script called deploy.py, this script will take of deploying your model.

code deploy.py

and start with importing the dependencies and setting some variables.

import azureml
from azureml.core.model import Model, InferenceConfig
from azureml.core import Workspace, Model, Environment
from azureml.core.conda_dependencies import CondaDependencies 
from azureml.core.webservice import AciWebservice

# Connect to workspace
workspaceName = ""
subscriptionId = ""
resourceGroup = ""

Next we need to connect to our workspace.

Get the subscription ID

The command below shows a list of all your subscription id's

az account show --query id -o table

Get the Workspace name

The command below lists all the workspaces you have access to

az ml workspace list -o table

Get the resource group

The command below lists all your resource groups.

az rg list

Add the values to the variables.

Connect to your workspace

ws = Workspace.get(name=workspaceName,
               subscription_id=subscriptionId,
               resource_group=resourceGroup)
print("Workspace:", ws.name)

Load the model

# Load the model
model = Model(ws, name='SimpsonsClassification-pytorch')
print("Loaded model version:",model.version)

Define the environment

myenv = Environment(name="simpsons-inference")
conda_dep = CondaDependencies()
conda_dep.add_pip_package("azureml-defaults")
conda_dep.add_pip_package("torch")
conda_dep.add_pip_package("torchvision")
conda_dep.add_pip_package("pillow==5.4.1")
myenv.python.conda_dependencies=conda_dep

Create the deployment configuration

inference_config = InferenceConfig(
    entry_script="score.py", 
    environment=myenv
)

deploy_config = AciWebservice.deploy_configuration(
                    cpu_cores = 1, 
                    memory_gb = 2,
                    description='Simpson Lego Classifier')

Deploy the model

# Deploy the model to an ACI
aci_service = Model.deploy(ws, 
                name="simpsons-pt-aci", 
                models = [model], 
                inference_config = inference_config, 
                deployment_config = deploy_config, 
                overwrite = True)

aci_service.wait_for_deployment(show_output=True)
PreviousLab 4.2 - Deploy to ACI using the Azure PortalNextLab 5 - Deploy to Managed Endpoint

Last updated 3 years ago

Was this helpful?