Lab 4.1 - Deploy to ACI using Python
An easy way testing your model is to deploy the model in a container to an Azure Container Instance. A Container Instance is the easiest way to deploy a container.
Install the AzureML SDK
In your terminal install the AzureML SDK
pip install azureml-sdk
Download the scoring script
# Create a directory
mkdir deploy
cd deploy
# Download the scoring script
wget https://raw.githubusercontent.com/GlobalAICommunity/back-together-2021/main/workshop-assets/amls/score.py
Create the Python Script
We start with creating a python script called deploy.py, this script will take of deploying your model.
code deploy.py
and start with importing the dependencies and setting some variables.
import azureml
from azureml.core.model import Model, InferenceConfig
from azureml.core import Workspace, Model, Environment
from azureml.core.conda_dependencies import CondaDependencies
from azureml.core.webservice import AciWebservice
# Connect to workspace
workspaceName = ""
subscriptionId = ""
resourceGroup = ""
Next we need to connect to our workspace.
Get the subscription ID
The command below shows a list of all your subscription id's
az account show --query id -o table
Get the Workspace name
The command below lists all the workspaces you have access to
az ml workspace list -o table
Get the resource group
The command below lists all your resource groups.
az rg list
Add the values to the variables.
Connect to your workspace
ws = Workspace.get(name=workspaceName,
subscription_id=subscriptionId,
resource_group=resourceGroup)
print("Workspace:", ws.name)
Load the model
# Load the model
model = Model(ws, name='SimpsonsClassification-pytorch')
print("Loaded model version:",model.version)
Define the environment
myenv = Environment(name="simpsons-inference")
conda_dep = CondaDependencies()
conda_dep.add_pip_package("azureml-defaults")
conda_dep.add_pip_package("torch")
conda_dep.add_pip_package("torchvision")
conda_dep.add_pip_package("pillow==5.4.1")
myenv.python.conda_dependencies=conda_dep
Create the deployment configuration
inference_config = InferenceConfig(
entry_script="score.py",
environment=myenv
)
deploy_config = AciWebservice.deploy_configuration(
cpu_cores = 1,
memory_gb = 2,
description='Simpson Lego Classifier')
Deploy the model
# Deploy the model to an ACI
aci_service = Model.deploy(ws,
name="simpsons-pt-aci",
models = [model],
inference_config = inference_config,
deployment_config = deploy_config,
overwrite = True)
aci_service.wait_for_deployment(show_output=True)
Last updated
Was this helpful?