Lab 4.1 - Deploy to ACI using Python
An easy way testing your model is to deploy the model in a container to an Azure Container Instance. A Container Instance is the easiest way to deploy a container.
In your terminal install the AzureML SDK
pip install azureml-sdk
# Create a directory
mkdir deploy
cd deploy
# Download the scoring script
wget https://raw.githubusercontent.com/GlobalAICommunity/back-together-2021/main/workshop-assets/amls/score.py
We start with creating a python script called deploy.py, this script will take of deploying your model.
code deploy.py
and start with importing the dependencies and setting some variables.
import azureml
from azureml.core.model import Model, InferenceConfig
from azureml.core import Workspace, Model, Environment
from azureml.core.conda_dependencies import CondaDependencies
from azureml.core.webservice import AciWebservice
# Connect to workspace
workspaceName = ""
subscriptionId = ""
resourceGroup = ""
Next we need to connect to our workspace.
The command below shows a list of all your subscription id's
az account show --query id -o table
The command below lists all the workspaces you have access to
az ml workspace list -o table
The command below lists all your resource groups.
az rg list
Add the values to the variables.
ws = Workspace.get(name=workspaceName,
subscription_id=subscriptionId,
resource_group=resourceGroup)
print("Workspace:", ws.name)
# Load the model
model = Model(ws, name='SimpsonsClassification-pytorch')
print("Loaded model version:",model.version)
myenv = Environment(name="simpsons-inference")
conda_dep = CondaDependencies()
conda_dep.add_pip_package("azureml-defaults")
conda_dep.add_pip_package("torch")
conda_dep.add_pip_package("torchvision")
conda_dep.add_pip_package("pillow==5.4.1")
myenv.python.conda_dependencies=conda_dep
inference_config = InferenceConfig(
entry_script="score.py",
environment=myenv
)
deploy_config = AciWebservice.deploy_configuration(
cpu_cores = 1,
memory_gb = 2,
description='Simpson Lego Classifier')
# Deploy the model to an ACI
aci_service = Model.deploy(ws,
name="simpsons-pt-aci",
models = [model],
inference_config = inference_config,
deployment_config = deploy_config,
overwrite = True)
aci_service.wait_for_deployment(show_output=True)