Introduction

The Azure Function that can see

The Azure Function that can see

Who could have imagined a few years ago that you would be able to create an Azure Function that can tell you what is in your images? Good news, this is now possible with only a few lines of code and some sample images.

In this hands-on lab, you are going to train a ONNX classification Model with the Custom Vision service that will run in an Azure Function. The language of choice is Python and you will use Visual Studio Code as the editor and the Azure CLI to manage and create your Azure Resources.

At the end of the lab you have a serverless API that can classify images and will have learnt about how to use the Azure Custom Vision Python SDK to train an ONNX model.

Duration:

2 hours

Format:

Hands-on Lab

Pre-Requirement:

Level:

100-200

Some level of programming & Azure will be helpful

Last updated