AI/ML

Easy Setup of n8n on Azure for Workflow Automation


Introduction

Microsoft Azure offers scalable cloud services for the deployment of machine learning models. The deployment workflow is automatable, and services integration on n8n can help you achieve that promptly.

 

Prerequisites

Make sure to have:

  • An Azure account
  • n8n is either hosted on a server or installed locally.
  • A trained ML model (TensorFlow, Pytorch, Scikit-learn etc.)
  • Authenticated Azure CLI
  • Docker if you want to do containerized deployments (Optional)

Step 1: Set Up the Azure Environment

1. Set Up n8n

Install n8n locally

To set up n8n, run the following code snippet.

npm install -g n8n

 

Then run:

n8n

 

n8n spins up a service on your local machine that can be accessed from http://localhost:5678 on any web browser.

Server Deployment (Docker Method)

If you want to host n8n on a server, you may accomplish this using Docker:

docker run -it --rm \    -p 5678:5678 \    -v ~/.n8n:/home/node/.n8n \    n8nio/n8n

 

This makes n8n accessible from http://your-server-ip:5678.

2. Set a Container for Azure Blob Storage

  • Head over to the Azure Portal > Storage Accounts tab > select Create.

  • Upon completion, go to Containers > press on New

  • Identify the container (for instance, ml-model-container).

  • Your model file can now be uploaded (e.g., model.pkl, model.h5).

3. Set up the Machine Learning Model on Azure

  • Log into Azure Machine Learning Studio > click on the Models tab > press on Register Model

  • Set the model’s source as Azure Blob Storage

  • Assign model to an Endpoint

4. Set up with Virtual Machine Deployment on Azure (Optional Final Deployment Step)

  • Log into Azure Portal > select Virtual Machines > select Create.

  • Select a feasible instance type (e.g., Standard_D2_v2 for small models or Standard_NC6 for GPU-based models).

  • Set up networking for HTTP access (allow 5000 for Flask and 8000 for FastAPI).

  • An SSH connection should be established to enable the installation of prerequisites:

sudo apt update && sudo apt install python3-pip -y   pip install flask tensorflow azure-storage-blob 

 

Step 2: Create Deployment n8n Workflow

1. Develop the Workflow

  • Configure an HTTP Trigger Node (to accept API calls)
  • Set up an Azure Blob Storage Node (to pull the model from blob storage)
  • For Azure ML inference use the Azure Machine Learning Node or use an HTTP Request Node for VM deployments.
  • Handle incoming data and provide the necessary output.

2. Additional Steps: Save and Deploy the Workflow

  • Select the Activate Workflow option.
  • Test by sending sample JSON input.

 

Step 3: Exposing the Model as an API

  • In case of Azure ML, utilize the provided endpoint URL.
  • For Azure VM users, deploy the model with Flask:
import picklefrom flask import Flask, request, jsonifyfrom azure.storage.blob import BlobServiceClientapp = Flask(__name__)# Get model from Azure blob storageconnection_string = 'YOUR_AZURE_STORAGE_CONNECTION_STRING'container_name = 'ml-model-container'blob_name = 'model.pkl'# Get blob service clientblob_service_client = BlobServiceClient.from_connection_string(connection_string)blob_client = blob_service_client.get_blob_client(container=container_name,blob=blob_name)with open('model.pkl','wb') as f: f.write(blob_client.download_blob().readall())model=pickle.load(open('model.pkl','rb'))@app.route('/predict',methods=['POST'])def predict(): # Get model prediction data=request.json['features'] prediction=model.predict([data]) return jsonify({'prediction': prediction.tolist()})if __name__=='__main__': app.run(host='0.0.0.0',port=5000)

 

Make sure network security groups permit traffic on port 5000.

 

Step 4: Automating Model Updates

  • Utilize an n8n Scheduler Node to automatically and regularly check for a new model.
  • Download the new version and trigger workflow redeploy.
  •  

Conclusion

Model management and deployment was simplified using n8n on Azure.

n8n makes it easy to automate the whole workflow regardless of whether you are using Azure ML, Blob Storage, or Virtual Machines.

Ready to optimize your AI infrastructure? Contact us today and leverage our AI/ML expertise!  

0

Share

facebook
LinkedIn
Twitter
AI/ML

Related Center Of Excellence