OpenThinker 7B Model for your Business?
Cost Efficiency (Open Source)
Lower Long Term costs
Customised data control
Pre-trained model
Get Your OpenThinker 7B AI Model Running in a Day
Free Installation Guide - Step by Step Instructions Inside!
Hugging Face provides a great platform for hosting large language models (LLMs), including OpenThinker 7B. By downloading and running the model using Docker, you can ensure that the environment is consistent, portable and easy to scale. In this guide, we will focus on pulling the OpenThinker 7B model directly from Hugging Face's model hub using Docker.
Before we can download OpenThinker 7B, we need to have Docker installed on our system. If you haven't installed Docker yet, follow these steps for your platform:
For Ubuntu:
sudo apt update && sudo apt upgrade -ycurl -fsSL https://get.docker.com -o get-docker.shsudo sh get-docker.sh
Verify Installation:
To confirm that Docker is installed, run:
docker --version
This should output something like:
Docker version 24.0.5, build [Build ID]
To run OpenThinker 7B with Docker, we will use a Hugging Face supported Docker image. You can pull the official Hugging Face transformers image from Docker Hub.
Run the following command to pull the image:
docker pull huggingface/transformers
Once the image is pulled, verify by listing the available images:
docker images | grep transformers
Expected output:
REPOSITORY TAG IMAGE ID CREATED SIZE
huggingface/transformers latest [Image ID] 2 days ago 5.2GB Now, let's create a custom Dockerfile that will set up OpenThinker 7B in the Hugging Face Docker container.
Create a New Directory for Your Project:
mkdir OpenThinker-7B-Docker-HuggingFace && cd OpenThinker-7B-Docker-HuggingFace
Create the Dockerfile:
Create a file named Dockerfile in this directory:
touch Dockerfilenano Dockerfile
Add the following content to the Dockerfile:
# Use Hugging Face's official transformer image as baseFROM huggingface/transformers# Install necessary dependenciesRUN pip install --upgrade pip && pip install torch transformers# Download OpenThinker 7B model from Hugging Face HubRUN python -c "from transformers import AutoModelForCausalLM, AutoTokenizer; model = AutoModelForCausalLM.from_pretrained('OpenThinker/OpenThinker-7B'); tokenizer = AutoTokenizer.from_pretrained('OpenThinker/OpenThinker-7B')"# Expose the necessary portEXPOSE 5000# Start the model serverCMD ["python", "app.py"]
This Dockerfile does the following:
Press CTRL + X, then Y and hit Enter to save the file.
Run the following command to build the image:
docker build -t openthinker-7b-huggingface .
After the build process completes, verify the image:
docker images | grep openthinker-7b-huggingface
Expected output:
REPOSITORY TAG IMAGE ID CREATED SIZEopenthinker-7b-huggingface latest [IMAGE ID] 10 minutes ago 6GB
Now, you can run the container with the following command:
docker run -d --name openthinker_huggingface -p 5000:5000 openthinker-7b-huggingfaceThis command:
Verify the Container is Running:
docker ps
Expected output:
CONTAINER ID IMAGE COMMAND STATUS PORTS NAMES[Container ID] openthinker-7b-huggingface "python app.py" Up 2 minutes 0.0.0.0:5000->5000/tcp openthinker_huggingface
You can now interact with OpenThinker 7B via HTTP requests to port 5000. To check if the model is running:
curl http://localhost:5000
Expected output:
{"message": "Model is up and running"}
Alternatively, you can interact programmatically using Python:
import requestsresponse = requests.post("http://localhost:5000", json={"text": "What is the significance of deep learning in AI?"})print(response.json())
Expected output:
{"response": "Deep learning is a subset of machine learning that utilizes neural networks..."}
To stop the container:
docker stop openthinker_huggingfaceTo remove the container:
docker rm openthinker_huggingfaceTo remove the Docker image:
docker rmi openthinker-7b-huggingface
Downloading and running OpenThinker 7B via Docker and Hugging Face provides a seamless environment for large language model deployment. It ensures that the model can be easily reproduced across different systems, and by using Docker, you avoid dependency issues and simplify the setup process.
Ready to transform your business with our technology solutions? Contact Us today to Leverage Our AI/ML Expertise.
Contact Us