AI/ML
Deploy OpenThinker 7B Using Docker: Hugging Face Integration Explained
Free Installation Guide - Step by Step Instructions Inside!
Introduction
Hugging Face provides a great platform for hosting large language models (LLMs), including OpenThinker 7B. By downloading and running the model using Docker, you can ensure that the environment is consistent, portable and easy to scale. In this guide, we will focus on pulling the OpenThinker 7B model directly from Hugging Face's model hub using Docker.
Step 1: Install Docker
Before we can download OpenThinker 7B, we need to have Docker installed on our system. If you haven't installed Docker yet, follow these steps for your platform:
For Ubuntu:
sudo apt update && sudo apt upgrade -ycurl -fsSL https://get.docker.com -o get-docker.shsudo sh get-docker.sh
Verify Installation:
To confirm that Docker is installed, run:
docker --version
This should output something like:
Docker version 24.0.5, build [Build ID]
Step 2: Pull the Hugging Face Docker Image
To run OpenThinker 7B with Docker, we will use a Hugging Face supported Docker image. You can pull the official Hugging Face transformers image from Docker Hub.
Run the following command to pull the image:
docker pull huggingface/transformers
Once the image is pulled, verify by listing the available images:
docker images | grep transformers
Expected output:
REPOSITORY TAG IMAGE ID CREATED SIZEhuggingface/transformers latest [Image ID] 2 days ago 5.2GB
Step 3: Create a Dockerfile for OpenThinker 7B
Now, let's create a custom Dockerfile that will set up OpenThinker 7B in the Hugging Face Docker container.
Create a New Directory for Your Project:
mkdir OpenThinker-7B-Docker-HuggingFace && cd OpenThinker-7B-Docker-HuggingFace
Create the Dockerfile:
Create a file named Dockerfile in this directory:
touch Dockerfilenano Dockerfile
Add the following content to the Dockerfile:
# Use Hugging Face's official transformer image as baseFROM huggingface/transformers# Install necessary dependenciesRUN pip install --upgrade pip && pip install torch transformers# Download OpenThinker 7B model from Hugging Face HubRUN python -c "from transformers import AutoModelForCausalLM, AutoTokenizer;model = AutoModelForCausalLM.from_pretrained('OpenThinker/OpenThinker-7B');tokenizer = AutoTokenizer.from_pretrained('OpenThinker/OpenThinker-7B')"# Expose the necessary portEXPOSE 5000# Start the model serverCMD ["python", "app.py"]
This Dockerfile does the following:
- Pulls the Hugging Face transformer image
- Installs necessary Python dependencies (torch, transformers)
- Downloads the OpenThinker 7B model using Hugging Face’s model hub
- Exposes port 5000 for model interaction
Press CTRL + X, then Y and hit Enter to save the file.
Step 4: Build the Docker Image
Run the following command to build the image:
docker build -t openthinker-7b-huggingface .
After the build process completes, verify the image:
docker images | grep openthinker-7b-huggingface
Expected output:
REPOSITORY TAG IMAGE ID CREATED SIZEopenthinker-7b-huggingface latest [IMAGE ID] 10 minutes ago 6GB
Step 5: Run the Docker Container
Now, you can run the container with the following command:
docker run -d --name openthinker_huggingface -p 5000:5000 openthinker-7b-huggingface
This command:
- Runs the container in detached mode (-d)
- Maps the container port 5000 to your host’s port 5000
- Names the container openthinker_huggingface
Verify the Container is Running:
docker ps
Expected output:
CONTAINER ID IMAGE COMMAND STATUS PORTS NAMES[Container ID] openthinker-7b-huggingface "python app.py" Up 2 minutes 0.0.0.0:5000->5000/tcp openthinker_huggingface
Step 6: Access the Model
You can now interact with OpenThinker 7B via HTTP requests to port 5000. To check if the model is running:
curl http://localhost:5000
Expected output:
{"message": "Model is up and running"}
Alternatively, you can interact programmatically using Python:
import requestsresponse = requests.post("http://localhost:5000", json={"text": "What is the significance of deep learning in AI?"})print(response.json())
Expected output:
{"response": "Deep learning is a subset of machine learning that utilizes neural networks..."}
Step 7: Stopping and Removing the Container
To stop the container:
docker stop openthinker_huggingface
To remove the container:
docker rm openthinker_huggingface
To remove the Docker image:
docker rmi openthinker-7b-huggingface
Conclusion
Downloading and running OpenThinker 7B via Docker and Hugging Face provides a seamless environment for large language model deployment. It ensures that the model can be easily reproduced across different systems, and by using Docker, you avoid dependency issues and simplify the setup process.
Ready to transform your business with our technology solutions? Contact Us today to Leverage Our AI/ML Expertise.
Comment