Local Ai With Docker
Prerequisites
- Working WSL2 distribution.
Working Docker installation
On Windows this can be done by running winget install Docker.DockerDesktop in Powershell.
You probably need to reboot the computer or at least log out and log in.
Then open Docker Desktop and make sure that WSL2-integration is activated.
You might need to add your WSL-user to the docker group and adjust some permissions.
Do that by running (in your WSL enviroment, probably Bash):
sudo usermod -aG docker $USER
ls -l /var/run/docker.sock
sudo chmod 666 /var/run/docker.sock
Run Ollama Via Docker
Ollama helps you test different AI models on the machine of your choice and is now available as a docker image. In the WSL2 distribution of your choice run
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Once Ollama is downloaded and started visit <locallhost:11434> and make sure you get the greeting “Ollama is running”.
Make NVIDIA GPU available to Docker
Remove or rename the previously installed Docker image. Then follow the installation instructions. Test run with sudo docker run --rm --runtime=nvidia --gpus all ubuntu nvidia-smi
Run a Local LLM
As a note states: “You should have at least 8 GB of RAM available to run the 7B models, 16 GB to run the 13B models, and 32 GB to run the 33B models”.
You can see amount of avaialbe ram in WSL with free -h. There is also the option to
increase the memory limits for WSL2.
- Make sure Ollama is running by visiting <locallhost:11434>.
- Choose a model from the library.
- Run the model with
docker exec -it ollama ollama run [libraryname]for exampledocker exec -it ollama ollama run llama2
Example Code and the Matching Models
There are examples available for Python and Javascript. Below the Javascript examples are used.
git clone https://github.com/ollama/ollama-js.git
# Add "type": "module", to package.json
cd ollama-js/examples/
pnpm install
pnpm run build
cd examples
pnpx tsx abort/any-request.ts # Need llama3.1
# Analyzes pictures and need llava
pnpx tsx multimodal/multimodal.ts
pnpx tsx tools/tools.ts #