Skip to content

OLLAMA

Other Ollama Tutorials

Tutorial

Check out the official API docs here

Run Ollama


# Only available on your machines
ollama serve

# ALlow remote hosts
OLLAMA_HOST=0.0.0.0:11434 ollama serve

# Windows
set OLLAMA_HOST=0.0.0.0
ollama serve

Download some models More can be found here



curl http://localhost:11434/api/pull -d '{
  "name": "mannix/llama3-uncensored"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "llama3:8b"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "dolphin-llama3:8b"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "mistral:7b"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "llava"
}'


curl http://localhost:11434/api/pull -d '{
  "name": "mxbai-embed-large"
}'

# phi2
# gemma2


List Models


curl http://localhost:11434/api/tags | jq

Generation



export OLLAMA_HOST="http://192.168.7.209:11434" 

curl $OLLAMA_HOST/api/generate \
-d '{
  "model": "llama3.2",
  "prompt": "Why is the sky blue?",
  "stream": false
}'

Chat Completion


curl http://localhost:11434/api/chat -d '{
  "model": "llama3:8b",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    }
  ]
}'

Embeddings


export OLLAMA_HOST="http://192.168.7.209:11434"

curl $OLLAMA_HOST/api/embeddings -d '{
  "model": "llama3.2",
  "prompt": "Hello World"
}'

Sources