Skip to content

OLLAMA

Tutorial

Check out the official API docs here

Run Ollama

# Only available on your machines
ollama serve

# ALlow remote hosts
OLLAMA_HOST=0.0.0.0:11434 ollama serve

Download some models More can be found here



curl http://localhost:11434/api/pull -d '{
  "name": "mannix/llama3-uncensored"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "llama3:8b"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "dolphin-llama3:8b"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "mistral:7b"
}'

curl http://localhost:11434/api/pull -d '{
  "name": "llava"
}'


curl http://localhost:11434/api/pull -d '{
  "name": "mxbai-embed-large"
}'

List Models


curl http://localhost:11434/api/tags | jq

Generation


curl http://localhost:11434/api/generate -d '{
  "model": "llama3:8b",
  "prompt": "Why is the sky blue?"
  "stream": false
}'

Chat Completion


curl http://localhost:11434/api/chat -d '{
  "model": "llama3:8b",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    }
  ]
}'

Embeddings


curl http://localhost:11434/api/embeddings -d '{
  "model": "llama3:8b",
  "prompt": "Hello World"
}'

Sources