OLLAMA - how to get reproducible outputs
clear
curl http://localhost:11434/api/generate -d '{
"model": "llama3.2:1b",
"prompt": "Why is the sky blue?",
"stream": false,
"options": {
"seed": 100000000,
"temperature": 0
}
}' | jq
Sources
- My Ollama Generate Advance detailed guide | ComfyUI
- ollama/docs/api.md at main · ollama/ollama
- Random Seed Parameter · Issue #2773 · ollama/ollama
- The "seed" is not working reliable for me. · Issue #1749 · ollama/ollama
- rand=-1 to use a random seed, not rand=0 as documented · Issue #386 · abetlen/llama-cpp-python
- ch07 - ollama reproducibility · Issue #249 · rasbt/LLMs-from-scratch
- Is there a way to set a a "random seed" for responses with temperature > 0? - Prompting - OpenAI Developer Forum
- Using a seed with OpenAI API sets temperature to 0 · Issue #5044 · ollama/ollama
- First value different on CUDA/ROCM when setting `seed` · Issue #4990 · ollama/ollama
- Llama3: Generated outputs inconsistent despite seed and temperature · Issue #5321 · ollama/ollama
- Steering sampling and decoding strategy with `num_beams` and `do_sample` · ggerganov/llama.cpp · Discussion #8265