Ollama
llm-local tool
Run LLMs locally easily
Concepts
modelsmodelfileapi
Pros and Cons
Ventajas
- + Very easy to use
- + Many models available
- + OpenAI-compatible API
- + Privacy (local)
Desventajas
- - Requires powerful hardware
- - Large models consume RAM
- - Slower than cloud APIs
Casos de Uso
- Local LLM app development
- Privacy-sensitive apps
- Experimentation