Get up and running with Ollama and its dependencies through Docker Compose with minimal setup.
Leverage GPU acceleration for improved performance in processing tasks, with easy configuration.
Includes a development container for testing and experimentation, supporting both Docker and virtual environments.
Auto Ollama includes test Python scripts which utilize langchain to programmatically use the Ollama API.
App Container