Prerequisites
Before proceeding, ensure the following prerequisites are met:- Install MindsDB locally via Docker or use MindsDB Cloud.
- To use Ollama within MindsDB, install the required dependencies following this instruction.
- Follow this instruction to download Ollama and run models locally.
Here are the recommended system specifications:
- A working Ollama installation, as in point 3.
- For 7B models, at least 8GB RAM is recommended.
- For 13B models, at least 16GB RAM is recommended.
- For 70B models, at least 64GB RAM is recommended.
Setup
Create an AI engine from the Ollama handler.ollama_engine
as an engine.
If you run Ollama and MindsDB in separate Docker containers, use the
localhost
value of the container. For example, ollama_serve_url = 'http://host.docker.internal:11434'
.Usage
The following usage examples utilizeollama_engine
to create a model with the CREATE MODEL
statement.
Deploy and use the llama2
model.
First, download Ollama and run the model locally by executing ollama run llama2
.
Now deploy this model within MindsDB.
Next StepsGo to the Use Cases section to see more examples.