ollama_engine
as an engine.
localhost
value of the container. For example, ollama_serve_url = 'http://host.docker.internal:11434'
.ollama_engine
to create a model with the CREATE MODEL
statement.
Deploy and use the llama2
model.
First, download Ollama and run the model locally by executing ollama run llama2
.
Now deploy this model within MindsDB.