Skip to main content

I would like to point the AI Assist to use Ollama - with the standard OpenAI API spec (https://ollama.com/blog/openai-compatibility?ref=upstract.com)

This would allow me to run completely locally and avoid the concerns of running AI Assist over the web to third party sources or incurring costs of models running in the cloud.

 

 

I would like to point the AI Assist to use Ollama - with the standard OpenAI API spec (https://ollama.com/blog/openai-compatibility?ref=upstract.com)

This would allow me to run completely locally and avoid the concerns of running AI Assist over the web to third party sources or incurring costs of models running in the cloud.

 

 

@oliver.morris thanks for share us. It may be very interesting. I will check it.


The product team is taking AI assist in a new direction. However, to call any AI API, the OpenAPI transformer in 2024.1 can be used
OpenArchived