Skip to main content
Archived

AI Assist to use local AI endpoint

Related products:Integrations

oliver.morris
Contributor

I would like to point the AI Assist to use Ollama - with the standard OpenAI API spec (https://ollama.com/blog/openai-compatibility?ref=upstract.com)

This would allow me to run completely locally and avoid the concerns of running AI Assist over the web to third party sources or incurring costs of models running in the cloud.

 

 

2 replies

danilo_fme
Evangelist
Forum|alt.badge.img+44
  • Evangelist
  • February 12, 2024
oliver.morris wrote:

I would like to point the AI Assist to use Ollama - with the standard OpenAI API spec (https://ollama.com/blog/openai-compatibility?ref=upstract.com)

This would allow me to run completely locally and avoid the concerns of running AI Assist over the web to third party sources or incurring costs of models running in the cloud.

 

 

@oliver.morris thanks for share us. It may be very interesting. I will check it.


LizAtSafe
Safer
Forum|alt.badge.img+15
  • Safer
  • October 29, 2024
The product team is taking AI assist in a new direction. However, to call any AI API, the OpenAPI transformer in 2024.1 can be used
OpenArchived

Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings