Skip to main content
FME Hub user sanaeatsafe just uploaded a new transformer to the FME Hub.

Description

The OllamaConnector allows you to interact with the Ollama API, enabling response generation and seamless integration with locally deployed large language models (LLMs) within an FME Workspace.

Examples of local LLM models that might be used with the OllamaConnector are Llama, LLaVa, Phi, Gemma, Mistral, Moondream, Deepseek, and Starling.

For more information, review the Ollama API documentation



Would you like to know more? Click here to find out more details!
Be the first to reply!