Ollama
Deploy LLMs as a part of your Xano instance
Last updated
Was this helpful?
Deploy LLMs as a part of your Xano instance
Last updated
Was this helpful?
Through Xano's Microservice feature, you can deploy as a part of your Xano instance, enabling secure communication with an off-the-shelf or customizable LLM tailored specifically to your needs.
Ollama is a platform that enables seamless integration of large language models (LLMs) into various applications. It provides an environment where businesses can deploy pre-built or customized LLMs, facilitating secure and efficient communication tailored to specific business needs.
This makes it easier for companies to leverage advanced AI technology without extensive in-house development, and without the concerns of sending data to a third-party AI provider. In addition, deploying Ollama as a part of your Xano instance can be a significant cost-saving measure in comparison to leveraging a third party service.
Deploying Ollama as a microservice in Xano helps address a critical data privacy challenge when building AI-enabled applications that handle sensitive information.
By processing data within your controlled infrastructure boundary rather than sending it to external AI providers, using Xano with microservices removes one significant barrier to developing secure applications that can still leverage the benefits of AI models.
While this approach addresses an important technical aspect of data privacy, comprehensive security and compliance will require implementing additional safeguards appropriate to your specific regulatory environment.
You'll want to know which model you're deploying so we understand the resources that need to be allocated. Check out the resource linked below if you need help choosing a model.
A persistent volume is just a place that the microservice can store data that remains between restarts. Ollama will use this volume to store the model(s) that you're working with.
When browsing Ollama models, use the Size column for your chosen model to determine how large of a volume you should deploy.
Name the volume ollama
, select the size, and choose SSD
as the storage class. When you're ready, click Add