Adding Custom LLM APIs / Fine Tuned LLMs
Showcases how to use Galileo with any LLM API or custom fine-tuned LLMs, not supported out-of-the-box by Galileo.
Galileo comes pre-configured with dozens of LLM integrations across various platforms including OpenAI, Azure OpenAI, Sagemaker, and Bedrock.
However, if you’re using an LLM service or custom model that Galileo doesn’t have support for, you can still get all that Galileo has to offer by simply using our workflow loggers.
In this guide, we showcase how to leverage Anthropic’s claude-3-sonnet
LLM without Galileo, and then use Galileo to do deep evaluations and analysis.
First, install the required libraries. In this example - Galileo, Anthropic, and Langchain.
Here’s a simple code snippet showing you how to query any LLM of your choice (in this case we’re going with an Anthropic LLM) and log your results to Galileo.
You should see a result like shown below:
Was this page helpful?