Supported LLMs
Galileo comes with support for the following LLMs out of the box. In the Playground, you will see models for which you’ve added an integration.
Check out Setting up your LLMs for instructions on how to set up your integrations.
AZURE OPEN AI
gpt3.5-turbo (4K Context)
gpt3.5-turbo (16K Context)
gpt-4 (8K Context)
OPEN AI
gpt3.5-turbo (4K)
gpt3.5-turbo (16K, 0125)
gpt3.5-turbo (16K, 1106)
gpt3.5-turbo (16K)
gpt4 (8K) gpt-4 (32K)
GPT-4 Turbo (0125)
GPT-4 Turbo
babbage-002
davinci-002
VERTEX AI
text-bison@001
text-bison
gemini-1.0-pro
WRITER
Palmyra Base
Palmyra Large
Palmyra Instruct
Palmyra Instruct 30
Palmyra Beta Silk Road
Palmyra E
Palmyra X
Palmyra X 32K
Palmyra Med
Exam Works
AWS BEDROCK
AWS - Titan TG1 Large (Bedrock)
AWS - Titan Lite v1 (Bedrock)
AWS - Titan Express v1 (Bedrock)
Cohere - Command v14 (Bedrock)
Cohere - Command Light v14 (Bedrock)
AI21 - Jurassic-2 Mid v1 (Bedrock)
AI21 - Jurassic-2 Ultra v1 (Bedrock)
Anthropic - Claude Instant v1 (Bedrock)
Anthropic - Claude v1 (Bedrock)
Anthropic - Claude v2 (Bedrock)
Anthropic - Claude v2.1 (Bedrock)
Anthropic - Claude 3 Sonnet (Bedrock)
Anthropic - Claude 3 Haiku (Bedrock)
Meta - Llama 2 Chat 13B v1 (Bedrock)
Mistral - 7B Instruct (Bedrock)
Mixtral - 8x7B Instruct (Bedrock)
Mistral - Large (Bedrock)
DATABRICKS
Mixtral-8x7B Instruct
Meta 3.1 405B Instruct
Meta Llama 3.1 70B Instruct
DBRX Instruct
Want to use these models in pq.run()? Check out the API docs here.
SAGEMAKER
Any model hosted on Sagemaker. Requires the integration to be set up. See instructions here.
Not finding the model you’re looking for? You can also log your model responses to Galileo via the Langchain callback or our custom loggers. See here for details.
Was this page helpful?