Learn how to manually log your data from your Langchain Chains
We support integrating into both Python-based and Typescript-based Langchain systems:
Integrating into your Python-based Langchain application is the easiest and recommended route. You can just add GalileoObserveCallback(project_name="YOUR_PROJECT_NAME") to the callbacks of your chain invocation.
Copy
Ask AI
from galileo_observe import GalileoObserveCallbackfrom langchain.chat_models import ChatOpenAIprompt = ChatPromptTemplate.from_template("tell me a joke about {foo}")model = ChatOpenAI()chain = prompt | modelmonitor_handler = GalileoObserveCallback(project_name="YOUR_PROJECT_NAME")chain.invoke({'foo':'bears'}, config(dict(callbacks=[monitor_handler])))
The GalileoObserveCallback logs your input, output, and relevant statistics back to Galileo, where additional evaluation metrics are computed.