Hey everyone! The prompts Quickstart guide with LangChain is great:
When trying to log a more simple model than the one from the agent, I however cannot see a Trace that shows me the model architecture etc. How can I accomplish this? The use case is comparing different LLMs from OpenAI, Huggingface on a single set of prompts. I’m new to WandB, so your help is much appreciated!
from langchain.callbacks.tracers import WandbTracer
wandb_config = {“project”: “wandb_langchain_simple_documentation”}
tracer = WandbTracer(wandb_config)
llm = OpenAI(temperature=0)
from langchain import PromptTemplate, OpenAI, LLMChain
prompt_template = “What is a good name for a company that makes {product}?”
llm = OpenAI(temperature=0)
llm_chain = LLMChain(
llm=llm,
prompt=PromptTemplate.from_template(prompt_template)
)
llm_chain(“colorful socks”)