When you integrate Weave into your code, it automatically tracks and logs LLM calls made with the Anthropic SDK for both Python and TypeScript. Weave does this by automatically invoking Anthropic’s Messages.create method.
Traces
Weave automatically captures traces for the Anthropic SDK when you add weave.init("your-team-name/your-project-name") to your code. If you don’t specify a team name as an argument in weave.init(), Weave logs output to your default W&B entity. If you don’t specify a project name, Weave fails to initialize.
The following examples demonstrate how to integrate Weave into a basic call to Anthropic:
import weave
# use the anthropic library as usual
import os
from anthropic import Anthropic
weave.init("anthropic_project")
client = Anthropic(
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)
message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "Tell me a joke about a dog",
}
],
model="claude-3-opus-20240229",
)
print(message.content)
import Anthropic from '@anthropic-ai/sdk';
import * as weave from 'weave';
import { wrapAnthropic } from 'weave';
await weave.init('anthropic_project');
// Wrap the Anthropic client to enable tracing
const client = wrapAnthropic(new Anthropic());
const message = await client.messages.create({
max_tokens: 1024,
messages: [
{
role: 'user',
content: 'Tell me a joke about a dog',
}
],
model: 'claude-3-opus-20240229',
});
console.log(message.content);
By including weave.init() in the code, Weave automatically captures tracing information and outputs links. You can view the traces in the Weave UI by clicking on the links.
Wrapping with your own ops
Weave ops automatically version your code as you experiment, and capture its inputs and outputs. Decorated with @weave.op() (Python) or wrapped with weave.op() (TypeScript) that calls into Anthropic.messages.create and Weave tracks the inputs and outputs for you.
The following examples show you how to track a function:
import weave
import os
from anthropic import Anthropic
weave.init("anthropic_project")
client = Anthropic(
api_key=os.environ.get("ANTHROPIC_API_KEY"),
)
@weave.op()
def call_anthropic(user_input:str, model:str) -> str:
message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": user_input,
}
],
model=model,
)
return message.content[0].text
@weave.op()
def generate_joke(topic: str) -> str:
return call_anthropic(f"Tell me a joke about {topic}", model="claude-3-haiku-20240307")
print(generate_joke("chickens"))
print(generate_joke("cars"))
import Anthropic from '@anthropic-ai/sdk';
import * as weave from 'weave';
import { wrapAnthropic } from 'weave';
await weave.init('anthropic_project');
const client = wrapAnthropic(new Anthropic());
const callAnthropic = weave.op(async function callAnthropic(
userInput: string,
model: string
): Promise<string> {
const message = await client.messages.create({
max_tokens: 1024,
messages: [
{
role: 'user',
content: userInput,
}
],
model: model,
});
const content = message.content[0];
return content.type === 'text' ? content.text : '';
});
const generateJoke = weave.op(async function generateJoke(
topic: string
): Promise<string> {
return callAnthropic(`Tell me a joke about ${topic}`, 'claude-3-haiku-20240307');
});
console.log(await generateJoke('chickens'));
console.log(await generateJoke('cars'));
By decorating or wrapping the function with weave.op(), Weave captures the function’s code, input, and output. You can use ops to track any function you want, including nested functions.
Create a Model for easier experimentation
The weave.Model class is only available in the Weave Python SDK. For TypeScript, use the weave.op() wrapper to track functions with structured parameters.
Organizing experimentation is difficult when there are many moving pieces. By using the Model class, you can capture and organize the experimental details of your app like your system prompt or the model you’re using. This helps organize and compare different iterations of your app.
In addition to versioning code and capturing inputs/outputs, Models capture structured parameters that control your application’s behavior. This can help you find which parameters work best. You can also use Weave Models with serve, and Evaluations.
In the following example, you can experiment with model and temperature:
import weave
# use the anthropic library as usual
import os
from anthropic import Anthropic
weave.init('joker-anthropic')
class JokerModel(weave.Model): # Change to `weave.Model`
model: str
temperature: float
@weave.op()
def predict(self, topic): # Change to `predict`
client = Anthropic()
message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": f"Tell me a joke about {topic}",
}
],
model=self.model,
temperature=self.temperature
)
return message.content[0].text
joker = JokerModel(
model="claude-3-haiku-20240307",
temperature = 0.1)
result = joker.predict("Chickens and Robots")
print(result)
Every time you change one of these values, Weave creates and tracks a new version JokerModel. This allows you to associate trace data with your code changes and can help you determine which configurations work best for your use case.
Anthropic provides a tools interface that allows Claude to request function calls. Weave automatically tracks tool definitions, tool use requests, and tool results throughout the conversation.
The following truncated examples demonstrate an Anthropic tool configuration:
message = client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": "What's the weather like in San Francisco?",
}
],
tools=[
{
"name": "get_weather",
"description": "Get the current weather in a given location",
"input_schema": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
}
},
"required": ["location"],
},
},
],
model=model,
)
print(message)
const message = await client.messages.create({
max_tokens: 1024,
messages: [
{
role: 'user',
content: "What's the weather like in San Francisco?",
}
],
tools: [
{
name: 'get_weather',
description: 'Get the current weather in a given location',
input_schema: {
type: 'object',
properties: {
location: {
type: 'string',
description: 'The city and state, e.g. San Francisco, CA',
}
},
required: ['location'],
},
},
],
model: 'claude-3-opus-20240229',
});
console.log(message);
Weave automatically captures the tool definitions, Claude’s tool use requests, and tool results at each step of the conversation.
