Skip to main content

Prompting Anthropic's Claude

In this tutorial, we'll create a Modelbit deployment that calls the Anthropic API with a prompt for the Claude model.

Add your API key to Modelbit

Make an Anthropic API Key to load into Modelbit. In Modelbit's Settings, click Integrations and then click the Anthropic tile. Add your API key to this form.

You're now ready to call Anthropic from Modelbit.

Make a Python function that calls Anthropic

In your Python notebook, install Anthropic's Python package, log into Modelbit, and set your Anthropic API key:

pip install anthropic
import modelbit
mb = modelbit.login()
import os
from anthropic import Anthropic

os.environ["ANTHROPIC_API_KEY"] = mb.get_secret("ANTHROPIC_API_KEY")

Now we'll create a function that passes a prompt to Claude and returns the response as a list of strings.

def call_claude(prompt: str):
message = Anthropic().messages.create(
model="claude-3-opus-20240229",
max_tokens=10,
messages=[
{"role": "user", "content": prompt}
])
return [c.text for c in message.content]

Let's call our function to test it:

call_claude("Hello, Claude! What color is the sky?")

Which returns:

["The sky is blue."]

Deploy to Modelbit to make a REST API

It's time to deploy our function to Modelbit so we can use it as a REST API:

mb.deploy(call_claude)

Click the View in Modelbit button to see your API endpoint, and call it with code like:

curl -s -XPOST "http://<your-workspace>.modelbit.com/v1/call_claude/latest" -d '{"data": "What color is the sky?" }'

Next steps

You've built a simple deployment that can pass prompts to Claude. Now you can extend this deployment with more elaborate prompts as well as post-processing of of the model's responses.