Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Claude-3 api snippets #713

Open
1 task
irthomasthomas opened this issue Mar 14, 2024 · 1 comment
Open
1 task

Claude-3 api snippets #713

irthomasthomas opened this issue Mar 14, 2024 · 1 comment
Labels
code-generation code generation models and tools like copilot and aider python Python code, tools, info source-code Code snippets

Comments

@irthomasthomas
Copy link
Owner

Claude-3 API Snippets

Sure! Here are some real code snippets that use the Anthropic Claude 3 API:

The first, from a blog post by Abid Ali Awan on KDnuggets, shows how to use the API to generate a response in Urdu:

import os
import anthropic
from IPython.display import Markdown, display

client = anthropic.Anthropic(
    api_key=os.environ["ANTHROPIC_API_KEY"],
)

Prompt = "Write a blog about neural networks."
message = client.messages.create(
    model="claude-3-opus-20240229",
    max_tokens=1024,
    system="Respond only in Urdu.",
    messages=[
        {'role': "user", "content": Prompt}
    ]
)

Markdown(message.content[0].text)

The next two code snippets are from the Amazon Bedrock User Guide:

# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. # SPDX-License-Identifier: Apache-2.0

"""
Shows how to generate a message with Anthropic Claude (on demand).
"""
import boto3
import json
import logging
from botocore.exceptions import ClientError

logger = logging.getLogger(name=__name__)
logging.basicConfig(level=logging.INFO)

def generate_message(bedrock_runtime, model_id, system_prompt, messages, max_tokens):
    body = json.dumps(
        {"anthropic_version": "bedrock-2023-05-31",
         "max_tokens": max_tokens,
         "system": system_prompt,
         "messages": messages
        }
    )
    response = bedrock_runtime.invoke_model(body=body, modelId=model_id)
    response_body = json.loads(response.get('body').read())
    return response_body

def main():
    """
    Entrypoint for Anthropic Claude message example.
    """
    try:
        bedrock_runtime = boto3.client(service_name='bedrock-runtime')
        model_id = 'anthropic.claude-3-sonnet-20240229-v1:0'
        system_prompt = "Please respond only with emoji."
        max_tokens = 1000

        # Prompt with user turn only.
        user_message = {"role": "user", "content": "Hello World"}
        messages = [user_message]
        response = generate_message(bedrock_runtime, model_id, system_prompt, messages, max_tokens)
        print("User turn only.")
        print(json.dumps(response, indent=4))

        # Prompt with both user turn and prefilled assistant response.
        # Anthropic Claude continues by using the prefilled assistant text.
        assistant_message = {"role": "assistant", "content": "<emoji>"}
        messages = [user_message, assistant_message]
        response = generate_message(bedrock_runtime, model_id, system_prompt, messages, max_tokens)
        print("User turn and prefilled assistant response.")
        print(json.dumps(response, indent=4))

    except ClientError as err:
        message = err.response["Error"]["Message"]
        logger.error("A client error occurred: %s", message)
        print("A client error occurred: " + format(message))

if __name__ == "__main__":
    main()

And:

# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. # SPDX-License-Identifier: Apache-2.0

"""
Shows how to run a multimodal prompt with Anthropic Claude (on demand) and InvokeModel.
"""
import json
import logging
import base64
import boto3
from botocore.exceptions import ClientError

logger = logging.getLogger(__name__)
logging.basicConfig(level=logging.INFO)

def run_multi_modal_prompt(bedrock_runtime, model_id, messages, max_tokens):
    """
    Invokes a model with a multimodal prompt.
    Args:
    bedrock_runtime: The Amazon Bedrock boto3 client.
    model_id (str): The model ID to use.
    messages (JSON): The messages to send to the model.
    max_tokens (int): The maximum number of tokens to generate.
    Returns:
    None.
    """
    body = json.dumps(
        {
            "anthropic_version": "bedrock-2023-05-31",
            "max_tokens": max_tokens,
            "messages": messages
        }
    )
    response = bedrock_runtime.invoke_model(body=body, modelId=model_id)
    response_body = json.loads(response.get('body').read())
    return response_body

def main():
    """
    Entrypoint for Anthropic Claude multimodal prompt example.
    """
    try:
        bedrock_runtime = boto3.client(service_name='bedrock-runtime')
        model_id = 'anthropic.claude-3-sonnet-20240229-v1:0'
        max_tokens = 1000
    URL: [Claude-3 API Snippets](https://www.anthropic.com/blog/claude-3-api-snippets)

Suggested labels

@irthomasthomas irthomasthomas added code-generation code generation models and tools like copilot and aider python Python code, tools, info source-code Code snippets labels Mar 14, 2024
@irthomasthomas
Copy link
Owner Author

irthomasthomas commented Mar 14, 2024

Related content

#396 - Similarity score: 0.88

#129 - Similarity score: 0.87

#638 - Similarity score: 0.87

#683 - Similarity score: 0.87

#678 - Similarity score: 0.87

#309 - Similarity score: 0.86

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
code-generation code generation models and tools like copilot and aider python Python code, tools, info source-code Code snippets
Projects
None yet
Development

No branches or pull requests

1 participant