Skip to content

Strands Agents SDK

Strands Agents is an open-source Python SDK from AWS that takes a model-driven approach to building AI agents. The core idea: define an agent with just a model, system prompt, and tools, let the LLM handle reasoning and orchestration.

from strands import Agent
agent = Agent(
system_prompt="You are a helpful assistant.",
tools=[my_tool],
)
response = agent("Help me with something")

Strands is used in production by multiple AWS teams, including Amazon Q Developer, AWS Glue, and VPC Reachability Analyzer.

An Agent is the central abstraction. It wraps:

  • A model (Amazon Bedrock by default)
  • A system prompt (the agent’s instructions)
  • A set of tools (what the agent can do)

The agent runs an agentic loop: send the prompt to the model, let it reason, execute any tool calls, feed results back, repeat until done.

Tools are Python functions decorated with @tool that the agent can call:

from strands import tool
@tool
def get_weather(city: str) -> str:
"""Get the current weather for a city.
Args:
city: The city name to check weather for.
"""
return f"Sunny, 72F in {city}"

The docstring becomes the tool description, and type hints become the parameter schema. The LLM uses these to decide when and how to call the tool.

Strands supports multiple model providers:

ProviderUsage
Amazon Bedrock (default)Agent(), uses Claude Sonnet
AnthropicAgent(model=AnthropicModel())
OpenAIAgent(model=OpenAIModel())
OllamaAgent(model=OllamaModel())
LiteLLMAgent(model=LiteLLMModel())

Strands has first-class support for the Model Context Protocol:

from strands.tools.mcp import MCPClient
from mcp import StdioServerParameters
mcp_client = MCPClient(
StdioServerParameters(command="python", args=["my_mcp_server.py"])
)
with mcp_client:
tools = mcp_client.list_tools()
agent = Agent(tools=[*tools])

Strands supports several multi-agent patterns:

Agents-as-Tools: Wrap agents as tools for hierarchical delegation

@tool
def ask_expert(question: str) -> str:
"""Ask the domain expert agent."""
response = expert_agent(question)
return response.message.content[0]["text"]

Swarm: Dynamic handoffs between agents Graph: Directed agent workflows using GraphBuilder A2A Protocol: Cross-framework agent communication

Built-in OpenTelemetry support:

  • Traces for every agent interaction
  • Spans for model calls, tool calls, and agent handoffs
  • Compatible with CloudWatch, Datadog, Jaeger, and other OTEL backends

AgentCore is AWS’s managed platform for deploying agents:

ServiceWhat It Does
RuntimeServerless agent hosting (up to 8-hour execution windows)
GatewayConvert APIs to MCP-compatible tools
IdentityAuth integration (Cognito, Okta, Entra ID)
MemoryPersistent cross-session agent memory
ObservabilityCloudWatch dashboards and metrics
PolicyNatural language guardrails (Cedar-based)
Evaluations13 pre-built eval metrics

Deploy with the starter toolkit:

Terminal window
pip install bedrock-agentcore-starter-toolkit
agentcore configure -e app.py
agentcore launch
agentcore invoke '{"prompt": "Hello!"}'