Products
A Product is the top-level container in Bedrock. It represents a single AI-powered application and holds all the configuration needed for your agents to run.What’s in a Product?
| Field | Description |
|---|---|
name | Human-readable name for your product |
system_prompt | Default instructions inherited by all agents |
default_model | Default LLM model for new agents (e.g., claude-sonnet-4) |
openai_api_key | Your OpenAI API key (write-only) |
anthropic_api_key | Your Anthropic API key (write-only) |
tool_call_secret | Secret sent to webhook tools for verification |
adapters | List of adapters (tool integrations) available to agents |
Creating a Product
LLM provider keys (
openai_api_key, anthropic_api_key) are write-only for
security. They cannot be read back via the API.LLM Provider Configuration
Bedrock uses your own LLM provider keys. This means:- You control costs directly with your provider
- You can use any model available on your account
- Usage shows up in your provider’s dashboard
System Prompt
The product’ssystem_prompt is inherited by all agents in the product. It defines the baseline behavior and personality.
system_prompt for specialized behavior.
Assigning Adapters
Adapters provide tools to your agents. Assign default adapters to get started quickly:API Keys
Each product can have multiple API keys for different environments or services:Product Hierarchy
- Organization
- Product
- API Keys — authentication
- Adapters — tool integrations, each with Adapter Configs (per-adapter settings)
- Agents — each with Memory (conversation history) and State (contacts, tasks, documents)
- Product
Usage Tracking
Track LLM usage across all agents in a product:Best Practices
One Product per App
Create separate products for separate applications or environments.
Descriptive System Prompts
Write clear system prompts that define expected behavior.
Separate API Keys
Use different API keys for dev, staging, and production.
Configure Tool Secret
Set
tool_call_secret if using webhook-based tools.