Skip to main content

What is Bedrock?

Bedrock handles the hard parts of building AI agents:
  • Persistent Memory: Agents remember conversations across sessions with automatic summarization
  • Tool Execution: Connect agents to SMS, email, calendars, or your own APIs via webhooks
  • Autonomous Runtime: Agents run independently, waking and sleeping on schedules
  • Full Observability: Every LLM call, tool invocation, and decision is traced
See a demo video here

Core Architecture

  • Organization — your top-level account
    • Product — one per application, holds shared configuration
      • LLM Provider Keys (OpenAI, Anthropic)
      • System Prompt
      • Adapters (tool integrations)
      • Agent 1, Agent 2, … — deployed instances, each with its own Memory, State, and Tools

Key Concepts

  • Product — Your application’s configuration template: system prompt, LLM keys, adapters. Configure it once in the portal.
  • Agent — A deployed instance of your product with its own memory, contacts, and state. Create one per end-user.
  • Adapter — An integration module (SMS, Email, CRM, etc.) that gives agents tools.
  • Tool — A callable function an agent can invoke, either built-in or a webhook to your API.

Products

Configure LLM keys, system prompts, and adapters in the portal.

Agents

Deploy autonomous AI instances with their own memory and state.

Adapters

Built-in integrations for SMS, email, calendars, and more.

Custom Adapters

Build your own adapters to connect agents to any API.

Memory

Hierarchical summarization that maintains context across sessions.

Tracing

Full observability into every LLM call, tool execution, and decision.

Built-in Adapters

Bedrock includes these default integrations:
AdapterDescription
ContactsStore and manage contact information
SMSSend and receive text messages via Twilio
MMSSend and receive iMessage/RCS via Linq
SurgeSend and receive SMS via Surge
EmailSend and receive emails via AgentMail
GmailRead/send from a user’s Gmail via OAuth
Google CalendarManage events on a user’s Google Calendar
NotificationsInternal notification/reminder system
ProjectsHierarchical project tracking with sub-projects
DocumentsStore text documents for agent reference
ComputerCloud VM sandbox for shell commands and files
BrowserAsynchronous browser automation for web tasks

Custom Integrations

Build your own adapters to connect agents to any system:
  1. Create an adapter — a named container for related tools
  2. Add webhook tools — each tool POSTs to your API when called
  3. Verify requests — check the X-Agent-Secret header to authenticate
  4. Return data — your JSON response becomes the tool result the agent sees
See Custom Adapters for a full walkthrough.

How Agents Run

  1. Wake agent (API call or scheduled)
  2. Load memory context
  3. Build prompt with tools
  4. Call LLM (OpenAI/Anthropic)
  5. Execute tool calls
  6. Log to memory
  7. Repeat until agent sleeps
Agents are autonomous—they decide when to sleep and what actions to take based on their tools and memory.

Use Your Own LLM Keys

Bedrock uses your LLM provider accounts:
  • You control costs directly with OpenAI/Anthropic
  • Use any model available on your account
  • Prompt caching with Claude reduces costs significantly

Quickstart

Set up a product and deploy your first agent

Custom Adapters

Connect agents to your own APIs

API Reference

Full endpoint documentation

Authentication

API key setup and usage