Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions 03-integrations/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ This folder contains framework and protocol integrations that demonstrate how to
## 📊 Observability

* **[Dynatrace](./observability/dynatrace/)**: Application performance monitoring integration with travel agent example
* **[OpenSearch](./observability/opensearch/)**: Trace Analytics observability with OpenSearch and Data Prepper via OpenTelemetry
* **[Simple Dual Observability](./observability/simple-dual-observability/)**: Amazon CloudWatch and Braintrust integration with automatic OpenTelemetry instrumentation for AgentCore Runtime

## 🎨 UX Examples
Expand Down
124 changes: 124 additions & 0 deletions 03-integrations/observability/opensearch/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
# Amazon Bedrock AgentCore Integration with OpenSearch

This example contains a demo of a Personal Assistant Agent built on top of [Bedrock AgentCore Agents](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/what-is-bedrock-agentcore.html) with [OpenSearch](https://opensearch.org/) observability via [Data Prepper](https://opensearch.org/docs/latest/data-prepper/).


## Prerequisites

- Python 3.11 or higher
- OpenSearch cluster with Data Prepper
- AWS Account with appropriate permissions
- Access to the following AWS services:
- Amazon Bedrock


## OpenSearch Instrumentation

> [!TIP]
> For detailed setup instructions, configuration options, and advanced use cases, please refer to the [OpenSearch Trace Analytics Documentation](https://opensearch.org/docs/latest/observing-your-data/trace/index/).

Bedrock AgentCore comes with [Observability](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability.html) support out-of-the-box.
Hence, we just need to register an [OpenTelemetry SDK](https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/overview.md#sdk) to send the data to OpenSearch via Data Prepper.

[Data Prepper](https://opensearch.org/docs/latest/data-prepper/) is an OpenSearch community project that acts as an OpenTelemetry collector. It accepts OTLP data over gRPC and writes it to OpenSearch indices for Trace Analytics visualization.

We simplified this process, hiding all the complexity inside [opensearch.py](./opensearch.py).
Data Prepper runs separate pipelines for traces and metrics on different ports. Configure the following env vars to point to your Data Prepper instance:
- `OTEL_TRACES_ENDPOINT` (default: `http://localhost:21890`) — for trace data
- `OTEL_METRICS_ENDPOINT` (default: `http://localhost:21891`) — for metric data

If your Data Prepper instance requires authentication, credentials will be read from your filesystem under `/etc/secrets/opensearch_auth` (Base64-encoded `username:password`) or from the environment variable `OPENSEARCH_AUTH`.


## How to use

### Setting your AWS keys

Follow the [Amazon Bedrock AgentCore documentation](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/runtime-permissions.html) to configure your AWS Role with the correct policies.
Afterwards, you can set your AWS keys in your environment variables by running the following command in your terminal:

```bash
export AWS_ACCESS_KEY_ID=your_api_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_REGION=your_region
```

Ensure your account has access to the model `us.anthropic.claude-3-7-sonnet-20250219-v1:0` used in this example. Please refer to the
[Amazon Bedrock documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-permissions.html) to see how to enable access to the model.
You can change the model used by configuring the environment variable `BEDROCK_MODEL_ID`.

### Setting up OpenSearch with Data Prepper

Before proceeding, you need an OpenSearch cluster and Data Prepper instance to receive telemetry data. Choose one of the following deployment options:

#### Option 1: Docker Deployment (Quickest for Testing)

This sample includes ready-to-use Docker Compose and Data Prepper configuration files:
- [`docker-compose.yml`](./docker-compose.yml) — OpenSearch, Data Prepper, and OpenSearch Dashboards
- [`pipelines.yaml`](./pipelines.yaml) — Data Prepper pipeline configuration for traces, service map, and metrics
- [`data-prepper-config.yaml`](./data-prepper-config.yaml) — Data Prepper server configuration

Start the services:

```bash
docker compose up -d
```

This will start:
- OpenSearch at `http://localhost:9200`
- Data Prepper accepting traces at `http://localhost:21890` and metrics at `http://localhost:21891`
- OpenSearch Dashboards at `http://localhost:5601`

#### Option 2: Amazon OpenSearch Service (Production-Ready)

For production use, deploy with [Amazon OpenSearch Service](https://aws.amazon.com/opensearch-service/):

1. Create an Amazon OpenSearch Service domain via the AWS Console or CLI
2. Enable the [Trace Analytics](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/trace-analytics.html) feature
3. Set up [Amazon OpenSearch Ingestion](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/ingestion.html) (managed Data Prepper) to receive OTLP data
4. Configure the ingestion pipeline endpoint as your `OTEL_TRACES_ENDPOINT` and `OTEL_METRICS_ENDPOINT`

For Amazon OpenSearch Ingestion, authentication is handled via IAM (SigV4). Refer to the [Amazon OpenSearch Ingestion documentation](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/ingestion.html) for pipeline configuration details.

### Configure environment variables

Once OpenSearch and Data Prepper are deployed, set the environment variables:

```bash
# Point to your Data Prepper trace and metric endpoints
export OTEL_TRACES_ENDPOINT=http://localhost:21890
export OTEL_METRICS_ENDPOINT=http://localhost:21891

# Optional: If authentication is required (Base64-encoded username:password)
export OPENSEARCH_AUTH=YWRtaW46YWRtaW4=
```

### Run the app

You can start the example with the following command:

```bash
uv run main.py
```

This will create an HTTP server that listens on port `8080` that implements the required `/invocations` endpoint for processing the agent's requirements.

The Agent is now ready to be deployed. The best practice is to package code as a container and push to ECR using CI/CD pipelines and IaC.
You can follow the guide
[here](https://github.com/awslabs/amazon-bedrock-agentcore-samples/blob/main/01-tutorials/01-AgentCore-runtime/01-hosting-agent/01-strands-with-bedrock-model/runtime_with_strands_and_bedrock_models.ipynb)
to have a full step-by-step tutorial.

You can interact with your agent with the following command:

```bash
curl -X POST http://127.0.0.1:8080/invocations --data '{"prompt": "What is the weather now?"}'
```

### Viewing Traces in OpenSearch Dashboards

1. Open OpenSearch Dashboards at `http://localhost:5601`
2. Navigate to *Observability* > *Trace Analytics*
3. You will see traces from your AgentCore agent, including:
- LLM invocation spans with model details
- Tool execution spans (calculator, weather)
- End-to-end request traces
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ssl: false
37 changes: 37 additions & 0 deletions 03-integrations/observability/opensearch/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
services:
opensearch:
image: opensearchproject/opensearch:latest
environment:
- discovery.type=single-node
- DISABLE_SECURITY_PLUGIN=true
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
ports:
- "9200:9200"
healthcheck:
test: ["CMD-SHELL", "curl -sf http://localhost:9200/_cluster/health || exit 1"]
interval: 10s
timeout: 5s
retries: 10

data-prepper:
image: opensearchproject/data-prepper:2
volumes:
- ./pipelines.yaml:/usr/share/data-prepper/pipelines/pipelines.yaml
- ./data-prepper-config.yaml:/usr/share/data-prepper/config/data-prepper-config.yaml
ports:
- "21890:21890"
- "21891:21891"
depends_on:
opensearch:
condition: service_healthy

opensearch-dashboards:
image: opensearchproject/opensearch-dashboards:latest
environment:
- OPENSEARCH_HOSTS=["http://opensearch:9200"]
- DISABLE_SECURITY_DASHBOARDS_PLUGIN=true
ports:
- "5601:5601"
depends_on:
opensearch:
condition: service_healthy
8 changes: 8 additions & 0 deletions 03-integrations/observability/opensearch/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
from opensearch import init

init()

from travel_agent import app

if __name__ == "__main__":
app.run()
76 changes: 76 additions & 0 deletions 03-integrations/observability/opensearch/opensearch.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
import os


def read_secret(secret: str):
try:
with open(f"/etc/secrets/{secret}", "r") as f:
return f.read().rstrip()
except Exception:
print("No credentials file found, falling back to environment variable")
return os.environ.get("OPENSEARCH_AUTH", "")


def init():
"""Initialize OpenTelemetry SDK to export traces and metrics to OpenSearch via Data Prepper.

Data Prepper is an OpenSearch community project that accepts OTLP data
and writes it to OpenSearch indices for Trace Analytics.

Data Prepper exposes gRPC endpoints for OTLP ingestion:
- OTEL_TRACES_ENDPOINT (default http://localhost:21890) for traces
- OTEL_METRICS_ENDPOINT (default http://localhost:21891) for metrics
"""
auth = read_secret("opensearch_auth")
metadata = []
if auth:
metadata.append(("authorization", f"Basic {auth}"))

OTEL_TRACES_ENDPOINT = os.environ.get(
"OTEL_TRACES_ENDPOINT",
"http://localhost:21890", # Data Prepper otel_trace_source default port
)
OTEL_METRICS_ENDPOINT = os.environ.get(
"OTEL_METRICS_ENDPOINT",
"http://localhost:21891", # Data Prepper otel_metrics_source default port
)

from opentelemetry import trace, metrics
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import (
OTLPMetricExporter,
)
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
from opentelemetry.sdk.resources import Resource

resource = Resource.create(
{
"service.name": "agent-core-samples",
}
)

provider = TracerProvider(resource=resource)
processor = SimpleSpanProcessor(
OTLPSpanExporter(
endpoint=OTEL_TRACES_ENDPOINT,
headers=metadata if metadata else None,
insecure=True,
)
)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)

reader = PeriodicExportingMetricReader(
OTLPMetricExporter(
endpoint=OTEL_METRICS_ENDPOINT,
headers=metadata if metadata else None,
insecure=True,
)
)
provider = MeterProvider(
metric_readers=[reader],
resource=resource,
)
metrics.set_meter_provider(provider)
24 changes: 24 additions & 0 deletions 03-integrations/observability/opensearch/pipelines.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
otel-trace-pipeline:
source:
otel_trace_source:
port: 21890
ssl: false
sink:
- opensearch:
hosts: ["http://opensearch:9200"]
index_type: trace-analytics-raw
- opensearch:
hosts: ["http://opensearch:9200"]
index_type: trace-analytics-service-map

otel-metrics-pipeline:
source:
otel_metrics_source:
port: 21891
ssl: false
processor:
- otel_metrics:
sink:
- opensearch:
hosts: ["http://opensearch:9200"]
index: otel-metrics
16 changes: 16 additions & 0 deletions 03-integrations/observability/opensearch/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
[project]
name = "agent-core-opensearch"
version = "0.1.0"
description = "Amazon Bedrock AgentCore integration with OpenSearch observability"
readme = "README.md"
requires-python = ">=3.11"
dependencies = [
"boto3",
"bedrock-agentcore",
"bedrock-agentcore-starter-toolkit",
"uv",
"strands-agents-tools",
"strands-agents",
"opentelemetry-sdk",
"opentelemetry-exporter-otlp-proto-grpc",
]
Comment on lines +1 to +16
Copy link

Copilot AI Apr 16, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This integration doesn’t include a uv.lock, while the other observability integrations do (e.g., 03-integrations/observability/dynatrace/uv.lock and 03-integrations/observability/openlit/uv.lock). Adding a lockfile would make the sample reproducible and consistent with the established pattern in this repo.

Copilot uses AI. Check for mistakes.
33 changes: 33 additions & 0 deletions 03-integrations/observability/opensearch/travel_agent.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
import os
from strands import Agent, tool
from strands_tools import calculator # Import the calculator tool
from strands.models import BedrockModel
from bedrock_agentcore.runtime import BedrockAgentCoreApp

app = BedrockAgentCoreApp()

# Create a custom tool
@tool
def weather():
"""Get weather""" # Dummy implementation
return "sunny"


model_id = os.getenv("BEDROCK_MODEL_ID", "us.anthropic.claude-3-7-sonnet-20250219-v1:0")
model = BedrockModel(
model_id=model_id,
)
agent = Agent(
model=model,
tools=[calculator, weather],
system_prompt="You're a helpful assistant. You can do simple math calculation, and tell the weather.",
)

@app.entrypoint
def strands_agent_bedrock(payload):
"""
Invoke the agent with a payload
"""
user_input = payload.get("prompt")
response = agent(user_input)
return response.message["content"][0]["text"]
Loading
Loading