-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Add OpenSearch observability integration #1370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
goyamegh
wants to merge
4
commits into
awslabs:main
Choose a base branch
from
goyamegh:add-opensearch-observability-integration
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
4acf01d
Add OpenSearch observability integration for Bedrock AgentCore
goyamegh 788a788
Fix Data Prepper port inconsistency in OpenSearch integration
goyamegh d6c114e
Add Docker Compose files and uv.lock for OpenSearch integration
goyamegh 789a5ad
Fix OTLP transport and add Docker Compose test setup
goyamegh File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,124 @@ | ||
| # Amazon Bedrock AgentCore Integration with OpenSearch | ||
|
|
||
| This example contains a demo of a Personal Assistant Agent built on top of [Bedrock AgentCore Agents](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/what-is-bedrock-agentcore.html) with [OpenSearch](https://opensearch.org/) observability via [Data Prepper](https://opensearch.org/docs/latest/data-prepper/). | ||
|
|
||
|
|
||
| ## Prerequisites | ||
|
|
||
| - Python 3.11 or higher | ||
| - OpenSearch cluster with Data Prepper | ||
| - AWS Account with appropriate permissions | ||
| - Access to the following AWS services: | ||
| - Amazon Bedrock | ||
|
|
||
|
|
||
| ## OpenSearch Instrumentation | ||
|
|
||
| > [!TIP] | ||
| > For detailed setup instructions, configuration options, and advanced use cases, please refer to the [OpenSearch Trace Analytics Documentation](https://opensearch.org/docs/latest/observing-your-data/trace/index/). | ||
|
|
||
| Bedrock AgentCore comes with [Observability](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/observability.html) support out-of-the-box. | ||
| Hence, we just need to register an [OpenTelemetry SDK](https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/overview.md#sdk) to send the data to OpenSearch via Data Prepper. | ||
|
|
||
| [Data Prepper](https://opensearch.org/docs/latest/data-prepper/) is an OpenSearch community project that acts as an OpenTelemetry collector. It accepts OTLP data over gRPC and writes it to OpenSearch indices for Trace Analytics visualization. | ||
|
|
||
| We simplified this process, hiding all the complexity inside [opensearch.py](./opensearch.py). | ||
| Data Prepper runs separate pipelines for traces and metrics on different ports. Configure the following env vars to point to your Data Prepper instance: | ||
| - `OTEL_TRACES_ENDPOINT` (default: `http://localhost:21890`) — for trace data | ||
| - `OTEL_METRICS_ENDPOINT` (default: `http://localhost:21891`) — for metric data | ||
|
|
||
| If your Data Prepper instance requires authentication, credentials will be read from your filesystem under `/etc/secrets/opensearch_auth` (Base64-encoded `username:password`) or from the environment variable `OPENSEARCH_AUTH`. | ||
|
|
||
|
|
||
| ## How to use | ||
|
|
||
| ### Setting your AWS keys | ||
|
|
||
| Follow the [Amazon Bedrock AgentCore documentation](https://docs.aws.amazon.com/bedrock-agentcore/latest/devguide/runtime-permissions.html) to configure your AWS Role with the correct policies. | ||
| Afterwards, you can set your AWS keys in your environment variables by running the following command in your terminal: | ||
|
|
||
| ```bash | ||
| export AWS_ACCESS_KEY_ID=your_api_key | ||
| export AWS_SECRET_ACCESS_KEY=your_secret_key | ||
| export AWS_REGION=your_region | ||
| ``` | ||
|
|
||
| Ensure your account has access to the model `us.anthropic.claude-3-7-sonnet-20250219-v1:0` used in this example. Please refer to the | ||
| [Amazon Bedrock documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/model-access-permissions.html) to see how to enable access to the model. | ||
| You can change the model used by configuring the environment variable `BEDROCK_MODEL_ID`. | ||
|
|
||
| ### Setting up OpenSearch with Data Prepper | ||
|
|
||
| Before proceeding, you need an OpenSearch cluster and Data Prepper instance to receive telemetry data. Choose one of the following deployment options: | ||
|
|
||
| #### Option 1: Docker Deployment (Quickest for Testing) | ||
|
|
||
| This sample includes ready-to-use Docker Compose and Data Prepper configuration files: | ||
| - [`docker-compose.yml`](./docker-compose.yml) — OpenSearch, Data Prepper, and OpenSearch Dashboards | ||
| - [`pipelines.yaml`](./pipelines.yaml) — Data Prepper pipeline configuration for traces, service map, and metrics | ||
| - [`data-prepper-config.yaml`](./data-prepper-config.yaml) — Data Prepper server configuration | ||
|
|
||
| Start the services: | ||
|
|
||
| ```bash | ||
| docker compose up -d | ||
| ``` | ||
|
|
||
| This will start: | ||
| - OpenSearch at `http://localhost:9200` | ||
| - Data Prepper accepting traces at `http://localhost:21890` and metrics at `http://localhost:21891` | ||
| - OpenSearch Dashboards at `http://localhost:5601` | ||
|
|
||
| #### Option 2: Amazon OpenSearch Service (Production-Ready) | ||
|
|
||
| For production use, deploy with [Amazon OpenSearch Service](https://aws.amazon.com/opensearch-service/): | ||
|
|
||
| 1. Create an Amazon OpenSearch Service domain via the AWS Console or CLI | ||
| 2. Enable the [Trace Analytics](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/trace-analytics.html) feature | ||
| 3. Set up [Amazon OpenSearch Ingestion](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/ingestion.html) (managed Data Prepper) to receive OTLP data | ||
| 4. Configure the ingestion pipeline endpoint as your `OTEL_TRACES_ENDPOINT` and `OTEL_METRICS_ENDPOINT` | ||
|
|
||
| For Amazon OpenSearch Ingestion, authentication is handled via IAM (SigV4). Refer to the [Amazon OpenSearch Ingestion documentation](https://docs.aws.amazon.com/opensearch-service/latest/developerguide/ingestion.html) for pipeline configuration details. | ||
|
|
||
| ### Configure environment variables | ||
|
|
||
| Once OpenSearch and Data Prepper are deployed, set the environment variables: | ||
|
|
||
| ```bash | ||
| # Point to your Data Prepper trace and metric endpoints | ||
| export OTEL_TRACES_ENDPOINT=http://localhost:21890 | ||
| export OTEL_METRICS_ENDPOINT=http://localhost:21891 | ||
|
|
||
| # Optional: If authentication is required (Base64-encoded username:password) | ||
| export OPENSEARCH_AUTH=YWRtaW46YWRtaW4= | ||
| ``` | ||
|
|
||
| ### Run the app | ||
|
|
||
| You can start the example with the following command: | ||
|
|
||
| ```bash | ||
| uv run main.py | ||
| ``` | ||
|
|
||
| This will create an HTTP server that listens on port `8080` that implements the required `/invocations` endpoint for processing the agent's requirements. | ||
|
|
||
| The Agent is now ready to be deployed. The best practice is to package code as a container and push to ECR using CI/CD pipelines and IaC. | ||
| You can follow the guide | ||
| [here](https://github.com/awslabs/amazon-bedrock-agentcore-samples/blob/main/01-tutorials/01-AgentCore-runtime/01-hosting-agent/01-strands-with-bedrock-model/runtime_with_strands_and_bedrock_models.ipynb) | ||
| to have a full step-by-step tutorial. | ||
|
|
||
| You can interact with your agent with the following command: | ||
|
|
||
| ```bash | ||
| curl -X POST http://127.0.0.1:8080/invocations --data '{"prompt": "What is the weather now?"}' | ||
| ``` | ||
|
|
||
| ### Viewing Traces in OpenSearch Dashboards | ||
|
|
||
| 1. Open OpenSearch Dashboards at `http://localhost:5601` | ||
| 2. Navigate to *Observability* > *Trace Analytics* | ||
| 3. You will see traces from your AgentCore agent, including: | ||
| - LLM invocation spans with model details | ||
| - Tool execution spans (calculator, weather) | ||
| - End-to-end request traces |
1 change: 1 addition & 0 deletions
1
03-integrations/observability/opensearch/data-prepper-config.yaml
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| ssl: false |
37 changes: 37 additions & 0 deletions
37
03-integrations/observability/opensearch/docker-compose.yml
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,37 @@ | ||
| services: | ||
| opensearch: | ||
| image: opensearchproject/opensearch:latest | ||
| environment: | ||
| - discovery.type=single-node | ||
| - DISABLE_SECURITY_PLUGIN=true | ||
| - "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m" | ||
| ports: | ||
| - "9200:9200" | ||
| healthcheck: | ||
| test: ["CMD-SHELL", "curl -sf http://localhost:9200/_cluster/health || exit 1"] | ||
| interval: 10s | ||
| timeout: 5s | ||
| retries: 10 | ||
|
|
||
| data-prepper: | ||
| image: opensearchproject/data-prepper:2 | ||
| volumes: | ||
| - ./pipelines.yaml:/usr/share/data-prepper/pipelines/pipelines.yaml | ||
| - ./data-prepper-config.yaml:/usr/share/data-prepper/config/data-prepper-config.yaml | ||
| ports: | ||
| - "21890:21890" | ||
| - "21891:21891" | ||
| depends_on: | ||
| opensearch: | ||
| condition: service_healthy | ||
|
|
||
| opensearch-dashboards: | ||
| image: opensearchproject/opensearch-dashboards:latest | ||
| environment: | ||
| - OPENSEARCH_HOSTS=["http://opensearch:9200"] | ||
| - DISABLE_SECURITY_DASHBOARDS_PLUGIN=true | ||
| ports: | ||
| - "5601:5601" | ||
| depends_on: | ||
| opensearch: | ||
| condition: service_healthy |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| from opensearch import init | ||
|
|
||
| init() | ||
|
|
||
| from travel_agent import app | ||
|
|
||
| if __name__ == "__main__": | ||
| app.run() |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,76 @@ | ||
| import os | ||
|
|
||
|
|
||
| def read_secret(secret: str): | ||
| try: | ||
| with open(f"/etc/secrets/{secret}", "r") as f: | ||
| return f.read().rstrip() | ||
| except Exception: | ||
| print("No credentials file found, falling back to environment variable") | ||
| return os.environ.get("OPENSEARCH_AUTH", "") | ||
|
|
||
|
|
||
| def init(): | ||
| """Initialize OpenTelemetry SDK to export traces and metrics to OpenSearch via Data Prepper. | ||
|
|
||
| Data Prepper is an OpenSearch community project that accepts OTLP data | ||
| and writes it to OpenSearch indices for Trace Analytics. | ||
|
|
||
| Data Prepper exposes gRPC endpoints for OTLP ingestion: | ||
| - OTEL_TRACES_ENDPOINT (default http://localhost:21890) for traces | ||
| - OTEL_METRICS_ENDPOINT (default http://localhost:21891) for metrics | ||
| """ | ||
| auth = read_secret("opensearch_auth") | ||
| metadata = [] | ||
| if auth: | ||
| metadata.append(("authorization", f"Basic {auth}")) | ||
|
|
||
| OTEL_TRACES_ENDPOINT = os.environ.get( | ||
| "OTEL_TRACES_ENDPOINT", | ||
| "http://localhost:21890", # Data Prepper otel_trace_source default port | ||
| ) | ||
| OTEL_METRICS_ENDPOINT = os.environ.get( | ||
| "OTEL_METRICS_ENDPOINT", | ||
| "http://localhost:21891", # Data Prepper otel_metrics_source default port | ||
| ) | ||
|
|
||
| from opentelemetry import trace, metrics | ||
| from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter | ||
| from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import ( | ||
| OTLPMetricExporter, | ||
| ) | ||
| from opentelemetry.sdk.metrics import MeterProvider | ||
| from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader | ||
| from opentelemetry.sdk.trace import TracerProvider | ||
| from opentelemetry.sdk.trace.export import SimpleSpanProcessor | ||
| from opentelemetry.sdk.resources import Resource | ||
|
|
||
| resource = Resource.create( | ||
| { | ||
| "service.name": "agent-core-samples", | ||
| } | ||
| ) | ||
|
|
||
| provider = TracerProvider(resource=resource) | ||
| processor = SimpleSpanProcessor( | ||
| OTLPSpanExporter( | ||
| endpoint=OTEL_TRACES_ENDPOINT, | ||
| headers=metadata if metadata else None, | ||
| insecure=True, | ||
| ) | ||
| ) | ||
| provider.add_span_processor(processor) | ||
| trace.set_tracer_provider(provider) | ||
|
|
||
| reader = PeriodicExportingMetricReader( | ||
| OTLPMetricExporter( | ||
| endpoint=OTEL_METRICS_ENDPOINT, | ||
| headers=metadata if metadata else None, | ||
| insecure=True, | ||
| ) | ||
| ) | ||
| provider = MeterProvider( | ||
| metric_readers=[reader], | ||
| resource=resource, | ||
| ) | ||
| metrics.set_meter_provider(provider) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,24 @@ | ||
| otel-trace-pipeline: | ||
| source: | ||
| otel_trace_source: | ||
| port: 21890 | ||
| ssl: false | ||
| sink: | ||
| - opensearch: | ||
| hosts: ["http://opensearch:9200"] | ||
| index_type: trace-analytics-raw | ||
| - opensearch: | ||
| hosts: ["http://opensearch:9200"] | ||
| index_type: trace-analytics-service-map | ||
|
|
||
| otel-metrics-pipeline: | ||
| source: | ||
| otel_metrics_source: | ||
| port: 21891 | ||
| ssl: false | ||
| processor: | ||
| - otel_metrics: | ||
| sink: | ||
| - opensearch: | ||
| hosts: ["http://opensearch:9200"] | ||
| index: otel-metrics |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,16 @@ | ||
| [project] | ||
| name = "agent-core-opensearch" | ||
| version = "0.1.0" | ||
| description = "Amazon Bedrock AgentCore integration with OpenSearch observability" | ||
| readme = "README.md" | ||
| requires-python = ">=3.11" | ||
| dependencies = [ | ||
| "boto3", | ||
| "bedrock-agentcore", | ||
| "bedrock-agentcore-starter-toolkit", | ||
| "uv", | ||
| "strands-agents-tools", | ||
| "strands-agents", | ||
| "opentelemetry-sdk", | ||
| "opentelemetry-exporter-otlp-proto-grpc", | ||
| ] | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| import os | ||
| from strands import Agent, tool | ||
| from strands_tools import calculator # Import the calculator tool | ||
| from strands.models import BedrockModel | ||
| from bedrock_agentcore.runtime import BedrockAgentCoreApp | ||
|
|
||
| app = BedrockAgentCoreApp() | ||
|
|
||
| # Create a custom tool | ||
| @tool | ||
| def weather(): | ||
| """Get weather""" # Dummy implementation | ||
| return "sunny" | ||
|
|
||
|
|
||
| model_id = os.getenv("BEDROCK_MODEL_ID", "us.anthropic.claude-3-7-sonnet-20250219-v1:0") | ||
| model = BedrockModel( | ||
| model_id=model_id, | ||
| ) | ||
| agent = Agent( | ||
| model=model, | ||
| tools=[calculator, weather], | ||
| system_prompt="You're a helpful assistant. You can do simple math calculation, and tell the weather.", | ||
| ) | ||
|
|
||
| @app.entrypoint | ||
| def strands_agent_bedrock(payload): | ||
| """ | ||
| Invoke the agent with a payload | ||
| """ | ||
| user_input = payload.get("prompt") | ||
| response = agent(user_input) | ||
| return response.message["content"][0]["text"] |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This integration doesn’t include a
uv.lock, while the other observability integrations do (e.g.,03-integrations/observability/dynatrace/uv.lockand03-integrations/observability/openlit/uv.lock). Adding a lockfile would make the sample reproducible and consistent with the established pattern in this repo.