Skip to content

edgee-ai/edgee

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

61 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Edgee

Open-source LLM gateway written in Rust. Route, observe, and compress your AI traffic.

License Edgee Docs Twitter


Edgee is a lightweight LLM gateway that sits between your application and AI providers. It gives you a single control point for routing, observability, and cost optimization, without changing your existing code.

Think of it as an open-source alternative to LiteLLM or OpenRouter, written in Rust for speed and low resource usage, with a built-in token compression engine that reduces your AI costs automatically.

ai-gateway-horizontal-light

Why Edgee

  • One gateway, any provider — Unified API for Anthropic, OpenAI, and other LLM providers. Switch models without touching your app code.
  • Token compression — Edgee analyzes request context and strips redundancy before it reaches the model. Same output, fewer tokens, lower bill.
  • Real-time observability — See exactly how many tokens you're sending, how many you're saving, and what it costs.
  • Rust-native — Fast startup, minimal memory footprint, no runtime dependencies. Runs anywhere Docker runs.

Install

macOS / Linux (curl)

curl -fsSL https://edgee.ai/install.sh | bash

Homebrew (macOS)

brew install edgee-ai/tap/edgee

Windows (PowerShell)

irm https://edgee.ai/install.ps1 | iex

Installs to %LOCALAPPDATA%\Programs\edgee\. You can override the directory with $env:INSTALL_DIR before running.


Quickstart

Use with AI coding assistants

Edgee can wrap your coding assistant and compress traffic automatically:

# Claude Code
edgee launch claude

# Codex
edgee launch codex

# Opencode
edgee launch opencode

Use as a standalone gateway

Point any OpenAI-compatible client at Edgee:

# Start the gateway
edgee serve

# Your app talks to Edgee instead of the provider directly
export OPENAI_BASE_URL=http://localhost:1207/v1

Features

Token compression

Edgee's compression engine analyzes tool outputs (file listings, git logs, build output, test results) and removes noise before they enter the LLM context. The compression is lossless from the model's perspective — responses are identical, but prompts are leaner.

Multi-provider routing

Route requests across Anthropic, OpenAI, and other providers through a single endpoint. Switch models, load-balance, or failover without code changes.

Usage tracking

Real-time visibility into token consumption, compression savings, and cost per request.


Supported setups

Tool Setup command Status
Claude Code edgee launch claude ✅ Supported
Codex edgee launch codex ✅ Supported
Opencode edgee launch opencode ✅ Supported
Cursor edgee launch cursor 🔜 Coming soon
Any OpenAI-compatible client edgee serve ✅ Supported

Acknowledgments

The token compression engine in Edgee is derived from RTK, created by Patrick Szymkowiak and contributors at rtk-ai Labs. RTK pioneered local tool-output compression for AI coding assistants, and we built on their work to bring the same optimizations to a gateway architecture.

RTK is licensed under the Apache License 2.0. All derived files retain the original copyright notice and are individually marked with a modification history. See LICENSE-APACHE and NOTICE for full details.

If you're looking for a local-first compression tool, check out RTK directly, it's excellent for individual developer workflows.


Contributing

Edgee is Apache 2.0 licensed and we genuinely want your contributions.

git clone https://github.com/edgee-ai/edgee
cd edgee
cargo build

See CONTRIBUTING.md for the full guide. For bigger changes, open an issue first so we can align before you build.


Community

About

Open-source AI gateway written in Rust, with token compression for Claude Code, Codex... and any other LLM client.

Topics

Resources

License

Apache-2.0, Unknown licenses found

Licenses found

Apache-2.0
LICENSE
Unknown
LICENSE-APACHE

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors