A production-ready backend microservice for API rate limiting using the Token Bucket algorithm and Redis. Designed for high performance, clarity, and easy integration.
- Token Bucket algorithm (allows bursts, smooths traffic)
- Global and per-client rate limits
- Redis for distributed, atomic state
- REST API:
POST /checkto verify if a request is allowed - Configurable via environment variables
- Dockerized for local development
- Middleware for easy integration
- Unit tests for core logic and Redis
- Allows short bursts while enforcing a long-term average rate
- More flexible than fixed/sliding window for real-world APIs
- Efficient with Redis atomic operations
Client → [POST /check] → Go API → RateLimiter (Token Bucket) → Redis
- Each client (by IP, API key, etc) has a token bucket in Redis
- On each request, the bucket is checked/updated atomically
- If tokens remain, request is allowed; else, denied with
retry_after
- Docker & Docker Compose
git clone <repo>
cd api-rate-limiter-microservice
docker-compose up --build
REDIS_URL(default: redis://localhost:6379)RATE_LIMIT(default: 100)WINDOW_SIZE(default: 1m)PORT(default: 8080)
POST /check
Content-Type: application/json
{
"client_id": "user-123"
}
Response:
- Allowed:
{ "allowed": true } - Rate limited:
{ "allowed": false, "retry_after": 42 }
- Use the provided Go middleware for easy plug-in to other services.
docker-compose exec api-rate-limiter go test ./...
- Add per-client custom limits in
rateLimiter.go - Add dashboard/CLI for usage stats (bonus)
MIT