Skip to content

Handle transient Sentry envelope SSL upload failures#96

Draft
cursor[bot] wants to merge 1 commit intomainfrom
cursor/high-impact-sentry-errors-56fc
Draft

Handle transient Sentry envelope SSL upload failures#96
cursor[bot] wants to merge 1 commit intomainfrom
cursor/high-impact-sentry-errors-56fc

Conversation

@cursor
Copy link
Copy Markdown

@cursor cursor Bot commented Apr 23, 2026

Summary

  • add bounded retry handling for transient Sentry envelope upload failures (SSLError, ConnectionError, Timeout) in GithubClient._send_envelope
  • set a request timeout for envelope uploads to avoid hanging requests
  • after retry exhaustion, log a warning and return without raising to avoid turning transient ingest/network issues into webhook 500s
  • add focused unit tests covering retry success and retry exhaustion behavior

Root Cause

Sentry issue SENTRY-GITHUB-ACTIONS-APP-4E shows intermittent TLS/connection failures while posting envelopes to Sentry ingest from src/github_sdk.py (requests.post in _send_envelope). These are transient transport failures outside payload generation logic but were previously unhandled, bubbling up to src/main.py and generating noisy application errors.

Validation

  • python3 -m pytest tests/test_github_sdk.py
  • Result: 9 passed

Risk / Notes

  • This intentionally suppresses only transient network/TLS upload exceptions for telemetry forwarding; non-transient HTTP failures (raise_for_status) are still surfaced.
  • A bounded retry count and timeout keep behavior predictable while reducing production error noise.
Open in Web View Automation 

Co-authored-by: Armen Zambrano G. <armenzg@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant