Automatic dart scoring from a single webcam, in the browser. Uses OpenCV.js for image processing and plain JavaScript math for the geometry. No build step, no npm.
# from the repo root
python3 -m http.server 8000Then open http://localhost:8000 in a browser. Camera access requires either
localhost or HTTPS.
.github/workflows/pages.yml deploys the repo root as a static site to
GitHub Pages on every push to main (and on manual workflow_dispatch).
After the first run, the site is served at
https://<owner>.github.io/<repo>/.
To enable it on a fresh clone:
- In the GitHub repo, Settings → Pages → Build and deployment → Source: choose GitHub Actions.
- Push to
main(or run the workflow manually from the Actions tab). - The workflow's job summary prints the live URL.
Because Pages is served over HTTPS, getUserMedia will work on phones and
laptops without any extra setup.
- Aim a webcam at a regulation dartboard so the whole board is in frame and the red/green rings are clearly coloured.
- Click Calibrate with the board clean (no darts).
- Throw a dart. After the dart sticks and motion settles (~0.5 s), the score updates automatically.
- Use Undo if a throw was misread, Reset 501 to start over, and Debug: on to see the warped top-down view and the motion mask.
camera frame ─▶ HSV mask ─▶ contours ─▶ ellipse fit ─▶ homography H
│
camera frame ─▶ warpPerspective(H) ─▶ canonical 800×800 board view
│
absdiff vs clean reference + threshold + open
│
motion settled? ─▶ PCA on largest blob ─▶ dart tip
│
geometry.scoreFromCanonical(x, y) ─▶ sector × multiplier ─▶ score
- HSV threshold → red mask + green mask, OR them together.
findContours→fitEllipseon each fragment.- Robust mean of fragment centres = coarse board centre.
- Cluster fragment radii into two groups (doubles vs triples) with 1‑D k-means.
- Brute-force search a rotation offset θ₀ ∈ [0°, 18°) that best matches the observed red/green pattern of the outer ring against the known dartboard sequence.
- Each accepted fragment now has a known sector → known canonical
pixel coordinate. Solve
findHomography(srcPts, dstPts, RANSAC).
- Warp every frame with the calibrated H.
absdiffagainst a clean reference, threshold, morphological open.- State machine:
READY → MOTION → READYwith an N-frame "low motion" settle window. - On settle,
findContours+PCACompute2recover the dart's principal axis. The two axis extrema are the dart's ends; the tip is whichever extremum is closer to the bullseye (the back of the dart sticks toward the camera and projects further out after warping). - The new dart is baked into the reference for the next throw.
Pure math, no OpenCV. Given canonical pixel (x, y):
r = hypot(x - cx, y - cy), converted to mm using the calibrated outer-double radius.θ = atan2(dx, -dy) - θ₀, normalised to[0, 2π).- Bullseye / outer bull / outer-double / off-board are pure radius checks.
- Sector index
= floor(((θ + π/20) / 2π) * 20) % 20, looked up in[20,1,18,4,13,6,10,15,2,17,3,19,7,16,8,11,14,9,12,5]. - Multiplier 2 if
r ∈ [innerDouble, outerDouble], 3 ifr ∈ [innerTriple, outerTriple], else 1.
Plain 501 game state with bust handling and undo. No checkout-on-double rule (kept simple).
dartscore/
├── index.html
├── css/style.css
├── js/
│ ├── main.js # entry, frame loop, camera setup
│ ├── opencv-loader.js # async OpenCV.js loader
│ ├── config.js # tunable thresholds
│ ├── calibration.js # HSV → ellipse fit → homography
│ ├── detection.js # diff + motion settle → dart tip
│ ├── geometry.js # pure-math scoring (no OpenCV)
│ ├── scoring.js # 501 game state
│ └── ui.js # DOM updates
├── tests/
│ └── geometry.test.html # in-browser unit tests for geometry.js
└── README.md
Open tests/geometry.test.html in a browser. The page imports geometry.js
as an ES module and runs ~70 assertions covering bullseyes, every sector
single/double/triple round-trip, sector-boundary edge cases, and rotation
offsets. The page title becomes tests OK if everything passes.
This is pure math — no camera or OpenCV needed.
- Single front camera + tip-extrapolation is approximate. Parallax can cause off-by-one-sector errors near the board edge. The industry-standard fix is multiple cameras viewing the board edge-on; that's out of scope here.
- Lighting changes invalidate the background reference. Re-calibrate if the lighting shifts noticeably.
- The "tip = end closest to centre" heuristic assumes darts stick into the board pointing roughly outward; very shallow sticks or bounce-outs will be wrong.
- Sector orientation assumes the board is mounted with 20 at top ±9° (standard). The colour-pattern search has period 18° so it can't distinguish a board rotated by exactly one sector — mount it the conventional way.
- OpenCV.js is ~9 MB and slow on first load; expect a few seconds before the camera UI becomes responsive.
Edit js/config.js:
| constant | meaning |
|---|---|
HSV_RED_1/HSV_RED_2/HSV_GREEN |
colour ranges for ring detection |
RING_MORPH_KERNEL |
morphology kernel for cleaning ring masks |
MIN_RING_CONTOUR_AREA |
smallest accepted ring fragment |
DIFF_THRESHOLD |
per-pixel intensity for the motion mask |
MOTION_HI / MOTION_LO |
enter / leave the MOTION state |
SETTLE_FRAMES |
consecutive low-motion frames before scoring |
CANONICAL_SIZE etc. |
size of the warped top-down board |