Case Studies/The Sharp Edge

The Sharp Edge

Carlos Gutierrez at Beast Consulting

OWN PRODUCTLive in production — generating subscription revenueUpdated 2026-04-25

Summary

The Sharp Edge is a multi-sport sports betting analytics platform serving paid subscribers across NBA, NFL, MLB, NHL, and NCAAB. Every morning it pulls a dozen external data feeds, runs a multi-stage AI convergence pipeline to surface where professional money is moving, and delivers the results as interactive dashboards inside a tiered membership site. I built it end-to-end — architecture, ingestion, AI layer, dashboard generators, billing integration, member portal, and the admin tooling that holds it all together.

The Problem

Public sports betting analysis is loud and shallow. Most products tell you what the public is doing, or what one model thinks. The interesting signal — where professional money is actually moving — gets buried. I wanted a platform that aggregated multiple independent signals and only flagged plays where they aligned, then delivered that to paying members daily without me touching it.

A second problem appeared once members started paying: my subscription billing platform and my membership site didn't talk to each other. There was no off-the-shelf integration. Members were being orphaned between the two systems on every failed payment, plan change, or comp account. I was supporting it manually. That wasn't going to scale.

The Approach

The whole product is a daily pipeline. At 6:10 AM Central, a GitHub Actions orchestrator kicks off ten parallel ingestion jobs. Once they land, an AI convergence layer scores every game across the day's signals — line movement, money split, situational factors — and emits a structured analysis report with tier rankings (NUCLEAR / STRONG / STANDARD / LEAN / SKIP). A second job consumes that report and regenerates per-sport HTML dashboards, then pushes them to the static site for members.

External data feeds (10+)  →  GitHub Actions orchestrator
                                       ↓
                              Multi-stage AI convergence
                                       ↓
                              Per-sport HTML dashboards
                                       ↓
                              Member portal (Vercel)
                                  ↑
       Subscription billing  ←→  Custom integration sync + admin UI

The integration sync became its own product. A daily ingest from the billing platform feeds into the membership database. A status state machine computes active, expiring, grace_period, or expired per user, with a 1-day grace window for failed payments. An override list handles permanent-access accounts. An admin UI lets me search, force a status, or extend access without touching SQL. None of this was in the original scope — I built it because the alternative was supporting two unsynced platforms by hand forever.

What I Built

  • Daily multi-sport ingestion pipeline — ten parallel jobs orchestrated via GitHub Actions, with retries, freshness checks, and Slack notifications on failure
  • AI convergence analysis layer — multi-stage pipeline that aggregates independent signals per game and emits tiered recommendations
  • Per-sport HTML dashboard generators — programmatic generation of dashboards with up to 13 analytical sections per game, base64-embedded assets for print-friendly output, mobile-responsive
  • Reverse Line Movement detection — methodology baked into the convergence layer for surfacing where professional money is moving against the public
  • Subscriber portal — tiered membership ($29.99/mo with a 14-day free trial), JWT auth, magic-link login, member-only sport pages
  • Integration / workaround dashboard — daily sync between subscription billing platform and membership database, status state machine, admin UI for member management

Engineering Highlights

  • Pipeline reliability under cron pressure. Multiple workflows push to the same repo many times per day. Implemented concurrency groups, push-retry-with-rebase, and [skip ci] markers to prevent infinite trigger loops. Daily pipeline now runs without manual intervention.
  • Cost-aware AI design. The convergence step costs real money per run. Pre-screening unlikely matchups with a lightweight classifier first, then routing only the interesting games to a frontier reasoning model, dropped daily AI cost meaningfully without losing analysis quality.
  • Workaround engineering for an integration that didn't exist. When two third-party platforms refused to integrate cleanly, I built the integration myself rather than rebuild on a different stack. The custom sync handles edge cases — failed payments, plan changes, manual overrides — that no vendor was going to solve.
  • Source-of-truth discipline across satellite repos. With multiple repos pushing to the same site many times daily, navigation and design changes were getting silently reverted. I established a canonical config, a pre-push checklist, and a satellite-repo update protocol. Drift problem solved.

Outcome

Live, generating recurring subscription revenue. Daily pipeline runs on autopilot — I review the output, members consume the content. Acquired members through both organic and paid channels, with the integration dashboard now the only thing standing between member churn and member growth.

Tech footprint

  • Frontend — vanilla JS + HTML/CSS for member-facing pages, React + Tailwind for admin tooling
  • Backend — Node.js serverless functions on Vercel + Python pipeline scripts on GitHub Actions
  • Data — Postgres for subscribers and access state, JSON file artifacts for daily analysis output
  • AI — multi-stage analysis combining a lightweight classifier with a frontier reasoning model
  • Payments + auth — third-party subscription billing, JWT-based session auth, magic-link login
  • Orchestration — GitHub Actions for daily ingestion, analysis, and deploy
  • Hosting — Vercel (static + serverless), GitHub-driven deploys

Related Case Studies

Have a custom AI problem you'd ship in 2-8 weeks if you had the right engineer?