All posts

Building a Local Job Application Agent: Why I Wrote My Own

Google Alerts in, AI-tailored application packets out — without ever fabricating a single fact.

Why bother

Senior PM job hunting is asymmetric work. Fifty listings show up in your alerts every day. Most are obvious mismatches. The handful that fit deserve a real, tailored application — which is 30+ minutes of resume surgery and cover-letter writing each. I wanted to compress that loop without joining the queue of generic LinkedIn EasyApply submissions, and without trusting an AI to make stuff up.

How it works

The agent is a local Node app with three jobs: ingest, filter, generate. Everything runs on my laptop. Nothing auto-submits.

  • Ingestion. RSS feeds from Google Alerts. Cheap, no scraping required, easy to tune by editing the alert query. The agent polls feeds, dedupes, and writes new postings to SQLite.
  • Filtering. A salary parser reads the listing, a title matcher screens for Senior PM variants, and a fit scorer (0–100) weights AI/healthcare/consumer keywords against my actual focus areas. Anything below threshold drops out before it ever hits the model.
  • Generation. GPT-4o tailors a cover letter and selects/edits resume bullets from a curated library — never freehand. The output lands in a local dashboard for review.

The truth-locked design

The thing I cared most about: the AI cannot invent a fact. Specifically:

  • Resume bullets live in a JSON library. Each has an allowed_to_edit flag — if false, the bullet is immutable, period.
  • Numbers (e.g., "30% reduction in onboarding time") are stored separately in a verified-metrics file. The model can reference them but cannot generate new ones.
  • Every change a model makes to a bullet is logged with a diff. I can audit any packet back to the bullets it drew from.

This is the only design that lets me trust AI-generated application materials enough to send them. Without it, every output needs full re-verification, which defeats the point.

The approval gate

The dashboard shows me three tabs: New, Pending, and Approved. I review every generated packet before anything moves. Approve and the packet goes into packets/approved/ with the resume PDF, cover letter, and screening answers ready to paste. Skip and it goes to packets/skipped/ with a note. No application gets submitted by the system — that's still my job, and I want it to stay that way.

What I'd do differently

I started with broad Google Alerts queries ("product manager AI") and tried to filter aggressively after the fact. Wrong order. Narrow alerts ("Senior Product Manager AI healthcare") produce a 10x cleaner stream than wide alerts plus heavy filtering. Tune the source, not the sieve.

What's next

Site-specific scrapers (Playwright is already wired up) for company career pages where Google Alerts doesn't reach — the best roles are often at smaller AI companies whose listings never get indexed. And maybe a one-click "open application form pre-filled" handler. Still no auto-submit. Ever.