ooligo
n8n-flow

Hiring funnel anomaly detection with n8n

Difficulty
intermediate
Setup time
90min
For
recruiting-leader · talent-acquisition · recruiting-ops
Recruiting & TA

Stack

An n8n flow that runs nightly against the ATS data, computes funnel conversion rates per role-by-stage, detects anomalies (a stage’s conversion rate dropped below the rolling baseline, a candidate has been stalled in a stage for too long, a role’s time-to-hire is trending out of bounds), and routes alerts to the right recruiting leader in Slack.

What you’ll need

  • n8n self-hosted or cloud account
  • ATS API access — Ashby, Greenhouse, or Lever
  • Postgres or similar for storing the rolling-baseline conversion rates per role-by-stage
  • Claude API key for narrative anomaly explanation
  • Slack workspace with channels per recruiting team / role family

Setup

  1. Build the baseline. Run a one-time backfill computing funnel conversion rates per role-by-stage over the trailing 90 days. Store as baselines table keyed by (role_id, stage).
  2. Build the n8n flow. Eight nodes:
    • Cron trigger — runs nightly at 2am
    • ATS pull — pulls active candidates, stage transitions, scorecards from the last 24 hours
    • Per-stage aggregation — computes today’s conversion rates per role-by-stage
    • Baseline comparison — flags stages where today’s rate is >2 stddev below baseline
    • Stalled-candidate detection — flags candidates who’ve been in a stage longer than the role’s stage SLA
    • Time-to-hire trend check — flags roles where rolling 7-day time-to-hire exceeds threshold
    • Claude narrative — for each flag, generates a 1-2 sentence explanation of what changed and likely cause
    • Slack notification — posts findings to the recruiting-team channel with the explanation and links into the ATS
  3. Test and tune thresholds. Run on historical data; verify anomaly detection matches what the team would have flagged manually. Tune sensitivity (too many alerts vs too few).
  4. Update baselines monthly. Conversion rates drift; refresh the baselines monthly to avoid stale comparison.

How the routing works

Anomaly typeRouted toSuggested action
Stage conversion below baselineRecruiter for the roleCalibration check on the stage’s screening criteria
Candidate stalled past SLARecruiter + hiring managerPick up the conversation; complete or close out
Time-to-hire trending upRecruiting team leadIdentify and remove the bottleneck
Source channel conversion dropSourcing leadInvestigate channel quality
New role with no movement after 7 daysRecruiting leader + hiring managerJob-description / sourcing-channel review

Output

For each alert, the Slack message includes:

  • The role and stage
  • The metric (conversion rate, days stalled, etc.)
  • The baseline (what’s normal)
  • The current value (what’s happening)
  • A Claude-generated 1-2 sentence explanation of what likely changed
  • Direct link into the ATS to investigate

Where it fits

This flow is the operational layer of recruiting funnel metrics. Without it, funnel anomalies get discovered in the quarterly review (too late). With it, the recruiting team intervenes within 24-48 hours of the metric shifting.

Mature recruiting organizations also use the data feeding this flow for monthly executive reporting — the same anomaly data, aggregated up.

Watch-outs

  • Alert fatigue is real. Tune thresholds carefully; too many alerts get ignored. Start conservative; loosen thresholds as the team adapts.
  • Don’t auto-act on alerts. AI surfaces the anomaly; humans diagnose and act. Auto-actions on recruiting workflows produce wrong corrections.
  • Baseline drift. Conversion rates change with seasonality, market conditions, and team changes. Refresh baselines monthly.
  • Privileged candidate data. ATS data includes candidate names, contact information, demographic data. Flow handling needs to align with privacy and AI-policy requirements.
  • Source channel attribution. “Source channel conversion drop” depends on accurate source attribution in the ATS; verify the data quality before relying on the alert.