ooligo
ENTRY TYPE · framework

EDRM Model — Electronic Discovery Reference Model

Last updated 2026-05-03 Legal Ops

The Electronic Discovery Reference Model (EDRM) is the industry-standard map of the eDiscovery workflow. Created in 2005 and maintained by the EDRM organization (now under Duke Law), it defines nine stages from pre-litigation information governance through trial presentation. Every eDiscovery platform, every litigation support team, and every court order on discovery procedure traces its terminology to the EDRM.

The nine stages

The EDRM is conventionally diagrammed left-to-right with volume decreasing and cost-per-document increasing as the workflow progresses:

#StageWhat happensWho owns it
1Information GovernancePre-litigation: data retention policies, hold readiness, data minimizationInformation governance team, IT, Legal Ops
2IdentificationWhen litigation triggers: identify custodians, data sources, scopeLitigation counsel, IT, Legal Ops
3PreservationIssue legal holds; freeze deletion; document complianceLitigation counsel, IT
4CollectionPull data from email, file shares, Slack, mobile, cloud appsForensic specialists, IT
5ProcessingDeduplicate, extract metadata, OCR, normalize formatseDiscovery team, vendor
6ReviewAttorneys (or AI) tag documents — responsive, privileged, hotReview attorneys, vendor
7AnalysisBuild the case narrative from reviewed documentsLitigation team
8ProductionProduce responsive, non-privileged documents to the requesting partyLitigation team, vendor
9PresentationUse produced documents at deposition or trialTrial team

Stages 1-3 are pre- or early-litigation; stages 4-8 are the bulk of eDiscovery cost; stage 9 is presentation work that uses the discovery output.

Volume vs cost trade-off

The classic EDRM diagram shows volume (in documents/data) decreasing across the stages while cost-per-document increases:

  • Stage 1-2: Petabytes of organizational data
  • Stage 3-4: Terabytes of preserved and collected custodian data
  • Stage 5: Hundreds of GB of processed deduplicated data
  • Stage 6: Millions of documents in the review universe
  • Stage 7-8: Tens of thousands of documents in the production
  • Stage 9: Hundreds of trial exhibits

Each stage that reduces volume cheaply pays back massively in downstream cost. Aggressive information governance (Stage 1) before litigation produces dramatically smaller collections (Stage 4) and dramatically lower review cost (Stage 6).

How modern eDiscovery platforms map to EDRM

Most platforms specialize:

  • Stages 1-3: Microsoft Purview (compliance and legal hold), Google Vault, OpenText, Onna. Specialist legal-hold tools (Exterro, Relativity Legal Hold).
  • Stage 4: Forensic collection tools (Cellebrite for mobile, X1 Social Discovery, native cloud-app exports).
  • Stages 5-8: The “review platforms” — Relativity, Everlaw, DISCO, Reveal, Logikcull for self-service.
  • Stage 9: Trial presentation software (TrialDirector, Sanction, native exhibit tools in courtroom systems).

Increasingly, vendors collapse multiple stages into one platform — Relativity One spans 4-8; modern AI-native platforms add deeper Stage 7 (Analysis) capabilities.

How AI changes EDRM

Stage-by-stage:

  • Stage 1 (Governance): AI classifies documents at creation, suggesting retention categories. Reduces eDiscovery scope on every future matter.
  • Stage 2 (Identification): AI scans organizational data to surface likely-relevant custodians and data sources from a litigation description.
  • Stage 5 (Processing): AI improves OCR, foreign-language detection, dedup decisions on near-duplicates.
  • Stage 6 (Review): The biggest impact. LLM-assisted review handles first-pass at quality competitive with junior reviewers. 30-70% cost reduction in well-run programs.
  • Stage 7 (Analysis): Concept extraction, communication network mapping, timeline construction. AI surfaces patterns human review would miss.
  • Stage 8 (Production): Auto-redaction (see redaction workflows), privilege log generation.
  • Stage 9 (Presentation): AI drafts deposition outlines from reviewed documents; suggests trial exhibits.

The total impact: well-run AI-augmented eDiscovery programs in 2026 deliver outcomes for 30-50% of historical cost.

Common pitfalls

  • Treating EDRM as linear. In practice, stages overlap and iterate. New custodians get identified mid-review; supplemental productions trigger return to earlier stages.
  • Under-investing in Stages 1-3. Cheap fixes upstream save expensive review downstream. Most organizations underspend on information governance.
  • Over-relying on AI without validation. AI-assisted review still requires statistical sampling and human validation; producing without it invites court challenges.
  • Ignoring proportionality. EDRM doesn’t override the duty to scope discovery proportionate to matter value. Aggressive negotiation on collection scope and review tier matters.