Skip to main content
Manual review is a critical part of any robust identity verification and KYC compliance workflow. When the automated system flags a verification session with warnings or inconsistencies, it moves to In Review status — requiring a trained reviewer to make the final decision. This guide walks you through the complete manual review process, including how to use the resubmission feature to give users a second chance.

Verification dashboard overview

Manual review verifications overview
The verification table view lists all sessions with their current status, document type, country, and other key details at a glance. Sessions can have the following statuses:
StatusMeaning
ApprovedThe user passed all identity checks
DeclinedThe user failed one or more verification checks
In ReviewRequires your manual attention before a decision can be made
ResubmittedThe user has been asked to redo specific verification steps
Sessions marked as In Review have triggered one or more warning signals during automated processing. The total count of sessions pending review is displayed at the top of the verification dashboard.

How to conduct an effective manual review

1

Initial assessment

Click on any session to open the detailed session view:
  1. Review all warnings displayed in the session overview — these are the specific signals that triggered the manual review (e.g., low liveness score, AML match, document inconsistency).
  2. Review the user’s previous verification attempts — click on the user’s vendor data to access the session history and see if they have prior verification sessions.
  3. Review the session events timeline — the events section provides a chronological log of every action taken during the session.
2

Document inspection

Didit’s automated system already performs comprehensive document verification — including security feature detection, field-level data consistency checks, MRZ validation, expiry date verification, and image quality analysis. Your role is to evaluate the flagged warnings and visually confirm the document when needed.
  1. Review document warnings — check the specific warnings the system raised. Common warnings include data inconsistencies between fields, failed MRZ check digit validation, expired documents, or suspected tampering.
  2. Visually verify when warnings are ambiguous — if a warning suggests potential tampering or low image quality, inspect the document images directly. Look for signs of digital editing, cropping, screen capture, or physical manipulation.
  3. Check extracted OCR data — review the data the system extracted from the document and confirm it looks consistent. The system highlights fields where confidence is low.
  4. Review document matches — if Face Search or duplicate detection is enabled, check whether this user has been verified before under a different identity or if the document appears in your blocklist.
Hover over document images in the Didit console to use the zoom feature for detailed pixel-level inspection. This is especially useful when verifying microprinting and holograms.
3

Biometric and liveness verification

Didit automatically performs face matching, liveness detection, and — if enabled — face search. Your role is to review the results when they are flagged.
  1. Face match score — the system computes a biometric similarity score between the selfie and the document portrait. When reviewing a flagged face match, focus on structural features like eye shape, nose geometry, jawline, and distinguishing marks.
  2. Liveness detection score — the liveness system detects presentation attacks such as printed photos, screen replays, deepfakes, and 3D masks. Poor camera quality or low lighting can sometimes produce lower scores for genuine users — consider requesting resubmission rather than declining.
  3. Face search results — review any matches carefully:
    • Duplicate detection — a match with a different session may indicate the same person verifying under multiple identities.
    • Blocklist matches — if the face matches a blocklisted user, this typically warrants a decline.
    • False positives — facial similarity between different people does occur. Evaluate the confidence score and compare images visually.
4

Risk and signal review

Navigate through the AML, IP Analysis, and Database Validation sections to assess contextual risk.
AML screening results in manual review
AML Screening
  • No matches — low risk. No hits found against global watchlists, sanctions lists, or PEP databases.
  • Matches found — review each matched entry. Check whether it’s a true positive or false positive, review the match category (PEP, sanctions, watchlist, adverse media), and consider the source and recency of the data.
IP Analysis and device intelligence
  • Geographic consistency — compare the document’s country of issue, the user’s claimed address, and the IP geolocation.
  • VPN / Proxy detection — check if the user is connecting through a VPN, proxy, or Tor network.
  • Device metadata — review the device type, operating system, and browser information for anomalies.

Making a decision

After completing your review, you have three options: Approve, Decline, or Request Resubmission.

Approve

  • Document appears authentic and unaltered
  • All data fields are consistent and legible
  • Selfie biometric match confirms the same person
  • Liveness check score is within acceptable range
  • No relevant AML, sanctions, or PEP matches
  • Device and location data are consistent

Decline

  • Document appears tampered, forged, or altered
  • Selfie does not match the document photo
  • High-confidence AML or sanctions match confirmed
  • Strong indicators of a presentation attack
  • Fraudulent identity patterns detected
  • Critical device or location inconsistencies

Request resubmission

  • Blurry or low-quality document images
  • Failed liveness check due to technical issues
  • Wrong document submitted
  • Incomplete or expired verification steps
  • Fixable issues the user can correct
Decline criteria selection dialog
Always select a decline reason and add detailed review notes when declining a session. This supports quality assurance, regulatory audits, and internal analytics.

Requesting a resubmission

When a session cannot be clearly approved or declined — typically because of fixable issues — you can request the user to resubmit specific verification steps.

How to request resubmission

1

Open the session

Navigate to the session in the Didit console.
2

Select Request Resubmission

Click the actions menu and select Request Resubmission.
3

Choose features to resubmit

The dialog shows all non-approved features with their current status. Select or deselect individual steps. Features that were never started are automatically required.
4

Optionally send an email notification

If the user’s email is available, send them a localized notification with a direct link to resume.
5

Confirm

The session status changes to Resubmitted and a webhook is sent to your server.

What happens after resubmission

  1. Feature data is reset — only the selected features are cleared. All previous attempt data is archived and marked as previous_attempt in the logs.
  2. The user re-enters the verification flow — they only need to complete the specific steps that were requested.
  3. Session stays in Resubmitted status — distinct from In Progress, helping you track resubmission sessions separately.
  4. Automatic re-evaluation — once all resubmitted features are completed, the system recalculates the final status automatically.
  5. New webhook fires — your server receives a webhook with the updated final status.
You can request resubmission multiple times on the same session. Each cycle preserves the complete history of all previous attempts for compliance audits.

Resubmission vs. creating a new session

ResubmissionNew session
Session IDSame session ID preservedNew session ID created
HistoryPrevious attempts archived within the same sessionSeparate session, no linked history
User effortOnly redo failed/selected stepsFull verification from scratch
Conversion impactHigher completion ratesRisk of user drop-off
WebhookSame session ID, status changes to Resubmitted → final statusNew session lifecycle from Not Started
Audit trailComplete history in one placeSpread across multiple sessions

Review best practices

Apply the same standards to every review. Document your reasoning so other team members can understand and replicate your decisions.
If the problem is a blurry photo or a technical error, requesting resubmission preserves the session and avoids unnecessary friction for genuine users.
Always zoom into document images to check for subtle tampering, microprinting, or security features not visible at standard zoom levels.
Don’t evaluate signals in isolation. A low liveness score combined with a VPN connection and mismatched geolocation is more concerning than any single factor alone.
Add detailed review notes for every decision, especially declines. These notes support internal QA, regulatory audits, and help train new team members.
Configure email or Slack alerts to get immediately notified when sessions enter In Review status. Fast response times improve both compliance and user experience.
If a user has multiple resubmission cycles, review the full history carefully. Repeated failures on the same feature may indicate a systemic issue.