← Use Cases

How do I validate domains before lead gen outreach?

The problem

Lead gen AI agents scrape thousands of domains from directories, LinkedIn, industry databases, and the open web. A significant portion of those domains are dead weight: businesses that closed, domains that lapsed, parked placeholders that were never real companies.

Every bad domain costs time. Sales reps research companies that don't exist. Emails bounce. Phone numbers are disconnected. The pipeline looks full but the close rate tells a different story.

How Unphurl solves it

Run your scraped domains through Unphurl before they enter the pipeline. The lead-gen scoring profile weights domain legitimacy and business viability signals. A parked domain, a domain without MX records, or a domain about to expire is not a lead worth pursuing.

After the batch check, filter your results by score to get a clean list. CLI users pipe through jq. Claude Cowork and OpenClaw users just ask for the filtered list in plain language.

Signals that matter for this use case

  • Parked domain means there's no business behind the domain. Just a placeholder.
  • HTTP only / SSL invalid signals poorly maintained or abandoned infrastructure.
  • Chain incomplete means the domain doesn't resolve at all. Dead lead.
  • No MX record means the domain can't receive email. Not a functioning business.
  • Expiring soon flags domains about to lapse. Business may be closing.
  • Bad domain status (pendingDelete, serverHold) confirms the domain is effectively dead.

Suggested scoring profile

{
  "name": "lead-gen",
  "weights": {
    "parked": 35,
    "http_only": 20,
    "chain_incomplete": 20,
    "domain_status_bad": 20,
    "no_mx_record": 15,
    "expiring_soon": 15,
    "ssl_invalid": 15,
    "domain_age_7": 10
  }
}

What a result looks like

Your AI agent scrapes 5,000 prospect domains from a directory. Unphurl checks each one:

acme-corp.com
Score: 0 Known
defunct-startup.io
Score: 55 Parked + No MX
real-business.ca
Score: 0 Cached

Your AI agent filters out everything above your threshold. Clean leads enter the pipeline. Dead weight stays out.

Cost

At a typical 1-5% pipeline rate, 5,000 URLs might consume 50-250 pipeline checks. The Standard package ($39 for 500 pipeline checks) handles this. Repeat scans of similar directories cost even less as the cache grows.

Get started

# Create the profile
unphurl profiles create lead-gen \
  --weights '{"parked":35,"http_only":20,"chain_incomplete":20,"domain_status_bad":20}'

# Batch check
unphurl --batch scraped-domains.csv --profile lead-gen

# Via the API (batch endpoint)
POST /v1/check/batch
{
  "urls": ["https://acme-corp.com", "https://defunct-startup.io", ...],
  "profile": "lead-gen",
  "webhook_url": "https://your-server.com/results"
}