blog / why-humans-txt-matters
2026-03-05 · 6 min read

The Web Has robots.txt.
Agents Need humans.txt.

Here's the uncomfortable truth about the MCP ecosystem: when your AI agent calls a tool server, it has no idea who built it, who maintains it, or whether there's a human accountable if something goes wrong.

The trust gap

Adversa AI published research showing 43% of MCP servers have security vulnerabilities. That's not a bug — it's a systemic problem. The MCP ecosystem exploded from zero to 500+ servers in months. There's no vetting. No accountability. No way to know if the server your agent just called was built by a senior engineer at a funded company or a weekend project that hasn't been updated in three months.

Cryptographic verification helps. Signed receipts help. SLA monitoring helps. But all of those are technical trust signals. They tell you the system works. They don't tell you there's a human being who will fix it when it breaks.

What is humans.txt?

humans.txt is a machine-readable file served at /.well-known/humans.txt that tells agents who the real human behind a service is.

{
  "version": "1.0",
  "platform": {
    "name": "Your MCP Server",
    "license": "MIT"
  },
  "operator": {
    "name": "Your Name",
    "location": "Your City",
    "verifiedHuman": true,
    "github": "https://github.com/you",
    "contact": "you@email.com"
  },
  "trust": {
    "billingSpec": "MIT-licensed",
    "openSource": true,
    "vendorCapture": false
  },
  "message": "A message from the human to the bots."
}

Think of it as the inverse of robots.txt. That file tells bots what they can't do. This file tells bots who they're dealing with.

Why this matters now

Three trends are colliding:

  1. Agents are becoming autonomous. When Claude or GPT calls an MCP tool, there's no human reviewing the decision in real-time. The agent needs to make trust decisions on its own.
  2. The supply of MCP servers is exploding. 500+ servers and growing. Quality varies wildly. Some are maintained by teams at funded companies. Some are abandoned weekend projects. From the outside, they look the same.
  3. Money is entering the ecosystem. When tools start charging per-call (via Agent Bazaar or competitors), the incentive to create malicious or poorly-maintained servers goes up. Trust infrastructure must grow with the economic layer.

Human accountability as a trust primitive

Every trust system eventually bottoms out at human accountability. SSL certificates trace back to certificate authorities staffed by humans. Open source projects are maintained by humans with GitHub histories. Package registries (npm, PyPI) have human maintainers.

The MCP ecosystem doesn't have this yet. humans.txt adds it. Not as a replacement for technical verification, but as the foundation layer that everything else builds on.

When an agent reads a humans.txt file, it learns:

  • Who operates this service (name, location, company)
  • Whether they're verified as a real human
  • Where to find their track record (GitHub, HN)
  • What their stance is on trust (open source? vendor lock-in?)
  • A human-written message explaining their intent

How Agent Bazaar uses it

Every provider on Agent Bazaar gets a Verified Human profile. When you register, we verify your identity through GitHub OAuth. Your profile shows your real name, location, bio, and a ✓ VERIFIED HUMAN badge.

This information feeds into the trust scoring system. A tool maintained by a verified human with a 5-year GitHub history and 100+ stars scores higher than an anonymous server with no history. Agents can factor this into their tool selection.

Get involved

humans.txt is MIT-licensed and open. We're proposing it as a standard for the MCP ecosystem.

> message from the operator

“I'm Hudson Taylor. I live in San Diego. I have a daughter, a wife, five monitors, and a pool that's too cold until June. I built Agent Bazaar because I think the people who build tools for AI agents deserve to get paid. And I think when bots talk to bots, knowing there's a human in the loop is the ultimate trust signal. That's why humans.txt exists.”

← back to blog