September 2025

AI Agents for PMF

Dev Patel

Tunnel is an AI-powered market simulation platform for testing product and feature ideas with realistic, personality-driven personas—built at HackTheNorth with Krish Garg, Suneru Perera, and Haresh Goyal. This writeup covers the motivation, architecture, and implementation behind using AI agents for accurate PMF (product–market fit) simulations.

Motivation

Most startups fail because they build something with no real market need. Classic validation—surveys, focus groups, user interviews—is slow, expensive, and often misses the mark. At HackTheNorth, our team wanted a better way: test product ideas before writing code, with hundreds or thousands of AI personas that behave like real users.

Tunnel lets you describe a product or feature and instantly run it past a simulated userbase. You can interact with personas in real time (including voice via Vapi), see reactions and sentiment, and iterate. The goal is fast, accessible PMF signal so makers can build smarter from day one.

Architecture

Stack overview

  • Frontend: Next.js, TypeScript, Tailwind, Shadcn UI primitives, Framer Motion for animations.
  • 3D globe: Three.js and React Three Fiber for the interactive globe (personas as points, color-coded by reaction: green interested, yellow neutral, red not interested).
  • Backend: Next.js API routes for persona generation, project analysis, voice session bridging, and session management.
  • AI: Cohere for semantic understanding, ranking, and nuanced conversation generation.
  • Voice: Vapi with an MCP server for browser-based, phone-like calls with each persona.
  • Data: MongoDB Atlas for sessions and persona profiles; Auth0 for auth (including runtime-created AI agent profiles). Compound indexing and partitioning keep reads fast; debounced auto-save and optimistic UI keep the experience responsive.
  • Hosting: Vercel for deployment and global availability.

Building for scale and security

The API layer orchestrates persona generation, niche extraction, ranking, and voice bridging. Persona profiles store nested demographic, psychographic, and behavioral data plus simulation history and voice transcripts. Live updates use fast polling and optimistic UI so users see feedback without blocking. Everything is designed to support hundreds of concurrent users and many concurrent simulations.

Example loading screen built with Framer Motion:

The globe

The 3D globe was one of the most asked-about parts of the project. It’s built with Three.js and React Three Fiber. Each persona is a point on the globe; color indicates reaction (green / yellow / red). The wireframe sphere uses THREE.MeshBasicMaterial with wireframe: true and ellipse curves for latitude lines:

const wireframeMaterial = new THREE.MeshBasicMaterial({
  color: color,
  wireframe: true,
  transparent: true,
  opacity: 0.6
});
const wireframeSphere = new THREE.Mesh(sphereGeometry, wireframeMaterial);
globeGroup.add(wireframeSphere);

const createLatitudeLines = () => {
  const latitudes = [];
  for (let i = -80; i <= 80; i += 20) {
    const phi = (90 - i) * (Math.PI / 180);
    const radius = Math.sin(phi);
    const y = Math.cos(phi);
    const curve = new THREE.EllipseCurve(
      0, 0, radius, radius, 0, 2 * Math.PI, false, 0
    );
    // ...
  }
};

Continent outlines come from a large GeoJSON file. We load it with loadGeoJsonData() and map lat/lon to 3D using latLonToVector3(). The next step is to map geographic coordinates to 3D Cartesian coordinates (the globe is round, not flat).

Input parameters: lat = latitude in degrees (0 = equator, ±90 = poles), lon = longitude in degrees (0 = Greenwich, ±180 = date line), radius = sphere radius.

Conversion to radians:

θ=(lon+180)×π/180\theta = (-\text{lon} + 180) \times \pi/180

(so 0° is North Pole, 180° is South Pole for 3D spherical convention.)

ϕ=(90lat)×π/180\phi = (90 - \text{lat}) \times \pi/180

(for Three.js left-handed system.)

Cartesian coordinates:

x=rsinϕcosθx = r \sin \phi \cos \theta

y=rcosϕy = r \cos \phi

z=rsinϕsinθz = r \sin \phi \sin \theta

Return:

THREE.Vector3(x,y,z)\mathrm{THREE.Vector3}(x, y, z)

The JSON structure looks like:

{
  "type": "FeatureCollection",
  "features": [
    { "type": "Feature", "properties": { "CONTINENT": "Asia" }, "geometry": { "type": "MultiPolygon", "coordinates": [ ... ] } },
    { "type": "Feature", "properties": { "CONTINENT": "North America" }, "geometry": { ... } },
    { "type": "Feature", "properties": { "CONTINENT": "Europe" }, "geometry": { ... } }
  ]
}

Runtime agent

The runtime agent powers the AI personas: consistent opinions, behavior, and voice. When you submit a product idea, it:

  1. Extracts a niche from the idea (e.g. “Gen X fintech app in Canada” → Financial Technology) and matches it to persona interests. Example extract-niche API output:
✓ Compiled /api/extract-niche in 536ms (1820 modules)
[EXTRACT-NICHE] Request body: { 'create genx app related to fintech in canada' }
[EXTRACT-NICHE] Processing idea: create genx app related to fintech in canada...
[EXTRACT-NICHE] Analyzing 15 potential niches...
[EXTRACT-NICHE] Financial Technology: score=1.50, keywords=[fintech]
[EXTRACT-NICHE] Best match: { name: 'Financial Technology', score: 1.5 }
[EXTRACT-NICHE] Successfully identified niche: Financial Technology
POST /api/extract-niche 200 in 597ms
  1. Selects personas via Cohere’s reranker so only relevant personas respond. Example rerank output:
[SELECT-NICHE-USERS] Cohere rerank completed, got 25 results
[SELECT-NICHE-USERS] Top 5 selected users:
1. James Wilson (0.866) - CEO
2. Derek Ward (0.759) - Learning Experience Designer
3. Michelle Nelson (0.561) - Financial Analyst
4. Tessa Watson (0.392) - Policy Analyst
5. Jared Bishop (0.336) - Regulatory Compliance Officer
  1. Stores rich metadata per persona: location, demographics, professional context, psychographics, interests, and personality. Example persona metadata (stored as Auth0-style users):
{
  "location": { "city": "Toronto", "country": "Canada", "coordinates": { "longitude": -79.3832, "latitude": 43.6532 } },
  "demographics": { "generation": "Gen X", "gender": "Male", "ageRange": "45-50" },
  "professional": { "seniority": "C-Level", "primaryIndustry": "Fintech", "companySize": "100-500", "yearsExperience": 22 },
  "psychographics": { "techAdoption": 7, "riskTolerance": 9, "priceSensitivity": 3, "influenceScore": 9, "brandLoyalty": 8 },
  "interests": [ "Strategy", "Innovation", "Golf", "Investing" ],
  "personality": { "openness": 0.7, "conscientiousness": 0.9, "extraversion": 0.8, "agreeableness": 0.6, "neuroticism": 0.2 }
}
  1. Generates reactions with AI pipelines that use persona attributes, pattern recognition, and sentiment so each reply is unique and grounded.
  2. Handles voice: when you “call” a persona, the runtime passes full context (personality, history, feedback) into Vapi, which drives a real-time voice call in the browser. State, transcripts, and feedback stay in sync so the agent can refer to past sessions.

All of this runs in real time with results stored and streamed back, so the simulation feels live and coherent.

Demo

Here’s the globe in action:

Ending remarks

We went into HackTheNorth with a new team and a Google Doc full of ideas. We ended up winning twice: MLH Track and Best use of Vapi – AI Voice Agent. Out of 100+ projects on the Y Combinator track, we were shortlisted in the top 10 and interviewed with Nicolas Dessaigne and Andrew Miklas from YC to discuss the future of Tunnel. Sharing the win on LinkedIn and X led to 500k+ impressions and a lot of inbound interest. For me, the biggest takeaway was building something that actually helped makers validate ideas faster—and doing it with a team that showed up and shipped.