September 15th, 2025

Tunnel – AI Agent Simulation

Tunnel is an advanced AI-powered market simulation platform designed to revolutionize how product ideas are validated and refined. Instead of relying solely on traditional market research methods (which are slow, expensive, and often inaccessible), Tunnel enables you to test your ideas instantly against a diverse array of intelligent personas, each modeled with unique demographics, psychographics, and behavioral traits.

Latest update
HackTheNorth – Vapi Track Winner
Latest update
HackTheNorth – MLH Track Winner

Introduction

Have you ever spent months building something you believed in, only to launch and realize... no one actually wanted it? I’ve been there, and it’s a brutal feeling. Turns out, we're not alone—most startups fail, and the number one reason is building something with no real market need.

We looked at the usual ways to validate ideas; surveys, focus groups, user interviews...but they’re slow, expensive, and often miss the mark. At HackTheNorth, our team set out to find a better way. That’s how Tunnel was created.

Tunnel is an AI-powered market simulation platform that lets you test product and feature ideas instantly directly with your userbase. You can interact with hundreds ( or even a thousand if you have a strong machine which I clearly don't... ) of realistic, personality-driven personas that represent your target market. Instead of guessing what people might think, you can see real-time reactions and insights, all before writing a single line of code. It’s fast, accessible, and built to help makers like us build smarter from day one.

Building for Scale and Security

When we set out to build Tunnel, we wanted every piece of the architecture to feel seamless, responsive, and as dynamic as the market simulations themselves. On the frontend, we chose a modern stack, Next.js, TypeScript, Tailwind, and Shadcn UI Primitives (my absolute favourite) !

Additionally, we created a real-time 3D globe (which ended up being a fan favourite from all the feedback we got!) powered by Three.js and React Three Fiber. All the UI logic lives alongside beautifully animated transitions, thanks to Framer Motion, while Tailwind CSS and Radix UI handle styling and accessibility for every component.


Example: Loading screen built using Framer-motion


The heart of Tunnel is our custom API layer, built with Next.js API routes, that orchestrates persona generation, project analysis, voice session bridging, and session management. Intelligence flows from Cohere's suite of AI tools, which handles everything from semantic understanding and ranking algorithms to nuanced conversation generation.

For voice, we use Vapi with an MCP server to provide instant, realistic phone-like interactions with each persona directly from the browser. All user sessions are managed in MongoDB Atlas, which allows users to pick up on their iterations at anytime. We utilized Auth0 for authentication and our world's first AI agent profiles which are created on run-time.

This offers the flexibility to store evolving persona profiles [ with nested demographic, psychographic, and behavioral data ] plus full records of simulations, feedback, and voice transcripts. We use compound indexing and partitioning to make sure data retrieval stays fast, even when scaling to hundreds of concurrent users. Live updates come in via fast polling mechanisms and optimistic UI updates, so users never wait for feedback, and all session data auto-saves in the background using a debounce system to prevent data loss.

Everything runs on Vercel, which enables near-instant deployments and ensures that our app stays globally available and performant. The result is our full-fledged platform that combines next-gen AI, interactive graphics, and robust backend systems, all put together to deliver instant, actionable market insights.

The Globe

Thought I’d do a separate writeup on the globe itself, as a lot of people were curious about it! Building the interactive 3D globe for Tunnel was honestly one of my favorite parts of the project. I wanted users to instantly visualize how their product idea resonated with different personas around the world, so I designed the globe using Three.js and React Three Fiber for rendering and interactivity. Each persona is represented as a point on the globe, color-coded by their reaction—green for interested, yellow for neutral, red for not interested—so you can see global trends and outliers at a glance.

 const wireframeMaterial = new THREE.MeshBasicMaterial({      color: color,      wireframe: true,      transparent: true,      opacity: 0.6    });    // Create wireframe sphere    const wireframeSphere = new THREE.Mesh(sphereGeometry, wireframeMaterial);    globeGroup.add(wireframeSphere);    // Create latitude lines    const createLatitudeLines = () => {      const latitudes = [];      for (let i = -80; i <= 80; i += 20) {        const phi = (90 - i) * (Math.PI / 180);        const radius = Math.sin(phi);        const y = Math.cos(phi);                const curve = new THREE.EllipseCurve(          0, 0,          radius, radius,          0, 2 * Math.PI,          false,          0        );

Wireframe Sphere Creation (Code Snippet)

A lot of people have also asked about how the outlines for the continents were rendered on the globe, so here’s how it works. I found a very and I mean very large .json file which maps thousands of coordinates to create a polygonial shape of the continent. Refer below for a basic structure.

 {"type": "FeatureCollection",                                                                                "features": [{ "type": "Feature", "properties": { "CONTINENT": "Asia" }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 93.275543212890625, 80.26361083984375 ], [ 93.148040771484375, 80.313873291015625 ], [ 91.424911499023438, 80.31011962890625 ], ......... million more},{ "type": "Feature", "properties": { "CONTINENT": "North America" }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -25.281669616699219, 71.39166259765625 ], [ -25.623889923095703, 71.537200927734375 ], [ -26.950275421142578, 71.578598022460938 ], [ -27.693889617919922, 71.930267333984375 ] }{ "type": "Feature", "properties": { "CONTINENT": "Europe" }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 58.061378479003906, 81.687759399414062 ], [ 57.889858245849609, 81.709854125976562 ], [ 59.435546875, 81.819297790527344 ], [ 59.159713745117188, 81.728866577148438 ], [ 58.061378479003906 ] };{ rest of africa, south america, oceania, australia, antartica}]  }  

Using this json file, I created a loadGeoJsonData() function which is basically responsible for loading up the continent borders in GeoJSON format. When the app starts or needs to render the globe, it calls this function, which reads the continents.json sitting on the server. If everything goes smoothly, it parses and returns the JSON data, which contains all the coordinates needed to draw those outlines on the 3D globe.

The next step, and probably the most complex part, is to map coordinates to actual vector coordinates because the globe is round, not flat....

To do that, I used a simple conversion algorithm. The latLonToVector3() function converts a point on a sphere given by geographic coordinates (latitude, longitude, and radius) into a 3D Cartesian coordinate (x, y, z) as used in 3D engines like Three.js. Here's the breakdown below:

Input Parameters
lat = Latitude (in degrees), where 0 is the equator and ±90 are the poles.
lon = Longitude (in degrees), where 0 is the Greenwich meridian, ±180 is the International Date Line.
radius = The radius of the sphere (globe) on which the point lies.

Conversion Math

1. Convert latitude and longitude to radians:\( \theta = (-\text{lon} + 180) \cdot \frac{\pi}{180} \)Shifts latitude so that 0° is the North Pole and 180° is the South Pole, which matches the convention for spherical coordinates in 3D graphics.
\( \phi = (90 - \varphi) \cdot \frac{\pi}{180} \)Negates longitude to correct for inversion (Three.js uses a left-handed coordinate system), then shifts by 180° so that 0° is at the front.

2. Calculate Cartesian Coordinates:\( x = \text{radius} \cdot \sin(\phi) \cdot \cos(\theta) \)\( y = \text{radius} \cdot \cos(\phi) \)\( z = \text{radius} \cdot \sin(\phi) \cdot \sin(\theta) \)This mapping aligns the poles and the equator correctly in a 3D scene.

3. Return Value:\( \mathrm{THREE.Vector3}(x, y, z) \)A new 3D Vector where x, y, and z are the Cartesian coordinates calculated above—this represents the position in 3D space for given (lat, lon) on a sphere.

Runtime Agent

One of the other things I worked on was the runtime agent powering all the AI personas inside Tunnel, so I wanted to break down how it actually works behind the scenes. The runtime agent is basically the heart of our persona simulation—it’s the thing that lets each persona have unique, consistent opinions, behaviors, and even voice conversations with users.

Whenever you submit a product idea, the agent kicks into action by grabbing all the relevant persona profiles from our database (demographics, psychographics, past interactions, and more). It fetches related agent profiles by extracting a niche from the product idea and matching it with the personas' interests. For example, refer to how the Extraction identified a Financial Technology niche from the prompt.

 ✓ Compiled /api/extract-niche in 536ms (1820 modules) [EXTRACT-NICHE] API endpoint called [EXTRACT-NICHE] Request body: { 'create genx app related to fintech in canada' } [EXTRACT-NICHE] Processing idea: create genx app related to fintech in canada... [EXTRACT-NICHE] Starting niche analysis for idea length: 44 [EXTRACT-NICHE] Lowercase idea: create genx app related to fintech in canada [EXTRACT-NICHE] Analyzing 15 potential niches... [EXTRACT-NICHE] Financial Technology: score=1.50, keywords=[fintech] [EXTRACT-NICHE] Productivity & Business Tools: score=0.45, keywords=[app] [EXTRACT-NICHE] Best match: { name: 'Financial Technology', score: 1.5 } [EXTRACT-NICHE] Successfully identified niche: Financial Technology POST /api/extract-niche 200 in 597ms  

From there, we filter through ALL the agents that are created using a quick search algorithm. This algorithm matches the niche with the agents' interests via Cohere's ranker. You can see a small snippet of an agent's persona below which we generate and store as metadata for each agent. We store these as 'users' in the Auth0 user database.

 "location": {            "city": "Toronto",            "country": "Canada",            "coordinates": {                "longitude": -79.3832,                "latitude": 43.6532            }        },        "demographics": {            "generation": "Gen X",            "gender": "Male",            "ageRange": "45-50"        },        "professional": {            "seniority": "C-Level",            "primaryIndustry": "Fintech",            "secondaryIndustry": "Leadership",            "companySize": "100-500",            "yearsExperience": 22        },        "psychographics": {            "techAdoption": 7,            "riskTolerance": 9,            "priceSensitivity": 3,            "influenceScore": 9,            "brandLoyalty": 8        },        "interests": [            "Strategy",            "Innovation",            "Golf",            "Investing"        ],        "personality": {            "openness": 0.7,            "conscientiousness": 0.9,            "extraversion": 0.8,            "agreeableness": 0.6,            "neuroticism": 0.2        },  

For every single simulation, it orchestrates multiple stages: first, it runs Cohere’s reranking to figure out which personas actually care about this idea, then it generates tailored reactions using our AI pipelines. Each agent’s response isn’t generic—it’s built from the persona’s attributes, combined with pattern recognition and sentiment extraction, so every reply feels unique and grounded.

   [SELECT-NICHE-USERS] Cohere rerank completed, got 25 results  [SELECT-NICHE-USERS] Relevance scores: {      count: 25,      average: '0.341',      max: '0.866',      min: '0.249',      highRelevance: 1,      mediumRelevance: 2,      lowRelevance: 22    }  [SELECT-NICHE-USERS] Top 5 selected users:  1. James Wilson (0.866) - CEO  2. Derek Ward (0.759) - Learning Experience Designer  3. Michelle Nelson (0.561) - Financial Analyst  4. Tessa Watson (0.392) - Policy Analyst  5. Jared Bishop (0.336) - Regulatory Compliance Officer  

But we didn’t stop at just text. When you want to actually “call” a persona, the runtime agent passes all their context; personality, history, specific feedback—into Vapi (AI Voice Agent), which then transforms that into a real-time, dynamic voice call right in the browser. All the state, conversation transcripts, and even evolving feedback are instantly synced, so the agent can remember what’s happened before and respond accordingly in future sessions.

Everything the agent does happens in real-time, with results stored, tracked, and sent back to the user. This means you’re never stuck waiting or wondering if the system is keeping up. The end result is that every simulated persona feels alive, coherent, and actually grows over time, making the whole market simulation so much more real and useful.

Ending Remarks

I walked into the hackathon with a brand new team (shoutout @Krish Garg, @Suneru Perera & @Haresh Goyal), completely different ambitions, a Google Doc full of big ideas, and absolutely no sleep in sight. Being only one of the few hackers from a "non-target" university, it was definetely a whole different atmosphere. The ideas, the skill-level and execution was all on another level. Our group wasn’t shy about setting wild goals; actually, we were pretty loud about it. We wanted to win, yes, but more than that, we wanted to build something that felt genuinely new.

We ended up pouring everything into Tunnel, a platform that helps makers validate product ideas and features in seconds with AI-driven, real-market personas. And the loss of sleep? Totally worth it. When they announced us for both the MLH Track and Best use of Vapi - AI Voice Agent awards, we were stunned into silence (something that rarely happened over the weekend).

The biggest plot twist? Out of the 100+ projects submitted towards the Y Combinator track, we were shortlisted as one of the top 10 teams for an interview! We sat down with none other than Nicolas Dessaigne and Andrew Miklas from Y Combinator to talk about the future of our project. Having YC interviewers poke holes in your pitch is nerve-wracking and surreal: one moment you’re just a sleep-deprived student, the next you’re tossing ideas around with people who’ve seen a thousand startups rise and fall. Winning big, meeting genuine legends, and realizing how much is possible when you just show up and start buildaing—it made every hour totally, absolutely worth it.