Dira: An AI-Powered Insights Engine Built from Scratch (Next.js + Python)

Dira: An AI-Powered Insights Engine Built from Scratch (Next.js + Python)

Dira: An AI-Powered Insights Engine Built from Scratch (Next.js + Python)

I identified a discovery gap and taught myself Python, Next.js, and RAG architecture to build a functional AI product.

I identified a discovery gap and taught myself Python, Next.js, and RAG architecture to build a functional AI product.

I identified a discovery gap and taught myself Python, Next.js, and RAG architecture to build a functional AI product.

My Role

Founder & Full-Stack Product Designer

I wore every hat—from product strategy and UI design to writing the production code (Python/Next.js) that powered the AI insights engine.

Tech Stack

  • Python

  • Next.js

  • React

  • OpenAI API

  • RAG (Retrieval-Augmented Generation)

  • Pinecone (Vector Database)

  • MongoDB (MongoEngine),

  • Prisma.

The Punchline

The Gap: Discovery is often slow and painful, not because data doesn't exist, but because it is inaccessible. Valuable insights are locked away in thousands of unread call transcripts, forcing PMs to re-run research instead of leveraging existing knowledge.

The Build: Instead of just designing a mockup, I taught myself Python and Next.js to build a fully functional AI engine. I architected a RAG system using Pinecone and OpenAI to ingest calls and auto-generate strategic themes.

The Validation: The concept gained traction with early users and won 2nd Runner-Up at CoLaunch (a tech startup accelerator), validating the problem space.

The Strategic "Kill": Despite the technical success, I identified a critical go-to-market barrier: the high friction of accessing data from walled gardens like Salesforce and Gong. I made the difficult decision to sunset the project, proving that I value business viability over the "sunk cost" of my own code.

The Challenge:  The "Buried Treasure" Problem

The Business Context: Unstructured Data = Lost Value

In B2B SaaS, the Customer Experience (CX) and Sales teams spend hours every day talking to users. These calls contain the "gold" of product strategy—pain points, feature requests, and workflow gaps.

  • The Problem: This data is unstructured (audio/video) and high-volume. Once a call ends, it becomes a "black box."

  • The Consequence: Product Managers and Designers often embark on lengthy, net-new discovery cycles to answer questions that have already been answered in previous sales calls. We were wasting time "rediscovering" the wheel.

The User Pain: Manual Consumption does not Scale

For a PM to extract these insights manually, the ratio is 1:1—to analyze 10 hours of calls, they have to spend 10 hours listening.

  • The Friction: There was no way to "query" the collective knowledge of the organization.

  • The Hypothesis: If I could use LLMs to transcribe, vectorize, and query this data, I could reduce the "Time to Insight" from hours to seconds, turning a repository of recordings into an interactive insights engine.

The Old Way (Manual Analysis) vs. The New Way (AI-Augmented Discovery).

The Approach: Engineering as a Design Tool

The Build Strategy: Validation over Simulation
As a designer, my instinct was to prototype in Figma. However, a mockup couldn't answer the core risk: Can AI accurately group disparate conversations into coherent themes?

  • The Decision: I chose to build a functional MVP to validate the technical feasibility, not just the user interface.

  • The "Builder" Mindset: This wasn't just about product validation; it was about resourcefulness. Working solo without a budget or co-founder, I couldn't wait for a dev team. I realized that learning to code was the fastest way to bridge the gap between my idea and reality.

Architecting the RAG Pipeline

The technical core of Dira was a Retrieval-Augmented Generation (RAG) pipeline.

  • The Challenge: It wasn't enough to just "chat with a PDF." I needed the system to remember past insights and check if a new insight belonged to an existing theme.

  • The Solution: I used Pinecone to vectorize insights. When a new transcript was processed, the system queried the vector database for semantic similarities, grouping related feedback into "Themes" automatically.

  • The Complexity: This required orchestrating a complex flow between MongoDB (via MongoEngine), the OpenAI API, and the React frontend.

Navigating the "Documentation Desert"

Building this solo meant hitting walls where documentation was sparse or overly dense.

  • The Accelerator: I used LLMs (ChatGPT) as a "Technical Co-Founder," helping me parse complex library documentation and debug obscure Python errors. This experience proved that with the right AI tools, a designer can bridge the gap to engineering faster than ever before.

The Work: Designing for AI Trust

The UX Challenge: The "Black Box" Problem
The biggest risk with AI is hallucination. If a PM couldn't verify why the AI grouped 5 calls into a "Pricing Complaint" theme, they wouldn't trust the insight.

  • The Solution: I designed a "Source of Truth" UI pattern.

  • The Interaction: Every AI-generated summary was interactive. Clicking a claim (e.g., "Users hate the login flow") instantly expanded the accordion to reveal the verbatim customer quotes and timestamps that generated it.

  • The Impact: This transparency turned the tool from a "magic box" into a "cited research assistant," building trust with skeptical PMs.

The "Source of Truth" pattern allowed PMs to verify AI claims by clicking through to the original customer quotes.

Visualizing the Invisible

I needed a way to show the "weight" of an insight.

  • The Dashboard: I designed a table-based layout where themes were ranked by "frequency" and "revenue impact" (simulated). This allowed PMs to see not just what people were saying, but how much it mattered to the bottom line.

The Outcome: Success is Knowing When to Stop

The Validation

  • Technical Proof: I successfully built a working end-to-end pipeline. The system could ingest raw audio, transcribe it, vectorise it, and output coherent, cited themes.

  • Market Recognition: The concept and problem space were robust enough to secure 2nd Runner-Up at CoLaunch, a competitive tech accelerator, confirming that the "Discovery Bottleneck" is a massive, recognized pain point.

The Strategic Decision

Despite the technical win, I made the hard decision to sunset the project.

  • The Barrier: The pilot revealed a critical friction point: Input. Valuable data lives in walled gardens like Salesforce and Gong. Getting approval as an integration partner and building enterprise-grade connectors was a massive engineering hurdle that required a full team, not a solo founder.

  • The Lesson: A great product needs more than great tech; it needs effortless distribution. Without seamless integration, the manual upload friction was too high for busy PMs.

My Takeaway

I walked away with something more valuable than a startup: Technical Empathy. I now understand the constraints my engineers face—latency, token limits, database structures—because I've faced them myself. This makes me a better, more pragmatic design partner today.

Related Case Study