Meet AI - AI-powered video call application
Meet AI is a fully AI-powered video call application that enables real-time conversations with intelligent agents tailored for specific roles such as Language Tutor, Therapy Assistant, Training Coach, and more. Unlike typical chatbots or apps focused on post-call summaries, Meet AI delivers a truly live, interactive experience, allowing users to engage directly with specialized AI personalities. After each session, users gain access to meeting summaries, transcripts, and ongoing AI support, making it easy to track progress, revisit key insights, and continue learning or growing beyond the call.
Technologies
Link
The purpose of Meet AI is to reimagine how people interact with artificial intelligence, not through static text or delayed responses, but through live, human-like conversations that feel intuitive and meaningful. By giving each AI agent a clear role and personality, the app is designed to support users in personal growth, emotional support, learning, and self-improvement. Whether someone wants to practice a new language, talk through challenges, or stay accountable to their goals, Meet AI offers an always-available, judgment-free space to connect and grow.
As a sole developer, I was responsible for project development, including UI/UX design, frontend and backend development, database management, and deployment.
The development of Meet AI began with a simple goal: to create a seamless, real-time conversation experience between users and intelligent digital agents through live video. The aim was to give users the feeling of speaking with a human — someone who listens, responds naturally, and creates a space where they feel free to share.
I began by designing the architecture using Next.js and React, prioritizing performance, modularity, and flexibility for future growth. With Shadcn UI and Tailwind CSS, I focused on building clean, accessible, and mobile-friendly interface components, especially within the dashboard, where most user interaction occurs. Real-time communication was enabled through the Stream SDK, while OpenAI powered the content and responses of the AI agents. From there, I built the backend infrastructure using Drizzle ORM, Neon, and PostgreSQL to ensure reliable and scalable data handling. I integrated tRPC with TanStack Query to enable full-stack type safety and smooth data fetching from backend to frontend.
Post-call experience was also a key part of the user journey. To help users keep track of their sessions, I implemented Inngest to trigger background jobs based on key events, such as when a video call ends, enabling asynchronous workflows like fetching transcripts from Stream and generating AI-powered summaries, all without disrupting the main user experience. Zod was used for consistent data validation across the app, and Polar handled monetization and payment integration.
In building Meet AI, I focused not only on technical performance but also on creating a product that feels genuinely human, helpful, and thoughtful from start to finish.
Inngest: While building an Inngest function to summarize meeting transcripts using GPT-4o, I encountered several challenges, including parsing significant JSONL transcripts, resolving speaker identities across multiple tables (user and agent), and ensuring reliable formatting of input/output for the agent. Long transcripts also risked exceeding token limits, and debugging the async workflow was complex.
I overcame this by modularizing the workflow using step.run()
, which isolates each task - fetching, parsing, speaker mapping, and summarizing - making it easier to debug and replay. I merged user and agent data safely, adding fallbacks to prevent runtime errors. To ensure consistent agent results, I formatted input with markdown structure and validated outputs with type-safe parsing.
tRPC + Tanstack Query: Combining tRPC with TanStack Query was powerful but tricky at first. I had difficulties with query key naming, invalidation logic, and maintaining type safety across client and server.
To solve this, I built custom wrappers around trpc.useQuery()
and trpc.useMutation()
that standardized query keys and caching strategies. I also took full advantage of inferred types from AppRouter, which reduced duplication and runtime errors. Reading through tRPC's examples and TanStack Query's stale time/invalidation pattern helped me get confident using both together.
Building Meet AI was a major learning experience that challenged me to work across the full stack—from real-time video and AI integration to managing async workflows and type-safe APIs. I faced a lot of unfamiliar tools and concepts, but by breaking problems down and learning from documentation and examples, I was able to build something meaningful. This project helped me gain confidence in using modern technologies and deepened my understanding of how to structure and ship a complete product.