How Theo (T3 GG) Built T3 Chat in 5 Days
Building T3 Chat: A 5-Day Devlog of a Lightning-Fast AI Chat App (Beginner-Friendly)
Introduction
In this blog post, we’ll explore the development journey of T3 Chat, a blazing-fast AI chat application built in just five days. This project aimed to create a more fluid and responsive experience than existing AI chat tools. We’ll dive into the daily challenges, architectural decisions, and technical breakthroughs that made this rapid development possible. By the end of this post, you’ll have a good overview of the development process, understand key technical decisions, and how they contributed to the speed and responsiveness of T3 Chat.
Estimated Read Time: 25-30 minutes Prerequisites: Basic knowledge of JavaScript, React, and web development concepts is helpful.
Day 1: Laying the Foundation
1. Initial Exploration with DeepSeek
- The project started with the discovery of DeepSeek V3, an open-source AI model that offered impressive speed and quality comparable to models like Claude.
- The existing chat interface for DeepSeek was clunky and inefficient, which inspired the need to build a better UI.
- Existing open-source starter kits for AI chat apps were not suitable, as they were designed for traditional setups, not the local-first approach desired for this project.
2. Scaffolding with Vercel AI SDK
- The initial UI was set up using Vercel’s AI SDK, which provided a basic structure.
- Vercel AI SDK had a lot of limits when it came to local syncing and more advanced features.
- A key decision was to make navigation entirely client-side for responsiveness.
3. Client-Side Routing and Data Sync
- React Router was used for all routing, ensuring the server wasn’t involved in navigation.
- A sync layer was built with React Context, which was later deemed insufficient for local-first app.
- A rough version of the application was operational, with basic chat functionality.
- There was experimentation with different data storage options, including a Neon instance with a schema, but eventually settled on Upstash Redis for a KV store.
Tip: Local-first applications can greatly benefit from client-side routing and local data syncing, which results in a more immediate and responsive user experience.
4. End of Day 1
- A basic chat UI was in place, including syntax highlighting and the ability to send messages.
- The data sync mechanism was not fully functional, requiring a significant overhaul in the following days.
- The day ended with a list of to-dos and a well-deserved rest.
Day 2: Teamwork and Architectural Overhaul
1. Collaboration with the CTO
- The CTO, Mark, was brought into the project for additional support.
- A detailed list of improvements and issues was compiled.
- Major UI revamp, with the introduction of tabs for navigation and an improved chat box.
2. Introduction of Dexie.js
- Dexie.js, a wrapper around IndexedDB, was chosen for local data storage.
- Dexie.js provided an elegant API for handling complex data models and syncing.
- The app’s architecture was restructured around projects, threads, and messages.
- Specific functions were created for handling new messages and threads.
useLiveQuery
was used to sync Dexie updates through signals.
3. Moving Away from Vercel AI SDK for Client State
- Issues with the message types and ID handling in Vercel AI SDK led to its removal for client-side state.
- The shift to Dexie allowed streaming updates directly to local storage, improving performance and stability.
4. User Feedback
- The app was stable enough to show to people to get feedback on.
- The name T3 Chat was finalized and implemented.
- The development team started using T3 chat for daily dev work.
Warning: Data sync can be very challenging in local-first applications. Consider tried-and-tested database solutions like IndexedDB (with Dexie.js), or CRDTs for better performance and stability.
Day 3: Gutting and Refinement
1. Feedback at Vercel
- The team visited Vercel to discuss frustrations with their SDK.
- Vercel committed to making meaningful changes as a result of the issues reported.
2. Removing Remaining Vercel SDK Pieces
- Effort was made to move all remaining parts of the Vercel SDK into the Dexie layer.
3. Fighting False Positives
- A significant portion of the day was spent addressing issues with anti-malware software incorrectly flagging the team’s software as malicious.
4. Local Storage and Auth
- A local-first authentication layer was implemented using cookies and local storage.
- The decision was made to handle auth locally to avoid server-side dependencies.
Tip: When building local-first apps, consider using libraries that make the auth local instead of relying on traditional auth.
5. Progress Summary
- Only a small UI improvement – the delete message button – was made due to other challenges.
- The authentication layer was functional, but not in its ideal state.
Day 4: Stream Day and Rethinking Decisions
1. Stream Day
- Development was slow due to a scheduled stream and other activities.
- Spent time at the Vercel office and hung out with the Laravel team.
2. Reverting off Next.js
- A lot of time was spent moving away and then back to Next.js due to struggles making the streaming work well outside of Next.js.
- The choice was made to temporarily stick with Next.js and refactor this in the future.
3. Focus on Core Features
- Most of the day was dedicated to polishing auth and streaming functionality.
- Linear (issue tracking) setup to keep track of bugs and tasks.
- React compiler was enabled for optimization.
- The primary achievement of the day was getting a functional auth layer.
Warning: Don’t be afraid to change your mind and try a different approach, especially if the current solution is too complicated.
Day 5: Syncing and Performance
1. Sync Layer
- Significant time spent on building a client-side sync layer.
- Multiple options were considered, including Zero, and Jazz tools.
- Zero was too hard to setup and had bad source of truth.
2. Jazz Investigation
- Jazz was explored but ultimately abandoned because of its focus on collaboration.
- The data model was complicated with a hierarchical structure.
- Jazz’s reliance on authenticated users hindered the user experience of the project.
3. Custom Dexie Sync Layer
- A custom Dexie sync layer was built because generic solutions did not meet the specific requirements.
4. Model Performance
- The team started to test the performance of different models.
- The performance of DeepSeek V3 slowed down over time.
- Azure was used to host the gpt-40 model.
- gpt-40 mini was chosen as the model for launch, because it was fast and affordable.
Tip: Benchmark different models and providers to ensure your application performs optimally and fits your budget.
5. Final UI Touch
- A proper homepage was created and added.
- Added a collapse for the sidebar.
Day 6: The Final Grind and Performance Tweaks
1. Marathon Development Day
- The final day was a marathon of coding and UI overhaul, where a lot of the UI was done by Mark.
- The input box was styled to resemble Claude, a chat competitor.
- The sidebar was refactored to accommodate auth information.
2. Stripe and Payments
- Stripe integration was done for subscription payments.
- Initial payment flow had some issues where it did not flag the users as paid.
3. Onboarding
- An onboarding flow was created with three introductory messages to explain the app’s features instead of a traditional homepage.
4. React Performance Optimization
- React performance expert Aiden was consulted, to optimize the app.
- Markdown chunking was implemented by using a marked lexer.
- Rendering was optimized by memoizing and only re-rendering changed blocks.
5. Final Touches
- React scan environment variable was added for performance monitoring.
- The result was a smooth user experience at 60 FPS even with CPU slowdown.
Tip: Memoization and chunking are critical techniques for optimizing rendering performance in React apps, especially when dealing with dynamic content and real-time updates.
6. Custom Data Layer
- UseQueryWithLocalCache custom data function was created.
- Show a default state while a server query is being fetched.
Conclusion and Next Steps
The result of this intense five-day effort is T3 Chat, a local-first AI chat application that excels in speed and responsiveness. The development process involved several key technical decisions:
- Local-first architecture
- Client-side routing
- Custom data syncing with Dexie.js
- Performance optimization with chunking and memoization
- Careful model selection
- A robust authentication solution
The T3 Chat app is the fastest AI chat app according to the team, who tried the app against other chat apps.
Further learning
- Learn more about data syncing with Dexie.js and IndexedDB.
- Research different local-first architecture patterns and approaches.
- Experiment with different AI models and providers.
- Practice implementing performance optimization techniques like memoization and chunking in your React projects.
If you have tried T3 Chat and you feel the speed difference, the team would love to hear about it on their Discord.