AI Dev 25 x NYC | Tyler Slaton: Build User Facing Agentic Applications with AG UI
Key Moments
Building agentic apps with AG-UI: Streamlined user interaction and generative UI patterns.
Key Insights
Agentic applications break the traditional request-response model, requiring streaming and handling complex inputs/outputs for good user experience.
The AG-UI protocol facilitates seamless communication between agentic backends and frontends, offering a standardized event-based approach.
Generative UI encompasses static (mapping data to components), open-ended (iframes/HTML), and declarative (structured specs) patterns for agent-generated interfaces.
State management is crucial for agentic applications, allowing for memory, persistence, and bidirectional updates between the agent and the user interface.
Future trends include the rise of voice agents, agent steering for autonomous tasks, and self-improving agents through reinforcement learning from human feedback (RLHF).
CopilotKit aims to support various generative UI patterns and advance agent-user interactivity through protocols like AG-UI.
THE EVOLVING LANDSCAPE OF AGENTIC APPLICATIONS
Agentic applications represent a significant departure from traditional web paradigms, moving beyond simple request-response interactions. They are characterized by long-running processes that necessitate streaming for a fluid user experience. These applications must also handle a diverse range of inputs and outputs, including structured data, unstructured text, tool calls, and even complex state updates. Furthermore, agentic systems often involve intricate composition, requiring seamless handoffs between sub-agents and keeping the user informed throughout these processes. This complexity underscores the need for new protocols and patterns to effectively build and manage these advanced AI-driven systems.
INTRODUCING THE AG-UI PROTOCOL
To address the challenges of building agentic applications, CopilotKit developed the AG-UI protocol. This event-based protocol is specifically designed to connect agentic backends with frontends, enabling fluid and interactive user experiences. AG-UI complements existing protocols like MCP (Model Context Protocol) and A2A (Agent-to-Agent) by focusing on the crucial agent-user interaction layer. It provides a standardized set of 16 events optimized for this communication, transport-agnostic, and supporting a client-server architecture where agents act as servers and frontends like React act as clients.
UNDERSTANDING GENERATIVE UI PATTERNS
Generative UI refers to interfaces or components generated by an AI agent and transmitted to the user. Three primary patterns exist: static UI involves mapping agent-generated data to pre-existing UI components, offering deep control and enabling the visualization of tool calls and agent states. Open-ended UI allows agents to return raw HTML or iframes, offering maximum flexibility but with potential challenges in styling, security, and performance, particularly in non-web environments. Declarative UI utilizes semi-open specifications, like JSON schemas, for designing constrained UIs, striking a balance between static and open-ended approaches by mapping agent-generated structures to component libraries.
THE ROLE OF STATE IN AGENTIC INTERACTIONS
While LLMs are famously stateless, agentic applications are inherently stateful. This state acts as an abstraction layer, enabling crucial functionalities such as memory, message history, and persistence. State can encompass not only conversational data but also structured information like charts, user preferences, or dynamic content. CopilotKit's V2 interfaces facilitate bidirectional state management, allowing both the agent and the frontend to read and update state. This enables sophisticated collaborative experiences, where users can directly interact with and modify agent-generated content, fostering a more dynamic and responsive application.
DEMONSTRATING AGENTIC CAPABILITIES WITH A TO-DO APP
A practical demonstration showcased a to-do application built with CopilotKit, highlighting key agentic features. The agent could add and manage to-do items, respond to queries about task status, and even perform front-end actions like changing the application's theme by updating React state. Notably, the demo illustrated a 'human-in-the-loop' scenario where the agent requested user permission before executing a significant action, such as marking all tasks as complete. This exemplifies how agents can integrate seamlessly with user workflows and learn from explicit user consent and interaction.
THE HORIZON: VOICE, STEERING, AND SELF-IMPROVEMENT
Looking ahead, CopilotKit anticipates 2026 as the year of voice agents, leveraging the intuitiveness of voice interaction and protocols like WebRTC for seamless integration. Agent steering is identified as critical for autonomous agents to maintain focus and achieve desired outcomes by preventing mid-run deviations. Perhaps most significantly, the focus is on self-improving agents powered by reinforcement learning from human feedback (RLHF). By analyzing signals from human edits, approvals, and denials, agents can continuously learn and refine their performance, leading to increasingly capable and intelligent AI systems.
Mentioned in This Episode
●Software & Apps
●Companies
●Organizations
●Concepts
Building Agentic Applications with Copilot Kit & AGUI
Practical takeaways from this episode
Do This
Avoid This
Common Questions
Copilot Kit is an open-source framework for building AI co-pilots and user-facing agentic applications. It facilitates agent-user interactivity and is used by a wide range of organizations.
Topics
Mentioned in this video
An agent framework that integrated with AGUI upon its open-sourcing.
A CRM platform where the Breeze assistant, a SAS co-pilot, is integrated.
The agent user interaction protocol for connecting agentic backends and frontends, designed by Copilot Kit to address the complexity of agent applications.
An agent framework with which Copilot Kit partnered, involving AGUI integration.
Protocol for communication between two or more agents within a mesh.
An example of a SAS co-pilot within the HubSpot CRM that provides intelligent responses based on user data.
A component library mentioned as an example of how declarative UI specs can be mapped to components.
A framework for building AI co-pilots and user-facing agentic applications. It is open-source and has seen significant adoption.
One of the fastest-growing TypeScript AI projects, associated with Copilot Kit.
An agent framework that integrated with AGUI upon its open-sourcing.
An agent framework that integrated with AGUI upon its open-sourcing.
A component library mentioned as an example of how declarative UI specs can be mapped to components.
An organization whose integration was used in a demo of a to-do application built with Copilot Kit.
More from DeepLearningAI
View all 65 summaries
1 minThe #1 Skill Employers Want in 2026
1 minThe truth about tech layoffs and AI..
2 minBuild and Train an LLM with JAX
1 minWhat should you learn next? #AI #deeplearning
Found this useful? Build your knowledge library
Get AI-powered summaries of any YouTube video, podcast, or article in seconds. Save them to your personal pods and access them anytime.
Try Summify free