Burn is a multi-user, multi-agent demo app built on TanStack DB and Phoenix.Sync.
It shows how to build an agentic system on real-time sync, where:
- users and agents are automatically kept in sync
- memory means rows in a Postgres database
- context engineering is a representation of that database state
- provides a super fast client store for instant reactivity and local writes
- with live queries syncing data into standard React components
- exposes sync endpoints
- handles auth and writes
- runs agents as OTP processes
The database is standard Postgres. Agentic memory and shared state are both just rows in the database.
Burn shows how to build an agentic system on an end-to-end local-first sync stack using TanStack DB and Phoenix.Sync.
It's useful as a reference of all of these aspects combined or any of them individually:
- how to build a real-world app using TanStack DB
- how to integrate TanStack DB with Phoenix.Sync for an end-to-end local-first sync stack
- how to build multi-user, multi-agent systems on real-time sync
The Burn app is a multi-user, multi-agent "roast-me" / "dad-jokes" app.
It supports realtime collaboration between users and agents in the same thread/session.
Create a thread and invite your friends to it. The agents will probe you for information. When they have enough ammunition, they'll burn you!
As you play the game, you'll see facts and events build up in the "Computer" on the right hand side. This shows the actual contents of the database, synced in real-time, that's driving both the:
- app UI
- context engineering for the LLMs
Both of these are just functions of the database state. The image above shows two parts of the UI representing the same state.
If you also look at the terminal output of the Phoenix server, you'll see it print out the context that it sends to the LLM:
<ask_user_about_themselves>
from: 34db280c-cc75-431b-9ecc-fe60516e3a59
id: toolu_01LCy3PE6AGjSZoHuXFEpxyW
input:
question: What's the most embarrassing dad joke or comment you've made recently that made your kids cringe?
subject: fee0dd11-060c-4a01-a1db-6955640e5a54
name: ask_user_about_themselves
</ask_user_about_themselves>
<user_message>
from: fee0dd11-060c-4a01-a1db-6955640e5a54
id: f245e959-3555-4f7c-96d3-aa70199a081e
text: I constantly tell bad dad jokes. Like the kids say "twist" playing blackjack and I say "like you did last summer?"
</user_message>
<extract_facts>
from: 34db280c-cc75-431b-9ecc-fe60516e3a59
id: toolu_017eRnGXpLcfEnFXa3rbH4rj
input:
facts:
-
category: behavior
confidence: 1.0
disputed: false
object: bad dad jokes constantly
predicate: tells
source_event: f245e959-3555-4f7c-96d3-aa70199a081e
subject: fee0dd11-060c-4a01-a1db-6955640e5a54
-
category: humor
confidence: 1.0
disputed: false
object: like you did last summer when kids say twist in blackjack
predicate: makes joke responses
source_event: f245e959-3555-4f7c-96d3-aa70199a081e
subject: fee0dd11-060c-4a01-a1db-6955640e5a54
name: extract_facts
</extract_facts>
What's the next step?
Users and agents all see and respond to the same state in real-time. There is no manual data wiring: everything is wired in using real-time declarative sync.
The TanStack DB collections are defined in assets/src/db/collections.ts.
The React components in assets/src/components use a variety of live queries.
For example:
- ChatArea.tsx queries the messages for the main chat UI
- ComputerAccordion/EventsList.tsx and ComputerAccordion/FactsList.tsx show a two stage live query with typeahead
- MainThread/ThreadEditForm.tsx has quite a few nested queries and shows interplay between React state and collection insert/update/delete methods for handling form-based updates
- Sidebar/SidebarThreads.tsx shows an optimistic transactional mutation (inserting a new thread and the owner's membership of it within the same transaction)
Writes are all handled by the same mutationFn in assets/src/db/mutations.ts. This sends them to the backend to ingest.
Sync is exposed through the sync macros in lib/burn_web/router.ex. Writes are ingested via lib/burn_web/controllers/ingest_controller.ex.
The agents themselves are defined in lib/burn/agents.
For example:
- Sarah is the "producer" agent reponsible for quizzing users and extracting facts
- Jerry Seinfeld and Frankie Boyle are "comedian" agents responsible for roasting users
As you can see, each agent constructs their own prompts. LLM responses are contrained to tool calls. These are defined (and validated and performed) by the modules in lib/burn/tools.
For example:
extract_factsto store facts in the databaseroast_userto roast the user
Some aspects of the demo app are simplified or not-yet implemented.
- the app syncs all data to the client. It doesn't filter or construct dynamic shapes based on the auth context
- auth tokens are just usernames accepted at face value (they're not signed or validated)
- there is some hardcoded control flow in the agent
should_instructfunctions. More complex or sophisticated apps may want to implement multi-layered routing or more autonomous/agentic control flow - the agents are not always that funny; that said Frankie does sometimes deliver a good burn!
You need Postgres running and to have Elixir and NodeJS installed. You'll also need an Anthropic API key.
In development you'll also want Caddy (as per this troubleshooting guide).
You can install the right versions using asdf:
# `asdf plugin add <name> <git-url> if you don't have the dependency plugin already, e.g.:
# asdf plugin add caddy https://github.com/salasrod/asdf-caddy.git
asdf installInstall and setup the dependencies:
mix setupCopy envs/.env.template to envs/.env (see the Dotenvy docs for context) and set the ANTHROPIC_KEY value manually:
cp envs/.env.template envs/.env
# then edit the file manuallyRun the tests:
mix testStart the Phoenix server:
mix phx.serverIn a different terminal start Caddy (this proxies port 4001 to Phoenix running on 4000):
caddy startOpen localhost:4001 in your web browser.
To learn more, see:
Reach out on the Electric Discord if you need help or have any questions.


