Paloma is an AI coaching companion that coaches independently between sessions. It makes every live session more powerful by surfacing what shifted, what's compounding, and what's ready to move.
This project started inside a coaching practice, not a design studio. After building Common Project, a real-time note-taking tool designed to capture what emerges between coach and client, it became clear the problem was bigger than capturing sessions. It was everything that happened between them. Paloma is the answer to that gap.
The traditional model treats the 60-minute session as the product. Between meetings, clients are on their own, expected to hold insights, honor commitments, and stay regulated without any infrastructure for doing so.
The result: value leakage. Coaches spend the first 20 minutes of every session re-establishing ground they already covered. Clients arrive dysregulated, or having forgotten the thing that mattered most. The work doesn't compound.
The moments that matter most, the pause before reacting, the decision under pressure, the small act nobody noticed, happen in daily life, not in the coaching room.
The continuity gap isn't a coaching problem. It's a systems design problem. What was missing wasn't more sessions. It was an intelligent witness that could hold the thread, notice what's compounding, and surface the right evidence at the right moment.
“The work doesn’t happen in the room. It happens on a Tuesday, when nobody’s watching.”
Behavioral change doesn’t happen through insight alone. It happens through small acts of courage, consciously noticed and accumulated as evidence against a limiting belief. Paloma was designed around this loop: not as a metaphor, but as the literal interaction model.
Every design decision, what Paloma says, stores, asks, and refuses, was grounded in a specific behavioral science framework. This isn’t a chatbot with coaching aesthetics. The frameworks are load-bearing.
Over-remembering destroys trust. Under-remembering makes the system useless. Paloma's memory system was designed to feel personally attuned without feeling surveilled: a precise calibration between depth and restraint. Each layer maps directly to a moment in the user experience.
Paloma coaches independently. But when a human coach is present, Paloma becomes the infrastructure that makes their relationship compound. The coach holds the relationship. Paloma holds the thread between sessions. Designing that boundary clearly was the most important decision in the product.
General-purpose models drift. Left unguided, an LLM will advise when it should reflect, praise when it should question, and empathize when it should redirect. IFS, solutions-focused coaching, and neuroplasticity frameworks aren’t inspiration. They’re structural constraints baked into Paloma’s cognitive design.