Why I Started Modeling Practice Decisions
A while back, I started experimenting with a practice simulator. It takes some creative license, but it is designed to reflect real clinical pressures: the tension between speed and quality, burnout under sustained load, and the difficulty of making good decisions with incomplete information.
Not as a product. Not even as formal research.
I wanted a way to think more clearly about the business and clinical choices clinicians make every week.
- Who we take on
- Who we refer out
- When we push
- When we rest
- How load and complexity quietly shape our decisions
I assumed I would see obvious things.
I did not.
What Emerged Instead
Individual sessions looked fine. Notes looked solid. Outcomes moved in the right direction.
But when decisions were tracked across weeks, especially under increasing load, different signals started to surface.
- Burnout did not rise linearly. It spiked after specific decision clusters.
- Pushing plateaued clients felt productive until it was not.
- Referral behavior shifted subtly as capacity tightened.
- Quality did not disappear. It drifted.
None of this showed up in a single note. None of it showed up in compliance metrics.
It only appeared when behavior was observed longitudinally.
The Realization
Quality in mental health care does not live solely inside a note.
It lives in decision patterns, timing, tradeoffs, and how clinicians respond to pressure over time.
Documentation captures what happened. Practice patterns reveal how we work.
Those are related, but they are not identical.
Where AI Actually Helps and Where It Should Not
This changed how I think about the role of AI in mental health.
AI is often framed as a replacement, an automation layer, or an invisible assistant doing work in the background.
That framing misses the deeper opportunity.
Used well, AI can act as a practice signal amplifier. It can help clinicians notice patterns they are too close, too busy, or too exhausted to see on their own.
Not to judge. Not to optimize away humanity.
But to support reflective, intentional practice.
How This Connects to SnapNotes and UMET Labs
At SnapNotes, we started with documentation because that is where the pain is most acute.
But the deeper mission across SnapNotes and UMET Labs has always been about supporting clinical judgment, not replacing it.
Clear notes support good care. Clearer patterns support sustainable care.
That is the direction I am increasingly convinced matters.
A Quiet Invitation
What if our tools did not just help us finish faster, but helped us see ourselves more clearly as clinicians?
Not in a mirror that shames. But in one that reflects.
That feels like a future worth building.