Ambient AI is quietly creeping into the homes of America’s most vulnerable patients. But just because it can listen and record doesn’t mean it should.
Trent Smith, founder and CEO of Apricot Health—a documentation automation company for skilled home health and hospice—has a different take. His background spans national championship football at the University of Oklahoma, the NFL, and most recently, founding and scaling Accentra Home Health & Hospice (acquired by Choice Health at Home in 2024). But these days, he’s focused on something even tougher than gridiron wins: fixing how care gets documented and reimbursed in the fractured world of post-acute care.
Smith has ridden along with nurses, watched the realities of home visits, and built thousands of task-specific prompts to make AI actually useful—not ambient. In his view, the hype around ambient AI distracts from the real opportunities to reduce costs, protect clinicians, and avoid liability. Worse, he says, it could blow back on the entire industry if regulators get involved.
The Home Care Innovation Forum (HCIF) sat down with Smith to get his candid, contrarian takes on all things ambient AI – and how at-home care leaders can best leverage Artificial Intelligence for their clients’ benefit.
This transcript has been edited for length and clarity.
HCIF: Why doesn’t Apricot use ambient recording technology in patients’ homes?
Smith: We’ve made a deliberate choice: no ambient recording. Ever. It’s the wrong fit for skilled home health, for three clear reasons.
- It misses the point. Roughly 80% of a home health visit is observational—body language, gait, wound appearance, and environmental cues. You can’t “record” that. So you’re only capturing 20% of the encounter, yet pretending it’s the whole picture. That’s dangerous.
- The legal risks are massive. If your AI captures audio of a nurse skipping a required assessment like BIMS, and you still submit a claim, congratulations—you’ve now documented your own Medicare fraud. I’ve ridden with nurses. They’re overwhelmed. They forget things. So now you’re going to record those mistakes? Good luck explaining that in court.
- The privacy concerns are a minefield. You’ve got federal wiretap laws, 12 different state-level consent laws, minors in the home, and spouses casually discussing their own medical info nearby. One angry family member calls their lawyer, and you’ve just handed them an audio recording as evidence.
This is not a clinic. It's a patient’s home. Ambient may work in a hospital setting, but skilled home health is a completely different context—and the risks far outweigh any perceived value.
HCIF: What’s the risk if prompt transparency isn’t taken seriously in AI documentation?
Smith: Most of the agencies exploring AI haven’t gone deep enough to understand the risks—and most vendors are banking on that.
Here’s the reality: if a large language model is generating documentation that impacts reimbursement, you’re in regulated territory. That means you need an audit trail. But most companies can’t tell you what prompt was used, who approved it, or when it changed.
They don’t version prompts, they don’t audit them, and they’re not prepared for what’s coming.
At Apricot, we’ve built and tested thousands of prompts—but more importantly, we’ve made them traceable. If a regulator ever asks how a specific chart note or coding suggestion was generated, we can show them the exact input, the AI’s output, and the context around it.
Can your vendor do that?
This isn’t paranoia. It’s just where the puck is headed. If you're submitting claims to CMS based on AI-generated documentation, the question isn't if regulators will ask—it’s when. And if you can't answer, you're holding the bag.
HCIF: Where is AI actually useful in home health and hospice?
Smith: Back office. Full stop.
Look at your P&L. Most agencies spend 4% of revenue on tech, 15% on RNs, and 10-12% on back office—most of whom are chasing prior auths and dealing with Medicare Advantage nonsense. That’s where AI can help today.
If we can reduce RN spend from 15% of revenue to 10% by increasing their capacity—without sacrificing documentation quality—that’s a win. But that still doesn’t even cover the annual CMS rate cuts. So we’ve got to keep innovating.
We’re not the Yankees. We’re the Oakland A’s. We don’t have the money or lobbyists. We have to Moneyball this. Ambient AI isn’t the answer. Thoughtful, targeted automation is.
HCIF: What’s the patient perspective on ambient recording?
Smith: I asked my own doctor that exact question. He said: “My patients are Baby Boomers. They don’t trust the government. If I started recording them, I’d lose half my business.”
That’s exactly who we’re caring for in skilled home health. They know what a recording device is. They won’t like it—and they shouldn’t have to accept it.
We believe in augmenting—not replacing—clinicians. That means no ambient, no audio surveillance, and no pretending AI can substitute for skilled human judgment.
We’re building AI that plays by the rules. Because when regulators come—and they will—we’ll be ready.
Hear more contrarian ideas from Smith on AI, compliance, and the future of home health and hospice care at HCIF 2026, May 17-19 in Palm Springs, CA.
Posted by
Join us!
The retreat for home health care and hospice leaders innovators.
May 17-19, 2026 | Palm Springs, CA
Comments