AI misses economic reality in healthcare
Healthcare company associate GC on where legal AI products break down
The hard part in healthcare compliance is not reading the contract, it is recognizing the economic reality hiding underneath it. In Stark and Anti-Kickback reviews, the question is often whether money, referral influence, or other value is moving to a doctor in a way that changes behavior, even if the paper is dressed up as services, consulting, equipment, or support. That is why pattern recognition from real deal experience matters more than one more document database.
-
Stark law focuses on physician referrals tied to financial relationships, and CMS rules now turn on practical tests like fair market value, commercial reasonableness, and whether pay tracks referral volume. Those are facts about how an arrangement actually works, not just words on the page.
-
The interview describes current legal AI failing on factual nuance in FDA, AKS, Stark, and privacy work, while still being useful for drafting, document organization, and first pass contract review. That draws a clear line between language tasks and arrangement judgment.
-
This also explains why CLM products like Luminance are more compelling around playbooks and clause deviations than around compliance sensitive healthcare judgment. A system can compare a third party contract to prior language, but that is different from spotting disguised transfer of value across a provider network.
The next wave of legal AI in healthcare will win by sitting inside contract intake and review workflows, surfacing anomalies, preserving context, and routing edge cases to humans. The substantive call on whether an arrangement is really paying for legitimate services or quietly paying for referrals will stay with specialists who understand how doctors, hospitals, and intermediaries actually get paid.