Navigating the False Claims Act in the Age of AI: Implications for Healthcare Providers and Billing Practices

By: Carter Graves

Since its enactment in 1863, the False Claims Act (“FCA”) has served to disincentivize and punish false and fraudulent claims submitted to the federal government by imposing damages and a civil monetary penalty for violators.[1] While Congress’s primary focus at its enactment was fraud in defense contracting, the FCA has evolved into one of the three primary healthcare fraud and abuse laws.[2] FCA enforcement is disproportionately concentrated on the healthcare industry because of the immense volume of claims submitted to the government under federal healthcare programs.[3] In 2024, the federal government recovered an estimated $2.9 billion in damages under the FCA.[4] Of that $2.9 billion, 57.5% or $1.76 billion came from the healthcare industry alone.[5]

Under the FCA, an individual who knowingly submits a false claim for payment to the federal government may be subject to civil monetary penalties (up to $10,000 for each violation) and treble damages.[6] In addition to the civil monetary penalties and treble damages authorized by the FCA, providers may also face an exclusion from federal healthcare programs for violation of the FCA.[7] To prove “knowledge” under the FCA, the government need only establish that the individual acted with reckless disregard in submitting the false or fraudulent claim to the federal government.[8] There is no requirement that the plaintiff or the federal government prove that the defendant acted with specific intent.[9]

As industries and professionals continue to integrate artificial intelligence into their workflows, healthcare providers should be cognizant of existing federal laws and regulations—like the FCA—that pertain to their use of AI technologies. AI has many uses in the healthcare sphere, and autonomous coding is merely one of the ways in which providers have been able to leverage AI to increase efficiency.[10] Autonomous coding allows providers and administrative staff to handle the billing process more efficiently, as well as allows greater accuracy in coding by catching billing errors before the claims are submitted.[11] However, autonomous or AI-powered coding is not without its flaws. For example, large language models (LLMs) rely on machine learning and can eventually “learn” that certain diagnoses and billing methods increase reimbursement rates.[12] The LLMs may begin suggesting these codes more often, resulting in upcoding that gives rise to a violation of the FCA.[13]

Overreliance on LLMs/AI for billing presents additional issues. Practice makes perfect, and as staff begin to increasingly integrate AI into billing workflows, they also risk losing the skill and expertise that enable them to conduct effective oversight with respect to the AI.[14] Given that the FCA’s knowledge standard may be satisfied by showing reckless disregard in the submission of a false or fraudulent claim, vigilance is key.[15] If the staff members and providers responsible for coding begin to experience skill degradation due to overreliance, the chances that they are able to exercise effective oversight significantly decrease.

In July 2025, the Department of Justice and the Department of Health and Human Services jointly announced the formation of an FCA Working Group designed to strengthen federal enforcement of the FCA and to target various healthcare enforcement areas.[16]Among the DOJ-HHS Working Group’s areas of concern is the “manipulation of Electronic Health Records [(“EHR”)] systems to drive inappropriate utilization of Medicare covered products and services.”[17] Previous enforcement actions have extended to EHR vendors involved in kickback schemes where the EHR software suggested certain medications to providers in exchange for kickbacks from the pharmaceutical companies that manufactured those medications.[18] While AI integration into the coding process can increase efficiency, this settlement demonstrates the possible consequences. Consequently, EHR vendors who integrate AI into their software to recommend certain codes and providers who rely on those recommendations should be especially wary of potential implications under the FCA and other healthcare fraud laws.

In its 163-year history, the FCA has borne witness to the evolution of healthcare in the United States, from the emergence of the modern hospital system and modern medical education,[19] to the advent of Medicare in 1965,[20] and the passage of the Patient Protection and Affordable Care Act in 2010.[21] Despite these changes, the FCA has evolved alongside them. The AI revolution is no different. Despite the dramatic change that the AI revolution brings, enforcement will continue and will reflect new and emerging practices that violate the FCA. Even without direct human involvement, unconventional means of preparing false claims (e.g., LLMs upcoding after learning that certain codes increase reimbursement rates) may still provide grounds for an FCA violation. Given that the FCA’s knowledge standard includes reckless disregard of whether a claim was false or fraudulent, autonomous coding through LLMs/AI still requires human verification and auditing to minimize FCA liability. Though autonomous coding may make coding more accurate overall, the risks are still great, and providers should approach with caution.


[1] 31 U.S.C. § 3729(a) (2018); The False Claims Act, DOJ (Jan. 15, 2025), https://www.justice.gov/civil/false-claims-act (on file with the Tennessee Law Review).

[2] Fraud & Abuse Laws: Dep’t of Health & Hum. Serves: Off. of Inspector Gen., https://oig.hhs.gov/compliance/physician-education/fraud-abuse-laws/ (on file with the Tennessee Law Review) (last visited Jan. 8, 2026).

[3] See Press Release, DOJ, False Claims Act Settlements and Judgments Exceed $2.9B in Fiscal Year 2024 (Jan. 15, 2025) (on file with author).

[4] See 31 U.S.C. § 3729(a) (2018); The False Claims Act, DOJ (Jan. 15, 2025), https://www.justice.gov/civil/false-claims-act (on file with the Tennessee Law Review).

[5] Id.

[6] 31 U.S.C. § 3729(a).

[7] 42 U.S.C. §1320a-7(b)(6) (2018).

[8] 31 U.S.C. § 3729(b).

[9] Id.

[10] See Tyson Blauer, From Hype to Reality: What Healthcare Leaders Should Know About Autonomous Coding Solutions, KLAS Rep. (Aug. 26, 2025), https://engage.klasresearch.com/blog/from-hype-to-reality-what-healthcare-leaders-should-know-about-autonomous-coding-solutions/8341/ (on file with the Tennessee Law Review).

[11] See Pro. & Continuing Educ. Staff, How AI is Revolutionizing Medical Billing and Coding, Univ. Tex. San Antonio Pro. & Continuing Educ., https://www.utsa.edu/pace/news/ai-in-medical-billing-and-coding.html (on file with the Tennessee Law Review) (last visited Jan. 8, 2026).

[12] See Joshua Robbins & Daniel Pietragallo, Blame It on the Bot: Health Care Fraud and Compliance in the Age of AI, Amer. Health l. Ass’n (July 11, 2025), https://www.buchalter.com/wp-content/uploads/2025/10/AHLA-Blame-It-on-the-Bot_-Health-Care-Fraud-and-Compliance-in-the-Age-of-AI.pdf (on file with the Tennessee Law Review).

[13] Id.

[14] Mick Polo, Navigating the Risks: Responsible Use of AI in Medical Billing, NCDS, https://www.ncdsinc.com/navigating-the-risks-responsible-use-of-ai-in-medical-billing/ (on file with the Tennessee Law Review) (last visited Jan. 8, 2025).

[15] Id.

[16] See Press Release, Dep’t Health & Hum. Servs., DOJ-HHS False Claims Act Working Group(July 2, 2025) (on file with author).

[17] Id.

[18] See Press Release, DOJ, Electronic Health Records Vendor to Pay $145 Million to Resolve Criminal and Civil Investigations(Jan. 27, 2020) (on file with author).

[19] See, e.g., History of Hospitals, Univ. of Penn., https://www.nursing.upenn.edu/nhhc/nurses-institutions-caring/history-of-hospitals/ (on file with the Tennessee Law Review) (last visited Jan. 8, 2026).

[20] Social Security Amendments of 1965, Pub. L. No. 89-97, 79 Stat. 286 (1965).

[21] Patient Protection and Affordable Care Act, Pub. L. No. 111-148, 124 Stat. 119 (2010).


Posted

in

by