
The Department of Veterans Affairs (VA) reported 367 artificial intelligence (AI) use cases in its 2025 AI use case inventory, marking a sharp increase from recent years as the department both added new applications and retired dozens of older efforts.
The VA reported 229 AI use cases in 2024. Notably, of the 367 use cases in 2025, the VA retired 72, saying their “development and/or use has since been discontinued.”
Of the remaining use cases, the department reported that 138 are deployed, 21 are in a pilot stage, and 136 are in a “pre-deployment” stage – meaning they are in “development or acquisition status.”
“VA is using artificial intelligence to improve how veterans access care, benefits and services, while also helping employees work more efficiently and effectively,” Pete Kasperowicz, VA press secretary, said in an emailed statement to MeriTalk. “The department’s AI Inventory showcases VA’s dedication to transparent and responsible innovation, and enables us to track, evaluate, and optimize our AI systems while ensuring they are safe, secure and responsible.”
“This year’s inventory shows steady growth, stronger governance, and expanding real-world impact, particularly in health care, benefits processing, and operational efficiency,” he added.
One of the most prominent use cases highlighted is VA GPT, an on-network generative AI chat interface available to all VA employees to help with administrative tasks. The VA said it has onboarded over 95,000 users for the product, and “survey results show more than 70% of users report improved job satisfaction and users report saving 2-3 hours of time per week on average.”
The VA is also applying AI to software development. The agency said it has implemented AI-assisted software development tools, now used by more than 2,000 developers, saving users an average of more than 8 hours per week. Seventy-four percent of users report the product “enables them to focus on work that is more satisfying to them.”
Health and medical applications account for a large share of VA’s AI footprint. Of the deployed and piloted use cases, 107 fall within the health and medical domain.
Among them is the Veterans Health Administration’s (VHA) Stratification Tool for Opioid Risk Mitigation (STORM), an AI-assisted clinical decision support system designed to help identify veterans at elevated risk of overdose or suicide. VA said use of the system is associated with a 22% decrease in mortality.
Additionally, the VA highlighted an AI-assisted colonoscopy device that has resulted in a 21% increase in the odds of adenoma detection and an absolute detection increase of approximately 4% compared to colonoscopies without the device. Higher adenoma detection rates are associated with lower late-stage cancer incidence and reduced mortality.
In an interview with MeriTalk last year, VA Chief Technology Officer Charles Worthington previewed many of these use cases, and he predicted that the number of the VA’s AI use cases will continue to grow.
“We have a lot of promising use cases out there, and the theme, I think, of the coming year is to scale some of those most promising use cases,” Worthington told MeriTalk at the time.
The expansion of AI use cases comes as VA faces scrutiny over the use of generative AI in clinical settings. Last month, the VA Office of Inspector General (OIG) issued a report warning that the VHA’s use of generative AI chat tools for clinical care presents “a potential patient safety risk.”
The OIG voiced its concerns about “VHA’s ability to promote and safeguard patient safety without a standardized process for managing AI-related risks.” However, the OIG’s review is ongoing, so it did not issue any formal recommendations.
“Given the critical nature of the issue, the OIG is broadly sharing this preliminary finding so that VHA leaders are aware of this risk to patient safety,” OIG said in its warning.