Skip to the contentAI Governance
- At the HLTH health innovation conference, a panel of AI experts expressed skepticism about appointing a chief AI officer in health organizations, advocating instead for improving AI literacy across the board. Some providers have established an AI oversight committee and an AI Enablement Center to democratize AI governance and ensure responsible integration of AI technologies. The widespread use of AI in radiology for diagnostic support and the growing adoption of ambient AI scribes have significantly reduced administrative burdens for physicians. The use of AI in administrative tasks, such as drafting patient communications, has shown positive results, with patients reportedly preferring AI-generated responses for their empathetic tone. Nevertheless, it is important to maintain a human element in AI applications, ensuring that AI supports rather than replaces clinical decision-making.
- The FDA has issued final guidance on regulating changes to AI-enabled medical devices through pre-determined change control plans (PCCPs), allowing for post-market modifications while maintaining safety and effectiveness. PCCPs, first introduced in 2019, enable performance enhancements by outlining specific, verifiable modifications and include a description of planned changes, a modification protocol, and an impact assessment. The guidance, consistent with a 2023 draft, now includes a section on version control and maintenance. While no adaptive AI-enabled devices have been authorized yet, PCCPs have been approved for devices through various regulatory pathways. Modifications under a PCCP must stay within the device’s intended use, and significant changes, such as altering a device’s user base or core functionalities, require new marketing submissions.
- The rapid evolution of AI in healthcare presents challenges for physicians and legal compliance, with shifting regulations and emerging laws at both federal and state levels. A federal rule effective July 2024 requires healthcare providers to comply with anti-discrimination regulations by May 2025, while various state bills focus on transparency, bias elimination, and AI limitations. Organizations like HIMSS and the AMA provide guidance on AI implementation, emphasizing human oversight and ethical considerations to enhance patient care and reduce costs. Legal risks associated with AI, such as data privacy, potential bias, and the unlicensed practice of medicine, necessitate legal expertise for healthcare providers. Despite these challenges, AI has the potential to generate actionable insights and improve healthcare operations, provided it is used responsibly and with appropriate legal guidance.
- A recent study published in npj Digital Medicine outlines comprehensive guidelines for the responsible integration of AI into healthcare, developed by a team from Harvard Medical School and the Mass General Brigham AI Governance Committee. The study emphasizes nine principles, including fairness, robustness, and accountability, and highlights the need for diverse training datasets and regular equity evaluations to reduce bias. A pilot study and shadow deployment were conducted to assess AI systems, focusing on privacy, security, and usability in clinical workflows. The study also stresses the importance of transparent communication regarding AI systems’ FDA status and a risk-based monitoring approach. Future efforts will expand testing to ensure AI systems remain equitable and effective across diverse healthcare settings.
Medical Judgment
- A recent study at Beth Israel Medical Center revealed that generative AI tools outperformed physicians in diagnosing patients by nearly 20%, achieving around 90% accuracy. This study challenges the “fundamental theorem of informatics,” which posits that human-computer collaboration should surpass either working alone. Despite the potential of genAI in healthcare, there are concerns about biases in AI models, their impact on clinician skills, and patient data privacy. As AI technology advances, the industry must address these issues and ensure that clinicians are adequately trained to use and manage these tools effectively.
- Use of artificial intelligence (AI) in health care quality measurement can enhance the precision and efficiency of performance assessment. However, the use of AI in measurement also raises concerns about biases that could perpetuate disparities and affect vulnerable individuals. There have been recent national discussions that emphasize the need for ethical, transparent, and equitable AI applications in health care quality measurement, such as the US Centers for Medicare & Medicaid Services’ (CMS) information session titled “AI in Quality Measurement” and the Biden-Harris Administration’s Executive Order on AI. Addressing bias is crucial to ensuring that AI tools do not exacerbate existing inequalities, but instead contribute to fair quality assessment and high-quality outcomes.
- Artificial intelligence, particularly large language models like ChatGPT, is increasingly used in healthcare for tasks such as answering patient questions and predicting diseases. A study by Ben-Gurion University researchers evaluated the performance of these models in understanding medical information, revealing that most models, even those trained on medical data, performed poorly, akin to random guessing. ChatGPT-4, however, showed better performance with an average accuracy of about 60%, though still not fully satisfactory. The research involved generating over 800,000 questions to assess model capabilities in distinguishing between medical concepts. The findings emphasize the need for caution in using AI for medical purposes and highlight the importance of developing models with a broader understanding of clinical language.
Employment
- Job applicants often face the challenge of applying for ghost jobs, which are non-existent positions posted by companies to build talent pools or create an illusion of growth. A 2024 survey found that 81% of recruiters have posted ghost jobs, and 30% of companies have done so this year. This practice raises ethical and data privacy concerns, particularly under the EU’s GDPR and California’s CCPA, which require transparency and proper notice of data collection purposes. Ghost jobs may violate these regulations, leading to potential fines and reputational damage for companies. Applicants can protect themselves by recognizing signs of ghost jobs and understanding their data privacy rights.
- Companies have increasingly adopted AI tools for hiring and employment decisions, raising concerns about bias and mistakes. A California Privacy Protection Agency meeting debated proposed rules to regulate AI in employment, emphasizing worker rights and transparency. Various U.S. states have enacted laws to manage AI in hiring, such as requiring consent for using AI in interviews or mandating audits to ensure non-bias. High-profile cases illustrate the potential for AI to discriminate, echoing past issues with automated credit decisions. Despite potential benefits, there is a call for transparency and governance in AI use, as candidates may avoid opportunities if AI is involved without clear policies.
Data Privacy
- According to a new survey, data privacy remains a top challenge for one-third (33%) of healthcare professionals across the seven major markets when integrating AI into clinical practice.
- GoodRx, a telemedicine platform provider, has agreed to settle a class action lawsuit for $25 million due to its use of tracking technologies that disclosed website visitor data to third parties without user consent. The Federal Trade Commission (FTC) found that GoodRx violated the FTC Act and the Health Breach Notification Rule by sharing sensitive user data without consent, leading to a separate $1.5 million settlement with the FTC. The consolidated lawsuit, Jane Doe et al. v. GoodRx Holdings, Inc., et al., includes claims of privacy invasion and violations of various California and New York laws, with Meta, Google, and Criteo also named as co-defendants. The Court is set to rule on the $25 million settlement, which, if approved, will allow affected individuals to file claims for compensation from the settlement fund. The plaintiffs’ attorneys are seeking $8.33 million, or one-third of the settlement, for fees and costs.
- The Federal Trade Commission (FTC) has taken action against Gravy Analytics Inc. and its subsidiary Venntel Inc. for allegedly violating the FTC Act by collecting, using, and selling sensitive geolocation data without user consent. These companies are accused of unlawfully tracking and selling data related to visits to sensitive locations such as healthcare facilities, places of worship, and schools, potentially exposing consumers to privacy risks and discrimination. The FTC claims they collected over 17 billion signals daily from about a billion mobile devices, using precise geolocation data tied to unique Mobile Advertising IDs (MAIDs), which could identify individuals. The proposed FTC order requires the companies to delete all historical location data unless de-identified, notify third parties to do the same, and establish a program to prevent unauthorized use of sensitive location data. This settlement aims to protect consumer privacy and prevent misuse of sensitive geolocation information.
HIPAA Penalties
- The U.S. Department of Health and Human Services (HHS), Office for Civil Rights (OCR) imposed a $548,265 civil monetary penalty on Children’s Hospital Colorado for violations of the HIPAA Privacy and Security Rules following breaches reported in 2017 and 2020 due to phishing attacks. The breaches compromised the protected health information (PHI) of 3,370 and 10,840 individuals, respectively, and were partly due to disabled multi-factor authentication and unauthorized email access by third parties. OCR found additional violations for failure to train staff on HIPAA Privacy Rules and conduct a proper risk analysis of electronic PHI (ePHI). In June 2024, Children’s Hospital Colorado waived its right to a hearing, leading OCR to finalize the penalty. OCR recommends that covered entities implement robust cybersecurity measures, including multi-factor authentication, encryption, regular risk analyses, and workforce training to prevent such breaches.
- The U.S. Department of Health and Human Services Office for Civil Rights (OCR) fined Gulf Coast Pain Consultants, LLC, $1.19 million for multiple HIPAA Security Rule violations, including failing to terminate a former contractor’s access to systems containing electronic protected health information (ePHI). The contractor, who had ceased providing services in August 2018, accessed ePHI of approximately 34,310 individuals without authorization and generated around 6,500 false Medicare claims. Gulf Coast Pain Consultants failed to conduct a HIPAA-compliant risk analysis until September 30, 2022, and did not implement necessary policies and procedures for access termination and activity review until April 2020. The penalty is part of OCR’s 14th HIPAA enforcement action in 2024 and highlights the importance of proactive cybersecurity measures. Despite providing evidence of mitigating factors, Gulf Coast Pain Consultants could not reach an informal settlement with OCR.
Quantum Computing
- The convergence of quantum technology and artificial intelligence in precision medicine is set to revolutionize healthcare by enabling highly personalized treatments and advancing drug design, medical imaging, and real-time health monitoring. Second-generation quantum technologies, which integrate quantum and classical computing, offer significant advantages in computing, sensing, and networking, with applications ranging from drug discovery to secure patient data sharing. However, these advancements come with regulatory challenges, as existing frameworks may not adequately address the unique risks associated with quantum devices, necessitating the development of new evaluation protocols, risk management frameworks, and clinical trial guidelines. Policymakers are encouraged to promote quantum literacy, anticipate societal impacts, and implement adaptive regulations to balance innovation with public safety. Ultimately, global collaboration and harmonized standards are essential to harnessing the potential of quantum technology in healthcare responsibly.
Centers for Medicare & Medicaid Services
- The Centers for Medicare & Medicaid Services (CMS) implemented a final rule in October requiring casualty insurers, defined as Responsible Reporting Entities (RREs), to report certain payments to Medicare beneficiaries or face Civil Money Penalties (CMPs). The rule focuses on “Non-Group Health Plans” (NGHPs), including liability insurers, no-fault carriers, and workers’ compensation plans, and emphasizes reporting timeliness while excluding penalties for reporting quality. This change follows the U.S. Supreme Court’s June 2024 decision to overturn the Chevron doctrine, adding scrutiny to CMS’s long-standing NGHP User Guide. Insurers face complexities in identifying the correct RRE and correctly reporting “total payment obligation to the claimant” (TPOC) settlements, with risks of overreporting in complex settlements involving multiple insurers. CMS’s guidance and the Supreme Court decision highlight the need for insurers to carefully assess their reporting obligations to avoid penalties.
- On November 1, 2024, the Centers for Medicare & Medicaid Services (CMS) released the CY 2025 Hospital Outpatient Prospective Payment System (OPPS) and Ambulatory Surgery Center (ASC) Payment System final rule, which includes a 2.9% increase in Medicare OPPS payments for 2025. This increase results from a 3.4% projected hospital market basket percentage increase, reduced by a 0.5% multifactor productivity reduction mandated by the ACA. The rule has been criticized by the American Hospital Association (AHA) for not adequately addressing the financial challenges faced by hospitals, particularly in rural and underserved areas. CMS has approved 21 new medical and dental procedures for the ASC covered procedures list for 2025, resulting in a projected $308 million increase in ASC payments, bringing the total to approximately $7.4 billion. The rule aligns with the Biden-Harris Administration’s goals to address health disparities and improve transparency, but concerns remain about the impact of rising labor and supply costs on healthcare delivery.
Emerging Technology
- The convergence of quantum technology and artificial intelligence in precision medicine is set to revolutionize healthcare by enabling highly personalized treatments and advancing drug design, medical imaging, and real-time health monitoring. Second-generation quantum technologies, which integrate quantum and classical computing, offer significant advantages in computing, sensing, and networking, with applications ranging from drug discovery to secure patient data sharing. However, these advancements come with regulatory challenges, as existing frameworks may not adequately address the unique risks associated with quantum devices, necessitating the development of new evaluation protocols, risk management frameworks, and clinical trial guidelines. Policymakers are encouraged to promote quantum literacy, anticipate societal impacts, and implement adaptive regulations to balance innovation with public safety. Ultimately, global collaboration and harmonized standards are essential to harnessing the potential of quantum technology in healthcare responsibly.
Fraud & Abuse
- Dr. Basem Hamid, a 52-year-old neurologist from Pearland, Texas, has agreed to pay $948,359.85 to settle allegations of submitting false Medicare claims. The claims involved billing for the surgical implantation of neurostimulator electrodes between August 27, 2019, and October 3, 2022. However, it is alleged that neither Dr. Hamid nor his staff performed these surgeries. Instead, patients received electro-acupuncture devices that were non-invasive and applied in his clinic, not in a surgical setting. Many patients reported that the devices, which were taped behind the ear, often fell off within a few days.
- The U.S. Department of Justice (DOJ) has historically focused on combating fraud against federally funded healthcare programs like Medicare, Medicaid, and TRICARE by encouraging whistleblowers to file lawsuits under the False Claims Act. Recently, the DOJ launched the Corporate Whistleblower Awards Pilot Program, a three-year initiative aimed at incentivizing reports of corporate crime, including private healthcare fraud, with potential monetary rewards for whistleblowers. This program expands the DOJ’s focus to include fraud involving private insurers and healthcare benefit programs outside the scope of the False Claims Act. The DOJ’s updated Evaluation of Corporate Compliance Program guidance emphasizes the importance of confidential reporting structures to protect whistleblowers and urges healthcare providers to enhance compliance programs to address both public and private healthcare fraud. These developments signal an increased scrutiny of corporate healthcare practices and the need for robust compliance systems.
- Edelmira Marquez, the 59-year-old owner of Marquez Medical Supply in El Paso, was sentenced to five years in federal prison and ordered to pay over $1.7 million in restitution for a health care fraud scheme involving adult diapers. Marquez pleaded guilty to conspiracy to commit health care fraud by billing Medicaid and Medicare for more expensive items while providing lower-value products. The fraud, which began as early as 2010, was uncovered by an investigation led by the Texas Attorney General Medicaid Fraud Control Unit and the FBI. In addition to the prison sentence, Marquez was fined $20,000 and admitted full responsibility for her actions. Born in Chihuahua, Mexico, and a naturalized U.S. citizen since 2008, Marquez had no prior criminal record and cooperated with investigators.
HIPAA Penalties
- The U.S. Department of Health and Human Services (HHS), Office for Civil Rights (OCR) imposed a $548,265 civil monetary penalty on Children’s Hospital Colorado for violations of the HIPAA Privacy and Security Rules following breaches reported in 2017 and 2020 due to phishing attacks. The breaches compromised the protected health information (PHI) of 3,370 and 10,840 individuals, respectively, and were partly due to disabled multi-factor authentication and unauthorized email access by third parties. OCR found additional violations for failure to train staff on HIPAA Privacy Rules and conduct a proper risk analysis of electronic PHI (ePHI). In June 2024, Children’s Hospital Colorado waived its right to a hearing, leading OCR to finalize the penalty. OCR recommends that covered entities implement robust cybersecurity measures, including multi-factor authentication, encryption, regular risk analyses, and workforce training to prevent such breaches.
- The U.S. Department of Health and Human Services Office for Civil Rights (OCR) fined Gulf Coast Pain Consultants, LLC, $1.19 million for multiple HIPAA Security Rule violations, including failing to terminate a former contractor’s access to systems containing electronic protected health information (ePHI). The contractor, who had ceased providing services in August 2018, accessed ePHI of approximately 34,310 individuals without authorization and generated around 6,500 false Medicare claims. Gulf Coast Pain Consultants failed to conduct a HIPAA-compliant risk analysis until September 30, 2022, and did not implement necessary policies and procedures for access termination and activity review until April 2020. The penalty is part of OCR’s 14th HIPAA enforcement action in 2024 and highlights the importance of proactive cybersecurity measures. Despite providing evidence of mitigating factors, Gulf Coast Pain Consultants could not reach an informal settlement with OCR.
Mental Health and Substance Use
Pharmacy Benefit Managers
Private Equity
Artificial Intelligence
- Ensuring AI models provide faithful and reliable explanations is challenging, particularly in high-stakes fields like healthcare and finance, as current interpretability paradigms—intrinsic and post-hoc—fall short. Intrinsic models, though inherently interpretable, often lack general applicability and competitive performance, while post-hoc methods, although flexible, frequently produce explanations that do not align with the model’s logic. To address these issues, three new paradigms have been introduced: Learn-to-Faithfully-Explain, Faithfulness-Measurable Models, and Self-Explaining Models, which aim to enhance faithfulness and interpretability without sacrificing performance. These approaches are tested on synthetic and real-world datasets, showing significant improvements, such as a 15% increase in faithfulness metrics, while maintaining high accuracy. The new frameworks promise to bridge the gap between interpretability and performance, making AI systems more transparent and reliable for various applications.
- Explainable AI (XAI) is crucial for building trust by making AI decisions understandable, particularly in healthcare where transparency is essential for diagnostic and treatment recommendations. Autonomous and agentic AI systems enhance decision-making and patient care by automating processes, such as monitoring and treatment adjustments, while Edge AI enables real-time processing and improves data privacy by handling information locally. AI also augments the healthcare workforce by assisting with data analysis and diagnostics, allowing humans to focus on tasks requiring emotional intelligence and critical thinking. As AI reshapes job roles, it is essential for healthcare organizations to adapt and leverage these technologies effectively.
- Nearly half of Americans with health insurance receive unexpected medical bills due to systemic issues in healthcare billing, costing $210 billion annually and adding $68 billion in unnecessary expenses. Errors often stem from data entry mistakes, outdated coding practices, and duplicate billing, which AI and machine learning technologies aim to address by reducing errors and improving efficiency. AI-powered systems enhance claims processing by detecting errors in real-time, improving reimbursement rates, and reducing patient distress from rejected claims. Natural Language Processing (NLP) optimizes clinical documentation and revenue management, while AI also improves diagnostic accuracy by identifying conditions like ischemic strokes and hypertrophic cardiomyopathy early. However, human oversight is crucial to ensure AI’s responsible use, maintaining patient care standards and allowing healthcare professionals to focus on direct patient interactions.
Bias & Equity
Cybersecurity
- The HHS Office of Inspector General (OIG) report criticized the Office for Civil Rights (OCR) for its narrow HIPAA audit program, which assessed only eight out of 180 requirements, failing to adequately improve cybersecurity at healthcare organizations. The audits did not evaluate physical or technical safeguards, leaving potential vulnerabilities unaddressed. The OIG recommended expanding the audit scope, enforcing corrective measures, and establishing evaluation metrics, but the OCR cited budget constraints and a lack of resources as barriers to implementing these changes. From fiscal years 2018 to 2020, the OCR’s budget remained at $38 million, while complaints and data breach reports increased, and investigative staff numbers decreased by 30% since 2010. Despite agreeing with most recommendations, the OCR disagreed with requiring corrective measures, emphasizing that HIPAA allows for civil penalties instead, and audits are intended to offer technical assistance.
- The continued success of telehealth hinges on its accessibility, but challenges remain, such as digital inequalities and the need for inclusive design for diverse populations. Security is a critical concern as telehealth platforms handle sensitive patient data, necessitating robust measures like encryption, multi-factor authentication, and compliance with privacy laws. The inherent tension between accessibility and security requires a balance to prevent vulnerabilities without deterring patients from using these services. Emerging technologies like AI and blockchain may enhance both security and accessibility, but a collective effort from healthcare providers, developers, policymakers, and patients is essential to ensure telehealth remains safe and inclusive.
Data Privacy
- Four U.S. healthcare organizations, HealthFund Solutions, Option Care Health, Liberty Endo, and Numotion, experienced unauthorized access to employee email accounts. The breaches exposed protected health information of thousands of individuals, including names, addresses, Social Security numbers, medical information, and financial details. The organizations are offering credit monitoring and identity theft protection services to affected individuals.
Elderly & Aging
- Older adults increasingly require more clinical care and social services, which places a significant burden on an already strained healthcare system. The integration of data analytics in senior care can enhance patient-centered care by enabling predictive analytics for proactive health interventions and personalized treatment plans tailored to individual needs. This approach improves health outcomes and optimizes resource allocation, ensuring efficient use of staff and financial resources. The future of senior care is data-driven, with advancements in artificial intelligence and real-time health monitoring promising further improvements in care delivery. However, challenges such as ensuring data privacy and training staff to use these technologies effectively must be addressed.
Emerging Technologies
- Technology is revolutionizing healthcare by enhancing diagnostics, patient care, and operational efficiency through innovations such as AI-driven diagnostics, wearable health devices, telemedicine, and robotic surgeries. These advancements improve accessibility and accuracy, with AI improving diagnostic precision and telemedicine expanding care to remote areas. Wearable technology empowers patients by tracking vital signs and supporting chronic disease management, while electronic health records streamline data management for continuity of care. However, challenges like data privacy, security, and accessibility persist, requiring solutions to ensure equitable healthcare access. Overall, technology is creating a future where healthcare is more efficient, personalized, and accessible.
Fraud & Abuse
- Attorney General Ken Paxton’s Medicaid Fraud Control Unit was instrumental in a significant federal prosecution involving nine pharmaceutical distributor executives and sales representatives who unlawfully distributed nearly 70 million opioid pills and over 30 million doses of other prescription drugs, valued at over $1.3 billion. These drugs were illegally sold to Houston-area pill-mill pharmacies. The investigation resulted in nine defendants pleading guilty.
- Dr. Rajesh Bindal, a 53-year-old from Sugar Land, has agreed to pay $2,095,946 to settle allegations of submitting false claims for electro-acupuncture device placements. Bindal, through Texas Spine & Neurosurgery Center P.A., billed Medicare and the Federal Employees Health Benefits Program for surgical neurostimulator electrode implantation between March 16, 2021, and April 22, 2022. However, instead of performing surgeries, his clinic allegedly inserted monofilament wires into patients’ ears and taped the devices behind the ear, which were then falsely billed as surgeries. These procedures were performed in his clinic without making any incisions, and many devices reportedly fell off within days. The U.S. Attorney and law enforcement officials emphasized the importance of accurate billing to maintain public trust and the integrity of federal health care programs.
- Federal agents detained former hospital CEO Ralph de la Torre and seized his phone as part of an escalating federal corruption and fraud investigation into the bankrupt hospital chain Steward, which he formerly led. De la Torre, held in criminal contempt of Congress in September, and Armin Ernst, who also had his phone seized, are central figures in a corruption case in Malta involving alleged bribery of government officials. A Maltese magistrate has recommended charges against them for money laundering and corruption. Domestically, Steward executives are accused of misusing the Steward-owned malpractice insurer TRACO, resulting in significant financial discrepancies, with $99 million in outstanding loans and $176 million in accounts receivable owed by Steward.
HIPAA
- Healthcare providers must comply with the new HIPAA Reproductive Health Rule by December 23, 2024, which restricts the disclosure of reproductive healthcare information (RPHI) if the care was legal in the state it was provided and the information is sought for investigative or prosecutorial purposes. The Rule faces legal challenges, notably from Texas, and its future is uncertain, especially with potential changes in federal administration. Providers must assess the legality of reproductive care and the purpose of RPHI requests, requiring attestations from requesters to ensure compliance. They must update HIPAA policies, train employees, and potentially use a model attestation from the Office for Civil Rights. Additionally, providers must update their Notice of Privacy Practices by February 16, 2026, to reflect these changes and other proposed HIPAA modifications.
- The U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) took its first enforcement action against Holy Redeemer Family Medicine for improperly disclosing a patient’s reproductive health information to a prospective employer without authorization. The disclosure included sensitive health details beyond what the patient had consented to share, violating HIPAA regulations. Holy Redeemer agreed to a settlement, paying a $35,581 penalty and adopting a corrective action plan, which includes revising privacy policies, training staff, and monitoring compliance for two years. OCR emphasized the importance of protecting patient privacy, particularly concerning reproductive health, to maintain trust in the patient-doctor relationship. Additionally, OCR’s Final Rule to enhance privacy protections for reproductive health information will take effect on December 23, 2024.
- On June 25, 2024, the Office for Civil Rights and the U.S. Department of Health and Human Services issued the HIPAA Privacy Rule to enhance privacy protections for Protected Health Information (PHI) related to reproductive healthcare. This rule prohibits healthcare entities and their associates from using or disclosing PHI for criminal, civil, or administrative investigations related to seeking or providing reproductive healthcare. Compliance with this rule is required by December 23, 2024, and updates to Notices of Privacy Practices (NPP) must be completed by February 16, 2026. The rule mandates obtaining a written attestation ensuring PHI is not used for prohibited purposes before any use or disclosure. Additionally, the rule requires updates to NPPs to reflect these protections and changes, with varying responsibilities for group health plans based on their insurance status.
Medicare Expansion
Mental Health & Substance Use
- On September 23, 2024, the Departments of Labor, Treasury, and Health and Human Services issued a final rule under the Mental Health Parity and Addiction Equity Act (MHPAEA), which requires insurers and group health plan sponsors to conduct a comparative analysis of nonquantitative treatment limitations (NQTLs) for mental health and substance use disorder benefits. Effective for plan years starting on or after January 1, 2025, the rule mandates that ERISA plan fiduciaries certify a prudent process in selecting and monitoring service providers for this analysis. The comparative analysis requirement, in effect since February 10, 2021, applies to most group health plans and includes various NQTLs like prior authorization and network design. Plan sponsors must ensure compliance by arranging for these analyses and updating contracts with insurers and vendors. Enforcement includes audits and penalties for non-compliance, with ERISA participants entitled to request the analysis within 30 days.
OIG
- The Office of Inspector General (OIG) issued Advisory Opinion No. 24-09 in response to a request from a municipal corporation about a proposal to charge insurance for treatment-in-place (TIP) emergency medical services without ambulance transport, while waiving patient cost-sharing amounts. The OIG assessed whether this proposal would violate the Federal anti-kickback statute or the Beneficiary Inducements Civil Monetary Penalty (CMP) provisions. Although the arrangement could potentially generate prohibited remuneration under these statutes, the OIG concluded that it would not impose administrative sanctions due to the low risk of fraud and abuse associated with the proposal.
- On November 20, 2024, the Office of Inspector General (OIG) released new compliance guidelines for nursing facilities, which is the first industry-specific guidance since the 2023 General Compliance Program Guidance. The guidance emphasizes best practices for nursing facilities, covering topics such as quality of care, Medicare and Medicaid billing requirements, and the federal Anti-Kickback Statute. Additionally, an OIG report published on November 12, 2024, found that Medicare overpaid acute-care hospitals an estimated $190 million over five years for outpatient services to hospice enrollees, and the OIG recommended improvements to prevent future overpayments.
- The HHS Office of Inspector General (OIG) report criticized the Office for Civil Rights (OCR) for its narrow HIPAA audit program, which assessed only eight out of 180 requirements, failing to adequately improve cybersecurity at healthcare organizations. The audits did not evaluate physical or technical safeguards, leaving potential vulnerabilities unaddressed. The OIG recommended expanding the audit scope, enforcing corrective measures, and establishing evaluation metrics, but the OCR cited budget constraints and a lack of resources as barriers to implementing these changes. From fiscal years 2018 to 2020, the OCR’s budget remained at $38 million, while complaints and data breach reports increased, and investigative staff numbers decreased by 30% since 2010. Despite agreeing with most recommendations, the OCR disagreed with requiring corrective measures, emphasizing that HIPAA allows for civil penalties instead, and audits are intended to offer technical assistance. See report here.
Cybersecurity
- The Office of Inspector General (OIG) has once again found the U.S. Department of Health and Human Services’ (HHS) information security program to be ineffective, as detailed in their report. The OIG’s annual audit, required by the Federal Information Security Modernization Act of 2014, revealed that HHS failed to meet maturity in all five functional areas of the NIST framework: Identify, Protect, Detect, Respond, and Recover. The OIG made six recommendations to improve HHS’s information security, including updating system inventories and implementing a comprehensive cybersecurity risk management strategy. Despite these recommendations, HHS only concurred with five, disagreeing on the need to fully implement a new cybersecurity risk management strategy. This audit exemplifies ongoing challenges faced by federal agencies in meeting FISMA requirements, with HHS struggling to address security flaws, particularly in its cloud systems.
- The ransomware landscape has become more distributed, with a rise in small-scale groups and a decrease in activity from previously dominant groups like LockBit and ALPHV. Poorly secured and outdated VPNs remain a primary initial access vector for ransomware groups, highlighting the critical importance of robust security measures like multi-factor authentication.
- Telehealth programs are increasingly targeted by cybercriminals due to their rapid expansion and critical role in patient care. Healthcare organizations can protect sensitive health data by conducting risk assessments of telehealth providers, implementing isolated network access points, and continuously monitoring security measures. Hospitals and health systems should integrate telehealth provider security into their overall strategy, ensuring compliance with industry standards like HIPAA and HITRUST.
- The IEEE Standards Association has published IEEE 2933, a new healthcare cybersecurity standard addressing vulnerabilities in connected medical devices. IEEE 2933, developed with input from global experts, focuses on six essential elements of medical device security, including trust, identity, privacy, protection, safety, and security. By adopting IEEE 2933, the healthcare industry can take a proactive stance in safeguarding patient safety and system integrity.
Data Privacy
- Elon Musk has been criticized for encouraging users of X, the platform he owns, to upload medical images to its AI tool, Grok, raising concerns about privacy and accuracy issues. Musk claims Grok is in early stages but already quite accurate, though results have been mixed, with some users reporting accurate diagnoses and others experiencing errors. Critics highlight the absence of HIPAA protections on X and ethical concerns about sharing sensitive health data on social media. The New York Times and experts like Bradley Malin emphasize the risks involved, including potential misuse of data and public trust issues. The debate underscores the need for regulation in AI-driven healthcare to prevent misuse and ensure safety.
- The U.S. Department of Health and Human Services, Office for Civil Rights (OCR) has announced a new enforcement initiative called the Risk Analysis Initiative, aimed at ensuring compliance with the HIPAA Security Rule Risk Analysis provision. This initiative is part of OCR’s broader efforts, including its seventh enforcement action related to ransomware, to address deficiencies in how organizations assess risks to electronic protected health information (ePHI). With a reported 264% increase in large breaches involving ransomware since 2018, the initiative emphasizes the need for healthcare entities to evaluate their cybersecurity measures and resource allocation. OCR’s focus is on enhancing the identification and remediation of threats to ePHI, a critical aspect of HIPAA compliance. This initiative follows OCR’s previous enforcement strategy, the Right of Access Initiative, suggesting a continued rigorous approach to ensuring compliance.
Artificial Intelligence
- In a randomized clinical trial published in JAMA Network Open, it was found that the use of a large language model (LLM) did not significantly enhance diagnostic reasoning performance among physicians compared to conventional resources. The study involved 50 physicians and showed that while the LLM alone outperformed both groups of physicians, its integration with physicians did not improve diagnostic reasoning. The trial highlighted the need for further development in human-computer interactions to effectively integrate LLMs into clinical practice. Despite the LLM’s potential, the study suggests that simply providing access to LLMs is insufficient to improve diagnostic reasoning in practice.
- Public Citizen experts are urging the U.S. Food and Drug Administration (FDA) to address the risks posed by AI in healthcare, which could worsen existing issues and threaten patient safety. Dr. Robert Steinbrook, Health Research Group Director, testified before the FDA’s Digital Health Advisory Committee, emphasizing the need for stringent regulations to prevent harm from rapidly developed AI devices. A report by Eagan Kemp highlights the growing use of AI in administrative tasks, medical practices, and mental health support, warning that without safeguards, AI could lead to inequitable care and exacerbate disparities. Public Citizen has recommended regulatory measures to the Department of Health and Human Services, expressing concern that the incoming Trump administration may prioritize innovation over regulation, potentially compromising patient safety.
- AI tools, particularly GenAI, are being used to enhance healthcare by detecting health threats and unauthorized access to patient data, but they must be accurate and secure to be effective. The article warns that AI can also be exploited by cybercriminals to harm healthcare systems through social engineering and other malicious activities. It emphasizes the need for healthcare organizations to establish robust AI policies and risk management strategies to mitigate these threats. Finally, the article advises thorough testing of AI tools to ensure they do not compromise patient data or violate legal requirements.
- Microsoft and major institutions like Yale, Harvard, and the University of Michigan are advancing AI initiatives, yet the technology’s adoption may be outpacing regulatory and oversight capabilities. The FDA currently approves AI tools as devices, which undergo a different and sometimes less rigorous approval process than drugs, raising concerns about their real-world efficacy and safety. The article emphasizes the need for transparency, stronger regulations, and a public database to track AI performance and ensure accountability. It also calls for increased resources for the FDA and suggests that patients and healthcare professionals should stay informed and engaged to promote responsible AI use in medicine.
- Academic Medical Centers (AMCs) are uniquely positioned to accelerate the translation of research into clinical care, particularly through the use of artificial intelligence (AI). AMCs can leverage AI to improve patient care, especially in resource-constrained settings, and create efficiencies for providers and research organizations. Despite challenges, the potential rewards of AI implementation are significant.
Dr. Death Redux
Emerging Tech
- In a randomized clinical trial published in JAMA Network Open, it was found that the use of a large language model (LLM) did not significantly enhance diagnostic reasoning performance among physicians compared to conventional resources. The study involved 50 physicians and showed that while the LLM alone outperformed both groups of physicians, its integration with physicians did not improve diagnostic reasoning. The trial highlighted the need for further development in human-computer interactions to effectively integrate LLMs into clinical practice. Despite the LLM’s potential, the study suggests that simply providing access to LLMs is insufficient to improve diagnostic reasoning in practice.
Fraud & Abuse
- A federal jury in Dallas has found Healthcare Associates of Texas, LLC (HCAT) liable for fraudulent Medicare billing practices, potentially resulting in $304 million in penalties under the False Claims Act. The case was brought by a whistleblower, a former HCAT executive, who alleged that HCAT submitted 21,844 false claims to Medicare between 2015 and 2021. The fraudulent schemes included billing for services by uncredentialed providers, splitting bills to obscure provider identities, and charging inflated rates. The jury found HCAT, its founding physicians, former CEO, and ex-Chief Compliance Officer liable for these actions. The whistleblower will receive a percentage of any recovered funds, though the specific award has not been disclosed.
- A Texas laboratory owner was charged with healthcare fraud for submitting over $79 million in fraudulent claims to Medicare and Medicaid for unnecessary respiratory pathogen panel tests. The owner allegedly used a physician’s identity without consent and falsely represented that the lab used reference laboratories to conduct the tests. The government seized over $15 million in cash and is investigating the case with multiple agencies.
Gender-Affirming Care
Health Policy
- Robert F. Kennedy Jr. has expressed strong opposition to the FDA’s current practices, which he believes suppress public health advancements by limiting access to non-patentable treatments and products. He criticizes the pharmaceutical industry’s reliance on patents, suggesting that many FDA actions are designed to protect this business model. Kennedy advocates for allowing the manufacture and distribution of medical products without FDA approval, permitting broader marketing claims, and opposing regulations on raw milk and certain food additives. His stance suggests a push for more lenient FDA policies regarding unapproved medical uses and claims for “clean foods.” President-elect Trump has indicated he will nominate Kennedy as secretary of health and human services, potentially giving him oversight of the FDA.
- The Food and Drug Administration (FDA) has made it a federal requirement to inform patients about their breast density when they receive mammograms, following a policy initially enacted in Texas. Higher breast density, characterized by more glandular tissue, can obscure cancer detection on mammograms since both appear white. Patients with dense breasts are advised to undergo more comprehensive exams, such as ultrasounds or MRIs, for better cancer detection. This federal mandate follows the 2012 Texas rule, known as Henda’s Law, which was adopted by 18 other states before becoming a nationwide requirement. Breast cancer remains the most common cancer among women in Texas, with over 21,000 diagnoses expected this year and nearly 3,500 estimated deaths.
Patient Confidentiality
- The U.S. Department of Health and Human Services, Office for Civil Rights (OCR) has announced a new enforcement initiative called the Risk Analysis Initiative, aimed at ensuring compliance with the HIPAA Security Rule Risk Analysis provision. This initiative is part of OCR’s broader efforts, including its seventh enforcement action related to ransomware, to address deficiencies in how organizations assess risks to electronic protected health information (ePHI). With a reported 264% increase in large breaches involving ransomware since 2018, the initiative emphasizes the need for healthcare entities to evaluate their cybersecurity measures and resource allocation. OCR’s focus is on enhancing the identification and remediation of threats to ePHI, a critical aspect of HIPAA compliance. This initiative follows OCR’s previous enforcement strategy, the Right of Access Initiative, suggesting a continued rigorous approach to ensuring compliance.
- Elon Musk has been criticized for encouraging users of X, the platform he owns, to upload medical images to its AI tool, Grok, raising concerns about privacy and accuracy issues. Musk claims Grok is in early stages but already quite accurate, though results have been mixed, with some users reporting accurate diagnoses and others experiencing errors. Critics highlight the absence of HIPAA protections on X and ethical concerns about sharing sensitive health data on social media. The New York Times and experts like Bradley Malin emphasize the risks involved, including potential misuse of data and public trust issues. The debate underscores the need for regulation in AI-driven healthcare to prevent misuse and ensure safety.
Risk Management
- The Office of Inspector General (OIG) has released updated Industry-Specific Compliance Program Guidance (ICPG) for nursing facilities. The 2024 ICPG shifts the focus from fraud prevention to quality of care and resident safety, reflecting the interconnectedness of care quality and compliance. Nursing facilities are encouraged to review their practices, identify gaps, and implement changes to align with the new framework.
- Overpayments pose significant risks to healthcare providers, leading to financial losses and compliance issues. Statistical Sampling and Overpayment Estimation (SSOE) is a method that uses a small, representative sample of claims to estimate overpayments across a larger pool, offering a cost-effective alternative to reviewing every claim. The SSOE process involves sampling claims, identifying overpayments, and extrapolating results to provide a reliable picture of financial impact. Key data fields for accurate overpayment estimation include claim details, provider and patient information, service codes, and overpayment indicators. SSOE not only helps in compliance and reducing financial risks but also provides insights into improving billing processes and addressing financial leakage.
Taxation
- The Fifth Circuit denied tax-exempt status to the Memorial Hermann Accountable Care Organization (MHACO), a healthcare nonprofit, under Section 501(c)(4) of the Internal Revenue Code, citing substantial nonexempt purposes. This decision extends the “substantial nonexempt purpose” test, traditionally applied to 501(c)(3) entities, to 501(c)(4) organizations, potentially affecting other nonprofits with similar structures. The court found that MHACO’s activities primarily benefited private healthcare providers and commercial insurers, rather than promoting social welfare, as required for tax exemption. The ruling could impact nonprofits with private membership or financial benefit structures, possibly affecting their operations and governance. Additionally, the decision may influence politically active nonprofits by curbing activities such as political spending.
Telehealth
- The Drug Enforcement Administration (DEA) and the Department of Health and Human Services (HHS) have issued a third extension of telemedicine flexibilities for prescribing controlled substances, now set to expire on December 31, 2025, allowing practitioners to prescribe Schedule II-V substances without an in-person evaluation under certain conditions. This extension serves as a transitional measure from the emergency provisions of the COVID-19 pandemic to a more permanent regulatory framework, reflecting ongoing efforts to modernize healthcare delivery while ensuring patient safety. The extension maintains continuity of care, particularly for patients with chronic conditions or substance use disorders, and provides time for stakeholders to adapt to future regulations. The DEA and HHS are considering public feedback, including 38,369 comments, to develop a comprehensive set of permanent rules that balance access to care with safeguards against misuse.
- The DEA and HHS have extended COVID-era tele-prescribing flexibilities for controlled substances through December 31, 2025. This extension provides more time to finalize permanent tele-prescribing rules that balance public health, access, and diversion risks. The agencies aim for a smooth transition for patients and practitioners who have come to rely on telemedicine for controlled substance prescriptions.
Artificial Intelligence
- The Department of Homeland Security (DHS) has released a voluntary framework outlining AI responsibilities for critical infrastructure sectors. The framework, developed with industry input, addresses risks like AI-based attacks and design failures, emphasizing practical implementation for safety and security. While the future of the framework under a potential Trump administration remains uncertain, DHS highlights its own AI pilot projects and integration efforts.
- The Global Privacy Assembly’s Joint Statement emphasizes that data scraping, even of publicly accessible data, must comply with privacy laws, including obtaining consent when necessary. Relying solely on platform terms for data scraping does not guarantee compliance with data protection and AI laws.
- President-elect Trump’s return to the White House may impact the artificial intelligence (AI) industry. Trump’s alliance with tech billionaire Elon Musk and his pledge to repeal President Biden’s AI executive order suggest a focus on private sector-driven innovation and competition over regulation.
- According to a recent cybersecurity report, healthcare organizations block 17.2% of AI transactions, ranking third behind finance and insurance, and technology sectors. This is slightly below the national average of 18.5%, indicating a lag in healthcare’s efforts to secure sensitive data against AI threats. Despite being the sixth-largest user of AI and machine learning, the healthcare sector’s AI adoption is expected to grow. The most popular AI applications in healthcare include ChatGPT, Drift, OpenAI, Writer, and Intercom. Healthcare organizations are actively engaging in AI safety initiatives, with some developing in-house AI platforms, while addressing concerns about data privacy, security, and the reliability and bias of AI algorithms in patient care.
- Pieces Technologies, a Dallas-based company, uses AI to streamline physician documentation, saving time on tasks like patient summaries, discharge notes, and progress notes. The platform, now in use at hospitals nationwide, has expanded to include a mobile version and is exploring outpatient applications. Despite facing scrutiny over AI accuracy, Pieces continues to innovate and secure funding for its patient-facing technology.
- California enacted two new laws governing the use of AI in healthcare. One law requires health plans using AI in utilization review to disclose its use and ensure determinations are based on clinical information. The other law mandates providers using AI in patient communications to obtain consent and follow specific protocols.
Cybersecurity
- The Office for Civil Rights (OCR) fined Providence Medical Institute (PMI) $240,000 for HIPAA violations, including lacking a Business Associate Agreement (BAA) and insufficient access controls. The fine reflects a 20% discount for PMI’s recognized security practices, but OCR has not publicly disclosed which practices qualify or how the discount was calculated. This marks the first time OCR has publicly acknowledged applying the discount, which was mandated by the Cybersecurity Act of 2015.
- New federal legislation called the Health Infrastructure Security and Accountability Act (HISAA) has been introduced to address cybersecurity risks in the healthcare sector, particularly for HIPAA Covered Entities and Business Associates. The Act is a response to high-profile cybersecurity incidents, such as the Change Healthcare breach, and aims to establish new security requirements, conduct annual risk assessments, and enforce audits to improve cybersecurity practices. HISAA proposes tiered penalties for noncompliance, removes statutory caps on penalties, and introduces fees to cover oversight costs. It also includes financial incentives and disincentives related to Medicare payments to encourage the adoption of enhanced cybersecurity measures. The act represents a significant shift in cybersecurity enforcement and oversight compared to existing HIPAA regulations.
Data Privacy
- The FTC published an explainer on the use of Data Clean Rooms (DCRs), cloud services that enable data exchange and analysis between companies. While DCRs can offer privacy protections when configured correctly, they are not inherently privacy-preserving and can be used to obfuscate privacy harms. Companies should not rely on DCRs to avoid legal obligations regarding data privacy and should be held accountable for any violations, regardless of the technology used.
- The healthcare industry is increasingly targeted by ransomware attacks, with notable incidents such as the Change Healthcare breach affecting nearly 100 million individuals. Healthcare organizations face complex decisions regarding whether to pay ransoms, balancing the need to minimize business disruption and protect sensitive data against the risks of legal liability, increased future targeting, and ethical concerns. Paying a ransom does not eliminate legal obligations to report breaches, and it may expose organizations to penalties if payments are made to sanctioned entities. The healthcare sector’s critical services and sensitive data make it a prime target, necessitating robust cybersecurity measures and comprehensive incident response strategies. Organizations must carefully evaluate their legal and strategic options to effectively manage ransomware risks.
- Texas is emerging as a significant player in privacy regulation following the implementation of the Texas Privacy and Data Security Act (TPDSA) in July 2024 and the Texas Securing Children Online through Parental Empowerment (SCOPE) Act in September 2024. Texas Attorney General Ken Paxton has initiated a privacy and security enforcement initiative, establishing a dedicated team within the Consumer Protection Division to enforce these laws. Notable actions include a lawsuit against TikTok for allegedly violating the SCOPE Act by sharing minors’ personal information without parental consent, and a settlement with Meta under the Texas biometric law for unauthorized data capture. Additionally, over 100 companies were notified for failing to register as data brokers, and car manufacturers are under investigation for data collection practices. Businesses processing Texans’ personal information should ensure compliance with the TPDSA and other relevant privacy laws to avoid enforcement actions.
Behavioral Health
- Behavioral health is a rapidly growing area in the healthcare sector, but it faces significant operational and financial challenges as companies scale and investor interest increases. Behavioral health organizations need to adopt innovative strategies to improve operations and financial performance, often requiring external expertise to navigate these complexities. They highlight the importance of effective management, strategic planning, and maintaining a focus on patient care amidst financial pressures, such as rising costs and debt. The experts emphasize the need for organizations to communicate their mission clearly, engage employees, and ensure consistent quality of care. They also advise investors to assess management’s ability to respond to data, maintain a positive organizational culture, and manage financial metrics effectively.
Drug & Device
Equity & Equality
- The Department of Health and Human Services (HHS) Office of Civil Rights (OCR) published final rules implementing anti-discrimination provisions under Section 1557 of the Affordable Care Act. These rules apply to health programs receiving federal financial assistance and prohibit discrimination based on race, color, national origin, sex, age, or disability. Covered entities must meet several upcoming deadlines, including appointing a Section 1557 Coordinator by November 2, 2024, and ensuring non-discriminatory use of patient care decision support tools by May 1, 2025. By July 5, 2025, entities must post notices of language assistance services and adopt comprehensive policies and procedures to ensure compliance. The rules also require training for relevant employees on these policies within specific timeframes.
Fraud & Abuse
Intellectual Property
Mergers & Acquisitions
No Surprises Act
Ransomware
- The healthcare industry is increasingly targeted by ransomware attacks, with notable incidents such as the Change Healthcare breach affecting nearly 100 million individuals. Healthcare organizations face complex decisions regarding whether to pay ransoms, balancing the need to minimize business disruption and protect sensitive data against the risks of legal liability, increased future targeting, and ethical concerns. Paying a ransom does not eliminate legal obligations to report breaches, and it may expose organizations to penalties if payments are made to sanctioned entities. The healthcare sector’s critical services and sensitive data make it a prime target, necessitating robust cybersecurity measures and comprehensive incident response strategies. Organizations must carefully evaluate their legal and strategic options to effectively manage ransomware risks.
Reproductive Rights
Telehealth
Blockchain
HIPAA & Cybersecurity
Ransomeware
- Ransomware attacks, while slightly less frequent in H1 2024, saw a 68% increase in severity, with average losses reaching a record high. Businesses with over $100 million in revenue experienced the most significant impact, with a 140% increase in losses. While BEC attacks remained the most common cause of claims, ransomware attacks were the third most common, with exposed login panels and outdated technologies increasing the likelihood of a claim.
- A new report reveals a four-year high in ransomware attacks on healthcare organizations, with 67% reporting incidents in the past year. These attacks are increasingly complex, with longer recovery times and higher costs, averaging $2.57 million in 2024. Attackers are also targeting data backups, increasing pressure on organizations to pay ransoms.
Regulation
Tech and ACOs
Fraud & Abuse
- Horizon Medical Center of Denton, owned by Corinth Investor Holdings, L.L.C., paid $14.2 million to settle potential violations of Medicare regulations and the Stark Law. The center self-disclosed omitting a modifier and location for services provided at off-campus facilities, as well as financial relationships with physician-owners. This settlement, along with two others, highlights the Department of Justice’s emphasis on voluntary self-disclosure and cooperation in healthcare fraud cases.
- A pharmaceutical ingredient supplier will pay $21.75 million to settle allegations of inflating Average Wholesale Prices (AWPs) for two key ingredients. A pharmacist whistleblower exposed the scheme, highlighting the critical role of whistleblowers in combating pharmaceutical fraud. The False Claims Act empowers individuals to report fraud and protects public funds from fraudulent activities.
- A Texas optometrist, agreed to pay $1 million to settle allegations of healthcare fraud. The doctor operated a network of optometry practices in Central Texas and according to the government, these practices submitted claims to TRICARE, Medicare, and Medicaid using the National Provider Identifiers (NPIs) of optometrists who did not perform the services billed. They allegedly did so “in circumstances where the optometrist who rendered services was not credentialed or enrolled in the Federal healthcare program billed.
- Oak Street Health, a CVS subsidiary, agreed to a $60 million settlement for violating the False Claims Act. The company allegedly paid kickbacks to insurance agents to recruit seniors to their clinics, resulting in false claims to Medicare. The settlement includes restitution and a whistleblower reward.
HIPAA & Cybersecurity
Hospice
Insulin Overpricing
Loper Bright
Med Spas
No Surprises Act
Physician Fee Schedule
- The Centers for Medicare & Medicaid Services (CMS) finalized the 2025 Medicare Physician Fee Schedule, resulting in a 2.93% reduction in average payment rates. This decision has been met with strong opposition from national provider associations, who argue that the cuts, coupled with inflation, threaten the financial viability of physician practices and patient access to care. These associations urge Congress to intervene and stabilize reimbursement rates.
- The Biden administration finalized 2025 Medicare reimbursement rates, with physicians facing a 2.9% decrease and hospitals receiving a 2.9% increase for outpatient services. While hospitals argue the rates are insufficient, physician groups, particularly those operating independent practices, face more significant challenges due to rising costs and smaller profit margins. The CMS also implemented changes to the Hospital Outpatient Prospective Payment System, including maternal health and safety standards and continuous coverage requirements for children in safety-net programs.
- CMS finalized a 2.83% physician pay cut for 2025 while increasing reimbursement for ASCs meeting quality reporting requirements. The rule includes updates to coding and payment policies for various services, as well as changes to the ASC quality reporting program.
Ransomeware
- Ransomware attacks, while slightly less frequent in H1 2024, saw a 68% increase in severity, with average losses reaching a record high. Businesses with over $100 million in revenue experienced the most significant impact, with a 140% increase in losses. While BEC attacks remained the most common cause of claims, ransomware attacks were the third most common, with exposed login panels and outdated technologies increasing the likelihood of a claim.
- A new report reveals a four-year high in ransomware attacks on healthcare organizations, with 67% reporting incidents in the past year. These attacks are increasingly complex, with longer recovery times and higher costs, averaging $2.57 million in 2024. Attackers are also targeting data backups, increasing pressure on organizations to pay ransoms.
Skilled Nursing Facilities