Categories
Health Law Highlights

Wade’s Healthcare Privacy Advisor for December 11, 2024

AI Governance

  • At the HLTH health innovation conference, a panel of AI experts expressed skepticism about appointing a chief AI officer in health organizations, advocating instead for improving AI literacy across the board. Some providers have established an AI oversight committee and an AI Enablement Center to democratize AI governance and ensure responsible integration of AI technologies. The widespread use of AI in radiology for diagnostic support and the growing adoption of ambient AI scribes have significantly reduced administrative burdens for physicians. The use of AI in administrative tasks, such as drafting patient communications, has shown positive results, with patients reportedly preferring AI-generated responses for their empathetic tone. Nevertheless, it is important to maintain a human element in AI applications, ensuring that AI supports rather than replaces clinical decision-making.
  • The FDA has issued final guidance on regulating changes to AI-enabled medical devices through pre-determined change control plans (PCCPs), allowing for post-market modifications while maintaining safety and effectiveness. PCCPs, first introduced in 2019, enable performance enhancements by outlining specific, verifiable modifications and include a description of planned changes, a modification protocol, and an impact assessment. The guidance, consistent with a 2023 draft, now includes a section on version control and maintenance. While no adaptive AI-enabled devices have been authorized yet, PCCPs have been approved for devices through various regulatory pathways. Modifications under a PCCP must stay within the device’s intended use, and significant changes, such as altering a device’s user base or core functionalities, require new marketing submissions.
  • The rapid evolution of AI in healthcare presents challenges for physicians and legal compliance, with shifting regulations and emerging laws at both federal and state levels. A federal rule effective July 2024 requires healthcare providers to comply with anti-discrimination regulations by May 2025, while various state bills focus on transparency, bias elimination, and AI limitations. Organizations like HIMSS and the AMA provide guidance on AI implementation, emphasizing human oversight and ethical considerations to enhance patient care and reduce costs. Legal risks associated with AI, such as data privacy, potential bias, and the unlicensed practice of medicine, necessitate legal expertise for healthcare providers. Despite these challenges, AI has the potential to generate actionable insights and improve healthcare operations, provided it is used responsibly and with appropriate legal guidance.
  • A recent study published in npj Digital Medicine outlines comprehensive guidelines for the responsible integration of AI into healthcare, developed by a team from Harvard Medical School and the Mass General Brigham AI Governance Committee. The study emphasizes nine principles, including fairness, robustness, and accountability, and highlights the need for diverse training datasets and regular equity evaluations to reduce bias. A pilot study and shadow deployment were conducted to assess AI systems, focusing on privacy, security, and usability in clinical workflows. The study also stresses the importance of transparent communication regarding AI systems’ FDA status and a risk-based monitoring approach. Future efforts will expand testing to ensure AI systems remain equitable and effective across diverse healthcare settings.

Medical Judgment

Employment

Data Privacy

HIPAA Penalties

Quantum Computing

Categories
Health Law Highlights

Wade’s Health Law Highlights for December 10, 2024

Centers for Medicare & Medicaid Services

Emerging Technology

Fraud & Abuse

HIPAA Penalties

Mental Health and Substance Use

Pharmacy Benefit Managers

Private Equity

Categories
Health Law Highlights

Wade’s Healthcare Privacy Advisor for December 4, 2024

Artificial Intelligence

Bias & Equity

Cybersecurity

  • The HHS Office of Inspector General (OIG) report criticized the Office for Civil Rights (OCR) for its narrow HIPAA audit program, which assessed only eight out of 180 requirements, failing to adequately improve cybersecurity at healthcare organizations. The audits did not evaluate physical or technical safeguards, leaving potential vulnerabilities unaddressed. The OIG recommended expanding the audit scope, enforcing corrective measures, and establishing evaluation metrics, but the OCR cited budget constraints and a lack of resources as barriers to implementing these changes. From fiscal years 2018 to 2020, the OCR’s budget remained at $38 million, while complaints and data breach reports increased, and investigative staff numbers decreased by 30% since 2010. Despite agreeing with most recommendations, the OCR disagreed with requiring corrective measures, emphasizing that HIPAA allows for civil penalties instead, and audits are intended to offer technical assistance.
  • The continued success of telehealth hinges on its accessibility, but challenges remain, such as digital inequalities and the need for inclusive design for diverse populations. Security is a critical concern as telehealth platforms handle sensitive patient data, necessitating robust measures like encryption, multi-factor authentication, and compliance with privacy laws. The inherent tension between accessibility and security requires a balance to prevent vulnerabilities without deterring patients from using these services. Emerging technologies like AI and blockchain may enhance both security and accessibility, but a collective effort from healthcare providers, developers, policymakers, and patients is essential to ensure telehealth remains safe and inclusive.

Data Privacy

Categories
Health Law Highlights

Wade’s Health Law Highlights for December 3, 2024

Elderly & Aging

  • Older adults increasingly require more clinical care and social services, which places a significant burden on an already strained healthcare system. The integration of data analytics in senior care can enhance patient-centered care by enabling predictive analytics for proactive health interventions and personalized treatment plans tailored to individual needs. This approach improves health outcomes and optimizes resource allocation, ensuring efficient use of staff and financial resources. The future of senior care is data-driven, with advancements in artificial intelligence and real-time health monitoring promising further improvements in care delivery. However, challenges such as ensuring data privacy and training staff to use these technologies effectively must be addressed.

Emerging Technologies

Fraud & Abuse

HIPAA

Medicare Expansion

Mental Health & Substance Use

OIG

  • The Office of Inspector General (OIG) issued Advisory Opinion No. 24-09 in response to a request from a municipal corporation about a proposal to charge insurance for treatment-in-place (TIP) emergency medical services without ambulance transport, while waiving patient cost-sharing amounts. The OIG assessed whether this proposal would violate the Federal anti-kickback statute or the Beneficiary Inducements Civil Monetary Penalty (CMP) provisions. Although the arrangement could potentially generate prohibited remuneration under these statutes, the OIG concluded that it would not impose administrative sanctions due to the low risk of fraud and abuse associated with the proposal.
  • On November 20, 2024, the Office of Inspector General (OIG) released new compliance guidelines for nursing facilities, which is the first industry-specific guidance since the 2023 General Compliance Program Guidance. The guidance emphasizes best practices for nursing facilities, covering topics such as quality of care, Medicare and Medicaid billing requirements, and the federal Anti-Kickback Statute. Additionally, an OIG report published on November 12, 2024, found that Medicare overpaid acute-care hospitals an estimated $190 million over five years for outpatient services to hospice enrollees, and the OIG recommended improvements to prevent future overpayments.
  • The HHS Office of Inspector General (OIG) report criticized the Office for Civil Rights (OCR) for its narrow HIPAA audit program, which assessed only eight out of 180 requirements, failing to adequately improve cybersecurity at healthcare organizations. The audits did not evaluate physical or technical safeguards, leaving potential vulnerabilities unaddressed. The OIG recommended expanding the audit scope, enforcing corrective measures, and establishing evaluation metrics, but the OCR cited budget constraints and a lack of resources as barriers to implementing these changes. From fiscal years 2018 to 2020, the OCR’s budget remained at $38 million, while complaints and data breach reports increased, and investigative staff numbers decreased by 30% since 2010. Despite agreeing with most recommendations, the OCR disagreed with requiring corrective measures, emphasizing that HIPAA allows for civil penalties instead, and audits are intended to offer technical assistance. See report here.
Categories
Health Law Highlights

Wade’s Healthcare Privacy Advisor for November 27, 2024

Cybersecurity

  • The Office of Inspector General (OIG) has once again found the U.S. Department of Health and Human Services’ (HHS) information security program to be ineffective, as detailed in their report. The OIG’s annual audit, required by the Federal Information Security Modernization Act of 2014, revealed that HHS failed to meet maturity in all five functional areas of the NIST framework: Identify, Protect, Detect, Respond, and Recover. The OIG made six recommendations to improve HHS’s information security, including updating system inventories and implementing a comprehensive cybersecurity risk management strategy. Despite these recommendations, HHS only concurred with five, disagreeing on the need to fully implement a new cybersecurity risk management strategy. This audit exemplifies ongoing challenges faced by federal agencies in meeting FISMA requirements, with HHS struggling to address security flaws, particularly in its cloud systems.
  • The ransomware landscape has become more distributed, with a rise in small-scale groups and a decrease in activity from previously dominant groups like LockBit and ALPHV. Poorly secured and outdated VPNs remain a primary initial access vector for ransomware groups, highlighting the critical importance of robust security measures like multi-factor authentication.
  • Telehealth programs are increasingly targeted by cybercriminals due to their rapid expansion and critical role in patient care. Healthcare organizations can protect sensitive health data by conducting risk assessments of telehealth providers, implementing isolated network access points, and continuously monitoring security measures. Hospitals and health systems should integrate telehealth provider security into their overall strategy, ensuring compliance with industry standards like HIPAA and HITRUST.
  • The IEEE Standards Association has published IEEE 2933, a new healthcare cybersecurity standard addressing vulnerabilities in connected medical devices. IEEE 2933, developed with input from global experts, focuses on six essential elements of medical device security, including trust, identity, privacy, protection, safety, and security. By adopting IEEE 2933, the healthcare industry can take a proactive stance in safeguarding patient safety and system integrity.

Data Privacy

  • Elon Musk has been criticized for encouraging users of X, the platform he owns, to upload medical images to its AI tool, Grok, raising concerns about privacy and accuracy issues. Musk claims Grok is in early stages but already quite accurate, though results have been mixed, with some users reporting accurate diagnoses and others experiencing errors. Critics highlight the absence of HIPAA protections on X and ethical concerns about sharing sensitive health data on social media. The New York Times and experts like Bradley Malin emphasize the risks involved, including potential misuse of data and public trust issues. The debate underscores the need for regulation in AI-driven healthcare to prevent misuse and ensure safety.
  • The U.S. Department of Health and Human Services, Office for Civil Rights (OCR) has announced a new enforcement initiative called the Risk Analysis Initiative, aimed at ensuring compliance with the HIPAA Security Rule Risk Analysis provision. This initiative is part of OCR’s broader efforts, including its seventh enforcement action related to ransomware, to address deficiencies in how organizations assess risks to electronic protected health information (ePHI). With a reported 264% increase in large breaches involving ransomware since 2018, the initiative emphasizes the need for healthcare entities to evaluate their cybersecurity measures and resource allocation. OCR’s focus is on enhancing the identification and remediation of threats to ePHI, a critical aspect of HIPAA compliance. This initiative follows OCR’s previous enforcement strategy, the Right of Access Initiative, suggesting a continued rigorous approach to ensuring compliance.

Artificial Intelligence

  • In a randomized clinical trial published in JAMA Network Open, it was found that the use of a large language model (LLM) did not significantly enhance diagnostic reasoning performance among physicians compared to conventional resources. The study involved 50 physicians and showed that while the LLM alone outperformed both groups of physicians, its integration with physicians did not improve diagnostic reasoning. The trial highlighted the need for further development in human-computer interactions to effectively integrate LLMs into clinical practice. Despite the LLM’s potential, the study suggests that simply providing access to LLMs is insufficient to improve diagnostic reasoning in practice.
  • Public Citizen experts are urging the U.S. Food and Drug Administration (FDA) to address the risks posed by AI in healthcare, which could worsen existing issues and threaten patient safety. Dr. Robert Steinbrook, Health Research Group Director, testified before the FDA’s Digital Health Advisory Committee, emphasizing the need for stringent regulations to prevent harm from rapidly developed AI devices. A report by Eagan Kemp highlights the growing use of AI in administrative tasks, medical practices, and mental health support, warning that without safeguards, AI could lead to inequitable care and exacerbate disparities. Public Citizen has recommended regulatory measures to the Department of Health and Human Services, expressing concern that the incoming Trump administration may prioritize innovation over regulation, potentially compromising patient safety.
  • AI tools, particularly GenAI, are being used to enhance healthcare by detecting health threats and unauthorized access to patient data, but they must be accurate and secure to be effective. The article warns that AI can also be exploited by cybercriminals to harm healthcare systems through social engineering and other malicious activities. It emphasizes the need for healthcare organizations to establish robust AI policies and risk management strategies to mitigate these threats. Finally, the article advises thorough testing of AI tools to ensure they do not compromise patient data or violate legal requirements.
  • Microsoft and major institutions like Yale, Harvard, and the University of Michigan are advancing AI initiatives, yet the technology’s adoption may be outpacing regulatory and oversight capabilities. The FDA currently approves AI tools as devices, which undergo a different and sometimes less rigorous approval process than drugs, raising concerns about their real-world efficacy and safety. The article emphasizes the need for transparency, stronger regulations, and a public database to track AI performance and ensure accountability. It also calls for increased resources for the FDA and suggests that patients and healthcare professionals should stay informed and engaged to promote responsible AI use in medicine.
  • Academic Medical Centers (AMCs) are uniquely positioned to accelerate the translation of research into clinical care, particularly through the use of artificial intelligence (AI). AMCs can leverage AI to improve patient care, especially in resource-constrained settings, and create efficiencies for providers and research organizations. Despite challenges, the potential rewards of AI implementation are significant.
Categories
Health Law Highlights

Wade’s Health Law Highlights for November 26, 2024

Dr. Death Redux

Emerging Tech

  • In a randomized clinical trial published in JAMA Network Open, it was found that the use of a large language model (LLM) did not significantly enhance diagnostic reasoning performance among physicians compared to conventional resources. The study involved 50 physicians and showed that while the LLM alone outperformed both groups of physicians, its integration with physicians did not improve diagnostic reasoning. The trial highlighted the need for further development in human-computer interactions to effectively integrate LLMs into clinical practice. Despite the LLM’s potential, the study suggests that simply providing access to LLMs is insufficient to improve diagnostic reasoning in practice.

Fraud & Abuse

Gender-Affirming Care

Health Policy

  • Robert F. Kennedy Jr. has expressed strong opposition to the FDA’s current practices, which he believes suppress public health advancements by limiting access to non-patentable treatments and products. He criticizes the pharmaceutical industry’s reliance on patents, suggesting that many FDA actions are designed to protect this business model. Kennedy advocates for allowing the manufacture and distribution of medical products without FDA approval, permitting broader marketing claims, and opposing regulations on raw milk and certain food additives. His stance suggests a push for more lenient FDA policies regarding unapproved medical uses and claims for “clean foods.” President-elect Trump has indicated he will nominate Kennedy as secretary of health and human services, potentially giving him oversight of the FDA.
  • The Food and Drug Administration (FDA) has made it a federal requirement to inform patients about their breast density when they receive mammograms, following a policy initially enacted in Texas. Higher breast density, characterized by more glandular tissue, can obscure cancer detection on mammograms since both appear white. Patients with dense breasts are advised to undergo more comprehensive exams, such as ultrasounds or MRIs, for better cancer detection. This federal mandate follows the 2012 Texas rule, known as Henda’s Law, which was adopted by 18 other states before becoming a nationwide requirement. Breast cancer remains the most common cancer among women in Texas, with over 21,000 diagnoses expected this year and nearly 3,500 estimated deaths.

Patient Confidentiality

  • The U.S. Department of Health and Human Services, Office for Civil Rights (OCR) has announced a new enforcement initiative called the Risk Analysis Initiative, aimed at ensuring compliance with the HIPAA Security Rule Risk Analysis provision. This initiative is part of OCR’s broader efforts, including its seventh enforcement action related to ransomware, to address deficiencies in how organizations assess risks to electronic protected health information (ePHI). With a reported 264% increase in large breaches involving ransomware since 2018, the initiative emphasizes the need for healthcare entities to evaluate their cybersecurity measures and resource allocation. OCR’s focus is on enhancing the identification and remediation of threats to ePHI, a critical aspect of HIPAA compliance. This initiative follows OCR’s previous enforcement strategy, the Right of Access Initiative, suggesting a continued rigorous approach to ensuring compliance.
  • Elon Musk has been criticized for encouraging users of X, the platform he owns, to upload medical images to its AI tool, Grok, raising concerns about privacy and accuracy issues. Musk claims Grok is in early stages but already quite accurate, though results have been mixed, with some users reporting accurate diagnoses and others experiencing errors. Critics highlight the absence of HIPAA protections on X and ethical concerns about sharing sensitive health data on social media. The New York Times and experts like Bradley Malin emphasize the risks involved, including potential misuse of data and public trust issues. The debate underscores the need for regulation in AI-driven healthcare to prevent misuse and ensure safety.

Risk Management

  • The Office of Inspector General (OIG) has released updated Industry-Specific Compliance Program Guidance (ICPG) for nursing facilities. The 2024 ICPG shifts the focus from fraud prevention to quality of care and resident safety, reflecting the interconnectedness of care quality and compliance. Nursing facilities are encouraged to review their practices, identify gaps, and implement changes to align with the new framework.
  • Overpayments pose significant risks to healthcare providers, leading to financial losses and compliance issues. Statistical Sampling and Overpayment Estimation (SSOE) is a method that uses a small, representative sample of claims to estimate overpayments across a larger pool, offering a cost-effective alternative to reviewing every claim. The SSOE process involves sampling claims, identifying overpayments, and extrapolating results to provide a reliable picture of financial impact. Key data fields for accurate overpayment estimation include claim details, provider and patient information, service codes, and overpayment indicators. SSOE not only helps in compliance and reducing financial risks but also provides insights into improving billing processes and addressing financial leakage.

Taxation

  • The Fifth Circuit denied tax-exempt status to the Memorial Hermann Accountable Care Organization (MHACO), a healthcare nonprofit, under Section 501(c)(4) of the Internal Revenue Code, citing substantial nonexempt purposes. This decision extends the “substantial nonexempt purpose” test, traditionally applied to 501(c)(3) entities, to 501(c)(4) organizations, potentially affecting other nonprofits with similar structures. The court found that MHACO’s activities primarily benefited private healthcare providers and commercial insurers, rather than promoting social welfare, as required for tax exemption. The ruling could impact nonprofits with private membership or financial benefit structures, possibly affecting their operations and governance. Additionally, the decision may influence politically active nonprofits by curbing activities such as political spending.

Telehealth

Categories
Health Law Highlights

Wade’s Healthcare Privacy Advisor for November 20, 2024

Artificial Intelligence

Cybersecurity

Data Privacy

  • The FTC published an explainer on the use of Data Clean Rooms (DCRs), cloud services that enable data exchange and analysis between companies. While DCRs can offer privacy protections when configured correctly, they are not inherently privacy-preserving and can be used to obfuscate privacy harms. Companies should not rely on DCRs to avoid legal obligations regarding data privacy and should be held accountable for any violations, regardless of the technology used.
  • The healthcare industry is increasingly targeted by ransomware attacks, with notable incidents such as the Change Healthcare breach affecting nearly 100 million individuals. Healthcare organizations face complex decisions regarding whether to pay ransoms, balancing the need to minimize business disruption and protect sensitive data against the risks of legal liability, increased future targeting, and ethical concerns. Paying a ransom does not eliminate legal obligations to report breaches, and it may expose organizations to penalties if payments are made to sanctioned entities. The healthcare sector’s critical services and sensitive data make it a prime target, necessitating robust cybersecurity measures and comprehensive incident response strategies. Organizations must carefully evaluate their legal and strategic options to effectively manage ransomware risks.
  • Texas is emerging as a significant player in privacy regulation following the implementation of the Texas Privacy and Data Security Act (TPDSA) in July 2024 and the Texas Securing Children Online through Parental Empowerment (SCOPE) Act in September 2024. Texas Attorney General Ken Paxton has initiated a privacy and security enforcement initiative, establishing a dedicated team within the Consumer Protection Division to enforce these laws. Notable actions include a lawsuit against TikTok for allegedly violating the SCOPE Act by sharing minors’ personal information without parental consent, and a settlement with Meta under the Texas biometric law for unauthorized data capture. Additionally, over 100 companies were notified for failing to register as data brokers, and car manufacturers are under investigation for data collection practices. Businesses processing Texans’ personal information should ensure compliance with the TPDSA and other relevant privacy laws to avoid enforcement actions.
Categories
Health Law Highlights

Wade’s Health Law Highlights for November 19, 2024

Behavioral Health

  • Behavioral health is a rapidly growing area in the healthcare sector, but it faces significant operational and financial challenges as companies scale and investor interest increases. Behavioral health organizations need to adopt innovative strategies to improve operations and financial performance, often requiring external expertise to navigate these complexities. They highlight the importance of effective management, strategic planning, and maintaining a focus on patient care amidst financial pressures, such as rising costs and debt. The experts emphasize the need for organizations to communicate their mission clearly, engage employees, and ensure consistent quality of care. They also advise investors to assess management’s ability to respond to data, maintain a positive organizational culture, and manage financial metrics effectively.

Drug & Device

Equity & Equality

Fraud & Abuse

Intellectual Property

Mergers & Acquisitions

No Surprises Act

Ransomware

  • The healthcare industry is increasingly targeted by ransomware attacks, with notable incidents such as the Change Healthcare breach affecting nearly 100 million individuals. Healthcare organizations face complex decisions regarding whether to pay ransoms, balancing the need to minimize business disruption and protect sensitive data against the risks of legal liability, increased future targeting, and ethical concerns. Paying a ransom does not eliminate legal obligations to report breaches, and it may expose organizations to penalties if payments are made to sanctioned entities. The healthcare sector’s critical services and sensitive data make it a prime target, necessitating robust cybersecurity measures and comprehensive incident response strategies. Organizations must carefully evaluate their legal and strategic options to effectively manage ransomware risks.

Reproductive Rights

Telehealth

Categories
Health Law Highlights

Wade’s Healthcare Privacy Advisor for November 11, 2024

Blockchain

HIPAA & Cybersecurity

Ransomeware

Regulation

Tech and ACOs

Categories
Health Law Highlights

Wade’s Health Law Highlights for November 11, 2024

Fraud & Abuse

HIPAA & Cybersecurity

Hospice

Insulin Overpricing

Loper Bright

Med Spas

No Surprises Act

Physician Fee Schedule

Ransomeware

Skilled Nursing Facilities