AI Implementation
- A recent report reveals that AI is primarily used for administrative tasks in healthcare settings, with clinical applications still in early adoption stages. Most medical facilities have been using AI for at least 10 months, and there is an expectation for AI to play a larger role in reviewing electronic health records and enhancing patient care. A significant knowledge gap exists, as only 24% of respondents received AI training from their employers, and concerns about data privacy and ethical issues are prevalent among 72% and 70% of respondents, respectively. Currently, AI is used for transcribing patient notes and business meetings, creating routine communications, and analyzing medical images. In the future, over 40% of respondents expect AI to assist more in clinical applications and physician training.
- The Department of Justice (DOJ) is scrutinizing the use of AI in healthcare, updating compliance guidelines to ensure companies mitigate risks associated with AI misuse, emphasizing the need for robust compliance programs and risk assessments. Health care companies must ensure compliance programs are equipped to handle AI-related risks, with adequate resources and training to prevent misuse and ensure accountability. Overall, these developments highlight significant legal challenges and regulatory scrutiny in the healthcare sector, emphasizing the need for vigilant compliance and monitoring of ongoing litigation.
- The Texas Attorney General has initiated investigations into 15 companies including Character.AI, Reddit, Instagram, and Discord regarding their privacy and safety practices for minors under the SCOPE Act and TDPSA, which require parental consent for data collection and provide tools for privacy control. The investigations are part of Texas’s broader data privacy enforcement initiative, following a recent lawsuit against TikTok and a $1.4 billion settlement with Meta over facial recognition data misuse. The SCOPE Act specifically prohibits sharing minors’ personal information without parental consent and requires companies to provide parental control tools, while the TDPSA enforces strict notice and consent requirements for collecting minors’ personal data. These protections extend to AI products, and Texas has demonstrated its commitment to data privacy enforcement through actions like the lawsuit against General Motors for illegal driver surveillance and data sharing with insurance companies. Texas is becoming a leader in data privacy enforcement, with these investigations representing a significant step toward ensuring technology companies comply with state laws protecting children from exploitation and harm.
Data Privacy
- Healthcare data represents 30% of global data, with 97% currently unused, though this is changing through AI and improved accessibility. Real-world data (RWD) and evidence (RWE) are becoming crucial, with 90% of life sciences executives leveraging RWE for decision-making, while AI algorithms have improved clinical trial matching by over 40% and recruitment by 1,800%. Patient-centric healthcare organizations are achieving twice the revenue growth compared to those with lower satisfaction scores, with AI-powered clinical decision support systems saving approximately $1,000 per patient encounter. Supply chain challenges remain significant, with 80% of healthcare providers expecting issues to persist or worsen, and half of suppliers losing over 2.5% revenue due to shortages between February 2023-2024, though predictive AI systems can now identify product shortages with over 90% accuracy.
- Even public-facing healthcare websites can present significant privacy risks through seemingly innocent features like contact forms, appointment requests, and symptom checkers. Unauthenticated pages can inadvertently capture Protected Health Information (PHI) through web forms, tracking technologies, cookies, and web beacons, which may collect user data including IP addresses and browsing history. Healthcare organizations must implement proper safeguards including data encryption, secure storage, explicit consent mechanisms, and careful evaluation of third-party tracking technologies to maintain HIPAA compliance. Organizations should consider minimizing PHI collection on public pages by providing general inquiry options instead of detailed health information forms, while maintaining clear privacy notices and readily accessible contact information for privacy-related concerns. The protection of PHI requires ongoing vigilance and consistency, as even basic data points can constitute protected health information when linked to an individual’s healthcare activities.
Data Breaches
- The healthcare sector experienced unprecedented data breaches in 2024, with 168 million individuals affected across all reported incidents and 137 million from the top 10 breaches alone. Change Healthcare suffered the largest breach, affecting 100 million individuals due to a BlackCat/ALPHV ransomware attack that exploited an MFA-lacking Citrix portal, resulting in a $22 million ransom payment and widespread healthcare disruptions. Kaiser Foundation Health Plan (13.4M affected), HealthEquity (4.3M), and Concentra Health Services (4M) rounded out the top breaches, with most incidents involving hacking or IT incidents, particularly targeting third-party vendors. Nine out of the ten largest breaches were attributed to hacking/IT incidents, with five originating from HIPAA business associates’ network servers. The breaches highlighted ongoing cybersecurity challenges in healthcare, including ransomware threats, third-party risk management issues, and the need for enhanced security measures like MFA implementation.
Cybersecurity
- The Office of the Inspector General (OIG) has called for enhancements to the HIPAA audit program due to increasing cyberattacks on healthcare organizations, resulting from the narrow scope and ineffective oversight of previous audits conducted by the Office for Civil Rights (OCR) in 2016-2017. In response, OCR plans to resume HIPAA audits by late 2024 or early 2025, with an expanded focus on physical and technical safeguards, and the development of criteria for compliance reviews. While OCR agreed to most of OIG’s recommendations, it did not concur with the recommendation to ensure deficiencies are corrected, citing limitations in legal authority and resources. OCR also intends to define metrics for monitoring audit effectiveness and will survey past audit participants to track compliance improvements. The enforcement process for potential HIPAA violations involves reviewing complaints, investigating breaches, and potentially referring criminal violations to the Department of Justice.
- A recent report analyzed cyberattacks on cyber-physical systems (CPS) and found significant financial impacts, with 27% of organizations experiencing losses of $1 million or more. Key contributors to these losses included lost revenue (39%), recovery costs (35%), and employee overtime (33%). Ransomware was a major factor, with 53% of respondents paying over $500,000 to regain access to encrypted systems, a problem particularly severe in the healthcare sector. Operational impacts were also significant, with 33% experiencing a full day or more of downtime and 49% taking a week or more to recover. Despite these challenges, 56% of organizations reported increased confidence in their CPS’s ability to withstand cyberattacks, and 72% expect security improvements in the coming year.
Geo-Location Data
- Fog Data Science, a location-tracking company, provides services to police departments by using geolocation data from smartphones to identify places visited by suspects, including sensitive locations like doctors’ offices. The company uses a “Project Intake Form” that asks police to provide locations and personal details about suspects to refine their search in a database of geolocation data collected from mobile apps. This practice has raised privacy concerns, particularly regarding the potential use of location tracking to prosecute abortions. An investigation by the Electronic Frontier Foundation revealed that Fog Data Science purchases extensive geolocation data from data brokers, which is then sold to law enforcement agencies. The Federal Trade Commission has taken action against one of Fog’s data sources, Venntel, for selling sensitive location data without user consent.
Enforcement
- Recent enforcement actions by state and federal law enforcement, including the Texas AG settlement with Pieces Technologies and the FTC’s Operation AI Comply, signal increased scrutiny of AI products under existing consumer protection laws. These actions highlight the importance of clear and conspicuous disclosures regarding AI accuracy, efficacy, and potential risks, as well as the need for substantiation of claims about AI bias and functionality. Companies using AI should be aware of the legal risks and take proactive steps to ensure compliance with applicable laws and regulations.