Skip to the contentArtificial Intelligence
- The Department of Homeland Security (DHS) has released a voluntary framework outlining AI responsibilities for critical infrastructure sectors. The framework, developed with industry input, addresses risks like AI-based attacks and design failures, emphasizing practical implementation for safety and security. While the future of the framework under a potential Trump administration remains uncertain, DHS highlights its own AI pilot projects and integration efforts.
- The Global Privacy Assembly’s Joint Statement emphasizes that data scraping, even of publicly accessible data, must comply with privacy laws, including obtaining consent when necessary. Relying solely on platform terms for data scraping does not guarantee compliance with data protection and AI laws.
- President-elect Trump’s return to the White House may impact the artificial intelligence (AI) industry. Trump’s alliance with tech billionaire Elon Musk and his pledge to repeal President Biden’s AI executive order suggest a focus on private sector-driven innovation and competition over regulation.
- According to a recent cybersecurity report, healthcare organizations block 17.2% of AI transactions, ranking third behind finance and insurance, and technology sectors. This is slightly below the national average of 18.5%, indicating a lag in healthcare’s efforts to secure sensitive data against AI threats. Despite being the sixth-largest user of AI and machine learning, the healthcare sector’s AI adoption is expected to grow. The most popular AI applications in healthcare include ChatGPT, Drift, OpenAI, Writer, and Intercom. Healthcare organizations are actively engaging in AI safety initiatives, with some developing in-house AI platforms, while addressing concerns about data privacy, security, and the reliability and bias of AI algorithms in patient care.
- Pieces Technologies, a Dallas-based company, uses AI to streamline physician documentation, saving time on tasks like patient summaries, discharge notes, and progress notes. The platform, now in use at hospitals nationwide, has expanded to include a mobile version and is exploring outpatient applications. Despite facing scrutiny over AI accuracy, Pieces continues to innovate and secure funding for its patient-facing technology.
- California enacted two new laws governing the use of AI in healthcare. One law requires health plans using AI in utilization review to disclose its use and ensure determinations are based on clinical information. The other law mandates providers using AI in patient communications to obtain consent and follow specific protocols.
Cybersecurity
- The Office for Civil Rights (OCR) fined Providence Medical Institute (PMI) $240,000 for HIPAA violations, including lacking a Business Associate Agreement (BAA) and insufficient access controls. The fine reflects a 20% discount for PMI’s recognized security practices, but OCR has not publicly disclosed which practices qualify or how the discount was calculated. This marks the first time OCR has publicly acknowledged applying the discount, which was mandated by the Cybersecurity Act of 2015.
- New federal legislation called the Health Infrastructure Security and Accountability Act (HISAA) has been introduced to address cybersecurity risks in the healthcare sector, particularly for HIPAA Covered Entities and Business Associates. The Act is a response to high-profile cybersecurity incidents, such as the Change Healthcare breach, and aims to establish new security requirements, conduct annual risk assessments, and enforce audits to improve cybersecurity practices. HISAA proposes tiered penalties for noncompliance, removes statutory caps on penalties, and introduces fees to cover oversight costs. It also includes financial incentives and disincentives related to Medicare payments to encourage the adoption of enhanced cybersecurity measures. The act represents a significant shift in cybersecurity enforcement and oversight compared to existing HIPAA regulations.
Data Privacy
- The FTC published an explainer on the use of Data Clean Rooms (DCRs), cloud services that enable data exchange and analysis between companies. While DCRs can offer privacy protections when configured correctly, they are not inherently privacy-preserving and can be used to obfuscate privacy harms. Companies should not rely on DCRs to avoid legal obligations regarding data privacy and should be held accountable for any violations, regardless of the technology used.
- The healthcare industry is increasingly targeted by ransomware attacks, with notable incidents such as the Change Healthcare breach affecting nearly 100 million individuals. Healthcare organizations face complex decisions regarding whether to pay ransoms, balancing the need to minimize business disruption and protect sensitive data against the risks of legal liability, increased future targeting, and ethical concerns. Paying a ransom does not eliminate legal obligations to report breaches, and it may expose organizations to penalties if payments are made to sanctioned entities. The healthcare sector’s critical services and sensitive data make it a prime target, necessitating robust cybersecurity measures and comprehensive incident response strategies. Organizations must carefully evaluate their legal and strategic options to effectively manage ransomware risks.
- Texas is emerging as a significant player in privacy regulation following the implementation of the Texas Privacy and Data Security Act (TPDSA) in July 2024 and the Texas Securing Children Online through Parental Empowerment (SCOPE) Act in September 2024. Texas Attorney General Ken Paxton has initiated a privacy and security enforcement initiative, establishing a dedicated team within the Consumer Protection Division to enforce these laws. Notable actions include a lawsuit against TikTok for allegedly violating the SCOPE Act by sharing minors’ personal information without parental consent, and a settlement with Meta under the Texas biometric law for unauthorized data capture. Additionally, over 100 companies were notified for failing to register as data brokers, and car manufacturers are under investigation for data collection practices. Businesses processing Texans’ personal information should ensure compliance with the TPDSA and other relevant privacy laws to avoid enforcement actions.