On April 18, 2024, the Federal Trade Commission (FTC), U.S. Department of Justice (DOJ), and U.S. Department of Health and Human Services (HHS) launched a public web portal for reporting anticompetitive practices in the health care sector. The portal, www.healthycompetition.gov, allows anyone to submit complaints about potential anticompetitive conduct in the healthcare industry. The portal provides information about federal laws ensuring healthy competition and examples of conduct that can harm competition in healthcare. The agencies have not limited the sources of reports, implying a wide scope for potential informants, from the general public to industry insiders. The launch of this portal necessitates increased vigilance from healthcare entities, as any information could potentially trigger an investigation by the FTC or DOJ.
Author: WadeEmmert
The Biden-Harris Administration has announced a Final Rule through the Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services (HHS) to enhance the HIPAA Privacy Rule and protect reproductive health care privacy. This rule prohibits the disclosure of protected health information (PHI) related to lawful reproductive health care under certain conditions. The rule was issued in response to community feedback for better patient confidentiality and to prevent misuse of medical records related to reproductive health care. The rule mandates regulated health care providers and organizations to modify their Notice of Privacy Practices and obtain a signed attestation for certain requests for PHI related to reproductive health care. The current HIPAA Privacy Rule remains in effect until the new rule is implemented.
Summary of article from Ars Technica, by Jon Brodkin:
The Federal Trade Commission (FTC) has issued a final rule banning noncompete clauses, rendering most existing clauses unenforceable, citing that they are an unfair method of competition and a violation of Section 5 of the FTC Act. The rule will take effect 120 days after its publication in the Federal Register, affecting approximately 30 million US workers currently bound by such clauses. The rule will not apply to senior executives, defined as those earning more than $151,164 annually and in policy-making positions. The FTC argues that noncompete clauses suppress wages, innovation, and economic dynamism, and believes businesses can protect trade secrets through other means like nondisclosure agreements. The US Chamber of Commerce intends to sue the FTC, claiming the rule undermines the competitiveness of American businesses.
Gwendolyn Gibbs, the owner of the Houston-based Daybreak Rehabilitation Center, has been sentenced to 84 months in federal prison and ordered to pay $8.68 million in restitution to Medicare for conspiracy to commit healthcare fraud. Gibbs fraudulently billed Medicare for unnecessary mental health services provided to vulnerable adults with intellectual disabilities. From 2007 to 2016, she submitted fraudulent claims for partial hospitalization program (PHP) services, falsified medical records, and paid kickbacks for patient referrals. Charles Guidry Jr., a manager at Daybreak and Gibbs’ ex-husband, was previously sentenced to 70 months imprisonment for his involvement. Gibbs will remain in custody until her transfer to a U.S. Bureau of Prisons facility. Source: Press Release.
Summary of article from Sheppard Mullin Richter & Hampton LLP, by Carolyn Metnick, Gianfranco Spinelli:
PrivacyCon’s takeaways for healthcare organizations highlighted key considerations for the use of AI in healthcare, focusing on privacy themes, Large Language Models (LLMs), and AI functionality. The study identified four privacy concerns: potential for data misuse, personal nature of data, lack of awareness and consent in data collection, and surveillance by the government. It also highlighted security, privacy, and safety concerns in LLM platforms, particularly with third-party applications, urging developers to prioritize these aspects. The fallacy of AI functionality, where users trust AI blindly without data validation, was identified as a major issue, especially in healthcare where it can lead to misdiagnosis. The post concluded by emphasizing the need for healthcare organizations to establish governance and compliance committees to address these complex challenges and facilitate responsible AI development with privacy and ethical considerations in mind.
Summary of article from mytexasdaily.com:
Gwendolyn Gibbs, the 72-year-old owner of a Houston-based mental health clinic, has been sentenced to seven years in federal prison for a healthcare fraud scheme. Gibbs pleaded guilty to conspiracy to commit healthcare fraud in December 2021 and was ordered to pay over $8.6 million in restitution to Medicare. The court found that Gibbs had fraudulently billed Medicare for services provided to adults with intellectual disabilities who did not require mental health services, from 2007 to 2016. She admitted to falsifying medical records and paying kickbacks for patient referrals. The case was investigated by multiple agencies, including the FBI and the Department of Health and Human Services.
Summary of article from Davis Wright Tremaine, by David L. Rice, Adam H. Greene, Rebecca L. Williams:
The “My Health My Data Act” in Washington, effective March 31, 2024, imposes strict regulations on the collection and use of “consumer health data” (CHD), even extending to data indirectly related to a consumer’s health. The Act covers all businesses operating in Washington and those providing services or products to its consumers, and applies to both residents and non-residents whose CHD is collected within the state. It mandates consumer consent for CHD collection, processing, or disclosure, and prohibits the sale of CHD without a valid, annually renewed authorization. The Act also forbids the use of “geofences” around healthcare facilities for data collection or advertising. Finally, the Act grants enforcement authority to the Washington Attorney General and establishes a private right of action for consumers, with Nevada implementing a similar law.
Summary of article from Healthcare IT News, by Andrea Fox:
A survey of 100 US physicians revealed that 81% believe generative AI can enhance care team interactions with patients. The majority (89%) of physicians require transparency about the sources of clinical decision support (CDS) data from vendors. However, physicians overestimate patients’ readiness for AI in healthcare, with 66% believing patients would be confident in AI-assisted decisions, contrasting with 48% of patients expressing confidence. The survey also highlighted a lack of clear AI usage guidelines in healthcare organizations. Despite initial skepticism, adoption of AI in healthcare is growing, with companies like Wolter Kluwer integrating AI into their products to aid clinical decision-making.
Summary of article from Federal Trade Commission, Office of Technology:
The FTC has been enforcing national consumer protection laws for over two decades, focusing on companies with inadequate security practices such as failing to encrypt sensitive data and not using multi-factor authentication. The FTC and the Cybersecurity and Infrastructure Security Agency (CISA) recommend practices like root-cause analysis of vulnerabilities, using template rendering systems for Cross-Site Scripting (XSS) vulnerabilities, query builders for SQL injection vulnerabilities, and memory-safe programming languages for buffer overflows and use-after-free vulnerabilities. CISA’s Secure by Design Alert Series offers additional strategies to protect systems from design issues leading to security incidents. The FTC asserts that companies have a legal obligation to protect consumers’ data, with violations leading to enforcement actions.
Summary of article from HackerNoon, by mcmullen:
Artificial intelligence (AI) is revolutionizing various aspects of healthcare, but it also presents privacy and security risks, particularly in the context of data breaches. Compliance with the Health Insurance Portability and Accountability Act (HIPAA) is crucial when integrating AI into healthcare. To remain HIPAA compliant, healthcare organizations must understand AI algorithms, regularly update policies, and implement robust security measures. Despite the challenges, the implementation of AI in healthcare, when done responsibly and ethically, offers significant potential benefits for patient care and research.