Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

USA: Harnessing AI in compliance with HIPAA

The use of artificial intelligence ('AI') has become the norm in many industries, with the health and pharmaceutical sector being no exception. As such, complex sectoral laws like the Health Insurance Portability and Accountability Act of 1996 ('HIPAA') deserve close attention as to how it governs the use of such practices. Elizabeth G. Litten, Partner and Chief Privacy & HIPAA Compliance Officer at Fox Rothschild LLP, and Kristina N. Burland, Associate at Fox Rothschild LLP, discuss this issue and the different factors for organisations to consider in this regard.

kojo_feja / Signature collection /

Even though we can not see it, AI1 is a prevalent part of everyday life. From unlocking your cell phone, to producing relevant responses to your every Google search, to providing up-to-the-minute navigation, to serving up product recommendations on Amazon, AI 'is encountered by most people from morning until night2.' While AI can certainly make daily life more convenient - and we are now accustomed to it doing so - AI can be harnessed to provide powerful and life-altering information, particularly in the field of healthcare. As one scholar has observed, AI 'will have a positive impact on various diagnostic fields' including radiology, pathology, dermatology, and ophthalmology3. For example, a 2018 study concluded that the use of AI 'improved diagnostic performance for detection of breast cancer at mammography4.' Similarly, a 2019 study details how machine learning was employed to 'distinguish between environmental enteropathy and celiac disease more reliably than human doctors, at least at the earliest stages when the two are easily confused5.' Perhaps most timely, AI has been used in various capacities during the COVID-19 pandemic to 'enhance diagnostic efforts, deliver medical supplies, and even assess risk factors from blood tests', as well as to help 'forecast future COVID-19 cases6.'

Beyond the diagnostic utility of AI, AI also facilitates 'improve[ed] clinician and administrative workflows', plays an important role in connecting patients to health resources and care, and 'can help improve drug development7.' It is clear that 'artificial intelligence is poised to be the engine that drives improvements across the [health]care continuum8.' Research suggests that 'AI has the potential to improve patient outcomes by 30% to 40% while reducing treatment costs by up to 50%9.'

But as is often the case with the rapid development of new technologies, there is a barrier to the use of AI, including the use of AI in healthcare: our laws are not readily adaptable to the rapid advancement of AI. Top among the laws implicated by the use of AI in healthcare is HIPAA, as amended by the Health Information Technology for Clinical and Economic Health Act of 2009 and implementing regulations. This Insight explores the interplay between HIPAA and the use of AI, and provides recommendations for the use of AI in compliance with HIPAA.

Does HIPAA apply at all? Consumer-generated health data may provide an out

Of course, deploying AI in compliance with HIPAA is a concern only where HIPAA applies. While at first blush the use of AI for healthcare purposes seems to necessarily implicate HIPAA, the reality is that there is a swath of AI activities that need not comply with HIPAA. HIPAA's requirements only apply to 'covered entities' and 'business associates.' Notably, this means that many app developers that create applications that collect consumer-generated health data may be able to employ AI without having to meet HIPAA's requirements10. In 2016, the Office for Civil Rights ('OCR') within the U.S. Department of Health and Human Services ('HHS') released guidance addressing the relationship between health applications and HIPAA. The guidance specifically addresses how 'HIPAA appli[es] to health information that a patient creates, manages or organizes through the use of a health app' and when application developers must comply with HIPAA rules11. The guidance provides that if a developer is 'only offering services directly to and collecting information for or on behalf of consumers, and not on behalf a provider, health plan, or health care clearinghouse, you are not likely to be subject to HIPAA as either a covered entity or business associate12.' Accordingly, developers that independently (and not on a covered entity's or business associate's behalf) use consumer-generated health and other data to develop AI are similarly not subject to HIPAA requirements.

The HHS articulated the distinction between independent consumer-generated health data and health data generated by or on behalf of a covered entity or business associate in a Notice of Proposed Rulemaking to amend HIPAA released on 10 December 2020 ('the Notice')13. The Notice proposes a new definition of 'personal health application' under HIPAA. A 'personal health application' is proposed to mean 'an electronic application used by an individual to access health information about that individual in electronic form, which can be drawn from multiple sources, provided that such information is managed, shared, and controlled by or primarily for the individual, and not by or primarily for a covered entity or another party such as the application developer14.' The HHS explains, 'Put another way, a personal health application is a service offered directly to consumers ... individuals ... use a personal health application for the individuals' own purposes, such as to monitor their own health status and access their own PHI suing the application.'

Processing PHI for AI purposes in compliance with HIPAA

While developers that are not 'covered entities' or 'business associates' may be able to freely use AI to make contributions and improvements to healthcare, the utility of AI in healthcare would ultimately be quite limited if the AI relied largely on consumer-generated data. To fully harness AI, 'large quantities of images or other real world data' is necessary to 'train the technology' - data that is held by HIPAA-covered entities like healthcare providers and insurers15 and their business associates like electronic health records vendors. Further, this data qualifies as personal health information ('PHI'), meaning that 'health care providers and insurers that hold this data are limited in their ability to transfer the PHI to third parties, including developers', who will use this data for AI purposes16. Covered entities and business associates are also potentially constrained in using the data for AI purposes internally17.

De-identification of PHI: Proceed with caution

A covered entity (or business associate) that wishes to disclose PHI to a third party for purposes of AI can do so in one of two ways. First, the covered entity can de-identify the data using either the Safe Harbor method or Expert Determination method, so that HIPAA requirements no longer apply to the de-identified data18. However, this method is not without risk. As one observer notes, 'de-identifying sounds like a good solution - but many health providers don't have the time, finances, or technical know-how to strip out the patient health information. Even when they do have the necessary tools, they make mistakes which breach HIPAA19.' In addition, both covered entities and business associates are limited by HIPAA (and, for business associates, by the terms of their business associate agreements) in terms of permitted uses of PHI, including the use of PHI to create de-identified data.

Further, many predict that as AI expands, there is a looming threat of litigation in connection with HIPAA's de-identified data exception; some argue that because of the sheer volume of information that Big Data holds, 'it may be almost impossible to find that a data giant could not combine almost any information with other data to identify an individual - meaning it is technically impossible to say that data can ever fall within HIPAA's 'de-identified data' exception20.' This camp warns that 'entities involved in AI-focused partnerships should not rely on the de-identified data exception without considering the risk of litigation21.'

Business associate agreements remain the safer bet for transferring PHI to a third party

Alternatively, the covered entity or business associate can disclose PHI to a third party for purposes of AI by entering into a business associate agreement or subcontractor agreement with the developer and/or other third party employing AI. Of course, upon doing so, the business associate or subcontractor must comply with HIPAA - which then raises the question: how can a covered entity or business associate use PHI to create AI in a HIPAA-compliant manner?

Covered entities and business associates that process PHI for AI may find comfort within the healthcare operations exception, but uncertainty remains

Some question whether processing PHI for purposes of AI even qualifies as a 'use'. At least one commenter has pointed out that the HHS commentary to the Privacy Rule advises that 'computer processing' of data does not constitute a 'use' that is subject to the Privacy Rule22. Accordingly, 'one reasonably can argue that an algorithm combing through terabytes of EHR data to 'learn' does not constitute a 'use' of the data for purposes of the Privacy Rule if no human ever lays eyes on the PHI23.'

However, the more conservative approach is to presume that such processing is a 'use' governed by HIPAA, and thus, that the Privacy Rule does apply. While HIPAA's Privacy Rule generally prohibits a covered entity or business associate from using or disclosing PHI without a patient's authorisation, it permits the use and disclosure of PHI absent an individual's authorisation for certain limited purposes. Most notable for purposes of processing PHI for purposes of AI is the use and/or disclosure of PHI for 'health care operations.' The term 'health care operations' is defined in detail in 45 C.F.R. § 164.501, but generally speaking, includes 'administrative, financial, legal, and quality improvement activities of a covered entity that are necessary to run its business and to support the core functions of treatment and payment24.' Of relevance is the stipulation that 'health care operations' include 'conducting quality assessment and improvement activities, population-based activities relating to improving health or reducing health care costs, and case management and care coordination25.'

The Notice proposes to amend the definition of 'health care operations' to replace the commas with semi-colons, such that health care operations will specifically include 'conducting quality assessment and improvement activities; population-based activities relating to improving health or reducing health care costs; and case management and care coordination26.' The HHS explains that this punctuation change will clarify that the term 'population-based activities' is not meant to place a condition on the activities following that term (case management and care coordination), which are activities that can be focused on individual cases or on a population. However, the Notice does not mention how or whether AI relates to or can be implemented as part of 'health care operations', either to benefit an individual or a population.

Although it will of course depend on the precise purpose for which a covered entity or business associate is processing PHI for purposes of AI, processing PHI for AI arguably fits within the 'health care operations' exception and could be performed by a covered entity for its own healthcare operations purposes or by a business associate for the healthcare operations purposes of the covered entity on whose behalf it is providing services.

Nevertheless, covered entities and business associates should be alert to situations in which the processing of PHI for AI purposes goes from 'health care operations' to 'research' purposes. HIPAA defines 'research' as 'a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge27.' If processing PHI for AI purposes becomes 'research, patient authorisation and/or informed consent may be required. Further, even where the individual's authorisation is obtained, the covered entity or business associate can only process an individual's PHI for research purposes in a manner consistent with that described in the authorisation28.


Although covered entities, business associates, and developers that are neither of the above do appear to have HIPAA-compliant avenues to pursue processing PHI for purposes of AI, there is a need for additional clarity from the HHS. For example, the HHS could explain in its adoption notice of the Notice that 'health care operations' purposes include the processing of PHI for AI. Alternatively, although non-binding, the OCR could issue guidance that specifically gives attention to the use of AI in compliance with HIPAA and help resolve uncertainties regarding de-identification of data and the scope of the 'health care operations' exception when it comes to processing PHI for purposes of AI. Such clarity would help allay concerns about violating HIPAA, which many experts believe has had the unfortunate consequence of 'creat[ing] a risk-averse environment for data sharing29.'

AI is not going anywhere, and indeed, it is rapidly garnering increasing attention. On 12 January 2021, the White House announced the establishment of the National Artificial Intelligence Initiative Office 'charged with overseeing and implementing the United States national AI strategy and will serve as the central hub for Federal coordination and collaboration in AI research and policymaking across the government, as well as with private sector, academia, and other stakeholders30.' Although the establishment of the National Artificial Intelligence Initiative Office came in the last days of the Trump administration and amidst a turbulent transition of power to the Biden administration, many have had their eyes on what we can expect for AI in the Biden administration31. Politics aside, it is clear that the use of AI has become a matter of national importance, and the use of AI in healthcare is no exception.

Elizabeth G. Litten Partner and Chief Privacy & HIPAA Compliance Officer
[email protected]
Kristina N. Burland Associate
[email protected]
Fox Rothschild LLP, Philadelphia

1. AI takes many forms and definitions. Indeed, the Brookings Institute has commented that 'few concepts are as poorly understood as artificial intelligence', and 'part of the problem is the lack of a uniformly agreed upon definition.' See Darrell West, What is Artificial Intelligence, Brookings, Oct. 4, 2018 (available at: For purposes of this article, AI is broadly defined as: 'machines that respond to stimulation consistent with traditional responses from humans, given the human capacity for contemplation, judgment, and intention', and/or software systems that 'make decisions which normally require [a] human level of expertise and help people anticipate problems or deal with issues as they come up.' See id. (internal quotations omitted).
2. See Bernard Marr, The Ten Best Examples of How AI is Already Used in Our Everyday Life, Forbes, Dec. 16, 2019 (available at:; see also Ilija Mihajlovic, How Artificial Intelligence is Impacting our Everyday Lives and How you Encounter it Every Day, Towards Data Science, June 13, 2019 (available at:,using%20a%20ride%2Dshare%20service)
3. See Abhimanyu S. Ahjuja, The Impact of Artificial Intelligence in Medicine on the Future Role of the Physician, PeerJ 7:e7702, Oct. 2019, available at
4. See Alejandro Rodriguez-Ruiz, Elizabeth Krupinski, et. al., Detection of Breast Cancer with Mammography: Effect of an Artificial Intelligence Support System, 290 Radiology No. 2, Nov. 20, 2018 (available at
5. See AI in Healthcare: 4 Examples in Health Informatics, University of Illinois Chicago Ð Health Informatics Blog, (available at: (last visited 1/13/2021).
6. See Using AI and Big Data to Predict the Future Spread of COVID-19 Cases, News-Medical, Sept. 29, 2020 (available at:
7. See Center for Open Data Enterprise, Sharing and Utilizing Health Data for AI Applications: Roundtable Report, 2019 ('the CODE Report') (available at:
8. See Jennifer Bresnick, Top 12 Ways Artificial Intelligence Will Impact Healthcare, HealthITAnalytics Ð Tools & Strategies News, April 30, 2018 (available at:
9. See Ahjuja, supra note 2 at p. 2,
10. Note, however, that other privacy and data security laws may apply to the use of AI in this manner. Further, a 2019 roundtable convened with experts in government, industry, clinical research, nonprofits, and academia have recommended that Òas interest in patient data increases, there is a need for clearer rules around its appropriate use, particularly in the context of AI development.Ó See the Code Report supra note 5 at p. 17.
11. See Health App Use Scenarios & HIPAA,, Feb. 2016 (available at
12. See id.
13. See HHS Press Office, HHS Proposes Modifications to the HIPAA Privacy Rule to Empower Patients, Improve Coordinated Care, and Reduce Regulatory Burdens, Dec. 10, 2020 (available at; Notice of Proposed Rulemaking, 45 CFR Parts 160 and 164, Proposed Modifications to the HIPAA Privacy Rule to Support, and Remove Barriers to, Coordinated Care and Individual Engagement (available at
14. See NPRM. HHS notes that the 'proposed definition of personal health application would not apply to or otherwise affect the requirements of the ONC Cures Act Final Rule or the CMS Interoperability and Patient Access Rule.' Ibid. at 47.
15. See Dianne J. Bourque, Strategies to Unlock AI's Potential in Health Care, Part 3: HIPAA and Other Privacy Considerations, The National Law Review (Oct. 25, 2018) (available at:; see also John Schneider, How HIPAA is Undermining IT and AI's Potential to Make Healthcare Better, Electronic Health Reporter, Sept. 18, 2019 ('Artificial intelligence and machine learning applications depend on large, real-world datasets and could be put to use to build technology and resources to identify distinct risk profiles, analyze the effectiveness of treatment protocols across specific patient populations, or surface insights that can dramatically improve the speed and quality of care. But only the few commercial entities that have access to data can play in this space.')
16. See Bourque, supra n. 11.
17. As discussed in more detail infra, covered entities may use and disclose PHI without an individual's authorisation for payment, treatment, and health care operations purposes, as well as other specific purposes enumerated under HIPAA. See 45 C.F.R. §164.502 through 512. Notably, the terms 'payment', 'treatment' and 'health care operations' are defined under HIPAA (45 C.F.R. §160.103; 45 C.F.R. §164.501), such that not even all internal uses of PHI are permitted. Additionally, business associates are limited not only by the applicable provisions of HIPAA, but by the terms of their business associate agreements with covered entities. 45 C.F.R. §164.504.
18. See e.g. De-identification of Protected Health Information: How to Anonymize PHI, HIPAA Journal, (see (last visited Jan. 13, 2021).
19. See Artificial Intelligence and the Health Insurance Portability and Accountability Act of 1996 (HIPAA), Cohen Healthcare Law (available at (last visited Jan. 13, 2021); see also Bourque, supra n. [ ] (noting 'de-identifying a massive trove of PHI is a complex undertaking, and missteps could lead to a reportable breach under HIPAA and other laws.')
20. See Patricia S. Calhoun and Patricia M. Carreiro, De-identified Data Exception in HIPAA Poses a Litigation Risk, Stat, Apr. 16, 2020 (available at
21. See id.
22. See Adam H. Greene, More Data Please! The Challenges of Applying Health Information Privacy Laws to the Development of Artificial Intelligence, Davis Wright Tremaine LLP Privacy & Security Law Blog, Feb. 26, 2020 (available at
23. See id.
24. See Uses and Disclosures for Treatment, Payment, and Health Care Operations, Department of Health and Human Services, July 26, 2013 (available at
25. See id.; see also 45 C.F.R. § 164.501.
26. See footnote 13 at p. 111-112;
27. See 45 C.F.R. ¤ 164.501.
28. See e.g. HIPAA Authorization for Research, U.S. Department of Health and Human Services National Institute of Health, Apr. 2004 (available at:
29. See CODE Report, supra n. 6, at 13.
30. See White House Launches the National Artificial Intelligence Initiative Office, Whitehouse Statements and Releases, Jan. 12, 2021 (available at
31. See Jonathan Vanian, What a Biden Harris Administration Means for Artificial Intelligence, Forbes, Nov. 10, 2020 (available at: