Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

USA: The rising use of biometrics and consequential legal and compliance risks

Whether it is facial recognition technology ('FRT') being used by law enforcement or in connection with various physical security and access management applications, the use of fingerprint-based time management systems or voiceprint technologies to validate identity, applications in the public and private sectors involving the use of biometric identifiers and information continue to grow. Correlated with that growth are concerns about privacy and security, as well as civil liberties. Over the past few years, significant compliance and litigation risks have emerged factoring heavily into the deployment of biometric technologies, particularly facial recognition. Joseph J. Lazzarotti, Privacy, Principal at Jackson Lewis P.C., explores the current leglisation covering biometrics and privacy within the US, as well as the link between biometrics and bias.

ktsimage / Essentials collection / istockphoto.com

The biometrics markets was worth around $20 billion market in 2020 and research suggests that it is expected to grow to approximately $44 billion by 2026. This is not hard to imagine considering how ubiquitous biometric applications have become in everyday life. Biometrics are used for identity verification in a myriad of circumstances such as unlocking smartphones, accessing theme parks, operating cash registers, clocking in and out for work, and travelling by plane. Concerns about security and identity theft coupled with weak practices around passwords have led some to ask whether biometrics will eventually replace passwords as the primary means for identity verification. While that remains to be seen, there is little doubt the use of biometrics will continue to expand.

A significant piece of that market, FRT, has become increasingly popular in the employment and consumer space (e.g. employee access, passport check-in systems, payments on smartphones), as well as with law enforcement. For approximately 20 years, law enforcement has used FRT to aid with criminal investigation with mixed results. The COVID-19 pandemic helped to drive broader use of this technology. The need to screen persons entering a facility for symptoms of the virus, including temperature, led to an increased use of thermal cameras, kiosks, and similar devices embedded with facial recognition capabilities. When federal and state unemployment benefit programs experienced massive fraud as they tried to distribute hundreds of billions in COVID-19 relief, many turned to facial recognition and similar technologies to help. By late summer 2021, more than half of the states in the US contracted with an online identity network company to provide ID verification services.

Many have objected to the use of this technology in its current form, however, citing concerns such as the lurch toward a more 'Orwellian society' and due process, noting some shortcomings in accuracy and consistency. Others have observed the ability to compromise the technology as a new path to committing fraud against individuals.  

BIPA

With the increasingly broad and expanding use of facial recognition and other biometrics has come increased regulation, along with compliance and litigation risks. Perhaps the most popular law regulating biometric information in the US is the Illinois Biometric Information Privacy Act of 20081 ('BIPA'). Enacted in 2008, BIPA was one of the first state laws to address a business's collection of biometric data. BIPA protects:

  • 'biometric identifiers', namely a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry; and
  • 'biometric information', namely any information, regardless of how it is captured, converted, stored, or shared, based on an individual's biometric identifier used to identify an individual2.

Many biometric technologies promote that they do not retain a face scan or a fingerprint, but instead convert that information to a hash or other code which cannot be reverse-engineered to recreate the identifier, e.g., the fingerprint. Nonetheless, a reading of BIPA suggests that the statute applies to both the fingerprint and information converted from that identifier.

BIPA established a comprehensive set of rules for companies collecting biometric identifiers and information from Illinois residents, including the following:

  • informed consent in connection with collection;
  • maintaining a written, public policy concerning retention and destruction;
  • disclosure limitations;
  • reasonable data security safeguards;
  • prohibition on profiting from biometric data; and
  • a private right of action for individuals aggrieved by violations of BIPA.

BIPA largely went unnoticed until 2015, when a series of five class action lawsuits of similar nature were brought against businesses alleging unlawful collection and use of the biometric data of Illinois residents3. Since BIPA was enacted, more than 2,000 putative class actions lawsuits have been filed. The onslaught is primarily due to BIPA's private right of action provision which provides statutory damages up to $1,000 for each negligent violation, and up to $5,000 for each intentional or reckless violation. Adding fuel to the fire, the Illinois Supreme Court ruled that an individual is 'aggrieved' under BIPA and can sue for technical violations, such as a failure to provide the notice called for under the law4. While most of these cases have involved the collection of fingerprints for time management systems, several cases have involved facial recognition, including one that reportedly settled recently for $650 million.

Additionally, there have been some important case law developments that generally have favoured plaintiffs, thereby increasing the potential exposure for violations of BIPA, with notable Illinois Supreme Court decisions including that:

  • a separate claim under BIPA accrues each time an entity scans or transmits an individual's biometric identifier or biometric information5;
  • the one-year statute of limitations under Section 13.201 of the Illinois Code of Civil Procedure is not applicable for claims under BIPA because the statute does not have its own limitations period, the 'catchall' five-year statute of limitations period must apply6; and
  • the exclusivity provisions of the Illinois Workers' Compensation Act do not bar claims for statutory liquidated damages brought by employees against their employers under BIPA7.

Beyond BIPA

Many have argued that BIPA went too far, opening the floodgates to litigation for plaintiffs who, in many cases, suffered little to no harm. Indeed, several efforts have been made to moderate BIPA's impact. However, massive data breaches and surges in identity theft and fraud have strengthened calls for stronger measures to protect sensitive personal information, including biometric information.  

Beyond BIPA, state and local governments have enacted laws to regulate the collection, use, and disclosure of biometric information. Here are a few examples:

Comprehensive biometric privacy laws

Both Texas8 and Washington9 have enacted comprehensive biometric laws similar to BIPA, but without the same kind of private right of action provision. Other states, such as New York10, have considered BIPA-like privacy bills which mirrored the BIPA's enforcement scheme, but without success. New York City passed a couple of industry-specific measures to safeguard biometric information, which include private rights of action.

Texas

The Texas law protects 'biometric identifiers', defined as either a 'retina or iris scan, fingerprint, voiceprint, or record of hand or face geometry'. This definition is a bit narrower than the scope of data protected under BIPA, although the Lone Star state's law includes similar requirements as BIPA: notice and consent; maximum retention period (one year); prohibitions on sale and disclosure; and reasonable safeguards. Unlike BIPA, there is no private right of action. The state's Attorney General ('AG') may bring an action for civil penalty of not more than $25,000 for each violation.

Washington

Washington's biometric data protection statute defines 'biometric identifiers' to mean: 'data generated by automatic measurements of an individual's biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual'. 'Biometric identifier' does not include a physical or digital photograph, video or audio recording, or data generated therefrom, or information collected, used, or stored for health care treatment, payment, or operations under the federal Health Insurance Portability and Accountability Act of 1996.

The statute prohibits persons from 'enrolling' biometric identifiers in a database for a 'commercial purpose' without first providing notice, obtaining consent, or providing a mechanism to prevent the subsequent use of the biometric identifiers for a commercial purpose.

'Enroll' means 'to capture a biometric identifier of an individual, convert it into a reference template that cannot be reconstructed into the original output image, and store it in a database that matches the biometric identifier to a specific individual'. 'Commercial purpose' means 'a purpose in furtherance of the sale or disclosure to a third party of a biometric identifier for the purpose of marketing of goods or services when such goods or services are unrelated to the initial transaction in which a person first gains possession of an individual's biometric identifier'.

Considering the scope of the commercial purpose definition, the Washington law does not appear to apply in an employment context, such as with time management devices, which have led to significant litigation in Illinois.

The exact type of notice and consent will depend on the context, and notice must be given through a procedure reasonably designed to be readily available to affected individuals. Also, in general, a person that has obtained a biometric identifier from an individual and enrolled that identifier may not sell, lease, or otherwise disclose the identifier absent consent. Persons that possess biometric identifiers of individuals that have been enrolled for a commercial purpose must: (i) have reasonable safeguards to protect against unauthorised access or acquisition to the identifiers; and (ii) not retain the identifiers for longer than is necessary to carry out certain functions, such as providing the product for which the identifier was acquired.

There is no private right of action under the new Washington law. As in Texas, it is enforced by the state's AG.

New York City

The Big Apple took a more targeted approach to regulating biometric information in the City, while retaining a private right of action similar to BIPA.

First, it amended Title 22 of its Administrative Code to create BIPA-like requirements for the retail, restaurant, and entertainment businesses relating to collecting customers' 'biometric identifier information', that is 'a physiological or biological characteristic that is used by or on behalf of a commercial establishment, singly or in combination, to identify, or assist in identifying, an individual, including, but not limited to: (i) a retina or iris scan, (ii) a fingerprint or voiceprint, (iii) a scan of hand or face geometry, or any other identifying characteristic'. Notably, this definition is not limited to the listed physiological or biological characteristics.  

The law has two primary requirements: (i) provide clear and conspicuous signage of the collection, retention, conversion, storage, or sharing of biometric identifier information to customers; and (ii) do not sell, lease, trade, share in exchange for anything of value, or otherwise profit from transactions involving biometric identifier information.  Under the law, customers have a private right of action to remedy violations, subject to a 30-day notice and cure period, with damages ranging from $500 to $5,000 per violation, along with attorneys' fees.

Next, New York City passed the Tenant Privacy Act11, which, among other things, requires owners of 'smart access' buildings, i.e., those that use key fobs, mobile apps, biometric identifiers12, or other digital technologies to grant access to their buildings, to provide privacy policies to their tenants prior to collecting certain types of data from them, as well as to strictly limit: (i) the categories and scope of data that the building owner collects from tenants; (ii) how the covered entities use that data (including a prohibition on data sales); and (iii) how long the information may be retained. The law creates a private right of action for tenants whose data is unlawfully sold. Those tenants are empowered to seek either compensatory damages or statutory damages ranging from $200 to $1,000 per tenant, along with attorneys' fees.

Comprehensive state privacy laws

Modeled to some degree after the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR'), comprehensive privacy laws have emerged in six states in the US, most notably the California Consumer Privacy Act of 2018 ('CCPA')13. The CCPA seeks to provide individuals who are residents of California (consumers) greater control over their personal information. That control is provided through, among other things, requirements on covered businesses to provide notice about data collection practices, to maintain a website privacy policy, and to extend an expanding array of rights, including the right to delete and limit the use of sensitive personal information.

The CCPA defines personal information broadly and sets out several non-exhaustive categories of personal information, one being 'biometric information' as:

'individual's physiological, biological, or behavioral characteristics, including information pertaining to an individual's deoxyribonucleic acid (DNA), that is used or is intended to be used singly or in combination with each other or with other identifying data, to establish individual identity. Biometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted, and keystroke patterns or rhythms, gait patterns or rhythms, and sleep, health, or exercise data that contain identifying information'.

It is worth noting that, in certain respects, biometric information covered by the CCPA is significantly broader than the kinds of information covered under BIPA and some of the other laws discussed above. Additionally, the California Privacy Rights Act of 2020 ('CPRA') amended the CCPA to add a new category of personal information, 'sensitive personal information', which is defined to include certain biometric information. Sensitive personal information comes with specific protections including the right to limit uses and disclosures of that information under certain circumstances.  

There is no private right of action under the CCPA for most violations of the law. However, if a CCPA-covered business experiences a data breach involving a subset of personal information, which includes biometric information14, the CCPA authorises a private cause of action against the business if a failure to implement reasonable security safeguards caused the breach. If successful, a plaintiff can seek to recover statutory damages in an amount not less than $100 and not greater than $750 per consumer per incident or actual damages, whichever is greater, as well as injunctive or declaratory relief and any other relief the court deems proper. This means that similar to BIPA, plaintiffs generally do not have to show actual harm to recover.

Other state laws

One area where states have expanded protections for biometric information is in breach notification laws. That is, nearly 20 states, including the District of Columbia, have added biometric information to the categories of personal information that, if breached, may require notification15.

Breach notification laws are not the only measures intended to protect biometric information. State laws addressing data destruction and reasonable safeguard mandates to protect the security of personal information include biometric information in their scope. Additionally, several state laws include statutory obligations for a business to obtain reasonable assurances in writing from third party service providers that process personal information on behalf of the business including biometric data.

Biometrics and bias

Biometrics can be invaluable in use cases involving identification and validation. The laws summarised above recognise that, but also the need to maintain the privacy and security of biometric information in the process. However, concerns related to the use of biometric information are not limited to privacy and security. Mismatches and allegations of bias when using biometric information, particularly in applications using facial recognition, have led to calls for additional protections, including with regard to civil rights.

In the last couple of years, there has been an uptick in hackers trying to 'trick' FRT, in a myriad of settings, such as fraudulently claiming unemployment benefits from state workforce agencies. As noted above, the majority of states are now using FRT as a way to verify persons eligible for government benefits to prevent fraud. Hackers of facial recognition systems use a myriad of techniques including deepfakes (artificial intelligence-generated images), special masks, or even holding up images or videos of the individual the hacker is looking to impersonate.

Fraud is not the only concern with FRT. Despite its appeal for employers and organisations, there are concerns regarding the accuracy of the technology, as well as significant legal implications to consider. A recent report by the National Institute of Standards and Technology ('NIST') studied 189 facial recognition algorithms which it considered the 'majority of the industry'16. The report found that most of the algorithms exhibit bias and falsely identify different ethnicities. Moreover, false positives are significantly more common in woman than men, and more elevated in elderly and children, than middle-aged adults.

As a result, there has been an increase in the regulation of the use of biometrics, including facial recognition.

Facial recognition bans

Several US localities have banned the use of facial recognition for law enforcement, other government agencies, and/or private and commercial use.

Law enforcement bans

Over the past few years, several states, cities, and localities have banned the use of facial recognition by law enforcement. These include Vermont, Virginia, San Francisco, Boston, New Orleans, and Minneapolis.

Portland

In September of 2020, the City of Portland in Oregon became the first city in the US to ban the use of 'facial recognition technologies' in the private sector. Proponents of the measure cited, among other things, a lack of standards for the technology and wide ranges in accuracy and error rates that differ by race and gender.

The term 'facial recognition technologies' is broadly defined to include automated or semi-automated processes using face recognition that assist in identifying, verifying, detecting, or characterising facial features of an individual or capturing information about an individual based on an individual's face. The measure carves out limited exceptions including the use of FRT to comply with law, verifying users of personal and employer provided devices, and social medial application. Failure to comply can be painful. Similar to BIPA, the measure provides persons injured by a material violation a cause of action for damages or $1,000 per day for each day of violation, whichever is greater.

Baltimore

The City of Baltimore, for example, recently banned the use of FRT by city residents, businesses, and most of the city government (excluding the city police department) until December 2022. Council Bill 21-0001 prohibits persons from 'obtaining, retaining, accessing, or using certain face surveillance technology or any information obtained from certain face surveillance technology'17. Any person who violates the bill is guilty of a misdemeanor and, on conviction, is subject to a fine of not more than $1,000 or imprisonment for not more than 12 months or both fine and imprisonment.

Conclusion

Facial recognition and other biometric-related technology is booming and continues to infiltrate different facets of everyday life. The technology brings innumerable potential benefits as well as significant data privacy and cybersecurity risks. Organisations that collect, use, and store biometric data increasingly face compliance obligations as the law attempts to keep pace with technology, cybersecurity crimes, and public awareness of data privacy and security. It is critical they maintain a robust privacy and data protection program to ensure compliance and minimise business and litigation risks.

Joseph J. Lazzarotti Principal
[email protected]
Jackson Lewis P.C., New Jersey


1. 740 ILCS 14 et seq, see: https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57
2. 740 ILCS 14/10
3. Pezen v. Facebook Inc., 1:15-cv-03484 (N.D. Ill. Apr. 21, 2015); Licata v. Facebook Inc., 1:15-cv-04022 (N.D. Ill. May 5, 2015); Patel v. Facebook Inc., 1:15-cv- 04265 (N.D. Ill. May 14, 2015); Gullen v. Facebook Inc., 1:15-cv-07861 (N.D. Ill. Aug. 31, 2015); and Norgberg v. Shutterfly, Inc., 1:15-cv-05351 (N.D. Ill. June 17, 2015)
4. Rosenbach v. Six Flags Entertainment Corp., 2019 IL 123186.
5. See: Cothron v. White Castle Systems, Inc., 2023 IL 128004 (Feb. 17, 2023).
6. See: Tims v. Blackhorse Carriers, Inc., No. 127801 (Feb. 2, 2023).
7. See: McDonald v. Symphony Bronzeville Park, No. 126511 (Feb. 3, 2022).
8. See: https://statutes.capitol.texas.gov/Docs/BC/htm/BC.503.htm
9. See: https://app.leg.wa.gov/RCW/default.aspx?cite=19.375
10. See: https://www.nysenate.gov/legislation/bills/2021/A27
11. See: https://legistar.council.nyc.gov/LegislationDetail.aspx?ID=4196254&GUID=29A4B0E2-4C1F-472B-AE88-AE10B5313AC1&Options=ID%7cText%7c&Search=
12. The law defines biometric identifier information similar to Title 22 of its Administrative Code. N.Y.C. Admin. Code § 26-3001.
13. Since 2020, six states have enacted comprehensive consumer privacy laws – California, Connecticut, Colorado, Iowa, Utah, and Virginia. These laws are more similar than they are different, although the CCPA's private right of action provision did not find its way into the other states' comprehensive laws. For purposes of this article, we have summarised the effects of these laws on biometric information through the CCPA.
14. For this purpose, the CCPA points to personal information as 'defined in subparagraph (A) of paragraph (1) of subdivision (d) of Section 1798.81.5', which defines biometric information as 'Unique biometric data generated from measurements or technical analysis of human body characteristics, such as a fingerprint, retina, or iris image, used to authenticate a specific individual. Unique biometric data does not include a physical or digital photograph, unless used or stored for facial recognition purposes'. Cal. Civ. Code § 1798.150.
15. These states include: Arizona, Arkansas, California, Colorado, Connecticut, Delaware, D.C., Illinois, Iowa, Louisiana, Maryland, Nebraska, New Mexico, New York, North Carolina, South Dakota, Vermont, Washington, Wisconsin, and Wyoming.
16. See: https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf#page=4
17. See: https://baltimore.legistar.com/LegislationDetail.aspx?ID=4749282&GUID=3605654F-5629-41A1-BD96-89946A2C32FB&Options=&Search=
Feedback