In the first quarter of 2022, there have already been significant legal developments in the biometric technology space. Most notably, the Illinois Supreme Court—which has actively taken Illinois Biometric Privacy Act (“BIPA”) cases amid the surge of such class action litigation in federal and state courts—issued several consequential BIPA opinions this year. Though 2022’s most critical BIPA decisions are likely still on the horizon.
On February 17, 2022, the California Legislature introduced a biometric privacy law (SB 1189) similar to the Illinois Biometric Information Privacy Act (“BIPA”). SB 1189 would dramatically increase biometric privacy protection for California consumers, expand regulation among private business, and add to the flurry of biometric privacy class action litigation that has taken hold of U.S. courts.
David J. Oberly |
In mid-2021, Baltimore, Maryland, passed Council Bill 21-0001 (the “FRT Ordinance”), becoming the second U.S. jurisdiction to enact sweeping facial recognition regulation that bans the use of facial biometrics by any private entity or individual within city limits.
While a number of cities have enacted laws prohibiting law enforcement and other governmental agencies from using facial recognition, Portland, Oregon, enacted the nation’s first blanket ban over the use of this technology by the private sector at the beginning of 2021. The Baltimore FRT Ordinance goes even further than Portland by imposing criminal penalties of up to a year in jail for companies and individuals that run afoul of the ban.
As federal lawmakers continue to drag their feet on enacting a nationwide, uniform biometric privacy regulatory regime, companies should anticipate that cities and states will continue to take the lead in implementing new biometrics regulation in 2022. In particular, the success seen by Baltimore and Portland in enacting outright bans over the commercial use of facial recognition software is likely to encourage lawmakers in other cities and states to follow suit by enacting tighter controls over the collection and use of facial geometry data in other parts of the country.
Taken together, all businesses that operate in Baltimore and use any type of facial recognition software should assess whether the Baltimore FTC Ordinance applies to them and, if so, take prompt measures to ensure compliance with the law. And from a broader perspective, as this strict type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use facial recognition technology (“FRT”) need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.
- Scope/Applicability: The Baltimore ordinance bars “persons” from obtaining, retaining, accessing, or using any “face surveillance system” or any information obtained from face surveillance system within the City of Baltimore.
- “Person”: The ordinance defines the term “person” as any individual, partnership, firm, association, corporation, other entity, receiver, trustee, guardian, personal representative, or fiduciary.
- “Face Surveillance System”: “Face surveillance system” means “any computer software or application that performs face surveillance.”
- “Face Surveillance”: “Face surveillance,” in turn, is defined as “an automated or semi-automated process that assists in identifying or verifying an individual based on the physical characteristics of the individual’s face.”
- Access Control Systems: Excluded from the scope of the ordinance are “biometric security system[s] designed specifically to protect against unauthorized access to a particular location or an electronic device.”
- Maryland Image Repository System: Also excluded from the scope of the ordinance is the Maryland Image Repository System (facial recognition software that allows law enforcement to compare images of unidentified individuals to images from motor vehicle records and criminal mugshots).
Core Compliance Requirement
- Prohibition on FRT Use: Under the ordinance, a person may not obtain, retain, access, or use in Baltimore City: (1) any face surveillance system; or (2) any information obtained from a face surveillance system.
Enforcement and Remedies
- Misdemeanor: Any person who violates the Baltimore FRT Ban is guilty of a misdemeanor and subject to a fine of not more than $1,000, imprisonment for not more than 12 months, or both fine and imprisonment.
- Each Day a Separate Offense: Each day that a violation continues is a separate offense.
Practical Compliance Tips & Best Practices
All businesses that maintain operations in Baltimore should take immediate action (if they have not already done so) to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Baltimore ordinance if the organization falls under the scope of the law:
- Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must assist in identifying or verifying individuals based on their facial characteristics.
- Evaluate Applicability of Access Control Exemption: If the technology falls under the scope of the ban, evaluate whether the narrow exemption offered by the ordinance for facial recognition-powered access control systems applies to allow the company to continue its use of the technology.
- Cease Use if Exemption Inapplicable: If the technology does not serve the purpose of protecting against unauthorized access to a particular location or electronic device, eliminate the use of facial recognition across the board immediately.
- Identify Availability of Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current facial biometrics technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.
At the end of 2021, two developments laid the groundwork for a definitive resolution of one of the most significant, yet unsettled, issues under the Illinois Biometric Information Privacy Act (“BIPA”)—claim accrual. While all litigants would appreciate some certainty surrounding this hot-button issue, resolution of when a BIPA violation “accrues” (i.e., occurs) will have a seismic impact on the trajectory of all BIPA litigation for years to come—depending on how the Illinois Supreme Court rules in the coming term.
Watson and Cothron BIPA Decisions
In mid-December 2021, an Illinois appellate panel in Watson v. Legacy Healthcare Financial Services, LLC, held BIPA claims accrue each and every time a defendant captures biometric information in violation of the statute, as opposed to only accruing at the first instance of collection.
Just a few days after Watson, the Seventh Circuit Court of Appeals issued its decision in Cothron v. White Castle System, Inc.—another appeal involving claim accrual. But rather than decide when a BIPA claim accrues, and after acknowledging the existence of Watson, the Cothron court certified the question to the Illinois Supreme Court to provide definitive guidance.
While neither Watson nor Cothron offers a conclusive answer, the issue is now teed up to be definitely decided by Illinois’ highest court.
Impact & Implications
The accrual date is a significant issue in BIPA class action litigation. Depending on the circumstances, accrual can serve as the basis for a statute of limitations defense, which, if successful, may require dismissal. But the issue is even more consequential in the context of damages and determining the overall value of a biometric privacy class action. If continuing BIPA violations constitute separate, independent claims, then the associated statutory negligent damages of $1,000 per violation (or $5,000 if intentional) could begin to compound. And because the law provides for liquidated damages for each violation, a ruling that claims accrue each time a defendant runs afoul of the law’s requirements could expand such liability exponentially.
Companies should pay close attention to how the Illinois Supreme Court decides the Cothron appeal, as the ruling could result in a drastic shift in the biometric privacy legal landscape. In the interim, companies should—with the assistance of experienced biometric privacy counsel—take the time to reassess their compliance with BIPA to ensure they are satisfying the full range of requirements to mitigate potential class action risk.
David J. Oberly |
By now, most Golden State employers are well versed in the California Consumer Privacy Act of 2018 (“CCPA”), as well as its soon-to-be successor, the California Privacy Rights Act of 2020 (“CPRA”), which goes into effect at the start of 2023.
At the same time, more and more employers operating in California (and in other parts of the nation) are integrating biometrics into their operations in a variety of ways, including for timekeeping and access control purposes, among others.
Many of these employers need not worry about satisfying the onerous requirements of the CCPA because they do not meet any of the law’s three applicability thresholds: (1) $25,000,000 in gross revenue; (2) buying, receiving, sharing, or selling the personal information of more than 50,000 consumers, households, or devices; or (3) deriving 50 percent or more of revenue from the sale of personal information.
And those employers that do fall under the scope of California’s consumer privacy law are largely exempted from compliance in connection with the personal information of employees (and job applicants) under the CCPA’s employee information exemption.
Combined, many California employers operate under the assumption that there are no applicable legal requirements that must be satisfied when using biometrics in their day-to-day operations. Organizations that take this approach do so at their peril, as California Labor Code § 1051 imposes clear, unambiguous requirements and limitations on employers that utilize certain biometric data in the workplace. More than that, this California law—which often flies under the radar of many employers and even their biometric privacy counsel—can result in criminal penalties for noncompliance.
California Labor Code § 1051
Specifically, Labor Code § 1051 bars employers that require employees or job applicants to furnish their fingerprints from disclosing that fingerprint biometric data to any third party. For example, employers are generally barred under Labor Code § 1051 from disclosing fingerprints to other employers to prevent subsequent employment, or to law enforcement agencies unless required pursuant to a court order or subpoena. Any employer that violates Labor Code § 1051 is guilty of a misdemeanor.
Practical Compliance Tips
Consequently, California employers must proceed with caution when using fingerprint biometric data in the workplace and ensure they are in strict compliance with Labor Code § 1051 when collecting, using, or storing employee fingerprint biometrics.
To do so, employers should first ensure that their biometrics service providers and vendors are completely precluded from accessing any fingerprint data collected by the employer through the service provider/vendor’s technology.
In addition, employers must maintain robust policies and protocols to prevent inadvertent disclosures of employee fingerprint data to any third parties, as a mishap of this nature—while not intentional—still nonetheless runs afoul of Labor Code § 1051.
Similarly, employers must maintain robust security measures to safeguard employee fingerprint data, as any unauthorized acquisition of such data by hackers or other malicious third parties also constitutes a violation of Labor Code § 1051.
To make matters worse, any inadvertent disclosure or other data compromise event also likely constitutes a violation of employees’ right to privacy under the California Constitution. And if that wasn’t enough, breach incidents involving the compromise of fingerprint data will also oftentimes form the basis for an actionable violation of the CCPA, opening the door for class action litigation. It should be noted, however, that Labor Code § 1051 does not apply to employers’ use of other types of biometric data (only fingerprint data) and is inapplicable outside of the employment context.
David J. Oberly |
The city of Portland, Oregon, made headlines last year when it became the first jurisdiction in the nation to enact a blanket ban on the use of facial recognition technology (“FRT”) by all private entities physically located within its city limits. While many cities have banned the use of face biometrics by law enforcement and parts of the public sector, the Portland ordinance is noteworthy because it drastically expanded the scope of this new type of regulation to also reach the private sector.
Since that time, the city of Baltimore, Maryland, followed suit with a similar private-sector facial biometrics ban of its own. More jurisdictions, including both cities and potentially states as well, are likely to add new laws mirroring those of Portland and Baltimore in the immediate future, especially as facial recognition continues to receive regular negative media coverage highlighting its claimed shortcomings, including potential accuracy and bias problems.
Combined, all companies that operate in Portland and use any type of software or other technology that may capture images of individuals’ faces should evaluate whether the new ordinance applies to them and, if so, take immediate action to ensure compliance with the law. And from a broader perspective, as this draconian type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use any type of facial recognition technology need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.
- Scope/Applicability: The Portland ordinance bars the use of “facial recognition technologies” by “private entities” in “places of public accommodation” within the City of Portland.
- “Private Entity”: The ordinance defines the term “private entity” in similar fashion to the Illinois Biometric Information Privacy Act (“BIPA”) as “any individual, sole proprietorship, partnership, limited liability company, association, or any other legal entity, however organized.”
- “Face Recognition Technologies”: Face recognition technologies means “automated or semi-automated processes using Face Recognition that assist in identifying, verifying, detecting, or characterizing facial features of an individual or capturing information about an individual based on an individual’s face.”
- “Face Recognition”: Face recognition, in turn, is defined as “the automated searching for a reference image in an image repository by comparing the facial features of a probe image with the features of images contained in an image repository (one-to-many search).”
- “Places of Public Accommodation”: “Places of public accommodation” is defined broadly to mean “[a]ny place or service offering to the public accommodations, advantages, facilities, privileges whether in the nature of goods, services, lodgings, amusements, transportation or otherwise.”
- Certain Places of Public Accommodation: Excluded from the scope of the ordinance are “institution[s], bona fide club[s], private residence[s], [and] place[s] of accommodation that [are] in [their] nature distinctly private.”
- Legal Compliance: The ordinance does not apply to the use of FRT to the extent necessary to comply with federal, state, or local laws.
- User Verification: The ordinance does not apply to the use of FRT for user verification purposes, but only in the narrow context of allowing an individual to access his or her individual or employer-issued communication or electronic device.
- Automatic Face Detection: Finally, the ordinance does not apply to the use of FRT “[i]n automatic face detection services in social media applications.”
Core Compliance Requirement
- Prohibition on FRT Use: Under the ordinance, private entities are barred from using face recognition technologies in places of public accommodation within city limits.
Enforcement and Remedies
- Private Right of Action: Any person “injured” by a material violation of the ordinance may pursue class action against the offending private entity.
- Recoverable Damages: A person injured by a violation of the ban can recover $1,000 per day for each day of the violation or actual damage sustained as a result of the violation, whichever is greater, as well as “such other remedies as may be appropriate.”
- Attorneys’ Fees: Attorneys’ fees are also recoverable, but only if certain actions are taken by the injured person before filing suit. Specifically, a plaintiff must submit a written demand for the payment of a claim on the offending private entity and its insurer (if known to the plaintiff) at least 30 days before the filing of the complaint. Where this is completed, a court may award to a prevailing plaintiff a “reasonable amount” of attorneys’ fees. Conversely, a plaintiff cannot recover attorneys’ fees if, before suit was filed, the offending private entity tendered to the plaintiff an amount that is at least equivalent to the damages awarded to the plaintiff in the litigation, exclusive of any costs, interest, and prevailing party fees.
Practical Compliance Tips & Best Practices
For companies operating in Portland, immediate action should be taken if not already done so to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Portland ordinance if the organization falls under the scope of the law:
- Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must engage in identifying, verifying, detecting, or characterizing facial features or capture information about an individual based on his or her facial features.
- Evaluate Applicability of Exceptions to Ban: If the technology is found to fall under the scope of the ban, the next step is to evaluate whether any of the limited exemptions offered by the ordinance can be satisfied to allow the company to continue its use of the technology.
- Cease All Use of FRT If No Exceptions Apply: If none of the exceptions apply, the company must immediately cease all use of its FRT technology.
- Identify Availability of Any Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current FRT technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.
David J. Oberly |
2021 has brought with it a sizeable expansion in the types of technology and companies that are now being targeted with bet-the-company Illinois Biometric Information Privacy Act (“BIPA”) class action lawsuits. The first major expansion involved the targeting of virtual try-on technology, a feature made even more popular during the COVID-19 pandemic, which, according to plaintiffs, utilizes facial recognition technology. More recently, a high volume of BIPA class action suits have been filed targeting the use of voice-powered technologies.
BIPA & Voice Data
BIPA regulates the collection, use, and storage of “biometric identifiers,” which includes—among other things—“voiceprints.” However, the term “voiceprint” is not defined in Illinois’ biometric privacy statute. “Voiceprint” is generally defined as a distinctive pattern of curved lines and whorls made by a machine that measures human vocal sounds for the purpose of identifying an individual speaker. It is this hallmark of identifying (or verifying the identity of) an individual that makes voice data a “voiceprint” under BIPA. In this respect, courts have noted that voice biometrics, also known as voiceprinting, is the use of biological characteristics—one’s voice—to verify an individual’s identity.
Thus, a critical distinction exists between general voice data, which is not covered by BIPA, and voiceprint, which fall under the scope of Illinois’ biometric privacy statute—with the important dividing line being the identifying quality of the biometric information. In a 2017 case, an Illinois federal court recognized this distinction, noting the difference between the mere capture of voice data and an actual “voiceprint.” In doing so, the court noted that if an entity simply captures a person’s voice without generating a voiceprint for the specific purpose of identifying ,or verifying the identity of, an individual, then there is no violation of BIPA.
Class action litigation against biometric technology manufacturers and vendors is on the rise. Several courts have recognized the viability of such claims and held manufacturers/vendors may be subject to liability under Sections 15(b) and 15(d) of the Illinois Biometric Information Privacy Act (“BIPA”). 740 ILCS 14/15(b) & (d); Figueroa v. Kronos Inc., 454 F. Supp. 3d 772, 784-86 (N.D. Ill. 2020). The merits of these BIPA claims are yet undetermined. But the risk of having to defend such claims in state and federal courts is real and ongoing.
As the saying goes, the best defense is good offense. Rather than face uncertain liability, or incur exorbitant litigation defense costs, potential BIPA defendants often turn to arbitration provisions. For manufacturers/vendors of biometric technology, however, this approach may not be that simple.
David J. Oberly |
New York City (“NYC”) has quickly become one of the newest hotbeds of biometric privacy legislative activity, having enacted several laws since the start of 2021 that directly govern the collection and use of biometric data.
In addition to the New York City Tenant Data Privacy Act (“TDPA”), which regulates the use of biometric data by owners and operators of “smart access buildings,” New York City Council also enacted the nation’s first municipal-level biometric privacy law regulating “commercial establishments” (the “NYC Biometrics Ordinance”), which went into effect on July 9, 2021.
Because the NYC Biometrics Ordinance will almost certainly not be the last of its kind, commercial establishments that utilize biometric data in their business operations—even those located beyond the borders of the Big Apple—should take proactive steps to implement robust biometric privacy compliance programs to ensure continued compliance with current and anticipated biometrics laws to mitigate potential liability exposure.
- Scope/Applicability: The NYC Biometrics Ordinance applies to the collection and use of “biometric identifier information” by “commercial establishments.”
- “Biometric Identifier Information”: Biometric identifier information is defined in broad terms as any “physiological or biological characteristic that is used by or on behalf of a commercial establishment, singly or in combination, to identify, or assist in identifying, an individual, including but not limited to: (i) a retina or iris scan, (ii) a fingerprint or voiceprint, (iii) a scan of hand or face geometry, or any other identifying characteristic.”
- “Commercial Establishment”: Commercial establishment is broadly defined to mean “a place of entertainment, a retail store, or a food and drink establishment.”
David J. Oberly |
On September 17, 2021, the Illinois Appellate Court First District delivered its much-anticipated decision in Tims v. Black Horse Carriers, Inc., 2021 IL App (1st) 200563 (1st Dist. Sep. 17, 2021), addressing the applicable statute of limitations for causes of action asserted under the Illinois Biometric Information Privacy Act (“BIPA”).
Importantly, in finding that BIPA’s two most commonly asserted provisions, Sections 15(a) and (b), are subject to the longer five-year limitations period, the opinion ensures that the tsunami of class action BIPA filings will continue to flood the courts for the foreseeable future.