Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Case Law Developments

Illinois Supreme Court Dramatically Expands Liability by Ruling Each Scan of a Biometric Identifier Is a Separate Violation

Amanda M. Noonan |

In a 4-3 split, the Illinois Supreme Court ruled earlier this month that claims under Sections 15(b) and 15(d) of the Illinois Biometric Information Privacy Act (“BIPA”) accrue each time a private entity scans a person’s biometric identifier and/or submits such scan to a third party—rather than only upon first collection. Cothron v. White Castle System, Inc., 2023 IL 128004 (Feb. 17, 2023). This decision—which dramatically expands the scope of potential liability for BIPA defendants—comes just weeks after the Illinois Supreme Court held a five-year statute of limitations applies to all BIPA causes of action in Tims v. Blackhorse Carriers, Inc., 2023 IL 127801 (Feb. 2, 2023).

The impact of Cothron on claim accrual, coupled with Tims’ resolution of the statute of limitations, will have an immense and immediate impact on BIPA class-action lawsuits—many of which had been stayed pending these decisions.

For many businesses that implement biometric time clocks, which scan biometric identifiers to track employee time/attendance, this means each time an employee scans in-and-out of work, a new BIPA violation accrues. Together with the five-year statute of limitations period, BIPA defendants may now be facing hundreds—if not thousands—of independent BIPA violations for a single complainant.

Categories
Biometric Privacy Legal Landscape Case Law Developments Class Action Litigation Defense Strategies

Illinois Supreme Court Holds Five-Year Statute of Limitations Applies to All Biometric Information Privacy Act Claims

Amanda M. Noonan |

In a highly anticipated decision, the Illinois Supreme Court in Tims v. Blackhorse Carriers, Inc., 2023 IL 127801 (Feb. 2, 2023), recently resolved longstanding uncertainty about the statute of limitations under the Illinois Biometric Information Privacy Act (“BIPA”). The Court held all claims arising under BIPA are governed by the five-year “catch-all” statute of limitations period provided by section 13-205 of the Illinois Code of Civil Procedure. See 735 ILCS 5/13-205. In so holding, the Court adopted the most expansive of the two limitations periods at issue. And it rejected Defendant’s—and the broader defense bar’s—contention that Illinois’ one-year limitations period, as applied to certain privacy/defamation actions, should extend to all BIPA actions.

Notably, the Supreme Court reversed, in part, the First District Illinois Appellate Court’s decision that incongruently applied a one-year limitations period to claims arising under Sections 15(c), and 15(d)—but a five-year limitations period for BIPA actions accruing under Sections 15(a), 15(b), and 15(e). Under the Appellate Court’s reasoning, Sections 15(c) and 15(d) included elements of publication analogous to certain common law privacy torts, and, for that reason, required application of Illinois’ one-year statute of limitations for “actions for slander, libel or for publication of matter violating the right of privacy” 735 ILCS 5/13-201. At the same time, the Appellate Court applied the “catch all” five-year statute of limitations period to claims under Sections 15(a), 15(b), and 15(e), reasoning no publication element was involved. 735 ILCS 5/13-205.

Categories
Biometric Privacy Legal Landscape Case Law Developments Class Action Litigation Defense Strategies

First Biometric Privacy Jury Trial Results in Massive $228 Million Dollar Verdict

Amanda M. Noonan |

A federal district court in the Northern District of Illinois conducted the first-ever jury trial in an Illinois Biometric Information Privacy Act (“BIPA”) case. On October 12, 2022, the jury returned a verdict for the plaintiff—and more than 45,000 class members—regarding defendant BNSF Railway’s (“BNSF”) reckless violations of BIPA. See Rogers v. BNSF Railway Co., No. 1:19-cv-03083 (N.D. Ill. Oct. 12, 2022). Plaintiffs’ claims centered on BNSF’s collection of fingerprints to verify their identities and allow access to BNSF’s facilities without obtaining their written consent, as required under BIPA Section 15(b).

After a five-day trial—and only an hour of deliberations—the jury found BNSF not only violated BIPA 46,500 times, but did so intentionally or recklessly under 735 ILCS 14/20(2). The jury’s finding on that issue quintupled plaintiff’s damages award to $5,000 per violation, as opposed to $1,000 per negligent violation. As a result, District Judge Matthew Kennelly entered a $228 million dollar damages award in plaintiffs’ favor following the verdict. BNSF has stated it intends to appeal.

The implications of the verdict loom large. On the plaintiff’s side, counsel will likely increase the already large-scale BIPA filings and push for higher settlement amounts, using the prospect of a successful jury trial as a bargaining chip. Given the stakes, BIPA defendants may be more inclined to seek early resolution once named in a BIPA class action to avoid a bet-the-company litigation at all costs.

Considering the verdict, early compliance efforts by companies implementing biometric technology are even more crucial to avoid BIPA litigation in the first instance. Significantly, companies using any technology that could arguably constitute biometrics—regardless of the sophistication—may be targeted by zealous plaintiff’s attorneys seeking to join the ever-increasing cascade of BIPA class action filings. Biometrics privacy counsel should thus be consulted to address compliance strategies to protect against the catastrophic risks of a BIPA verdict at the earliest possible opportunity.

Categories
Biometric Privacy Legal Landscape The Lighter Side of Biometrics

Delta Airlines Debuts “Parallel Reality” Biometric Flight Information Display

Rachel Evans* |

On June 29, 2022, travelers at the Detroit Metropolitan Airport were the first to interact with a new flight information display that uses facial recognition technology “to identify participating travelers and show them the appropriate information.”

How It Works

Customers can opt-in to the experience by either scanning their boarding pass or activating facial recognition at the Parallel Reality kiosk to check in to their flight and receive day-of-travel information at their fingertips—or, more appropriately, at their facial scan.

Once a customer has checked in and approached the flight information board, cameras embedded in the board will match an individual to their picture and engage multi-view pixels to display a unique message only the intended customer can see.

Nearly all travelers can simultaneously look at the display and receive completely different, personalized information relating to their travel plan.

Categories
Legislative Developments & Trends

Congressional Hearing Update: “Privacy in the Age of Biometrics”

Rachel Evans* |

On June 29, 2022, the Subcommittee on Investigations and Oversight for the House Committee on Science, Space, and Technology heard testimony from multiple experts about the expansion of biometric technology and the future of American privacy law. This hearing signals Congress’ intention to consider new legislation aimed at protecting privacy interests while simultaneously encouraging the expansion of biometric technologies.

Chairman Bill Foster (D-IL) opened the hearing with concerns that the broad impact of the U.S. Supreme Court’s Dobbs v. Jackson Women’s Health Organization decision would significantly weaken privacy protections, explaining that “[t]he timing of our discussion … is notable[.] [T]his makes protecting Americans biometric data more important than ever.” Congress’ focus will now “be on how technological solutions can secure our privacy while allowing us to enjoy the benefits of biometric tools.”

Categories
Biometric Privacy Legal Landscape Case Law Developments

Q1 Biometric Privacy Litigation Update

Amanda M. Noonan |

In the first quarter of 2022, there have already been significant legal developments in the biometric technology space. Most notably, the Illinois Supreme Court—which has actively taken Illinois Biometric Privacy Act (“BIPA”) cases amid the surge of such class action litigation in federal and state courts—issued several consequential BIPA opinions this year. Though 2022’s most critical BIPA decisions are likely still on the horizon.

Categories
Biometric Privacy Legal Landscape Legislative Developments & Trends

California Legislature Introduces Expansive Biometric Privacy Law

Amanda M. Noonan |

On February 17, 2022, the California Legislature introduced a biometric privacy law (SB 1189) similar to the Illinois Biometric Information Privacy Act (“BIPA”). SB 1189 would dramatically increase biometric privacy protection for California consumers, expand regulation among private business, and add to the flurry of biometric privacy class action litigation that has taken hold of U.S. courts.

Categories
Biometric Privacy Compliance Tips

Practical Compliance Tips: Baltimore Private-Sector Facial Recognition Ban

David J. Oberly |

In mid-2021, Baltimore, Maryland, passed Council Bill 21-0001 (the “FRT Ordinance”), becoming the second U.S. jurisdiction to enact sweeping facial recognition regulation that bans the use of facial biometrics by any private entity or individual within city limits.

While a number of cities have enacted laws prohibiting law enforcement and other governmental agencies from using facial recognition, Portland, Oregon, enacted the nation’s first blanket ban over the use of this technology by the private sector at the beginning of 2021. The Baltimore FRT Ordinance goes even further than Portland by imposing criminal penalties of up to a year in jail for companies and individuals that run afoul of the ban.

As federal lawmakers continue to drag their feet on enacting a nationwide, uniform biometric privacy regulatory regime, companies should anticipate that cities and states will continue to take the lead in implementing new biometrics regulation in 2022. In particular, the success seen by Baltimore and Portland in enacting outright bans over the commercial use of facial recognition software is likely to encourage lawmakers in other cities and states to follow suit by enacting tighter controls over the collection and use of facial geometry data in other parts of the country.

Taken together, all businesses that operate in Baltimore and use any type of facial recognition software should assess whether the Baltimore FTC Ordinance applies to them and, if so, take prompt measures to ensure compliance with the law. And from a broader perspective, as this strict type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use facial recognition technology (“FRT”) need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.

Overview

  • Scope/Applicability: The Baltimore ordinance bars “persons” from obtaining, retaining, accessing, or using any “face surveillance system” or any information obtained from face surveillance system within the City of Baltimore.
  • “Person”: The ordinance defines the term “person” as any individual, partnership, firm, association, corporation, other entity, receiver, trustee, guardian, personal representative, or fiduciary.
  • “Face Surveillance System”: “Face surveillance system” means “any computer software or application that performs face surveillance.”
  • “Face Surveillance”: “Face surveillance,” in turn, is defined as “an automated or semi-automated process that assists in identifying or verifying an individual based on the physical characteristics of the individual’s face.”

Exemptions

  • Access Control Systems: Excluded from the scope of the ordinance are “biometric security system[s] designed specifically to protect against unauthorized access to a particular location or an electronic device.”
  • Maryland Image Repository System: Also excluded from the scope of the ordinance is the Maryland Image Repository System (facial recognition software that allows law enforcement to compare images of unidentified individuals to images from motor vehicle records and criminal mugshots).

Core Compliance Requirement

  • Prohibition on FRT Use: Under the ordinance, a person may not obtain, retain, access, or use in Baltimore City: (1) any face surveillance system; or (2) any information obtained from a face surveillance system.

Enforcement and Remedies

  • Misdemeanor: Any person who violates the Baltimore FRT Ban is guilty of a misdemeanor and subject to a fine of not more than $1,000, imprisonment for not more than 12 months, or both fine and imprisonment.
  • Each Day a Separate Offense: Each day that a violation continues is a separate offense.

Practical Compliance Tips & Best Practices

All businesses that maintain operations in Baltimore should take immediate action (if they have not already done so) to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Baltimore ordinance if the organization falls under the scope of the law:

  • Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must assist in identifying or verifying individuals based on their facial characteristics.
  • Evaluate Applicability of Access Control Exemption: If the technology falls under the scope of the ban, evaluate whether the narrow exemption offered by the ordinance for facial recognition-powered access control systems applies to allow the company to continue its use of the technology.
  • Cease Use if Exemption Inapplicable: If the technology does not serve the purpose of protecting against unauthorized access to a particular location or electronic device, eliminate the use of facial recognition across the board immediately.
  • Identify Availability of Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current facial biometrics technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.
Categories
Case Law Developments

Claim Accrual Ruling Could Bring Seismic Shift to Biometric Privacy Landscape in 2022

Amanda M. Noonan |

At the end of 2021, two developments laid the groundwork for a definitive resolution of one of the most significant, yet unsettled, issues under the Illinois Biometric Information Privacy Act (“BIPA”)—claim accrual. While all litigants would appreciate some certainty surrounding this hot-button issue, resolution of when a BIPA violation “accrues” (i.e., occurs) will have a seismic impact on the trajectory of all BIPA litigation for years to come—depending on how the Illinois Supreme Court rules in the coming term.

Watson and Cothron BIPA Decisions

In mid-December 2021, an Illinois appellate panel in Watson v. Legacy Healthcare Financial Services, LLC, held BIPA claims accrue each and every time a defendant captures biometric information in violation of the statute, as opposed to only accruing at the first instance of collection.

Just a few days after Watson, the Seventh Circuit Court of Appeals issued its decision in Cothron v. White Castle System, Inc.—another appeal involving claim accrual. But rather than decide when a BIPA claim accrues, and after acknowledging the existence of Watson, the Cothron court certified the question to the Illinois Supreme Court to provide definitive guidance.

While neither Watson nor Cothron offers a conclusive answer, the issue is now teed up to be definitely decided by Illinois’ highest court.

Impact & Implications

The accrual date is a significant issue in BIPA class action litigation. Depending on the circumstances, accrual can serve as the basis for a statute of limitations defense, which, if successful, may require dismissal. But the issue is even more consequential in the context of damages and determining the overall value of a biometric privacy class action. If continuing BIPA violations constitute separate, independent claims, then the associated statutory negligent damages of $1,000 per violation (or $5,000 if intentional) could begin to compound. And because the law provides for liquidated damages for each violation, a ruling that claims accrue each time a defendant runs afoul of the law’s requirements could expand such liability exponentially.

Conclusion

Companies should pay close attention to how the Illinois Supreme Court decides the Cothron appeal, as the ruling could result in a drastic shift in the biometric privacy legal landscape. In the interim, companies should—with the assistance of experienced biometric privacy counsel—take the time to reassess their compliance with BIPA to ensure they are satisfying the full range of requirements to mitigate potential class action risk.

Categories
Biometric Privacy Compliance Tips

Beware of Hidden Pitfalls: Biometric Privacy Guidance for California Employers

David J. Oberly |

By now, most Golden State employers are well versed in the California Consumer Privacy Act of 2018 (“CCPA”), as well as its soon-to-be successor, the California Privacy Rights Act of 2020 (“CPRA”), which goes into effect at the start of 2023.

At the same time, more and more employers operating in California (and in other parts of the nation) are integrating biometrics into their operations in a variety of ways, including for timekeeping and access control purposes, among others.

Many of these employers need not worry about satisfying the onerous requirements of the CCPA because they do not meet any of the law’s three applicability thresholds: (1) $25,000,000 in gross revenue; (2) buying, receiving, sharing, or selling the personal information of more than 50,000 consumers, households, or devices; or (3) deriving 50 percent or more of revenue from the sale of personal information.

And those employers that do fall under the scope of California’s consumer privacy law are largely exempted from compliance in connection with the personal information of employees (and job applicants) under the CCPA’s employee information exemption.

Combined, many California employers operate under the assumption that there are no applicable legal requirements that must be satisfied when using biometrics in their day-to-day operations. Organizations that take this approach do so at their peril, as California Labor Code § 1051 imposes clear, unambiguous requirements and limitations on employers that utilize certain biometric data in the workplace. More than that, this California law—which often flies under the radar of many employers and even their biometric privacy counsel—can result in criminal penalties for noncompliance.

California Labor Code § 1051

Specifically, Labor Code § 1051 bars employers that require employees or job applicants to furnish their fingerprints from disclosing that fingerprint biometric data to any third party. For example, employers are generally barred under Labor Code § 1051 from disclosing fingerprints to other employers to prevent subsequent employment, or to law enforcement agencies unless required pursuant to a court order or subpoena. Any employer that violates Labor Code § 1051 is guilty of a misdemeanor.

Practical Compliance Tips

Consequently, California employers must proceed with caution when using fingerprint biometric data in the workplace and ensure they are in strict compliance with Labor Code § 1051 when collecting, using, or storing employee fingerprint biometrics.

To do so, employers should first ensure that their biometrics service providers and vendors are completely precluded from accessing any fingerprint data collected by the employer through the service provider/vendor’s technology.

In addition, employers must maintain robust policies and protocols to prevent inadvertent disclosures of employee fingerprint data to any third parties, as a mishap of this nature—while not intentional—still nonetheless runs afoul of Labor Code § 1051.

Similarly, employers must maintain robust security measures to safeguard employee fingerprint data, as any unauthorized acquisition of such data by hackers or other malicious third parties also constitutes a violation of Labor Code § 1051.

To make matters worse, any inadvertent disclosure or other data compromise event also likely constitutes a violation of employees’ right to privacy under the California Constitution. And if that wasn’t enough, breach incidents involving the compromise of fingerprint data will also oftentimes form the basis for an actionable violation of the CCPA, opening the door for class action litigation. It should be noted, however, that Labor Code § 1051 does not apply to employers’ use of other types of biometric data (only fingerprint data) and is inapplicable outside of the employment context.