Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Legislative Developments & Trends

NYC Introduces Bills to Limit Facial Recognition in Private Sector

Tianmei Ann Huang |

New York City Council (“Council”) members are expected to formally introduce two Local Laws on April 27, 2023, during the next Council meeting, seeking to regulate private-sector use of facial recognition (or similar surveillance technology) for identification or verification purposes.

The first bill would amend New York City’s administrative code to prohibit businesses and venues from using “biometric identifier information” (e.g., face scans) to identify or verify customers without first obtaining their written consent. These businesses and venues must also develop and make publicly available a retention-and-destruction policy, and must further comply with certain data protection, privacy, and security obligations. The proposal also includes a private right of action for civil damages up to $500 per negligent violation and up to $5,000 per intentional or reckless violation, as well as attorneys’ fees.

The second bill would ban owners of “multiple dwelling” properties (e.g., residential buildings) from installing, activating, or using “biometric recognition technology” to identify tenants or their guests. The legislation, if enacted, would be one of the first laws to place city-wide restrictions on the use of biometric recognition technology in the private sector.

Based on the introduction of these dual bills, companies in NYC that currently collect biometric data, or are considering doing so, are encouraged to contact experienced counsel to provide protective compliance measures—lest they become the target of civil litigation.

Categories
Biometric Privacy Compliance Tips Case Law Developments Class Action Litigation Defense Strategies

Illinois Supreme Court: Federal Labor Law Preempts Union Employees’ BIPA Claims

Tianmei Ann Huang |

The Illinois Supreme Court in Walton v. Roosevelt University, 2023 IL 128338 (Mar. 23, 2023), unanimously affirmed dismissal of the putative class action arising under the Illinois Biometric Privacy Information Act, 740 ILCS 14/1 (“BIPA”), concluding that federal labor law preempted BIPA claims brought by unionized employees covered by a collective bargaining agreement (“CBA”). Consistent with Seventh Circuit federal court decisions in support of federal preemption, the Walton high court’s ruling specifically provides that Section 301 of the federal Labor Management Relations Act (“LMRA”), 29 U.S.C. § 185, preempts BIPA claims asserted by union employees (or bargaining unit employees) covered by a CBA in Illinois state courts. Therefore, the federal preemption defense may be used to foreclose these unionized employees from bringing BIPA claims in state and federal courts, including on a class action basis.

In Walton, the representative plaintiff was a member of a union subject to a CBA, which included a broad management-rights clause, during his employment with Roosevelt University. The putative class alleged that Roosevelt University used scanning devices to enroll employees’ hand geometry scans for timekeeping purposes, but Roosevelt University failed to fulfill BIPA’s Section 15 requirements. However, under the LMRA, the provisions of the CBA should govern, and even if “biometric” data is not expressly discussed within the CBA, a broad management-rights clause along with provisions regarding employee timekeeping and grievance resolution procedures may be sufficient to preclude BIPA litigation.

Overall, the Walton decision offers a measure of relief to defendants involved in BIPA disputes brought by union employees, particularly following the liability-expanding Illinois Supreme Court decisions in Cothron and Tims, as previously discussed. To avoid future litigation, employers should carefully exercise their exclusive rights to direct the employees covered by a CBA or other contract.

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Case Law Developments

Illinois Supreme Court Dramatically Expands Liability by Ruling Each Scan of a Biometric Identifier Is a Separate Violation

Amanda M. Noonan |

In a 4-3 split, the Illinois Supreme Court ruled earlier this month that claims under Sections 15(b) and 15(d) of the Illinois Biometric Information Privacy Act (“BIPA”) accrue each time a private entity scans a person’s biometric identifier and/or submits such scan to a third party—rather than only upon first collection. Cothron v. White Castle System, Inc., 2023 IL 128004 (Feb. 17, 2023). This decision—which dramatically expands the scope of potential liability for BIPA defendants—comes just weeks after the Illinois Supreme Court held a five-year statute of limitations applies to all BIPA causes of action in Tims v. Blackhorse Carriers, Inc., 2023 IL 127801 (Feb. 2, 2023).

The impact of Cothron on claim accrual, coupled with Tims’ resolution of the statute of limitations, will have an immense and immediate impact on BIPA class-action lawsuits—many of which had been stayed pending these decisions.

For many businesses that implement biometric time clocks, which scan biometric identifiers to track employee time/attendance, this means each time an employee scans in-and-out of work, a new BIPA violation accrues. Together with the five-year statute of limitations period, BIPA defendants may now be facing hundreds—if not thousands—of independent BIPA violations for a single complainant.

Categories
Biometric Privacy Legal Landscape Case Law Developments Class Action Litigation Defense Strategies

Illinois Supreme Court Holds Five-Year Statute of Limitations Applies to All Biometric Information Privacy Act Claims

Amanda M. Noonan |

In a highly anticipated decision, the Illinois Supreme Court in Tims v. Blackhorse Carriers, Inc., 2023 IL 127801 (Feb. 2, 2023), recently resolved longstanding uncertainty about the statute of limitations under the Illinois Biometric Information Privacy Act (“BIPA”). The Court held all claims arising under BIPA are governed by the five-year “catch-all” statute of limitations period provided by section 13-205 of the Illinois Code of Civil Procedure. See 735 ILCS 5/13-205. In so holding, the Court adopted the most expansive of the two limitations periods at issue. And it rejected Defendant’s—and the broader defense bar’s—contention that Illinois’ one-year limitations period, as applied to certain privacy/defamation actions, should extend to all BIPA actions.

Notably, the Supreme Court reversed, in part, the First District Illinois Appellate Court’s decision that incongruently applied a one-year limitations period to claims arising under Sections 15(c), and 15(d)—but a five-year limitations period for BIPA actions accruing under Sections 15(a), 15(b), and 15(e). Under the Appellate Court’s reasoning, Sections 15(c) and 15(d) included elements of publication analogous to certain common law privacy torts, and, for that reason, required application of Illinois’ one-year statute of limitations for “actions for slander, libel or for publication of matter violating the right of privacy” 735 ILCS 5/13-201. At the same time, the Appellate Court applied the “catch all” five-year statute of limitations period to claims under Sections 15(a), 15(b), and 15(e), reasoning no publication element was involved. 735 ILCS 5/13-205.

Categories
Biometric Privacy Legal Landscape Case Law Developments Class Action Litigation Defense Strategies

First Biometric Privacy Jury Trial Results in Massive $228 Million Dollar Verdict

Amanda M. Noonan |

A federal district court in the Northern District of Illinois conducted the first-ever jury trial in an Illinois Biometric Information Privacy Act (“BIPA”) case. On October 12, 2022, the jury returned a verdict for the plaintiff—and more than 45,000 class members—regarding defendant BNSF Railway’s (“BNSF”) reckless violations of BIPA. See Rogers v. BNSF Railway Co., No. 1:19-cv-03083 (N.D. Ill. Oct. 12, 2022). Plaintiffs’ claims centered on BNSF’s collection of fingerprints to verify their identities and allow access to BNSF’s facilities without obtaining their written consent, as required under BIPA Section 15(b).

After a five-day trial—and only an hour of deliberations—the jury found BNSF not only violated BIPA 46,500 times, but did so intentionally or recklessly under 735 ILCS 14/20(2). The jury’s finding on that issue quintupled plaintiff’s damages award to $5,000 per violation, as opposed to $1,000 per negligent violation. As a result, District Judge Matthew Kennelly entered a $228 million dollar damages award in plaintiffs’ favor following the verdict. BNSF has stated it intends to appeal.

The implications of the verdict loom large. On the plaintiff’s side, counsel will likely increase the already large-scale BIPA filings and push for higher settlement amounts, using the prospect of a successful jury trial as a bargaining chip. Given the stakes, BIPA defendants may be more inclined to seek early resolution once named in a BIPA class action to avoid a bet-the-company litigation at all costs.

Considering the verdict, early compliance efforts by companies implementing biometric technology are even more crucial to avoid BIPA litigation in the first instance. Significantly, companies using any technology that could arguably constitute biometrics—regardless of the sophistication—may be targeted by zealous plaintiff’s attorneys seeking to join the ever-increasing cascade of BIPA class action filings. Biometrics privacy counsel should thus be consulted to address compliance strategies to protect against the catastrophic risks of a BIPA verdict at the earliest possible opportunity.

Categories
Biometric Privacy Legal Landscape The Lighter Side of Biometrics

Delta Airlines Debuts “Parallel Reality” Biometric Flight Information Display

Rachel Evans* |

On June 29, 2022, travelers at the Detroit Metropolitan Airport were the first to interact with a new flight information display that uses facial recognition technology “to identify participating travelers and show them the appropriate information.”

How It Works

Customers can opt-in to the experience by either scanning their boarding pass or activating facial recognition at the Parallel Reality kiosk to check in to their flight and receive day-of-travel information at their fingertips—or, more appropriately, at their facial scan.

Once a customer has checked in and approached the flight information board, cameras embedded in the board will match an individual to their picture and engage multi-view pixels to display a unique message only the intended customer can see.

Nearly all travelers can simultaneously look at the display and receive completely different, personalized information relating to their travel plan.

Categories
Legislative Developments & Trends

Congressional Hearing Update: “Privacy in the Age of Biometrics”

Rachel Evans* |

On June 29, 2022, the Subcommittee on Investigations and Oversight for the House Committee on Science, Space, and Technology heard testimony from multiple experts about the expansion of biometric technology and the future of American privacy law. This hearing signals Congress’ intention to consider new legislation aimed at protecting privacy interests while simultaneously encouraging the expansion of biometric technologies.

Chairman Bill Foster (D-IL) opened the hearing with concerns that the broad impact of the U.S. Supreme Court’s Dobbs v. Jackson Women’s Health Organization decision would significantly weaken privacy protections, explaining that “[t]he timing of our discussion … is notable[.] [T]his makes protecting Americans biometric data more important than ever.” Congress’ focus will now “be on how technological solutions can secure our privacy while allowing us to enjoy the benefits of biometric tools.”

Categories
Biometric Privacy Legal Landscape Case Law Developments

Q1 Biometric Privacy Litigation Update

Amanda M. Noonan |

In the first quarter of 2022, there have already been significant legal developments in the biometric technology space. Most notably, the Illinois Supreme Court—which has actively taken Illinois Biometric Privacy Act (“BIPA”) cases amid the surge of such class action litigation in federal and state courts—issued several consequential BIPA opinions this year. Though 2022’s most critical BIPA decisions are likely still on the horizon.

Categories
Biometric Privacy Legal Landscape Legislative Developments & Trends

California Legislature Introduces Expansive Biometric Privacy Law

Amanda M. Noonan |

On February 17, 2022, the California Legislature introduced a biometric privacy law (SB 1189) similar to the Illinois Biometric Information Privacy Act (“BIPA”). SB 1189 would dramatically increase biometric privacy protection for California consumers, expand regulation among private business, and add to the flurry of biometric privacy class action litigation that has taken hold of U.S. courts.

Categories
Biometric Privacy Compliance Tips

Practical Compliance Tips: Baltimore Private-Sector Facial Recognition Ban

David J. Oberly |

In mid-2021, Baltimore, Maryland, passed Council Bill 21-0001 (the “FRT Ordinance”), becoming the second U.S. jurisdiction to enact sweeping facial recognition regulation that bans the use of facial biometrics by any private entity or individual within city limits.

While a number of cities have enacted laws prohibiting law enforcement and other governmental agencies from using facial recognition, Portland, Oregon, enacted the nation’s first blanket ban over the use of this technology by the private sector at the beginning of 2021. The Baltimore FRT Ordinance goes even further than Portland by imposing criminal penalties of up to a year in jail for companies and individuals that run afoul of the ban.

As federal lawmakers continue to drag their feet on enacting a nationwide, uniform biometric privacy regulatory regime, companies should anticipate that cities and states will continue to take the lead in implementing new biometrics regulation in 2022. In particular, the success seen by Baltimore and Portland in enacting outright bans over the commercial use of facial recognition software is likely to encourage lawmakers in other cities and states to follow suit by enacting tighter controls over the collection and use of facial geometry data in other parts of the country.

Taken together, all businesses that operate in Baltimore and use any type of facial recognition software should assess whether the Baltimore FTC Ordinance applies to them and, if so, take prompt measures to ensure compliance with the law. And from a broader perspective, as this strict type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use facial recognition technology (“FRT”) need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.

Overview

  • Scope/Applicability: The Baltimore ordinance bars “persons” from obtaining, retaining, accessing, or using any “face surveillance system” or any information obtained from face surveillance system within the City of Baltimore.
  • “Person”: The ordinance defines the term “person” as any individual, partnership, firm, association, corporation, other entity, receiver, trustee, guardian, personal representative, or fiduciary.
  • “Face Surveillance System”: “Face surveillance system” means “any computer software or application that performs face surveillance.”
  • “Face Surveillance”: “Face surveillance,” in turn, is defined as “an automated or semi-automated process that assists in identifying or verifying an individual based on the physical characteristics of the individual’s face.”

Exemptions

  • Access Control Systems: Excluded from the scope of the ordinance are “biometric security system[s] designed specifically to protect against unauthorized access to a particular location or an electronic device.”
  • Maryland Image Repository System: Also excluded from the scope of the ordinance is the Maryland Image Repository System (facial recognition software that allows law enforcement to compare images of unidentified individuals to images from motor vehicle records and criminal mugshots).

Core Compliance Requirement

  • Prohibition on FRT Use: Under the ordinance, a person may not obtain, retain, access, or use in Baltimore City: (1) any face surveillance system; or (2) any information obtained from a face surveillance system.

Enforcement and Remedies

  • Misdemeanor: Any person who violates the Baltimore FRT Ban is guilty of a misdemeanor and subject to a fine of not more than $1,000, imprisonment for not more than 12 months, or both fine and imprisonment.
  • Each Day a Separate Offense: Each day that a violation continues is a separate offense.

Practical Compliance Tips & Best Practices

All businesses that maintain operations in Baltimore should take immediate action (if they have not already done so) to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Baltimore ordinance if the organization falls under the scope of the law:

  • Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must assist in identifying or verifying individuals based on their facial characteristics.
  • Evaluate Applicability of Access Control Exemption: If the technology falls under the scope of the ban, evaluate whether the narrow exemption offered by the ordinance for facial recognition-powered access control systems applies to allow the company to continue its use of the technology.
  • Cease Use if Exemption Inapplicable: If the technology does not serve the purpose of protecting against unauthorized access to a particular location or electronic device, eliminate the use of facial recognition across the board immediately.
  • Identify Availability of Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current facial biometrics technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.