Categories
Biometric Privacy Legal Landscape Case Law Developments Class Action Litigation Defense Strategies

First Biometric Privacy Jury Trial Results in Massive $228 Million Dollar Verdict

Amanda M. Noonan |

A federal district court in the Northern District of Illinois conducted the first-ever jury trial in an Illinois Biometric Information Privacy Act (“BIPA”) case. On October 12, 2022, the jury returned a verdict for the plaintiff—and more than 45,000 class members—regarding defendant BNSF Railway’s (“BNSF”) reckless violations of BIPA. See Rogers v. BNSF Railway Co., No. 1:19-cv-03083 (N.D. Ill. Oct. 12, 2022). Plaintiffs’ claims centered on BNSF’s collection of fingerprints to verify their identities and allow access to BNSF’s facilities without obtaining their written consent, as required under BIPA Section 15(b).

After a five-day trial—and only an hour of deliberations—the jury found BNSF not only violated BIPA 46,500 times, but did so intentionally or recklessly under 735 ILCS 14/20(2). The jury’s finding on that issue quintupled plaintiff’s damages award to $5,000 per violation, as opposed to $1,000 per negligent violation. As a result, District Judge Matthew Kennelly entered a $228 million dollar damages award in plaintiffs’ favor following the verdict. BNSF has stated it intends to appeal.

The implications of the verdict loom large. On the plaintiff’s side, counsel will likely increase the already large-scale BIPA filings and push for higher settlement amounts, using the prospect of a successful jury trial as a bargaining chip. Given the stakes, BIPA defendants may be more inclined to seek early resolution once named in a BIPA class action to avoid a bet-the-company litigation at all costs.

Considering the verdict, early compliance efforts by companies implementing biometric technology are even more crucial to avoid BIPA litigation in the first instance. Significantly, companies using any technology that could arguably constitute biometrics—regardless of the sophistication—may be targeted by zealous plaintiff’s attorneys seeking to join the ever-increasing cascade of BIPA class action filings. Biometrics privacy counsel should thus be consulted to address compliance strategies to protect against the catastrophic risks of a BIPA verdict at the earliest possible opportunity.

Categories
Biometric Privacy Legal Landscape The Lighter Side of Biometrics

Delta Airlines Debuts “Parallel Reality” Biometric Flight Information Display

Rachel Evans* |

On June 29, 2022, travelers at the Detroit Metropolitan Airport were the first to interact with a new flight information display that uses facial recognition technology “to identify participating travelers and show them the appropriate information.”

How It Works

Customers can opt-in to the experience by either scanning their boarding pass or activating facial recognition at the Parallel Reality kiosk to check in to their flight and receive day-of-travel information at their fingertips—or, more appropriately, at their facial scan.

Once a customer has checked in and approached the flight information board, cameras embedded in the board will match an individual to their picture and engage multi-view pixels to display a unique message only the intended customer can see.

Nearly all travelers can simultaneously look at the display and receive completely different, personalized information relating to their travel plan.

Categories
Legislative Developments & Trends

Congressional Hearing Update: “Privacy in the Age of Biometrics”

Rachel Evans* |

On June 29, 2022, the Subcommittee on Investigations and Oversight for the House Committee on Science, Space, and Technology heard testimony from multiple experts about the expansion of biometric technology and the future of American privacy law. This hearing signals Congress’ intention to consider new legislation aimed at protecting privacy interests while simultaneously encouraging the expansion of biometric technologies.

Chairman Bill Foster (D-IL) opened the hearing with concerns that the broad impact of the U.S. Supreme Court’s Dobbs v. Jackson Women’s Health Organization decision would significantly weaken privacy protections, explaining that “[t]he timing of our discussion … is notable[.] [T]his makes protecting Americans biometric data more important than ever.” Congress’ focus will now “be on how technological solutions can secure our privacy while allowing us to enjoy the benefits of biometric tools.”

Categories
Biometric Privacy Legal Landscape Case Law Developments

Q1 Biometric Privacy Litigation Update

Amanda M. Noonan |

In the first quarter of 2022, there have already been significant legal developments in the biometric technology space. Most notably, the Illinois Supreme Court—which has actively taken Illinois Biometric Privacy Act (“BIPA”) cases amid the surge of such class action litigation in federal and state courts—issued several consequential BIPA opinions this year. Though 2022’s most critical BIPA decisions are likely still on the horizon.

Categories
Biometric Privacy Legal Landscape Legislative Developments & Trends

California Legislature Introduces Expansive Biometric Privacy Law

Amanda M. Noonan |

On February 17, 2022, the California Legislature introduced a biometric privacy law (SB 1189) similar to the Illinois Biometric Information Privacy Act (“BIPA”). SB 1189 would dramatically increase biometric privacy protection for California consumers, expand regulation among private business, and add to the flurry of biometric privacy class action litigation that has taken hold of U.S. courts.

Categories
Biometric Privacy Compliance Tips

Practical Compliance Tips: Baltimore Private-Sector Facial Recognition Ban

David J. Oberly |

In mid-2021, Baltimore, Maryland, passed Council Bill 21-0001 (the “FRT Ordinance”), becoming the second U.S. jurisdiction to enact sweeping facial recognition regulation that bans the use of facial biometrics by any private entity or individual within city limits.

While a number of cities have enacted laws prohibiting law enforcement and other governmental agencies from using facial recognition, Portland, Oregon, enacted the nation’s first blanket ban over the use of this technology by the private sector at the beginning of 2021. The Baltimore FRT Ordinance goes even further than Portland by imposing criminal penalties of up to a year in jail for companies and individuals that run afoul of the ban.

As federal lawmakers continue to drag their feet on enacting a nationwide, uniform biometric privacy regulatory regime, companies should anticipate that cities and states will continue to take the lead in implementing new biometrics regulation in 2022. In particular, the success seen by Baltimore and Portland in enacting outright bans over the commercial use of facial recognition software is likely to encourage lawmakers in other cities and states to follow suit by enacting tighter controls over the collection and use of facial geometry data in other parts of the country.

Taken together, all businesses that operate in Baltimore and use any type of facial recognition software should assess whether the Baltimore FTC Ordinance applies to them and, if so, take prompt measures to ensure compliance with the law. And from a broader perspective, as this strict type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use facial recognition technology (“FRT”) need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.

Overview

  • Scope/Applicability: The Baltimore ordinance bars “persons” from obtaining, retaining, accessing, or using any “face surveillance system” or any information obtained from face surveillance system within the City of Baltimore.
  • “Person”: The ordinance defines the term “person” as any individual, partnership, firm, association, corporation, other entity, receiver, trustee, guardian, personal representative, or fiduciary.
  • “Face Surveillance System”: “Face surveillance system” means “any computer software or application that performs face surveillance.”
  • “Face Surveillance”: “Face surveillance,” in turn, is defined as “an automated or semi-automated process that assists in identifying or verifying an individual based on the physical characteristics of the individual’s face.”

Exemptions

  • Access Control Systems: Excluded from the scope of the ordinance are “biometric security system[s] designed specifically to protect against unauthorized access to a particular location or an electronic device.”
  • Maryland Image Repository System: Also excluded from the scope of the ordinance is the Maryland Image Repository System (facial recognition software that allows law enforcement to compare images of unidentified individuals to images from motor vehicle records and criminal mugshots).

Core Compliance Requirement

  • Prohibition on FRT Use: Under the ordinance, a person may not obtain, retain, access, or use in Baltimore City: (1) any face surveillance system; or (2) any information obtained from a face surveillance system.

Enforcement and Remedies

  • Misdemeanor: Any person who violates the Baltimore FRT Ban is guilty of a misdemeanor and subject to a fine of not more than $1,000, imprisonment for not more than 12 months, or both fine and imprisonment.
  • Each Day a Separate Offense: Each day that a violation continues is a separate offense.

Practical Compliance Tips & Best Practices

All businesses that maintain operations in Baltimore should take immediate action (if they have not already done so) to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Baltimore ordinance if the organization falls under the scope of the law:

  • Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must assist in identifying or verifying individuals based on their facial characteristics.
  • Evaluate Applicability of Access Control Exemption: If the technology falls under the scope of the ban, evaluate whether the narrow exemption offered by the ordinance for facial recognition-powered access control systems applies to allow the company to continue its use of the technology.
  • Cease Use if Exemption Inapplicable: If the technology does not serve the purpose of protecting against unauthorized access to a particular location or electronic device, eliminate the use of facial recognition across the board immediately.
  • Identify Availability of Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current facial biometrics technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.
Categories
Case Law Developments

Claim Accrual Ruling Could Bring Seismic Shift to Biometric Privacy Landscape in 2022

Amanda M. Noonan |

At the end of 2021, two developments laid the groundwork for a definitive resolution of one of the most significant, yet unsettled, issues under the Illinois Biometric Information Privacy Act (“BIPA”)—claim accrual. While all litigants would appreciate some certainty surrounding this hot-button issue, resolution of when a BIPA violation “accrues” (i.e., occurs) will have a seismic impact on the trajectory of all BIPA litigation for years to come—depending on how the Illinois Supreme Court rules in the coming term.

Watson and Cothron BIPA Decisions

In mid-December 2021, an Illinois appellate panel in Watson v. Legacy Healthcare Financial Services, LLC, held BIPA claims accrue each and every time a defendant captures biometric information in violation of the statute, as opposed to only accruing at the first instance of collection.

Just a few days after Watson, the Seventh Circuit Court of Appeals issued its decision in Cothron v. White Castle System, Inc.—another appeal involving claim accrual. But rather than decide when a BIPA claim accrues, and after acknowledging the existence of Watson, the Cothron court certified the question to the Illinois Supreme Court to provide definitive guidance.

While neither Watson nor Cothron offers a conclusive answer, the issue is now teed up to be definitely decided by Illinois’ highest court.

Impact & Implications

The accrual date is a significant issue in BIPA class action litigation. Depending on the circumstances, accrual can serve as the basis for a statute of limitations defense, which, if successful, may require dismissal. But the issue is even more consequential in the context of damages and determining the overall value of a biometric privacy class action. If continuing BIPA violations constitute separate, independent claims, then the associated statutory negligent damages of $1,000 per violation (or $5,000 if intentional) could begin to compound. And because the law provides for liquidated damages for each violation, a ruling that claims accrue each time a defendant runs afoul of the law’s requirements could expand such liability exponentially.

Conclusion

Companies should pay close attention to how the Illinois Supreme Court decides the Cothron appeal, as the ruling could result in a drastic shift in the biometric privacy legal landscape. In the interim, companies should—with the assistance of experienced biometric privacy counsel—take the time to reassess their compliance with BIPA to ensure they are satisfying the full range of requirements to mitigate potential class action risk.

Categories
Biometric Privacy Compliance Tips

Beware of Hidden Pitfalls: Biometric Privacy Guidance for California Employers

David J. Oberly |

By now, most Golden State employers are well versed in the California Consumer Privacy Act of 2018 (“CCPA”), as well as its soon-to-be successor, the California Privacy Rights Act of 2020 (“CPRA”), which goes into effect at the start of 2023.

At the same time, more and more employers operating in California (and in other parts of the nation) are integrating biometrics into their operations in a variety of ways, including for timekeeping and access control purposes, among others.

Many of these employers need not worry about satisfying the onerous requirements of the CCPA because they do not meet any of the law’s three applicability thresholds: (1) $25,000,000 in gross revenue; (2) buying, receiving, sharing, or selling the personal information of more than 50,000 consumers, households, or devices; or (3) deriving 50 percent or more of revenue from the sale of personal information.

And those employers that do fall under the scope of California’s consumer privacy law are largely exempted from compliance in connection with the personal information of employees (and job applicants) under the CCPA’s employee information exemption.

Combined, many California employers operate under the assumption that there are no applicable legal requirements that must be satisfied when using biometrics in their day-to-day operations. Organizations that take this approach do so at their peril, as California Labor Code § 1051 imposes clear, unambiguous requirements and limitations on employers that utilize certain biometric data in the workplace. More than that, this California law—which often flies under the radar of many employers and even their biometric privacy counsel—can result in criminal penalties for noncompliance.

California Labor Code § 1051

Specifically, Labor Code § 1051 bars employers that require employees or job applicants to furnish their fingerprints from disclosing that fingerprint biometric data to any third party. For example, employers are generally barred under Labor Code § 1051 from disclosing fingerprints to other employers to prevent subsequent employment, or to law enforcement agencies unless required pursuant to a court order or subpoena. Any employer that violates Labor Code § 1051 is guilty of a misdemeanor.

Practical Compliance Tips

Consequently, California employers must proceed with caution when using fingerprint biometric data in the workplace and ensure they are in strict compliance with Labor Code § 1051 when collecting, using, or storing employee fingerprint biometrics.

To do so, employers should first ensure that their biometrics service providers and vendors are completely precluded from accessing any fingerprint data collected by the employer through the service provider/vendor’s technology.

In addition, employers must maintain robust policies and protocols to prevent inadvertent disclosures of employee fingerprint data to any third parties, as a mishap of this nature—while not intentional—still nonetheless runs afoul of Labor Code § 1051.

Similarly, employers must maintain robust security measures to safeguard employee fingerprint data, as any unauthorized acquisition of such data by hackers or other malicious third parties also constitutes a violation of Labor Code § 1051.

To make matters worse, any inadvertent disclosure or other data compromise event also likely constitutes a violation of employees’ right to privacy under the California Constitution. And if that wasn’t enough, breach incidents involving the compromise of fingerprint data will also oftentimes form the basis for an actionable violation of the CCPA, opening the door for class action litigation. It should be noted, however, that Labor Code § 1051 does not apply to employers’ use of other types of biometric data (only fingerprint data) and is inapplicable outside of the employment context.

Categories
Biometric Privacy Compliance Tips

Practical Compliance Tips: Portland Private-Sector Facial Recognition Ban

David J. Oberly |

The city of Portland, Oregon, made headlines last year when it became the first jurisdiction in the nation to enact a blanket ban on the use of facial recognition technology (“FRT”) by all private entities physically located within its city limits. While many cities have banned the use of face biometrics by law enforcement and parts of the public sector, the Portland ordinance is noteworthy because it drastically expanded the scope of this new type of regulation to also reach the private sector.

Since that time, the city of Baltimore, Maryland, followed suit with a similar private-sector facial biometrics ban of its own. More jurisdictions, including both cities and potentially states as well, are likely to add new laws mirroring those of Portland and Baltimore in the immediate future, especially as facial recognition continues to receive regular negative media coverage highlighting its claimed shortcomings, including potential accuracy and bias problems.

Combined, all companies that operate in Portland and use any type of software or other technology that may capture images of individuals’ faces should evaluate whether the new ordinance applies to them and, if so, take immediate action to ensure compliance with the law. And from a broader perspective, as this draconian type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use any type of facial recognition technology need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.

Overview

  • Scope/Applicability: The Portland ordinance bars the use of “facial recognition technologies” by “private entities” in “places of public accommodation” within the City of Portland.
  • “Private Entity”: The ordinance defines the term “private entity” in similar fashion to the Illinois Biometric Information Privacy Act (“BIPA”) as “any individual, sole proprietorship, partnership, limited liability company, association, or any other legal entity, however organized.”
  • “Face Recognition Technologies”: Face recognition technologies means “automated or semi-automated processes using Face Recognition that assist in identifying, verifying, detecting, or characterizing facial features of an individual or capturing information about an individual based on an individual’s face.”
  • “Face Recognition”: Face recognition, in turn, is defined as “the automated searching for a reference image in an image repository by comparing the facial features of a probe image with the features of images contained in an image repository (one-to-many search).”
  • “Places of Public Accommodation”: “Places of public accommodation” is defined broadly to mean “[a]ny place or service offering to the public accommodations, advantages, facilities, privileges whether in the nature of goods, services, lodgings, amusements, transportation or otherwise.”

Exemptions

  • Certain Places of Public Accommodation: Excluded from the scope of the ordinance are “institution[s], bona fide club[s], private residence[s], [and] place[s] of accommodation that [are] in [their] nature distinctly private.”
  • Legal Compliance: The ordinance does not apply to the use of FRT to the extent necessary to comply with federal, state, or local laws.
  • User Verification: The ordinance does not apply to the use of FRT for user verification purposes, but only in the narrow context of allowing an individual to access his or her individual or employer-issued communication or electronic device.
  • Automatic Face Detection: Finally, the ordinance does not apply to the use of FRT “[i]n automatic face detection services in social media applications.”

Core Compliance Requirement

  • Prohibition on FRT Use: Under the ordinance, private entities are barred from using face recognition technologies in places of public accommodation within city limits.

Enforcement and Remedies

  • Private Right of Action: Any person “injured” by a material violation of the ordinance may pursue class action against the offending private entity.
  • Recoverable Damages: A person injured by a violation of the ban can recover $1,000 per day for each day of the violation or actual damage sustained as a result of the violation, whichever is greater, as well as “such other remedies as may be appropriate.”
  • Attorneys’ Fees: Attorneys’ fees are also recoverable, but only if certain actions are taken by the injured person before filing suit. Specifically, a plaintiff must submit a written demand for the payment of a claim on the offending private entity and its insurer (if known to the plaintiff) at least 30 days before the filing of the complaint. Where this is completed, a court may award to a prevailing plaintiff a “reasonable amount” of attorneys’ fees. Conversely, a plaintiff cannot recover attorneys’ fees if, before suit was filed, the offending private entity tendered to the plaintiff an amount that is at least equivalent to the damages awarded to the plaintiff in the litigation, exclusive of any costs, interest, and prevailing party fees.

Practical Compliance Tips & Best Practices

For companies operating in Portland, immediate action should be taken if not already done so to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Portland ordinance if the organization falls under the scope of the law:

  • Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must engage in identifying, verifying, detecting, or characterizing facial features or capture information about an individual based on his or her facial features.
  • Evaluate Applicability of Exceptions to Ban: If the technology is found to fall under the scope of the ban, the next step is to evaluate whether any of the limited exemptions offered by the ordinance can be satisfied to allow the company to continue its use of the technology.
  • Cease All Use of FRT If No Exceptions Apply: If none of the exceptions apply, the company must immediately cease all use of its FRT technology.
  • Identify Availability of Any Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current FRT technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.
Categories
Biometric Privacy Compliance Tips Case Law Developments Legislative Developments & Trends

Current BIPA Trends: Class Actions Targeting the Use of Voice Data

David J. Oberly |

2021 has brought with it a sizeable expansion in the types of technology and companies that are now being targeted with bet-the-company Illinois Biometric Information Privacy Act (“BIPA”) class action lawsuits. The first major expansion involved the targeting of virtual try-on technology, a feature made even more popular during the COVID-19 pandemic, which, according to plaintiffs, utilizes facial recognition technology. More recently, a high volume of BIPA class action suits have been filed targeting the use of voice-powered technologies.

BIPA & Voice Data

BIPA regulates the collection, use, and storage of “biometric identifiers,” which includes—among other things—“voiceprints.” However, the term “voiceprint” is not defined in Illinois’ biometric privacy statute. “Voiceprint” is generally defined as a distinctive pattern of curved lines and whorls made by a machine that measures human vocal sounds for the purpose of identifying an individual speaker. It is this hallmark of identifying (or verifying the identity of) an individual that makes voice data a “voiceprint” under BIPA. In this respect, courts have noted that voice biometrics, also known as voiceprinting, is the use of biological characteristics—one’s voice—to verify an individual’s identity.

Thus, a critical distinction exists between general voice data, which is not covered by BIPA, and voiceprint, which fall under the scope of Illinois’ biometric privacy statute—with the important dividing line being the identifying quality of the biometric information. In a 2017 case, an Illinois federal court recognized this distinction, noting the difference between the mere capture of voice data and an actual “voiceprint.” In doing so, the court noted that if an entity simply captures a person’s voice without generating a voiceprint for the specific purpose of identifying ,or verifying the identity of, an individual, then there is no violation of BIPA.