Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Case Law Developments Class Action Litigation Defense Strategies

Texas Attorney General Reaches Largest-Ever Biometrics Settlement with Meta

Amanda M. Noonan |

On July 30, 2024, a Texas state court issued an Order finalizing the largest-ever biometrics settlement, between the Texas Attorney General and Meta for a staggering $1.4 billion. The settlement resolves a longstanding civil action brought by the Texas Attorney General in 2022 asserting violations under Texas’s Capture or Use of Biometric Identifier Act (“CUBI”).

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Case Law Developments

Northern District of Illinois Weighs in on Employment-Related Examinations under Illinois’ GIPA

Gabrielle N. Ganze |

In an important privacy law development, United States District Court for the Northern District of Illinois, Judge Sharon Johnson Coleman, has issued two of the first federal decisions applying a substantive analysis to provisions of the Illinois Genetic Information Privacy Act, 410 ILCS 513/1 et seq. (“GIPA”) as it relates to employment-related examinations.

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Legislative Developments & Trends

Maryland Passes Unique and Operationally Challenging Privacy Law

Philip N. Yannella, Sharon R. Klein, Timothy W. Dickens, and Jason C. Hirsch |

Maryland recently became the fifth state in 2024—and the 17th U.S. state overall—to pass a comprehensive data privacy law. Effective October 1, 2025, the Maryland Online Data Privacy Act (“MODPA”) contains a number of unique provisions that govern the processing of sensitive and children’s data, among other things. These unique provisions, combined with the broad applicability of the law, makes MODPA one of the more operationally challenging privacy laws passed in the United States to date.

Scope and Applicability

MODPA applies to individuals that do business in Maryland or target services to Maryland residents and who, during the prior calendar year, either controlled or processed the personal data of at least 35,000 Maryland residents or controlled or processed the personal data of at least 10,000 Maryland residents and derived more than 20 percent of their gross revenue from the sale of personal data. The 35,000 threshold is 0.56 percent of Maryland’s total population of 6.18 million and is notably lower than other state privacy laws. Most U.S. states set a threshold for processing of 100,000 state residents. Only Delaware, with a population of 990,000, has a processing threshold as low as Maryland’s. The law also lacks a full exemption for non-profit institutions as well as institutions of higher education.

The relatively low threshold for compliance combined with the lack of familiar exemptions means that MODPA will likely trigger compliance obligations for a swath of institutions that haven’t had to comply with many other U.S. state privacy laws.

Read the full client alert on our website.

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Legislative Developments & Trends

The Oregon Consumer Data Privacy Act Takes Effect July 1, 2024

Gabrielle N. Ganze |

With its passage of Oregon Consumer Data Privacy Act (“OCDPA”), Oregon became one of 16 states to pass comprehensive data privacy laws.

Regulated Entities and Data

The OCDPA generally applies to any person who meets two requirements:

  1. conducts business in the state, or “provides” products or services to Oregon’s residents; and
  2. within a calendar year, controls or processes personal data of
    • 100,000 or more consumers, or
    • 25,000 or more consumers and also derives at least 25 percent of its gross revenue from selling personal data.

“Personal data” regulated by the Act broadly includes any “derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household.”

The OCDPA imposes additional requirements for personal data that is considered “sensitive data.” Such data includes children’s data; genetic or biometric data; precise geolocation data; or data that “reveals a consumer’s” national origin, citizen or immigration status, racial or ethnic background, religious beliefs, mental or physical condition/diagnosis, sexual orientation, transgender or non-binary status, or status as a victim of crime. This definition of sensitive data is more expansive that other privacy statutes with its inclusion of categories such as transgender or non-binary status.

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Legislative Developments & Trends

Colorado Becomes the First State to Explicitly Protect “Neural Data”

Daniel R. Saeedi |

Colorado has amended its privacy statute, the “Colorado Privacy Act,” to explicitly protect “neural data.” The new amendment, signed into law by Governor Jared Polis on April 17, 2024, adds both “neural data” and “biological data” as defined terms under its umbrella of “Sensitive Data” to be protected in accordance with the statute, and which cannot be collected without first obtaining the consumer’s consent.

The new amendment defines “neural data” as “information that is generated by the measurement of the activity of an individual’s central or peripheral nervous systems and that can be processed by or with the assistance of a device.” “Biological data,” which was a previously undefined term under the statue, now means “data generated by the technological processing, measurement, or analysis of an individual’s biological, genetic, biochemical, physiological, or neural properties, compositions, or activities or of an individual’s body or bodily functions,” if the data is “intended to be used, singly or in combination with other personal data, for identification purposes.” The amendment further clarifies that “biological data” includes “neural data.”

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Legislative Developments & Trends

Is Your Business in Compliance with Washington’s New Data Privacy Statute?

Gabrielle N. Ganze |

Washington’s new data privacy statute, “My Health, My Data Act” (“MHMD” or the “Act”), officially became fully effective on March 31, 2024, for regulated entities under the Act, while small businesses have until June 30, 2024, to comply. The purpose of MHMD is to protect consumers’ personal health data not otherwise protected by federal regulation, such as HIPAA. Businesses should be familiar with Washington’s preexisting biometric privacy law, RCW 19.375, and recognize MHMD’s coverage is far more expansive. MHMD regulates the collection, sharing, selling, and processing of “consumer health data.” It applies to entities that conduct business in Washington as well as those that provide services or products to Washington.

Notably, the Act does not regulate the collection of employee data like other privacy statutes. However, the scope of MHMD’s regulation expands far beyond traditional health data and biometric data, which has been the focus of many other data privacy statutes throughout the country. Unlike Washinton’s biometric statute, MHMD can be enforced by private parties through a private right of action, in addition to the Attorney General. Consumers can sue for damages and other relief for violations of MHMD, which gives it the potential to spur class action litigation.

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Legislative Developments & Trends

NYC Introduces Bills to Limit Facial Recognition in Private Sector

Tianmei Ann Huang |

New York City Council (“Council”) members are expected to formally introduce two Local Laws on April 27, 2023, during the next Council meeting, seeking to regulate private-sector use of facial recognition (or similar surveillance technology) for identification or verification purposes.

The first bill would amend New York City’s administrative code to prohibit businesses and venues from using “biometric identifier information” (e.g., face scans) to identify or verify customers without first obtaining their written consent. These businesses and venues must also develop and make publicly available a retention-and-destruction policy, and must further comply with certain data protection, privacy, and security obligations. The proposal also includes a private right of action for civil damages up to $500 per negligent violation and up to $5,000 per intentional or reckless violation, as well as attorneys’ fees.

The second bill would ban owners of “multiple dwelling” properties (e.g., residential buildings) from installing, activating, or using “biometric recognition technology” to identify tenants or their guests. The legislation, if enacted, would be one of the first laws to place city-wide restrictions on the use of biometric recognition technology in the private sector.

Based on the introduction of these dual bills, companies in NYC that currently collect biometric data, or are considering doing so, are encouraged to contact experienced counsel to provide protective compliance measures—lest they become the target of civil litigation.

Categories
Biometric Privacy Compliance Tips Case Law Developments Class Action Litigation Defense Strategies

Illinois Supreme Court: Federal Labor Law Preempts Union Employees’ BIPA Claims

Tianmei Ann Huang |

The Illinois Supreme Court in Walton v. Roosevelt University, 2023 IL 128338 (Mar. 23, 2023), unanimously affirmed dismissal of the putative class action arising under the Illinois Biometric Privacy Information Act, 740 ILCS 14/1 (“BIPA”), concluding that federal labor law preempted BIPA claims brought by unionized employees covered by a collective bargaining agreement (“CBA”). Consistent with Seventh Circuit federal court decisions in support of federal preemption, the Walton high court’s ruling specifically provides that Section 301 of the federal Labor Management Relations Act (“LMRA”), 29 U.S.C. § 185, preempts BIPA claims asserted by union employees (or bargaining unit employees) covered by a CBA in Illinois state courts. Therefore, the federal preemption defense may be used to foreclose these unionized employees from bringing BIPA claims in state and federal courts, including on a class action basis.

In Walton, the representative plaintiff was a member of a union subject to a CBA, which included a broad management-rights clause, during his employment with Roosevelt University. The putative class alleged that Roosevelt University used scanning devices to enroll employees’ hand geometry scans for timekeeping purposes, but Roosevelt University failed to fulfill BIPA’s Section 15 requirements. However, under the LMRA, the provisions of the CBA should govern, and even if “biometric” data is not expressly discussed within the CBA, a broad management-rights clause along with provisions regarding employee timekeeping and grievance resolution procedures may be sufficient to preclude BIPA litigation.

Overall, the Walton decision offers a measure of relief to defendants involved in BIPA disputes brought by union employees, particularly following the liability-expanding Illinois Supreme Court decisions in Cothron and Tims, as previously discussed. To avoid future litigation, employers should carefully exercise their exclusive rights to direct the employees covered by a CBA or other contract.

Categories
Biometric Privacy Compliance Tips Biometric Privacy Legal Landscape Case Law Developments

Illinois Supreme Court Dramatically Expands Liability by Ruling Each Scan of a Biometric Identifier Is a Separate Violation

Amanda M. Noonan |

In a 4-3 split, the Illinois Supreme Court ruled earlier this month that claims under Sections 15(b) and 15(d) of the Illinois Biometric Information Privacy Act (“BIPA”) accrue each time a private entity scans a person’s biometric identifier and/or submits such scan to a third party—rather than only upon first collection. Cothron v. White Castle System, Inc., 2023 IL 128004 (Feb. 17, 2023). This decision—which dramatically expands the scope of potential liability for BIPA defendants—comes just weeks after the Illinois Supreme Court held a five-year statute of limitations applies to all BIPA causes of action in Tims v. Blackhorse Carriers, Inc., 2023 IL 127801 (Feb. 2, 2023).

The impact of Cothron on claim accrual, coupled with Tims’ resolution of the statute of limitations, will have an immense and immediate impact on BIPA class-action lawsuits—many of which had been stayed pending these decisions.

For many businesses that implement biometric time clocks, which scan biometric identifiers to track employee time/attendance, this means each time an employee scans in-and-out of work, a new BIPA violation accrues. Together with the five-year statute of limitations period, BIPA defendants may now be facing hundreds—if not thousands—of independent BIPA violations for a single complainant.

Categories
Biometric Privacy Compliance Tips

Practical Compliance Tips: Baltimore Private-Sector Facial Recognition Ban

David J. Oberly |

In mid-2021, Baltimore, Maryland, passed Council Bill 21-0001 (the “FRT Ordinance”), becoming the second U.S. jurisdiction to enact sweeping facial recognition regulation that bans the use of facial biometrics by any private entity or individual within city limits.

While a number of cities have enacted laws prohibiting law enforcement and other governmental agencies from using facial recognition, Portland, Oregon, enacted the nation’s first blanket ban over the use of this technology by the private sector at the beginning of 2021. The Baltimore FRT Ordinance goes even further than Portland by imposing criminal penalties of up to a year in jail for companies and individuals that run afoul of the ban.

As federal lawmakers continue to drag their feet on enacting a nationwide, uniform biometric privacy regulatory regime, companies should anticipate that cities and states will continue to take the lead in implementing new biometrics regulation in 2022. In particular, the success seen by Baltimore and Portland in enacting outright bans over the commercial use of facial recognition software is likely to encourage lawmakers in other cities and states to follow suit by enacting tighter controls over the collection and use of facial geometry data in other parts of the country.

Taken together, all businesses that operate in Baltimore and use any type of facial recognition software should assess whether the Baltimore FTC Ordinance applies to them and, if so, take prompt measures to ensure compliance with the law. And from a broader perspective, as this strict type of biometric privacy regulation is likely expand to additional parts of the country moving forward, companies that use or intend to use facial recognition technology (“FRT”) need to familiarize themselves with this new type of biometrics regulation and consider taking proactive steps to minimize their anticipated liability exposure.

Overview

  • Scope/Applicability: The Baltimore ordinance bars “persons” from obtaining, retaining, accessing, or using any “face surveillance system” or any information obtained from face surveillance system within the City of Baltimore.
  • “Person”: The ordinance defines the term “person” as any individual, partnership, firm, association, corporation, other entity, receiver, trustee, guardian, personal representative, or fiduciary.
  • “Face Surveillance System”: “Face surveillance system” means “any computer software or application that performs face surveillance.”
  • “Face Surveillance”: “Face surveillance,” in turn, is defined as “an automated or semi-automated process that assists in identifying or verifying an individual based on the physical characteristics of the individual’s face.”

Exemptions

  • Access Control Systems: Excluded from the scope of the ordinance are “biometric security system[s] designed specifically to protect against unauthorized access to a particular location or an electronic device.”
  • Maryland Image Repository System: Also excluded from the scope of the ordinance is the Maryland Image Repository System (facial recognition software that allows law enforcement to compare images of unidentified individuals to images from motor vehicle records and criminal mugshots).

Core Compliance Requirement

  • Prohibition on FRT Use: Under the ordinance, a person may not obtain, retain, access, or use in Baltimore City: (1) any face surveillance system; or (2) any information obtained from a face surveillance system.

Enforcement and Remedies

  • Misdemeanor: Any person who violates the Baltimore FRT Ban is guilty of a misdemeanor and subject to a fine of not more than $1,000, imprisonment for not more than 12 months, or both fine and imprisonment.
  • Each Day a Separate Offense: Each day that a violation continues is a separate offense.

Practical Compliance Tips & Best Practices

All businesses that maintain operations in Baltimore should take immediate action (if they have not already done so) to ensure compliance with the city’s FRT ban. Companies should consider the following action steps to determine the applicability of the ban to their operations and to come into compliance with the Baltimore ordinance if the organization falls under the scope of the law:

  • Determine Whether Technology Falls under Scope of Law: First, companies should determine if their technology falls under the scope of the law. To do so, the system must assist in identifying or verifying individuals based on their facial characteristics.
  • Evaluate Applicability of Access Control Exemption: If the technology falls under the scope of the ban, evaluate whether the narrow exemption offered by the ordinance for facial recognition-powered access control systems applies to allow the company to continue its use of the technology.
  • Cease Use if Exemption Inapplicable: If the technology does not serve the purpose of protecting against unauthorized access to a particular location or electronic device, eliminate the use of facial recognition across the board immediately.
  • Identify Availability of Suitable Alternative Technologies: At the same time, companies that are no longer permitted to use their current facial biometrics technology should evaluate whether any alternative technologies can be implemented to accomplish the same objectives—such as identification, verification/authentication, or security—for which facial recognition was used.