Articles Posted in Privacy Law

Efforts by an alleged perpetrator and his legal team to unmask a Jane Doe plaintiff (by revealing her name) were held dead on arrival by the Illinois Appellate Court today. Our firm assisted lead counsel Tamara Holder with the appellate briefs. In these types of matters, our firm concentrates on defending alleged sexual assault victims who are allegedly revictimized by being subject to what we advocate, on our client’s behalf, in court papers, are strike suits for defamation or libel. This practice of suing the alleged victim for libel or defamation is, unfortunately, becoming an all too common tactic to, we contend, try to bully them into silence or to retract their claims.

The forceful and well-reasoned concurring opinion by Justice Hyman explains why efforts to expose the names of alleged victims of sexual misconduct or assault is a pernicious practice. The opinion provides guide posts for courts in Illinois and across the country to encourage alleged sexual misconduct or assault victims to seek justice, without having to suffer more trauma due to their names being spread all over the internet. It also notes that the alleged perpetrator should have similar privacy rights prior to a judgment on guilt or innocence.

The concurring opinion states:

In a world where the Internet already has created privacy, confidentiality, and security issues, we now enter the age of artificial intelligence, exacerbating these issues and making secrecy vital. No longer, in famous observation of Justice Brandeis almost 100 years ago, is “right to be let alone” enough. Olmstead v. United States, 277 U.S. 438, 478 (1928) (Brandeis, J., dissenting). In the 21st century, the right to be left unknown will join the right to be let alone as a vexing subject of intense legal debate. Indeed, the question of anonymity has taken on increased significance as court records have become readily available to the general public through even casual Internet searches. As the appellant notes in his brief, a Google search of a litigant’s name can produce an untold number of articles describing the lawsuit. Those articles may be available online for a lifetime, unless kept confidential. Although Illinois case law offers slight guidance on petitions to proceed anonymously, an alleged victim deserves anonymity whether or not their identity has been divulged elsewhere, including in a case not brought by them. …

Although no reported Illinois cases address whether a claim of sexual violence constitutes an “exceptional” situation warranting the use of a pseudonym, federal courts in Illinois have recognized that allegations of sexual assault are “highly sensitive, personal matters that involve the disclosure of information of the utmost intimacy.” Doe v. Cook County, Illinois, 542 F. Supp. 3d 779, 786 (N.D. Ill. 2021); accord Doe No. 2 v. Kolko, 242 F.R.D. 193, 195 (E.D.N.Y. 2006) (while the Seventh Circuit disfavors fictitious names, it has “recognized that sexual assault victims are a paradigmatic example of those entitled to a grant of anonymity” (citing Doe, 112 F.3d at 872)). Even so, a sexual violence allegation alone has been considered not dispositive. See Cook County, Illinois, 542 F. Supp. 3d at 786 (“allegation of sexual assault alone does not end the inquiry”); Doe v. Skyline Automobiles, Inc., 375 F. Supp. 3d 401, 405-06 (S.D.N.Y. 2019) (“other factors must be taken into consideration and analyzed in comparison to the public’s interest and the interests of the opposing parties”).

Illinois has taken steps to protect individuals’ private information. Examples include the Personal Information Protection Act (815 ILCS 530/1 et seq. (West 2022)), and the Biometric Information Privacy Act (740 ILCS 14/1 et seq. (West 2022)), and two laws regulating data obtained by artificial intelligence, the Artificial Intelligence Video Interview Act (820 ILCS 42/5 (West 2022)) and the Illinois Health Statistics Act (410 ILCS 520/1 et seq. (West 2022)). Nonetheless, the law cannot keep pace with the speed of innovations, compromising privacy. Corinne Moini, Protecting Privacy in the Era of Smart Toys: Does Hello Barbie Have A Duty to Report?, 25 Cath. U.J.L. & Tech. 281, 299 (2017) (asserting that privacy torts do not provide adequate protection for privacy implications of artificial intelligence and data collection). When methods of intruding into private lives and stripping anonymity outpace lawmakers’ ability to address them, courts have a duty under existing rules of procedure to protect sexual assault and abuse victims.

Plaintiff, a minor when the alleged sexual assault occurred, undeniably constitutes an “exceptional” situation. The lawsuit involves matters of a highly personal nature warranting anonymity. Indeed, Illinois Supreme Court rules acknowledge the need for anonymity in cases involving minors. For instance, the Illinois Supreme Court rules provide that minors shall be identified by first name and last initial or by initials in adoption cases (Ill. S. Ct. R. 663 (eff. Oct. 1, 2001) and appeals involving the Juvenile Court Act of 1987 (705 ILCS 405/1 et seq. (West 2022)). Ill. S. Ct. R. 660(c) (eff. Oct. 1, 2001). Moreover, the Style Manual for the Supreme and Appellate Courts of Illinois (5th ed. rev. 2017) provides for using the minor’s initials in cases involving the Department of Children and Family Services. These rules reflect the need to protect the identity of a minor in matters of a personal nature that involve potentially stigmatizing issues such as termination of parental rights or juvenile criminal conduct.  An alleged victim of sexual violence has similar reasons for protecting their identity when filing a lawsuit under the Gender Act. The alleged conduct involves highly personal conduct likely to embarrass and stigmatize, regardless of its availability on the Internet. Thus, I would find that an alleged victim has a compelling reason to proceed anonymously when filing a complaint. Similarly, an accused perpetrator should be able to seek anonymity on petition….

The appellant contends that Doe waived her right to proceed anonymously because she filed an affidavit supporting a motion to dismiss the defamation lawsuit the appellant filed against his other accusers. (The appellant added Doe as a defendant in the defamation litigation after she filed her complaint.) I must disagree that she waived her right. When Doe filed the affidavit in the defamation case, she had yet to file her complaint against defendant. The decision to help another litigant should not bar an individual from proceeding anonymously in their own lawsuit, regardless of an affidavit in another proceeding. Filing suit creates a different level of exposure than filing an affidavit in support of others.

You can read the entire opinion here. Continue reading ›

Apple recently sued the NSO Group, an Israeli surveillance company that allegedly uses Apple products to spy on targets for its government clients. While the NSO Group has tried to portray itself as a company that helps bring criminals to justice and save lives, a closer look at their clients (and the targets of those clients) tells a more insidious story.

According to internal documents from the NSO Group that were leaked to the press, the surveillance company’s clients include the United Arab Emirates and Mexico, and the targets of those clients have included dissidents, activists, and journalists. The documents also revealed that the teenaged children of those targets (some of whom were living in the U.S.) were also surveilled.

The NSO Group’s legal troubles started back in 2019 when Facebook sued the surveillance company for targeting its WhatsApp users. The surveillance company tried to claim foreign sovereign immunity to have the lawsuit dismissed, but the United States Court of Appeals for the Ninth Circuit rejected that argument, thereby paving the way for the case to proceed through the courts.

The unanimous decision also paved the way for Apple to file its own lawsuit against the NSO Group. When Apple discovered that the NSO Group had created spyware that allowed it to access data on a target’s Apple product and transmit it back to the government servers without the target knowing about it, Apple took steps to both prevent future attacks, and to bring the NSO Group to justice for this invasion of privacy.

When it turned out that NSO’s engineers had created more than 100 fake Apple IDs to carry out the attack, Apple was able to sue the surveillance company for violating Apple’s Terms and Conditions, to which every user must agree in order to set up their account. One section of Apple’s Terms and Conditions specifies that users’ engagement with Apple and its products and services are to be governed by California state law. That’s the clause that allowed the Silicon Valley company to sue an Israeli surveillance company in U.S. federal court. Continue reading ›

Recently, the Illinois Appellate Court for the First District issued a significant decision on the question of which statute of limitations govern claims for violations of the Illinois Biometric Information Privacy Act (“BIPA”). In its opinion, the Court ruled that claims for unlawful profiting from or disclosure of biometric data, those brought under sections section 15(c) and (d) of the BIPA, are subject to a one year limitations period while claims involving violations of the notice, consent and retention requirements, those brought under sections 15(a), (b), and (e) of the BIPA, are subject to a limitations period of five years. This decision should bring much needed clarity to class-action plaintiffs and defendants alike.

The BIPA, one of the most robust privacy statutes in the country, imposes various obligations on anyone that collects, stores or uses biometric identifiers such as fingerprints, retina or iris scans, voiceprints, or face geometry from Illinois residents. Failure to comply with the BIPA’s requirements can be costly as violations of the statute entitle successful plaintiffs to statutory damages ranging from $1,000 to $5,000 for each violation (plus attorney fees). This can add up quickly as claims for violations of the BIPA are frequently brought as a class action as we have seen in recent years.

The underlying case was brought by two former drivers for Black Horse Carriers, a trucking and logistics company. The plaintiffs filed the case as a class action. In their lawsuit, the former drivers alleged that Black Horse failed to obtain consent to use drivers’ fingerprints or to institute a retention schedule. They also accused the company of unlawfully disseminating their biometric data by sharing fingerprints with a third-party vendor that processed timekeeping records for the company. Continue reading ›

In a putative class-action lawsuit filed against Apple concerning alleged violations of the Illinois Biometric Information Privacy Act (BIPA), the parties disputed the scope of discovery to which the plaintiffs were entitled. The plaintiffs sought to compel Apple to produce certain identifying information for Illinois residents with Apple devices containing the Photos App. The plaintiffs also issued document subpoenas to major resellers of Apple products for the personal data of individual customers. The district court ultimately denied the request to compel and quashed the subpoenas, citing concerns about how personal information would be protected given the increase in cyber attacks and hacking incidents.

The suit centers on the Photo App contained on Apple devices that displays photos stored on the devices. According to the plaintiffs, the Photo App collects biometric identifiers and biometric information, including scans of facial geometry and related biometric information, of the individuals in the photos. Apple collects these biometric identifiers, the plaintiffs allege, without first notifying the individuals in writing and obtaining their informed consent. The plaintiffs further allege Apple possessed biometric identifiers and biometric information without creating and following a written, publicly available policy with retention schedules and destruction guidelines. According to the plaintiffs’ complaint, these actions violate the BIPA. Continue reading ›

The Supreme Court recently issued its first ever opinion interpreting the Computer Fraud and Abuse Act, 18 U.S.C. §1030. In issuing its opinion, the Court limited the scope of the Computer Fraud and Abuse Act and resolved a circuit split on the meaning of “exceeds authorized access” found in the statute. In a 6-3 opinion, Justice Amy Coney Barrett, in her first signed majority opinion, said the Court would not turn “millions of otherwise law-abiding citizens” into criminals if they violated their employer’s computer-use policies at work by using their computers to send personal e-mails, do online shopping, or plan a vacation.

At issue, the Court said, were so-called “inside hackers” who have legal access to a computer but exceed their authorized authority by using the information for unauthorized purposes. Adopting the government’s “breathtaking” interpretation of the phrase “exceeds authorized access,” the Court explained, would turn every violation of a computer-use policy into a criminal act.

The immediate beneficiary of the Court’s ruling was a former Georgia police sergeant, Nathan Van Buren. Van Buren was authorized to use the Georgia Crime Information Center database to check license plates as part of his job. He unwittingly found himself caught up in an FBI sting when he took a $5,000 payment from a man who claimed that he wanted to learn about a stripper he had just met. After using his official computer to perform the requested search, Van Buren was charged and convicted of violating the Computer Fraud and Abuse Act for exceeding his “authorized access.”

The Computer Fraud and Abuse Act was enacted in 1986, during the early stages of the internet. The statute imposes criminal or civil liability on any person who “intentionally accesses a computer without authorization” or “exceeds authorized access” and, in doing so, obtains information from a “protected computer.” The statute does not define the term “without authorization” but does define the term “exceeds authorized access” in a rather opaque way. Pleading a claim under the statute requires a plaintiff to allege that the defendant (i) intentionally accessed a computer, (ii) lacked authority to access the computer or exceeded authorized access to the computer, (iii) obtained data from the computer, and (iv) caused a loss of $5,000 or more during a one-year period. Continue reading ›

As we have previously written about here, here, and here, the Illinois Biometric Information Privacy Act (BIPA) has generated some high profile litigation in recent years. The Illinois Supreme Court’s last opportunity to consider one of the country’s most protective laws concerning biometric data came in 2019 in its decision in Rosenbach v. Six Flags Entertainment Corporation, which we wrote about here. Recently, the Illinois Supreme Court has granted permission to appeal another potentially impactful decision interpreting BIPA.

BIPA was enacted in 2008 to help regulate the collection, use, safeguarding, handling, storage, retention, and destruction of biometric identifiers and information. The BIPA defines “biometric identifier” as “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” It defines “biometric information” as “any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual.” The BIPA provides for fines of $1,000 to $5,000 for each violation.

On January 27, 2021, the Illinois Supreme Court granted leave to appeal the Illinois Court of Appeals for the First District’s recent decision in McDonald v. Symphony Bronzeville Park LLC, 2020 IL App (1st) 192398. The McDonald case considered the very specific, yet important, issue of whether the exclusivity provisions of the Illinois Workers’ Compensation Act preempted claims statutory damages under BIPA. In its decision, the First District ruled that the Illinois Workers’ Compensation Act, and specifically its exclusive remedy provisions do not bar claims for statutory damages under BIPA. Continue reading ›

An AI company harvested publicly available photographs from social media sites across the internet and then used those photographs to derive a biometric facial scan of each individual in the photograph. The company sold this database to law enforcement agencies to use in identifying persons of interest or unknown individuals. A woman sued in a class action, arguing that the harvesting of biometric data violated Illinois’ Biometric Information Privacy Act. The company removed the case to federal court, and the federal court ruled that the plaintiffs’ claims lacked standing under Article III. The appellate court agreed with the district court and affirmed, ordering that the case be remanded to state court.

Clearview AI is in the business of facial recognition tools. Users may download an application that gives them access to Clearview’s database. The database is built from a proprietary algorithm that scrapes pictures from social media sites such as Facebook, Twitter, Instagram, LinkedIn, and Venmo. The materials that it uses are all publicly available. Clearview’s software harvests from each scraped photograph the biometric facial scan and associated metadata, which it stores in its database. The database currently contains billions of entries.

Many of Clearview’s clients are law enforcement agencies. The clients primarily use the database to find out more about a person in a photograph, such as to identify an unknown person or confirm the identity of a person of interest. Users upload photographs to Clearview’s app, and Clearview creates a digital facial scan of the person in the photograph and then compares the new facial scan to those in its database. If the program finds a match, it returns a geotagged photograph to the user and informs the user of the source social-media site for the photograph.

In the wake of a New York Times article profiling Clearview, Melissa Thornley filed suit in Illinois state court under the Illinois Biometric Information Privacy Act (BIPA). BIPA provides robust protections for the biometric information of Illinois residents. Thornley’s complaint, filed on behalf of herself and a class, asserted violations of three subsections of BIPA. Clearview removed the case to federal court. Shortly after removal, Thornley voluntarily dismissed the action. Thornley then returned to the Circuit Court of Cook County in May 2020 with a new, significantly narrowed, action against Clearview. The new action alleged only a single violation of BIPA and defined a more modest class. Continue reading ›

In a 3-0 decision, the U.S. Court of Appeals for the Ninth Circuit ruled that Facebook users in Illinois can move forward with a class-action lawsuit challenging the company’s use of facial recognition technology. Facebook had argued that the court should not let the plaintiffs proceed on a class basis with claims that it violated the Illinois Biometric Information Privacy Act (often referred to a “BIPA”). The Ninth Circuit’s ruling in Patel v. Facebook affirmed the District Court’s decision to certify a class of Illinois Facebook users.

The BIPA is intended to protect the biometric privacy of Illinois citizens by imposing restrictions on the collection and storage of certain biometric information by private companies. One of the protections afforded by the law is the requirement that a company must obtain an individual’s written consent before collecting and storing any such biometric information.

The case stems from a class action complaint filed by three Illinois Facebook users on behalf of all Illinois Facebook users accusing the social media company of unlawfully gathering and storing its users’ biometric information, including through the use of facial recognition technology, without consent. Specifically, the suit targets a feature Facebook launched in 2010 called “Tag Suggestions” which uses facial recognition technology to build a “face template” of an individual from pictures uploaded to the site. The software builds these face templates by analyzing an individual’s face in uploaded photos and measuring various geometric data points on an individual’s face such as the distance between eyes, nose, and ears. Users are able to opt-out of the feature, and Facebook argued that it only builds face templates of Facebook users who have not opted-out and have the feature turned on. Continue reading ›

If you’ve used Facebook at all in the past few years, you’ve probably noticed that every time you post a photo with one of your friends, Facebook automatically suggests you tag that person. While that might seem innocent enough, the facial recognition technology Facebook uses to accomplish that is highly controversial and possibly illegal.

Facial recognition technology is a relatively recent development and it didn’t take long for it to become controversial. With the abundance of cameras all around us, facial recognition technology allows owners of the technology to find us just about everywhere we go, which is why Facebook is now facing a class action consumer lawsuit on behalf of millions of Illinois users.

According to the lawsuit, Facebook used its facial recognition technology to gather and store biometric data on its users without their consent, which violates the Illinois Biometric Information Privacy Act of 2008. Facebook tried to have the class action dismissed and to force each plaintiff to sue them individually, knowing the costs of filing the lawsuit would prohibit most, if not all the plaintiffs from pursuing legal action.

But the court said the class action was the proper format for this particular lawsuit. Facebook appealed that decision, and the appellate court recently upheld the lower court’s ruling, allowing the class action to proceed as is. Continue reading ›

The line between security and privacy has always been a bit blurry and it continues to get blurrier every day as technology advances. One of the latest developments in surveillance technology has been facial recognition software, which is allegedly capable of identifying you with just a quick scan of your face. While this could have far-reaching effects in the crime-solving world, it also eliminates much of our personal privacy in the process.

Brian Hofer is a paralegal in California who has been fighting to ban facial recognition software for the past five years. As soon as he became aware of the technology in 2014, he joined activist groups to try to get the technology banned from his hometown of Oakland. Once that was accomplished, he started working with other local government bodies across California to ban the technology from their streets. Since then, Hofer has drafted 26 different privacy laws for cities and counties all over the state of California, and all 26 have been approved.

While facial recognition technology may have been the catalyst for Hofer to start fighting for each citizen’s right to privacy, it has extended beyond that to include demands that companies and governing bodies be transparent about the kind of technology they’re using for their surveillance efforts. He has also convinced some cities, including Richmond and Berkeley, to cancel their contracts with tech companies like Vigilant Solutions and Amazon – both Richmond and Berkeley have sanctuary policies and both Vigilant Solutions and Amazon share information with ICE, so Hofer successfully argued that maintaining both the sanctuary policies and contracts with those companies constituted a conflict. Continue reading ›

Contact Information