Tracking the COVID-19 Pandemic While Respecting EU Data Protection Laws

French representative Mounir Mahjoubi published on April 6 a note on how technology can help with containing the COVID-19 pandemic while respecting privacy laws and ethics.

IoT, Big Data analytics, AI and blockchain may all be used to monitor this major health crisis. For instance, China has been using the “Alipay Health Code”, which uses big data to automatically draw automated conclusions about whether an individual is sick or not. However, there is a fine line between tracking to surveillance. Singapore collaborated with WhatsApp to offer an update about the virus: interested parties can sign up to receive the latest information about the virus. Individuals in the US and the UK can download an app to voluntarily report how they are feeling, even if they do not have any symptoms. Users can then follow how the virus is developing.

The Mahjoubi note cites an article published last month on the Nature website about how digital tools can help us remediate the COVID-19 outbreak. This article provides several examples, including how Johns Hopkins University’s Center for Systems Science and Engineering has developed a real-time tracking map of COVID-19 cases across the world, using data collected from the US Centers for Disease Control and Prevention (CDC), the World Health Organization (WHO), the European Center for Disease Prevention and Control, the Chinese Center for Disease Control and Prevention (China CDC) and the Chinese website DXY, which aggregates data from China’s National Health Commission and the China CDC. The map allows everyone to follow the location and number of confirmed COVID-19 cases, as well as deaths and recoveries around the world.

The note explains that mobile data tracking may be used three different ways to control the pandemic: (1) to monitor collective mobility practices containment, (2) to identify individuals who have been in contact with a person affected by the virus, and (3) to control whether an individual is indeed confined.

Technologies used for these different purposes may be the telephone, GPS and Bluetooth applications, video surveillance, banking cards…

The first use, monitoring collective mobility, can be implemented using, for example, cell phone data pooled from telephone operators. GPS data an also be used for this purpose. Data is anonymized and aggregated. As such, this use respects the GDPR as it does not apply to anonymized data. However, it should be noted that data can be de-anonymized (or re-identified).

The other two uses identify affected individuals and track whether they are indeed staying confined, and raise ethical and legal issues.

Several technologies can be used to identify an individual who has been in contact with an infected person, sometimes called “contact tracing.” It seems that both China and South Korea used such technology to track the spread of COVID-19. France is developing its Stop Covid contact tracing app. The  Pan-European Privacy Preserving Proximity Tracing initiative (PEPP-PT) is an initiative of 130 researchers and technologists from eight European countries collaborating to create a tracing solution fully compliant with the GDPR and preserving privacy.

Indeed, in the European Union, such use of data must be done in compliance with the GDPR. Marie-Laure Denis, the President of the French data protection agency, the CNIL, said in an interview to Le Monde (in French) that such monitoring must be done on a voluntary basis only, based on a free and informed consent, and that there should be no consequences for someone refusing, for example, to download an application. Ms. Denis also noted that such a scheme would have to comply with the data protection principles. Indeed, the GDPR sets out seven key principles: lawfulness, fairness and transparency, purpose limitation, data minimization, accuracy, storage limitation, security and accountability. She notes that if these principles are respected, there would be no need for a legislative provision.

The GDPR provides that only data necessary for an explicit purpose should be collected. The COVID Symptom Tracker app explains, for instance, that the data can “only be used to help medical science and healthcare providers to better understand Coronavirus.”

Ms. Denis also noted that the choice of the technology used to track is important, as, for instance, an application using Bluetooth technology detecting if another phone equipped with the same application is in the immediate vicinity, provides more guarantees than an app using a precise and continuous geolocation.

On April 8, the European Union Commission recommended the development of a common EU approach for the use of mobile applications and mobile data in response to the coronavirus pandemic.

These recommendations include:

  1. strictly limiting personal data processing for the purposes of combating the pandemic and ensuring that personal data is not used for any other purposes such as law enforcement or commercial purposes;
  2. ensuring that regular reviews are being conducted of the continued need for personal data processing for the purposes of fighting COVID-19 and setting appropriate sunset clauses to ensure that such processing does not extend beyond what is strictly necessary for those purposes;
  3. taking measures to ensure that processing is terminated once it is no longer needed and that the personal data is then “irreversibly destroyed unless, on the advice of ethics boards and data  protection authorities, [its ] scientific value in serving the public interest outweighs the  impact on the rights concerned, subject to appropriate safeguards.”
Facebooktwitterredditpinterestlinkedinmailby feather

Newly Arch-Popular Zoom App in Privacy Crosshairs

The COVID-19 virus has forced employees and contractors to work from home, and educators scramble to find ways to continue to teach. Zoom, a video teleconferencing platform, has been the service of choice for many. 200 million people now use Zoom daily.

A New York Times article recently revealed that Zoom automatically sent names and email addresses of participants to a meeting to a system which matches them with their LinkedIn profiles, if the host of the meeting had signed up for the sales prospecting program LinkedIn Sales Navigator. This article came after Motherboard had revealed that the iOS version of the Zoom app was sending some analytics data to Facebook, regardless of whether Zoom users have a Facebook account or not.

These security concerns have been acknowledged in a blog post by Eric Yuan, Zoom’s CEO and the company released a new version of the app. However, a class action suit has been filed against Zoom in the Northern District of California, claiming that Zoom failed to implement adequate security protocols and failed to provide accurate disclosures to its users, and thus fell well short of Zoom’s promises as stated in its privacy policy.

The New York Times reported that New York’s attorney general, Letitia James, sent a letter to Zoom asking it which security measures have been put in place to prevent hacking. Senator Richard Blumenthal wrote a letter to Eric Yuan asking for information about how Zoom handles the personal data of its users and protects against security threats and abuse against its service.”

Another issue is “zoom bombing,” where uninvited tele-conferencing participants are “crashing” a conference to disrupt it using noise or unwelcome images, including pornographic images. Indeed, it has been reported that several educational Zoom meetings have been crashed by uninvited parties. On one instance, the intruder exposed himself to student, and in another case, the hacker disturbed a doctoral defense using obscene drawings and racial slurs.

Zoom-bombing is a crime, as we are reminded by Matthew Schneider, United States Attorney for Eastern Michigan, warning that anyone who hacks into a teleconference can be charged with state or federal crimes.

The FBI warned last month about this new risk and provided some security tips, such as avoiding making meetings or classrooms public, but instead organizing a private meeting, either by requiring a password to participate or using Zoom’s waiting room feature to control which guests can indeed be admitted.

The FBI also recommended not to share a link to a Zoom event on social media but instead send the link to invited parties, to set screensharing to “Host Only,” and also to make sure that people use the January 2020 updated version of the Zoom software, which disabled the ability to randomly scan for meetings to join.

Zoom announced in early April that it will enable the “waiting room” feature for all accounts, including the free accounts, and will also require additional password settings. Free K-12 education accounts will be required to use passwords and will not be able to turn off the password feature. The New York City Department of Education has however announced that it will no longer allow the use of Zoom for distance learning out of security concerns.

Facebooktwitterredditpinterestlinkedinmailby feather

Selfie, Privacy, and Freedom of Speech Collide in France

On December 12, 2015, Brahim Zaibat, a dancer and choreographer, posted on social media a selfie he had taken two years ago, showing him in an airplane, just a seat behind the one where Jean-Marie Le Pen, the honorary president of the French National Front, had fallen asleep.

Mr. Zaibat wrote under the selfie: “Put them KO tomorrow by going to vote. To preserve our fraternal France !!! ” Mr. Zaibat was referring to the second round of France’s regional elections, which were to take place the next day, on December 13, 2015, as the National Front, a party on the far right, was leading in the first round of the election in six of the thirteen French regions.17431042963_bd7983b82d_o

Mr. Le Pen took umbrage at this photograph as he considered it to be an infringement of his privacy and of his right to his image (droit à l’image). He sued Mr. Zaibat on December 31, 2015, and asked the Tribunal de Grande Instance (TGI) of Paris, sitting in emergency, to order the selfie to be taken down from the social media site, to order the publication in several magazines of a message informing the public about this judicial measure, and to order Mr. Zaibat to pay a provisional fine of 50,000 euros.

Mr. Zaibat argued in defense that ordering his selfie to be removed would undermine his freedom of expression as protected by article 10 of the European Convention on Human Rights (ECHR). According to Mr. Zaibat, he had “not overstepped the limits of freedom of expression, in a humorous way, in the context of a political debate on a topic of general interest.” For Mr. Zaibat, the selfie was a photograph taken in public, which represented, in a humorous way, a politician whose party was then in the spotlight.

But the judge, quoting both article 9 of the French civil Code, and article 8 of the ECHR, both protecting privacy, considered that the selfie indeed had violated Mr. Le Pen’s privacy and the right to his image. But she considered that since the selfie was “neither degrading nor malicious,” it was appropriate to award the politician only one euro compensation. However, the judge forbade Mr. Zaibat to republish the photograph under penalty of 1,000 euros per infringement.

Mr. Zaibat has appealed the interim order.

No legal definition of privacy

Article 9 of the French civil Code provides that “[e]veryone has the right to respect for his private life” but it does not define what privacy is. Similarly, article 8 of the ECHR guarantees the right to respect for private life, without defining that concept. In 1970, the Parliamentary Assembly of the Council of Europe defined the right to respect for privacy in its declaration on mass communication media and human rights, Resolution 428, as “consist[ing] essentially in the right to live one’s own life with a minimum of interference” which includes “non-revelation of irrelevant and embarrassing facts, unauthorized publication of private photographs.”

Freedom of expression

Article 10 of the ECHR protects freedom of expression, which may nevertheless, under Article 10-2 of the Convention, “be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society.” The Court interpreted this text in 1979, in Sunday Times v. The United Kingdom, as requiring that such measures necessary in a democratic society must correspond to a pressing social need, which must be proportionate to the legitimate aim pursued. Furthermore, the reasons given by the national authorities to justify such measures must be relevant and sufficient (see Sunday Times v. The United Kingdom § 62).

The difficult balance between freedom of expression and protection of privacy

The selfie violates the privacy of Mr. Le Pen. But does his right to privacy necessarily prevail over the right of Mr. Zaibat to express himself and over the public’s right to be informed?

It can be argued that a public person has the right to fall sleep on an airplane without having his picture taken and the photo published. Mr. Le Pen had argued in court that Mr. Zaibat was not a political debater or a comedian, but a dancer, and thus could not avail himself of free political expression, but that he had “in fact expressed himself as a private citizen and had simply taken advantage of the political agenda to create some “buzz ” by broadcasting a stolen photograph taken two years before.” But freedom of opinion belongs to everyone, dancers, politicians, and media.

In a recent judgment, Couderc and Hachette Filipacchi Associés v. France, the Grand Chamber of the European Court of Human Rights unanimously held that France had violated Article 10 of the ECHR when interfering in 2005 with the right of weekly magazine Paris Match to publish photos of the then secret son of Prince Albert of Monaco.

In Couderc, the Court observed that “[t]he right to the protection of one’s image is… one of the essential components of personal development. It mainly presupposes the individual’s right to control the use of that image, including the right to refuse publication thereof” (Couderc § 85).

The Court added:

In determining whether or not the publication of a photograph interferes with an applicant’s right to respect for his or her private life, the Court takes account of the manner in which the information or photograph was obtained. In particular, it stresses the importance of obtaining the consent of the persons concerned, and the more or less strong sense of intrusion caused by a photograph… In this connection, the Court has had occasion to note that photographs appearing in the “sensationalist” press or in “romance” magazines, which generally aim to satisfy the public’s curiosity regarding the details of a person’s strictly private life… are often obtained in a climate of continual harassment which may induce in the person concerned a very strong sense of intrusion into their private life or even of persecution” (Couderc § 86).

What about the picture taken by Mr. Zaibat? Does it represent Mr. Le Pen in his private life or in his public life? Mr. Le Pen is a public person. The photo was taken in a public place, an airplane, without the subject of photography being harassed. All persons in the aircraft could see Mr. Le Pen asleep in his seat, if passing by him in the aisle. But does the public have the right to know that he was asleep in this plane?

The judge took the position to subjectively define privacy as the sphere that the person herself defines when determining what can be disclosed by the press.

In accordance with Article 9 of the Civil Code and Article 8 of the [ECHR] every person, regardless of his reputation, has a right to privacy and is entitled to get protection by setting herself what may be disclosed by the press. ”

But this is an “extreme” conception of privacy according to three law professors (Traité du Droit de la Presse et des Medias, Bernard Beignier, Bertrand De Lamy, and Emmanuel Dreyer, paragraph 1589). The authors note that this conception allows an individual to “oppose any publication even in the presence of a legitimate public interest.” According to the authors, this conception of privacy is “not viable.

In a case with facts somewhat similar to ours, François Hollande, then First Secretary of the French Socialist Party, was photographed in 2006 while on vacation, in a market in the South of France. The photos were published in a weekly magazine which illustrated them with humorous comments referring to current events. Mr. Hollande sued the magazine for invasion of privacy. The Cour de Cassation, France’s highest civil court, ruled on October22, 2007, that the photos indeed had interfered with the privacy of Mr. Holland:

Whereas the limits of the protection afforded by article 9 of the civil Code can be interpreted more widely in respect of persons performing official and public functions, the information revealed in this case are not directly related to the political functions exercised by the applicant as the photographs were taken on the occasion of a private activity carried out during his vacation; these elements are therefore not within a legitimate public information, despite the humorous reference in the article to the next organization by François Hollande of a conference on the purchasing power of the French.

Did Mr. Zaibat participate in a debate of general interest by publishing the selfie? In its judgment in Von Hannover v. Germany (n. 2), the European Court of Human Rights explained that:

an initial essential criterion is the contribution made by photos or articles in the press to a debate of general interest… the definition of what constitutes a subject of general interest will depend on the circumstances of the case“ (Von Hannover v. Germany (n. 2) § 109).

Yet, after its Couderc case of November 2015, the European Court of Human Rights seems to tip the balance in favor of freedom of expression. However, the French courts are often more favorable to the protection of privacy. Stay tuned…

Image is courtesy of Flickr user Gautier Poupeau under a CC BY 2.0 license.

Facebooktwitterredditpinterestlinkedinmailby feather

May an Employer Prohibit Employees From Taking Pictures at the Workplace?

3160087487_f6f1d71777_zIn a recent National Labor Relations Board (NLRB) case, Professional Electrical Contractors of Connecticut, Inc., 34-CA-071532, Administrative Law Judge (ALJ) Raymond P. Green found that some rules contained in the employee handbook, which prohibited employees from disclosing their location to third parties and from taking photographs or making recordings at their workplace, violated Section 8(a)(1) of the National Labor Relations Act (NLRA).

This section prohibits an employer to interfere with protected, concerted activity (PCA) as defined by Section 7 of the NLRA, that is, the exercise of employees’ rights “to self-organiz[e], to form, join, or assist labor organizations, to bargain collectively through representatives of their own choosing, and to engage in other concerted activities for the purpose of collective bargaining or other mutual aid or protection.

In this case, employer was an electrical company and its employees spent most of their working days at clients’ sites. Employer argued that these rules aimed at protecting the privacy of its clients and testified that clients often prohibited pictures from being taken on their property without authorization.

No Disclosure of Location and Customer Telephone Number of Customer Assignment to Third Parties

The employee handbook stated that employees were forbidden to disclose the location and telephone number of their customer assignment to outsiders. It also stated that “[v]iolation of customer confidentiality may lead to discipline up to and including termination.” Maintaining such confidentiality prevented, according to the employee handbook, employees to “disclose customers’ information to outsiders, including other customers or third parties and members of one’s own family.”

The General Counsel argued that such prohibition could be interpreted by employees as barring Section 7 activities. Employer was arguing that the rule aimed at protecting customers’ confidentiality, as its employees spend most of their working hours at customer sites and thus may have access to information which must be protected.

ALJ Green found that the rule prohibiting to disclose a customer’s location was too broad and violated Section 7 of the NLRA. However, prohibiting disclosing a customer’s phone number did not violate the NLRA.  ALJ Green noted that all employees had personal or professional cell phones and thus there was no need to disclose a customer’s phone number.

Information Technology Policy

The employee handbook also prohibited employees from sending communications or posting information, on or off duty, and to use personal computers in a way which may adversely affect their employer’s business interests or reputation.

ALJ Green found this rule invalid as too broad because it did not merely relate to employees communications made using computers owned by the employer, but also to communications made by employees using their personal computer.

Prohibiting Employees to Take Photographs or to Make Recordings at Their Workplace

The employee handbook also prohibited employees from taking photographs or making recordings at the workplace without prior authorization.

The General Counsel contended that this rule may reasonably be construed as prohibiting employees from photographing or recording Section 7 activities such as picketing or employee communications used in social media. The employer testified that the rule aimed at protecting customers ‘confidentiality and privacy.

ALJ Green cited a recent NLRB case, The Boeing Company case, 19-CA-90932 (May 15, 2014), where an ALJ found that a Boeing rule which prohibited employees to use personal camera-enabled violated Section 8(a)(1) of the NLRA. The Boeing rule allowed employees to carry camera-enabled devices on all company properties and locations “except as restricted by government regulation, contract requirements or by increased local security requirements,” but using these devices to take pictures or to tape a video was prohibited without authorization.

Boeing had argued that the rule was meant to protect the confidentiality of the manufacturing process. However, the ALJ was not convinced, noting that the areas designated as being camera-enabled devices-free were areas included VIP-tours of the plant, and such visitors were allowed to take pictures during the tours. The ALJ noted that “[Boeing]’s manufacturing process is no more in need of protection than an automobile assembly line.” The ALJ distinguished the facts in Boeing from the facts in the Flagstaff Medical Center case, where employers had forbidden employees to take pictures in order to protect hospital patients’ privacy. The ALJ concluded that Boeing’s no camera-enabled devices rule “reasonably discourages its employees from taking photos of protected concerted activities.”

In the Professional Electrical Contractors of Connecticut case, ALJ Green concluded that the rule prohibiting employees from taking photographs and videos at the workplace violated Section 8(a)(1) of the NLRA, and ordered the employee to rescind the language of its rules. This case is a reminder that, while a social media policy may contain language prohibiting employees from taking pictures in the workplace, the language of the policy must clearly indicate that the purpose of the rule is to protect the privacy of the third party whose photograph is taken.

Image is Taking pictures courtesy of Flickr user Greg Habermann under a CC BY 2.0 license

Facebooktwitterredditpinterestlinkedinmailby feather

Is Collecting License Plate Data Protected by the First Amendment?

9600608221_bef9576f1eTwo companies, Digital Recognition Network (DNR) and Vigilant Solutions (Vigilant), filed a complaint last month in the Utah Central District Court against Utah’s Governor and Utah’s Attorney General. They are asking the court to declare that the state’s Automatic License Plate Reader System Act (the Act) violates the First Amendment.

The case number is 2:14-cv-00099-CW-PMW.

What are Automatic License Plate Readers?

DNR manufactures automatic license plate reader systems (ALPR). They are composed of mobile or fixed high speed cameras, which record the license plates of every vehicle around them. Vigilant’s proprietary software converts these images into computer-readable data, allowing thus the creation of databases of the locations of all of these vehicles.

Clients are mainly recovery companies, better known as “the repo man,” which are cruising the streets looking for cars in arrears by using  vehicles carrying the DNR cameras. By using DNR services, they receive a ping when a car they want to repossess is in their vicinity.

As described on the DNR web site, clients are able to “[r]eceive up to the minute location updates” when using its services. That is because DNR “maintain[s] up to date information on asset locations for timely recovery.” In other words, the repo man knows where the cars in arrears are at any time, and it is then up to him to reclaim them.

According to DNR’s website, 190,000 vehicles have been recovered using this system. The data is also sold to law enforcement agencies, as DNR provides the data to Vigilant, which then shares it with law enforcement agencies.

The Utah Act

Utah enacted the Automatic License Plate Reader System Act in April of last year.

§ 41-6a-2003 of the Utah’s Act generally prohibits the use of automatic license plate reader systems. They are a few exceptions, such as when used by law enforcement agencies to protect public safety, conduct criminal investigations, comply with the law, enforce parking laws, enforce motor carrier laws, collect a toll electronically, or to control access to a secured area. No “repo man exception.”

§ 41-6a-2003 of the Act prohibits selling data thus collected. It cannot be held more than 30 days by a private entity and more than 90 days by the government, except when it is necessary to preserve it longer pursuant to a preservation request, a disclosure order or a warrant. Therefore, the Act prohibits building commercial databases with ALPR data.

As the Utah Act clearly prevents plaintiffs to enact their business model in the state, plaintiffs had to stop collecting and sharing ALPR data there, and are bringing this suit as they wish to resume their business in Utah.

Privacy Laws

The plaintiffs are careful to note that the ALPR data does not contain any personally identifiable information. They argue that, to allow identifying a person, this data would have to be combined with other data, such as data held by departments of motor vehicles, but which they cannot access since the federal Drivers Privacy Protection Act (DPPA), as well as several state laws, protects the privacy of motor vehicle records.

Is Disseminating ALPR Data Protected by the First Amendment?

Plaintiffs claim that disseminating ALPR data they collect is speech protected by the First Amendment.

They argue that the Utah Act is a content-based speech restriction, as it is the content of what is being photographed that is being regulated. They also claim that the Act discriminates based on the content of the speech and the identity of the speaker, and thus could not survive the “heightened scrutiny“ standard of review, that is, the intermediate level of scrutiny. Under that standard of review, the law must further an important government interest by means that are substantially related to that interest.

Laws protecting privacy indeed restrict the dissemination of truthful information. Justice Marshall, noted in the 1989 The Florida Star v. B.J.F. Supreme Court opinion that there is “tension between the right which the First Amendment accords to a free press, on the one hand, and the protections which various statutes and common-law doctrines accord to personal privacy against the publication of truthful information, on the other” (at 530). There is no doubt that the ALPR data is truthful information.

Warren and Brandeis wrote in their classic law review article that “[t]he right of privacy does not prohibit any publication of matter which is of public or general interest” (at 214). But what is the public interest in publishing ALPR?

Plaintiffs make the argument that such data is helpful to law enforcement agencies to fight crimes. In the 2001 Bartnicki v. Vopper case, the Supreme Court noted that “there are some rare occasions in which a law suppressing one party’s speech may be justified by an interest in deterring criminal conduct by another” (at 530). But here, the law would suppress speech which, at least according to the plaintiff, contributes in deterring criminal conduct…  Also, the data may also used for commercial purposes, as third parties which are not law enforcement agencies may buy the data. It is not clear if the plaintiffs are selling this data to private parties, but it would be possible to do so.

ALPR  impact everybody’s privacy, as these systems record information about all the cars in their vicinity, not just a few chosen targets.  Much can be inferred, whether wrongly or not, from the location of our car and our whereabouts, and not only for surveillance purpose. Did we park at the shopping mall to go to the health club or to gorge on French fries? The answer to that question may interest our health care provider. Did we visit a big box hardware store several times last week? This may interest its competitor across town.

Some Legislative Answers

Like Utah, Vermont has enacted a law last year regulating the use of ALPR systems, which restricts their use for legitimate law enforcement purposes.

Other states are considering enacting bills banning license plates scanners, such as Massachusetts. A New Hampshire bill, HB 675, is now dead. It would have authorized the use of license plate scanning devices, but only by law enforcement agencies, not by private parties.

A federal bill, HR 2644, would prohibit providing some grants to law enforcement agencies using an automated license plate reader unless they have written and binding policies limiting the length of time the data is stored, and do not store data in a database, and do not share data with a third party which is not another law enforcement agency. HR 2644 is not likely to be enacted.

The Utah case will certainly be followed by legislators around the U.S. Meanwhile, Utah legislators are anticipating its outcome by trying to quickly enact a new law. SB 222 would restrict the scope of the Utah Act by providing that the restrictions on the use of ALPR systems only apply to a governmental entity.  SB222 has been received from Senate for enrolling and should be enacted very soon.

Image is courtesy of Flickr user Nan Palmero pursuant to a CC BY 2.0 license.

Facebooktwitterredditpinterestlinkedinmailby feather

Two New California Laws: Right to be Forgotten for Minors and Protection from Cyber Revenge

3563325190_8a682d4b93California  enacted on September 23rd a “right to be forgotten” law. SB 568 was passed to give “Privacy Rights for California Minors in the Digital World.” The law, which is now part of the California Online Privacy Protection Act, will take effect on January 1, 2015.

Under §22581(a) of the law, operators of web sites or apps “directed to minors” which have “actual knowledge” that a minor is using the site/app, must allow minors who are registered users of their service “to remove or, if the operator prefers, to request and obtain removal of, content or information posted on the operator’s Internet Web site, online service, online application, or mobile application by the user.”

Operators must provide notice to minors that such removal service is available to them and also provide them “clear instructions” on how to remove or to request removal of content or information. The notice must also inform minors that the removal “does not ensure complete or comprehensive removal of the content or information posted.”

Exceptions to the Right to Be Forgotten

There are five exceptions to this obligation to erase information:

1. A federal or a state law requires maintaining the content or information

2. The content or information was stored on or posted by a third party, including any content or information posted by the minor that was stored, republished, or reposted by the third party

3. The operator anonymizes the content or information posted by the minor in such way that the minor cannot be individually identified

4. The minor does not follow the instructions provided by the operator on how to erase or require deletion of the information or content

5. The minor has received compensation or other consideration for providing the content

Is Content Really Deleted?

Many sites already offer a delete button. Minors (and majors) can delete tweets and Facebook posts and it is easy to delete a blog post or even an entire blog in a few seconds. However, one can never be sure that the information has been deleted from servers.

Also, as data is frequently republished by third parties, deleting all the subsequent reposting of that information is impossible, hence the necessity of the third-party republication exception of the California law.

The Particular Issue of Cyber Revenge

This is good news from a free speech point of view, but it is not good news for victims of cyber bullying and cyber revenge. The New York Times published a sobering article on revenge porn last week, describing the plight of a young girl whose former boyfriend had posted online intimate pictures of her out of spite.

The article mentions that owners and operators of porn revenge sites are in most cases protected by federal law.

Indeed, under Section 230 of the Communication Decency Act (CDA), 47 U.S.C. §230, “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The law provides a safe harbor to Internet service providers from speech posted by a third party.

Section 230 of the CDA has, however, no effect on criminal law or intellectual property law. That means that if the image of a naked minor is posted on a porn revenge site, and the State where the posting took place considers that such posting constitutes child pornography, the porn revenge site will not be able to use Section 230 of the CDA as a defense.

Also, if a posting falsely implies that the person featured would freely engage in sexual activities if asked and/or paid for this service, it would defamatory, and the victim could sue the site. However, most posts show a picture and identify the subject of the photograph, without further comment.

Copyright law could also be used to have a photograph taken down, but only if the subject of the photograph owns its copyright. That would be the case if racy ‘selfie’ is emailed to, say, a boyfriend, who later posted it on a site out of spite. In that case, the subject of the photograph could use another federal law, the Digital Millenium Copyright Act, to require the site to takedown the photograph. In order to be protected by copyright however, the ‘selfie’ would have to be original, but the threshold for protection is low.

It could also be argued that porn revenge is a breach of the federal stalking law, 18 U.S.C. § 2261A, passed as part of the Violence Against Women Act. This act, 18 U.S.C. § 2261A(2) specifically incriminates stalking using a computer.

Another new California law, SB 255, was enrolled on September 19 and specifically addresses the issue of cyber revenge. It is now a crime in California to publicize the picture of someone partially or completely naked, even if the person has given his consent to have this picture taken, defined as follows:

Any person who photographs or records by any means the image of the intimate body part or parts of another identifiable person, under circumstances where the parties agree or understand that the image shall remain private, and the person subsequently distributes the image taken, with the intent to cause serious emotional distress, and the depicted person suffers serious emotional distress.”

The author of the bill, California Senator Anthony Cannella, argued that such law was necessary to protect victims of cyber-bullying, especially since some victims have committed suicide as a result of being bullied online.

The ACLU opposed the bill, arguing that “[t]he posting of otherwise lawful speech or images even if offensive or emotionally distressing is constitutionally protected. The speech must constitute a true threat or violate another otherwise lawful criminal law, such as stalking or harassment statute, in order to be made illegal. The provisions of this bill do not meet that standard.”

The ACLU cited in its argument the United States v. Cassidy case, where the defendant had argued that 18 U.S.C. § 2261A(2)(A) violated the First Amendment because it was overbroad. The District Court of Maryland found in favor of defendant, reasoning that the government’s indictment was directed at protected speech, described by the court as “anonymous, uncomfortable Internet speech addressing religious matters” (at 583).

The court also noted that this speech did not fall into any of the categories of speech outside First Amendment protection, that is, obscenity, fraud, defamation, true threats, incitement, or speech integral to criminal conduct.

The questions of whether porn revenge postings are protected by the First Amendment remains an open question, likely to be answered at one point by a court of law.

Image is Graffiti courtesy of Flickr user Cliff Beckwith pursuant to a CC BY 2.0 license.

 

Facebooktwitterredditpinterestlinkedinmailby feather

SCA Protects Privacy of Non-Public Facebook Wall Posts

5454778149_620c36dbfeThe U.S. District Court for the District of New Jersey ruled on August 20 that non-public Facebook wall posts are covered by the Stored Communications Act (SCA). However, the authorized user exception applied in this case, as a colleague who had legally access to Plaintiff’s Facebook wall had forwarded the controversial  post, unsolicited, to their common employer.

The case is Ehling v. Monmouth-Ocean Hospital.

Plaintiff Deborah Ehling was as registered nurse and paramedic working for Defendant Monmouth-Ocean Hospital Service Corp. (MONOC). In June 2009, she posted a message on her account, implying that the paramedics who took care of the man who had killed a security guard outside the U.S. Holocaust Memorial Museum in 2009 should have let him die.

The privacy settings of Ehling’s Facebook account limited access to her Facebook wall to her Facebook friends. No MONOC managers were her Facebook friends, but several of her MONOC coworkers were, including Tim Ronco, who apparently decided on his own to provide screenshots of the controversial Ehling’s Facebook post to a MONOC manager.

Plaintiff was then suspended with pay and was told that her comment reflected a “deliberate disregard for patient safety.” Plaintiff then filed a complaint with the National Labor Relations Board (NLRB) which found that MONOC had not violated Ehling’s privacy as the post had been sent unsolicited to management.

Plaintiff was eventually fired, and filed an action against MONOC,  but the court granted defendant’s motion for summary judgment. I will only talk about the violation of the SCA.

Stored Communications Act

Plaintiff claimed that Defendant had violated the SCA when accessing the messages posted on her Facebook wall.

The SCA, 18 U.S.C. § 2701, is part of the Electronic Communications Privacy Act (ECPA) of 1986. The SCA forbids unlawful access to stored communications, that is,“(1)intentionally access[ing]s without authorization a facility through which an electronic communication service is provided; or (2) intentionally exceed[ing] an authorization to access that facility.”

According to the Ehling court, Facebook wall posts are indeed electronic communications as defined by 18 U.S.C. § 2510(12) that is,“any transfer of signs, signals, writing, images, sounds, data, or intelligence of any nature transmitted in whole or in part by a wire, radio, electromagnetic, photoelectronic or photooptical system that affects interstate or foreign commerce.”

Facebook’s users transmit data over the Internet, from their devices to Facebook’s servers, and thus their posts are electronic communications within the meaning of the SCA. No new issue here. The New Jersey court cited the 2010 Central District Court of California case, Crispin v. Audigier, which stated that Facebook and MySpace were electronic communications providers. Again, nothing new.

Finally, Facebook posts are saved on its servers indefinitely, thus backing them up. Therefore, Facebook wall posts are in electronic storage within the meaning of the SCA, 18 U.S.C. § 2510(17)(B), which defines the storage of electronic communications for purposes of backing them up. We are all set, the SCA applies in this case.

Public Electronic Communications/Private Electronic Communications

The controversial issue was instead whether Plaintiff’s Facebook posts were public or private. This is important, as the ECPA only protects private communications. The Ninth Circuit had noted in the 2002 case Konop v. Hawaaiian Airlines, Inc. that “the legislative history of the [SCA} suggests that Congress wanted to protect electronic communications that are configured to be private, such as email and private electronic bulletin boards.”

But a completely public BBS is not protected by the SCA. Indeed, the SCA legislator wrote that:

the bill does not for example hinder the development or use of electronic bulletin boards or other similar services where the availability of information about the service, and the readily accessible nature of the service are widely known and the service does not require any special access code or warning to indicate that the information is private. To access a communication in such a public system is not a violation of the Act, since the general public has been ‘authorized’ to do so by the facility provider”. (S. REP. NO. 99-541, at 36)

Konop was cited in Crispin v. Audigier, where the court reasoned that “there is no basis for distinguishing between a restricted-access BBS and a user’s Facebook wall or MySpace comments.” The New Jersey District court cited Audigier to conclude that non-public Facebook wall postings are covered by the SCA.

As the privacy settings of Plaintiff’s Facebook account prevented non-Facebook friends to access the messages on her wall, these messages were not really “public” and therefore the SCA applied to them. However, the authorized user exception of the SCA applied in this case.

Why the SCA’s Authorized User Applied in this Case

There is no liability under the SCA if access “[is] authorized … by a user of that service with respect to a communication of or intended for that user,” 18 U.S.C. § 2701(c)(2).

The court cited its own 2009 Pietrylo v. Hillstone Rest. Grp. case which had found that there is no violation of the SCA if the access to an electronic communication has been authorized. In the Pietrylo case, the manager of a restaurant had accessed the MySpace account of an employee, accessible only by invitation, by asking another employee to provide him the password. In Ehling, one of Plaintiff’s colleagues had voluntarily forwarded the electronic communication to the employer “without any coercion or pressure.” Therefore, the access was authorized. The difference is there, asking/coercing for access, or learn about the communication from an unsolicited third party.

Take away

Case law is consistent in this issue. While employers should not coerce or pressure employees to provide them access to the social media account of another employee, it is not illegal for them, under the SCA, to access a social media post if a third party willingly shares this information with them.

As for providing access to one’s own social media account to one’s employer, New Jersey recently enacted a law prohibiting employers to ask for user names, passwords, or other means for accessing employee’s electronic communications devices. Several states have similar laws, but New York is not one of them yet.

Image is Facebook wall courtesy of Flickr user Marcin Wichary pursuant to a CC BY 2.0 license.

 

Facebooktwitterredditpinterestlinkedinmailby feather

Big Data and Privacy at the American Bar Association

bdI will be presenting next Friday the part on privacy and Big Data in this  CLE webinar presented by the American Bar Association: The Big Data IP Problem: Big Issues for “Big Data.

From the program description:

The program will explore the ways that “big data” enhances and impedes innovation, the privacy issues it creates, and the viable solutions to creating a more secure system of compiling data.

 

 

Image is no Spin-Torque MRAM courtesy of Flickr user jurvetson pursuant to a  CC BY 2.0 license.

Facebooktwitterredditpinterestlinkedinmailby feather