Connecticut College Students Join a Suit Against GoogleAs a Yale student in 2014, Rachel Ett used her university-issued Google Apps for Education account for, well, almost everything.
She took and saved class notes in Google Drive cloud storage, used Gmail to email back and forth with professors and used Google Calendar to plan and manage her social as well as academic life. What she didn’t realize until later is that Google was allegedly watching it all.
Despite assurances from Google and Yale’s Data & Security page that Yale Gmail accounts would not be scanned, Google was apparently scanning the emails for advertising purposes, something Ett believes was an invasion of her privacy.
“Google would have been able to know exactly where I would be at any given time, my grades, my relationships with professors, my work schedule, the library books I checked out, my lunch dates and what parties I attended,” says Ett, a 24-year-old from Charleston, South Carolina.
Google Apps for Education is provided free to thousands of K-12 schools and universities and is used by more than 30 million students and teachers nationwide. It includes a Gmail service that was supposed to be free of the data-scanning Google’s general apps are subject to. However, the Mountain View, California-based tech giant announced in a blog post in April 2014 that it had “permanently removed all ads scanning” in its email service for schools, tacit acknowledgement it had been scanning the emails previously.
Ett is one of 38 Yale students and seven University of Connecticut students, as well as more than 800 students from 16 other universities including Harvard and New York University, who have joined a lawsuit against Google that seeks damages for the alleged breach of privacy. The lawsuit is being led by Ray Gallo, a California-based lawyer who is himself a Yale graduate.
The suit charges that Google, which did not respond to requests for comment for this story, violated the Electronic Communications Privacy Act and asks the company to delete any information derived from the scanned apps as well as pay damages of up to $10,000 for each plaintiff or $100 a day for each day the law was broken, whichever number is greater.
Gallo believes he and the students he represents have a strong case. But the cards can often be stacked against consumers in these type of cases, says Lauren Henry Scholz, resident fellow and Knight Law and media scholar at the Information Society Project at Yale Law School. “Privacy cases in general struggle in the courts. There is no federal general consumer privacy act.”
But there have been some successes. In 2013 Facebook settled a case for more than $20 million prompted by privacy concerns over targeted ads on the social media site. The payout in that case was unusually large, Scholz says. “Many times privacy cases need to be taken pro bono because in privacy, damages are often low. In Boring v. Google, a couple won their trespass case against Google, but got damages of only $1.” The cases are still important, she says. “The general public has broad fears about privacy invasion by private companies, but they do not understand the specific mechanisms for data collection, processing and trafficking. It’s worse than people think.”
Google data mining
In a process known as data mining, Google scans the content of Gmail and other apps as well as which terms users use in its ubiquitous search engine. This information is used to create user profiles that can be sold to other companies or can be used to create targeted ads. Facebook and many other tech companies employ similar methods, and users often unknowingly consent to this scanning. “The disclosures that are generally made to consumers, they’re very confusing, and they’re very long,” says Matthew Fitzsimmons, Connecticut’s assistant attorney general who heads that office’s privacy and data security department. “There are companies that their whole existence is figuring out how to use that data. That’s not necessarily a bad thing — there’s a lot of good that comes from big data — but it can’t all be done at the expense of consumers.”
Fitzsimmons would like to see a more concise, easy-to-understand and standardized model for digital privacy disclosures that would be modeled on food labels, which are able to clearly give consumers information about what they’re actually consuming. He also wants to make sure companies stick to the terms laid out in the digital agreement forms users consent to. “If a company’s upfront and says, ‘Hey, here’s this product, it’s free for you to use but here’s how we’re going to make it free,’ that’s fine. But, we’ve had a number of cases that we’ve settled where maybe a company isn’t doing what they said they were going to do with the data,” he says.
In 2013, Fitzsimmons helped his boss, Attorney General George Jepsen, lead a 38-state case against Google after it came to light that the cars used for its street view mapping project were scooping up data, including passwords, emails and other information, from unencrypted Wi-Fi networks. Google settled the case by paying a $7 million fine and agreeing to help educate the public about privacy issues. The Connecticut attorney general’s office also met with Google executives to discuss concerns about Google Glass and met with Apple executives prior to the launch of the Apple Watch to discuss concerns about how the personal health data recorded by the device would be protected.
Connecticut College Students Join a Suit Against Google
• • •
The disconnect between what was promised and what occurred is at the heart of the college students’ lawsuit against Google. Gallo says, “Everybody’s privacy is important, but if folks want to waive it in exchange for free email, that is their right. The problem arises when no consent is given. Plaintiffs [in this case] are university users required to use these Gmail accounts by the universities and they did not intend to consent to Google scanning their emails for commercial purposes.”
Alexander Cerjan, 29, currently a postdoctoral scholar at Stanford University, was a Yale graduate student studying physics and teaching undergraduate students when the email scanning in question occurred. “I don’t mind Google reading emails like receipts from online orders. All of that information is already available to the wider internet from my choice of search criteria through other tracking mechanisms,” he says. “Most of my personal communication is also rather banal. If Google wants to know that I’m going to get sushi on Wednesday, that’s fine.” But he adds there was content disclosed in his emails that should remain private. “Material that concerns others, especially the students I was the teaching fellow for, is qualitatively different. It’s not my place to decide for a student whether the grade they received on a particular homework assignment or test should be public.” Had he known Google was scanning the emails, he says he would have provided information about grades to students in a different manner.
The debate about digital privacy gets more intense when the government is involved. National headlines were generated earlier this year after Apple refused to help the FBI unlock an iPhone 5c belonging to one of the San Bernardino shooters. Ultimately, the FBI managed to unlock the phone without Apple’s help, which raised other concerns about mobile devices’ vulnerability to hacking.
In the spring, a cellphone privacy bill unanimously passed Connecticut’s House of Representatives and Senate. The law, which will go into effect this fall, requires police to demonstrate evidence that a person has committed or will commit a crime in order to receive a court order to track someone’s cellphone information. It also requires authorities to obtain permission before using “stingrays” (devices that act like cellular towers in order to track phones within certain areas), and notify people by mail that they have been tracked. There are also safeguards in place to make sure law enforcement is adhering to these requirements.
The bill was championed by the American Civil Liberties Union of Connecticut. David McGuire, the American Civil Liberties Union of Connecticut’s legislative and policy director and interim executive director, says the bill puts Connecticut at the forefront of digital privacy legislation. “It is, in my mind, one of the most comprehensive government privacy bills in the country. This is a relatively future-proof bill that will do a great job of protecting people’s privacy while enabling police to get cellphone tracking information when appropriate. So it strikes that balance between privacy and public safety.”
Another issue in Connecticut and beyond is digital privacy in schools. In December the Electronic Frontier Foundation, a San Francisco-based privacy advocacy group, filed a complaint with the Federal Trade Commission against Google. The case is separate from the college-student suit against Google and charges that the tech company is reneging on its promise to not data-mine Google Apps for Education. Thousands of K-12 schools use the app, and Google continues to mine schoolchildren’s personal information, including Google searches, but is not using the information gathered for targeted ads. It’s unclear what Google is doing with the data, and that’s part of the problem, says Nate Cardozo, the Electronic Frontier Foundation’s chief attorney. “Google is collecting a rather extraordinary amount of information about students and this was despite the fact that Google signed a pledge saying they wouldn’t collect information about students without getting the students’ or parents’ consent, which they didn’t do.”
Like the college case, the problem is that students at these schools are required to use Google products and were led to believe their use would be private. “Google Apps for Education is an awesome product,” Cardozo says. “It helps classrooms around the country. It’s free for schools and kids can do things now that they could never do before. It provides an enormous benefit to society, but there’s also a cost and that cost is privacy.”
Protecting personal privacy
Questions about digital privacy are not unique to Google; the company’s size and data-centered business model make it unusual. “Google’s businesses is data in the way that some other companies, mainly Apple, isn’t,” Cardozo says. “Apple makes its money selling hardware and selling media content on the iTunes store. They don’t make their money scanning your data. Google, their only business is scanning data.”
Avoiding this scanning is difficult, as Google or other companies with similar practices are hard to avoid. This story was researched using Google searches, written on Google Docs, and the interviews were scheduled via Gmail.
Experts say you can pay for an email account that doesn’t data-mine and use alternative search engines that don’t track search terms. However, Cardozo says, the only surefire way to avoid data-mining from Google is to not use Gmail and stop emailing anybody who does use Gmail, and “that doesn’t sound realistic to me.”
Fitzsimmons says vigilance is needed in general and that email and many other internet communications are not secure.
“I would always caution against anybody putting any kind of sensitive information into an email unless it’s absolutely necessary,” he says. “Using public Wi-Fi to do anything involving sensitive or personal information is a terrible idea.”
As far as the college case, as of press time a mediation was scheduled, but Gallo was not optimistic it would be resolved. Either way, the issue of digital privacy does not seem like it will be solved anytime soon. “Email is the primary form of communication in the modern day.
It should be subject to the same privacy, if not more, of secure written documents and correspondence,” says Ett, the Yale student.
Cerjan says, “We take a vast number of services for granted on the internet, and it is understandable that the companies providing those services should be able to monetize them somehow. As a society, it would be great if we could decide which particular methods for monetizing these services are acceptable. However, in the meantime, the terms and conditions we are forced to accept to use these services should be stated in plain text, not buried in pages of documents, and companies should be held accountable to these agreements just as the users are.”
(This article was originally published on a different platform. Some formatting changes may have occurred.)