FTC Beat
Posts Tagged ‘Internet privacy’
Sep 29
2013

FTC Takes First Enforcement Action on ‘Internet of Things’

A company that markets video cameras that are designed to allow consumers to monitor their homes remotely has agreed to settle charges with the FTC that it failed to properly protect consumers’ privacy. This marks the FTC’s first enforcement action against a marketer of a product with connectivity to the Internet and other mobile devices, commonly referred to as the “Internet of Things.”

The FTC’s complaint alleges that TRENDNet marketed its cameras for uses ranging from baby monitoring to home security and that TRENDNet told customers that its products were “secure.” In fact, however, the devices were compromised by a hacker who posted links on the Internet to live feeds of over 700 cameras. Additionally, TRENDNet stored and transmitted user credentials in clear unencrypted text.

Under the terms of its settlement with the FTC, TRENDnet is prohibited from misrepresenting the security of its cameras or the security, privacy, confidentiality, or integrity of the information that its cameras or devices transmit. The company must also establish a comprehensive security program and notify customers about security issues with the cameras and must provide a software update to customers to address security issues.

“The Internet of Things holds great promise for innovative consumer products and services,” FTC Chairwoman Edith Ramirez said. “But consumer privacy and security must remain a priority as companies develop more devices that connect to the Internet.”

The FTC’s authority to regulate and penalize companies that the agency claims do not protect consumers with sufficient data security is being challenged in federal court in New Jersey by The Wyndham Hotel Group. Wyndham has argued, among other things, that the FTC has not published any formal rules on data security and therefore cannot penalize companies that it deems have not protected consumer information. That case is pending.

This is the first time the FTC has brought an enforcement action involving the “Internet of Things,” but the FTC has already signaled it will be carefully watching how the Internet of Things develops. In particular, the FTC will be hosting a workshop in November to explore these new technologies. The agency previously sought comment from interested stakeholders on the Internet of Things – including the privacy and data security implications of interconnected devices. We expect that the FTC will continue to explore these issues, with a particular emphasis on how these devices collect and share information, particularly sensitive and personal information, such as health information.

Sep 09
2013

With Complaint Against LabMD, FTC Continues to Flex Enforcement Muscle on Data Security

The Federal Trade Commission recently filed another complaint against a company for alleged data security lapses. As readers of this blog know, the FTC has initiated numerous lawsuits against companies in various industries for data security and privacy violations, although it is facing a backlash from Wyndham and large industry organizations for allegedly lacking the appropriate authority to set data security standards in this way.

The FTC’s latest target is LabMD, an Atlanta-based cancer detection laboratory that performs tests on samples obtained from physicians around the country. According to an FTC press release, the FTC’s complaint (which is being withheld while the FTC and LabMD resolve confidentiality issues) alleges that LabMD failed to reasonably protect the security of the personal data (including medical information) of approximately 10,000 consumers, in two separate incidents.

Specifically, according to the FTC, LabMD billing information for over 9,000 consumers was found on a peer-to-peer (P2P) file-sharing network. The information included a spreadsheet containing insurance billing information with Social Security numbers, dates of birth, health insurance provider information, and standardized medical treatment codes.

In the second incident, the Sacramento, California Police Department found LabMD documents in the possession of identity thieves. The documents included names, Social Security numbers, and some bank account information. The FTC states that some of these Social Security numbers were being used by multiple individuals, indicating likely identity theft.

The FTC’s complaint alleges that LabMD did not implement or maintain a comprehensive data security program to protect individuals’ information, that it did not adequately train employees on basic security practices, and that it did not use readily available measures to prevent and detect unauthorized access to personal information, among other alleged failures.

The complaint includes a proposed order against LabMD that would require the company to implement a comprehensive information security program. The program would also require an evaluation every two years for 20 years by an independent certified security professional. LabMD would further be required to provide notice to any consumers whose information it has reason to believe was or could have been accessible to unauthorized persons and to consumers’ health insurance companies.

LabMD has issued a statement challenging the FTC’s authority to regulate data security, and stated that it was the victim of Internet “trolls” who presumably stole the information. This latest complaint is yet another sign that the FTC continues to monitor companies’ data security practices, particularly respecting health, financial, and children’s information. Interestingly, the LabMD data breaches were not huge – with only 10,000 consumers affected. But, the breach of, and potential unauthorized access to, sensitive health information and Social Security numbers tend to raise the FTC’s attention.

While industry awaits the district court’s decision on Wyndham’s motion to dismiss based on the FTC’s alleged lack of authority to set data security standards, companies should review and document their data security practices, particularly when it comes to sensitive personal information. Of course, in addition to the FTC, some states, such as Massachusetts, have their own data security standards, and most states require reporting of data breaches affecting personal information.

Aug 06
2013

FTC Looking Closely at Impact on Consumers of ‘Big Data’

Manufacturers and marketers know that the more consumer data they have, the more they can tailor and direct their advertising, their products, and their product placement. This helps them to maximize sales and minimize costs. Thanks to the combination of cheap data storage and ubiquitous data capturers (e.g., smart phones, credit cards, the Web), the amount of consumer data out there to mine is astounding. Hence the recently-popularized term, “Big Data.”

But the misuse of data could result in government enforcement actions and, more importantly, serious privacy violations that can affect everyone.

Some of the practical challenges and concerns flowing from the use of big data were addressed recently by FTC Commissioner Julie Brill at the 23rd Computers, Freedom and Privacy conference on June 26. Issues raised include noncompliance with the Fair Credit Reporting Act and consumer privacy matters such as transparency, notice and choice, and deidentification (scrubbing consumer data of personal identifiers).

The FCRA: Those whose business includes data collection or dissemination should determine whether their practices fall within the boundaries of the FCRA. As Brill pointed out, “entities collecting information across multiple sources and providing it to those making employment, credit, insurance and housing decisions must do so in a manner that ensures the information is as accurate as possible and used for appropriate purposes.” If Brill’s comments are any indication of enforcement actions to come, businesses should be aware that the FTC is on the lookout for big data enterprises that don’t adhere to FCRA requirements.

Consumer Privacy: Brill gave some credit to big data giant Acxiom for its recent announcement that it plans to allow consumers to see what information the company holds about them, but she noted that this access is of limited use when consumers have no way of knowing who the data brokers are or how their information is being used. Brill highlighted how consumer data is being (questionably) used by national retailer Target: the somewhat funny yet disturbing news story about Target Stores identifying a teen’s pregnancy. It is a classic example of why consumers ought to have notice of what data is being collected on them and how that information is being used.

Consumers also need to have, as Brill suggested, the opportunity to correct information about themselves. This makes sense. Data collection is imperfect – different individuals’ information may be inaccurately combined; someone’s information may have been hacked; someone could be the victim of cyber-bullying, and other mishaps and errors can occur. Consumers should be able to review and correct information for errors. Finally, Brill highlighted concerns that current efforts to scrub consumer data may be ineffective, as companies are getting better at taking different data points and still being able to accurately identify the individual. “Scrubbed” data in the wrong hands could be as harmful as a direct security breach.

Brill encouraged companies to follow the “privacy by design” recommendations issued by the FTC in order to build more protections into their products and services. She further emphasized her initiative “Reclaim Your Name,” which is set to promote consumer knowledge and access to data collection. Companies that are in the business of data collection, mining and analytics should take note of the FTC’s efforts to empower the consumer against the overuse or misuse of consumer data. If you want to stay on the good side of the FTC – and on the good side of the informed consumer – work with the consumer, and provide meaningful notice, choice and consent.

posted in:
Privacy
Jul 18
2013

Google Glass Sounds Exciting — But What About Privacy?

Beta testing is underway for Google Glass, a new technology that provides the functionality of a smartphone in a headset worn like glasses. Much like a smartphone, the Glass headset is able to exchange messages with other mobile devices, take pictures, record videos, and access search engines to respond to user queries. But unlike a smartphone, the device’s optical head-mounted display, voice recognition, and front-facing camera give users hands-free access to its features, including the ability to capture photographs and video recordings of the world in front of them.

For now, Glass is only available to developers and a small group of test users known as Google Explorers. The device is expected to go on sale to the public in 2014. In the meantime, public speculation swells, and the blogosphere is full of conflicting reports about what the device can and cannot do. Some suggest that the device will utilize facial recognition and eye-tracking software to show icons and statistics above people whom the user recognizes. A more common concern is that the device will be able to photograph and record what the user sees and then share that data with third parties without permission from the user or those whose likenesses are being captured.

Because of this lack of clarity, lawmakers around the world are urging Google to affirmatively address the myriad of privacy concerns raised by this new technology. Last month, an international group of privacy regulators – including representatives from Australia, Canada, Israel, Mexico, New Zealand, Switzerland, and a European Commission panel – signed off on a letter to Google’s CEO Larry Page asking for more information regarding the company’s plans to ensure compliance with their data protection laws.

Here in the United States, the House Bipartisan Privacy Caucus issued a similar letter of its own. In addition to a variety of questions regarding the device’s capabilities, the letters reference some of Google’s recent data privacy mishaps and ask whether Google intends to take proactive steps to ensure the protection of user and nonuser privacy.

Google’s Vice President of Public Policy and Governmental Relations (and former New York Congresswoman) Susan Molinari issued a formal response to the House Bipartisan Privacy Caucus. According to Molinari, Google “recognize[s] that new technology is going to bring up new types of questions, so [they] have been thinking carefully about how [they] design Glass from its inception.”

To address concerns about the picture and video capabilities, Molinari highlighted several features designed to “give users control” and “help other people understand what Glass users are doing.” For example, specific user commands are required to search the Internet or find directions, and the user must either press a button on the arm of the Glass or say “Take a photo” or “Record a video” in order to access those features.

Molinari’s letter plainly states that Google’s privacy policies will not change for Glass. Instead, Glass will be governed by the terms of the Google’s current privacy policy. However, the company has created policies for developers making Glass apps, also known as “Glassware.” For example, developers will not be allowed to incorporate facial recognition into their Glassware, and they are prohibited from disabling or turning off the display when using the camera. Glassware developers must also agree to terms of service for Glass’s application programming interfaces (APIs) and have and follow their own privacy policies. We are carefully observing Google’s actions to ensure that they are in keeping with Google’s promises, and we encourage all Glassware developers, as well, to comply with Google’s policies.

posted in:
Privacy
Jun 12
2013

Wyndham Case Challenges FTC’s Authority Over Cybersecurity

Over the past decade the Federal Trade Commission has brought cybersecurity enforcement actions against various private companies, imposing tens of millions of dollars in monetary penalties and requiring companies to maintain more stringent data-security practices. No company has ever challenged the FTC’s authority to regulate cybersecurity in this way in court – until now. On June 17, 2013, a federal court will finally get a chance to weigh in on whether the scope of the FTC’s regulatory jurisdiction is so broad as to include setting standards for cybersecurity.

In FTC v. Wyndham Worldwide Corporation, et al., the FTC launched a civil action against the parent company of the Wyndham hotels and three of its subsidiaries for data security failures that led to three major data breaches in less than two years. The Commission’s complaint charges that Wyndham’s security practices were unfair and deceptive in violation of the FTC Act.

Unlike many other data-security FTC enforcement actions, in which the defendant has chosen to cut its losses and settle out of court, Wyndham has decided to stand and fight with a motion to dismiss. Judge Esther Salas of the U.S. District Court for the District of New Jersey is expected to rule on Wyndham’s motion on June 17.

The FTC complaint alleges that Wyndham Hotels and Resorts (a Wyndham subsidiary and named defendant) had a privacy policy that stated that they “safeguard customers’ personally identifiable information by using industry standard practices” and they “make commercially reasonable efforts to make to make [their] collection of such information consistent with all applicable laws and regulations.”

The FTC argues that this policy covers the individual hotels and that it was deceptive because the defendants failed to implement “reasonable and appropriate” data-security measures. Wyndham’s motion to dismiss attacks the facts of the deception claim by quoting language from the Wyndham Hotels and Resorts privacy policy that expressly explains that the privacy policy does not apply to the individual hotels. Wyndham argues that, taken as a whole, Wyndham Hotels and Resorts’ privacy policy is not deceptive.

With respect to the FTC’s unfairness claim, Wyndham’s motion asserts that the FTC is attempting to circumvent the legislative process by acting as if “it has the statutory authority to do that which Congress has refused: establish data-security standards for the private sector and enforce those standards in federal court.”

According to Wyndham, “on multiple occasions in the 1990s and early 2000s the FTC publicly acknowledged that it lacked authority to prescribe substantive data-security standards under the [FTC Act]. For that very reason, the FTC has repeatedly asked Congress to enact legislation giving it such authority.” Further, Wyndham highlights the Senate’s failure to pass the Cybersecurity Act of 2012, which sought to address the need for specific data-security standards for the private sector, and President Obama’s February 2013 Executive Order on cybersecurity that was issued in response to the Congressional stalemate.

On its face, Wyndham’s motion to dismiss seems quite strong. However, the facts that the FTC is alleging do not cut in Wyndham’s favor. The Commission’s complaint alleges that Wyndham’s failure to “adequately limit access between and among the Wyndham-branded hotels’ property management systems, [Wyndham] Hotels and Resorts’ corporate network, and the Internet” allowed intruders to use weak access points (e.g., a single hotel’s local computer network) to hack into the entire Wyndham Hotels and Resorts’ corporate network. From there, the intruders were able to gain access to the payment management systems of scores of Wyndham-branded hotels.

According to the FTC, Wyndham failed to remedy known security vulnerabilities, employ reasonable measures to detect unauthorized access, and follow proper incident response procedures following the first breach in April 2008. Thus, the corporation remained vulnerable to attacks that took place the following year. All told, the intruders compromised over 600,000 consumer payment card accounts, exported hundreds of thousands of payment card account numbers to a domain registered in Russia, and used them to make over $10.6 million in fraudulent purchases.

Unfortunately – as Wyndham notes in its motion to dismiss – hacking has become an endemic problem. There has been no shortage of stories about major cyber-attacks on private companies and governmental entities alike: from Google and Microsoft to the NASA and the FBI. And the FTC has not been shy about bringing enforcement actions against private companies with inadequate security measures.

If Wyndham prevails, the case could usher in a major reduction in FTC enforcement efforts. However, if the court sides with the FTC, the commission will be further empowered to regulate data security practices. With such high stakes on both sides, any decision is likely to result in an appeal. In the meantime, companies in various industry sectors that maintain personal consumer information are awaiting next week’s decision.

posted in:
Cybersecurity
May 28
2013

FTC Serves Notice on App Developers About Children’s Privacy

The Federal Trade Commission has made it quite clear that it is serious about advising mobile app developers that the rules of the road will be changing very soon. Since 2011, the Commission has been working to update the rules governing the collection of children’s personal information by mobile apps. The relevant law is the Children’s Online Privacy Protection Act (COPPA), and the rules are set to change in just over a month, on July 1.

As part of its effort to encourage compliance, the Commission recently issued more than 90 warning letters to app developers, both foreign and domestic, whose online services appear to collect data from children under the age of 13. The letters alert the recipients about the upcoming COPPA rule change and encourage them to review their apps, policies, and procedures for compliance. According to the letters, the Commission did not evaluate whether the recipients’ apps or company practices are in compliance. Therefore, we view this move as a public warning to all app developers that may be collecting personal information from children.

Until now, COPPA, which was originally enacted in 1998, defined “personal information” to include only the basics such as a child’s name, contact information, and social security number. Over the past decade, it has become antiquated by the development of mobile apps and other technological advancements affecting data collection. Unfortunately but understandably, COPPA’s original incarnation failed to account for the proclivities of today’s children, who – reared in the age of smartphones, Facebook, and Google-everything – routinely use mobile apps to share their lives with their friends, their family, and the world.

The FTC has expressed major concerns that, unbeknownst to many users, mobile app developers also collect and disseminate their users’ persistent identifiers (things such as cookies, IP addresses, and mobile device IDs). This information, which can recognize users over time and across different websites and online services, is often used by developers and third parties to market products to children based on each child’s specific online behavior. Come July 1, this practice will be illegal.

Under the revised rule, the definition of “personal information” has been expanded to include persistent identifiers, photos and videos with a child’s image, and recordings of a child’s voice. Additionally, developers of apps directed to children under 13 – or that knowingly collect personal information from children under 13 – will be required to post accurate privacy policies, provide notice, and obtain verifiable parental consent before collecting, using, or disclosing such information. However, there are some exceptions for developers that only use the information to support internal operations (i.e., analyze the app’s functionality, authenticate app users, etc.)

Protecting children’s privacy continues to be one of the Commission’s major initiatives, and the FTC has levied some hefty penalties for COPPA violations over the past year. That said, the Commission has indicated that it may be more lenient in cases where a small business has violated the rule despite well-intentioned attempts to comply. As we mentioned back in February, developers should beware of increased data privacy enforcement on the state level, as well. We encourage all mobile app developers to be proactive and review/update their policies to ensure compliance and avoid costly penalties.

posted in:
Privacy
May 02
2013

FTC: Data Brokers That Compile Tenant Data May Be Covered by FCRA

On April 3, 2013, the Federal Trade Commission issued a press release that marks yet another step in its continuing trend of actions involving data brokers and data providers. As we have noted in earlier blog posts, the agency is making a concerted effort on a number of fronts to enforce the laws that protect consumer data and privacy.
The FTC’s current action involves a letter that it sent to a number of data brokerage companies that provide tenants’ rental histories to landlords. The letter is simply a notification to the companies that they may be considered credit reporting agencies under the Fair Credit Reporting Act (FCRA) and that they thus may be required to ensure that their websites and practices comply with that law.

The FTC letter also listed some of the obligations of credit reporting agencies to take reasonable steps to ensure the fairness, accuracy, and confidentiality of their reports — such as (1) ensuring that landlords are actually using the report for tenant screening purposes and not as a pretext, (2) ensuring the maximum possible accuracy of the information in the tenant reports, (3) if the company is a nationwide provider, providing consumers with a free copy of their report annually, and (4) ensuring that all obligations are met concerning notifications to landlords (e.g. letting consumers know about a denial based on a tenant report, the right to dispute information in the report, and the right to get a free copy of the report).

The FTC letter specifically noted that the agency has not evaluated whether the company receiving the letter is in compliance with the FCRA but that “we encourage you to review your websites and your policies and procedures for compliance.”

We have discussed FTC actions against data brokers before. In March, we discussed the FTC’s announcement of a settlement with Compete, Inc., a web analytics company. Compete sells reports on consumer browsing behavior to clients looking to drive more traffic to their websites and increase sales. Compete obtained the information by getting consumers to install the company’s web-tracking software in their computers. The FTC alleged that the company’s business practices were unfair and deceptive because the company did not sufficiently describe the types of information it was collecting from its users.

We are confident that the companies that received the letter regarding tenant information are reviewing their websites and polices, as encouraged by the FTC. However, what really intrigues us is the motivation behind the FTC sending the letters to the companies.

Of course, part of that motivation is to help ensure that the companies follow rules for privacy protection. Nonetheless, it is also interesting to note that there is a significant consequence under the FCRA – namely, individuals are permitted to seek punitive damages for deliberate violations of the FCRA. Thus, the letter arguably provides notice for the companies to become compliant immediately since future violations may be considered deliberate breaches that warrant punitive damages.

posted in:
Privacy
Apr 08
2013

A Cautionary Tale: EU Probing Google for Possible Privacy Violations

The increasing difficulties faced by internet providers and data gatherers in the international realm have yet again come to the fore. Privacy regulators in France, Germany, Spain, the Netherlands, the United Kingdom and Italy have banded together to investigate whether to fine Google for what they perceive to be violations of European Union privacy laws.

The background is that in March 2012, Google replaced its disparate privacy policies applicable to its various products (such as Gmail and YouTube) with a single policy that applied to all of its services.

However, as part of a report  issued in October 2012, the EU’s Article 29 Data Protection Working Party then declared that Google’s unified policy did not comply with EU data protection laws. The EU’s primary, but not only, quibble with Google’s new policy involved the sharing of personal data across multiple Google services and platforms. At that time, the president of the French regulatory body, the CNIL, indicated that litigation would be initiated if Google did not implement the Working Party’s recommendations within three to four months.

Google – whose privacy policies appear to comply with U.S. law – did not bow to the EU regulators’ demands. As a result, on April 2, 2013, CNIL announced that it will lead efforts by the various EU states’ data protection authorities to coordinate “repressive action” against Google for not implementing the changes to its privacy policy that were demanded in the October Working Group Paper. Google has defended its privacy policy and contends that its new single policy complies with EU data protection laws.

As a result, Google now faces the time and costs of substantial regulatory oversight and investigation, as well as potential fines, from multiple national privacy protection watchdogs. In fairness, the EU privacy regulators have tended to be rather inclusive in their interpretation of what is and is not required by law. This is unfair to Google and to other companies that comply with what they believe to be the letter and spirit of the law, only to have regulators reinterpret the law to move the goal posts. But this is typical in the EU regulatory realm.

Google’s predicament sends a stern warning to all internet providers that gather personal data. Any provider’s natural inclination is to focus on complying with the applicable privacy rules applicable in the country where the provider is located. But the internet is borderless, subjecting providers to multiple laws in multiple jurisdictions. This creates the need for each provider to carefully analyze privacy policies to ensure as best as possible that it complies with the rules applicable across the globe. EU regulators and others are no longer content to allow the United States to set the guidelines for privacy and other rights, creating new challenges for privacy compliance in the United States and abroad.

posted in:
Privacy
Mar 24
2013

FTC Points to Key Issues in New World of Mobile Payments

Earlier this month, the Federal Trade Commission released a staff report outlining key issues facing consumers and companies as they adopt mobile payment services, entitled “Paper, Plastic . . . or Mobile? An FTC Workshop on Mobile Payments.” The report is based on a workshop held by the FTC in 2012 to examine the mobile payment industry.

Consumer use of mobile payment services continues to grow quickly. Mobile payment systems have the potential to be beneficial for both companies and consumers. However, many issues regarding fraud, privacy and security arise, and the FTC is looking to the industry to take the lead on establishing sound policies.

The FTC encourages companies that use mobile payment systems to develop clear policies on the resolution of disputes regarding unauthorized or fraudulent charges. Consumers fund their mobile purchases from a variety of sources (e.g., credit cards, bank account, mobile phone bills) and under current regulations each different method of funding has a different process for consumers to dispute an unauthorized or fraudulent charge. The FTC wants to create a clearer and streamlined process for consumers if an issue were to arise regarding a disputed charge. The FTC is planning to hold a separate roundtable on this issue in May.

The report highlights the problems associated with “cramming,” which involves placing unauthorized charges on a consumer’s phone bill. The FTC suggests that mobile carriers should perform some due diligence on companies from which they accept charges.

The report also discusses the idea of “privacy by design,” which involves strong privacy policies and transparency for consumers from inception of a company’s offerings. Consumers understand that they will need to provide some information to access a company’s services, but consumers may want to control how that information is stored and shared. The FTC and the industry realize that mobile payment systems can be an efficient, favored payment method. However, companies offering mobile payments need to be clear to consumers about how their data is being collected, maintained and used. Privacy issues are of paramount concern when using mobile payment systems because of the enormous amount of data available on smartphones.

The report also notes the potential privacy issues that can occur in the mobile payment process. Since mobile payment providers have access to both the financial information and contact information of the payer, they are in a position to create a serious privacy breach. The report suggests that companies consider privacy throughout the process of development, be transparent regarding data practices, and allow consumers options on how they want their information to be collected.

The report also encourages the industry to adopt measures to ensure that the entire mobile payment process is secure since financial information could potentially be disclosed. The FTC notes that there is technology available to make the protection of payment information more secure and suggests that financial information should be encrypted at all points in the transaction.

Companies should take note of the FTC’s report and adjust their practices. The FTC has put companies on notice about its expectations in mobile payments. It would not surprise us to see enforcement actions in the future in the area. Companies should, in particular, make clear their policy for explaining charges, and how they can be authorized. The more support a company has in showing that a charge is justified, the easier it will be to defend. This kind of specificity may also help influence authorities from even bringing charges. When offering mobile payment services, opt-in screens requiring a click or a password to make a charge and making sure the network is secure are best practices that may save an organization from being on the receiving end of an enforcement action.

Mar 18
2013

Google Settles Another Privacy Allegation — This One Concerning Its ‘Street View’

Google recently agreed to a settlement after a three-year investigation conducted by 38 state attorneys general stemming for allegations that it had violated individuals’ privacy rights when it collected information from unsecured wireless networks while Google was engaged in its Street View mapping project. Full text of the settlement is available here.

Google used special vehicles to create the pictures that are seen on Google Street View. Google tried to improve its location services by identifying wireless Internet signals that could provide reference points on the map. In the process, the vehicles collected network identification information, as well as data, from unsecured wireless networks.

Google has stated that the collection of any personal information from the wireless networks was unintentional and that the information was never used or looked at. The company has agreed to destroy the personal data that it collected. Google will also be required to pay a $7 million fine as part of the settlement.

As part of the settlement, Google also agreed to launch a new internal privacy education program. The settlement requires Google to hold an annual privacy week event for its employees and to make privacy education available for select employees. Additionally, it must provide refresher training for its lawyers that oversee new products.

The settlement also requires Google to educate the public on privacy. Google will be required to create a video for YouTube explaining how people can easily encrypt data on their wireless networks and run an ad online every day for two years promoting it. It must also run educational ads in the biggest newspapers in the 38 participating states. Google will have to submit data to the state attorneys general to show that it is in compliance with the requirements of the settlement.

The Connecticut Attorney General’s office led an eight-state committee that investigated the data collection and led to this settlement. Connecticut Attorney General George Jespen said in a statement, “Consumers have a reasonable expectation of privacy. This agreement recognizes those rights and ensures that Google will not use similar tactics in the future to collect personal information without permission from unsuspecting consumers.”

This is another example of states taking a more aggressive approach to protecting consumer privacy rights when the federal government does not. The Federal Trade Commission investigated this activity by Google but closed its case without a fine. The Federal Communications Commission also investigated, and issued a $25,000 fine, but that fine was largely for Google allegedly hindering the investigation. Companies that do business on the Internet should be aware that states will continue to enforce privacy laws. Companies must make sure that they do not unintentionally collect unnecessary sensitive information in the course of their business activities.

posted in:
Privacy
Connect with Us Share

About Ifrah Law

Crime in the Suites is authored by the Ifrah Law Firm, a Washington DC-based law firm specializing in the defense of government investigations and litigation. Our client base spans many regulated industries, particularly e-business, e-commerce, government contracts, gaming and healthcare.

Ifrah Law focuses on federal criminal defense, government contract defense and procurement, healthcare, and financial services litigation and fraud defense. Further, the firm's E-Commerce attorneys and internet marketing attorneys are leaders in internet advertising, data privacy, online fraud and abuse law, iGaming law.

The commentary and cases included in this blog are contributed by founding partner Jeff Ifrah, partners Michelle Cohen, David Deitch, and associates Rachel Hirsch, Jeff Hamlin, Steven Eichorn, Sarah Coffey, Nicole Kardell, Casselle Smith, and Griffin Finan. These posts are edited by Jeff Ifrah. We look forward to hearing your thoughts and comments!

Visit the Ifrah Law Firm website

Popular Posts