Four years after the Federal Trade Commission issued Orders, using its authority under Section 6(b) of the FTC Act, to nine social media and video streaming services, the Commission released a report detailing its findings. Below, we summarize the key findings and provide takeaways.
What is the FTC’s 6(b) Authority?
The FTC’s 6(b) authority grants the Commission the power to require “annual or special . . . reports or answers in writing to specific questions” for information about a company’s “organization, business, conduct, practices, management, and relation to other corporations, partnerships, and individuals.”[1] As part of the FTC’s investigatory powers, the 6(b) authority enables the Commission to conduct wide-ranging studies to gain deeper understanding of industry practices rather than to serve a law enforcement purpose.
The FTC’s Report: Key Findings
The FTC divided its findings into four general categories: (1) data practices; (2) advertising; (3) algorithms, data analytics, and artificial intelligence; and (4) children and teens. The report acknowledged and qualified many of its findings where they did not necessarily apply uniformly to every recipient of this 6(b) order. The report generalized its findings in part to provide some degree of confidentiality over the specific information and practices of each recipient.
- Data Practices. The FTC report concluded the companies failed to adequately police their data handling; that they did not often implement privacy protection measures; that their data minimization and retention practices were “woefully inadequate”; and that their practices did not consistently make consumer privacy a priority.
The FTC attributes these findings in part to privacy policy disclosures that it called “very hard to read”, “nearly impossible to understand”, “too vague”, and “subject to change.” Further, many companies allegedly lacked properly written and implemented retention or deletion policies. The Commission noted that while companies had restrictions governing data sharing with third parties, their reliance on template contractual language for sharing between corporate affiliates was potentially insufficient to protect consumer privacy. - Advertising. The report found that the collection of users’ personal data may lead to significant privacy invasions and other risks through targeted advertising – often through the use of pixels and conclusions based on sensitive personal information categories.
The report specifically took issue with consumers’ lack of awareness that their online behavior may be used for advertising purposes. Of the companies surveyed, those that engaged in advertising allegedly allowed advertisers to target users based on demographic information, location, device information, and user interests—a practice that the report states is contrary to the interests of a majority of consumers. - Algorithms, Data Analytics, and Artificial Intelligence. The FTC observed that companies broadly used algorithms or data analytics to automate decisions and analyze large amounts of information. Consumers, the FTC contended, do not have an adequate understanding of these use cases, and they should have a meaningful opportunity to consent to such use of their data as a universal way to opt in or opt out.
The FTC also commented that there was no “uniform or standard approach” to practices and standards of monitoring and testing AI, which is misaligned with recent calls for transparency and testing standards. - Children and Teens. The FTC also found that companies did not adequately protect children and teens by failing to detect when children used their platforms. The FTC reported that companies often treated teens thirteen and older like adult users by collecting their information as the companies would an adult’s. Although there is no federal restriction on this approach, according to the FTC, this is problematic given that children “do not become fully formed adults the moment they turn thirteen, and the harms Congress enacted COPPA to prevent can affect teenagers as much as—or even more than—they affect children.” Given the FTC’s prior efforts to expand legislative protections to cover teens, the findings in this report are unsurprising.
Takeaways
While not tied to enforcement proceedings, this report on social media and video streaming services provides insight into the Commission’s stance on particular data practices. For instance, the FTC continues to reinforce the importance of long-standing concepts like notice, as well as establishing support for what the Commission believes are existing gaps in the current federal laws, such as the Children’s Online Privacy Protection Act’s exclusion of teens. Among the broadly applicable takeaways are the following:
- Implement concerete policies that cover data minimization, retention, and deletion
- Include strong data protections in contracts for data transfers between affiliates as well as third parties
- Excercise caution around sensitive information, such as location and health, when using advertising technology
- Build strong transparency, testing and user control policies for AI and automated systems
- Allow parents and guardians the means to manage their child’s information
- Recognize when children are using a service, and restrict the collection and use of their data
[1] 15 U.S.C. § 46(b).