close
close

first Drop

Com TW NOw News 2024

Social media, online video companies conduct ‘massive surveillance’ of users, FTC investigation finds | Technology
news

Social media, online video companies conduct ‘massive surveillance’ of users, FTC investigation finds | Technology

Social media and online video companies are collecting vast amounts of your personal information on and off their websites or apps and sharing it with a host of third parties, according to a new report from the Federal Trade Commission (FTC) on nine technology companies.

The FTC report, published Thursday, looked at the data collection practices of Facebook, WhatsApp, YouTube, Discord, Reddit, Amazon, Snap, TikTok and Twitter/X between January 2019 and Dec. 31, 2020. Most of the companies’ business models revolved around tracking how people interact with their platforms, collecting their personal data and using it to determine what content and ads users see in their feeds, the report said.

The FTC’s findings corroborate years of reporting on the depth and breadth of these companies’ tracking practices and call on the tech companies to engage in “massive surveillance of users.” The agency recommends that Congress enact federal privacy regulations based on what it has documented. In particular, the agency urges lawmakers to recognize that the business models of many of these companies do little to encourage effective self-regulation or protection of user data.

“Recognizing this basic fact is important for both enforcers and policymakers, because any attempt to limit or regulate how these companies collect troves of people’s personal data will conflict with their primary business incentives,” FTC Chair Lina Khan said in a statement. “To create effective rules or remedies that limit this data collection, policymakers will need to ensure that breaking the law is not more lucrative than complying with it.”

The FTC also asks the companies named in the report to invest in “limiting data retention and sharing, limiting targeted advertising, and strengthening protections for teens.”

Notably, the report highlights how consumers have little control over how these companies use and share their personal data. Most companies collected or inferred demographic information about users, such as age, gender, and language. Some collected information about household income, education, and parental and marital status. But even when this type of personal information wasn’t explicitly collected, some companies were able to analyze users’ behavior on the platform to infer details of their personal lives without them knowing. For example, some companies’ interest categories included “baby, children, and pregnancy,” which would reveal parental status, or “newlyweds” and “divorce support,” which would reveal marital status. This information was then used by some companies to customize what content people saw to drive engagement on their platforms. In some cases, that demographic information was shared with third-party entities to help them deliver more relevant advertising.

Regardless of the product used, it wasn’t easy to opt out of data collection, the FTC said. Nearly all of the companies said they gave personal information to automated systems, usually to deliver content and ads. On the other hand, almost none offered “a comprehensive ability to directly control or opt out of the use of your data by any algorithm, data analytics or AI,” the report said.

Several companies say it’s impossible to even compile a complete list of who they share your data with. When the companies were asked to list which advertisers, data brokers or other entities they shared consumer data with, none of the nine companies provided the FTC with a full inventory.

The FTC also found that despite evidence that children and teens use many of these platforms, many of the tech companies reported that because their platforms are not directed to children, they do not need different data-sharing practices for children under 13. According to the report, none of the companies reported having data-sharing practices that treat information collected about and from 13- to 17-year-olds through their sites and apps differently than data collected from adults, even though data about minors is more sensitive.

The FTC called the companies’ data minimization practices “woefully inadequate” and found that some companies failed to remove information when users requested it. “Even those companies that actually removed data only removed some of the data, but not all of it,” the report said.

“That’s the most basic requirement,” said Mario Trujillo, a staff attorney at the Electronic Frontier Foundation. “The fact that some people didn’t do that, even in light of state privacy laws that require it, shows that there needs to be stronger enforcement, particularly by consumers themselves.”

Some companies have disputed the report’s findings. In a statement, Discord said the FTC report was an important step but “threw very different models into one bucket.”

skip the newsletter promotion

“Discord’s business model is very different: we’re a real-time communications platform with strong user privacy controls, not endless-scrolling feeds. At the time of our investigation, Discord did not have a formal digital advertising service,” Kate Sheerin, Discord’s head of public policy in the U.S. and Canada, said in a statement.

A Google spokesperson said the company has the strictest privacy policy in the industry. “We never sell people’s personal information, and we don’t use sensitive information to serve ads. We prohibit ad personalization for users under 18, and we don’t personalize ads for anyone who views ‘made for kids’ content on YouTube,” Google spokesman José Castañeda said.

The other companies have not issued an official comment or did not immediately respond to a request for comment.

However, if companies dispute the FTC’s findings, the burden of proof is on them to prove it, according to the Electronic Privacy Information Center (Epic), a Washington, D.C.-based organization that conducts research on privacy and free speech.

“I’ve worked in privacy compliance for companies, and let’s just say I absolutely don’t believe anything without documentation to support the claim,” said Epic global privacy counsel Calli Schroeder. “And I agree with the FTC’s conclusion that self-regulation has been a failure. Companies have repeatedly demonstrated that their priority is profits, and they will only take consumer protection and privacy issues seriously if failure to do so impacts those profits.”