A Good Idea That Hasn't Delivered
At first glance, the data privacy labels displayed on mobile app stores seem like a meaningful step forward for consumer protection. Modeled loosely on the concept of nutrition labels for food, they are designed to give users a clear picture of what personal data a given app collects, how that data is used, and with whom it might be shared. The idea is that armed with this information, consumers can make genuinely informed decisions about whether to download an app — just as they might scrutinize a food label before making a purchase.
But according to leading privacy researchers and practitioners, the current implementation of these labels is deeply flawed, and the promise they represent remains largely unfulfilled.
Expert Skepticism at RSAC
Lorrie Cranor, director and Bosch distinguished professor at Carnegie Mellon University's CyLab Security & Privacy Institute, addressed the issue directly during her talk at the recent RSAC Conference in San Francisco. While she acknowledged the value of the concept, she was candid about its limitations.
"We're not kidding ourselves. Having these labels is not going to actually protect privacy," Cranor said. "But it's going to be a way for us all to get more information and hopefully lead to better privacy practices and help people protect privacy."
Kelly Peterson, chief privacy and compliance officer at AI startup Yobi — and formerly chief privacy officer at Grindr, with privacy leadership experience at Amazon — shares that skepticism. She argues that companies have historically prioritized regulatory compliance over genuinely informing consumers about data practices.
"I like the concept," Peterson told Dark Reading. "I like trying to make this really hard, technical stuff attainable for someone who's like, 'I don't know if I want to use this app or not,' but I don't think that they're solving a problem."
According to Peterson, when companies post a data privacy label, they are often simply publishing it for informational purposes and implying its accuracy, without necessarily conducting the due diligence required to verify it. Crucially, they are not addressing the underlying data privacy problems the labels might highlight.
A Long Road to Adoption
Cranor's work on privacy labels stretches back more than a decade. She began collaborating with Carnegie Mellon students in 2010 to develop privacy labels for websites. Those labels performed well in testing but were never widely adopted. Her team also explored labeling for Internet of Things devices before turning their attention to mobile apps in 2013.
It was not until 2020 that Apple announced it would begin including privacy labels in its App Store, with Google following shortly afterward with a similar initiative for the Play Store.
"When these came out, we were at first very excited that they were finally doing something," Cranor told RSAC attendees.
That initial enthusiasm quickly gave way to concern. Multiple reports emerged indicating that companies were not being truthful in their labeling. A subsequent study conducted by Cranor and her research team uncovered numerous inaccuracies — though it also found that these errors were more often the product of honest mistakes and developer misunderstandings than deliberate attempts to mislead consumers.
Apple and Google Aren't Even on the Same Page
Compounding the accuracy problem is a fundamental inconsistency between the two dominant platforms. Apple and Google use different methodologies to define what counts as data collection. Google defines data collection as any data transmitted from a user's device. Apple, by contrast, only considers data to be collected if it is both transmitted from the device and stored. This definitional gap means that consumers comparing labels across platforms are not actually comparing like with like.
What Would Make Labels Actually Useful
Peterson suggests that consumers are sometimes better served by visiting a company's online trust center or reading its privacy policy directly — even if the latter is often an overwhelming document filled with legal jargon. She argues that companies should be able to offer simplified versions of their privacy policies for everyday users, reserving the lengthy technical language for legal compliance purposes.
Cranor outlined several concrete changes she believes are necessary to make app privacy labels genuinely effective:
- Standardization: Aligning Apple and Google on a common methodology would reduce confusion for both developers and consumers.
- Greater prominence: Labels should be more visibly featured in app store listings rather than buried in secondary screens.
- Developer tools: Platforms should provide resources to help developers create accurate labels, reducing errors born of misunderstanding.
- Verification mechanisms: App stores should implement processes to audit and confirm the accuracy of submitted labels.
- AI-powered search: In an era of artificial intelligence, consumers could benefit from tools that allow them to search for apps that match their specific privacy preferences, removing the burden of reading labels manually.
Cranor was blunt in her overall assessment, describing the current labels as "not at all useful." What makes this especially problematic, she noted, is that the labels create a false sense of progress — making it appear that companies are actively protecting consumer privacy when the reality falls considerably short.
Labels Alone Won't Fix a Systemic Problem
Both Cranor and Peterson agree that privacy labels are an inherently limited tool. Just as nutrition labels on food packaging have not resolved the United States' obesity crisis, data privacy labels cannot by themselves transform how companies handle personal information. The root issue is a culture within the tech industry that has long treated privacy as a compliance checkbox rather than a genuine user right.
Until labeling standards are harmonized, enforcement mechanisms are introduced, and companies are held accountable for the accuracy of what they publish, consumers will remain largely in the dark about what happens to their data — regardless of what any label might claim.
Source: Dark Reading