Applications profiling primary school children | information age
Children see their data collected without their consent or without their knowledge. Photo: Shutterstock
Unscrupulous mobile app developers are collecting personal data from Australian children under the age of five, a privacy watchdog group has warned, after a nine-month analysis of 186 popular gaming apps found 59 % of them had “problematic data collection behavior”.
As part of a campaign called Apps Can Track, the technical analysis – commissioned by Children and Media Australia (CMA) with the assistance of Dr Serge Engleman, Research Director of the Usable Security and Privacy Group within the International University of California, Berkeley’s Computer Science Institute (ICSI) – used the ICSI-supported AppCensus tool to examine the data collection practices of 208 Android apps.
Of these, 22 were specifically designed to facilitate school-home or parent-child communication and the remaining 186 were popular games – 54 of which were classified as engaging in “risky” privacy practices.
This meant that despite being targeted at very young children – think Dr Panda’s pool – the apps collect and send unique Android ID (AID) and Android Advertising Identifier (AAID) codes to up to six companies related to advertising.
Another 40 apps, including Icecream Cone Cupcake Baking Maker, were flagged as requiring caution because they passed the AAID to at least one advertising-related company.
And seven apps – including Starwars Pinball 7 – have been classified as “highly risky” because they transmit personal data to advertising-related companies “in an insecure manner”.
AID and AAID codes allow advertisers to identify individual mobile handsets and track user behavior across apps – behavior strictly regulated for children under 13 since the US adopted Children’s Online Privacy Protection Act (COPPA) in 1998.
The lack of legislation to protect Australian children prompted CMA, which has separately reviewed the content of more than 1,500 films and 990 apps since 2002, to expand its remit to include privacy protection.
Project results feed into CMA’s growing database of mobile app privacy reviews, which the organization has been building since June 2021 and now includes 208 app reviews.
CMA’s goal is “to help parents who may not be aware of the risks associated with the apps their children are playing, nor may they be able to detect or prevent them,” the organization said, warning that Commonly used privacy practices create “an extensive digital footprint that can be used by predators and marketers, and can compromise prospects later in life, such as jobs, college places, health insurance, etc.
“Big Tech’s business model is to collect user data to sell them things, and it won’t be easy to change your mind.”
The clock is TikTok-ing
Apple and Google’s detailed policies set out clear expectations about what data apps will and won’t collect, but developers have repeatedly been caught ignoring these restrictions and collecting far more user data than they actually do. they’re only supposed to – or they need to.
Earlier this year, for example, Google banned more than a dozen apps after AppCensus analysis identified secret data-gathering practices.
In March, the United States Federal Trade Commission fined weight loss company WW International (formerly Weight Watchers) and its subsidiary Kurbo, Inc. $2.2 million for unlawfully collecting data on children as young as eight without parental consent.
Pervasive remote learning in the pandemic era has upped the ante, with Human Rights Watch (HRW) reporting last year that 89% of 163 EdTech products analyzed “directly endanger or violate the privacy of users”. children and other children’s rights, for purposes unrelated to their education.”
The products “monitored or had the ability to monitor children, in most cases covertly and without the consent of the children or their parents”, found HRW – with most online learning platforms using tracking technologies that “followed the children outside of their virtual classrooms and around the world.” internet, over time.
Almost all of the apps share children’s data with third parties, HRW found, warning that “in doing so, they appear to have enabled sophisticated algorithms from AdTech companies to stitch together and analyze this data to guess a person’s personal characteristics. child and interests, and to predict what a child might do next and how they might be influenced.
The gaping gap between the theoretical protection of children and the practice of app developers has sounded the alarm at the highest level.
Earlier this month, US Federal Communications Commission (FCC) Commissioner Brendan Carr called on Apple and Google to remove the ubiquitous TikTok app from their app stores after investigations suggested that it was a “sophisticated surveillance tool that collects large amounts of personal and sensitive data”.
For now, the CMA is focused on advocating for the “urgent need for effective regulatory protections,” pushing developers to use “child-centric safety by design” and empowering parents, including securing additional funding to make AppCensus results even more accessible. non-technical parents.