COPPA Compliance

Interesting research: "'Won't Somebody Think of the Children?' Examining COPPA Compliance at Scale":

Abstract: We present a scalable dynamic analysis framework that allows for the automatic evaluation of the privacy behaviors of Android apps. We use our system to analyze mobile apps' compliance with the Children's Online Privacy Protection Act (COPPA), one of the few stringent privacy laws in the U.S. Based on our automated analysis of 5,855 of the most popular free children's apps, we found that a majority are potentially in violation of COPPA, mainly due to their use of third-party SDKs. While many of these SDKs offer configuration options to respect COPPA by disabling tracking and behavioral advertising, our data suggest that a majority of apps either do not make use of these options or incorrectly propagate them across mediation SDKs. Worse, we observed that 19% of children's apps collect identifiers or other personally identifiable information (PII) via SDKs whose terms of service outright prohibit their use in child-directed apps. Finally, we show that efforts by Google to limit tracking through the use of a resettable advertising ID have had little success: of the 3,454 apps that share the resettable ID with advertisers, 66% transmit other, non-resettable, persistent identifiers as well, negating any intended privacy-preserving properties of the advertising ID.

Posted on April 13, 2018 at 6:43 AM • 11 Comments

Comments

DaveApril 13, 2018 9:19 AM

Finally, a reasonable "Think of the children" initiative. I guess. It might extend to everyone else!
Thanks, Bruce.

Vasu GantiApril 13, 2018 10:06 AM

And Congress thinks it can rein in FB and other online media apps and services from commercialization (and profiteering) off of our PII? What?

HumdeeApril 13, 2018 10:16 AM

There is another aspect of "protecting children" where app makers fail and that is in policing what their users do on-line. Many apps targeted to children feature some type of social or chat feature that involves the sharing of text and even images. What goes on in this exchanges is eye-opening to say the least and in my observations developers engage in little to no policing of content. Either they try some automated solution which people work around or they use some human intervention which tends to be very slow to respond because app developers have more profitable things to be concerned with.

DaveApril 13, 2018 12:13 PM

Expecting app makers to police app users is rather like expecting hammer makers to police hammer users. AI is just not capable as yet, But the restraint of indiscriminant surveillance is clearly achievable by stopping its use, and it would be easier to do that for everybody, than just children. True?

Komal BansalApril 13, 2018 9:19 PM

Hi Bruce, During my masters, I did research on the effectiveness of COPPA and Safe Harbors among parents of children under 13. Please let me know if I could share my research with you if you are interested.

AlejandroApril 15, 2018 7:14 AM

@Dave et al re: "AI is just not capable as yet..."

I know it's off topic, but I have to wonder if AI is real or simply another high tech buzzword.

Seems to me AI is nothing more than creating a really, really big rule set for an app which is NOT artificial in the least, but entirely man made with all it's positives..and negatives. Then "AI" simply does the math and works a lot faster at delivering the result than (most) humans.

In regards protecting the children then, the issue becomes who is building the rule set and can THEY be trusted. What are the creators biases? I would guess some are not good.

Ditto for Autonomous Vehicle (AV) rule sets. It's assumed AI will can or will make vehicles nearly accident and fail proof. The recent death in Temp certainly erases any credibility of that assumption.

Anyway, is AI real or just another very human and flawed app with a lot of code?

AlejandroApril 15, 2018 7:17 AM

Re: apps and policing

Right, we have learned time and again police cannot police themselves. It's practically a rule. Ditto for app makers.

ArdApril 16, 2018 12:14 AM

Thank you Bruce for sharing this topic

Our children in this context are I see as 2 things: the most exploitable targets, and the most mobilizing victims (prey). They agree to terms of service that not even educated adults can understand, and the results of which can affect their lives and future careers more than anyone else in a more pervasive way more than most any population in human history.

Both a profound risk but also a subject that can rally profound support if we position the threat appropriately. This is an area where 'for the children' can actually be appropriate.

DaveApril 16, 2018 9:38 AM

@Alejandro:
I was being cynical.
I do not believe the state of 'ai' is anywhere near adequate.
The non-cynical point I make is to the massive flow of user data to dubious entities. Throttling that would be desireable for all those that sit between chair and keyboard.

oliverApril 16, 2018 10:38 AM

This is horseshit!
"Would sombody think of the children" is Helen-Lovejoy-esques bullshit!!
Nobody needs this, certainly not the children.

They access everything, anyway!

Cheers, Oliver

vas pupApril 19, 2018 8:44 AM

See research related to the subject:
Digital addiction increases loneliness, anxiety and depression:
https://www.sciencedaily.com/releases/2018/04/180411161316.htm
"The behavioral addiction of smartphone use begins forming neurological connections in the brain in ways similar to how opioid addiction is experienced by people taking Oxycontin for pain relief -- gradually," Peper explained.
On top of that, addiction to social media technology may actually have a negative effect on social connection. In a survey of 135 San Francisco State students, Peper and Harvey found that students who used their phones the most reported higher levels of feeling isolated, lonely, depressed and anxious. They believe the loneliness is partly a consequence of replacing face-to-face interaction with a form of communication where body language and other signals cannot be interpreted. They also found that those same students almost constantly multitasked while studying, watching other media, eating or attending class. This constant activity allows little time for bodies and minds to relax and regenerate, says Peper, and also results in "semi-tasking," where people do two or more tasks at the same time -- but half as well as they would have if focused on one task at a time.
[!!!!]Peper and Harvey note that digital addiction is not our fault but a result of the tech industry's desire to increase corporate profits.[!!!] "More eyeballs, more clicks, more money," said Peper. Push notifications, vibrations and other alerts on our phones and computers make us feel compelled to look at them by triggering the same neural pathways in our brains that once alerted us to imminent danger, such as an attack by a tiger or other large predator. "But now we are hijacked by those same mechanisms that once protected us and allowed us to survive -- for the most trivial pieces of information," he said.
But just as we can train ourselves to eat less sugar, for example, we can take charge and train ourselves to be less addicted to our phones and computers. The first step is recognizing that tech companies are manipulating our innate biological responses to danger. Peper suggests turning off push notifications, only responding to email and social media at specific times and scheduling periods with no interruptions to focus on important tasks.

Leave a comment

Allowed HTML: <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre>

Photo of Bruce Schneier by Per Ervland.

Schneier on Security is a personal website. Opinions expressed are not necessarily those of IBM Resilient.