The company was handed a staggering $5.7 million fine when the FTC filed a complaint alleging that the video-sharing app was in violation of the Children’s Online Privacy Protection Act.
Their mistake was that the app did not require parental consent from users under the age of 13 before collecting personal information. As with many apps of this type, this one (called Tik Tok) collected vast amounts of information. This included user names, email addresses, first and last names, phone numbers, profile pictures, user-entered biographical information, location data, and more.
In addition to the obvious COPPA violations, the app’s development team came under fire when it was discovered that much of each user’s account information remained visible to the general public, even if the user opted to make their profile private.
Worst of all, in the FTC filing, it was noted that adults had made numerous attempts to contact children via the app. It also stated that until the company released an update in 2016, there was a feature in place that allowed a user to view all other signed-in users within a fifty-mile radius of their location.
The general state of app security and permissions is quite poor, but even given the relatively low standards in today’s market, the Tik Tok app sets new lows on several different fronts. The hefty fine levied by the FTC was not only wholly justified, but it is hoped, will serve as a warning shot across the bow of app developers to start cleaning up their collective acts a bit, especially when marketing apps to children.
John Fokker, the head of Cyber Investigations at McAfee applauded the ruling, but also cautioned:
“…the responsibility also lies with parents to ensure their children are only signing up for services they’re old enough and wise enough to use.”
Wise words indeed, and kudos to the FTC.