It’s been an astonishing few days for Facebook. They've seen both an app and their enterprise certificate removed and revoked with big consequences.
What happened?Apple issue enterprise certificates to organizations with which they can create internal apps. Those apps don’t end up released on the Apple store, because the terms of service don’t allow it. Anything storefront-bound must go through the mandatory app checks by Apple before being loaded up for sale.
What went wrong?Facebook put together a “Facebook research” market research app using the internal process. However, they then went on to distribute it externally to non-Facebook employees. And by “non Facebook employees” we mean “people between the ages of 13 to 35.” In return for access to large swathes of user data, the participants received monthly $20 gift cards.
Problem solved?Not exactly. Apple has, in fact, revoked Facebook’s certificate, essentially breaking all of their internal apps and causing major disruptions for their 33,000 or so employees in the process. As per the Apple statement:
We designed our Enterprise Developer Program solely for the internal distribution of apps within an organization. Facebook has been using their membership to distribute a data-collecting app to consumers...a clear breach of their agreement.
WhoopsYes, whoops. Now the race is on to get things back up and running over at Facebook HQ. Things may be a little tense behind the scenes due to, uh, something similar involving a VPN-themed app collecting data it shouldn’t have been earlier this year. That one didn’t use the developer certificate, but it took some 33 million downloads before Apple noticed and decided to pull the plug.
Could things get any worse for Facebook?Cue Senator Ed Markey, with a statement on this particular subject:
It is inherently manipulative to offer teens money in exchange for their personal information when younger users don’t have a clear understanding of how much data they’re handing over and how sensitive it is,” said Senator Markey. “I strongly urge Facebook to immediately cease its recruitment of teens for its Research Program and explicitly prohibit minors from participating. Congress also needs to pass legislation that updates children’s online privacy rules for the 21st century. I will be reintroducing my ‘Do Not Track Kids Act’ to update the Children’s Online Privacy Protection Act by instituting key privacy safeguards for teens.Well, that definitely sounds like a slide towards “worse” instead of “better."
But my concerns also extend to adult users. I am alarmed by reports that Facebook is not providing participants with complete information about the extent of the information that the company can access through this program. Consumers deserve simple and clear explanations of what data is being collected and how it being used.
A one-two punch?Facebook is already drawing heavy criticism this past week for the wonderfully-named “friendly fraud” practice of kids making dubious purchases, and chargebacks being made. It happens, sure, but perhaps not quite like this. From the linked Register article:
Facebook, according to the full lawsuit, was encouraging game devs to build Facebook-hosted games that allowed children to input parents' credit card details, save those details, and then bill over and over without further authorisation.While large amounts of money were being spent, some refunds proved to be problematic. Employees were querying why most apps with child-related issues are “defaulting to the highest-cost setting in the purchase flows.” You'd better believe there may be further issues worth addressing.