Apple has revoked Facebook's ability to deploy and use apps distributed internally through the iOS ecosystem as punishment for flouting strict privacy and data rules designed to protect consumers.
The abuse relates to a market research project the social media giant has been running known as "Facebook Research."
The scheme, while openly requiring that participants hand over all manner of personal data -- including browser activity, private messages, chat sessions, and both photos and videos sent to contacts -- came under fire this week not only due to the information gathered, but the target group and how the app was developed.
Facebook Research is aimed at users from the tender age of 13 to 35 years old and promises up to $20 a month to participate. The app does require parental consent when minors are concerned, but these checks were lackluster and at best only required a tick-box, which could result in teenagers signing up without fully understanding what they were handing over.
"Since this research is aimed at helping Facebook understand how people use their mobile devices, we've provided extensive information about the type of data we collect and how they can participate," Facebook said. "We don't share this information with others and people can stop participating at any time."
When examined further, however, another potential transgression came to light.
The Facebook Research application, once downloaded and executed, reportedly required that users install a root developer certificate.
Root certificates can provide close to limitless access to a mobile device if accepted. The vast majority of mobile apps on the market today may ask for permission to access features such as cameras, contact lists, or GPS data, but root certificates are beyond what is generally deemed acceptable.
Apple said at the time of the original report that the issue was being investigated, and now, it appears the scrutiny has resulted in Facebook's Enterprise Developer Program privileges being revoked.
The Apple Enterprise Developer Program is a system for companies and developers to use in order to side-step the public Apple App Store in order to offer and test apps internally and privately. Root certificates are acceptable and often required in the development process -- but these should not end up in apps offered to the public.
Speaking to sister site CNET, Apple said that "Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple."
Due to the use of a developer certificate and the vast amount of data on mobile devices which can be accessed on the app, if submitted through Apple's app standards and review process, the market research app would not have been permitted as a consumer offering.
Apple said the enterprise certificate was revoked in order to "protect its users and their data."
The loss of the certificate may end up causing chaos internally for the social media giant and may also be one of the only true real-world consequences the company has suffered for its privacy practices despite the constant stream of data scandals linked to Facebook in recent years.
Unreleased, employee iOS apps, ranging from beta versions of Facebook, Instagram, and Messenger, lunch menu apps and transport systems may all have stopped working, according to The Verge.
Facebook confirmed to the publication that internal apps have been affected by Apple's decision. It is believed some apps simply are not working or able to launch.
The data scandals plaguing Facebook are potentially indicative of a company being fast and loose with user privacy. However, the company needs to understand that its own practices do not just impact its own reputation. The app ecosystem and abuse thereof can also damage the reputation of other firms -- such as Apple.
It might be ironic considering the recent exposure of a critical vulnerability in Apple's own software which placed user privacy at risk, but the lesson appears to have hit home: the abuse of developer platforms in the iOS environment will not be tolerated.