top of page
Writer's pictureFahad H

Facebook bans an app once used by 4M people, suspends 200 more

Facebook has banned another app on its platform since announcing in March it had banned Cambridge Analytica for misusing user data via the ThisIsYourDigitalLife app.

The myPersonality app banned by Facebook was active prior to 2012, according to the company, and was used by approximately 4 million people. This is the first app Facebook has banned since it started its app investigation following the Cambridge Analytica crisis.

Facebook says it will be notifying users who gave their information to the myPersonality app — but that it will not be notifying Facebook friends of any of the users impacted because there is no evidence the app accessed users’ friends lists (unlike the app used by Cambridge Analytica). Facebook noted if that changes — and they do discover the app was able to access data beyond the people who used it — they will notify users’ Facebook friends.

Facebook’s ongoing app investigation

In addition to banning the myPersonality app from the platform, Facebook has suspended more apps as part of its investigations. In May, it announced it had suspended 200 apps for the misuse of user data after auditing more than 1,000. Now, more than 400 apps have been suspended.

From Ime Archibong, Facebook’s VP of product partnerships:

Since launching our investigation in March, we have investigated thousands of apps. And we have suspended more than 400 due to concerns around the developers who built them or how the information people chose to share with the app may have been used — which we are now investigating in much greater depth.

Since news broke earlier this year that Cambridge Analytica had used an app to exploit user data, Facebook has been under the gun to clean up its app platform. Initially, the company paused all app reviews, then released a new app review process. It has also refined the amount of data available to app developers and has given users tools to easily remove apps in bulk from their profiles.

Industry experts weigh in on Facebook, user privacy & GDPR

While Facebook’s latest moves around apps and the handling of user data were spurred by the Cambridge Analytica crisis, the later rollout of the EU’s GDPR in May has also played into how Facebook manages user data. The Media Trust’s CEO Chris Olsen says Facebook’s ongoing investigation into how third parties are using its platform shows Facebook is being responsive to consumer outrage over its inability to create a safe environment for users.

“Facebook’s decision to suspend 400+ apps from its platform sends a strong message that app developers and their third parties must meet new security and privacy thresholds and will be subject to stringent audits,” says Olsen. “As data becomes increasingly regulated, companies will need to put together a robust digital vendor risk management program that will enable them to pay close attention to their direct and indirect digital third parties’ activities, ensure third parties align with policies, and terminate activities that violate policies.”

Eyal Katz, a senior marketing manager for Namogoo, agrees with Olsen’s stance and says that all digital platforms need to be paying attention to the public scrutiny experienced by Facebook and its handling of user data.

“Now, with the implementation of GDPR and increased focus on the protection of personal information, these companies must have a complete understanding of which third-party services are operating on their platforms and more importantly, what those services are doing with user data,” says Katz.

Facebook safety measures beyond apps

Facebook CEO Mark Zuckerberg has admitted several times — including in his appearance before Congress and the EU Parliament — that his company did not take a broad enough view of Russian interference on the platform during the 2016 US elections.

Now, with the 2018 November US midterm elections looming and the GDPR in effect, Facebook has had to significantly increase its user safety and privacy efforts. In May, the company launched a number of related initiatives, including its first-ever transparency report and new political and issue-based ad policies. Facebook also said it planned to increase the number of employees on its safety and security teams to 20,000 this year — 15,000 of which have already been hired.

The company has tried to be more transparent as well, sharing how it reviews content and its process for removing posts and accounts from the platform. After taking down 32 Pages for coordinated inauthentic behavior in July, Facebook announced this week it had removed another 652 Pages, groups and accounts originating in Iran.

0 views0 comments

Comments


bottom of page