Apple briefly pulled Telegram over child pornography distribution


When Apple temporarily pulled Telegram from the App Store over "inappropriate content," it left many wondering just what that content was. We now know: 9to5Mac has learned that the company removed the app after discovering that people had been distributing child pornography through the app. Apple both contacted Telegram's team and authorities (including the National Center for Missing and Exploited Children) to both address the specific violation and to ensure that there were "more controls" in place to prevent a repeat.

As a rule, internet services use a range of safeguards to prevent the spread of child porn, such as shared hash lists that prevent a file on one site from being posted elsewhere. It's not certain what solutions Telegram implemented, but the relatively short turnaround (its software was back within hours) suggests it didn't require a fundamental change.

The nature of the discovery might provide a clue as to how it was distributed. Telegram's bread and butter is end-to-end encrypted messaging, which should rule out a non-participant directly intercepting the messages (including Apple itself). The 9to5 team suggests that the material may have been made public through a third-party plugin. Your privacy should remain intact as a result -- Apple may have just been fortunate enough to spot the vile content and take action.

Via: The Verge

Source: 9to5Mac

via Engadget RSS Feed "http://ift.tt/2GQZqiX"

Comments

Popular posts from this blog

Evernote cuts staff as user growth stalls

The best air conditioner

We won't see a 'universal' vape oil cartridge anytime soon