Facebook came out swinging on Wednesday, revealing a bunch of changes that will be made to the social network in light of the Cambridge Analytica scandal. Continuing its apologetic tour, Facebook and Mark Zuckerberg made plenty of disturbing revelations about the way Facebook hasn’t been protecting your data.
Zuckerberg said that a maximum number of 87 million people may be affected by the Cambridge Analytica data gathering apps. He also said that all 2 billion users should assume that malicious individuals could have scraped their profile data, if certain settings were enabled, revealing that sophisticated attacks were detected on the network, but Facebook chose not to answer them until now.
And, of course, Facebook has been reading your Messenger chats. This isn’t really news.
Facebook first hinted about the capability a few days ago when Zuckerberg talked to Vox — that’s the same interview during which the exec took several shots at Apple’s Tim Cook.
Zuckerberg said that Facebook’s automatic systems can surface content that doesn’t abide by its rules. Like the ethnic cleansing in Myanmar. “In that case, our systems detect what’s going on,” Zuckerberg said, explaining that Facebook detected that people were trying to send sensational messages through Messenger. “We stop those messages from going through.”
Facebook further explained to Bloomberg that Messenger conversations are private, but Facebook does scan them to prevent abuse.
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery, or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
Facebook says it doesn’t really read the contents of those chats, and that it doesn’t use the data to target you with ads. On any given day, that should be enough from a tech company whom you trust. But how do we really know that’s the case?
One could argue that it’s great to see Facebook step up its game against malicious behavior, including hate, pornography and even attempted malware attacks. Of course, it is. Also, Facebook already tracks your online activity, whether you’re a user or not, which can reveal even more relevant information for ad targeting.
The problem, again, is that Facebook hasn’t been forthcoming about these features and many people have no idea that everything they may type inside Messenger is screened by Facebook bots, and flagged messages can even be read by moderators.
On the other hand, let’s not forget that Google, which is probably watching very closely Facebook’s scandal unfold, has been scanning your Gmail for years and serving you ads based on it. Google announced last summer that it’ll stop the practice for ads. But it’ll keep scanning email for malware attacks, and to offer smart features, like Smart Reply.
Finally, you can turn on full, end-to-end encryption on Messenger, but you must do so for every chat you want to keep private. That’s the Secret Conversation feature.
Zuckerberg said that a maximum number of 87 million people may be affected by the Cambridge Analytica data gathering apps. He also said that all 2 billion users should assume that malicious individuals could have scraped their profile data, if certain settings were enabled, revealing that sophisticated attacks were detected on the network, but Facebook chose not to answer them until now.
And, of course, Facebook has been reading your Messenger chats. This isn’t really news.
Facebook first hinted about the capability a few days ago when Zuckerberg talked to Vox — that’s the same interview during which the exec took several shots at Apple’s Tim Cook.
Zuckerberg said that Facebook’s automatic systems can surface content that doesn’t abide by its rules. Like the ethnic cleansing in Myanmar. “In that case, our systems detect what’s going on,” Zuckerberg said, explaining that Facebook detected that people were trying to send sensational messages through Messenger. “We stop those messages from going through.”
Facebook further explained to Bloomberg that Messenger conversations are private, but Facebook does scan them to prevent abuse.
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery, or when you send a link, we scan it for malware or viruses,” a Facebook Messenger spokeswoman said. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
Facebook says it doesn’t really read the contents of those chats, and that it doesn’t use the data to target you with ads. On any given day, that should be enough from a tech company whom you trust. But how do we really know that’s the case?
One could argue that it’s great to see Facebook step up its game against malicious behavior, including hate, pornography and even attempted malware attacks. Of course, it is. Also, Facebook already tracks your online activity, whether you’re a user or not, which can reveal even more relevant information for ad targeting.
The problem, again, is that Facebook hasn’t been forthcoming about these features and many people have no idea that everything they may type inside Messenger is screened by Facebook bots, and flagged messages can even be read by moderators.
On the other hand, let’s not forget that Google, which is probably watching very closely Facebook’s scandal unfold, has been scanning your Gmail for years and serving you ads based on it. Google announced last summer that it’ll stop the practice for ads. But it’ll keep scanning email for malware attacks, and to offer smart features, like Smart Reply.
Finally, you can turn on full, end-to-end encryption on Messenger, but you must do so for every chat you want to keep private. That’s the Secret Conversation feature.
Comments
Post a Comment