At the end of the day, it comes down to trusting WhatsApp. Even without a backdoor in their protocol, they can easily do all kinds of things.
For instance, it could instruct specific clients to encrypt and send each message twice: one for the recipient, and one for the WhatsApp server. As long as this was off for 99.9% of users, it's unlikely that security researchers would ever detect this.
And anyone with sufficient access could push an update from the app store to a selected target that bypasses the normal security protocols of any given messaging app. Who checks their app store downloads against source code?
We're mostly talking about government agencies here that would force Facebook to act in that way. And even though they could force them to try to intercept messages of a user with an ESL, they can't force them to introduce a vulnerability for all users.
Especially as ESL are usually kept secret within the company, this would be too risky, developers would ultimately find such a backdoor (especially since it generates a lot of server traffic).
Right, I definitely wouldn't use WhatsApp for anything truly sensitive. I wouldn't use smartphone software at all, personally. But this particular "backdoor" seems bogus.
It's shared physical space. I'm OK if they video me. I'm OK if they share that video with law enforcement if there is a reason of substance to do so and as long as they make it publicly known they asked for it within a reasonable timeframe, say 45 days. I'm NOT OK with broad sweeping requests and would only allow them if circumstances required it and the request for the data was disclosed within 90 days.
Does Walmart actively deceive me? Not that I'm aware, but I don't shop there.
Never heard of Glencore or Phillip Morris.
As for Blackwater and Palantir, my impression from the media is they do exactly what they say. It's not like Palantir lies about harvesting data to give to government. I trust that they actually do do that.
None of those companies have posted fake news and altered the news algo with the express intent of manipulating users' mental states for reasons that basically boil down to "for the lols" and "let's see if we can make money from this".
The amount of passion in your comment and this gap in knowledge don't go well with each other. I encourage you to at least read about Philip Morris (or watch John Oliver's episode about them at least).
Cool, no worries. Philip Morris operates worldwide though, and I'll have to hear some awesome arguments to be convinced that facebook is more evil/less trustable than them.
I mean, they're probably disassembling the app, so they'd definitely notice _that_, but there are some truly subtle problems that pop up in security, so your general point about trust seems reasonable enough (certainly for any closed-source remote-updating system).
The joke is if this specific "Backdoor" would be used widespread it would generate a lot of noise (eg. random key changes, keys that don't match) and would cause massively bad PR for WhatsApp. No way are they doing that.
For instance, it could instruct specific clients to encrypt and send each message twice: one for the recipient, and one for the WhatsApp server. As long as this was off for 99.9% of users, it's unlikely that security researchers would ever detect this.