Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Back in the day hackernews had some fire and resistance.

Most of the comments are fire and resistance, but they commonly take ragebait and run with the assumptions built-in to clickbait headlines.

> Too many tech workers decided to rollover for the government and that's why we are in this mess now.

I take it you've never worked at a company when law enforcement comes knocking for data?

The internet tough guy fantasy where you boldly refuse to provide the data doesn't last very long when you realize that it just means you're going to be crushed by the law and they're getting the data anyway.





> I take it you've never worked at a company when law enforcement comes knocking for data?

The solution to that is to not have the data in the first place. You can't avoid the warrants for data if you collect it, so the next best thing is to not collect it in the first place.


"But I forgot my password! You need to fix this!"

The technology exists to trivially encrypt your data if you want to. That's not a product most people want, because the vast majority of people (1) will forget their password and don't want to lose their data, and (2) aren't particularly worried about the feds barging in and taking their laptop during a criminal investigation.

That's not what the idealists want, but that's the way the market works. When the state has a warrant, and you've got a backdoor, you're going to need to give the state the keys to the backdoor.


Apple approaches it different with iCloud. You have a clear option to not hand these keys over.

It shows that your idea of how the market works clearly is not representative of the actual market.


You realize the famous case of Apple pushing back against the govt ended because their encryption was breakable by a third party, right?

There are some errors in what you write, and despite that, it is not clear to me what the supposed ‘realization’ would be.

1. The famous 2016 San Bernardino case predates Advanced Data Protection technology of iCloud backups. It was never about encryption keys, it was about signing a ‘bad’ iOS update.

2. Details are limited, but it involved a third-party exploit to gain access to the device, not to break the encryption (directly). These are different things and should both be addressed for security, but separately.

Evidently, after this case ended, Apple continued its efforts. It rolled out protecting backups from Apple, and the requirement of successful user authentication before installing iOS updates (which is also protecting against Apple or stolen signing keys).

There is a market here.


Yes, just hand over the encrypted data that you have no way of retrieving the keys for. "Have fun, officer."

Until the NSA knocks on your door and says encrypt it like this.

"Good" companies in the old days would ensure they don't have your data, so they don't have to give it to the police.

Plenty of companies would do that if they could. The problem is it has become illegal for them to do that now. KYC/AML laws form the financial arm of warrantless global mass surveillance.

KYC/AML is luckily still confined to the financial sector. There's no law for operating system vendors to do KYC/AML.

There is no law yet.

Where I live, government passed a similar law to the UK's online identification law not too long ago. It creates obligations for operating system vendors to provide secure identity verification mechanisms. Can't just ask the user if they're over 18 and believe the answer.

The goal is of course to censor social media platforms by "regulating" them under the guise of protecting children. In practice the law is meant for and will probably impact the mobile platforms, but if interpreted literally it essentially makes free computers illegal. The implication is that only corporation owned computers will be allowed to participate in computer networks because only they are "secure enough". People with their own Linux systems need not apply because if you own your machine you can easily bypass these idiotic verifications.


Which law is that?

Online Safety Act in the UK.

In Brazil, where I live, it's law 15.211/2025. It makes it so that the tech industry must verify everyone's identity in order to proactively ban children from the harmful activities. It explicitly mentions "terminal operating systems" when defining which softwares the law is supposed to regulate.


OpenAI does KYC. I refuse to deal with that.

If you design it so you don't have access to the data, what can they do? I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.

If you design it so you don't have access to the data, how do you make money?

Microsoft (and every other corporation) wants your data. They don't want to be a responsible custodian of your data, they want to sell it and use it for advertising and maintaining good relationships with governments around the world.


> If you design it so you don't have access to the data, how do you make money?

The same way companies used to make money, before they started bulk harvesting of data and forcing ads into products that we're _already_ _paying_ _for_?

I wish people would have integrity instead of squeezing out every little bit of profit from us they can.


People arguably cannot have integrity unless all other companies they compete with also have integrity. The answer is legislation. We have no reason to allow our government to use “private” companies to do what they cannot then turn over the results to government agencies. Especially when willfully incompetence.

The same can be said of using “allies” to mutually snoop on citizens then turning over data.


I think you’re conflating lots of different types of data into one giant “data.”

Microsoft does not sell / use for advertising data from your Bitlocked laptop.

They do use the following for advertising:

Name / contact data Demographic data Subscription data Interactions

This seems like what a conspiracy theorist would imagine a giant evil corporation does.

https://www.microsoft.com/en-us/privacy/usstateprivacynotice


What are you talking about?

> I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.

FTA (3rd paragraph): don't default upload the keys to MSFT.

>If you design it so you don't have access to the data, what can they do?

You don't have access to your own data? If not, they can compel you to reveal testimony on who/what is the next step to accessing the data, and they chase that.


That's not the point. Microsoft shouldn't be silently taking your encryption key in the first place. The law doesn't compel them to do that.

It's not silent. It tells you when you set up BitLocker and it also allows you to recover the drive.

Doesn't sound like it tells you now that it's default, but I'll see what it says next time. If they make the key-sharing clear and make it easy to disable, then it's fine.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: