> There is security, and then there is freedom. You can have the most secure system in the world -- but if there are state sponsored, or company back back doors it means nothing.
Okay, so you're saying: "If a backdoor is present than your security prioritization doesn't matter, the result is bad." I understand, but:
1. If there is a back door in open source code that goes unnoticed (and it certainly does) because of persistent but bad practices in the open source community (e.g., a stubborn refusal to stop using C-like memory management semantics and primitives when dealing with untrusted inputs), then why don't said accidentaly backdoors invalidate the open source work?
2. Does "control" actually matter in the context of AOSP? Strictly speaking, you have essentially everything you need up utill you hit the hardware drivers. You can easily rewrite that to your hearts content.
3. Given Librem's recently move into commodity-based social products (and the poop-from-great-height attitude they initially adopted), are you genuinely sure that they're actually trustworthy actors? If they're coerced, how will yu attest that they never injected a deeply subtle backdoor on millions of lines of code which you'd like to be unique and less scrutinized?
I can't really work out why you feel the way you do, so I ask these questions.
> persistent but bad practices in the open source community (e.g., a stubborn refusal to stop using C-like memory management semantics and primitives when dealing with untrusted inputs)
This applies to the entire industry. It's not something specific to the open source community. It's also extreme to call the use of C as "bad practice," as any language has its own strengths and weaknesses.
Not the entire industry, as many companies have thankfully moved on from plain old C, or at very least reduced its use quite considerably.
BSD/Linux derived FOSS is still the C stronghold.
The Morris worm was in 1988, since then C has collected enough CVEs due to memory corruption issues to consider its use bad practice.
Something that even Apple, Google and Microsoft security reports now advise against, and with Google actively engaging into taming C's usage in Linux kernel.
The operating system is only a tiny fraction of commercial code out there most of which is either written in (more) memory safe languages like Java, C# or C++. SAPs code base alone is 1 billion lines of mostly C++ and their own proprietary scripting language.
Not to the extent that it is tainted by C's copy-paste compatibility.
Still it does provide a stronger type system, proper string, vectors, reference paramenters and strong type enumerations, to prevent a large amount of C security exploits.
C++ teams that care about security do use such features and respective static analysers on their CI/CD to enforce them.
While it doesn't cover everything, it is much safer than plain C.
Ideally, we will reach a state where both C and C++ get nuked, or ISO C++ just drops its C copy-paste compatibility, which in the end means it is anyway easier to switch to something else.
However that process will take decades, and is hampered by relying on POSIX based systems.
Desktops are dwarfed by mobiles devices. AFAICT a linux kernel variant is present on most of the world's smartphones (with most of the rest being iOS devices, which I know little about), though you've addressed that by saying Google are pushing to reduce the impact of C underlying their system.
I don't want to make a song and dance about C being awesome or anything - we've certainly got massive issues with allowing that extreme amount of flexibility without ensuring that the developer really, really means what they've just told the machine to do - but it's hardly a small enclave that's holding out, it's still huge.
And there are still companies developing in it. I've seen a sort-of-microservices-in-C-implemented-as-a-sort-of-supersized-cgi-bin approach relatively recently.
Windows Phone, JavaScript, .NET (VB and C#) and C++.
iOS, JavaScript, Objective-C, C++ and Swift, C only due to BSD stuff.
Android, Java, Kotlin, JavaScript, C++, C only due to Linux kernel. Its sucessor. Project Treble drivers use Java and C++. Fuchsia is written in a mixture of Rust, Dart, and C++.
ChromeOS, JavaScript, C++, Rust, C only due to Linux kernel
> Windows Phone, JavaScript, .NET (VB and C#) and C++.
An irrelevance given their complete lack of market presence.
The rest all have significant underlying C components you've identified. All I'm saying is that's a hardly a 'niche holdout' when it appears to be at the heart of the vast majority of shipping devices.
Why do you assume that OSS has more bugs than proprietary software? I would probably argue the opposite.
With OSS you get more people working on a project that actually care. A proprietary business project prioritizes making money over actually creating a good product everyone loves.
You're right that this is not a perfect solution. All software has bugs and all software may have malicious back doors. I just find it much easier to trust the development that happens in the open with community involvement than the development that happens in secret where I have absolutely no way see what's going on.
If you had an inkling that someone was trying to poison you, would you rather eat the food you watched be prepared or the food that was prepared in secret? Both dishes might be poisoned, but it's reasonable to prefer the one you were able to examine.
> Why do you assume that OSS has more bugs than proprietary software? I would probably argue the opposite
I don't. But nor do I assume it has less. My point, as restated elsewhere, is that from a user's point of view Openness of Source is more about protecting against negligence.
Who exactly is talking about anti-openness here? We're talking about which open source piece of code to reuse. Someone gave a bad argument against one company's offering.
Microsoft of the 90s, which no one emulates these days and it's a wrongheaded comparison anyways, would have said that all the open options are bad to begin with.
If you meant to say "anti-free software" then maybe we could have a conversation, but that's hardly the problem Microsoft faced in the 90s and 2ks.
Seriously, what does your post mean? Could you maybe be specific? And while we're at it, what's your connection if any with the company that sells Purism phones.
“Open source is not safer because people won’t read the source”, “having control doesn’t matter”, and trying to raise doubts about the trustworthiness of the people involved... that’s old Microsoft textbook approach.
At least MS wasn’t built on open software, unlike Google.
> And while we're at it, what's your connection if any with the company that sells Purism phones.
None at all. I’ve just heard of this project a few days ago via a DDG search.
Believe it or not, not everyone is a corporate shill.
> Open source is not safer because people won’t read the source
That's not what I said. To sum it up: Open source is not really a security proposition. It eliminates problems related to negligence.
> having control doesn’t matter
In what concrete way does the Purism OS give you more control over your device than AOSP?
It really seems like you are confusing open source and free software for this entire conversation, as literally every line of code we are discussing is shared under a license that allows you to look at, modify and use as you see fit.
> None at all. I’ve just heard of this project a few days ago via a DDG search.
The depth of your consideration was already fairly easy to guess, but thanks for being honest.
> Believe it or not, not everyone is a corporate shill.
Bad software is bad whether it's open or not. But historically, closed software has more lock-in. If a particular open lib or component is bad, it can often be fixed by somebody who didn't create it. Or, for those who don't want to touch the scary hairball, it can often be replaced by a completely new hairball written from scratch by a completely different party. Even if there's nothing broken with the original, open software is friendlier to alternatives. It might take a bit of work, but you can replace one open part with another just because it's shinier or smaller or faster or not Oracle or whatever.
I don't trust all open source software, but I trust it by default more than I trust closed software. And I know that if something really bad gets exposed the odds of a solid fix are better in open source. I get to see the warts of OSS. There's public criticism over small details on a lot of important projects. That doesn't happen for closed stuff. Sure, a vendor may have four of the brightest devs in that field and they might hash it all out behind closed doors. The open alternative usually has another four of the top 12 minds in that field along with four pretty competent others and they have a better process for hashing it out.
Then there's that other guy who's not in the top 12 who goes it alone and comes up with something spectacular. So three of the four from the other open project jump on board because they can. And since this new project tries very hard to be backwards compatible, it just snaps in as an overnight replacement. That's part of the awesomeness of OSS.
Okay, so you're saying: "If a backdoor is present than your security prioritization doesn't matter, the result is bad." I understand, but:
1. If there is a back door in open source code that goes unnoticed (and it certainly does) because of persistent but bad practices in the open source community (e.g., a stubborn refusal to stop using C-like memory management semantics and primitives when dealing with untrusted inputs), then why don't said accidentaly backdoors invalidate the open source work?
2. Does "control" actually matter in the context of AOSP? Strictly speaking, you have essentially everything you need up utill you hit the hardware drivers. You can easily rewrite that to your hearts content.
3. Given Librem's recently move into commodity-based social products (and the poop-from-great-height attitude they initially adopted), are you genuinely sure that they're actually trustworthy actors? If they're coerced, how will yu attest that they never injected a deeply subtle backdoor on millions of lines of code which you'd like to be unique and less scrutinized?
I can't really work out why you feel the way you do, so I ask these questions.