> Isn’t it common sense to not buy a toaster expecting it to be a server even though they both have circuit boards and technically can both compute?
The key here is control, not computational power. An ideal law, in my opinion, would be one which prohibits building and selling any device that can run code in a way which allows the manufacturer to have more control over it than the legal owner after the sale has been completed. I think this idea is actually great because it never limits how limited a device can be, it just prohibits it from being made in a way in which the OEM/maker can control it more than the end user/new owner could.
As an example, say you make a "smart toaster" with Wi-Fi and all that "good stuff" in it. If you just burn the firmware into the sillicon and that program has no way of updating itself, then you're good to go because both the company and the end user are stuck with the same level of control (In this context, "control" means "ability to make the computer parts run the code that you wish them to run")
If you include the firmware in a writable EEPROM, and no further checks for the update firmware besides checksums, you're also golden, because then both the new owner and you (OEM) can exercise the same level of control over it.
If, however, you decided to include signature checking using a public key burned in the sillicon, then and only then you would be violating this hypothetical law, because that creates a situation in which you, the OEM, can exercise more control than the device's legitimate owner after purchase.
So, to summarize, from the OEM's point of view under this law, less control is good, equal control is good, more control is bad.
I think this is what should be proposed as a new bill in U.S congress, although I have to admit the Open App Markets Act serves a great purpose as of right now for some specific devices.
The key here is control, not computational power. An ideal law, in my opinion, would be one which prohibits building and selling any device that can run code in a way which allows the manufacturer to have more control over it than the legal owner after the sale has been completed. I think this idea is actually great because it never limits how limited a device can be, it just prohibits it from being made in a way in which the OEM/maker can control it more than the end user/new owner could.
As an example, say you make a "smart toaster" with Wi-Fi and all that "good stuff" in it. If you just burn the firmware into the sillicon and that program has no way of updating itself, then you're good to go because both the company and the end user are stuck with the same level of control (In this context, "control" means "ability to make the computer parts run the code that you wish them to run")
If you include the firmware in a writable EEPROM, and no further checks for the update firmware besides checksums, you're also golden, because then both the new owner and you (OEM) can exercise the same level of control over it.
If, however, you decided to include signature checking using a public key burned in the sillicon, then and only then you would be violating this hypothetical law, because that creates a situation in which you, the OEM, can exercise more control than the device's legitimate owner after purchase.
So, to summarize, from the OEM's point of view under this law, less control is good, equal control is good, more control is bad.
I think this is what should be proposed as a new bill in U.S congress, although I have to admit the Open App Markets Act serves a great purpose as of right now for some specific devices.