Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> As our friend Cory Doctorow has been explaining for years, DRM for books is dangerous for readers, authors and publishers alike.

Richard Stallman explained it a decade earlier in a piece worth reviewing now and then -- https://www.gnu.org/philosophy/right-to-read.html.



Some of Stallman's predictions were spot on.

> In his software class, Dan had learned that each book had a copyright monitor that reported when and where it was read, and by whom, to Central Licensing. (They used this information to catch reading pirates, but also to sell personal interest profiles to retailers.)


> It was also possible to bypass the copyright monitors by installing a modified system kernel. Dan would eventually find out about the free kernels, even entire free operating systems, that had existed around the turn of the century. But not only were they illegal, like debuggers—you could not install one if you had one, without knowing your computer's root password. And neither the FBI nor Microsoft Support would tell you that.

Not quite there, but almost (wrt PCs).

https://en.wikipedia.org/wiki/Unified_Extensible_Firmware_In...

https://en.wikipedia.org/wiki/Trusted_Platform_Module#Critic...

http://rt.com/news/windows-8-nsa-germany-862/


Let's not forget the widening gap between "advanced user" and "developer", created by schemes like code signing, forced app-store distribution models, and the gradual removal of "only developers will need them" features from software. While there are arguably security benefits to such closed ecosystems, it also makes it much harder for users to be in control of their computers - which includes modifying and writing new software for them.

Compared to a PC today, the original IBM PC/XT/AT was amazingly open. All versions of DOS came with a basic debugger, which could also be used to write short Asm programs. I remember these being popularly published in the computing magazines of the time, and there was a general attitude of openness around that.

Perhaps the average user today doesn't care, but what astounds me is how much freedom we've given up in pursuit of security and safety. We seem to have gone down a path (or the corporations have led us down one) in which we're lead to believe that having responsibility and freedom is a bad thing, and that we'd be happier - blissfully ignorant - if we let these corporations (and governments) take over for us.


We've gone in a security model that Bruce Schneier calls Digital Feudalism. We give up rights to have large centralized organizations take care of our security for us. In exchange we give them data that they are allowed to sell. This is both for devices and for services (notably Web 2.0).

There are arguably benefits to this arrangement. Prostitutes have pimps, even though they cut into profits without taking much personal risk or providing much labor, because pimps do provide a safety mechanism. Likewise, vassals gave up large portions of what they produced for security from their lords and kings.

We've returned to this arrangement because:

1.) Paying is painful and copying software is easy. Instead of making money from you, Web 2.0 have monetized you as a target for advertisers. Web services are 'in' because that's the only way software can't be pirated. Much of our software today displays GUIs on a screen but runs on other computers because of this reason. The new Office products all run in the cloud, MMOs offer the same deal, and games like Diablo III can only run when in constant connection to the internet, even though there's no technical reason why that should be the case. XBox One tried to (and backpedaled from) make the xBox always on always connected. Other streaming video game and content services try the same. Essentially, the internet has become glorified TV (minimally interactive), with more channels.

2.) The internet is the Wild Wild West. Security professionals have been yelling for ages that the rush to release and the need to compete together with closed source development was making the world an insecure place. Essentially every layer of our computing stack is insecure. Attestation of computer intrusion is ridiculously hard. And there's money (and geopolitical power) to be had. We designed things insecure from the start, and continue to do so. Individuals can't stay secure on their own. It's not possible. So up with the lords and vassals - we know it sort of works from history.

I do think the average user cares. I don't think the average user is informed.


> Web services are 'in' because that's the only way software can't be pirated.

That is cognitive dissonance. If your revenue is derived from advertising then users installing ad blocking software is revenue-equivalent to users pirating your software. The fact that it's legal for users to do when piracy is not provides no support for argument that ad-supported services are on stronger financial footing.

The reason ad-supported services are winning is much simpler. Users prefer them to paying. So if one competitor is "free" (with ads) while another is charging money, the market picks the free one. But ad-supported services don't inherently require centralization or feudalism anyway.

> The internet is the Wild Wild West.

The internet has never been the wild west. Modern computing devices are dramatically more secure than physical things like your house or your car. The internet is so far away from the wild west that we consider even the possibility of a security breach under rare circumstances to be a serious vulnerability and work quickly to close it. And the feudal lords are the ones making it worse -- is it even possible to patch a vulnerability on an un-rooted iPhone which is too old for Apple to patch it officially?

Having said that, I don't disagree that the reasons you're listing are the ones used to justify feudalism, they just happen to be factually incorrect.


Modern computing devices are dramatically more secure than physical things like your house or your car.

Er, no. You can't break into my house from another country. Moreover, my computer is also a physical thing in my house, so it cannot easily be more secure than the house itself.

The patching of possible security breaches has become a weekly ritual. And there are hundreds of thousands of unpatched compromised computers sloshing around in botnets.


> You can't break into my house from another country.

An attacker could certainly pay someone to break into your house from another country in much the same way as they pay their ISP to deliver malicious packets to your computer.

> Moreover, my computer is also a physical thing in my house, so it cannot easily be more secure than the house itself.

That isn't strictly true. If your device has full disk encryption using a strong password, physical access doesn't get you much in the way of accessing the data.

It's easy to confuse this with the explanation of why DRM can't work, but they aren't the same thing. To access data you need either the plaintext, or the ciphertext and the key. DRM fails because you need one or the other in order to watch the content, which means an attacker inherently has in his possession what is necessary to get the data. But a locked or turned off device with disk encryption only gives the attacker the ciphertext without the key, which is an entirely different situation.

> The patching of possible security breaches has become a weekly ritual.

That's kind of my point. Nobody recalls your car windows when they're discovered to be vulnerable to the "blunt force with heavy object" attack.

> And there are hundreds of thousands of unpatched compromised computers sloshing around in botnets.

Hundreds of thousands out of billions is what, 0.01%?


> If your device has full disk encryption using a strong password, physical access doesn't get you much in the way of accessing the data.

Well, even if assuming that all your software is bug free, someone with physical access can either replace the unencrypted boot image with a compromised one which gives him remote access or install a hardware keylogger and come back later for the disk and password.

In the hypothetical situation of a targeted attack and attacker with physical access, FDE only protects you if you never turn on your computer.


There are plenty of effective ways to detect physical intrusion into a premises.


I'd be shocked and amazed if someone broke into my locked computer. Someone could break into my house with something as simple as a rock, though.


If an adversary has physical access to a computer, they can get basically anything. In unencrypted scenarios they can pull out drives and mount them in their own run time (OS). If the device is also running then coldboot attacks can allow the encryption to be attacked.

Finally, there is the eventual cracking of many encryption algorithms via cryptanalysis and moores law.

[1]: Answer about moore's law effects on bits of security http://crypto.stackexchange.com/a/1828 [2]: http://en.wikipedia.org/wiki/Cold_boot_attack


My computer is encrypted. It's far more complicated to break into my computer than to use a rock against a window.


Cold Boot is hard to implement, and can be mitigated by putting the 16kb of key memory on the L2/L3 cache or some other piece of memory that instantly clears on power off.

With FDE and memory encryption, how else can you get pass this?


Thing is, you would know if someone broke into your house with a rock through the window. Breaking into your computer, or breaking the encryption of your data on some server, can be done without nobody noticing for years.

Securing your data is a completely different problem, and a much more difficult one, than securing your house.


Chances are your computer has already been compromised. Your house? Probably not.


If it was compromised already it's because it's safer for the hacker to do so than to break my window and go in my house. Safer does not equal easier.


The amount of knowledge, skill, and effort to gain access to your computer from another country vastly outweighs the amount of skill and effort required to break your window and carry it out the front door.


> > Web services are 'in' because that's the only way software can't be pirated.

> That is cognitive dissonance. If your revenue is derived from advertising then users installing ad blocking software is revenue-equivalent to users pirating your software. The fact that it's legal for users to do when piracy is not provides no support for argument that ad-supported services are on stronger financial footing.

It is not difficult to foresee a world in which "forging the appearance of a website" is illegal and userscripts and userstyles are forbidden as well. How many sites try very hard to block right click to hide their code?

There are five actors that can modify the appearance of a website:

* the author,

* the server,

* the network,

* the browser,

* the user (instructing a browser's add on).

By definition the author can do whatever they want.

But what if the server started serving slightly different pages, maybe with advertisements? This is happening now with hosted blogs and publishing platforms; it is frowned upon but accepted.

If the network changes the content of a page, for example adding advertisements, the network will have to face a lot of bad reactions, so very few networks are doing it now. (see http://justinsomnia.org/2012/04/hotel-wifi-javascript-inject... )

The browser is in a similar situation. When Internet Explorer wanted to modify the received HTML to "enrich it", there has been a big backslash from authors «Microsoft thinks they can improve my writing. This makes me want to get a gun and go to war.» «With smart tags, Microsoft is able to insert their ads right into competitors’ sites.» (http://alistapart.com/article/smarttags )

So the authors are against modifications to the way their sites appear. Their remaining problem is _the user_. Authors cannot force the user to see what they produced: users have the freedom to install an ad-blocker or strip the CSS away. But we have to realize that this may be only a temporary situation. With the app-ification of the web, userscripts and userstyles are gone. And the content producers are happy about this.


>>It is not difficult to foresee a world in which "forging the appearance of a website" is illegal and userscripts and userstyles are forbidden as well. How many sites try very hard to block right click to hide their code?

Not illegal, but made technically impossible. That will be the logical Result of the EME Standard, combined with WebCrypto where the Website owner, not the user controls the browser.

All of these users chanting about how great netflix "HTML5" is do not understand the long term game that is being played for control over the web browser


Interesting perspectives, so here's another volley.

I think "users prefer ads to paying" is not a full explanation: I did include "paying is painful". If that's all there were - just upfront cost, how do we deal with the trend of connected software (Diablo III, etc) and devices whose primary developer feature is DRM?

The internet totally is the WWW. Well, it's an analogy. So no, it's not. But the analogy is useful in its most spirited form. There is no truly enforceable law and criminal and conspiratorial enterprises run abound. It's questionable whether the Feudal lords are making it better or worse. (One way they make it better is by having direct accountability.)


> I think "users prefer ads to paying" is not a full explanation: I did include "paying is painful". If that's all there were - just upfront cost, how do we deal with the trend of connected software (Diablo III, etc) and devices whose primary developer feature is DRM?

Connected software is a different business model. It has certain advantages for the developer (like collecting a monthly fee rather than a one time payment), but how is that supposed to be providing any security or other benefit to the user?

> There is no truly enforceable law and criminal and conspiratorial enterprises run abound.

There are lots of enforceable laws -- probably too many. And the threat is vastly overhyped. Actual criminals and criminal organizations are the likes of Ted Bundy and the Zetas Cartel. The internet version of that is supposed to be weev and Anonymous? They're not even on the same planet.

Moreover, what is the feudal lord supposed to do any better than anyone else to prevent some jerk from cracking into a webserver and stealing private data? If anything the centralization makes it worse by creating juicer targets. If the bad guys compromise Apple or Google you're roasted, toasted and burnt to a crisp.

> It's questionable whether the Feudal lords are making it better or worse.

I have a hard time thinking of any way they could legitimately make it better that wouldn't work just as well without a locked boot loader.

> (One way they make it better is by having direct accountability.)

The fact that they aren't accountable is half the problem. At best the user can throw away their device and buy one from a different vendor, but that's hardly much consolation when you can't get your money back. And the app developers have even less leverage. The only way to opt out is to abandon hundreds of millions of prospective customers.


If the Internet is the Wild Wild West, are we (the early adapters, the knowledgeable users) the Native Americans? New Business are coming, destroying our way of life? Ruining our lands, exterminating certain breeds, making our traditional ways forever extinct?

I guess it is an apt analogy.


Glad to hear someone air these thoughts out loud. I vastly preferred back when I could simply trade money for software. I don't like this new world in which I have to enter into a vaguely abusive intimate relationship with Google or whoever to pay for their product.


I find that my stance on this topic has changed as I have more money to pay for services. I used to vastly prefer free services. Now I would happily pay a few dollars. Unfortunately, a large portion of my privacy and anonymity has already been given up from past use of these free services, and I most likely will never be able to reclaim that. I can at any point choose to use a less invasive service, but the choice is not the same now that I view much of my anonymity already gone.


Piracy is a really huge part of it. Nothing is free. Pay for it or it finds someone who will pay for you.

... And If neither occurs, you won't get it at all. I've started to become increasingly convinced that huge scale piracy really did harm the quality of popular music. You can find some good obscure music today, but the average quality seems to have hugely declined in some objective way. Art is no more free than code, and good art really does cost more than bad art for all kinds of reasons... Fewer can make it, and quality takes time and sustained focus. Having some talentless tart squall into an auto tuner to a cliched catchy melody is cheap. Finding an Elizabeth Fraser and paying them to practice for 40 hours a week under the supervision of a professional choral voice instructor while you coach the band on composition in a recording studio is not. Why invest in a product nobody will pay for?

Funny how when I bring up the issue in the context of software everyone agrees. Of course people have to get paid somehow and of course polished products cost money. But on every forum including this one, nobody gets how there could be any relationship whatsoever between the quality of art and its ability to be financed. I think this comes down to one of the greatest myths about art: that it comes from some automatic and magical place and people either have it or they don't, and that artists make art solely out of duty to the muse. Like anything else art is a skill and doing it well requires practice, research, focus, coaching, even peer review, and all of that takes time during which many people including the artist must be supported. There is a component of inspiration but raw inspiration without the rest of it results in rough draft quality work that is only of value to the artist themselves.

Piracy undermines the ability to finance art just like it does in software. What's one of the first questions a VC asks about? Defensibility. In other words how will you protect your ability to monetize the value you create. You think investors only ask that question in software?


Art is subjective, code is not. Good code is testable. Good art is not testable.

Top40 music is palatable to many and is the exact result of what you describe: practice, research, focus, coaching, and peer review on a massive financed scale that "startup" bands could never afford.

Yet despite that, the indie music scene has never been more vibrant and alive than it is today. Instead of a small number of bands/groups producing music that all indie fans like, there's a huge pool of artists each producing their unique sound based on their own personal values of how music should be produced (including things like practice, focus, peer review, and other things not mentioned like experience, personality, and culture). That, to me, is art -- not a rehearsed manufactured production but an embedded experience unique to the artist you are listening to. That's what you heard in popular music until the mid nineties. The lack of quality popular music is not due to lack of craftsmanship.


The financial costs a young artist or group incurs to bring one or a few albums into the world are slim, but the cost in time is great, especially if they're also trying to keep food in their bellies by touring. But here's the thing - who can afford to do that for their whole careers? I think part of the costs parent is describing have to do with making art as a viable lifetime career, meaning, providing for healthcare and retirement. Sure, it's easier than ever for college-age people to put out some records, but if anyone is to make art a viable profession, money has to come from somewhere, and enough of it to make it viable as a living.


That's exactly what I meant: a viable profession.

I also disagree with the OP about cost vs. quality. The key difference comes later when you're managing the artist. If the artist has talent then they also have a certain amount of leverage in the relationship. If you make them feel like a slave then they have an ultimate trump card: turning off the juice. (The "Atlas Shrugged" maneuver?) A talentless piece of meat plus auto tune gives you total leverage, which means risk mitigation and a more controllable product.

Right now the environment in the music industry is one of extreme risk aversion, which is something you see when the bottom falls out of an industry. You've arguably seen a much more risk-averse tech industry since the 2001 crash.

I think my points about piracy fit into a broader critique of the concept of "free" that I've been thinking about and that I've heard others talking about. Unlike previous critiques it's more of a liberal/progressive critique.


I don't know about the music, but Avatar, one of the most pirated movies of all times, was also one of the most financially successful. It seems like making quality movies still pays off.


Citing a single example isn't evidence. Also movies have been less affected than music, which got the brunt of it due to Napster's VC-funded normalization of bulk music piracy. Movies have their theater runs, while musicians can only get live revenue by touring (which is hard work).


Computing isn't a hobby anymore. Everything changes when big money is on the table. Back then the virus said "your PC is now stoned" and everyone just laughed. Now if your computer isn't totally locked down it's root level malware produced by professional career criminals that steals your financial data and conscripts your computer into a botnet to extort money from web sites.

I share these concerns but coming up with a way to be both open and secure is deeply hard. Companies like Apple have decided to just punt on the problem, especially in the mobile space. Not only does it save money on R&D while delivering a product that isn't instantly malware ridden, but it also gives them App Store revenue.

My point though is that its not just them ramming these models down our throats. Other factors are in play helping to seal the deal.


"in pursuit of security and safety"

pursuit is the critical word. Of course you don't get either, often enough the opposite.


And of course it's illegal to install other operating systems on smartphones and other devices.

Citation required?


Most modern smartphones are bootloader locked. Circumventing those locks is arguably violating the anti-circumvention bits of the DMCA. Therefore, to install another OS, you have to break the law.

Worth mentioning is that it's illegal in the same way breaking the speed limit by 1MPH is illegal, i.e. technically against the law, but your chances for ending up in trouble for doing it are nil.


Thats actually not worth mentioning.

Speeding by 1mph isn't a practical or useful law to enforce. Creating a new OS and breaking the boot loader would require a group of people working together in some organized form, which would make it practical to target, and it'd probably be financially threatening to a larger company, which would make it useful to enforce.

In 2014, companies are the political and cultural entities in our society, and a law like this kills off a class of entity.


It might be illegal in _your_ country. But AFAIK reverse engineering or installing custom software on a device I bought is not illegal in Germany. Probably in most countries.


It's not reverse engineering that is illegal but "cirumventing a digital protection device" (modulo translation issues). What does Germany's implementation of the EUCD say?

The EUCD was the DMCA pushed through the EU by trade agreements. In some ways it's worse, since it lacks the safe harbor provision.

The end user situation isn't as bad though because there are other laws that protect you. In Sweden there is a provision that it is specifically allows it for interoperability reasons for example.

Funny you mention Germany though. It is one of the few countries that tried to outlaw "hacking tools", broadly defined as including a lot of reverse engineering tools you would use for the purpose you mentioned. I don't know what happened with that, perhaps the situation have improved?


>Funny you mention Germany though. It is one of the few countries that tried to outlaw "hacking tools", broadly defined as including a lot of reverse engineering tools you would use for the purpose you mentioned. I don't know what happened with that, perhaps the situation have improved?

No.


As I understand it, EUCD is about copyright protection. Unlocking my bootloader is not a violation of this. So EUCD does not apply here.


Motorola tried to tell me my phone warranty was off because I requested the bootloader unlock code from their site. Well, they're wrong. They cannot do that. It's illegal. I have every right to unlock my phone. Europe really is different from the USA.


Something voiding the warranty or being illegal is quite different matter though. I'm pretty sure that disassembling your phone voids your warranty but I would be surprised if it was illegal.


Disassembling your phone only voids the warranty on the overall device, not on the individual parts. Also changing the software does not void the warranty of the hardware.

Also if a contract contains a clause forbidding you to disassemble, decompile, or debug something, then the clause is automatically void.

Yay for EU rights.


Exactly. Doesn't matter what Motorola says. The Terms they made me "read" and "agree" upon are void because they go against the law. I'm free to get rid of the Motorola software on my phone and replace it without voiding the 24 months warranty on my phone.


Warranties can be tied to limitations and restrictions the manufacturers make up. (There probably are some limitations to that to do with contract law. It surely is quite complicated, but voiding your warranty by unlocking your device doesn’t seem unreasonable to me.)

Warranties are voluntary and distinct from seller liability for defects in the product. That liability is legally required in the EU. The details of the implementation are different in different countries.

In Germany it’s like this: The seller (and only the seller) is liable and you have to go to the seller to make a liability claim. So if you bought your phone from Amazon you have to write Amazon, not Motorola. If you bought it from a carrier you have to go to the carrier. If you didn’t buy the phone directly from the manufacturer that manufacturer doesn’t have to do anything when you come to them with a defect phone. It’s the seller’s problem. (This is mostly to prevent a runaround, where seller and manufacturer both tell you each other is responsible for fixing the problem. This creates one entity that is clearly and obviously responsible and has to handle the problem.)

It’s for defects present when the device was sold and that’s it. (However, subsequent defects of some component because the device was delivered with some faulty or not up to spec components count, too.)

In the first six months the seller has to prove the defect wasn’t present when the device was delivered to not be liable. You, the buyer, don’t have to prove anything. That burden of proof, however, reverses after six months and up to 24 months (then there is no more liability), making it very hard in practice for buyers to prove after those six months that the defect was there when the device was sold. (However, courts have relaxed the requirements for that. Buyers usually don’t need to get some expensive expert opinion or something like that. If you can make a very good and informed argument for your case you should usually be covered.)

The seller can repair or replace the device. Buyers don’t have right to get their device replaced when it’s possible to repair it. If repairing and replacing fails a number of times (I think three) the buyer can demand their money back.

That’s the liability. A legal requirement and very complicated – and also wholly distinct from warranties.

By unlocking your phone you may have voided your warranty, but you certainly didn’t void the legal liability of the seller.


They're still called warranties in many EU countries, and simply saying it'll "void your warranty" is incorrect, and possibly illegal.

Oh, and in some countries the law has more teeth: here in Portugal the burden of proof never shifts during the 24 months :)


Yeah looks like that was the argument that held up until just a few years ago. I remember it being 'illegal' (like breaking the speed limit) at that time. As of 2010 it appears to have changed. Note that it IS illegal to modify a phone so that it connects to a carrier it wasn't 'intended' to connect to.


That is only for smartphones specifically though, as I understand it.


There was an exception against that until about a year ago. Then this year that exception was added back in.

http://www.theverge.com/2014/8/1/5959915/president-barack-ob...


I looked it up and am wrong as of 2010 when the Registrar of Copyright of the Library of Congress ruled that, in fact, it was legal to jailbreak iPhones and other smartphones. Apologies, I will remove the comment above.


That was a prediction, not a statement - it's from Stallman's 1997 'right to read' cited above.


Except, even if Stallman didn't realize this, the story is mostly about state violence ("you could go to prison for many years for letting someone else read your books", "free operating systems ... were they illegal") where DRM is used as a tool to control and monitor people.

DRM itself is not good or bad. As everything else, it can be abused, as Sony and Adobe showed us, or can be used for good (or neutral), for example, to allow indie shareware developers to let us try their software before purchasing a little registration code (I know, not free software, but let's not start this debate for now).

It's a logical error to say that DRM itself is evil when some of the instances of it are evil.

DRM is a "smart contract", a protocol for enforcing a contract without laws or violence. When you read that a person goes to prison for breaking DRM, realize that it has nothing to do with DRM, it's about state using violence to protect a failed smart contract, which is the opposite of the purpose of such smart contract.


>DRM itself is not good or bad. As everything else, it can be abused

We're going to have to disagree on this. DRM, any DRM, no matter how benign, places the computer in an adversarial role against its user and owner. There is no circumstance in which this is a healthy relationship, and there is further no circumstance in which DRM is added to something that DRM couldn't be removed entirely and therefore enhance the value of whatever it's attached to.

This is one of the things I fully agree with RMS on. DRMing something means it's broken by design.


"DRM is a "smart contract", a protocol for enforcing a contract without laws or violence."

No it isn't, because DRM fundamentally cannot enforce the contract it is supposed to; you can't allow people to view something while preventing them from copying it. DRM only "works" if circumventing DRM is made illegal; DRM is an attempt by copyright holders to introduce legal restrictions on fair-use copying and on copying devices with substantial non-infringing use. DRM is entirely legal and political, not technical.


That's a good argument, thank you! I'd argue that for DRM to work reasonably well (without coercion), it doesn't need to be perfect, it can work in the same way password stretching works: you don't get the absolute protection, but influence the cost (value). For example, when applied to software, DRM can be stripped, but then you'll be receiving the executable from a possibly untrusted source. However you are spot on that currently DRM is mostly legal and political, but I don't think that's the inherent property of it.

* * *

Unfortunately, seeing my comments downvoted makes me uncomfortable to continue discussion here on HN, as (when applied to reasonable comments) it provides instant feedback that you're going against the popular opinion, and I'd rather avoid such feedback, so I'll go think about it more and then write a blog post or something.


DRM doesn't work nearly as well as password stretching. I can sit down for a few hours with IDA and crack nearly any DRM scheme.

With password stretching, the result of how long attacks will take is predictable and they can actually be made difficult or require more hardware (and at larger expense) rather than simply requiring a more skilled attacker.


It's not a good argument. DRM is an executable contract. Contracts cannot enforce themselves. That is why we have police, courts, etc. The fact that it is an executable contract makes it more difficult to circumvent than a paper one, but circumvention is still circumvention.


> It's a logical error to say that DRM itself is evil when some of the instances of it are evil.

All of the instances of DRM are evil, because all of the instances of DRM'ed stuff prevents practical and unlimited sharing.

In the digital world, any limitations on copying bits are akin to virtual restraints and locking information away. It prevents you from doing anything else than what the golden prison allows.


> In the digital world, any limitations on copying bits are akin to virtual restraints and locking information away.

Jennifer Lawrence might disagree with your laissez faire attitude.


I'd argue that Jennifer Lawrence's sensitive bits shouldn't have been given to someone else (Apple) in the first place.


I assume you store all your money under the mattress? We all store "sensitive bits" (information) with others.


Well, to get his money from under his mattress would require you to break into his home. That's different from him storing his money under your mattress.


Well, yes, that was my point, since chances are (s)he doesn't actually keep money under the mattress, but on some bank.


Right, I see that now. I misread you earlier.


Sure. But there's not much to complaint about when those 3rd parties f* up, if you used them voluntarily...


I think you've just invalidated contract law wholesale.


Is that even valid when the contract is nothing more than a ToS that everybody agrees on without reading and most probably not enforceable in court?


I side with icebraining here.

The statement "there's not much to complaint about when those 3rd parties f* up, if you used them voluntarily." taken in general can refer to pretty much everything we encounter in everyday life. There are lot of implicit contracts made, and breaking some of them could be recognized in court (there's the concept of acting in bad faith).

In this particular case, JenLaw et al. have all the rights to be mad at Apple because of the broken ToS/implicit contract that said "this is my data, it's only backed up and will not be shown to third parties". Whether or not they have shown practical wisdom by using the service is a whole another matter.

That's basically the crux of disagreements around the "victim blaming" concept. People confuse two different things here - morality of whether something should be done, and the probability it will happen in practice. If I get mugged under the bridge in the middle of the night, I'm not morally at fault for being mugged (it's something that shouldn't be done), but I also haven't shown practical wisdom by going alone at night under the bridge in dangerous area (by doing so I increased the probability it will happen to me).


As far as I remember though, the "breach" was not on Apple's part but on the victims who chose weak passwords; can we blame Apple for this ? Except maybe for a lack of forceful education ?

The original sentence becomes "Jennifer Lawrence shouldn't have stored sensitive information externally without using a minimum of good security measures" in this vision.


> As far as I remember though, the "breach" was not on Apple's part but on the victims who chose weak passwords; can we blame Apple for this? Except maybe for a lack of forceful education?

In this case I guess we can blame Apple only for the "lack of forceful education"/crappy security ideas (security questions in 21th century, really?).

There is one funny thing about the Fappening - there was this movie[0] released few months ago, that featured a couple making a sex tape that ends up accidentally distributed to their extended families and friends thanks to iPads and cloud backup. The best line from the trailer:

    - It went up! It went up to the cloud!
    - And you can't get it down from the cloud?
    - NOBODY UNDERSTANDS THE CLOUD! It's a fucking mystery!
Call it a prophecy.

[0] - http://www.imdb.com/title/tt1956620/


Hah. You say that now. And then the bank messes up, and your savings account is empty. You're not going to complain? I think not. I think you're going to be screaming at the top of your lungs, calling lawyers, and so on.


I use Bitcoin. Don't have a bank account.


Your argument that bits should be copyable without restrictions is independent from the fact that these bits were not on her local hard drive.


Indeed, if you come from the assumption that it's evil to limit the sharing of information/software, then DRM is evil, because that's the exact purpose of it. However, in this case, login systems, encryption, etc also evil because they prevent the sharing of information. Speaking of which, why do you lock information away from Adobe? Also, pass me your HN password, please :-)

But even if DRM's purpose is evil, it's still doesn't invalidate my argument: that it's a tool for peaceful enforcement of contracts (something you might not like), the exact opposite of the state's violent enforcement of contracts. The latter is the problem in Stallman's story, not the former.


Hmm, I was talking about DRM in the context of published information.

Obviously, in the case of private and/or sensitive information, you don't release anything to the public, so I'm not sure the protections (login system, etc...) can still fit in the definition of DRM.

https://en.wikipedia.org/wiki/Digital_rights_management:

    Digital Rights Management (DRM) is a class of technologies that are used by
    hardware manufacturers, publishers, copyright holders, and individuals with
    the intent to control the use of digital content and devices after sale.
With publications, you give an access to other people (a restricted one if DRM is involved). It's not the same as not giving access to anyone but yourself.


The argument of copyright is that 'published' information is not public. It is still owned by its creator (or more often by one of the corporations that employ them), you just get a very limited license to do certain things with it, like reading it and maybe creating personal backups, but not other things, such as sharing it with others, either for free or as part of a commercial venture.

This distinction between owning and licensing information didn't use to be necessary in the publisher's business model for popular culture, because the cost (and, equally importantly, the profit margin) of distribution was significant: authors made and continue to make far less than their publishers, with rare exceptions.

The cost of distribution of digital information is so low that consumers will do it for free (BitTorrent). And since the power of publishers primarily derives from their ability to distribute copies, that is what they attempt to preserve, even though they do a lot of other things that continue to be valuable in today's digital world, like financing and advertising. This (not entirely irrational) attempt to preserve a dated business model is in turn perceived by consumers as a clampdown on their rights, leading to a backlash to the established publishing industry and enabling the (so far limited) success of new services like Spotify and Netflix, which don't have to take into account any reconceived notions of what their business is. People associate licensing an ebook with buying a treebook, but they tend to associate streaming with borrowing a book in the library.


>People associate licensing an ebook with buying a treebook //

People don't just associate it. Companies offer e-books for sale. Amazon says "Kindle Purchase" and give a price for the e-book: that's a subtle sort of fraud if they really mean you can "Kindle license" and to offer a "licensing fee" rather than a price.

Companies want people to think they are purchasing stuff because otherwise people would be reluctant to "buy". Unfortunately the largest companies have been able to play this fraud long enough to establish the system; only now are people realising that what they thought they had bought doesn't technically belong to them and the rights they thought they had are not in place. Like I said, it's a subtle fraud.

What's more egregious is that the copyright deal has been corrupted. With DRM companies are saying their work will not enter the public domain eventually - that means they've failed to uphold their end of the copyright deal ... why then are the demos upholding their end, there is no compulsion to if the contract has already been broken.

There is no protection for works which have been crippled so they can not enter the public domain; the contract has gone. It would be good if the legal system could come in line with the reality of this situation.


Yes, locking private information is not DRM. But if you separate "private" information from "published" information, then for DRM to become [logically] evil, you should restate your claim about why is it evil to prevent unlimited sharing, because the generic "virtual restraints" on bits no longer works.


So, some files on your system likely have read rights but no write rights were I to log in. Is that evil?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: