Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The same firearm can be user to assault an innocent person and to protect an innocent person (in the hands of said person or law enforcement / army).

There are certain devices, like napalm bombs, that have a narrow and specific purpose of destruction. But the technologies used to make the bomb shells or the napalm are not specific, and have all kinds of peaceful and constructive uses.

Equally, there can be bad, evil-pursuing programs in a particular general purpose language, but this does not taint that language.



One of the few takeaways I remember from my engineering ethics class is one way to think about the ethical implications of a tool: if you have a situation, and you introduce a tool, how do the possible outcomes of the situation change?

For example, if you have two people arguing without weapons, the likely outcomes of the situation aren't strongly weighted towards one or both of the participants being maimed or killed --- it's difficult and requires commitment to really cause horrific damage if you're just hitting each other.

If you introduce a (supposedly value-neutral) tool, like a gun, into the situation, the outcomes become much more strongly weighted towards someone being maimed or killed.

Even though it's always a human using the tool, the tool itself can be seen as having an ethical character.


I like that lens of analysis, thank you for sharing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: