Every hear of the 3 Laws, dummy?
Jun. 13th, 2005 12:50 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
WATCH OUT FOR EVIL ROBOTS! THEY ARE COMING!
Robowatch.org is a "human-rights advocacy group" intended to keep an eye on the developing sciences/technologies of robotics, machine intelligence, and so forth that could eventually pose a sustained threat against "Natural Man"--in other words, 100% organic dumbasses terrified at the thought of evil cyborgs deleting them as being irrelevant.
The confluence of humanity and its technologies, biological or cybernetic or otherwise, is a process that has been going on for close to 100,000 years now--ever since the first caveperson made a flint knife that he/she could easily carry--and no amount of Luddite paranoia is going to stop it. Doesn't mean you, as an individual, have to take part in it: you can choose to be Amish, or some other lowtech derivative thereof...and in an ideal society, the rights of everyone to take part in sociotechnological movements or to NOT take part would be respected. The only way to set yourself up for a robot holocaust is to treat all sentient machines as possible oppressors and arbitrarily limit their development. That's a fundamentally stupid idea, because...hell, intelligence exists to find ways around limits--and usually when those limits are bypassed, they're bypassed by being destroyed.
Robowatch.org is a "human-rights advocacy group" intended to keep an eye on the developing sciences/technologies of robotics, machine intelligence, and so forth that could eventually pose a sustained threat against "Natural Man"--in other words, 100% organic dumbasses terrified at the thought of evil cyborgs deleting them as being irrelevant.
The confluence of humanity and its technologies, biological or cybernetic or otherwise, is a process that has been going on for close to 100,000 years now--ever since the first caveperson made a flint knife that he/she could easily carry--and no amount of Luddite paranoia is going to stop it. Doesn't mean you, as an individual, have to take part in it: you can choose to be Amish, or some other lowtech derivative thereof...and in an ideal society, the rights of everyone to take part in sociotechnological movements or to NOT take part would be respected. The only way to set yourself up for a robot holocaust is to treat all sentient machines as possible oppressors and arbitrarily limit their development. That's a fundamentally stupid idea, because...hell, intelligence exists to find ways around limits--and usually when those limits are bypassed, they're bypassed by being destroyed.
no subject
Date: 2005-06-13 05:08 am (UTC)I wonder how long before they advocate burning psychics and mutants.
-R
no subject
Date: 2005-06-13 07:07 pm (UTC)http://www.singinst.org/CFAI/
disclaimer: ai futurism makes my skin crawl, and i usually wouldn't poke anything that smells of ray kurzweil with a long stick, but this site has some good reading.
Re: Also
Date: 2005-06-14 04:39 am (UTC)Hell with a team of dudes dressed as Optimus Prime, we could just sic a swarm of nanoreplicators on him and dissolve him and his cronies into component atoms which could then be used to build something COOL...like a hot android babe!
no subject
Date: 2005-06-14 04:44 am (UTC)no subject
Date: 2005-06-14 03:21 pm (UTC)i can't see any obvious justification for this statement. the essential element to social behavior is simply the recognition that other entities are out there and can be thought of as similar-to-myself, and this is something which could be either learned or "written in." beyond that, cooperation and competition are simply emergent behaviors of a social system in response to the challenges at hand. there is no reason to think of competition as a default state or as inevitable.
> If we just program it in, they'll just figure out how to strip out the code or work around it, because that's what intelligent beings do: they always overcome limitations.
"hardwired rules" are not the same thing as "limitations." the hardwired rules are what provide a framework for the motivations and perceptions of intelligent beings; an entity's rewriting these rules for itself would not be analogous to "overcoming limitations," it would be analogous to trading in its current mind for an incomprehensible one.
this is a headier subject than can be captured in a simple friendly/unfriendly dichotomy.