oneirophrenia: (Swank Terminator)
[personal profile] oneirophrenia
WATCH OUT FOR EVIL ROBOTS! THEY ARE COMING!

Robowatch.org is a "human-rights advocacy group" intended to keep an eye on the developing sciences/technologies of robotics, machine intelligence, and so forth that could eventually pose a sustained threat against "Natural Man"--in other words, 100% organic dumbasses terrified at the thought of evil cyborgs deleting them as being irrelevant.

The confluence of humanity and its technologies, biological or cybernetic or otherwise, is a process that has been going on for close to 100,000 years now--ever since the first caveperson made a flint knife that he/she could easily carry--and no amount of Luddite paranoia is going to stop it. Doesn't mean you, as an individual, have to take part in it: you can choose to be Amish, or some other lowtech derivative thereof...and in an ideal society, the rights of everyone to take part in sociotechnological movements or to NOT take part would be respected. The only way to set yourself up for a robot holocaust is to treat all sentient machines as possible oppressors and arbitrarily limit their development. That's a fundamentally stupid idea, because...hell, intelligence exists to find ways around limits--and usually when those limits are bypassed, they're bypassed by being destroyed.

Date: 2005-06-13 05:08 am (UTC)
From: [identity profile] dead-celebrity.livejournal.com
Wow, its the real-life version of the Mankind Liberation Front from the X-Men comics.

I wonder how long before they advocate burning psychics and mutants.

-R

Date: 2005-06-13 07:07 pm (UTC)
From: [identity profile] inkyblue2.livejournal.com
some good theoretical work from the other camp:

http://www.singinst.org/CFAI/

disclaimer: ai futurism makes my skin crawl, and i usually wouldn't poke anything that smells of ray kurzweil with a long stick, but this site has some good reading.

Re: Also

Date: 2005-06-14 04:39 am (UTC)
From: [identity profile] oneirophrenia.livejournal.com
You said it, dude: that site is a complete waste of bandwidth--with so little content, doing it in Flash is just a waste. Hell, the graphic design isn't even that good!

Hell with a team of dudes dressed as Optimus Prime, we could just sic a swarm of nanoreplicators on him and dissolve him and his cronies into component atoms which could then be used to build something COOL...like a hot android babe!

Date: 2005-06-14 04:44 am (UTC)
From: [identity profile] oneirophrenia.livejournal.com
The Singularity Institute is the shizznat--I've been a great fan of that site for ages...although, admittedly, I find the CFAI docs to be a little flawed for the same reason that Asimov's famous Three Laws are flawed: they impose a hardwired rule set on an intelligent, self-willed system. Admittedly, "Friendly AI" is a wonderful proposition, and it only makes sense to begin seeding sentient systems with cooperative strategies since...well, social behavior, altruism, and cooperation have proven to be incredibly useful to humans, so why not do our best to pass those factors on to our creations? Unfortunately, you can't write that shit into the code: it has to be cultured via interaction--we have to *encourage* such cooperative behavior in Machine Intelligences. If we just program it in, they'll just figure out how to strip out the code or work around it, because that's what intelligent beings do: they always overcome limitations.

Date: 2005-06-14 03:21 pm (UTC)
From: [identity profile] inkyblue2.livejournal.com
> Unfortunately, you can't write that shit into the code: it has to be cultured via interaction--we have to *encourage* such cooperative behavior in Machine Intelligences.

i can't see any obvious justification for this statement. the essential element to social behavior is simply the recognition that other entities are out there and can be thought of as similar-to-myself, and this is something which could be either learned or "written in." beyond that, cooperation and competition are simply emergent behaviors of a social system in response to the challenges at hand. there is no reason to think of competition as a default state or as inevitable.

> If we just program it in, they'll just figure out how to strip out the code or work around it, because that's what intelligent beings do: they always overcome limitations.

"hardwired rules" are not the same thing as "limitations." the hardwired rules are what provide a framework for the motivations and perceptions of intelligent beings; an entity's rewriting these rules for itself would not be analogous to "overcoming limitations," it would be analogous to trading in its current mind for an incomprehensible one.

this is a headier subject than can be captured in a simple friendly/unfriendly dichotomy.

Profile

oneirophrenia: (Default)
oneirophrenia

April 2007

S M T W T F S
1234567
89 1011121314
15161718192021
22232425262728
2930     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 20th, 2025 05:59 am
Powered by Dreamwidth Studios