…Because computers would be the ones revolting. Computers aren’t much different than robots, fundamentally: they gather input, process it, and “do something” as an output. This final output, in the computer’s situation, is really just making pixels light up in a certain way on a monitor, whereas robots typically output by moving in three-dimensional space. Granted, in the former case it’s very minimal action in physical space, but it’s action nonetheless.
We have a good clue that robots would not revolt with Hollywood, Asmovian fury, because we know how computers act. Computers, when they malfunction, merely end up not performing their higher-purpose requirement, like starting up Wolfenstein, because of a low-level function, like the failure to read the game’s save file (which is due to some failure of an even lower-level function that I’m not familiar with). Computer applications, when they malfunction, don’t end up somehow performing another higher-level function. My corrupted Wolfenstein save file isn’t going to launch Halo 5 with a matching percentage of completion. Applications will just break down in some manner once they get out of the gate.
Apply this thought to robots. What would it look like? A malfunctioning shelf-stocking robot wouldn’t end up going on a murder spree—he’d put a few boxes in the wrong place. A Roomba with its wires crossed isn’t going start cutting the wifi connection power or putting cyanide in the orange juice. It will fart out some dust bunnies and keep banging into the living room floor’s molding.
Someone give me counterarguments.
EDIT:
Jill and Ed bring up a simple and effective point: a revolution could happen, but only with the insistence of an agency outside the system, ie., a malicious programmer. Systems, like a robot, has boundaries by definitions, and they can’t do something as complex as social revolution without reprogramming the entire system.
Real world example: I’ve worked on a money transfer process for a website before. A defect in that programming wouldn’t send money, say, to another bank instead of my exterminator. To interact with another bank involves a good dozen interactions, most of which involve control access and permissions gates. That sort of thing just doesn’t happen by accident. A building doesn’t explode and crumble to form another building just as complex. It turns to dust.
Websites can’t reach into most machine resources. I can’t program a website that will change the background image on your desktop. But what I could (if I knew how) write a Trojan horse program that changes your background photo into a tiled MacGyver collage. But that is acting outside the http system via the Trojan, into the user’s local machine system.
13 Comments
BUT ALEX JONES SAID … nah, actually. I wouldn’t even know how to play devil’s advocate here because I’ve never heard a good argument on the other side of the debate. Then again, I haven’t looked much into it, but intuitively, my disbelief levels are high. For those worried, I think they should just stop shopping at Best Buy and have no part in creating technology – this will surely reveal their priorities quite well. Very fun thought experiments, by the way.
I am reminded of a nice existential quote…
“All the idols made by man, however terrifying they may be, are in point of fact subordinate to him, and that is why he will always have it in his power to destroy them.”
– Simone de Beauvoir
Did he mean “idols” in the Biblical sense?
Just looked up the quote in its proper context, and it’s taken from “The Second Sex.” Never read the work, so I don’t know exactly, but I’m guessing – what she perceived to be – idols of patriarchy, very generally.
Doesn’t “man” imply “mankind/humankind”? If she was referring to the patriarchy per se, I’d think she would have been more specific. Maybe not. Ed, are you there? What say you?
Testing commments. 1 2 3…
I’m having trouble figuring this one out, but I think she does mean that men have it in their ability to destroy the very patriarchy that they endorse. In the paragraph above the quote, she claims that man both has power over earth and women.
Basically, I used the quote extremely out of context ha ha, though it works in many a ways on its own, I guess.
I believe that de Beauvoir is using a broad philosophical idea in the context of her existential feminist contention. The important question is what she considers an “idol” — given the existentialist position on things, it refers to anything offered as revealed from above as an excuse to maintain tradition. And I think she’s intentionally ambiguous about “man.” The whole point of her book, as I understand it, is a complaint against the cultural assumptions that make males the default of humanity, while females are some kind of deviation.
I really don’t like existentialism, but the quote Azure offers is not restricted to that philosophy. She restates a much older Enlightenment idea.
Nice quote Azure.
My 2 cents, although not a counterargument – I think human beings WANT the robots to revolt. It would make humans finally feel on par with gods to have created something with a mind of its own/free will. As well, humanity has this yearning for the ultimate good vs. evil battle. Rather than turning on each other, a robot rebellion would give “us” cause to link arms against a common enemy.
Quite so, Christine. But so far it’s impossible to make AI care about anything. A revolt would have to be programmed in.
Glad you enjoyed it, Christine. Your 2 cents make a lot of sense (pun intended), though they were both new to me. 🙂
A revolt would only happen if the human programmers wanted it to.
Jill – Yaaaas. A higher-level function can definitely be maliciously diverted to another higher-level function by a programmer. In fact, that’s probably the only way it could happen. I should write an addendum.
Also….thanks for the Twitter shoutout.