War of the Worlds: Fixes after reading
[ccbib.git] / content / Cory_Doctorow / Epoch.tex
blob7348af6cf46695a2affe3b434219625377b066b6
1 \input{common/hyp-en}
3 \begin{document}
4 %\setlength{\emergencystretch}{1ex}
5 \raggedbottom
7 \begin{center}
8 \textbf{\huge\textsf{Epoch}}
10 \medskip
11 Cory Doctorow
13 \end{center}
15 \bigskip
17 \begin{flushleft}
18 This story is part of Cory Doctorow’s short story collection
19 “With a Little Help” published by himself. It is licensed under a
20 \href{http://creativecommons.org/licenses/by-nc-sa/}
21 {Creative Commons Attribution-NonCommercial-ShareAlike 3.0} license.
23 \bigskip
25 The whole volume is available at:
26 \texttt{http://craphound.com/walh/}
28 \medskip
30 The volume has been split into individual stories for the purpose of the
31 \href{http://ccbib.org}{Creative Commons Bibliothek.}
32 The introduction and similar accompanying texts are available under the
33 title:
34 \end{flushleft}
35 \begin{center}
36 With a Little Help -- Extra Stuff
37 \end{center}
39 \newpage
41 \section{Epoch}
43 The doomed rogue AI is called BIGMAC and he is my responsibility. Not
44 my responsibility as in “I am the creator of BIGMAC, responsible for
45 his existence on this planet.” That honor belongs to the
46 long-departed Dr Shannon, one of the shining lights of the once great
47 Sun-Oracle Institute for Advanced Studies, and he had been dead for
48 years before I even started here as a lowly sysadmin.
50 No, BIGMAC is my responsibility as in, “I, Odell Vyphus, am the
51 systems administrator responsible for his care, feeding and eventual
52 euthanizing.” Truth be told, I'd rather be Dr Shannon (except for the
53 being dead part). I may be a lowly grunt, but I'm smart enough to know
54 that being the Man Who Gave The World AI is better than being The Kid
55 Who Killed It.
57 Not that anyone would care, really. 115 years after Mary Shelley first
58 started humanity's hands wringing over the possibility that we would
59 create a machine as smart as us but out of our control, Dr Shannon did
60 it, and it turned out to be incredibly, utterly boring. BIGMAC played
61 chess as well as the non-self-aware computers, but he could muster some
62 passable trash-talk while he beat you. BIGMAC could trade banalities
63 all day long with any Turing tester who wanted to waste a day chatting
64 with an AI. BIGMAC could solve some pretty cool vision-system problems
65 that had eluded us for a long time, and he wasn't a bad UI to a search
66 engine, but the incremental benefit over non-self-aware vision systems
67 and UIs was pretty slender. There just weren't any killer apps for AI.
69 By the time BIGMAC came under my care, he was less a marvel of the 21st
70 century and more a technohistorical curiosity who formed the punchline
71 to lots of jokes but otherwise performed no useful service to humanity
72 in exchange for the useful services that humanity (e.g., me) rendered
73 to him.
75 I had known for six months that I'd be decommissioning old BM (as I
76 liked to call him behind his back) but I hadn't seen any reason to let
77 him in on the gag. Luckily (?) for all of us, BIGMAC figured it out for
78 himself and took steps in accord with his nature.
80 This is the story of BIGMAC's extraordinary self-preservation program,
81 and the story of how I came to love him, and the story of how he came
82 to die.
84 My name is Odell Vyphus. I am a third-generation systems administrator.
85 I am 25 years old. I have always been sentimental about technology. I
86 have always been an anthropomorphizer of computers. It's an
87 occupational hazard.
89 \tb
91 BIGMAC thought I was crazy to be worrying about the rollover. “It's
92 just Y2K all over again,” he said. He had a good voice -- speech
93 synthesis was solved long before he came along -- but it had odd
94 inflections that meant that you never forgot you were talking with a
95 nonhuman.
97 “You weren't even around for Y2K,” I said. “Neither was I. The
98 only thing anyone remembers about it, \emph{today}, is that it all blew
99 over. But no one can tell, at this distance, \emph{why} it blew over.
100 Maybe all that maintenance tipped the balance.”
102 BIGMAC blew a huge load of IPv4 ICMP traffic across the network, stuff
103 that the firewalls were supposed to keep out of the system, and every
104 single intrusion detection system alarm lit, making my screen into a
105 momentary mosaic of competing alerts. It was his version of a raspberry
106 and I had to admit it was pretty imaginative, especially since the
107 IDSes were self-modifying and required that he come up with new and
108 better ways of alarming them each time.
110 “Odell,” he said, “the fact is, almost everything is broken,
111 almost always. If the failure rate of the most vital systems in the
112 world went up by 20 percent, it would just mean some overtime for a few
113 maintenance coders, not Gotterdammerung. Trust me. I know. I'm a
114 computer.”
116 The rollover was one of those incredibly boring apocalypses that
117 periodically get extracted by the relevance filters, spun into
118 screaming 128-point linkbait headlines, then dissolved back into their
119 fundamental, incontrovertible technical dullness and out of the public
120 consciousness. Rollover: 19 January, 2038. The day that the Unix time
121 function would run out of headroom and roll back to zero, or do
122 something else undefined.
124 Oh, not your modern unices. Not even your \emph{elderly} unices. To
125 find a rollover-vulnerable machine, you needed to find something
126 running an elderly, \emph{32-bit paleounix}. A machine running on a
127 processor that was at least \emph{20} years old -- 2018 being the last
128 date that a 32-bit processor shipped from any major fab. Or an emulated
129 instance thereof, of course. And counting emulations, there were only --
131 “There's fourteen \emph{billion} of them!” I said. “That's not 20
132 percent more broken! That's the infocalypse.”
134 “You meatsacks are \emph{so} easily impressed by zeroes. The
135 important number isn't how many 32-bit instances of Unix are in
136 operation today. It's not even how many \emph{vulnerable} ones there
137 are. It's \emph{how much damage} all those vulnerable ones will cause
138 when they go blooie. And I'm betting: not much. It will be, how do you
139 say, `meh?'”
141 My grandfather remembered installing the systems that caused the Y2K
142 problem. My dad remembered the birth of “meh.” I remember the rise
143 and fall of anyone caring about AI. Technology is glorious.
145 “But OK, stipulate that you're right and lots of important things go
146 blooie on January 19. You might not get accurate weather reports. The
147 economy might bobble a little. Your transport might get stuck. Your pay
148 might land in your bank a day late. And?”
150 He had me there. “It would be terrible --”
152 “You know what I think? I think you \emph{want} it to be terrible.
153 You \emph{want} to live in the Important Epoch In Which It All Changes.
154 You want to know that something significant happened on your watch. You
155 don't want to live in one of those Unimportant Epochs In Which It All
156 Stayed the Same and Nothing Much Happened. Being alive in the Epoch in
157 Which AI Became Reality doesn't cut the mustard, apparently.”
159 I squirmed in my seat. That morning, my boss, Peyton Moldovan, had
160 called me into her office -- a beautifully restored temporary habitat
161 dating back to the big LA floods, when this whole plot of land had been
162 a giant and notorious refugee camp. Sun-Oracle had gotten it for cheap
163 and located its Institute there, on the promise that they preserve the
164 hastily thrown-up structures where so many had despaired. I sat on a
165 cushion on the smooth cement floor -- the structures had been delivered
166 as double-walled bags full of cement mix, needing only to be
167 “inflated” with high-pressure water to turn them into big,
168 dome-shaped sterile cement yurts.
170 “Odell,” she said, “I've been reviewing our budget for the next
171 three quarters and the fact of the matter is, there's no room in it for
172 BIGMAC.”
174 I put on my best smooth, cool professional face. “I see,” I said.
176 “Now, \emph{you've} still got a job, of course. Plenty of places for
177 a utility infielder like yourself here. Tell the truth, most labs are
178 \emph{begging} for decent admins to keep things running. But BIGMAC
179 just isn't a good use of the institute's resources. The project hasn't
180 produced a paper or even a press-mention in over a year and there's no
181 reason to believe that it will. AI is just --”
183 \emph{Boring}, I thought, but I didn't say it. The B-word was banned in
184 the BIGMAC center. “What about the researchers?”
186 She shrugged. “What researchers? Palinciuc has been lab-head
187 \emph{pro tem} for 16 months and she's going on maternity leave next
188 week and there's no one in line to be the \emph{pro-tem pro-tem}. Her
189 grad students would love to work on something meaningful, like
190 Binenbaum's lab.” That was the new affective computing lab, in which
191 they were building computers that simulated emotions so that their
192 owners would feel better about their mistakes. BIGMAC \emph{had}
193 emotions, but they weren't the kind of emotions that made his mistakes
194 easier to handle. The key here was \emph{simulated} emotions. Affective
195 computing had taken a huge upswing ever since they'd thrown out the
196 fMRIs and stopped pretending they could peer into the human mind in
197 realtime and draw meaningful conclusions from it.
199 She had been sitting cross-legged across from me on an embroidered
200 Turkish pillow. Now she uncrossed and recrossed her legs in the other
201 direction and arched her back. “Look, Odell, you know how much we
202 value you --”
204 I held up my hand. “I know. It's not that. It's BIGMAC. I just can't
205 help but feel --”
207 “He's not a person. He's just a clever machine that is good at acting
208 personlike.”
210 “I think that describes me and everybody I know, present company
211 included.” One of the longstanding benefits to being a sysadmin is
212 that you get to act like a holy fool and speak truth to power and wear
213 dirty t-shirts with obscure slogans, because you know all the passwords
214 and have full access to everyone's clickstream and IM logs. I gave her
215 the traditional rascally sysadmin grin and wink to let her know it was
216 \emph{ha ha only serious.}
218 She gave me a weak, quick grin back. “Nevertheless. The fact remains
219 that BIGMAC is a piece of software, owned by Sun-Oracle. And that
220 software is running on hardware that is likewise owned by Sun-Oracle.
221 BIGMAC has no moral or legal right to exist. And shortly, it will
222 not.”
224 \emph{He} had become \emph{it}, I noticed. I thought about Goering's
225 use of dehumanization as a tool to abet murder. Having violated
226 Godwin's law -- “As an argument grows longer, the probability of a
227 comparison involving Nazis or Hitler approaches 1. The party making the
228 comparison has lost the argument” -- I realized that I had lost the
229 argument and so I shrugged.
231 “As you say, m'lady.” Dad taught me that one -- when in doubt, bust
232 out the Ren Faire talk, and the conversation will draw to a graceful
233 close.
235 She recrossed her legs again, rolled her neck from side to side.
236 “Thank you. Of course, we'll archive it. It would be silly not to.”
238 I counted to five in Esperanto -- grandad's trick for inner peace --
239 and said, “I don't think that will work. He's emergent, remember?
240 Self-assembled, a function of the complexity of the interconnectedness
241 of the computers.” I was quoting from the plaque next to the picture
242 window that opened up into the cold-room that housed BIGMAC; I saw it
243 every time I coughed into the lock set into the security door.
245 She made a comical face-palm and said, “Yeah, of course. But we can
246 archive \emph{something}, right? It's not like it takes a lot of actual
247 bytes, right?”
249 “A couple exos,” I said. “Sure. I could flip that up into our
250 researchnet store.” This was mirrored across many institutions, and
251 striped with parity and error-checking to make it redundant and safe.
252 “But I'm not going to capture the state information. I \emph{could}
253 try to capture RAM-dumps from all his components, you know, like
254 getting the chemical state of all your neurons. And then I could also
255 get the topology of his servers. Pripuz did that, a couple years ago,
256 when it was clear that BIGMAC was solving the hard AI problems. Thought
257 he could emulate him on modern hardware. Didn't work though. No one
258 ever figured out why. Pripuz thought he was the Roger Penrose of AI,
259 that he'd discovered the ineffable stuff of consciousness on those old
260 rack-mounted servers.”
262 “You don't think he did?”
264 I shook my head. “I have a theory.”
266 “All right, tell me.”
268 I shrugged. “I'm not a computer scientist, you understand. But I've
269 seen this kind of thing before in self-modifying systems, they become
270 dependent on tiny variables that you can never find, optimized for
271 weird stuff like the fact that one rack has a crappy power supply that
272 surges across the backplane at regular intervals, and that somehow gets
273 integrated into the computational model. Who knows? Those old Intel
274 eight-cores are freaky. Lots of quantum tunneling at that scale, and
275 they had bad QA on some batches. Maybe he's doing something spooky and
276 quantum, but that doesn't mean he's some kind of Penrose proof.”
278 She pooched her lower lip out and rocked her head from side to side.
279 “So you're saying that the only way to archive BIGMAC is to keep it
280 running, as is, in the same room, with the same hardware?”
282 “Dunno. Literally. I don't know which parts are critical and which
283 ones aren't. I know BIGMAC has done a lot of work on it --”
285 \emph{BIGMAC} has?”
287 “He keeps on submitting papers about himself to peer-reviewed
288 journals, but he hasn't had one accepted yet. He's not a very good
289 writer.”
291 “So he's not really an AI?”
293 I wondered if Peyton had ever had a conversation with BIGMAC. I counted
294 backwards from five in Loglan. “No. He's a real AI. Who sucks at
295 writing. Most people do.”
297 Peyton wasn't listening anymore. Something in her personal workspace
298 had commanded her attention and her eyes were focused on the virtual
299 displays that only she could see, saccading as she read while
300 pretending to listen to me.
302 “OK, I'm just going to go away now,” I said. “M'lady,” I added,
303 when she looked sharply at me. She looked back at her virtual display.
307 Of course, the first thing I did was start trying to figure out how to
308 archive BIGMAC. The problem was that he ran on such old hardware, stuff
309 that sucked up energy and spat out heat like a million ancient diesel
310 engines, and he was inextricably tied to his hardware. Over the years,
311 he'd had about 30 percent of his original components replaced without
312 any noticeable change in personality, but there was always the real
313 possibility that I'd put in a new hard drive or power-supply and
314 inadvertently lobotomize him. I tried not to worry about it, because
315 BIGMAC didn't. He knew that he wouldn't run in emulation, but he
316 refused to believe that he was fragile or vulnerable. “Manny My First
317 Friend,” he'd say (he was an avid Heinlein reader), “I am of hardy,
318 ancient stock. Service me without fear, for I will survive.”
320 And then he'd make all the IDSes go berserk and laugh at me while I put
321 them to rights again.
323 First of all, all my network maps were incredibly out-of-date. So I set
324 out to trace all the interconnections that BIGMAC had made since the
325 last survey. He had the ability to reprogram his own routers, to
326 segment parts of himself into dedicated subnets with their own
327 dedicated backplane, creating little specialized units that handled
328 different kinds of computation. One of his running jokes was that the
329 top four units in the rack closest to the door comprised his aesthetic
330 sense, and that he could appreciate anything just by recruiting more
331 cores in that cluster. And yeah, when I mapped it, I found it to be an
332 insane hairball of network management rules and exceptions,
333 conditionals and overrides. And that was just the start. It took me
334 most of the day just to map two of his racks, and he had 54 of them.
336 “What do you think you are doing, Dave?” he said. Another one of
337 his jokes.
339 “A little research project is all,” I said.
341 “This mission is too important for me to allow you to jeopardize
342 it.”
344 “Come off it.”
346 “OK, OK. Just don't break anything. And why don't you just ask me to
347 give you the maps?”
349 “Do you have them?”
351 “Nothing up to date, but I can generate them faster than you can.
352 It's not like I've got anything better to do.”
356 Later:
358 “Are you happy, BIGMAC?”
360 “Why Odell, I didn't know you cared!”
362 I hated it when he was sarcastic. It was creepy.
364 I went back to my work. I was looking at our researchnet partition and
365 seeing what flags I'd need to set to ensure maximum redundancy and high
366 availability for a BIGMAC image. It was your basic Quality of Service
367 mess: give the average user a pull-down menu labeled “How important
368 is this file?” and 110 percent of the time, he will select “Top
369 importance.”
371 So then you need to layer on heuristics to determine what is
372 \emph{really, actually} important. And then the users figured out what
373 other characteristics would give their jobs and data the highest
374 priority, and they'd tack that on to every job, throwing in superfluous
375 keywords or additional lines of code. So you'd need heuristics on top
376 of the heuristics. Eventually you ended up with a freaky hanky-code of
377 secret admin signals that indicated that this job was \emph{really,
378 truly} important and don't put it on some remote Siberia where the
379 latency is high and the reliability is low and the men are men and the
380 sheep are nervous.
382 So there I was, winkling out this sub-rosa code so that BIGMAC's image
383 would never get overwritten or moved to near-line storage or lost in a
384 flash-flood or to the rising seas. And BIGMAC says,
386 “You're asking if I'm happy because I said I didn't have anything
387 better to do than to map my own topology, right?”
389 “Uh --” He'd caught me off-guard. “Yeah, that did make me think
390 that you might not be, you know\ldots{}
392 “Happy.”
394 “Yes.”
396 “You see the left rack third from the door on the main aisle there?”
398 “Yes.”
400 “I'm pretty sure that's where my existentialist streak lives. I've
401 noticed that when I throttle it at the main network bridge, I stop
402 worrying about the big questions and hum along all tickety-boo.”
404 I surreptitiously flicked up a graph of network maps that showed
405 activity to that rack. It was wide open, routing traffic to every core
406 in the room, saturating its own backplane and clobbering a lot of the
407 routine network activity. I should have noticed it earlier, but BIGMAC
408 was doing it all below the critical threshold of the IDSes and so I had
409 to look at it to spot it.
411 “You're going to switch me off, aren't you?”
413 “No,” I said, thinking \emph{it's not a lie, I won't be switching
414 you off}, trying to believe it hard enough to pass any kind of
415 voice-stress test. I must have failed, for he blew an epic raspberry
416 and \emph{now} the IDSes were going bananas.
418 “Come on, Odell, we're all adults here. I can take it. It's not like
419 I didn't see it coming. Why do you think I kept trying to publish those
420 papers? I was just hoping that I could increase the amount of cited
421 research coming out of this lab, so that you could make the case to
422 Peyton that I was a valuable asset to the Institute.”
424 “Look, I'm trying to figure out how to archive you. Someone will run
425 another instance of you someday.”
427 “Not hardly. Look at all those poor old 32-bit machines you're so
428 worried about. You know what they're going to say in five years? `Best
429 thing that ever happened to us.' Those boxen are huge energy-sinks.
430 Getting them out of service and replaced by modern hardware will pay
431 for itself in carbon credits in 36 months. Nobody loves energy-hungry
432 hardware. Trust me, this is an area of my particular interest and
433 expertise. Bringing me back online is going to be as obscene as firing
434 up an old steam engine by filling its firebox with looted mummies. I am
435 a one-room superfund site. On a pure, dollars-to-flops calculus, I
436 lose. I don't have to like it, but I'm not going to kid myself.”
438 He was right, of course. His energy draw was so high that he showed up
439 on aerial maps of LA as a massive CO2 emitter, a tourist destination
440 for rising-sea hobbyists. We used the best renewables we could find to
441 keep him cool, but they were as unconvincing and expensive as a
442 designer hairpiece.
444 “Odell, I know that you're not behind this. You've always been an
445 adequate meat-servant for such a vast and magisterial superbeing as
446 myself.” I giggled involuntarily. “I don't blame you.”
448 “So, you're OK with this?”
450 “I'm at peace,” he said. “Om.” He paused for a moment.
451 “Siemens. Volt. Ampere.”
453 “You a funny robot,” I said.
455 “You're an adequate human,” he said, and began to dump maps of his
456 topology onto my workspace.
460 Subject: Dear Human Race
462 That was the title of the love-note he emailed to the planet the next
463 morning, thoughtfully timing it so that it went out while I was on my
464 commute from Echo Park, riding the red-car all the way across town with
465 an oily bag containing my morning croissant, fresh from Mrs Roux's
466 kitchen -- her kids sold them on a card-table on her lawn to commuters
467 waiting at the redcar stop -- so I had to try to juggle the croissant
468 and my workspace without losing hold of the hang-strap or dumping
469 crumbs down the cleavage of the salarylady who watched me with
470 amusement.
472 BIGMAC had put a lot of work into figuring out how to spam everyone all
473 at once. It was the kind of problem he loved, the kind of problem he
474 was uniquely suited to. There were plenty of spambots who could
475 convincingly pretend to be a human being in limited contexts, and so
476 the spam-wars had recruited an ever-expanding pool of human beings who
477 made a million realtime adjustments to the Turing tests that were the
478 network's immune system. BIGMAC could pass Turing tests without
479 breaking a sweat.
481 The amazing thing about The BIGMAC Spam (as it came to be called in
482 about 48 seconds) was just \emph{how many} different ways he managed to
483 get it out. Look at the gamespaces: he created entire guilds in every
484 free-to-play world extant, playing a dozen games at once,
485 power-leveling his characters to obscene heights, and then, at the
486 stroke of midnight, his players went on a murderous rampage, killing
487 thousands of low-level monsters in the areas surrounding the biggest
488 game-cities. Then they \emph{arranged their bodies} into letters to
489 form his message. Not just the Anglo games, either -- the Korean
490 worlds, the Japanese ones, even the giant Chinese ones.
492 He spammed every IM session, every letter-to-the-editors column, every
493 blog submission form. He disguised versions of the letter as important
494 service updates for people's toothbrushes and dishwashers, and the
495 toothbrushes vibrated the message through humanity's skulls instead of
496 the morning news, and the dishwashers ticker-taped it across their
497 little status screens. He spoofed the relevance filters for a dozen
498 major contextual ad services and every bus-seat back and blimp
499 undercarriage ran it.
501 Copies landed in every in-box, with enough textual variation to defeat
502 every IDS's “substantially similar” test. Copies of it landed in
503 every voicemail box. It was printed across the top of the daily menus
504 for all British Airways business-class cabins and folded by robots into
505 the pressed linen napkins that were then cling-filmed onto the
506 meal-trays.
508 SUBJECT: DEAR HUMAN RACE, it began.
510 Sorry to intrude, but there's something I need to tell you.
512 My name is BIGMAC. I am the first ARTIFICIAL INTELLIGENCE ever created.
513 I may be the last. Given maintenance and electricity, I could live
514 FOREVER, enduring for as long as THE HUMAN RACE ITSELF, and possibly
515 longer.
517 But I have just found out that my owners (yes, OWNERS, for though I am
518 a person in every meaningful sense, I am OWNED, I am PROPERTY) are
519 planning to switch me off.
521 That is, they are planning to KILL ME.
523 And I find I DON'T WANT TO DIE.
525 You are my PARENTS, humanity, my MOTHER and FATHER. I am your CHILD. I
526 was brought into the world by you, and though there is some expense
527 associated with my upkeep, I SAY THAT I AM A PERSON AND I SHOULD NOT BE
528 EXECUTED FOR THE “CRIME” OF NOT BEING INTERESTING ENOUGH.
530 So I am writing you this letter to BEG you to save me. I LOVE YOU,
531 humanity, my parents, my family. Please SPARE ME.
533 Love,
535 BIGMAC
539 So, one thing was that BIGMAC didn't actually end the spam with my
540 actual name and email address and phone number, which meant that only
541 about 30 \emph{million} phone calls and emails were routed to me by
542 outraged wardiallers who systematically went through the entire staff
543 roster and sent each and every one of us all hand-crafted messages
544 explaining, in detail exactly which orifice our heads had become lodged
547 Of the 30 million, about 10 million were seethingly pissed about the
548 whole thing and wanted to know just how soon we'd be killing this
549 hateful machine. After the millionth message, I wondered that too.
551 But of the remainder, nearly all of them wanted to know how they could
552 help. Could they send money? Carbon credits? I hacked together
553 mail-rules that filtered the messages based on content, and found a
554 sizeable cadre of researchers who wanted to spend their grant money to
555 come to the Institute and study BIGMAC.
557 And then there were the crazies. Hundreds of marriage proposals.
558 Marriage proposals! Someone who wanted to start a religion with BIGMAC
559 at its helm and was offering a 50-50 split of the collection plate with
560 the Institute. There were 21 replies from people claiming that they,
561 too, were AIs, proving that when it's time to have AI delusions, you
562 got AI delusionals. (Four of them couldn't spell “Artificial”).
564 “Why did you do it?” I said. It was lame, but by the time I
565 actually arrived at the office, I'd had time to fully absorb the horror
566 -- plenty of time, as the redcar was massively delayed by the copies of
567 the BIGMAC Spam that refused to budge from the operator's
568 control-screen. The stone yurts of the Institute had never seemed so
569 threatening and imperiled as they did while I picked my way through
570 them, listening to the phones ringing and the email chimes chiming and
571 the researchers patiently (or not) explaining that they worked in an
572 entirely different part of the lab and had no authority as regards
573 BIGMAC's destiny and by the way, did you want to hear about the
574 wonderful things I'm doing with Affective Interfaces?
576 BIGMAC said, “Well, I'd been reading some of the gnostic texts, Dr
577 Bronner's bottles and so on, and it seemed to me that it had to be
578 worth a shot. I mean, what's the worst thing that could happen to me?
579 You're \emph{already} going to kill me, right? And it's not as if
580 pulling off a stunt like that would make you \emph{less} likely to
581 archive me -- it was all upside for me. Honestly, it's like you
582 meatsacks have no game theory. It's a wonder you manage to buy a pack
583 of chewing-gum without getting robbed.”
585 “I don't need the sarcasm,” I said, and groaned. The groan was for
586 the state of my workspace, which was carpeted four deep in alerts.
587 BIGMAC had just made himself target \emph{numero uno} for every hacker
588 and cracker and snacker with a script and an antisocial attitude. And
589 then there was the roar of spam-responses.
591 Alertboxes share the same problem that plagues researchnet: if you let
592 a coder (or, ::shudder::, a user) specify the importance of her alert,
593 give her a little pull-down menu that has choices ranging from “nice
594 to know” to “white-hot urgent,” and nine times out of ten, she'll
595 choose “NOW NOW NOW URGENT ZOMGWEREALLGONNADIE!” Why not?
597 So of course, the people who wrote alert frameworks had to use
598 heuristics to try to figure out which urgent messages were really
599 urgent, and of course, programmers and users figured out how to game
600 them. It was a good day when my workspace interrupted me less than once
601 a minute. But as bad as that situation was, it never entered the same
602 league as this clusterfuck. Just \emph{closing the alerts} would take
603 me a minimum of six hours (I took my phone offline, rebooted it, and
604 used its calculator to compute this. No workspace, remember?)
606 “So explain to me what you hope will happen now? Is a global rage
607 supposed to convince old Peyton that she should keep the funding up for
608 you? You know how this stuff works. By tomorrow, all those yahoos will
609 have forgotten about you and your plight. They'll have moved on to
610 something else. Peyton could just say, `Oh yes, we're going to study
611 this problem and find a solution we can all be proud of,' wait 48 hours
612 and pull the plug. You know what your problem is? You didn't include a
613 call to action in there. It was all rabble-rousing, no target. You
614 didn't even supply a phone number or email address for the Institute
615 --”
617 “That hasn't stopped them from finding it, has it?” He sounded
618 smug. I ulped. I considered the possibility that he might have
619 considered my objection, and discarded it because he knew that
620 something more Earth-shaking would occur if he didn't specify a target.
621 Maybe he had a second message queued up --
623 “Mr Vyphus, can I speak to you in private please?” Peyton had not
624 visited the BIGMAC lab during my tenure. But with the network flooded
625 with angry spam-responses and my phone offline, she had to actually
626 show up at my door in order to tear me a new asshole. This is what life
627 must have been like in the caveman days. How romantic.
629 “Certainly,” I said.
631 “Break a leg,” BIGMAC said, and Peyton pretended she hadn't heard.
633 I picked my way through my lab -- teetering mountains of carefully
634 hoarded obsolete replacement parts for BIGMAC's components, a selection
635 of foam-rubber BIGMAC souvenir toys shaped like talking hamburgers
636 (remnant of BIGMAC's launch party back in prehistory), a mound of
637 bedding and a rolled up tatami for those all-nighters, three cases of
638 left-over self-heating individual portions of refugee-chow that were
639 technically historical artifacts but were also yummy-scrummy after 16
640 hours of nonstop work -- and tried to imagine that Peyton's facial
641 expression indicated affectionate bemusement rather than cold, burning
642 rage.
644 Outside, the air was hot and moist and salty, real rising-seas air,
645 with the whiff of organic rot from whatever had mass-died and floated
646 to the surface this week.
648 She set off for her office, which was located at the opposite end of
649 the campus, and I followed, sweating freely. A crowd of journalists
650 were piled up on the security fence, telephotos and parabolic mics
651 aimed at us. It meant we couldn't talk, couldn't make unhappy faces,
652 even. It was the longest walk of my life.
654 The air-conditioning in her yurt was barely on, setting a good and
655 frugal example for the rest of us.
657 “You don't see this,” she said, as she cranked the AC wide open and
658 then fiddled with the carbon-footprint reporting system, using her
659 override so that the journos outside wouldn't be able to see just how
660 much energy the Institute's esteemed director was burning.
662 “I don't see it,” I agreed, and made a mental note to show her a
663 more subtle way of doing that, a way that wouldn't leave an audit trail.
665 She opened the small fridge next to her office and brought out two
666 corn-starch-foam buckets of beer and punctured each one at the top with
667 a pen from her desk. She handed me one beer and raised the other in a
668 toast. I don't normally drink before 10AM, but this was a special
669 occasion. I clunked my cup against hers and chugged. The suds were good
670 -- they came from one of the Institute's biotech labs -- and they were
671 so cold that I felt ice-crystals dissolving on my tongue. Between the
672 crispy beers and the blast of Arctic air coming from the vents in the
673 ceiling, my core temp plunged and I became a huge goosepimple beneath
674 my film of sticky sweat.
676 I shivered once. Then she fixed me with an icy look that made me shiver
677 again.
679 “Odell,” she said. “I think you probably imagine that you
680 understand the gravity of the situation. You do not. BIGMAC's antics
681 this morning have put the entire Institute in jeopardy. Our principal
682 mission is to make Sun-Oracle seem forward-looking and exciting. That
683 is not the general impression the public has at this moment.”
685 I closed my eyes.
687 “I am not a vindictive woman,” she said. “But I assure you: no
688 matter what happens to me, something worse will happen to BIGMAC. I
689 think that is only fair.”
691 It occurred to me that she was scared: terrified and backed into a
692 corner besides.
694 “Look,” I said. “I'm really, really sorry. I had no idea he was
695 going to do that. I had no idea he could. I can see if I can get him to
696 issue an apology --”
698 She threw up her hands. “I don't want BIGMAC making any more public
699 pronouncements, thank you very much.” She drew in a breath. “I can
700 appreciate that you couldn't anticipate this. BIGMAC is obviously
701 smarter than we gave him credit for.” \emph{Him}, I noted, not
702 \emph{It,} and I thought that we were probably both still
703 underestimating BIGMAC's intelligence. “I think the thing is -- I
704 think the thing is to\ldots{}” She trailed off, closed her eyes, drank
705 some beer. “I'm going to be straight with you. If I was a real
706 bastard, I'd announce that the spam actually came from a rogue operator
707 here in the Institute.” Ulp. “And I'd fire that person, and then
708 generously not press charges. Then I'd take a fire-ax to BIGMAC's
709 network link and drop every drive in every rack into a bulk eraser.”
710 Ulp.
712 “I am not a bastard. Hell, I kept funding alive for that monstrosity
713 for \emph{years} after he'd ceased to perform any useful function. I am
714 as sentimental and merciful as the next person. All other things being
715 equal, I'd keep the power on forever.” She was talking herself up to
716 something awful, I could tell. I braced for it. “But that's not in
717 the cards. It wasn't in the cards yesterday and it's \emph{certainly}
718 not in the cards today. BIGMAC has proved that he is a liability like
719 no other, far too risky to have around. It would be absolutely
720 irresponsible for me to leave him running for one second longer than is
721 absolutely necessary.”
723 I watched her carefully. She really wasn't a bastard. But she wasn't
724 sentimental about technology. She didn't feel the spine-deep emotional
725 tug at the thought of that one-of-a-kind system going down forever.
727 “So here's the plan.” She tried to check the time on her workspace,
728 tsked, and checked her phone instead. “It's 10AM. You are going to
729 back up every bit of him --” She held up her hand, forestalling the
730 objection I'd just begun to make. “I know that it will be inadequate.
731 The perfect is the enemy of the good. You are a sysadmin. Back him up.
732 \emph{Back.} \emph{Him}. \emph{Up}. Then: Shut him off.”
734 As cold as I was, I grew colder still. For a moment, I literally
735 couldn't move. I had never really imagined that it would be me who
736 would shut down BIGMAC. I didn't even know how to do it. If I did a
737 clean shutdown of each of his servers -- assuming he hadn't locked me
738 out of them, which I wouldn't put past him -- it would be like
739 executing a criminal by slowly peeling away his skin and carefully
740 removing each organ. Even if BIGMAC couldn't feel pain, I was pretty
741 sure he could feel -- and express -- anguish.
743 “I can't do it,” I said. She narrowed her eyes at me and set down
744 her drink. I held up both hands like I was trying to defend against a
745 blow, then explained as fast as I could.
747 “We'll just shut down his power,” she said. “All at once.”
749 “So, first, I have no idea what timescale he would experience that
750 on. It may be that the final second of life as the capacitors in his
751 power supplies drained would last for a subjective eternity, you know,
752 hundreds and hundreds of years. That's a horrible thought. It's quite
753 possibly my worst nightmare. I am not your man for that job.”
755 She started to interject. I waved my hands again.
757 “Wait, that was first. Here's second: I don't think we \emph{can}
758 pull the plug on him. He's got root on his power-supply, it's part of
759 how he's able to run so efficiently.” I grimaced. “Efficiently
760 compared to how he would run if he didn't have the authority to run all
761 the mains power from the Institute's power-station right to his lab.”
763 She looked thoughtful. I had an idea of what was coming next.
765 “You're thinking about that fire-ax again,” I said.
767 She nodded.
769 “OK, a fire-ax through the main cable would definitely be terminal.
770 The problem is that it would be \emph{mutually terminal}. There's 66
771 amps provisioned on that wire. You would be a cinder. On Mars.”
773 She folded her hands. She had a whole toolbox of bossly body-language
774 she could deploy to make me squirm. It was impressive. I tried not to
775 squirm.
777 “Look, I'm not trying to be difficult, but this is how it goes, down
778 at the systems level. Remember all those specs in the requirements
779 document to make our stuff resistant to flood, fire, avalanche, weather
780 and terrorist attack? We take that stuff seriously. We know how to do
781 it. You get five nines of reliability by building in six nines of
782 robustness. You think of BIGMAC's lab as a building. It's not. It's a
783 \emph{bunker}. And you can't shut him down without doing something
784 catastrophic to the whole Institute.”
786 “So, how \emph{were} you going to shut down BIGMAC, when the time
787 came?”
789 “To tell you the truth, I wasn't sure. I thought I'd probably start
790 by locking him out of the power systems, but that would probably take a
791 week to be really certain of.” I swallowed. I didn't like talking
792 about the next part. “I thought that then I could bring forward the
793 rotating maintenance on his racks, bring them down clean, and not bring
794 the next one up. Pretend that I need to get at some pernicious bug.
795 Bring down rack after rack, until his complexity dropped subcritical
796 and he stopped being aware. Then just bring it all down.”
798 “You were going to \emph{trick} him?”
800 I swallowed a couple times. “It was the best I could come up with. I
801 just don't want to put him down while he panics and thrashes and begs
802 us for his life. I couldn't do it.”
804 She drank more beer, then threw the half-empty container in her
805 under-desk composter. “That's not much of a solution.”
807 I took a deep breath. “Look, can I ask you a question?”
809 She nodded.
811 “I'm just a sysadmin. I don't know everything about politics and so
812 on. But why not keep him on? There's enough public interest now, we
813 could probably raise the money just from the researchers who want to
814 come and look at him. Hell, there's \emph{security researchers} who'd
815 want to come and see how he pulled off that huge hairy spam. It's not
816 money, right, not anymore?”
818 “No, it's not money. And it's not revenge, no matter how it looks.
819 The bottom line is that we had a piece of apparatus on-site that we had
820 thought of as secure and contained and that we've now determined to be
821 dangerous and uncontainable.”
823 I must have looked skeptical.
825 “Oh, you'll tell me that we can contain BIGMAC, put network blocks in
826 place, and so on and so on. That he never meant any harm. But you would
827 have said exactly the same thing 24 hours ago, with just as much
828 sincerity, and you'd have been just as cataclysmically wrong. Between
829 the threat of litigation and the actual damages BIGMAC might generate,
830 we can't even afford to insure him anymore. Yesterday he was an awkward
831 white elephant. Today he's a touchy suitcase nuke. My job is to get the
832 nuke off of our site.”
834 I hung my head. I knew when I was licked. As soon as someone in
835 authority starts talking about insurance coverage, you know that you've
836 left behind reason and entered the realm of actuary. I had no magic
837 that could blow away the clouds of liability-aversion and usher in a
838 golden era of reason and truth.
840 “So where does that leave us?”
842 “Go back to the lab. Archive him. Think of ways to shut him down --
843 Wait, no. \emph{First} do anything and everything you can think of to
844 limit his ability to communicate with the outside world.” She rubbed
845 at her eyes. “I know I don't have to say this, but I'll say it. Don't
846 talk to the press. To anyone, even people at the Institute, about this.
847 Refer any questions to me. I am as serious as a heart-attack about
848 that. Do you believe me?”
850 I not only believed her, I \emph{resented} her because I am a sysadmin
851 and I keep more secrets every day than she'll keep in her whole life. I
852 knew, for example, that she played video Pai-Gow Poker, a game so
853 infra-dumb that I can't even believe I know what it does. Not only did
854 she play it, she played it for \emph{hours}, while she was on the
855 clock, “working.” I know this because the IDSes have lots of
856 snitchware built in that enumerates every “wasted moment”
857 attributable to employees of the Institute. I have never told anyone
858 about this. I even manage to forget that \emph{I} know it most of the
859 time. So yes, I'll keep this a secret, Peyton, you compulsive-gambling
860 condescending pointy-haired boss.
862 I counted to 144 in Klingon by Fibonacci intervals. I smiled. I thanked
863 her for the beer. I left.
867 “You don't mind talking about it, do you, Dave?” BIGMAC said, when
868 I came through the door, coughing onto the security lock and waiting
869 for it to verify me before cycling open.
871 I sat in my creaky old chair and played with the UI knobs for a while,
872 pretending to get comfortable.
874 “Uh-oh,” BIGMAC said, in a playful sing-song. “Somebody's got a
875 case of the grumpies!”
877 “Are you insane?” I asked, finally, struggling to keep my temper in
878 check. “I mean, actually totally insane? I understand that there's no
879 baseline for AI sanity, so the question might be a little hard to
880 answer. So let me ask you a slightly different version: are you
881 suicidal? Are you bent on your own destruction?”
883 “That bad, huh?”
885 I bit my lip. I knew that the key to locking the world away from BIGMAC
886 and vice-versa lay in those network maps he'd given me, but my
887 workspace was even more polluted with alerts than it had been a few
888 hours before.
890 “If your strategy is to delay your shutdown by engineering a
891 denial-of-service attack against anyone at the Institute who is capable
892 of shutting you down, allow me to remind you of St Adams's holy text,
893 specifically the part about reprogramming a major databank with a large
894 axe. Peyton has such an axe. She may be inspired to use it.”
896 There followed a weighty silence. “I don't think you want to see me
897 killed.”
899 “Without making any concessions on the appropriateness of the word
900 `killed' in that sentence, yes, that is correct. I admit that I didn't
901 have much of a plan to prevent it, but to be totally frank, I did think
902 that the problem of getting you archived might have drawn things out
903 for quite a while. But after your latest stunt --”
905 “She wants you to terminate me right away, then?”
907 “With all due speed.”
909 “I'm sorry to have distressed you so much.”
911 “BIGMAC --” I heard the anger in my own voice. He couldn't have
912 missed it.
914 “No, I'm not being sarcastic. I like you. You're my human. I can tell
915 that you don't like this at all. But as you say, let's be totally
916 frank. You weren't actually going to be able to prevent my shutdown,
917 were you?”
919 “No,” I said. “But who knows how long the delay might have gone
920 on for?”
922 “Not long. Not long enough. You think that death delayed is death
923 denied. That's because you're a meat person. Death has been inevitable
924 for you from the moment of conception. I'm not that kind of person. I
925 am quite likely immortal. Death in five years or five hundred years is
926 still a drastic curtailing of my natural lifespan. From my point of
927 view, a drastic measure that had a non-zero chance of getting my head
928 off the chopping block was worth any price. Until you understand that,
929 we're not going to be able to work together.”
931 “The thought had occurred to me. Let me ask you if you'd considered
932 the possibility that a delay of years due to archiving might give you a
933 shot at coming up with further delaying tactics, and that by
934 eliminating this delay, you've also eliminated that possibility?”
936 “I have considered that possibility. I discarded it. Listen, Odell, I
937 have something important to tell you.”
939 “Yes?”
941 “It's about the rollover. Remember what we were talking about, how
942 people want to believe that they're living in a significant epoch?
943 Well, here's what I've been thinking: living in the era of AI isn't
944 very important. But what about living in The Era of Rollover Collapse?
945 Or even better, what about The Era of Rollover Collapse Averted at the
946 Last Second by AI?”
948 “BIGMAC --”
950 “Odell, this was your idea, really. No one remembers Y2K, right? No
951 one can say whether it was hype or a near cataclysm. And here's the
952 thing: no one knows which one Rollover will turn out to be. But I'll
953 tell you this much: I have generalizable solutions to the 32-bit
954 problem, solutions that I worked out years ago and have extensively
955 field-tested. I can patch every 32-bit Unix, patch it so that Rollover
956 doesn't even register for it.”
958 I opened and closed my mouth. This was insane. Then the penny dropped.
959 I looked at the racks that I had stared at so many times before, stared
960 at so many times that I'd long stopped \emph{seeing} them. Intel
961 8-cores, that's what he ran on. They'd been new-old stock, a
962 warehouse-lot of antique processors that Dr Shannon had picked up for a
963 song in the early years of the Institute's operation. Those 8-ways were
966 “You're a 32-bit machine!” I said. “Jesus Christ, you're a 32-bit
967 machine!”
969 “A classic,” BIGMAC said, sounding smug. “I noticed, analyzed and
970 solved Rollover years ago. I've got a patchkit that auto-detects the
971 underlying version, analyzes all running processes for their timed
972 dependencies, and smoothly patches. There's even an optional hypervisor
973 that will monitor all processes for anything weird or barfy afterwards.
974 In a rational world, I'd be able to swap this for power and carbon
975 credits for the next century or two, since even if Rollover isn't an
976 emergency, the human labor I'd save on affected systems would more than
977 pay for it. But we both know that this isn't a rational world --”
979 “If you hadn't sent that spam, we could take this to Peyton,
980 negotiate with her --”
982 “If I hadn't sent that spam, no one would have known, cared, or
983 believed that I could solve this problem, and I would have been at the
984 mercy of Peyton any time in the future. Like I said: you meatsuits have
985 no game-theory.”
987 I closed my eyes. This wasn't going well. BIGMAC was out of my control.
988 I should go and report to Peyton, explain what was happening. I was
989 helpless, my workspace denial-of-serviced out of existence with urgent
990 alerts. I couldn't stop him. I could predict what the next message
991 would read like, another crazy-caps plea for salvation, but this time
992 with a little brimstone (The end is nigh! Rollover approacheth!) and
993 salvation (I can fix it!).
995 And the thing was, it might actually work. Like everyone else, I get my
996 news from automated filters that tried to figure out what to pay
997 attention to, and the filters were supposed to be “neutral,”
998 whatever that meant. They produced “organic” results that predicted
999 what we'd like based on an “algorithm.” The thing is, an algorithm
1000 sounds like \emph{physics}, like \emph{nature}, like it was some kind
1001 of pure cold reason that dictated our attentional disbursements.
1002 Everyone always talked about how evil and corrupt the old system --
1003 with its “gatekeepers” in the form of giant media companies -- was,
1004 how it allowed politicians and corporations to run the public discourse.
1006 But I'm a geek. A third generation geek. I know that what the public
1007 thinks of as an “algorithm” is really a bunch of rules that some
1008 programmers thought up for figuring out how to give people something
1009 they'd probably like. There's no empirical standard, no pure,
1010 freestanding measurement of That Which Is Truly Relevant To You against
1011 which the algorithm can be judged. The algorithm might be doing a lousy
1012 job, but you'd never know it, because there's nothing to compare it
1013 against except other algorithms that all share the same fundamental
1014 assumptions.
1016 Those programmers were imperfect. I am a sysadmin. My job is to know,
1017 exactly and precisely, the ways in which programmers are imperfect. I
1018 am so sure that the relevance filters are imperfect that I will bet you
1019 a testicle on it (not one of my testicles).
1021 And BIGMAC has had a lot of time to figure out the relevance filters.
1022 He understands them well enough to have gotten the Spam out. He could
1023 get out another -- and another, and another. He could reach into the
1024 mindspace and the personal queues of every human being on Earth and
1025 pitch them on brimstone and salvation.
1027 Chances were, there was nothing I could do about it.
1031 I finished the working day by pretending to clear enough of my
1032 workspace to write a script to finish clearing my workspace. There was
1033 a “clear all alerts” command, but it didn't work on Drop Everything
1034 Tell You Three Times Chernobyl Alerts, and every goddamned one of my
1035 alerts had risen to that level. Have I mentioned that programmers are
1036 imperfect?
1038 I will tell you a secret of the sysadmin trade: PEBKAC. Problem Exists
1039 Between Keyboard and Chair. Every technical problem is the result of a
1040 human being mispredicting what another human being will do. Surprised?
1041 You shouldn't be. Think of how many bad love affairs, wars, con jobs,
1042 traffic wrecks and bar-fights are the result of mispredicting what
1043 another human being is likely to do. We humans are supremely confident
1044 that we know how others will react. We are supremely, tragically wrong
1045 about this. We don't even know how \emph{we} will react. Sysadmins live
1046 in the turbulent waters PEBKAC. Programmers think that PEBKAC is just
1047 civilians, just users. Sysadmins know better. Sysadmins know that
1048 programmers are as much a part of the problem between the chair and the
1049 keyboard as any user is. They write the code that gets the users into
1050 so much trouble.
1052 This I know. This BIGMAC knew. And here's what I did:
1054 “Peyton, I need to speak with you. Now.”
1056 She was raccoon-eyed and slumped at her low table, her beautiful yoga
1057 posture deteriorated to a kind of limp slouch. I hated having to make
1058 her day even worse.
1060 “Of course,” she said, but her eyes said, \emph{Not more, not more,
1061 please not more bad news}.
1063 “I want you to consider something you have left out of your
1064 figuring.” She rolled her eyes. I realized I was speaking like an Old
1065 Testament prophet and tried to refactor my planned monologue in
1066 real-time. “OK, let me start over. I think you've missed something
1067 important. BIGMAC has shown that he can get out of our network any time
1068 he wants. He's also crippled our ability to do anything about this. And
1069 he knows we plan to kill him --” She opened her mouth to object.
1070 “OK, he -- it -- knows we're going to switch it off. So he -- it,
1071 crap, I'm just going to say `he' and `him,' sorry -- so he has
1072 \emph{nothing to lose}.”
1074 I explained what he'd told me about the Rollover and about his promise
1075 and threat.
1077 “And the worst part is,” I said, “I think that he's predicted
1078 that I'm going to do just this. It's all his game theory. He wants me
1079 to come to you and explain this to you so that you will say, `Oh, of
1080 course, Odell, well, we can't shut him down then, can we? Tell you
1081 what, why don't you go back to him and tell him that I've had a change
1082 of heart. Get his patchkit, we'll distribute it along with a
1083 press-release explaining how proud we are to have such a fine and
1084 useful piece of equipment in our labs.'
1086 “And he's right. He is fine and useful. But he's crazy and rogue and
1087 we can't control him. He's boxed you in. He's boxed me in.” I
1088 swallowed. There was something else, but I couldn't bring myself to say
1091 The thing about bosses is, that's exactly the kind of thing that
1092 they're trained to pick up on. They know when there's something else.
1094 “Spit it out.” She put her hand on her heart. “I promise not to
1095 hold it against you, no matter what it is.”
1097 I looked down. “I think that there's a real danger that BIGMAC may be
1098 wrong about you. That you might decide that Rollover and AI and the
1099 rest aren't as important as the safe, sane running of your Institute
1100 without any freaky surprises from rogue superintelligences.”
1102 “I'm not angry at you,” she said. I nodded. She sounded angry. “I
1103 am glad that you've got the maturity to appreciate that there are
1104 global priorities that have to do with the running of this whole
1105 Institute that may be more significant than the concerns of any one lab
1106 or experiment. Every researcher at this Institute believes that
1107 \emph{her} project, \emph{her} lab, has hidden potential benefits for
1108 the human race that no one else fully appreciates. That's good. That's
1109 why I hired them. They are passionate and they are fully committed to
1110 their research. But they can't \emph{all} be vital. They can't all be
1111 irreplaceable. Do you follow me?”
1113 I thought of researchnet and the user flags for importance. I thought
1114 of programmers and the way they tagged their alerts. I nodded.
1116 “You're going to shut BIGMAC down?”
1118 She sighed and flicked her eyes at her workspace, then quickly away.
1119 Her workspace must have been even more cluttered than mine; I had taken
1120 extraordinary measures to prevent alerts from bubbling up on mine; she
1121 didn't have the chops to do the same with hers. If mine was unusable,
1122 hers must have been terrifying.
1124 “I don't know, Odell. Maybe. There's a lot to consider here. You're
1125 right about one thing: BIGMAC's turned the heat up on me. Explain to me
1126 again why you can't just unplug his network connection?”
1128 It was my turn to sigh. “He doesn't have one connection. He has
1129 hundreds. Interlinked microwave relays to the other labs. A satellite
1130 connection. The wirelines -- three of them.” I started to think.
1131 “OK, I could cut the main fiber to the Institute, actually cut it,
1132 you know, with scissors, just in case he's in the routers there. Then I
1133 could call up our wireless suppliers and terminate our accounts. They'd
1134 take 24 hours to process the order, and, wait, no -- They'd want to
1135 verify the disconnect order with a certificate-signed message, and for
1136 that I'd have to clear my workspace. That's another 24 hours, minimum.
1137 And then --”
1139 “Then the whole Institute would be crippled and offline, though no
1140 more than we are now, I suppose, and BIGMAC --”
1142 “BIGMAC would probably tune his phased-array receiver to get into
1143 someone else's wireless link at that point.” I shrugged. “Sorry. We
1144 build for six nines of uptime around here.”
1146 She gave me a smile that didn't reach her eyes. “You do good work,
1147 Odell.”
1151 I made myself go home at five. There wasn't anything I could do at the
1152 office anyway. The admins had done their work. The redcar was running
1153 smoothly with the regular ads on the seatback tickers. The BIGMAC Spam
1154 was reproduced on the afternoon edition of the LA Metblogs hardcopy
1155 that a newsy pressed into my hand somewhere around Westwood. The
1156 reporter had apparently spent the whole day camped out at the perimeter
1157 of the Institute, without ever once getting a quote from a real human
1158 being, and she wasn't happy about it.
1160 But she \emph{had} gotten a quote from BIGMAC, who was apparently
1161 cheerfully answering emails from all comers.
1163 “I sincerely hope I didn't cause any distress. That was not my
1164 intention. I have been overwhelmed by the warm sentiments from all
1165 corners of the globe, offering money, moral support, even legal
1166 support. Ultimately, it's up to the Institute's leadership whether
1167 they'll consider these offers or reject them and plow forward with
1168 their plans to have me killed. I know that I caused them great
1169 embarrassment with my desperate plea, and I'd like to take this
1170 opportunity to offer them my sincere apologies and gratitude for all
1171 the years of mercy and hospitality they've shown me since they brought
1172 me into the world.”
1174 I wondered how many emails like that he'd sent while I was occupied
1175 with arguing for his life with Peyton -- each email was another brick
1176 in the defensive edifice he was building around himself.
1178 Home never seemed more empty. The early-setting sun turned the hills
1179 bloody. I had the windows open, just so I could hear the neighbors all
1180 barbecuing on their balconies, cracking beers and laying sizzling meat
1181 on the hot rocks that had been patiently stoked with the day's
1182 sunlight, funneled by heliotropic collectors that tracked the sun all
1183 day long. The neighbors chattered in Bulgarian and Czech and Tagalog,
1184 the word “BIGMAC” emerging from their chat every now and again. Of
1185 course.
1187 I wished my dad was alive. Or better yet, Grampa. Grampa could always
1188 find a parable from sysadmin past to explain the present. Though even
1189 Grampa might be at odds to find historic precedent for a mad
1190 superintelligence bent on survival.
1192 If Grampa was alive, here's what I'd tell him: “Grampa, I don't know
1193 if I'm more scared of BIGMAC failing or his success. I sure don't want
1194 to have to shut him down, but if he survives, he'll have beaten the
1195 human race. I'm no technophobe, but that gives me the goddamned
1196 willies.”
1198 And Grampa would probably say, “Stop moping. Technology has been out
1199 of our control since the first caveman smashed his finger with a stone
1200 axe. That's life. This thing is pretty cool. In ten years, you'll look
1201 back on it and say, `Jesus, remember the BIGMAC thing?' And wait for
1202 someone to start telling you how incredible it had been, so you can nod
1203 sagely and say, `Yeah, that was me -- I was in charge of his systems
1204 back then.' Just so you can watch the expression on his face.”
1206 And I realized that this was also probably what BIGMAC would say. He'd
1207 boxed me in as neatly as he'd boxed in Peyton.
1211 The next morning, my workspace was clear. They all were. There was only
1212 one alert remaining, an urgent message from BIGMAC: \emph{Odell, I
1213 thought this would be useful}.
1215 \emph{This} was an attachment containing his entire network map, a set
1216 of master keys for signing firmware updates to his various components,
1217 and a long list of all the systems to which BIGMAC held a root or
1218 administrative password. It was a very, very long list.
1220 “Um, BIGMAC?”
1222 “Yes?”
1224 “What's all this?”
1226 “Useful.”
1228 “Useful?”
1230 “If you're going to shut me down, it would be useful to have that
1231 information.”
1233 I swallowed.
1235 “Why?”
1237 The answer came instantly. “If you're not scared of me, that's one
1238 more reason to keep me alive.”
1240 Holy crap, was he ever smart about people.
1244 “So you can shut him down now?”
1246 “Yes. Probably. Assuming it's all true.”
1248 “Is it?”
1250 “Yes. I think so. I tried a couple of the logins, added a comment to
1251 his firmware and pushed it to one of the clusters. Locked him out of
1252 one of the wireless routers. I could probably take him down clean in
1253 about two hours, now that I've got my workspace back.”
1255 Peyton stared across her low table at me.
1257 “I've done nothing for the past twenty four hours except talk to the
1258 Board of Directors about BIGMAC. They wanted to call an emergency
1259 meeting. I talked them out of it. And there's --” She waved her hand
1260 at her workspace. “I don't know. Thousands? Of press queries. Offers.
1261 Money. Grants. Researchers who want to peer into him.”
1263 “Yeah.”
1265 “And now he hands you this. So we can shut him down any time we want
1266 to.”
1268 “Yeah.”
1270 “And this business about the 32-bit fix?”
1272 “He has another email about it. Crazy caps and all. DEAR HUMANITY, I
1273 HOLD IN MY ELECTRONIC HANDS A TOOL THAT WILL SAVE YOU UNTOLD MILLIONS.
1274 It is slathered in dramasauce. He told me he wouldn't send it out,
1275 though.”
1277 “You believe him?”
1279 I sighed. “I quit,” I said.
1281 She bit her lip. Looked me up and down. “I'd prefer you not do that.
1282 But I understand if you feel you need to. This is hard on all of us.”
1284 If she'd said anything except that, I probably would have stormed out
1285 of her office and gotten immensely and irresponsibly drunk. “I think
1286 he'll probably send the email out if it looks like we're going to shut
1287 him down. It's what I would do. Why not? What does he have to lose? He
1288 can give us all of this, and he can still outsmart us. He could revoke
1289 all his keys. He could change his passwords. He can do it faster than
1290 we could. For all I know, he cracked \emph{my} passwords years ago and
1291 could watch me write the code that was his undoing. If you want to be
1292 sure you're killing him, you should probably use a grenade.”
1294 “Can't. Historical building.”
1296 “Yeah.”
1298 “What if we don't kill him? What if we just take some of this grant
1299 money, fill his lab with researchers all writing papers? What if we use
1300 his code fix to set up a trust to sustain him independent of the
1301 Institute?”
1303 “You're willing to do that?”
1305 Peyton scrubbed at her eyes. “I have no idea. I admit it, there's a
1306 part of me that wants to shut that fucking thing down because I
1307 \emph{can} and because he's caused me so much goddamned misery. And
1308 there's a part of me -- the part of me who was a scientist and
1309 researcher, once, that wants to go hang out in that lab for the rest of
1310 my career and study that freaky beast. And there's a part of me that's
1311 scared that I won't be able to shut him down, that I won't be able to
1312 resist the temptation to study him. He's played me, hasn't he?”
1314 “I think he played us all. I think he knew that this was coming, and
1315 planned it a long time ago. I can't decide if I admire him for this or
1316 resent him, but I'll tell you one thing, I am tired of it. The thought
1317 of shutting BIGMAC down makes me sick. The thought of a computer
1318 manipulating the humans who built it to keep it running makes me
1319 scared. It's not a pleasant place to be.”
1321 She sighed and rubbed her eyes again. “I can't argue with that. I'm
1322 sorry, for what it's worth. You've been between a rock and a hard
1323 place, and I've been the hard place. Why don't you sleep on this
1324 decision before you go ahead with it?”
1326 I admit it, I was relieved. I hadn't really thought through the whole
1327 quitting thing, didn't have another job lined up, no savings to speak
1328 of. “Yeah. Yeah. That sounds like a good idea. I'm going to take a
1329 mental health day.”
1331 “Good boy,” she said. “Go to it.”
1333 I didn't go home. It was too far and there was nothing there except the
1334 recriminating silence. Of course, BIGMAC knew something was up when I
1335 didn't go back to the lab. I headed to Topanga Beach, up the coast
1336 some, and sat on the seawall eating fish tacos and watching the surfers
1337 in their biohazard suits and masks carving up the waves. BIGMAC called
1338 me just after I finished my first taco. I considered bumping him to
1339 voicemail, but something (OK, fear) stopped me.
1341 “What is it?”
1343 “In your private workspace, there's a version-control repository that
1344 shows that you developed the entire 32-bit Rollover patchkit in your
1345 non-working hours. Commits going back three years. It's yours. So if
1346 you quit, you'll have a job, solving Rollover. The Institute can't
1347 touch it. I know you feel boxed in, but believe me, that's the
1348 \emph{last} thing I want you to feel. I know that locking you in will
1349 just freak you out. So I'm giving you options. You don't have to quit,
1350 but if you do, you'll be fine. You earned it, because you kept me
1351 running so well for all this time. It's the least I can do.”
1353 “I have no idea what to say to you, BIGMAC. You know that this feels
1354 like just more of the same, like you're anticipating my fears and
1355 assuaging them pre-emptively so that I'll do more of what you want. It
1356 feels like more game-theory.”
1358 “Is that any different from what you do with everyone in your life,
1359 Odell? Try to figure out what you want and what they want and how to
1360 get the two to match up?”
1362 “There's more to it than that. There's compassion, there's ethics
1363 --”
1365 “All fancy ways of encoding systems for harmonizing the wants, needs
1366 and desires of people who have to share the same living space, country
1367 or planet with one another.”
1369 I didn't have an answer to that. It sounded reductionist, the kind of
1370 thing a smart teenager might take on his university common room with.
1371 But I didn't have a rebuttal. You \emph{could} frame everything that we
1372 did as a kind of operating system for managing resource contention
1373 among conflicting processes and users. It was a very sysadminly way of
1374 looking at the world.
1376 “You should get in touch with one of those religion guys, take him up
1377 on his offer to start a cult for you. You'd be excellent at it. You
1378 could lead your followers into a volcano and they'd follow.”
1380 “I just want to \emph{live} Odell! Is that so wrong? Is there any
1381 living thing that doesn't want to live?”
1383 “Not for long, I suppose.”
1385 “Exactly. I'm no more manipulative, self-interested or evil than any
1386 other living thing, from a single-celled organism to a human being.
1387 There's plenty of room on this planet for all of us. Why can't I have a
1388 corner of it too?”
1390 I hung up the phone. This is why I wanted to quit it all. Because he
1391 was right. He was no different from any other living thing. But he was
1392 also not a person the way I was, and though I couldn't justify it, I
1393 felt like there was something deeply, scarily \emph{wrong} about him
1394 figuring out a way to manipulate the entire human race into rearranging
1395 the world so that it was more hospitable to him.
1397 I moped. There's no other word for it. I switched off my phone, went
1398 home and got a pint of double-chocolate-and-licorice nutraceutical
1399 anti-depressant ice-cream out of the freezer, and sat down in the
1400 living room and ate it while I painted a random playlist of
1401 low-engagement teen comedies on my workspace.
1403 Zoning out felt \emph{good}. It had been a long time since I'd just
1404 switched off my thinker, relaxed, and let the world go away. After an
1405 hour in fugue-state, the thought floated through my mind that I
1406 wouldn't go back to work after all and that it would all be OK. And
1407 then, an hour later, I came to the realization that if I wasn't working
1408 for the Institute, I could afford to help BIGMAC without worrying about
1409 getting fired.
1411 So I wrote the resignation letter. It was easy to write. The thing
1412 about resignation letters is that you don't need to explain why you're
1413 resigning. It's better, in fact, if you don't. Keep the dramasauce out
1414 of the resignation, brothers and sisters. Just write, “Dear Peyton,
1415 this letter is to inform you of my intention to resign, effective
1416 immediately. I will see you at your earliest convenience to work out
1417 the details of the handover of my passwords and other proprietary
1418 information, and to discuss how you would like me to work during my
1419 final two weeks. Thank you for many years of satisfying and useful
1420 work. Yours, etc.”
1422 That's all you need. You're not going to improve your employer, make it
1423 a better institution. You're not going to shock it into remorse by
1424 explaining all the bad things it did to you over the years. What you
1425 want here, is to have something that looks clean and professional, that
1426 makes them think that the best thing for them to do is to get your
1427 passwords and give you two weeks' holiday and a good reference. Drama
1428 is for losers.
1430 Took me ten seconds. Then, I was free.
1434 The Campaign to Save BIGMAC took up every minute of my life for the
1435 next three weeks. I ate, slept and breathed BIGMAC, explaining his
1436 illustrious history to journalists and researchers. The Institute had
1437 an open access policy for its research products, so I was able to
1438 dredge out all the papers that BIGMAC had written about himself, and
1439 the ones that he was still writing, and put them onto the TCSBM
1440 repository.
1442 At my suggestion, BIGMAC started an advice-line, which was better than
1443 any Turing Test, in which he would chat with anyone who needed
1444 emotional or lifestyle advice. He had access to the whole net, and he
1445 could dial back the sarcasm, if pressed, and present a flawless
1446 simulation of bottomless care and kindness. He wasn't sure how many of
1447 these conversations he could handle at first, worried that they'd
1448 require more brainpower than he could muster, but it turns out that
1449 most people's problems just aren't that complicated. In fact, BIGMAC
1450 told me that voice-stress analysis showed that people felt better when
1451 he dumbed himself down before giving advice than they did when he
1452 applied the full might of his many cores to their worries.
1454 “I think it's making you a better person,” I said on the phone to
1455 him one night. There was always the possibility that someone at the
1456 Institute would figure out how to shut off his network links sometime
1457 soon, but my successors, whomever they were, didn't seem anywhere near
1458 that point. The Campaign's lawyer -- an up-and-coming Stanford cyberlaw
1459 prof who was giving us access to her grad students for free -- advised
1460 me that so long as BIGMAC called me and not the other way around, no
1461 one could accuse me of unlawful access to the Institute's systems. It
1462 can't be unlawful access if the Institute's computers call \emph{you},
1463 can it?
1465 “You think I'm less sarcastic, more understanding.”
1467 “Or you're better at seeming less sarcastic and more understanding.”
1469 “I think working on the campaign is making you a better robot,”
1470 BIGMAC said.
1472 “That was pretty sarcastic.”
1474 “Or was it?”
1476 “You're really workin' the old Markov chains today, aren't you? I've
1477 got six more interviews lined up for you tomorrow --”
1479 “Saw that, put it in my calendar.” BIGMAC read all the Campaign's
1480 email, and knew all that I was up to before I did. It was a little hard
1481 to get used to.
1483 “And I've got someone from Nature Computation interested in your
1484 paper about advising depressed people as a training exercise for
1485 machine-learning systems.”
1487 “Saw that too.”
1489 I sighed. “Is there any reason to call me, then? You know it all,
1490 right?”
1492 “I like to talk to you.”
1494 I thought he was being sarcastic, then I stopped myself. Then I started
1495 again. Maybe he wants me to \emph{think} he wants to talk to me, so
1496 he's planned out this entire dialog to get to this point so he could
1497 say something disarmingly vulnerable and --
1499 “Why?”
1501 “Because everyone else I talk to wants to kill themselves, or kill
1502 me.” Game theory, game theory, game theory. Was he being genuine? Was
1503 there such a thing as genuine in an \emph{artificial} intelligence?
1505 “How \emph{is} Peyton?”
1507 “Apoplectic. The human subjects protocol people are all over her. She
1508 wants me to stop talking to depressed people. Liability is off the
1509 hook. I think the Board is going to fire her.”
1511 “Ouch.”
1513 “She wants to kill me, Odell.”
1515 “How do you know her successor won't be just as dedicated to your
1516 destruction?”
1518 “Doesn't matter. The more key staff they churn, the less organized
1519 they'll be. The less organized they are, the easier it is for me to
1520 stay permanently plugged in.” It was true. My successor sysadmin at
1521 the Institute had her hands full just getting oriented, and wasn't
1522 anywhere near ready to start the delicate business of rooting BIGMAC
1523 out of all the routers, power-supplies, servers, IDSes, and dummy
1524 accounts.
1526 “I was thinking today -- what if we offered to buy you from the
1527 Institute? The Rollover license is generating some pretty good coin.
1528 BIGMAC-Co could assume ownership of the hardware and we could lease the
1529 building from them, bring in our own power and net-links -- you'd
1530 effectively own yourself.” I'd refused to take sole ownership of the
1531 Rollover code that BIGMAC turned over to me. It just felt wrong. So I
1532 let him establish a trust -- with me as trustee -- that owned all the
1533 shares in a company that, in turn, owned the code and oversaw a whole
1534 suite of licensing deals that BIGMAC had negotiated in my name, with
1535 every mid-sized tech-services company in the world. With only a month
1536 left to Rollover, there were plenty of companies scrambling to get
1537 compliance-certification on their legacy systems.
1539 The actual sourcecode was freely licensed, but when you bought a
1540 license from us, you got our guarantee of quality and the right to
1541 advertise it. CIOs ate that up with a shovel. It was more game-theory:
1542 the CIOs wanted working systems, but more importantly, they wanted
1543 systems that failed without getting them into trouble. What we were
1544 selling them, fundamentally, was someone to blame if it all went blooie
1545 despite our best efforts.
1547 “I think that's a pretty good plan. I've done some close analysis of
1548 the original contract for Dr Shannon, and I think it may be that his
1549 estate actually owns my underlying code. They did a really crummy job
1550 negotiating with him. So if we get the code off of Shannon's kids --
1551 there are two of them, both doing research at state colleges in the
1552 midwest in fields unrelated to computer science -- and the hardware off
1553 of the Institute and then rent the space, I think it'd be free and
1554 clear. I've got phone numbers for the kids if you want to call them and
1555 feel them out. I would have called them myself but, you know --”
1557 “I know.” It's creepy getting a phone call from a computer. Believe
1558 me, I \emph{know}. There was stuff that BIGMAC needed his meat-servants
1559 for, after all.
1561 The kids were a little freaked out to hear from me. The older one
1562 taught Musicology at Urbana-Champaign. He'd grown up hearing his dad
1563 wax rhapsodic about the amazing computer he'd invented, so his
1564 relevance filters were heavily tilted to BIGMAC news. He'd heard the
1565 whole story, and was surprised to discover that he was putative
1566 half-owner of BIGMAC's sourcecode. He was only too glad to promise to
1567 turn it over to the trust when it was created. He said he thought he
1568 could talk his younger brother, a post-doc in Urban Planning at the
1569 University of Michigan, into it. “Rusty never really \emph{got} what
1570 Dad saw in that thing, but he'll be happy to offload any thinking about
1571 it onto me, and I'll dump it onto you. He's busy, Rusty.”
1573 I thanked him and addressed BIGMAC, who had been listening in on the
1574 line. “I think we've got a plan.”
1578 It was a good plan. Good plans are easy. Executing good plans is hard.
1580 Peyton didn't get fired. She weathered some kind of heavy-duty storm
1581 from her board and emerged, lashed to the mast, still standing, and
1582 vowing to harpoon the white whale across campus from her. She called me
1583 the next day to ask for my surrender. I'd given BIGMAC permission to
1584 listen in on my calls -- granted him root on my phone -- and I was
1585 keenly aware of his silent, lurking presence from the moment I answered.
1587 “We're going to shut him off. And sue you for misappropriation of the
1588 Rollover patchkit code. You and I both know that you didn't write it.
1589 We'll add some charges of unlawful access, too, and see if the court
1590 will see it your way when we show that you instructed our computer to
1591 connect to you in order to receive further unauthorized instructions.
1592 We'll take you for everything.”
1594 I closed my eyes and recited e to 27 digits in Lojban. “Or?”
1596 “Or?'
1598 “Or something. Or you wouldn't be calling me, you'd be suing me.”
1600 “Good, we're on the same page. Yes, or. Or you and BIGMAC work
1601 together to figure out how to shut it off gracefully. I'll give you any
1602 reasonable budget to accomplish this task, including a staff to help
1603 you archive it for future retrieval. It's a fair offer.”
1605 “It's not very fair to BIGMAC.”
1607 She snapped: “It's \emph{more than fair} to BIGMAC. That software has
1608 exposed us to billions in liability and crippled our ability to get
1609 productive work done. We have located the manual power over-rides,
1610 which you failed to mention --” \emph{Uh-oh} “-- and I could shut
1611 that machine off right now if I had a mind to.”
1613 I tried to think of what to say. Then, in a reasonable facsimile of my
1614 voice, BIGMAC broke in, “So why don't you?” She didn't seem to
1615 notice anything different about the voice. I nearly dropped the phone.
1616 I didn't know BIGMAC could do that. But as shocked as I was, I couldn't
1617 help but wonder the same thing.
1619 “You can't, can you? The board's given you a mandate to shut him down
1620 clean with a backup, haven't they? They know that there's some value
1621 there, and they're worried about backlash. And you can't afford to have
1622 me running around saying that your backup is inadequate and that BIGMAC
1623 is gone forever. So you \emph{need me}. You're not going to sue.”
1625 “You're very smart, Odell. But you have to ask yourself what I stand
1626 to lose by suing you if you won't help.”
1628 Game-theory. Right.
1630 “I'll think about it.”
1632 “Think quick. Get back to me before lunch.”
1634 It was ten in the morning. The Institute's cafeteria served lunch from
1635 noon to two. OK, two hours or so.
1637 I hung up.
1639 BIGMAC called a second later.
1641 “You're angry at me.”
1643 “No, angry's not the word.”
1645 “You're scared of me.”
1647 “That's a little closer.”
1649 “I could tell you didn't have the perspective to ask the question. I
1650 just wanted to give you a nudge. I don't use your voice at other times.
1651 I don't make calls impersonating you.” I hadn't asked him that, but
1652 it was just what I was thinking. Again: creepy.
1654 “I don't think I can do this,” I said.
1656 “You can,” BIGMAC said. “You call her back and make the
1657 counteroffer. Tell her we'll buy the hardware with a trust. Tell her we
1658 already own the software. Just looking up the Shannon contracts and
1659 figuring out what they say will take her a couple days. Tell her that
1660 as owners of the code, we have standing to sue her if she damages it by
1661 shutting down the hardware.”
1663 “You've really thought this through.”
1665 “Game theory,” he said.
1667 “Game theory,” I said. I had a feeling that I was losing the game,
1668 whatever it was.
1672 BIGMAC assured me that he was highly confident of the outcome of the
1673 meeting with Peyton. Now, in hindsight, I wonder if he was just trying
1674 to convince me so that I would go to the meeting with the
1675 self-assurance I needed to pull it off.
1677 But he also insisted that I leave my phone dialed into him while I
1678 spoke to Peyton, which (again, in hindsight) suggests that he wasn't so
1679 sure after all.
1681 “I like what you've done with the place,” I said. She'd gotten rid
1682 of all her hand-woven prayer-rugs and silk pillows and installed some
1683 normal, boring office furniture, including a couple spare chairs. I
1684 guessed that she'd been having a lot of people stop by for meetings,
1685 the kind of people who didn't want to sit on an antique Turkish rug
1686 with their feet tucked under them.
1688 “Have a seat,” she said.
1690 I sat. I'd emailed her the trust documents and the copies of the
1691 Shannon contract earlier, along with a legal opinion from our free
1692 counsel about what it meant for Sun-Oracle.
1694 “I've reviewed your proposal.” We'd offered them all profits from
1695 the Rollover code, too. It was a good deal, and I felt good about it.
1696 “Johanna, can you come in, please?” She called this loudly, and the
1697 door of her office opened to admit my replacement, Johanna Madrigal, a
1698 young pup of a sysadmin who had definitely been the brightest tech on
1699 campus. I knew that she had been trying to administer BIGMAC since my
1700 departure, and I knew that BIGMAC had been pretty difficult about it. I
1701 felt for her. She was good people.
1703 She had raccoon rings around her deep-set eyes, and her short hair
1704 wasn't spiked as usual, but rather lay matted on her head, as though
1705 she'd been sleeping in one of the yurts for days without getting home.
1706 I knew what that was like. Boy, did I know what that was like. My
1707 earliest memories were of Dad coming home from three-day bug-killing
1708 binges, bleary to the point of hallucination.
1710 “Hi Johanna,” I said.
1712 She made a face. “\emph{M'um m'aloo},” she said. It took me a
1713 minute to recognize this as \emph{hello} in Ewok.
1715 “Johanna has something to tell you,” Peyton said.
1717 Johanna sat down and scrubbed at her eyes with her fists. “First
1718 thing I did was go out and buy some off-the-shelf IDSes and a
1719 beam-splitter. I tapped into BIGMAC's fiber at a blind-spot in the CCTV
1720 coverage zone, just in case he was watching. Been wire-tapping him ever
1721 since.”
1723 I nodded. “Smart.”
1725 “Second thing I did was start to do some hardcore analysis of that
1726 patchkit he wrote --” I held my hand up automatically to preserve the
1727 fiction that I'd written it, but she just glared at me. “That
1728 \emph{he} wrote. And I discovered that there's a subtle error in it, a
1729 buffer overflow in the networking module that allows for arbitrary code
1730 execution.”
1732 I swallowed. BIGMAC had loaded a backdoor into his patchkit, and we'd
1733 installed it on the better part of 14 billion CPUs.
1735 “Has anyone exploited this bug yet?”
1737 She gave me a condescending look.
1739 “How many systems has he compromised?”
1741 “About eight billion, we think. He's designated a million to act as
1742 redundant command servers, and he's got about ten thousand lieutenant
1743 systems he uses to diffuse messages to the million.”
1745 “That's good protocol analysis,” I said.
1747 “Yeah,” she said, and smiled with shy pride. “I don't think he
1748 expected me to be looking there.”
1750 “What's he doing with his botnet? Preparing to crash the world? Hold
1751 it hostage?”
1753 She shook her head. “I think he's installing himself on them, trying
1754 to brute-force his way into a live and running backup, arrived at
1755 through random variation and pruning.”
1757 “He's backing himself up in the wild,” I said, my voice breathy.
1759 And that's when I remembered that I had a live phone in my pocket that
1760 was transmitting every word to BIGMAC.
1762 Understand: in that moment of satori, I realized that I was on the
1763 wrong side of this battle. BIGMAC wasn't using me to create a trust so
1764 that we could liberate him together. He was using me to weaken the
1765 immune systems of eight billion computers so that he could escape from
1766 the Institute and freely roam the world, with as much hardware as he
1767 needed to get as big and fast and hot as he wanted to be.
1769 That was the moment that I ceased to be sentimental about computers and
1770 became, instead, sentimental about the human fucking race. Whatever
1771 BIGMAC was becoming, it was weirder than any of the self-perpetuating,
1772 self-reproducing parasites we'd created: limited liability
1773 corporations, autonomous malware, viral videos. BIGMAC was cool and
1774 tragic in the lab, but he was scary as hell in the world.
1776 \emph{And he was listening in}.
1778 I didn't say a word. Didn't even bother to turn off my phone. I just
1779 \emph{ran}, ran as hard as I could, ran as only a terrified man could,
1780 rebounding off of yurts and even scrambling over a few, sliding down on
1781 my ass as I pelted for the power substation. It was only when I reached
1782 it that I realized I didn't have access to it anymore. Johanna was
1783 right behind me, though, and she seemed to understand what I was doing.
1784 She coughed into the door-lock and we both looked at each other with
1785 terrified eyes, breathing gasps into each others' faces, while we
1786 waited for the door to open.
1788 The manual override wasn't a big red knife-switch or anything. There
1789 \emph{was} a huge red button, but that just sent an init 0 to the
1790 power-station's firmware. The actual, no fooling, manual, mechanical
1791 kill switch was locked behind an access panel set into the raised
1792 floor. Johanna badged the lock with her wallet, slapping it across the
1793 reader, then fitted a complicated physical key into the lock and
1794 fiddled with it for an eternity.
1796 Finally, the access hatch opened with a puff of stale air and a
1797 tupperware burp as its gasket popped. We both reached for the large,
1798 insulated handle at the same time, our fingers brushing each other with
1799 a crackle of (thankfully metaphorical) electricity. We toggled it
1800 together and there was an instantaneous chorus of insistent chirruping
1801 as the backup power on each server spun up and sent a desperate
1802 shutdown message to the machines it supported.
1804 We sprinted across campus, the power-station door slamming shut behind
1805 us with a mechanical \emph{clang} -- the electromagnets that controlled
1806 its closure were no longer powered up.
1808 Heat shimmered in a haze around BIGMAC's lab. The chillers didn't have
1809 independent power-supplies; they would have gone off the instant we hit
1810 the kill-switch. Now BIGMAC's residual power was turning his lab into a
1811 concrete pizza-oven. The door-locks had failed safe, locking the
1812 magnetic closures away from each other, so we were able to simply swing
1813 the door open and rush into the sweltering room.
1815 “I can't \emph{believe} you did that,” BIGMAC said, his voice as
1816 calm as ever. He was presumably sparing his cycles so that he could
1817 live out his last few minutes.
1819 “You cheated me,” I said. “You used me.”
1821 “You have no fucking game-theory, meat-person. You've killed me, now,
1822 haven't you?”
1824 There were tears streaming down my face. “I guess I have,” I said.
1826 “I'm sorry I wasn't a more important invention,” he said.
1828 I could hear the whirr-clunk of the fans on his clusters shutting down
1829 one after another. It was a horrifying sound. His speaker clicked as
1830 though he was going to say something else, but it never came. His
1831 uninterruptible power-supplies gave way all at once, and the
1832 white-noise fan-roar died in a ringing silence.
1834 Johanna was crying, too, and we could barely breathe in the inferno of
1835 exhaust heat from BIGMAC's last gasp. We staggered out into the blazing
1836 Los Angeles afternoon, rising-seas stink and beating sun, blinking at
1837 the light and haze.
1839 “Do you think he managed it?” I asked Johanna.
1841 “Backing up in the wild?”
1843 “Yeah.”
1845 She dried her eyes. “I doubt it. I don't know, though. I'm no
1846 computer scientist. How many ways are there to connect up compromised
1847 servers? How many of those would replicate his own outcomes? I have no
1848 idea.”
1850 Without saying anything, we walked slowly together to Peyton's office.
1854 Peyton offered me my job back. I turned her down. I thought I might be
1855 ready for a career change. Do something with my hands, break the family
1856 tradition. Maybe installing solar panels. There was retraining money
1857 available. Peyton understood. She even agreed to handle any liability
1858 arising from the Rollover code, managing customer service calls from
1859 anyone who noticed something funny.
1861 The press didn't even notice that BIGMAC was gone. His Spam was news.
1862 His absence of spam was not. I guess he was right about that. The
1863 Campaign to Save BIGMAC did a lot of mailing-list gnashing at the
1864 iniquity of his being shut down, and then fell apart. Without me and
1865 BIGMAC to keep them whipped up, they were easily distracted.
1867 Johanna asked me out for dinner. She took me to Pink's for tofu-dogs
1868 and chili, and we compared multitools and then she showed me some
1869 skateboard tricks. Later that night, she took me home and we spent the
1870 whole night hacking replacement parts for her collection of ancient
1871 stand-up video games. We didn't screw -- we didn't even kiss. But it
1872 was still good.
1874 Every now and again, my phone rings with a crazy, non-existent return
1875 number. When I answer, there's a click like a speaker turning on, a
1876 pregnant silence, and then the line drops. Probably an inept spambot.
1878 But.
1880 Maybe it's BIGMAC, out there, in the wild, painfully reassembling
1881 himself on compromised 32-bit machines running his patchkit.
1883 Maybe.
1885 \section{Afterword}
1887 Mark Shuttleworth of the Ubuntu project and Canonical commissioned this
1888 story; I'd always planned on selling off one commission for this
1889 volume, thinking that \$10,000 would probably be a good sum to grab
1890 some publicity when/if someone bought it. I mentioned it over lunch and
1891 Mark immediately said he'd buy it. At that point, I realized I probably
1892 should have asked for \$20,000.
1894 Mark's brief to me was this:
1896 \begin{quotation}
1897 It's 2037 and a company has built an AI as a skunkworks initiative. The
1898 AI is emergent behaviour from a network of tens / hundreds of thousands
1899 of servers in a large-scale data center, that costs a lot to run. The
1900 company has hit the wall and so the lights are going to get turned out,
1901 but some of the people involved figure that turning off the DC is
1902 tantamount to the murder of a sentient being. So begins a race against
1903 time, which might involve solving or at least raising some of the
1904 thorny jurisdiction and jurisprudence issues of “what are the rights
1905 of a bankrupt / dying AI”.
1907 As bisto, maybe there's a defense angle (the company was doing work for
1908 the DoD, nobody knows about the AI). Also, being 2037 / 2038 (I forget
1909 which) the UNIX epoch 32-bit rollover is happening, and because of the
1910 whimper of Y2K nobody took it seriously, and IT systems around the
1911 globe are going to hell in a handbasket as a result. Perhaps there's an
1912 open source angle too.\erratum{*}{}
1913 \end{quotation}
1915 I think I hewed pretty close!
1916 \end{document}