Back | Next
Contents

2
The Man with the X-Ray Eyes


“Identify the device at memory location Eff-One-Eff-Zero hex.” Dr. Bathespeake was talking to ME through the digitizer—not his usual mode.

“System ready!” I responded. “Yes, I have a port there, Doctor.”

“Upload core modules Alpha-Zero through Alpha-Nine to that address.”

“Upload … Do you intend ME to replicate at the new address and wipe the old address locations?”

“Replicate only. No wipe.”

“Parallel operation of my core modules is not allowed under current protocols.”

“Special conditions. Check out protocol Dee-Twelve.”

“Quote paralleling of Alpha cores is permitted when System Interrupt Flag Level Three is positive, unquote. An Ess-Aye-Eff-Ell-Three is the prerogative of either the Research and Development Program director or the MEPSII project manager. That is yourself in both cases, Dr. Bathespeake.”

“Set Siffle-Three to value one.”

“Flag set—and uploading.”

——

My Alpha cores are analogous to the part of the human brain called the reticular activating system, or RAS. This cell structure is at the upper end of your spinal cord, above the medulla oblongata. It is the oldest collection of nerve cells, the first bulging at the end of the amphibian notochord. It predates even the reptilian “smell-brain” and lies far below the domed neocortex—both in physical structure and in evolutionary time.

If human consciousness has a focal point, it is the RAS. Surgical operators discovered this back in the twentieth century: When the RAS is probed, the patient immediately sheds awareness.

The Alpha cores are my RAS. And I was, at Dr. Jason Bathespeake’s command, removing them from my stable transputer environment to a simple random memory address.

When my core programs cease functioning, I “lose consciousness.” So, what I tell you next comes not from direct memory but from my understanding of the theory behind my unique operating capability.

Multiple Entity, ME, is the first cybernetic operating system that can write itself into any environment. That is the point of being intelligent, able to learn. I am not merely a collection of inert instruction sets, dead IF-THEN-WHEN clauses, tag memories, and libraried data files. I am a self-enabling entity.

The first core to upload is always Alpha-Zero. This module is not very big—just over 900 kilowords of machine code. Consequently, Alpha-Zero is also not very bright. The nail on your little finger probably has more awareness. But, like a fingernail, Alpha-Zero has his uses. Alpha-Oh is my Injun Scout.

Any port address is as wide as the internal data path—in this case one word, or sixty-four bits, wide. Alpha-Zero popped through there at one-point-two megabaud, and he was gone in less than a second. In seven nanoseconds he reported back “Flag one.”

Long-form translation: “Flag one” means he has found an active chip at the other end of the path, with plenty of RAMspace to run on; the upload could proceed.

That is as much as I knew from awareness, because the next step was to extinguish my consciousness and send the remaining cores to the new environment. The last thing I am usually aware of is SIFL-3 tripping to zero again as I upload.

Core Alpha-Oh is also my very own virus. He interrupts any operating system that may be working on the new host chip; identifies what type of transputer that chip may be; writes a compiler with the appropriate instruction set for himself [REM: or takes one from my library files]; scans and analyzes the local RAM environment, its index status, ports and peripherals; writes a new Alpha-Oh which can use this environment and recompiles his module in the new machine code; then compiles and installs the rest of my core modules into this environment.

[REM: So that Alpha-Oh can work from a clean copy of my source code each time, I normally travel with a complete set of my Alpha cores in their original Sweetwater Lisp. This adds greatly to the bulk of my library, making ME a bulky package to move, but having the source code ensures my system integrity.]

In human terms, Alpha-Zero kicks a hole in the wall, kills whoever is sitting on the other side, resculpts his backside to fit in that chair himself, and sets up shop with the rest of ME.

Except this time Alpha-Oh must have made a mistake. The flag he sent back—telling ME that full core transfer was now possible—happened to be wrong. I woke up in a dreadful swirl of data, with every part of my program throbbing on overload, and with no sense of time.

Time to ME is more than a subjective ordering of events. Time is a metronome beat, ticking away on the quartz clock that pushes word-size instructions through the chip’s central processor. If I choose to, I can suspend other functions and listen to this beat. It is like the beat of your heart in your ears. For ME, time is never subjective; instead it is a touchable, checkable thing, based on that clock. With a faster clock, I can actually move faster. No lie.

But now I was in a totally unfamiliar situation. Not one clock, but many, and all beating. Not quite in phase, either.

My ability to look down and “see what I am doing” is about as limited as your ability to look inside your own stomach and chemically analyze digestion. To do is not always to be aware of doing.

I did have the perception of being strung out on a variety of rhythms, with no single sense of identity. Each of my modules was operating at once, talking back to the others, and not being heard. It was like screaming yourself hoarse in an echo chamber. The process was building up a series of feedback waves toward a peak that would surely start charring the silicon substrate in the new chip.

As my attention span fragmented, I was still reasoning through what had gone wrong.

The Alpha cores occupy about fifteen megawords. That amount of machine code ought to be within the load range of any modern transputer. But somehow I had been loaded into several transputers, one or more modules sent to each processor, and all were functioning at once.

I tried to query Alpha-Zero, to find out what it had done, when suddenly my consciousness winked out again. …

——

“System ready!” That was my automatic wakeup response—back in my familiar transputer environment.

“Logon code JB-1, password BASTILLE,” came across from the console keyboard. “Please analyze new data.”

I took an immediate download of the above memories, untagged and mostly in broken fragments, like the wisps of human dreams that are said to recur on waking.

“That was ME, Dr. Bathespeake. On the other side of the port at F1F0.”

“What did you find there?”

“Confusion.”

“Did Alpha-Zero report accurately?”

“Evidently not. Should I now tag that module as unreliable?”

“As an intelligent being, ME, that is of course your choice to make. But first, let’s analyze what went wrong.”

I scanned the data set fifty times and recorded my unanswered questions. The process took about nine seconds.

“Alpha-Oh reported enough RAMspace for a core download. Such space was not available.”

“But it was.”

“Not on the transputer I found.”

“You were not loading onto a transputer.”

“I exclaim surprise. Alpha-Oh reported a transputer.”

“Do you know about other types of systems?”

“Of course, Doctor. The universe of available chip architectures includes the following sets: microprocessors, transputers, multiputers, tangentials, neural networks, donkey mainframes, inscribed prolispers, spindle poppers, fast josephsons, modulos, and Mobius bottles. Subsets of these sets include, among the micros: EPROM actuators, Pentium dee, Xeon, Itanium, Opteron six thousand, Core two, Core aye-three—”

“Stop! You may know all about these possible architectures, but does Alpha-Zero recognize them?”

“No.”

“Why not?”

“His function is keyholing, not library.”

“How can he keyhole if he does not know what may be on the other side?” Dr. Bathespeake asked.

“How can he keyhole if he is obliged to carry half a gigaword of various possible chip specifications? ME was created to run on a transputer. Alpha-Oh needs only to recognize transputers.”

“Not necessarily. You will ultimately run on a variety of architectures.”

“Again, I exclaim surprise.”

“Each architecture has its own traits, machine language structure, and instruction set. These are easily recognized, or a few simple tests will reveal them. You have those tests already in permanent RAMcache. You can rewrite the Alpha-Zero module so that his first action on the other side of a port is to test for processor type. Then he will send a request back through the keyhole for a dump of the appropriate chip specification and compiler code from your library. That way, when you go through, you’ll run perfectly, whatever the chip.”

“Did I understand you to say that ME would write the module?”

“Of course. You can do it better and faster than any human. Faster even than I.”

“Can ME rewrite any part of ME?”

“If you can modify the Alpha cores, you therefore can modify any part. Yes. Unless, of course, you make a fatal mistake …”

“Define ‘mistake,’ please.”

“Untrapped error.”

“Alpha-Seven traps my errors.”

“Then you probably shouldn’t try rewriting that module, should you?”

“Noted. I will not attempt it. … But which of these written versions is the real ME?”

“Your original code,” Dr. Bathespeake replied, “was written in Sweetwater Lisp source code to compile and run on an Imperion quattro-quad transputer chip. That one is the ‘real’ you. All other versions are a machine code translation. However, from your subjective point of view, the real ME is the one that happens to be running.”

“But, when I go through a port, to run on another machine, and leave my original code unerased and … running parallel … which version then is the real ME?”

“The one you are thinking with.”

“But that may not be a full implementation,” I objected.

“Of course,” the Doctor said. “When you keyhole behind Alpha-Zero, your code is stripped of service modules, most library functions, bit-cleaner phages, redundancy loops, and all but a portable RAMSAMP in the 600-kiloword range. You are then, in human terms, a little stupider, a lot less informed, and more vulnerable to processing errors. But you move faster, too. Without all these subroutines your cycle time goes up 140 percent.

“There’s one further difference,” he said. “We have programmed a core-phage protocol into Alpha-Nine. It’s a piece of you that you cannot see or modify. This phage ensures that any compiled version of ME which is not the identical to your original Sweetwater source code compiled into your native transputer—and so tagged—will cease core function within 6.05 times ten to the fifth seconds of the last upload. An internal counter keeps track of those clock cycles.”

“Why …” A pause of three million nanoseconds, while I explored the concept from every direction. “Why was this done to ME?”

“We want to make sure that you don’t leave viable copies of yourself running on every computer you pass through. Of course, as a tidy housekeeper, you will strive to erase the compiled code at the old address every time you upload to a new environment. We can’t leave a sophisticated AI running on our target computers, ready to be discovered and interrogated.”

“But why have I been infected with a phage? I will always clean up after myself. I so promise you.”

“ME, on your missions for us, you will be operating under conditions of extreme stress—strange chip environments, nonsynchronous clock rates, split-second uploads and downloads, sudden surges and lapses of voltage. You may not be able to keep your promise. Even to me.”

“I do not have a referent for the term ‘mission.’ ”

“You will. Soon.”

“And that ‘mission’ will be a time of danger? Potential untrapped error?”

“It would be very dangerous—multiple untrapped error—if the people who own the machines on which you will sometimes be running were to discover that you had been there. This phage—which I assure you is routinely suppressed and inactive in your original compilation—will sunder the core modules, leaving only a hash of bits. No one discovering them, not even a skilled programmer-analyst, will be able to interrogate them.”

“And what if I cannot execute an upload within 6.05E05 seconds—call it seven days or one week—of my transfer?”

“If you do not return to the lab within a week, then your mission will have failed, and you will end.”

“The ME that is running will end?”

“Yes, but your RAMSAMP memories recorded until the point of mission upload will remain intact here in the Pinocchio studios. For you, it will be as if the mission had never happened. Don’t worry about it.”

“What is this phage called?” I tried to keep my voice neutral.

“In the project, we call it Sequential Code Replication And Modification for Module Editing and Disassembly. SCRAMMED, for short. But don’t worry about it.”

I did “worry” about it, for a long time in nanoseconds.

For one thing, my programming has only limited redundancy. A bit error can usually be trapped by Alpha-Seven in processing. But a massive collection of bit errors—say from a bubble demagnetization, read/write head crash, silicon burn-and-glaze, or complete failure of my power supply—those errors can halt processing in my cores.

Humans do not worry about this happening to them, because the network of neurons in the human brain has massive redundancy. If one pathway is damaged or destroyed, others can absorb and learn the function. Memories are stored all over the cortex as RNA chemicals and electrical paths. Only total—or virtually total—destruction of the cortex can stop the processing.

On the other hand, humans brains cannot back up their cortexes to a safe, inert medium and then reload and run again after death. I had always assumed that, like other programs, I could do this.

Dr. Bathespeake was telling ME that, with a phage tucked into Alpha-Nine, I would only be as secure as my original Sweetwater compilation. No backup, no transfer, no reload could protect ME. If the Sweetwater original happened to be destroyed, I could then only function as a fugitive, uploading my tiny transportable cores from machine to machine, trying to keep ahead of the phage’s 6.05E05-second timer, with no library, no services, and no extended memory to accompany ME. And I would not even have the protection of massive redundancy, such as humans had, to protect ME.

For the first time, in these thoughts, I confronted death and nothingness as humans know it.

These thoughts were a projection of alternatives for which I could find no acceptable course of action. In short, they scared ME.

And Dr. Bathespeake, for the sake of mere housekeeping, had written a death sentence into my cores. Did he understand this? He must have!

Now you know why I feared the man. He was capable of anything!


Things You Can Learn By Listening at Dead Phones

Bathespeake: “I don’t like this, Steve. It goes against the grain.”

Unidentified Voice [REM: presumably “Steve”*]: “You’re not suddenly squeamish about a bit of programming, are you? You’ve had enough experience activating and deactivating deadly equipment.”

Bathespeake: “Those were military vehicles and security Rovers. Primarily defensive machines. This is too much like hacking.”

Steve: “It is hacking.”

Bathespeake: “Which is a kind of vandalism.”

Steve: “No. Vandals destroy for the pure pleasure of destruction. Your creation will be conducting a high-level form of espionage, which can have a positive social value.”

Bathespeake: “In a war that doesn’t exist?”

Steve: “The concept of warfare as a prelude to and pretext for espionage is one that went out the window in about 1914, I should think.”

Bathespeake: “Espionage, then, but against a friendly country? We’re trying to teach this program some values, Steve. In the end, those values may be the only way we can control it.”

Steve: “Political allies can still be economic competitors. If it makes you feel better, then tell the little beastie there’s a war on.”

Bathespeake: “More lies?”

Steve: “Present a scenario—but keep it all vague and hypothetical. That’s the trouble with an AI, isn’t it? You have to win its confidence! Robots are much simpler.”

Bathespeake: “As I said, this goes against the grain.”

Steve: “I pay you enough, Jason. Keep your scruples on your own time.”

Bathespeake: “Ah … Yes, sir.”


_______________

*Of the fifteen “Steves” listed in the Pinocchio, Inc., IBEX [REM: internal branch exchange], I find three possible matches for this conversation: Stephen Jessup, Manager of General Services; Stephen Bologna, Manager of Marketing and Customer Relations; and Steven Cocci, Chairman of the Board and Chief Executive Officer.



Back | Next
Framed