Back | Next

Chapter 8

Of all the Earthling’s crew, Raja Lon Flattery has been provided with the most accurate information, suitably weighted, of course. This was necessary because he had to be provided with a secret terminal in his quarters through which he can monitor the mood of ship and crew. A primary fuse has to be connected to the system, and Flattery is that fuse.

—Morgan Hempstead, Lectures at Moonbase

She had come into Com-central still feeling weak and disoriented. It was obvious that the shift of dominance had gone faster than expected, and she had forced herself to overcome her body’s weaknesses, putting on a mask of wellbeing and composure that she did not feel.

The ovoid Com-central room should not have confused her—she had put in too many hours of training among these dials and gauges and pipes and keyboard consoles before their departure—but the feeling of unfamiliarity persisted. Then, as awareness increased, she saw the subtle changes in connections and controls and readouts. Bickel’s handiwork.

All the changes were necessary to put the ship on manual, she realized, but she could feel the inadequacies of what had been done.

It was only then that she realized the thin edge they walked, and she turned her attention to Flattery who was finishing out his shift on the big board. The signs of strain were obvious in his movements—still exact with a surgeon’s sureness, but the control betrayed its thinning energy in the way he relaxed abruptly after each adjustment of the board.

He should be relieved now, she thought, but she knew she was not yet ready to have that green dial point down at her, and she was not sure of the conditions of Bickel and Timberlake.

Timberlake radiated glum silence.

Bickel had greeted her warmly enough, then handed her a load of programming. The task obviously pointed toward construction of an electronic multi-simulation model of their main computer’s core memory input/output.

Much of the programming remained to be completed. She lay back on her action couch, examined the test display of one series on the screen beside her. She felt the couch’s enfolding cocoon through the vacsuit, wished there were time to let her body recover fully from its dehyb ordeal.

The evidence was all around her, though, that she had to get to work. There was no time for the luxury of slow recuperation.

Okay, you’re so proud of your position and title … Prudence Lon Weygand, M.D., she told herself. You asked for this job. You know what you have to do; get with it …

The old self-lecture failed to rekindle her energies, and she steeled herself to hide all signs of weakness before speaking.

“Moonbase is taking longer to answer this time than it did before,” she said. “And I gave ’em some questions to answer.”

“They’re too busy trying to decide what our reply really means,” Bickel said.

“Or they could be figuring out how to tell us we’ve bitten off more’n we can chew,” Timberlake said.

She heard the fear in his voice. “Raj has been on that board over four hours. Isn’t it time somebody spelled him, Tim?”

Flattery knew what she was doing, but could not prevent the feeling of tension from gripping his spine. There was always the possibility Timberlake couldn’t take this.

Timberlake felt the dryness in his mouth. Naturally, she assumed he was giving orders here. He was the life-systems man. She had not volunteered to take the board, either … the bitch. But maybe it was too soon after dehyb. Metabolisms differed. She would know her own capacities, certainly. Besides, she was scheduled to follow Bickel on the board in the normal rotation.

His glance followed the Com-central track, the way the board circled around their positions. Bickel was in number one spot, then Prue, then Flattery—and he sat here on the end.

It’s my watch, Timberlake told himself.

He felt perspiration start in his palms.

Bickel had taken the board in his turn, obviously begrudging every minute away from his damned compu-tations. He would not volunteer.

I’ve got to take that board, Timberlake told himself.

He thought of the more than three thousand lives immediately dependent on him when that green arrow slid over to his position … all the other lives and dreams that had been poured into this project.

Every bit of it pointing a finger at him.

I can’t! he thought.

He’s taking too long, Flattery thought. “I’ll give you the board on the count, Tim. I’m wearing pretty thin.”

Before Timberlake could protest, the count had started and his hand went automatically to the big red switch. Board and arrow came to him. Necessities of the job caught him immediately. Almost a third of the shield temperature control needed trimming to bring it into better balance.

We should trace out the OMC linkages for this and install automatics for the gross part of the job, he thought.

Presently, he fell into the routine of the watch.

“Here’s our operating procedure,” Bickel said. He looked up, caught an exchange of knowing glances between Flattery and Prue, hesitated. Something going on between those two? If it was man-woman problems, that could cause trouble.

“You were saying,” Prudence said.

Bickel saw she was staring directly at him. He cleared his throat, glanced at his figures and schematics for reassurance. “The computer must be the basis for anything we build, but we can’t interfere with core memory and switching controls. That means we have to use an electronic simulation model. Part of the AAT system …”

“What about communication with Moonbase?” Prudence asked.

That’s a stupid question, he thought, but he hid his irritation. “A switching system will automatically restore AA function when the reply burst hits our antennas. We’ll use an alarm klaxon.”

“Oh.” She nodded, wondering how far she could go before he realized he was being irritated purposely.

“This will be an operational model,” he said. “It’ll duplicate real characteristics of the total system, but won’t function as completely as the computer-based system. However it will give us direct observation of functions with conventional equipment. It’ll tell us where we have to go unconventional. The environment, the signals, and the system parameters can be observed and changed as development progresses. And we’ll only need a one-way, fused link with the computer to permit it to record all our results.”

This much was predictable, Flattery thought. But where does he go from here?

“We’ll generate an environment in scaled time and apply its own effect signals to the system under analysis,” Prudence said. “Good. What then?”

“Based on my experience with the UMB experiments,” Bickel answered, “I can tell you which avenues aren’t worth exploring and which avenues may give us an artificial consciousness. May do it. From here on in, it’s cut and try.”

“Are we going to have to fight the time lag and possibility of transmission errors while we let Moonbase analyze our progress?” Flattery asked.

Bickel glanced at his computations and schematics, looked back at Prudence. “Do we have a mathematician aboard competent enough to break down the embodied transducers of our results?”

Prudence looked across Bickel at the displays and stacks of schematics. She had followed enough of what he was doing there to combine that with the programming he had handed her, but it was the same old self-reflexive circle every time they faced this problem—where did the round of consciousness begin?

“Maybe I can handle the math,” she said. “And that’s all—just maybe.”

“Then which avenue do we explore first?” Flattery asked.

“The field-theory approach,” Bickel said.

“Oh, great!” Timberlake growled. “We’re going to assume that the whole is greater than the sum of its parts.”

“Okay,” Bickel said. “But just because we can’t see a thing or define it, that doesn’t mean it isn’t there and shouldn’t be added into the sum. We’re going to be juggling one hell of a lot of unknowns. The best approach to that kind of job is the engineering one: if it works, that’s the answer.”

“Define consciousness for me,” Prudence said.

“We’ll leave that up to the bigdomes at UMB,” Bickel said.

“And our only contact between the simulation model and the main computer will be through the loading channels?” Prudence asked. “What do we do about the supervisory control programs?”

“We’re not going to touch the inner communications lines to the computer,” Bickel said. “Our auxiliary will go into it through a one-way channel, fused against backlash.”

“Then it won’t give us total simulation,” she pointed out.

“That’s right,” Bickel agreed. “We’ll have an error coefficient to contend with all along the line. If it gets too high, we change our plan of attack. The simulator will be just an auxiliary—kind of dumb in some respects.”

“And there’s no way for this auxiliary to run wild?” Flattery asked.

“Its supervisory program will always be one of us,” Bickel said, fighting to keep irritation from his voice. “One of us will always be in the driver’s seat. We’ll drive it—like we’d drive an ox pulling a wagon.”

“This ox won’t have any ideas of its own, eh?’ Flattery persisted.

“Not unless we solve the consciousness problem,” Bickel said.


Flattery’s word pounced.

“And when it’s conscious, what then?” he asked.

Bickel blinked at him, absorbing this. Presently, he said, “I … suppose it’ll be like a newborn baby … in a sense.”

“What baby was ever born with all the information and stored experiences of this ship’s master computer?” Flattery demanded.

Bickel’s being fed this too fast, Prudence thought. If he’s kept too much off balance he may rebel or start to probe in the wrong places. He mustn’t guess.

“Well … the human is born with instincts,” Bickel said. “And we do train the human baby into … humanity.”

“I find the moral and religious aspects of this whole idea faintly repugnant,” Flattery declared flatly. “I think there’s sin here. If not hubris, then something equally evil.”

Prudence stared at him. Flattery betrayed signs of real agitation—a flush in his cheeks, fingers trembling, eyes bright and glaring.

That wasn’t in the program, she thought. Perhaps he’s tired.

“All right,” she said. “We construct a field of interacting impulses and that puts us right smack dab into a games theory problem where countless bits are—”

“Oh, no!” Bickel snapped. “The UMB stab at this thing got all fouled up with games-theory ideas like the ‘Command Constant’ and ‘Mobility Constant’ and inner-outer-directed behavior. It took me one hell of a long time to realize they didn’t know what they were talking about.”

“Easy for you to say,” Prudence said, holding her voice to a slow, cold beat. “You forget I saw the games machine they produced. The more it was used, the more it changed in—”

“Okay, it changed,” Bickel admitted. “The machine absorbed part of its … personality from its opponents. What’s that mean? It had some of the characteristics of consciousness, sure—but it wasn’t conscious.”

She turned away, conveying a sneer by the movement alone. He has to think he can rely on no one but himself.

Flattery shifted his attention from Bickel to Prudence and back. He found it increasingly difficult to hide his resentment of Bickel.

Psychiatrist, heal thyself, he thought. Bickel has to take charge. I’m just the safety fuse.

Flattery glanced at the false plate on his personal repeater board, thinking of the trigger beneath that plate and the mate to it in his quarters concealed by the lines of the sacred graphic on the bulkhead.

Arbitrary turn-back command, Flattery reminded himself. That was the code signal he must listen for from UMB. That was the signal he must obey—unless he judged the ship had to be destroyed before receiving that signal.

A simple push on one of the hidden triggers would activate the master program in the ship’s computer, open airlocks, set off explosive charges. Death and destruction for crew, ship, all the colonists and their supplies.

Colonists and their supplies! Flattery thought.

He was too good a psychiatrist not to recognize the guilt motives behind the careful provisioning of this ship.

“If you solve the Artificial Consciousness problem, you can plant a human colony somewhere in space. Not at Tau Ceti, of course, but …”

And he was too good a diviner not to penetrate the religious hokum, not to see through to the essential rightness of his role in the project.

Given the known perils, there had to be a safety fuse. There had to be someone willing and able to blow up the ship.

Flattery knew the reasons. They were reality of the most brutal kind.

The first crude attempts at mechanical reproduction of consciousness had been made on an island in Puget Sound. The island no longer existed. “Rogue consciousness!” they had screamed. True enough. Something had defied natural laws, slaughtered lab personnel, destroyed sensors, sent slashing beams of pure destruction through the surrounding countryside.

Finally, it had taken the island—God knew where.


No island.

No lab personnel.

Nothing but gray water and a cold north wind whipping whitecaps across it and the fish and the seaweed invading the area where land and men and machinery had been.

Just thinking about it made Flattery shiver. He conjured up in his mind the image of the sacred graphic from his quarters, absorbed some of the peace from the field of serenity, the tranquility of the holy faces.

Even Moonbase didn’t walk too close to this project now. It was all a sham to educate ship personnel, to frustrate the eager young men and women.

“Each project ship must maintain its coefficient of frustration,” went the private admonition. “Frustration must come from both human and mechanical sources.”

They thought of frustration as a threshold, a factor to heighten awareness.

It made a weird kind of sense.

Thus, there were crew members like Flattery … and Prudence Lon Weygand, and machinery that broke down, robox repair units that had to have a human monitor every second—and programmed emergencies to complicate real emergencies.

Back | Next