"Any smells? Strange sensations?" As he asked, McPherson looked at the EEG scanner above the bed. It was still reading normal alpha patterns, without any suggestion of seizure activity.
"No. Nothing like that."
"But you feel as if you might explode?" He thought: Ross should really be asking these questions.
"Sort of," Benson said. "In the coming war, we may all explode."
"How do you mean?"
"You look annoyed," Benson said.
"I'm not, just puzzled. How do you mean, in the coming war?"
"In the coming war between men and machines. The human brain is obsolete, you see."
That was a new thought. McPherson hadn't heard it from
Benson before. He stared at him, lying in the bed, his head and shoulders heavily bandaged. It made the upper part of his body and his head appear thick, gross, oversized.
"Yes," Benson said. "The human brain has gone as far as it is going to go. It's exhausted, so it spawned the next generation of intelligent forms. They will- Why am I so tired?" He closed his eyes again.
"You're exhausted from the operation."
"A minor procedure," he said, and smiled with his eyes closed. A moment later he was snoring.
McPherson remained by the bed for a moment, then turned to the window and watched the sun set over the Pacific. Benson had a nice room; you could see a bit of the ocean between the high-rise apartments in Santa Monica. He remained there for several minutes. Benson did not wake. Finally, McPherson went out to the nurses' station to write his note in the chart.
"Patient alert, responsive, oriented times three." He paused after writing that. He didn't really know if Benson was oriented to person, place, and time; he hadn't checked specifically. But he was clear and responsive, and McPherson let it go. "Flow of ideas orderly and clear, but patient retains machine imagery of pre-operative state. It is too early to be certain, but it appears that early predictions were correct that the operation would not alter his mentation between seizures."
Signed, "Roger A. McPherson, M.D."
He stared at it for a moment, then closed the chart and replaced it on the shelf. It was a good note, cool, direct, holding out no false anticipations. The chart was a legal document, after all, and it could be called into court. McPherson didn't expect to see Benson's chart in court, but you couldn't be too careful. He believed very strongly in appearances - and he felt it was his job to do so.
The head of any large scientific laboratory performed a political function. You might deny it; you might dislike it. But it was nonetheless true, a necessary part of the job.
You had to keep all the people in the lab happy as they worked together. The more prima donnas you had, the tougher the job was, as pure politics.
You had to get your lab funded from outside sources, and that was also pure politics. Particularly if you were working in a delicate area, as the NPS was. McPherson had long since evolved the horseradish-peroxidase principle of grant applications. It was simple enough: when you applied for money, you announced that the money would be spent to find the enzyme horseradish peroxidase, which could lead to a cure for cancer. You would easily get sixty thousand dollars for that project - although you couldn't get sixty cents for mind control.
He looked at the row of charts on the shelf, a row of unfamiliar names, into which BENSON, H. F. 710 merged indistinguishably. In one sense, he thought, Benson was correct - he was a walking time bomb. A man treated with mind-control technology was subject to all sorts of irrational public prejudice. "Heart control" in the form of cardiac pacemakers was considered a wonderful invention; "kidney control" through drugs was a blessing. But "mind control" was evil, a disaster- even though the NPS control work was directly analogous to control work with other organs. Even the technology was similar: the atomic pacemaker they were using had been developed first for heart work.
But the prejudice remained. And Benson thought of himself as a ticking time bomb. McPherson sighed, took out the chart again, and flipped to the section containing doctors' orders. Both Ellis and Morris had written post-op care orders. McPherson added: "After interfacing tomorrow a.m., begin thorazine."
He looked at the note, then decided the nurses wouldn't understand interfacing. He scratched it out and wrote: "After noon tomorrow, begin thorazine."
As he left the floor, he thought that he would rest more easily once Benson was on thorazine. Perhaps they couldn't defuse the time bomb - but they could certainly drop it into a bucket of cold water.
7
Late at night, in Telecomp, Gerhard stared anxiously at the computer console. He typed in more instructions, then walked to a print-out typewriter and began reviewing the long sheaf of green-striped sheets. He scanned them quickly, looking for the error he knew was there in the programmed instructions.
The computer itself never made a mistake. Gerhard had used computers for nearly ten years - different computers, different places - and he had never seen one make a mistake. Of course, mistakes occurred all the time, but they were always in the program, not in the machine. Sometimes that infallibility was hard to accept. For one thing, it didn't fit with one's view of the rest of the world, where machines were always making mistakes - fuses blowing, stereos breaking down, ovens overheating, cars refusing to start. Modern man expected machines to make their fair share of errors.
But computers were different, and working with them could be a humiliating experience. They were never wrong. It was as simple as that. Even when it took weeks to find the source of some problem, even when the program was checked a dozen times by as many different people, even when the whole staff was slowly coming to the conclusion that for once, the computer circuitry had fouled up - it always turned out, in the end, to be a human error of some kind. Always.
Richards came in, shrugging off a sport coat, and poured himself a cup of coffee. "How's it going?"
Gerhard shook his head. "I'm having trouble with George."
"Again? Shit." Richards looked at the console. "How's
Martha?"
"Martha's fine, I think. It's just George."
"Which George is it?"
"Saint George," Gerhard said. "Really a bitch."
Richards sipped his coffee and sat down at the console.
"Mind if I try it?"
"Sure," Gerhard said.
Richards began flicking buttons. He called up the program for Saint George. Then he called up the program for Martha. Then he pushed the interaction button.
Richards and Gerhard hadn't devised these programs; they were modified from several existing computer programs developed at other universities. But the basic idea was the same - to create a computer program that would make the computer act emotionally, like people. It was logical to designate the programs with names like George and Martha. There was a precedent for that: Eliza in Boston, and Aldous in England.
George and Martha were essentially the same program with slight differences. The original George was programmed to be neutral in his response to stimuli. Then Martha was created. Martha was a little bitchy; Martha disliked most things. Finally, another George was formulated, a very loving George, who was referred to as Saint George.
Each program could respond with three emotional states - love, fear, and anger. Each could produce three actions - approach, withdrawal, and attack. All this was, of course, highly abstract. It was carried out in terms of numbers. For example, the original George was neutral to most numbers, but he disliked the number 751. He was programmed to dislike it. And by extension he disliked similar numbers - 743, 772, and so on. He much preferred numbers such as 404, 133, and 918. If you punched in one of these numbers, George responded with numbers signifying love and approach. If you punched in 707, George withdrew. If you punched in 750, George angrily attacked - according to the numbers he printed out.
The NPS staff had played with the programs in this way for a long time. Then they had incorporated program modifications to allow for "talk" from the computer. The numbers were translated into sentences. This was amusing, and revealing. The actual interaction was referred to as "the Christmas game" because most of it was conducted as giving and receiving gifts - objects that had an assigned or learned emotional value, just as the numbers did.
Normal George interacting with Martha would eventually win her over, and her bitchiness would recede into the background.
But Saint George had a much worse effect on her. His loving acceptance drove her up the wall. That is, if things were working normally. Richards watched as the print-out flashed across the screen.
HELLO I'M SAINT GEORGE
HELLO
WHAT IS YOUR NAME?
MARTHA.
YOU DON'T SAY MUCH, MARTHA.
NO
BUT I LIKE YOU.
WHY I DON'T LIKE YOU.
I UNDERSTAND THAT. HOW CAN I HELP?
"It isn't printing periods consistently," Richards said.
"When did that start?"
"That's nothing," Gerhard said. "Just keep watching."
HELP?
YES, I WOULD LIKE TO HELP YOU.
HAVE A CAT.