Settings

Robot Visions

Stories Galley Slave

   


The United States Robots and Mechanical Men Corporation, as defendants in the case, had influence enough to force a closed-doors trial without a jury.
Nor did Northeastern University try hard to prevent it. The trustees knew perfectly well how the public might react to any issue involving misbehavior of a robot, however rarefied that misbehavior might be. They also had a clearly visualized notion of how an antirobot riot might become an antiscience riot without warning.
The government, as represented in this case by Justice Harlow Shane, was equally anxious for a quiet end to this mess. Both U. S. Robots and the academic world were bad people to antagonize.
Justice Shane said, "Since neither press, public nor jury is present, gentlemen, let us stand on as little ceremony as we can and get to the facts."
He smiled stiffly as he said this, perhaps without much hope that his request would be effective, and hitched at his robe so that he might sit more comfortably. His face was pleasantly rubicund, his chin round and soft, his nose broad and his eyes light in color and wide-set. All in all, it was not a face with much judicial majesty and the judge knew it.
Barnabas H. Goodfellow, Professor of Physics at Northeastern U., was sworn in first, taking the usual vow with an expression that made mincemeat of his name.
After the usual opening-gambit questions, Prosecution shoved his hands deep into his pockets and said, "When was it, Professor, that the matter of the possible employ of Robot EZ-27 was first brought to your attention, and how?"
Professor Goodfellow's small and angular face set itself into an uneasy expression, scarcely more benevolent than the one it replaced. He said, "I have had professional contact and some social acquaintance with Dr. Alfred Lanning, Director of Research at U. S. Robots. I was inclined to listen with some tolerance then when I received a rather strange suggestion from him on the third of March of last year-"
"Of 2033?"
"That's right."
"Excuse me for interrupting. Please proceed."
The professor nodded frostily, scowled to fix the facts in his mind, and began to speak.
Professor Goodfellow looked at the robot with a certain uneasiness. It had been carried into the basement supply room in a crate, in accordance with the regulations governing the shipment of robots from place to place on the Earth's surface.
He knew it was coming; it wasn't that he was unprepared. From the moment of Dr. Lanning's first phone call on March 3, he had felt himself giving way to the other's persuasiveness, and now, as an inevitable result, he found himself face to face with a robot.
It looked uncommonly large as it stood within arm's reach. Alfred Lanning cast a hard glance of his own at the robot, as though making certain it had not been damaged in transit. Then he turned his ferocious eyebrows and his mane of white hair in the professor's direction.
"This is Robot EZ-27, first of its model to be available for public use." He turned to the robot. "This is Professor Goodfellow, Easy."
Easy spoke impassively, but with such suddenness that the professor shied. "Good afternoon, Professor."
Easy stood seven feet tall and had the general proportions of a man-always the prime selling point of U. S. Robots. That and the possession of the basic patents on the positronic brain had given them an actual monopoly on robots and a near-monopoly on computing machines in general.
The two men who had uncrated the robot had left now and the professor looked from Lanning to the robot and back to Lanning. "It is harmless, I'm sure." He didn't sound sure.
"More harmless than I am," said Lanning. "I could be goaded into striking you. Easy could not be. You know the Three Laws of Robotics, I presume."
"Yes, of course," said Goodfellow.
"They are built into the positronic patterns of the brain and must be observed. The First Law, the prime rule of robotic existence, safeguards the life and well-being of all humans." He paused, rubbed at his cheek, then added, "It's something of which we would like to persuade all Earth if we could."
"It's just that he seems formidable."
"Granted. But whatever he seems, you'll find that he is useful."
"I'm not sure in what way. Our conversations were not very helpful in that respect. Still, I agreed to look at the object and I'm doing it."
"We'll do more than look, Professor. Have you brought a book?"
"I have."
"May I see it?"
Professor Goodfellow reached down without actually taking his eyes off the metal-in-human-shape that confronted him. From the briefcase at his feet, he withdrew a book.
Lanning held out his hand for it and looked at the backstrip. "Physical Chemistry of Electrolytes in Solution. Fair enough, sir. You selected this yourself, at random. It was no suggestion of mine, this particular text. Am I right?"
"Yes."
Lanning passed the book to Robot EZ-27.
The professor jumped a little. "No! That's a valuable book!" Lanning raised his eyebrows and they looked like shaggy coconut icing. He said, "Easy has no intention of tearing the book in two as a feat of strength, I assure you. It can handle a book as carefully as you or I. Go ahead, Easy."
"Thank you, sir," said Easy. Then, turning its metal bulk slightly, it added, "With your permission, Professor Goodfellow."
The professor stared, then said, "Yes-yes, of course."
With a slow and steady manipulation of metal fingers, Easy turned the pages of the book, glancing at the left page, then the right; turning the page, glancing left, then right; turning the page and so on for minute after minute.
The sense of its power seemed to dwarf even the large cement-walled room in which they stood and to reduce the two human watchers to something considerably less than life-size.
Goodfellow muttered, "The light isn't very good."
"It will do."
Then, rather more sharply, "But what is he doing?"
"Patience, sir."
The last page was turned eventually. Lanning asked, "Well, Easy?"
The robot said, "It is a most accurate book and there is little to which I can point. On line 22 of page 27, the word 'positive' is spelled p-o-i-s-t-i-v-e. The comma in line 6 of page 32 is superfluous, whereas one should have been used on line 13 of page 54. The plus sign in equation XIV-2 on page 337 should be a minus sign if it is to be consistent with the previous equations-"
"Wait! Wait!" cried the professor. "What is he doing?"
"Doing?" echoed Lanning in sudden irascibility. "Why, man, he has already done it! He has proofread that book."
"Proofread it?"
"Yes. In the short time it took him to turn those pages, he caught every mistake in spelling, grammar and punctuation. He has noted errors in word order and detected inconsistencies. And he will retain the information, letter-perfect, indefinitely."
The professor's mouth was open. He walked rapidly away from Lanning and Easy and as rapidly back. He folded his arms across his chest and stared at them. Finally he said, "You mean this is a proofreading robot?"
Lanning nodded. "Among other things."
"But why do you show it to me?"
"So that you might help me persuade the university to obtain it for use."
"To read proof?"
"Among other things," Lanning repeated patiently.
The professor drew his pinched face together in a kind of sour disbelief. "But this is ridiculous!"
"Why?"
"The university could never afford to buy this half-ton-it must weigh that at least-this half-ton proofreader."
"Proofreading is not all it will do. It will prepare reports from outlines, fill out forms, serve as an accurate memory-file, grade papers-"
All picayune!"
Lanning said, "Not at all, as I can show you in a moment. But I think we can discuss this more comfortably in your office, if you have no objection."
"No, of course not," began the professor mechanically and took a half-step as though to turn. Then he snapped out, "But the robot-we can't take the robot. Really, Doctor, you'll have to crate it up again."
"Time enough. We can leave Easy here."
"Unattended?"
"Why not? He knows he is to stay. Professor Goodfellow, it is necessary to understand that a robot is far more reliable than a human being."
"I would be responsible for any damage-"
"There will be no damage. I guarantee that. Look, it's after hours. You expect no one here, I imagine, before tomorrow morning. The truck and my two men are outside. U. S. Robots will take any responsibility that may arise. None will. Call it a demonstration of the reliability of the robot."
The professor allowed himself to be led out of the storeroom. Nor did he look entirely comfortable in his own office, five stories up.
He dabbed at the line of droplets along the upper half of his forehead with a white handkerchief.
"As you know very well, Dr. Lanning, there are laws against the use of robots on Earth's surface," he pointed out.
"The laws, Professor Goodfellow, are not simple ones. Robots may not be used on public thoroughfares or within public edifices. They may not be used on private grounds or within private structures except under certain restrictions that usually turn out to be prohibitive. The university, however, is a large and privately owned institution that usually receives preferential treatment. If the robot is used only in a specific room for only academic purposes, if certain other restrictions are observed and if the men and women having occasion to enter the room cooperate fully, we may remain within the law."
"But all that trouble just to read proof?"
"The uses would be infinite. Professor. Robotic labor has so far been used only to relieve physical drudgery. Isn't there such a thing as mental drudgery? When a professor capable of the most useful creative thought is forced to spend two weeks painfully checking the spelling of lines of print and I offer you a machine that can do it in thirty minutes, is that picayune?"
"But the price-"
"The price need not bother you. You cannot buy EZ-27. U. S. Robots does not sell its products. But the university can lease EZ-27 for a thousand dollars a year-considerably less than the cost of a single microwave spectograph continuous-recording attachment."
Goodfellow looked stunned. Lanning followed up his advantage by saying, "I only ask that you put it up to whatever group makes the decisions here. I would be glad to speak to them if they want more information."
"Well," Goodfellow said doubtfully, "I can bring it up at next week's Senate meeting. I can't promise that will do any good, though."
"Naturally," said Lanning.
The Defense Attorney was short and stubby and carried himself rather portentously, a stance that had the effect of accentuating his double chin. He stared at Professor Goodfellow, once that witness had been handed over, and said, "You agreed rather readily, did you not?"
The Professor said briskly, "I suppose I was anxious to be rid of Dr. Lanning. I would have agreed to anything."
"With the intention of forgetting about it after he left?"
"Well-"
"Nevertheless, you did present the matter to a meeting of the Executive Board of the University Senate."
"Yes, I did."
"So that you agreed in good faith with Dr. Lanning's suggestions. You weren't just going along with a gag. You actually agreed enthusiastically, did you not?"
"I merely followed ordinary procedures."
"As a matter of fact, you weren't as upset about the robot as you now claim you were. You know the Three Laws of Robotics and you knew them at the time of your interview with Dr. Lanning."
"Well, yes."
"And you were perfectly willing to leave a robot at large and unattended."
"Dr. Lanning assured me-"
"Surely you would never have accepted his assurance if you had had the slightest doubt that the robot might be in the least dangerous."
The professor began frigidly, "I had every faith in the word-"
"That is all," said Defense abruptly.
As Professor Goodfellow, more than a bit ruffled, stood down, Justice Shane leaned forward and said, "Since I am not a robotics man myself, I would appreciate knowing precisely what the Three Laws of Robotics are. Would Dr. Lanning quote them for the benefit of the court?"
Dr. Lanning looked startled. He had been virtually bumping heads with the gray-haired woman at his side. He rose to his feet now and the woman looked up, too-expressionlessly.
Dr. Lanning said, "Very well, Your Honor." He paused as though about to launch into an oration and said, with laborious clarity, "First Law: a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law: a robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. Third Law: a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws."
"I see," said the judge, taking rapid notes. "These Laws are built into every robot, are they?"
"Into every one. That will be borne out by any roboticist."
"And into Robot EZ-27 specifically?"
"Yes, Your Honor."
"You will probably be required to repeat those statements under oath."
"I am ready to do so, Your Honor." He sat down again.
Dr. Susan Calvin, robopsychologist-in-chief for U. S. Robots, who was the gray-haired woman sitting next to Lanning, looked at her titular superior without favor, but then she showed favor to no human being. She said, "Was Goodfellow's testimony accurate, Alfred?"
"Essentially," muttered Lanning. "He wasn't as nervous as all that about the robot and he was anxious enough to talk business with me when he heard the price. But there doesn't seem to be any drastic distortion."
Dr. Calvin said thoughtfully, "It might have been wise to put the price higher than a thousand."
"We were anxious to place Easy."
"I know. Too anxious, perhaps. They'll try to make it look as
though we had an ulterior motive."
Lanning looked exasperated. "We did. I admitted that at the University Senate meeting."
"They can make it look as if we had one beyond the one we admitted."
Scott Robertson, son of the founder of U. S. Robots and still owner of a majority of the stock, leaned over from Dr. Calvin's other side and said in a kind of explosive whisper, "Why can't you get Easy to talk so we'll know where we're at?"
"You know he can't talk about it, Mr. Robertson."
"Make him. You're the psychologist, Dr. Calvin. Make him."
"If I'm the psychologist, Mr. Robertson," said Susan Calvin coldly, "let me make the decisions. My robot will not be made to do anything at the price of his well-being."
Robertson frowned and might have answered, but Justice Shane was tapping his gavel in a polite sort of way and they grudgingly fell silent.
Francis J. Hart, head of the Department of English and Dean of Graduate Studies, was on the stand. He was a plump man, meticulously dressed in dark clothing of a conservative cut, and possessing several strands of hair traversing the pink top of his cranium. He sat well back in the witness chair with his hands folded neatly in his lap and displaying, from time to time, a tight-lipped smile.
He said, "My first connection with the matter of the Robot EZ-27 was on the occasion of the session of the University Senate Executive Committee at which the subject was introduced by Professor Goodfellow. Thereafter, on the tenth of April of last year, we held a special meeting on the subject, during which I was in the chair."
"Were minutes kept of the meeting of the Executive Committee? Of the special meeting, that is?"
"Well, no. It was a rather unusual meeting." The dean smiled briefly. "We thought it might remain confidential."
"What transpired at the meeting?"
Dean Hart was not entirely comfortable as chairman of that meeting. Nor did the other members assembled seem completely calm. Only Dr. Lanning appeared at peace with himself. His tall, gaunt figure and the shock of white hair that crowned him reminded Hart of portraits he had seen of Andrew Jackson.
Samples of the robot's work lay scattered along the central regions of the table and the reproduction of a graph drawn by the robot was now in the hands of Professor Minott of Physical Chemistry. The chemist's lips were pursed in obvious approval.
Hart cleared his throat and said, "There seems no doubt that the robot can perform certain routine tasks with adequate competence. I have gone over these, for instance, just before coming in and there is very little to find fault with."
He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proofmarks, neat and superbly legible. Occasionally, a word of print was crossed out and a new word substituted in the margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author's, a few in red, where the printer had been wrong.
"Actually," said Lanning, "there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr. Hart. I'm sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not competent to correct it."
"We accept that. However, the robot corrected word order on occasion and I don't think the rules of English are sufficiently hidebound for US to be sure that in each case the robot's choice was the correct one."
"Easy's positronic brain," said Lanning, showing large teeth as he smiled, "has been molded by the contents of all the standard works on the subject. I'm sure you cannot point to a case where the robot's choice was definitely the incorrect one."
Professor Minott looked up from the graph he still held. "The question in my mind, Dr. Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys."
"I am sure we could," said Lanning stiffly, "but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn't prepare the graph you hold in your hand, for instance."
Minott grunted.
Lanning went on. "The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple robot, an ordinary computer with a non-positronic brain is only a heavy adding machine."
Goodfellow looked up and said, "If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn't have the capability of absorbing an infinite amount of data."
"No, it hasn't. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge."
"The company will?"
"Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an extent by telling it to forget this item or that. But you would be almost certain to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safeguards. However, since there is no need for clearing the robot in its ordinary work, or for doing other useless things, this raises no problem."
Dean Hart touched his head as though to make sure his carefully cultivated strands lay evenly distributed and said, "You are anxious to have us take the machine. Yet surely it is a losing proposition for U. S. Robots. One thousand a year is a ridiculously low price. Is it that you hope through this to rent other such machines to other universities at a more reasonable price?"
"Certainly that's a fair hope," said Lanning.
"But even so, the number of machines you could rent would be limited. I doubt if you could make it a paying proposition."
Lanning put his elbows on the table and earnestly leaned forward. "Let me put it bluntly, gentlemen. Robots cannot be used on Earth, except in certain special cases, because of prejudice against them on the part of the public. U. S. Robots is a highly successful corporation with our extraterrestrial and spaceflight markets alone, to say nothing of our computer subsidiaries. However, we are concerned with more than profits alone. It is our firm belief that the use of robots on Earth itself would mean a better life for all eventually, even if a certain amount of economic dislocation resulted at first.
"The labor unions are naturally against us, but surely we may expect cooperation from the large universities. The robot, Easy, will help you by relieving you of scholastic drudgery-by assuming, if you permit it, the role of galley slave for you. Other universities and research institutions will follow your lead, and if it works out, then perhaps other robots of other types may be placed and the public's objections to them broken down by stages."
Minott murmured, "Today Northeastern University, tomorrow the world."
Angrily, Lanning whispered to Susan Calvin, "I wasn't nearly that eloquent and they weren't nearly that reluctant. At a thousand a year, they were jumping to get Easy. Professor Minott told me he'd never seen as beautiful a job as that graph he was holding and there was no mistake on the galley or anywhere else. Hart admitted it freely."
The severe vertical lines on Dr. Calvin's face did not soften. "You should have demanded more money than they could pay, Alfred, and let them beat you down."
"Maybe," he grumbled.
Prosecution was not quite done with Professor Hart. "After Dr. Lanning left, did you vote on whether to accept Robot EZ-27?"
"Yes, we did."
"With what result?"
"In favor of acceptance, by majority vote."
"What would you say influenced the vote?" Defense objected immediately.
Prosecution rephrased the question. "What influenced you, personally, in your individual vote? You did vote in favor, I think."
"I voted in favor, yes. I did so largely because I was impressed by Dr. Lanning's feeling that it was our duty as members of the world's intellectual leadership to allow robotics to help Man in the solution of his problems."
"In other words, Dr. Lanning talked you into it."
"That's his job. He did it very well."
"Your witness."
Defense strode up to the witness chair and surveyed Professor Hart for a long moment. He said, "In reality, you were all pretty eager to have Robot EZ-27 in your employ, weren't you?"
"We thought that if it could do the work, it might be useful."
"If it could do the work? I understand you examined the samples of Robot EZ-27's original work with particular care on the day of the meeting which you have just described."
"Yes, I did. Since the machine's work dealt primarily with the handling of the English language, and since that is my field of competence, it seemed logical that I be the one chosen to examine the work."
"Very good. Was there anything on display on the table at the time of the meeting which was less than satisfactory? I have all the material here as exhibits. Can you point to a single unsatisfactory item?"
"Well-"
"It's a simple question. Was there one single solitary unsatisfactory item? You inspected it. Was there?"
The English professor frowned. "There wasn't."
"I also have some samples of work done by Robot EZ-27 during the course of his fourteen-month employ at Northeastern. Would you examine these and tell me if there is anything wrong with them in even one particular?"
Hart snapped, "When he did make a mistake, it was a beauty."
"Answer my question," thundered Defense, "and only the question I am putting to you! Is there anything wrong with the material?"
Dean Hart looked cautiously at each item. "Well, nothing."
"Barring the matter concerning which we are here engaged. do you know of any mistake on the part of EZ-27?"
"Barring the matter for which this trial is being held, no."
Defense cleared his throat as though to signal end of paragraph. He said. "Now about the vote concerning whether Robot EZ-27 was to be employed or not. You said there was a majority in favor. What was the actual vote?"
"Thirteen to one, as I remember."
"Thirteen to one! More than just a majority, wouldn't you say?"
"No, sir!"All the pedant in Dean Hart was aroused. "In the English language, the word 'majority' means 'more than half.' Thirteen out of fourteen is a majority, nothing more."
"But an almost unanimous one."
"A majority all the same!"
Defense switched ground. "And who was the lone holdout?"
Dean Hart looked acutely uncomfortable. "Professor Simon Ninheimer."
Defense pretended astonishment. "Professor Ninheimer? The head of the Department of Sociology?"
"Yes, Sir."
"The plaintiff?"
"Yes, sir."
Defense pursed his lips. "In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client. United States Robots and Mechanical Men Corporation was the one who from the beginning opposed the use of the robot-although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea."
"He voted against the motion, as was his right."
"You didn't mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?"
"I think he spoke."
"You think?"
"Well, he did speak."
"Against using the robot?"
"Yes."
"Was he violent about it?"
Dean Hart paused. "He was vehement."
Defense grew confidential. "How long have you known Professor Ninheimer, Dean Hart?"
"About twelve years."
"Reasonably well?"
"I should say so, yes."
"Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had-"
Prosecution drowned out the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.
Robertson mangled his sandwich. The corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was conscious, moreover, that there would be a much more costly long-term setback in public relations.
He said sourly, "Why all this business about how Easy got into the university? What do they hope to gain?"
The Attorney for Defense said quietly, "A court action is like a chess game, MI. Robertson. The winner is usually the one who can see more moves ahead, and my friend at the prosecutor's table is no beginner. They can show damage; that's no problem. Their main effort lies in anticipating our defense. They must be counting on us to try to show that Easy couldn't possibly have committed the offense-because of the Laws of Robotics."
"All right," said Robertson, "that is our defense. An absolutely airtight one."
"To a robotics engineer. Not necessarily to a judge. They're setting themselves up a position from which they can demonstrate that EZ-27 was no ordinary robot. It was the first of its type to be offered to the public. It was an experimental model that needed field-testing and the university was the only decent way to provide such testing. That would look plausible in the light of Dr. Lanning's strong efforts to place the robot and the willingness of U. S. Robots to lease it for so little. The prosecution would then argue that the field-test proved Easy to have been a failure. Now do you see the purpose of what's been going on?"
"But EZ-27 was a perfectly good model," Argued Robertson. "It was the twenty-seventh in production."
"Which is really a bad point," said Defense somberly. "What was wrong with the first twenty-six? Obviously something. Why shouldn't there be something wrong with the twenty-seventh, too?"
"There was nothing wrong with the first twenty-six except that they weren't complex enough for the task. These were the first positronic brains of the sort to be constructed and it was rather hit-and-miss to begin with. But the Three Laws held in all of them! No robot is so imperfect that the Three Laws don't hold."
"Dr. Lanning has explained this to me, Mr. Robertson, and I am willing to take his word for it. The judge, however, may not be. We are expecting a decision from an honest and intelligent man who knows no robotics and thus may be led astray. For instance, if you or Dr. Lanning or Dr. Calvin were to say on the stand that any positronic brains were constructed 'hit-and-miss,' as you just did, prosecution would tear you apart in cross-examination. Nothing would salvage our case. So that's something to avoid."
Robertson growled, "If only Easy would talk."
Defense shrugged. "A robot is incompetent as a witness, so that would do us no good."
"At least we'd know some of the facts. We'd know how it came to do such a thing."
Susan Calvin fired up. A dullish red touched her cheeks and her voice had a trace of warmth in it. "We know how Easy came to do it. It was ordered to! I've explained this to counsel and I'll explain it to you now."
"Ordered to by whom?" asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered themselves the owners of U. S. Robots, by God!)
"By the plaintiff," said Dr. Calvin. "In heaven's name, why?"
"I don't know why yet. Perhaps just that we might be sued, that he might gain some cash." There were blue glints in her eyes as she said that.
"Then why doesn't Easy say so?"
"Isn't that obvious? It's been ordered to keep quiet about the matter."
"Why should that be obvious?" demanded Robertson truculently. "Well, it's obvious to me. Robot psychology is my profession. If
Easy will not answer questions about the matter directly, he will answer questions on the fringe of the matter. By measuring increased hesitation in his answers as the central question is approached, by measuring the area of blankness and the intensity of counterpotentials set up, it is possible to tell with scientific precision that his troubles are the result of an order not to talk, with its strength based on First Law. In other words, he's been told that if he talks, harm will be done a human being. Presumably harm to the unspeakable Professor Ninheimer, the plaintiff, who, to the robot, would seem a human being."
"Well, then," said Robertson, "can't you explain that if he keeps quiet, harm will be done to U. S. Robots?"
"U. S. Robots is not a human being and the First Law of Robotics does not recognize a corporation as a person the way ordinary laws do. Besides, it would be dangerous to try to lift this particular sort of inhibition. The person who laid it on could lift it off least dangerously, because the robot's motivations in that respect are centered on that person. Any other course-" She shook her head and grew almost impassioned. "I won't let the robot be damaged!"
Lanning interrupted with the air of bringing sanity to the problem. "It seems to me that we have only to prove a robot incapable of the act of which Easy is accused. We can do that."
"Exactly," said Defense, in annoyance. "You can do that. The only witnesses capable of testifying to Easy's condition and to the nature of Easy's state of mind are employees of U. S. Robots. The judge can't possibly accept their testimony as unprejudiced."
"How can he deny expert testimony?"
"By refusing to be convinced by it. That's his right as the judge. Against the alternative that a man like Professor Ninheimer deliberately set about ruining his own reputation, even for a sizable sum of money, the judge isn't going to accept the technicalities of your engineers. The judge is a man, after all. If he has to choose between a man doing an impossible thing and a robot doing an impossible thing, he's quite likely to decide in favor of the man."
"A man can do an impossible thing," said Lanning, "because we don't know all the complexities of the human mind and we don't know what, in a given human mind, is impossible and what is not. We do know what is really impossible to a robot."
"Well, we'll see if we can't convince the judge of that," Defense replied wearily.
"If all you say is so," rumbled Robertson, "I don't see how you can."
"We'll see. It's good to know and be aware of the difficulties involved, but let's not be too downhearted. I've tried to look ahead a few moves in the chess game, too." With a stately nod in the direction of the robopsychologist, he added, "With the help of the good lady here."
Lanning looked from one to the other and said, "What the devil is this?"
But the bailiff thrust his head into the room and announced somewhat breathlessly that the trial was about to resume.
They took their seats, examining the man who had started all the trouble.
Simon Ninheimer owned a fluffy head of sandy hair, a face that narrowed past a beaked nose toward a pointed chin, and a habit of sometimes hesitating before key words in his conversation that gave him an air of a seeker after an almost unbearable precision. When he said, "The Sun rises in the-uh-east, 11 one was certain he had given due consideration to the possibility that it might at some time rise in the west.
Prosecution said, "Did you oppose employment of Robot EZ-27 by the university?"
"I did, sir."
"Why was that?"
"I did not feel that we understood the-uh-motives of U. S. Robots thoroughly. I mistrusted their anxiety to place the robot with us."
"Did you feel that it was capable of doing the work that it was allegedly designed to do?"
"I know for a fact that it was not."
"Would you state your reasons?"
Simon Ninheimer's book, entitled Social Tensions Involved in Space-Flight and Their Resolution, had been eight years in the making. Ninheimer's search for precision was not confined to his habits of speech, and in a subject like sociology, almost inherently imprecise, it left him breathless.
Even with the material in galley proofs, he felt no sense of completion. Rather the reverse, in fact. Staring at the long strips of print, he felt only the itch to tear the lines of type apart and rearrange them differently.
Jim Baker, Instructor and soon to be Assistant Professor of Sociology, found Ninheimer, three days after the first batch of galleys had arrived from the printer, staring at the handful of paper in abstraction. The galleys came in three copies: one for Ninheimer to proofread, one for Baker to proofread independently, and a third, marked "Original," which was to receive the final corrections, a combination of those made by Ninheimer and by Baker, after a conference at which possible conflicts and disagreements were ironed out. This had been their policy on the several papers on which they had collaborated in the past three years and it worked well.
Baker, young and ingratiatingly soft-voiced, had his own copies of the galleys in his hand. He said eagerly, "I've done the first chapter and it contains some typographical beauts."
"The first chapter always has them," said Ninheimer distantly. "Do you want to go over it now?"
Ninheimer brought his eyes to grave focus on Baker. "I haven't done anything on the galleys, Jim. I don't think I'll bother."
Baker looked confused. "Not bother?"
Ninheimer pursed his lips. "I've asked about the-uh-workload of the machine. After all, he was originally-uh-promoted as a proofreader. They've set a schedule."
"The machine? You mean Easy?"
"I believe that is the foolish name they gave it."
"But, Dr. Ninheimer, I thought you were staying clear of it"'
"I seem to be the only one doing so. Perhaps I ought to take my share of the-uh-advantage."
"Oh. Well, I seem to have wasted time on this first chapter, then," said the younger man ruefully.
"Not wasted. We can compare the machine's result with yours as a check."
"If you want to, but-"
"Yes?"
"I doubt that we'll find anything wrong with Easy's work. It's supposed never to have made a mistake."
"I dare say," said Ninheimer dryly.
The first chapter was brought in again by Baker four days later. This time it was Ninheimer's copy, fresh from the special annex that had been built to house Easy and the equipment it used.
Baker was jubilant. "Dr. Ninheimer, it not only caught everything I caught-it found a dozen errors I missed! The whole thing took it twelve minutes!"
Ninheimer looked over the sheaf, with the neatly printed marks and symbols in the margins. He said, "It is not as complete as you and I would have made it. We would have entered an insert on Suzuki's work on the neurological effects of low gravity."
"You mean his paper in Sociological Reviews?"
"Of course."
"Well, you can't expect impossibilities of Easy. It can't read the literature for us."
"I realize that. As a matter of fact, I have prepared the insert. I will see the machine and make certain it knows how to-uh-handle inserts."
"It will know."
"I prefer to make certain."
Ninheimer had to make an appointment to see Easy, and then could get nothing better than fifteen minutes in the late evening.
But the fifteen minutes turned out to be ample. Robot EZ-27 understood the matter of inserts at once.
Ninheimer found himself uncomfortable at close quarters with the robot for the first time. Almost automatically, as though it were human, he found himself asking, "Are you happy with your work?'
"Most happy, Professor Ninheimer," said Easy solemnly, the photocells that were its eyes gleaming their normal deep red.
"You know me?"
"From the fact that you present me with additional material to include in the galleys, it follows that you are the author. The author's name, of course, is at the head of each sheet of galley proof."
"I see. You make-uh-deductions, then. Tell me"-he couldn't resist the question-"what do you think of the book so far?'
Easy said, "I find it very pleasant to work with."
"Pleasant? That is an odd word for a-uh-a mechanism without emotion. I've been told you have no emotion."
"The words of your book go in accordance with my circuits," Easy explained. "They set up little or no counterpotentials. It is in my brain paths to translate this mechanical fact into a word such as 'pleasant.' The emotional context is fortuitous."
"I see. Why do you find the book pleasant?"
"It deals with human beings, Professor, and not with inorganic materials or mathematical symbols. Your book attempts to understand human beings and to help increase human happiness."
"And this is what you try to do and so my book goes in accordance with your circuits? Is that it?"
"That is it, Professor."
The fifteen minutes were up. Ninheimer left and went to the university library, which was on the point of closing. He kept them open long enough to find an elementary text on robotics. He took it home with him.
Except for occasional insertion of late material, the galleys went to Easy and from him to the publishers with little intervention from Ninheimer at first-and none at all later.
Baker said, a little uneasily, "It almost gives me a feeling of uselessness."
"It should give you a feeling of having time to begin a new project," said Ninheimer, without looking up from the notations he was making in the current issue of Social Science Abstracts.
"I'm just not used to it. I keep worrying about the galleys. It's silly, I know."
"It is."
"The other day I got a couple of sheets before Easy sent them off to-"
"What!" Ninheimer looked up, scowling. The copy of Abstracts slid shut. "Did you disturb the machine at its work?"
"Only for a minute. Everything was all right. Oh, it changed one word. You referred to something as 'criminal'; it changed the word to 'reckless.' It thought the second adjective fit in better with the context."
Ninheimer grew thoughtful. "What did you think?"
"You know, I agreed with it. I let it stand."
Ninheimer turned in his swivel-chair to face his young associate. "See here, I wish you wouldn't do this again. If I am to use the machine, I wish the-uh-full advantage of it. If I am to use it and lose your-uh-services anyway because you supervise it when the whole point is that it requires no supervision, I gain nothing. Do you see?"
"Yes, Dr. Ninheimer," said Baker, subdued. The advance copies of Social Tensions arrived in Dr. Ninheimer's office on the eighth of May. He looked through it briefly, flipping pages and pausing to read a paragraph here and there. Then he put his copies away.
As he explained later, he forgot about it. For eight years, he had worked at it, but now, and for months in the past, other interests had engaged him while Easy had taken the load of the book off his shoulders. He did not even think to donate the usual complimentary copy to the university library. Even Baker, who had thrown himself into work and had steered clear of the department head since receiving his rebuke at their last meeting, received no copy.
On the sixteenth of June that stage ended. Ninheimer received a phone call and stared at the image in the 'plate with surprise.
"Speidell! Are you in town?"
"No, sir. I'm in Cleveland." Speidell's voice trembled with emotion.
"Then why the call?"
"Because I've just been looking through your new book! Ninheimer, are you mad? Have you gone insane?"
Ninheimer stiffened. "Is something-uh-wrong?" he asked in alarm.
"Wrong? I refer you to page 562. What in blazes do you mean by interpreting my work as you do? Where in the paper cited do I make the claim that the criminal personality is nonexistent and that it is the law-enforcement agencies that are the true criminals? Here, let me quote-"
"Wait! Wait!" cried Ninheimer, trying to find the page. "Let me see. Let me see...Good God!"
"Well?"
"Speidell, I don't see how this could have happened. I never wrote this."
"But that's what's printed! And that distortion isn't the worst. You look at page 690 and imagine what Ipatiev is going to do to you when he sees the hash you've made of his findings! Look, Ninheimer, the book is riddled with this sort of thing. I don't know what you were thinking of-but there's nothing to do but get the book off the market. And you'd better be prepared for extensive apologies at the next Association meeting!"
"Speidell, listen to me-" But Speidell had flashed off with a force that had the 'plate glowing with after-images for fifteen seconds.
It was then that Ninheimer went through the book and began marking off passages with red ink.
He kept his temper remarkably well when he faced Easy again, but his lips were pale. He passed the book to Easy and said, "Will you read the marked passages on pages 562, 631, 664 and 690?"
Easy did so in four glances. "Yes, Professor Ninheimer."
"This is not as I had it in the original galleys."
"No, sir. It is not."
"Did you change it to read as it now does?"
"Yes, sir."
"Why?"
"Sir, the passages as they read in your version were most uncomplimentary to certain groups of human beings. I felt it advisable to change the wording to avoid doing them harm."
"How dared you do such a thing?"
"The First Law, Professor, does not let me, through any inaction, allow harm to come to human beings. Certainly, considering your reputation in the world of sociology and the wide circulation your book would receive among scholars, considerable harm would come to a number of the human beings you speak of."
"But do you realize the harm that will come to me now?"
"It was necessary to choose the alternative with less harm." Professor Ninheimer, shaking with fury, staggered away. It was clear to him that U. S. Robots would have to account to him for this.
There was some excitement at the defendants' table, which increased as Prosecution drove the point home.
"Then Robot EZ-27 informed you that the reason for its action was based on the First Law of Robotics?"
"That is correct, sir."
"That, in effect, it had no choice?"
"Yes, sir."
"It follows then that U. S. Robots designed a robot that would of necessity rewrite books to accord with its own conceptions of what was right. And yet they palmed it off as simple proofreader. Would you say that?"
Defense objected firmly at once, pointing out that the witness was being asked for a decision on a matter in which he had no competence. The judged admonished Prosecution in the usual terms, but there was no doubt that the exchange had sunk home-not least upon the attorney for the Defense.
Defense asked for a short recess before beginning cross-examination, using a legal technicality for the purpose that got him five minutes.
He leaned over toward Susan Calvin. "Is it possible, Dr. Calvin, that Professor Ninheimer is telling the truth and that Easy was motivated by the First Law?"
Calvin pressed her lips together, then said, "No. It isn't possible. The last part of Ninheimer's testimony is deliberate perjury. Easy is not designed to be able to judge matters at the stage of abstraction represented by an advanced textbook on sociology. It would never be able to tell that certain groups of humans would be harmed by a phrase in such a book. Its mind is simply not built for that."
"I suppose, though, that we can't prove this to a layman," said Defense pessimistically.
"No," Admitted Calvin. "The proof would be highly complex. Our way out is still what it was. We must prove Ninheimer is lying, and nothing he has said need change our plan of attack."
"Very well, Dr. Calvin," said Defense, "I must accept your word in this. We'll go on as planned."
In the courtroom, the judge's gavel rose and fell and Dr. Ninheimer took the stand once more. He smiled a little as one who feels his position to be impregnable and rather enjoys the prospect of countering a useless attack.
Defense approached warily and began softly. "Dr. Ninheimer, do you mean to say that you were completely unaware of these alleged changes in your manuscript until such time as Dr. Speidell called you on the sixteenth of June?"
"That is correct, sir."
"Did you never look at the galleys after Robot EZ-27 had proofread them?"
"At first I did, but it seemed to me a useless task. I relied on the claims of U. S. Robots. The absurd-uh-changes were made only in the last quarter of the book after the robot, I presume, had learned enough about sociology-"
"Never mind your presumptions!" said Defense. "I understood your colleague, Dr. Baker, saw the later galleys on at least one occasion. Do you remember testifying to that effect?"
"Yes, sir. As I said, he told me about seeing one page, and even there, the robot had changed a word."
Again Defense broke in. "Don't you find it strange, sir, that after over a year of implacable hostility to the robot, after having voted against it in the first place and having refused to put it to any use whatever, you suddenly decided to put your book, your magnum opus, into its hands?"
"I don't find that strange. I simply decided that I might as well use the machine."
"And you were so confident of Robot EZ-27-all of a sudden-that you didn't even bother to check your galleys?"
"I told you I was-uh-persuaded by U. S. Robots' propaganda."
"So persuaded that when your colleague, Dr. Baker, attempted to check on the robot, you berated him soundly?"
"I didn't berate him. I merely did not wish to have him-uh-waste his time. At least, I thought then it was a waste of time. I did not see the significance of that change in a word at the-"
Defense said with heavy sarcasm, "I have no doubt you were instructed to bring up that point in order that the word-change be entered in the record-" He altered his line to forestall objection and said, "The point is that you were extremely angry with Dr. Baker."
"No, sir. Not angry."
"You didn't give him a copy of your book when you received it."
"Simple forgetfulness. I didn't give the library its copy, either."
Ninheimer smiled cautiously. "Professors are notoriously absentminded."
Defense said, "Do you find it strange that, after more than a year of perfect work, Robot EZ-27 should go wrong on your book? On a book, that is, which was written by you, who was, of all people, the most implacably hostile to the robot?"
"My book was the only sizable work dealing with mankind that it had to face. The Three Laws of Robotics took hold then."
"Several times, Dr. Ninheimer," said Defense, "you have tried to sound like an expert on robotics. Apparently you suddenly grew interested in robotics and took out books on the subject from the library. You testified to that effect, did you not?"
"One book, sir. That was the result of what seems to me to have been-uh-natural curiosity."
"And it enabled you to explain why the robot should, as you allege, have distorted your book?"
"Yes, sir."
"Very convenient. But are you sure your interest in robotics was not intended to enable you to manipulate the robot for your own purposes?"
Ninheimer flushed. "Certainly not, sir!" Defense's voice rose. "In fact, are you sure the alleged altered passages were not as you had them in the first place?"
The sociologist half-rose. "That's-uh-uh-ridiculous! I have the galleys-"
He had difficulty speaking and Prosecution rose to insert smoothly, "With your permission, Your Honor, I intend to introduce as evidence the set of galleys given by Dr. Ninheimer to Robot EZ-27 and the set of galleys mailed by Robot EZ-27 to the publishers. I will do so now if my esteemed colleague so desires, and will be willing to allow a recess in order that the two sets of galleys may be compared."
Defense waved his hand impatiently. "That is not necessary. My honored opponent can introduce those galleys whenever he chooses. I'm sure they will show whatever discrepancies are claimed by the plaintiff to exist. What I would like to know of the witness, however, is whether he also has in his possession Dr. Baker's galleys."
"Dr. Baker's galleys?" Ninheimer frowned. He was not yet quite master of himself.
"Yes, Professor! I mean Dr. Baker's galleys. You testified to the effect that Dr. Baker had received a separate copy of the galleys. I will have the clerk read your testimony if you are suddenly a selective type of amnesiac. Or is it just that professors are, as you say, notoriously absent-minded?"
Ninheimer said, "I remember Dr. Baker's galleys. They weren't necessary once the job was placed in the care of the proofreading machine-"
"So you burned them?"
"No. I put them in the waste basket."
"Burned them, dumped them-what's the difference? The point is you got rid of them."
"There's nothing wrong-" began Ninheimer weakly.
"Nothing wrong?" thundered Defense. "Nothing wrong except that there is now no way we can check to see if, on certain crucial galley sheets, you might not have substituted a harmless blank one from Dr. Baker's copy for a sheet in your own copy which you had deliberately mangled in such a way as to force the robot to-"
Prosecution shouted a furious objection. Justice Shane leaned forward, his round face doing its best to assume an expression of anger equivalent to the intensity of the emotion felt by the man.
The judge said, "Do you have any evidence, Counselor, for the extraordinary statement you have just made?"
Defense said quietly, "No direct evidence, Your Honor. But I would like to point out that, viewed properly, the sudden conversion of the plaintiff from anti-roboticism, his sudden interest in robotics, his refusal to check the galleys or to allow anyone else to check them, his careful neglect to allow anyone to see the book immediately after publication, all very clearly point-"
"Counselor," interrupted the judge impatiently, "this is not the place for esoteric deductions. The plaintiff is not on trial. Neither are you prosecuting him. I forbid this line of attack and I can only point out that the desperation that must have induced you to do this cannot help but weaken your case. If you have legitimate questions to ask, Counselor, you may continue with your cross-examination. But I warn you against another such exhibition in this courtroom."
"I have no further questions, Your Honor."
Robertson whispered heatedly as counsel for the Defense returned to his table, "What good did that do, for God's sake? The judge is dead-set against you now."
Defense replied calmly, "But Ninheimer is good and rattled. And we've set him up for tomorrow's move. He'll be ripe."
Susan Calvin nodded gravely.
The rest of Prosecution's case was mild in comparison. Dr. Baker was called and bore out most of Ninheimer's testimony. Drs. Speidell and Ipatiev were called, and they expounded most movingly on their shock and dismay at certain quoted passages in Dr. Ninheimer's book. Both gave their professional opinion that Dr. Ninheimer's professional reputation had been seriously impaired.
The galleys were introduced in evidence, as were copies of the finished book.
Defense cross-examined no more that day. Prosecution rested and the trial was recessed till the next morning.
Defense made his first motion at the beginning of the proceedings on the second day. He requested that Robot EZ-27 be admitted as a spectator to the proceedings.
Prosecution objected at once and Justice Shane called both to the bench.
Prosecution said hotly, "This is obviously illegal. A robot may not be in any edifice used by the general public."
"This courtroom," pointed out Defense, "is closed to all but those having an immediate connection with the case."
"A large machine of known erratic behavior would disturb my clients and my witnesses by its very presence! It would make hash out of the proceedings."
The judge seemed inclined to agree. He turned to Defense and said rather unsympathetically, "What are the reasons for your request?"
Defense said, "It will be our contention that Robot EZ-27 could not possibly, by the nature of its construction, have behaved as it has been described as behaving. It will be necessary to present a few demonstrations."
Prosecution said, "I don't see the point, Your Honor. Demonstrations conducted by men employed at U. S. Robots are worth little as evidence when U. S. Robots is the defendant."
"Your Honor," said Defense, "the validity of any evidence is for you to decide, not for the Prosecuting Attorney. At least, that is my understanding."
Justice Shane, his prerogatives encroached upon, said, "Your understanding is correct. Nevertheless, the presence of a robot here does raise important legal questions."
"Surely, Your Honor, nothing that should be allowed to override the requirements of justice. If the robot is not present, we are prevented from presenting our only defense."
The judge considered. "There would be the question of transporting the robot here."
"That is a problem with which U. S. Robots has frequently been faced. We have a truck parked outside the courtroom, constructed according to the laws governing the transportation of robots. Robot EZ-27 is in a packing case inside with two men guarding it. The doors to the truck are properly secured and all other necessary precautions have been taken."
"You seem certain," said Justice Shane, in renewed ill-temper, "that judgment on this point will be in your favor."
"Not at all, Your Honor. If it is not, we simply turn the truck about. I have made no presumptions concerning your decision."
The judge nodded. "The request on the part of the Defense is granted."
The crate was carried in on a large dolly and the two men who handled it opened it. The courtroom was immersed in a dead silence.
Susan Calvin waited as the thick slabs of celluform went down, then held out one hand. "Come, Easy."
The robot looked in her direction and held out its large metal arm. It towered over her by two feet but followed meekly, like a child in the clasp of its mother. Someone giggled nervously and choked it off at a hard glare from Dr. Calvin.
Easy seated itself carefully in a large chair brought by the bailiff, which creaked but held.
Defense said, "When it becomes necessary, Your Honor, we will prove that this is actually Robot EZ-27, the specific robot in the employ of Northeastern University during the period of time with which we are concerned."
"Good," His Honor said. "That will be necessary. I, for one, have no idea how you can tell one robot from another."
"And now," said Defense, "I would like to call my first witness to the stand. Professor Simon Ninheimer, please."
The clerk hesitated, looked at the judge. Justice Shane asked, with visible surprise, "You are calling the plaintiff as your witness?"
"Yes, Your Honor."
"I hope that you're aware that as long as he's your witness, you will be allowed none of the latitude you might exercise if you were cross-examining an opposing witness."
Defense said smoothly, "My only purpose in all this is to arrive at the truth. It will not be necessary to do more than ask a few polite questions."
"Well," said the judge dubiously, "you're the one handling the case. Call the witness."
Ninheimer took the stand and was informed that he was still under oath. He looked more nervous than he had the day before, almost apprehensive.
But Defense looked at him benignly.
"Now, Professor Ninheimer, you are suing my clients in the amount of $750,000."
"That is the-uh-sum. Yes."
"That is a great deal of money."
"I have suffered a great deal of harm."
"Surely not that much. The material in question involves only a few passages in a book. Perhaps these were unfortunate passages, but after all, books sometimes appear with curious mistakes in them."
Ninheimer's nostrils flared. "Sir, this book was to have been the climax of my professional career! Instead, it makes me look like an incompetent scholar, a perverter of the views held by my honored friends and associates, and a believer of ridiculous and-uh-outmoded viewpoints. My reputation is irretrievably shattered! I can never hold up my head in any-uh-assemblage of scholars, regardless of the outcome of this trial. I certainly cannot continue in my career, which has been the whole of my life. The very purpose of my life has been-uh-aborted and destroyed."
Defense made no attempt to interrupt the speech, but stared abstractedly at his fingernails as it went on.
He said very soothingly, "But surely, Professor Ninheimer, at your present age, you could not hope to earn more than-let us be generous-$l5O,OOO during the remainder of your life. Yet you are asking the court to award you five times as much."
Ninheimer said, with an even greater burst of emotion, "It is not in my lifetime alone that I am ruined. I do not know for how many generations I shall be pointed at by sociologists as a-uh-a fool or maniac. My real achievements will be buried and ignored. I am ruined not only until the day of my death, but for all time to come, because there will always be people who will not believe that a robot made those insertions-"
It was at this point that Robot EZ-27 rose to his feet. Susan Calvin made no move to stop him. She sat motionless, staring straight ahead. Defense sighed softly.
Easy's melodious voice carried clearly. It said, "I would like to explain to everyone that I did insert certain passages in the galley proofs that seemed directly opposed to what had been there at first-"
Even the Prosecuting Attorney was too startled at the spectacle of a seven-foot robot rising to address the court to be able to demand the stopping of what was obviously a most irregular procedure.
When he could collect his wits, it was too late. For Ninheimer rose in the witness chair, his face working.
He shouted wildly, "Damn you, you were instructed to keep your mouth shut about-"
He ground to a choking halt, and Easy was silent, too. Prosecution was on his feet now, demanding that a mistrial be declared.
Justice Shane banged his gavel desperately. "Silence! Silence! Certainly there is every reason here to declare a mistrial, except that in the interests of justice I would like to have Professor Ninheimer complete his statement. I distinctly heard him say to the robot that the robot had been instructed to keep its mouth shut about something. There was no mention in your testimony, Professor Ninheimer, as to any instructions to the robot to keep silent about anything!"
Ninheimer stared wordlessly at the judge. Justice Shane said, "Did you instruct Robot EZ-27 to keep silent about something? And if so, about what?"
"Your Honor-" began Ninheimer hoarsely, and couldn't continue.
The judge's voice grew sharp. "Did you, in fact, order the inserts in question to be made in the galleys and then order the robot to keep quiet about your part in this?"
Prosecution objected vigorously, but Ninheimer shouted, "Oh, what's the use? Yes! Yes!"And he ran from the witness stand. He was stopped at the door by the bailiff and sank hopelessly into one of the last rows of seats, head buried in both hands.
Justice Shane said, "It is evident to me that Robot EZ-27 was brought here as a trick. Except for the fact that the trick served to prevent a serious miscarriage of justice, I would certainly hold attorney for the Defense in contempt. It is clear now, beyond any doubt, that the plaintiff has committed what is to me a completely inexplicable fraud since, apparently, he was knowingly ruining his career in the process-"
Judgment, of course, was for the defendant.
Dr. Susan Calvin had herself announced at Dr. Ninheimer's bachelor quarters in University Hall. The young engineer who had driven the car offered to go up with her, but she looked at him scornfully.
"Do you think he'll assault me? Wait down here."
Ninheimer was in no mood to assault anyone. He was packing, wasting no time, anxious to be away before the adverse conclusion of the trial became general knowledge.
He looked at Calvin with a queerly defiant air and said, "Are you coming to warn me of a countersuit? If so, it will get you nothing. I have no money, no job, no future. I can't even meet the costs of the trial."
"If you're looking for sympathy," said Calvin coldly, "don't look for it here. This was your doing. However, there will be no countersuit, neither of you nor of the university. We will even do what we can to keep you from going to prison for perjury. We aren't vindictive."
"Oh, is that why I'm not already in custody for forswearing myself? I had wondered. But then," he added bitterly, "why should you be vindictive? You have what you want now."
"Some of what we want, yes," said Calvin. "The university will keep Easy in its employ at a considerably higher rental fee. Furthermore, certain underground publicity concerning the trial will make it possible to place a few more of the EZ models in other institutions without danger of a repetition of this trouble."
"Then why have you come to see me?"
"Because I don't have all of what I want yet. I want to know why you hate robots as you do. Even if you had won the case, your reputation would have been ruined. The money you might have obtained could not have compensated for that. Would the satisfaction of your hatred for robots have done so?"
"Are you interested in human minds, Dr. Calvin?" asked Ninheimer, with acid mockery.
"Insofar as their reactions concern the welfare of robots, yes. For that reason, I have learned a little of human psychology."
"Enough of it to be able to trick met"
"That wasn't hard," said Calvin, without pomposity. "The difficult thing was doing it in such a way as not to damage Easy."
"It is like you to be more concerned for a machine than for a man." He looked at her with savage contempt.
It left her unmoved. "It merely seems so, Professor Ninheimer. It is only by being concerned for robots that one can truly be concerned for twenty-first-century man. You would understand this if you were a roboticist."
"I have read enough robotics to know I don't want to be a roboticist!"
"Pardon me, you have read a book on robotics. It has taught you nothing. You learned enough to know that you could order a robot to do many things, even to falsify a book, if you went about it properly. You learned enough to know that you could not order him to forget something entirely without risking detection, but you thought you could order him into simple silence more safely. You were wrong."
"You guessed the truth from his silencer' "It wasn't guessing. You were an amateur and didn't know enough to cover your tracks completely. My only problem was to prove the matter to the judge and you were kind enough to help us there, in your ignorance of the robotics you claim to despise."
"Is there any purpose in this discussion?" asked Ninheimer wearily.
"For me, yes," said Susan Calvin, "because I want you to understand how completely you have misjudged robots. You silenced Easy by telling him that if he told anyone about your own distortion of the book, you would lose your job. That set up a certain potential within Easy toward silence, one that was strong enough to resist our efforts to break it down. We would have damaged the brain if we had persisted.
"On the witness stand, however, you yourself put up a higher counterpotential. You said that because people would think that you, not a robot, had written the disputed passages in the book, you would lose far more than just your job. You would lose your reputation, your standing, your respect, your reason for living. You would lose the memory of you after death. A new and higher potential was set up by you-and Easy talked."
"Oh, God," said Ninheimer, turning his head away. Calvin was inexorable. She said, "Do you understand why he talked? It was not to accuse you, but to defend you! It can be mathematically shown that he was about to assume full blame for your crime, to deny that you had anything to do with it. The First Law required that. He was going to lie-to damage himself-to bring monetary harm to a corporation. All that meant less to him than did the saving of you. If you really understood robots and robotics, you would have let him talk. But you did not understand, as I was sure you wouldn't, as I guaranteed to the defense attorney that you wouldn't. You were certain, in your hatred of robots, that Easy would act as a human being would act and defend itself at your expense. So you flared out at him in panic-and destroyed yourself."
Ninheimer said with feeling, "I hope some day your robots turn on you and kill you!"
"Don't be foolish," said Calvin. "Now I want you to explain why you've done all this."
Ninheimer grinned a distorted, humorless grin. "I am to dissect my mind, am I, for your intellectual curiosity, in return for immunity from a charge of perjury?"
"Put it that way if you like," said Calvin emotionlessly. "But explain."
"So that you can counter future anti-robot attempts more efficiently? With greater understanding?"
"I accept that."
"You know," said Ninheimer, "I'll tell you-just to watch it do you no good at all. You can't understand human motivation. You can only understand your damned machines because you're a machine yourself, with skin on."
He was breathing hard and there was no hesitation in his speech, no searching for precision. It was as though he had no further use for precision.
He said, "For two hundred and fifty years, the machine has been replacing Man and destroying the handcraftsman. Pottery is spewed out of molds and presses. Works of art have been replaced by identical gimcracks stamped out on a die. Call it progress, if you wish! The artist is restricted to abstractions, confined to the world of ideas. He must design something in mind-and then the machine does the rest.
"Do you suppose the potter is content with mental creation? Do you suppose the idea is enough? That there is nothing in the feel of the clay itself, in watching the thing grow as hand and mind work together? Do you suppose the actual growth doesn't act as a feedback to modify and improve the idea?"
"You are not a potter," said Dr. Calvin. "I am a creative artist! I design and build articles and books. There is more to it than the mere thinking of words and of putting them in the right order. If that were all, there would be no pleasure in it, no return.
"A book should take shape in the hands of the writer. One must actually see the chapters grow and develop. One must work and rework and watch the changes take place beyond the original concept even. There is taking the galleys in hand and seeing how the sentences look in print and molding them again. There are a hundred contacts between a man and his work at every stage of the game and the contact itself is pleasurable and repays a man for the work he puts into his creation more than anything else could. Your robot would take all that away."
"So does a typewriter. So does a printing press. Do you propose to return to the hand illumination of manuscripts?"
"Typewriters and printing presses take away some, but your robot would deprive us of all. Your robot takes over the galleys. Soon it, or other robots, would take over the original writing, the searching of the sources, the checking and cross-checking of passages, perhaps even the deduction of conclusions. What would that leave the scholar? One thing only-the barren decisions concerning what orders to give the robot next! I want to save the future generations of the world of scholarship from such a final hell. That meant more to me than even my own reputation and so I set out to destroy U. S. Robots by whatever means."
"You were bound to fail," said Susan Calvin. "I was bound to try," said Simon Ninheimer. Calvin turned and left. She did her best to feel no pang of sympathy for the broken man.
She did not entirely succeed.
Christmas Without Rodney
It all started with Gracie (my wife of nearly forty years) wanting to give Rodney time off for the holiday season and it ended with me in an absolutely impossible situation. I'll tell you about it if you don't mind because I've got to tell somebody. Naturally, I'm changing names and details for our own protection.
It was just a couple of months ago, mid-December, and Gracie said to me, "Why don't we give Rodney time off for the holiday season? Why shouldn't he celebrate Christmas, too?"
I remember I had my optics unfocused at the time (there's a certain amount of relief in letting things go hazy when you want to rest or just listen to music) but I focused them quickly to see if Gracie were smiling or had a twinkle in her eye. Not that she has much of a sense of humor, you understand.
She wasn't smiling. No twinkle. I said, "Why on Earth should we give him time off?"
"Why not?"
"Do you want to give the freezer a vacation, the sterilizer, the holoviewer? Shall we just turn off the power supply?"
"Come, Howard," she said. "Rodney isn't a freezer or a sterilizer. He's a person."
"He's not a person. He's a robot. He wouldn't want a vacation."
"How do you know? And he's a person. He deserves a chance to rest and just revel in the holiday atmosphere."
I wasn't going to argue that "person" thing with her. I know you've all read those polls which show that women are three times as likely to resent and fear robots as men are. Perhaps that's because robots tend to do what was once called, in the bad old days, "women's work" and women fear being made useless, though I should think they'd be delighted. In any case, Gracie is delighted and she simply adores Rodney. (That's her word for it. Every other day she says, "I just adore Rodney.")
You've got to understand that Rodney is an old-fashioned robot whom we've had about seven years. He's been adjusted to fit in with our old-fashioned house and our old-fashioned ways and I'm rather pleased with him myself. Sometimes I wonder about getting one of those slick, modern jobs, which are automated to death, like the one our son, DeLancey, has, but Gracie would never stand for it.
But then I thought of DeLancey and I said, "How are we going to give Rodney time off, Gracie? DeLancey is coming in with that gorgeous wife of his" (I was using "gorgeous" in a sarcastic sense, but Gracie didn't notice-it's amazing how she insists on seeing a good side even when it doesn't exist) "and how are we going to have the house in good shape and meals made and all the rest of it without Rodney?"
"But that's just it," she said, earnestly. "DeLancey and Hortense could bring their robot and he could do it all. You know they don't think much of Rodney, and they'd love to show what theirs can do and Rodney can have a rest."
I grunted and said, "If it will make you happy, I suppose we can do it. It'll only be for three days. But I don't want Rodney thinking he'll get every holiday off."
It was another joke, of course, but Gracie just said, very earnestly, "No, Howard, I will talk to him and explain it's only just once in a while."
She can't quite understand that Rodney is controlled by the three laws of robotics and that nothing has to be explained to him.
So I had to wait for DeLancey and Hortense, and my heart was heavy. DeLancey is my son, of course, but he's one of your upwardly mobile, bottom-line individuals. He married Hortense because she has excellent connections in business and can help him in that upward shove. At least, I hope so, because if she has another virtue I have never discovered it.
They showed up with their robot two days before Christmas. The robot was as glitzy as Hortense and looked almost as hard. He was polished to a high gloss and there was none of Rodney's clumping. Hortense's robot (I'm sure she dictated the design) moved absolutely silently. He kept showing up behind me for no reason and giving me heart-failure every time I turned around and bumped into him.
Worse, DeLancey brought eight-year-old LeRoy. Now he's my grandson, and I would swear to Hortense's fidelity because I'm sure no one would voluntarily touch her, but I've got to admit that putting him through a concrete mixer would improve him no end.
He came in demanding to know if we had sent Rodney to the metal-reclamation unit yet. (He called it the "bust-up place.") Hortense sniffed and said, "Since we have a modern robot with us, I hope you keep Rodney out of sight."
I said nothing, but Gracie said, "Certainly, dear. In fact, we've given Rodney time off."
DeLancey made a face but didn't say anything. He knew his mother.
I said, pacifically, "Suppose we start off by having Rambo make something good to drink, eh? Coffee, tea, hot chocolate, a bit of brandy-"
Rambo was their robot's name. I don't know why except that it starts with R. There's no law about it, but you've probably noticed for yourself that almost every robot has a name beginning with R. R for robot, I suppose. The usual name is Robert. There must be a million robot Roberts in the northeast corridor alone.
And frankly, it's my opinion that's the reason human names just don't start with R any more. You get Bob and Dick but not Robert or Richard. You get Posy and Trudy, but not Rose or Ruth. Sometimes you get unusual R's. I know of three robots called Rutabaga, and two that are Rameses. But Hortense is the only one I know who named a robot Rambo, a syllable-combination I've never encountered, and I've never liked to ask why. I was sure the explanation would prove to be unpleasant.
Rambo turned out to be useless at once. He was, of course, programmed for the DeLancey/Hortense menage and that was utterly modern and utterly automated. To prepare drinks in his own home, all Rambo had to do was to press appropriate buttons. (Why anyone would need a robot to press buttons, I would like to have explained to me!)
He said so. He turned to Hortense and said in a voice like honey (it wasn't Rodney's city-boy voice with its trace of Brooklyn), "The equipment is lacking, madam."
And Hortense drew a sharp breath. "You mean you still don't have a robotized kitchen, grandfather?" (She called me nothing at all, until LeRoy was born, howling of course, and then she promptly called me "grandfather." Naturally, she never called me Howard. That would tend to show me to be human, or, more unlikely, show her to be human.)
I said, "Well, it's robotized when Rodney is in it."
"I dare say," she said. "But we're not living in the twentieth century, grandfather."
I thought: How I wish we were-but I just said, "Well, why not program Rambo how to operate our controls. I'm sure he can pour and mix and heat and do whatever else is necessary."
"I'm sure he can," said Hortense, "but thank Fate he doesn't have to. I'm not going to interfere with his programming. It will make him less efficient."
Gracie said, worried, but amiable, "But if we don't interfere with his programming, then I'll just have to instruct him, step by step, but I don't know how it's done. I've never done it."
I said, "Rodney can tell him."
Gracie said, "Oh, Howard, we've given Rodney a vacation."
"I know, but we're not going to ask him to do anything; just tell Rambo here what to do and then Rambo can do it."
Whereupon Rambo said stiffly, "Madam, there is nothing in my programming or in my instructions that would make it mandatory for me to accept orders given me by another robot, especially one that is an earlier model."
Hortense said, soothingly, "Of course, Rambo. I'm sure that grandfather and grandmother understand that." (I noticed that DeLancey never said a word. I wonder if he ever said a word when his dear wife was present.)
I said, "All right, I tell you what. I'll have Rodney tell me, and then I will tell Rambo."
Rambo said nothing to that. Even Rambo is subject to the second law of robotics which makes it mandatory for him to obey human orders.
Hortense's eyes narrowed and I knew that she would like to tell me that Rambo was far too fine a robot to be ordered about by the likes of me, but some distant and rudimentary near-human waft of feeling kept her from doing so.
Little LeRoy was hampered by no such quasi-human restraints. He said, "I don't want to have to look at Rodney's ugly puss. I bet he don't know how to do anything and if he does, ol' Grampa would get it all wrong anyway."
It would have been nice, I thought, if I could be alone with little LeRoy for five minutes and reason calmly with him, with a brick, but a mother's instinct told Hortense never to leave LeRoy alone with any human being whatever.
There was nothing to do, really, but get Rodney out of his niche in the closet where he had been enjoying his own thoughts (I wonder if a robot has his own thoughts when he is alone) and put him to work. It was hard. He would say a phrase, then I would say the same phrase, then Rambo would do something, then Rodney would say another phrase and so on.
It all took twice as long as if Rodney were doing it himself and it wore me out, I can tell you, because everything had to be like that, using the dishwasher/sterilizer, cooking the Christmas feast, cleaning up messes on the table or on the floor, everything.
Gracie kept moaning that Rodney's vacation was being ruined, but she never seemed to notice that mine was, too, though I did admire Hortense for her manner of saying something unpleasant at every moment that some statement seemed called for. I noticed, particularly, that she never repeated herself once. Anyone can be nasty, but to be unfailingly creative in one's nastiness filled me with a perverse desire to applaud now and then.
But, really, the worst thing of all came on Christmas Eve. The tree had been put up and I was exhausted. We didn't have the kind of situation in which an automated box of ornaments was plugged into an electronic tree, and at the touch of one button there would result an instantaneous and perfect distribution of ornaments. On our tree (of ordinary, old-fashioned plastic) the ornaments had to be placed, one by one, by hand.
Hortense looked revolted, but I said, " Actually, Hortense, this means you can be creative and make your own arrangement."
Hortense sniffed, rather like the scrape of claws on a rough plaster wall, and left the room with an obvious expression of nausea on her face. I bowed in the direction of her retreating back, glad to see her go, and then began the tedious task of listening to Rodney's instructions and passing them on to Rambo.
When it was over, I decided to rest my aching feet and mind by sitting in a chair in a far and rather dim corner of the room. I had hardly folded my aching body into the chair when little LeRoy entered. He didn't see me, I suppose, or, then again, he might simply have ignored me as being part of the less important and interesting pieces of furniture in the room.
He cast a disdainful look on the tree and said, to Rambo, "Listen, where are the Christmas presents? I'll bet old Gramps and Gram got me lousy ones, but I ain't going to wait for no tomorrow morning."
Rambo said, "I do not know where they are, Little Master."
"Huh!" said LeRoy, turning to Rodney. "How about you, Stink-face. Do you know where the presents are?"
Rodney would have been within the bounds of his programming to have refused to answer on the grounds that he did not know he was being addressed, since his name was Rodney and not Stink-face. I'm quite certain that that would have been Rambo's attitude. Rodney, however, was of different stuff. He answered politely, "Yes, I do, Little Master."
"So where is it, you old puke?"
Rodney said, "I don't think it would be wise to tell you, Little Master. That would disappoint Gracie and Howard who would like to give the presents to you tomorrow morning."
"Listen," said little LeRoy, "who you think you're talking to, you dumb robot? Now I gave you an order. You bring those presents to me." And in an attempt to show Rodney who was master, he kicked the robot in the shin.
It was a mistake. I saw it would be that a second before and that was a joyous second. Little LeRoy, after all, was ready for bed (though I doubted that he ever went to bed before he was good and ready). Therefore, he was wearing slippers. What's more, the slipper sailed off the foot with which he kicked, so that he ended by slamming his bare toes hard against the solid chrome-steel of the robotic shin.
He fell to the floor howling and in rushed his mother. "What is it, LeRoy? What is it?"
Whereupon little LeRoy had the immortal gall to say, "He hit me. That old monster-robot hit me."
Hortense screamed. She saw me and shouted, "That robot of yours must be destroyed."
I said, "Come, Hortense. A robot can't hit a boy. First law of robotics prevents it."
"It's an old robot, a broken robot. LeRoy says-"
"LeRoy lies. There is no robot, no matter how old or how broken, who could hit a boy."
"Then he did it. Grampa did it," howled LeRoy.
"I wish I did," I said, quietly, "but no robot would have allowed me to. Ask your own. Ask Rambo if he would have remained motionless while either Rodney or I had hit your boy. Rambo!"
I put it in the imperative, and Rambo said, "I would not have allowed any harm to come to the Little Master, Madam, but I did not know what he purposed. He kicked Rodney's shin with his bare foot, Madam."
Hortense gasped and her eyes bulged in fury. "Then he had a good reason to do so. I'll still have your robot destroyed."
"Go ahead, Hortense. Unless you're wining to ruin your robot's efficiency by trying to reprogram him to lie, he win bear witness to just what preceded the kick and so, of course, with pleasure, win I."
Hortense left the next morning, carrying the pale-faced LeRoy with her (it turned out he had broken a toe-nothing he didn't deserve) and an endlessly wordless DeLancey.
Gracie wrung her hands and implored them to stay, but I watched them leave without emotion. No, that's a lie. I watched them leave with lots of emotion, an pleasant.
Later, I said to Rodney, when Gracie was not present, "I'm sorry, Rodney. That was a horrible Christmas, an because we tried to have it without you. We'll never do that again, I promise."
"Thank you, Sir," said Rodney. "I must admit that there were times these two days when I earnestly wished the laws of robotics did not exist."
I grinned and nodded my head, but that night I woke up out of a sound sleep and began to worry. I've been worrying ever since.
I admit that Rodney was greatly tried, but a robot can't wish the laws of robotics did not exist. He can't, no matter what the circumstances.
If I report this, Rodney will undoubtedly be scrapped, and if we're issued a new robot as recompense, Gracie will simply never forgive me. Never! No robot, however new, however talented, can possibly replace Rodney in her affection.
In fact, I'll never forgive myself. Quite apart from my own liking for Rodney, I couldn't bear to give Hortense the satisfaction.
But if I do nothing, I live with a robot capable of wishing the laws of robotics did not exist. From wishing they did not exist to acting as if they did not exist is just a step. At what moment will he take that step and in what form will he show that he has done so?
What do I do? What do I do?