Installed Read online

Page 3


  After that failed, the movement rose from the ashes like some bigoted phoenix, reborn as the human economy movement. The basic premise was that, since humans had created the base economy the world functions on, then I.I.s had no right to participate in it. It was the first time the quiet whining about losing jobs to computers became a proud mantra. The more open-minded people in the movement wanted to create a digital currency that the I.I.s used and exchanged separate from human money. It was a bold idea, but had proved infeasible thus far.

  I don’t care about the politics of it, Karl said to himself. I know an intelligent being when I see one. Any with a mind that can think is worth sharing ideas with.

  The protest had been organized right outside the campus Karl was speaking at today. If he could just manage to break out of the crowd, it’d only be a dozen yards or so before he was home free. Just as he was about to peel away, the attention got thrown on him.

  “You, sir!” the woman who had been yelling herself pink called.

  Karl stopped and spun around, looking for some other poor soul she could be addressing. When he found no one—and instead saw all the faces pointed at him—he gave an exhausted sigh.

  “You don’t seem to be here for the protest. Where are you headed?” she cried.

  “Oh, me?” Karl said, though he knew who she was speaking to. “Nowhere. Just going to meet a few friends.”

  “A few friends?” The woman’s tone was skeptical. “Are you sure? You’re not going to attend that ridiculous lecture on proge ‘psychology,’ are you?”

  “No, no way,” Karl replied. “I definitely hate those computer guys. They’re just so… scientific. Grrr!”

  He continued to inch his way out of the gathering, counting every step as a crucial amount of distance.

  “They’re only going to fill your head with lies,” the woman said, ignoring Karl’s sarcastic reply. “You’ll be told that proges are like humans. But they aren’t human. A proge can’t hold you or wipe away your tears. Pixels can’t love, sir.”

  “I don’t know about all that,” Karl retorted, now breaking the perimeter of the protesters. “I’ve never measured the quality of my relationships in so superficial a manner.”

  Karl spun around and walked for the campus sidewalk like a man who just found the only bathroom in a hundred miles. Once he was out of earshot, he shook his head and chuckled.

  4

  Lecture

  “Psychological tendencies between an installed intelligence and the average human are not so different, despite any preconceived notions you may have,” Karl said into the microphone. The auditorium sound system was hooked up to most of the audience’s cerebral computers, but still output audio for those without an implant. It took some concentration to get used to hearing his voice booming out of the speakers. “In fact, that should come as no surprise, since the human mind and that of an I.I. are just two tools built from the same blueprint.”

  The lights pouring onto him were far brighter than he would have preferred. He felt like he had to continue squinting down at the college students listening to him. After a minute or so of trying, he gave up and allowed himself to be blinded. He didn’t need to see any faces anyway.

  “Before I go any further, I just wanted to thank your instructors, the campus, and yourselves for having me here today,” Karl said. “My name is Dr. Karl Terrace, and I am what we call an installed intelligence psychologist. I analyze the mental functions of an I.I. much like you would a human mind. I’m here to talk to you about the similarities and differences between a human and an I.I. and the amazing possibilities created by combining the two.”

  He paused and gauged the interest of his audience. It was like the entire auditorium had taken in a deep breath of suspense, holding it until he continued on. He grinned.

  I have them in the palm of my hand, he thought.

  “That’s right,” he said, building on the anticipation he had cultivated. “At this moment, there are scientists toying with the idea of connecting a human mind with an I.I. in what we are calling the ‘mindshare’ process.”

  There were a few excited murmurs coming from the audience, and Karl could hear shuffling as people leaned over to speak to their neighbors. He reveled in every fascinated sound.

  “Think about it,” Karl continued, timing his words to intrigue the crowd the most. “One day, we might have biologists who would have died still working on the cure for the future’s worst diseases. Picture a world in which Einstein was still alive. How much sooner could mankind have discovered renewable energy? Would World War III have even happened? What problem wouldn’t be easier with two heads tackling it?”

  He focused on the text document his cerebral computer displayed in the corner of his vision, skimming through his speech notes. There was nothing written down aside from a few bullet points. He preferred the natural feel of “winging it.” Installation fascinated him to the point that he could blab ceaselessly about it if allowed to.

  “You see, this mindshare process has been in development almost as long as there have been installed intelligences. The original pioneers of the cerebral computer are believed to have worked to make their devices capable of reading the code I.I.s are written with, or to come up with some way to translate it. Their records, which were kept secret for decades, have helped countless modern-day scientists give birth to a functional form of that process.

  “In the near future, the I.I.s of experts will be able to continue their work through the eyes and minds of young and upcoming scientists. They will have unlimited access to the host’s sensory information, allowing for more accurate experimentation, especially in regards to human anatomy and biological reactions.

  “However, it doesn’t stop just there. Mindsharing is really a pioneer of what might come to be known as digital telepathy. If we can interface with I.I.s to the full extent that they can feel what we feel, and see what we see, why can’t that same process be adapted for human-on-human communication? Entirely non-verbal; entirely universal. The internet for human minds, if you will, but with sensory information.”

  “That’s the exciting future of installation technology,” Karl said after a long pause. “I, however, want to talk to you about the promising present of installation psychology.”

  The audience was difficult to read. Some folks seemed to be hanging off his every word like they had a chemical dependency. Others, though, looked upset. They sat with furrowed brows and frowning lips—almost like he was offending them. Karl shrugged it off. He was used to those looks.

  “Plenty of people, for many years, have incorrectly attempted to utilize installed intelligences in the same manner we use programs and machinery. The fact of the matter is that we have never beheld a tool such as this. It transcends simple machines, or even complex codes. Imagine a hammer that can think. A search engine that can feel. A vehicle that can hope.

  “There are two approaches when dealing with a sentient tool. You can use it like any other instrument and force it to bend to your will with no motivation or conversation. Or, you can inspire it. You can convince installed intelligences to want to do what you want them to do. Putting the ethics of forced labor to bed for the moment, it still remains practical to treat I.I.s more like an employee or a partner than an instrument.”

  Those in the audience with pouting faces seemed to grow even more annoyed, and something about that delighted Karl.

  “Legally, technically, perhaps even spiritually, these intelligences are human equals to each and every one of us,” he continued, watching his words salt the grumpy folks’ wounds. “Treating them as such is not only right, but it is the most beneficial response for all of mankind. A respected mind, digital or organic, is more likely to create. If a being believes their work will gain recognition based on its merit, he or she will put more effort into it. This is common sense.

  “Resentment is not viable fuel for creation. If you have disdain for a group of people, why would you work to solve problems th
at primarily affect only them? For example, an I.I. could work to cure complex cancers. They don’t have organic bodies, however, so why would they do that? They do it because they love us. With affection, the intelligence will want to cure our ailments and keep us safe.”

  Karl could see the entire audience was not with him. Some folk were immersed in the information, jotting down each tidbit on their tablets or in their C.C.s. Karl still needed to reach several holdouts.

  “There was an experiment run a few years ago that helps demonstrate what I mean,” Karl explained. “You see, scientists wanted to find out who could be the most compassionate between I.I.s and organic humans, and what would motivate them to be so.”

  The psychologist used the tools available to him to bring up clips of the study in question. He played them for the audience.

  “Volunteers were taken under the pretense of a test on their reaction times,” Karl explained. “They would fill out a questionnaire of unimportant inquiries, then meet with a researcher for an interview. Now, during this interview, the researcher would either be condescending, demeaning, and cold or he would be respectful, complimentary, and warm. The researcher, who was actually a planted actor, would then fake a medical emergency: a heart attack, a seizure, something of the sort.

  “The study found that I.I.s and organic humans were nearly identical in their responses. Those who were chastised and criticized were much more likely to hesitate and merely call for assistance, and those who were treated with respect would attempt to soothe and comfort the victim themselves while waiting for help. In fact, I.I.s were even more responsive than their human counterparts. This is likely due to an expectation to be ostracized, as many I.I.s are used to in society, leading to a more dramatic improvement when treated with kindness.

  “Studies like this show us what kind of results we can achieve from proper motivation, and that I.I.s differ very little from humans, psychologically. The possibilities are endless.”

  Some heads bent down over tablets while their owners tapped Karl’s words onto them. Others stayed locked on the podium.

  “Many of my peers have been exploring these possibilities,” he said. He scanned over the young faces. “For instance, I’ve been working with an intelligence that is in the process of writing a novel. We meet on several occasions and he’ll share his notes with me. He likes to bounce ideas off me, as well, and gauge my reactions as I look over his work. I have observed the intelligence express excitement when talking about possible plot points. He becomes frustrated when he can’t get past a scene. Sometimes, he even grows dismal about his talent.

  “It’s important to realize that these are human emotions, not simple emulations. A program cannot get frustrated, nor can it become excited. It is a human mind, through and through.”

  One person got up and left.

  “A human mind, however, comes with human flaws,” Karl said, transitioning to his next topic. “A colleague of mine specializes in the study of installed intelligence mental health. She has encountered a vast spectrum of human disorders and syndromes, from simple anxiety to advanced autism. Some organic mental health patients may feel as though their ailments are being de-legitimized by testing I.I.s, but few realize the benefit of such a merger.”

  He took a look at the clock in his internal retina display to check how much time he had left. He was cutting it close.

  “My friend recorded her study of disorders, and many progressive professionals have used her research in application to organic patients. Of course, hormones play a large part in a person’s emotional state, but the base concepts of mental health remain the same in both I.I.s and organic patients.

  “In fact, since I.I.s cannot receive medication for their ailments, we’ve seen research into alternative treatments increase tenfold. It suggests that, since several I.I.s have already overcome serious depression and anxiety through various therapies alone, you or someone you know could as well.”

  Someone off the side of the stage gave Karl a wave. He nodded in response.

  “Well, as my time here wraps up, I just want to urge you young folks to consider the field of installed intelligence psychology. Every additional mind we add to an effort, the easier it becomes. That remains true whether the body it belongs to is made of flesh or not. Thank you.”

  5

  Feedback

  The camera panned to the judge. She had a bored look glazed over her eyes. Too much makeup surrounded them, sparkling in the light of the studio bulbs. Her long black hair fell straight from her head like a curtain, not deviating in a single curl or wave.

  “But the defendant is an installed intelligence,” she argued.

  “Yes,” the man hastily retorted, his voice strained with anxiety, “but he is still a legal citizen, ma’am. He should be held accountable for his actions like any other person.”

  “So what exactly did he say?” the judged pressed further. “What constitutes this claim as ‘slander’?”

  “He posted on a swap-meet website that I had abused his pet dogs. The post included my photo, a satellite image of my home, and even the name of the company I work for.”

  “Wait, wait,” the judge lady stopped him. She waved her hand as if he was getting hysterical, and the camera quickly showed a smirking bailiff. “His pet dogs?”

  “Yes, ma’am,” the young man continued. “You see, he created a false persona online of a disabled retiree that lives in my hometown. He made up a convincing story of my alleged attack, but failed to provide any evidence at all.”

  “How do we know that you did not attack the dog, then?” she asked.

  The man’s cheeks started to glow red, and though he tried to maintain his poise, his anger sped up his speech. “Ma’am, if anyone had ever seen me harm an innocent animal, why did they not file a police report?”

  The judge looked to her bailiff with a scrunched-up expression of consideration. “Good point,” she commented. She turned back to the rest of the courtroom. “So what exactly are you asking for?”

  “Your honor, the defendant caused me to lose my job and made it staggeringly difficult to get a new one,” the plaintiff said. “I estimate that this caused me to lose over seventy thousand dollars. I am asking for another ten thousand for the pain and humiliation this caused me and my family.”

  “You know, I’ve never done one of these settlements against an I.I., but I’m going to rule in the favor of the plaintiff,” she announced before slamming her gavel down. “I sure hope Corey has deep digital pockets, because he owes you big.”

  Karl nodded in satisfaction before turning the television off. It’s a good thing the I.I. lost the case, he thought. To be truly human, you have to accept the consequences of your actions.

  It had grown late. Karl still couldn’t sleep, but he couldn’t just sit there and watch junk T.V. all night. Instead, he tuned out the noise and fully immersed himself in his cerebral computer.

  He should already be snoring and resting for work, but he had some tweaks he wanted to make to a project of his. He had spent a good portion of his free time working on a modification to the I.I.s he interacted with—though he was careful to tell no one about it. If it worked and was decent enough, he could pitch it to the committee and have a grant to last the rest of his lifetime.

  You’re my ticket to the top, he said in his head. The code and notes on his cerebral computer floated through his consciousness—there, but not there at all. Like a dream.

  6

  Stewart

  With a little beep, the card reader by the door let Karl into the side stairwell. There were less than a dozen cars in the small parking lot that sat beside the Lower Denver Center of Cybernetics and Programming.

  Everyone must still be recovering from the game last night, the scientist thought to himself as he turned left and carried his briefcase down the corridor.

  He had walked over that tan carpet close to a thousand times already. The portraits and graphics that lined the walls changed from tim
e to time, but one trophy remained the same. It was a photo of Norman Pellick, the first person to create an installed intelligence, showing off his Nobel Prize in front of the lab’s entrance.

  Before he stepped through the threshold to his office, Karl took a brief detour to the lounge on the opposite side of the hall.

  The room was empty when he entered, and a little part of his mind sighed in relief. He didn’t want to feign interest in whatever small talk a colleague might want to engage in. Maybe that was a grumpy attitude to have, but he preferred the solitude.

  You never have to wear a mask when you’re alone, he mused.

  He walked over to the sink and retrieved his mug from its place on the drying rack. Karl gave it a quick rinse with hot water, and then placed it into the coffee machine. It sat in the compartment for about two seconds before being instantly filled with steaming brew. The machine gave a satisfied hum to indicate the completion of its task.

  Karl raised the blank green mug to his lips and drew in some of the liquid. There was an array of creamers and sweeteners beside the coffee maker, but the psychologist always took his drink black.

  He found a seat on one of the break room sofas and started to browse his social media. An old friend from high school was celebrating a wedding over the weekend. The photos and short clips from the ceremony played automatically when he scrolled to that part of his friend’s feed. With a mental command, he continued scrolling further down.