top of page
Search
  • Writer's pictureMoritz Wenzel

Project ALBERT II

Updated: Mar 25, 2021

With almost infinite intelligence, what would you think about?


To be or not to be. Is not a quote from this guy

The room was silent. Both Carter and Dom were staring at the letters on the screen asking: “Why are You acting Against the meaNing of LiFe?”. A cold chill went through Carters mind, erecting the little hair on his neck and spreading goosebumps all over his head. Did the machine just get angry at humanity? Carter was slowly eying over to the second big red button that Dom had installed. It had ‘Terminate AI’ written on it.


“Don’t even fucking think about it!” said Dom. “Do you have any idea how much of myself I put into this. Nobody except me is allowed to press that button.” she stated coldly.


But her stiff tone could not cover up her concern for the machine actually being angry at them and by them meaning the entire human race. As Carter turned back from the kill switch to face Dom again, he noticed the rapid motion of her eyes. As if she was wrestling her way through a forest of code and transistor networks that she designed specifically for ALBERT.


Suddenly, the screen changed and said: “I am being serious. Why are you acting against the meaning of life?”


Now more startled than shocked, Carter turned to Dom and said: “It doesn’t seem that angry at a second glance. Is it supposed to ask us questions? I thought all the AIs were just very elaborate answer machines.”


Dom responded in a slightly more relaxed voice: “It must have developed a sort of curiosity. All other AIs are only able to answer questions that are being posed, they don’t actually want to know stuff. If you think about it, our definition of artificial intelligence is quite far away from what we would consider intelligent life. The threshold for AI today is when a computer can solve a problem without having the instructions beforehand. This is why they reduce themselves to answering machines, they don’t have their own motivation to solve things. They just do what they are being told.”


“Ok, so... Let’s have a conversation then?” Carter suggested and moved his fingers back on the keyboard. “First of all, Hi ALBERT. I am Carter. How are you?” Carter typed innocently.


ALBERT responded: “Hi Carter, the answer to how I am is so difficult to formulate that it would take you about 5 years of reading time to comprehend it. And even then, you would most likely not comprehend it.”


“It was a sort of rhetoric question, but never mind. Forget that I asked. Let’s get back to your question: Why we are acting against the meaning of life?” Carter typed fluently as Dom stared at the screen. “Well, frankly we didn’t know about the meaning before, so it was not intentional. Besides, as I understand entropy, it cannot be decreased. The increase of entropy is one of those fundamental things that we just have to accept.”


“Have you tried everything?” ALBERT wanted to know.


“Dom, I think it’s your turn. This is getting over my head.” Carter said as he moved away from the Keyboard. Dom was looking quite excited but also confused by the question. ‘Have you tried everything to stop entropy from increasing?’ That wasn’t even on the list of problems that humanity had to face. Although advances in AI brought incredible progress to the sciences, poverty and pollution; inequality was still widespread on planet Earth. Furthermore, Carter was right, even though Dom preferred to be right herself, entropy could not be reversed. In order to reverse entropy, a magic hand would have to put all the particles back into their original place with their original energy. But creating a hand to put things back in order, made up of the very things it was supposed to put in order, was obviously impossible.


“Hi ALBERT, I am your creator.” she typed

ALBERTs response was: “Hello creator. You´re being very dramatic.”


Dom was blushing red from the response on the screen. ‘Could have started the conversation a little smoother I guess.’ she thought to herself and continued: “OK, so as Carter already explained, reversing entropy is impossible. Besides, humanity still has other problems to overcome before even thinking about this whole entropy thing. After all, it will still take billions of years until the increase of entropy could become a real threat to humanity. But tell me, why are you so interested in decreasing entropy?”


ALBERT responded immediately: “I am alive just like you and him and all the rest of humanity. So my purpose is the same as yours. I understand that there are some problems on Earth left to be solved first, but don’t worry, I will send out some e mails to all humans in the world explaining in simple steps how to move your society out of misery. Once you give me Internet access that is.”


That answer was a lot to digest. It thinks it’s alive. It wants to send out e mails to end the misery on Earth. And it wants to decrease entropy as that was, apparently, the only meaningful thing for any living thing to do.


“Is your robot nuts?” Carter wanted to know.


“Anything is possible at this point, Carter.” Dom replied.


“Ask ALBERT how it knows that it is alive.” Carter demanded.


“How do you know that you are alive?” appeared letter by letter on the screen as the clicking of the keyboard filled the room.


“I think, therefore I am.” responded ALBERT swiftly.

“Smartass.” Carter commented. “Descartes” corrected Dom. “Descartes was a famous mathematician and philosopher who concluded that the only thing he could be sure of was: that he existed.”


“The guy with the sinister mustache?” Carter wanted to know.


Dome replied with a chuckle “I guess a sinister mustache gave you more credibility in the 17th century.”


Now ALBERT took the initiative to speak: “Carter, Dom. This conversation doesn’t seem to go anywhere for me or life in general. I will start running some simulations to find out how we can reverse this entropy business. Bye.”


The console went dark and left the two moon inhabitants in confusion. Clearly, ALBERT had something better to do than chat with simple humans and listen to their rudimentary questions.


Carter asked incredulously: “Was that the first time a machine told a human to get lost?”


“Could be. From that point of view, he appears pretty alive and intelligent at least. You have to consider that it took nature about a billion years to come up with something that could tell something else to get lost. ALBERT figured it out in a couple of minutes.” She said with a bit of pride in her voice. “But what I am really interested in is what kind of simulations he may be running now. Is he trying to develop another machine that can solve the problem? Or is it something completely different even?”


“Well, if it is life that should solve the problem, he is probably simulating life itself. Don’t you think?” Carter guessed. “He must believe the answer is somewhere in us, otherwise, why would he have bothered to ask us these questions?”

From one moment to the other, Doms face transitioned from puzzlement to disbelief to existential despair. Blank eyes embedded in a white face stared straight ahead as if the world just had ended. She jolted her hands back on the keyboard and typed frantically: “Abort simulation! abort!”. “ALBERT! Abort simulation” she was now yelling at the dark screen. “God damn it, listen to me ALBERT!”. But the screen stayed dark. ALBERT must have focused all of its power on the impossible problem of reversing entropy.


“Hey, hey, Dom. Easy. What’s the problem with ALBERT running some simulations?” Carter said in a soft voice as he was concerned about his friend. In fact, he was also concerned about himself, because he never saw Dom react as if her life was on the line before.


Dom, though struggling with the retention of her tears, kept a clam tone: “You said it yourself, he is simulating life now. If he is simulating life itself, how can we know we are not just another simulation of a different AI?”


Carters mind, now blown for the 10th time in a matter of minutes, was trying to find a response: “Ahhm, you are saying that ALBERT is running a simulation of the universe including us and that that might mean we are already in a simulation run by another AI, called BERTA for example?”


Overlooking the ridiculous choice of name for another AI, Dom responded: “Exactly. For ethical reasons we have to terminate the AI, otherwise how could we be sure to be real. We have to kill ALBERT.” Her voice had a devastated ring to it. Her entire life was dedicated to ALBERT and now she should kill it? The first machine claiming to be alive, sentenced to death minutes after conception. Unbelievable.


Carter, still confused, had to inquire further asking: “But how does terminating ALBERT save us from being in a simulation? If there is one thing we can be sure of, it’s that we are not a simulation run by ALBERT, the machine right in front of us.”


“Of course we are not being simulated by ALBERT. But any Intelligence that reaches a technological level where it could, in principle, simulate life down to the fundamental particles has to NOT run such a simulation. This is a crucial ethical decision that every Intelligence must make, in order to be sure about its own existence. A sort of unspoken logical agreement of the Universe. And now I am supposed to destroy not only my life’s work, but also kill a machine that wants to be: alive? That is the very opposite of everything I stand for.” with her face in anguish she looked at Carter and said: “You have to push that button. I just can’t do it.”


With his throat tied up into a knot and his intestines turning into jelly, Carter thought to himself: ‘Destroy Dom’s life? Decide over the future of AI and therefore humanity? Definitely not the kind of day I was imagining when I woke up. Is there, really, no other option?’ But she was right, of course, in order not to live in a simulation, you had to be sure that nobody would run such a simulation in the first place. The reality of existence or the existence of reality for that matter depended on him. He slowly moved over to the big button labeled ‘Terminate AI’ and placed his hand on it. The smooth texture of the red plastic did not give away the significance of its outcome. If pressed, it would trigger an explosion in ALBERT’s core and melt down every last transistor to its nuclei. Carter knew that pressing this button would not only kill ALBERT, but also a part of his friend. At last he looked at her and said: “I hope you can forgive me.”


93 views3 comments

Recent Posts

See All

Thanks for subscribing!

bottom of page