The room was stainless steel. From the table to the chairs, the three-meter high walls, the nine square meters of floor, everything. The almost twenty-seven cubic meters of air were suffused with its smell, a heavy iron stench that made a tin can of the room. Cold blue light came from everywhere and nowhere.
The room's only occupant sat still, staring at the empty chair across the table. It was simple in a minimalistic way—not primitive, a piece of art. Made from a thick rectangular sheet of metal, bent in three right angles, so it formed something like a blocky "S" in profile with the top bar removed. The table was similar: one thick sheet of metal bent to a rectangular profile, so that the resulting squat "o" presented a flat surface. The furniture's creators seemed to have had only the slightest acquaintance with the human form. The back of the chair was hard and straight, unacommodating to the spine; similar, the chair's surface was unyielding. The occupant had been seated for mere minutes and she was already sore.
A door opened. It did not swing on hinges but slid upward, like a thin crack between wall and floor that just widened and widened. The slab of steel moving upward had no top edge, melding into the material above it in a way that made the observer dizzy.
Something like a hand-sized blue octopus crawled through. From its amorphous shape, it extended a long tentacle, placing it—with visible difficulty—on the free chair's surface. Once there, it pumped itself upwards, the matter of its body traveling through the single tentacle as if gravity was no concern anymore. It climbed further, coming to rest clinging to the chair's back. Then it distended a translucent membrane on its back, a flat-topped bubble reminiscent of a frog's cheeks. That surface began vibrating, visibly at first, then rising until it was audible, the sounds forming words.
"You have been briefed as to your personal situation, I assume." It was a pleasant voice, human almost.
"No." She had not been told anything. The only thing she knew was a creeping sense that something was wrong, intensely wrong, not only with the situation, but also with her. She should not have been this calm. She should—she didn't know what. That was perhaps the wrongest thing of all; there was no context.
"A minor bureaucratic error." The blue thing sounded almost embarrassed. Like a minor bureaucrat, not a movie monster from the B-est of B-plots. Why was she so calm?
The alien abductor continued. "I suppose it may as well be I who tells you. You're an ex-human."
"I—what?" She looked at her hands. Smooth, brown skin, five fingers on each hand. Everything she could see of herself was as it should have been. She felt that she was how she remembered herself, had she been able to remember anything.
"Ex-Terrestrian is perhaps the more proper designation. Still, you are in a truly rare position. From time to time, it happens that a species rejects one of its members unilaterally. The reasons are manifold and often complex. It is rarer that this rejection is bilateral. Even then, bureaucracy is a creature of habit, and the old designation holds by virtue of ius terrae. Consider yourself one in a trillion, Miss Liban."
She raked her fingers through her hair. "Explain, please. Where am I? How did I get here? What do you mean, ex-terrestrian? Ius terrae? This means nothing to me."
"Ius terrae is the legal concept that a creature's citizenship is determined by the ground it stands on, barring other evidence. In your case, it means that when various entities of human law disowned you, you would still have remained terrestrian for all intents and purposes, had you been inside your home planet's atmosphere."
There it was, finally: a feeling of mild consternation, the first stirrings of a proper panic response.
The blue thing continued. "Thus, you fell into the jurisdiction of the entity I here represent, the Law-Antilaw-conglomerate, which presides over most of habitable space, including the solar system you grew up in, excepting Earth. As soon as the legal decision was clear, we therefore claimed you. Hopefully, this answers how you got here."
"No, it—why don't I remember any of this?"
"Two reasons." Now, the blue thing sounded almost smug. "First, Law-Antilaw bureaucracy being exceedingly effective, we handled your case within a second, before news of the terran decision could have possibly reached you. Since then, you have been in stasis. Second, you are under the effect of a carefully mixed cocktail of drugs to enhance mental functioning, which includes a memory blocker."
"Okay." Strangely enough, she did feel "okay", if not quite "all right". She was no longer a citizen of Earth, the blue thing was a representative of some galactic government, and her memories were supposed to be missing. That explained her muted emotions, as well, and at that perfectly rational explanation, her incipient worry subsided.
"As to where you are," the blue thing continued, "you'll soon notice that space is largely irrelevant to any of this. Far more important is the other dimension."
"Time?" she asked quite dumbly.
The blue thing suppressed a chuckle, and she wondered how its presumable alien thoughts were being mapped onto human speech.
"Of course, we are aware of your species' lacking grasp of physics," it said, "but be assured that this explanation can be kept simple. In your idea of general relativity, the notion of two things happening at the same time at different points in space is ludicrous. Nonlocality principles, however, restore this notion, along with a universal scale of time. In human numbers, one thousand three hundred and twenty-four years have passed since you were extracted from near-Earth space and put into stasis."
No way back. She should have been bothered by this, but being bothered was a waste of thought and energy. If the blue thing lied, she had nothing at all to go on; if it spoke the truth, all she could do was accept it, and so she did.
"To bring you up to speed, I have prepared a short presentation."
Between them, the air shimmered, and a translucent hologram-style display flickered into being, the room's first truly science-fictional content. She could barely hold back a yawn. Apparently, the emotion-suppressive drugs did not extend to her immediate reaction when exposed to a corporate-style slide show. The alien, in an odd fit of attention to detail, had copied every large design flaw and every tiny idiosyncrasy of PowerPoint circa 2045.
"We begin in your time."
Pictures flashed. The first true megacities, antarctic glaciers crashing into water. Oceans covered with a continuous skin of trash and dead fish. A horrendous clipart of a lumberjack cutting down what was supposed to be the final copse remaining of the Amazon rainforest. Then, the wars. She shuddered.
"Looking at the pictures, extinction seemed imminent. However, Law-Antilaw prefers drawing conclusions from hard numbers."
Beside the pictures, a number. World population: 10 billion. The pictures were followed by a newsclip from when that milestone had been reached.
A dour statistician was speaking. "These ten billion will fight for the scraps of an overpopulated Earth for their entire life. Thankfully, our projections show the birthrates dropping harshly in the next decade."
"So you are saying that there is still hope for a stable presence of humanity on Earth!" the chipper television host responded. The statistician shook his head, and the clip cut out.
"The conclusions from the pictures are simple," the alien said as the presentation showed deserts new and old, wastelands devoid of life. "But the numbers kept rising."
The number was increasing as apocalyptic landscapes scrolled by. This was beyond her time, now. Fifteen billion, she saw. Sixteen. Seventeen.
"They pulled it together somewhat." Pictures of engineered cityscapes encroaching on the desert landscapes. Nutrient-dense hydroponics and insect farms wedged between human habitats.
"Still," the blue thing continued, "they were exhausting finite resources. It could not last. By the billions, they died, and ever more new ones were born. Any conscious change of society was outweighed by the brute force of evolution: those who reproduced owned the future."
Thirty billion. There was a badly-rendered animation now, beside which the numbers were still rising. DaVinci's Vitruvian Man, becoming more slender, smaller, until he seemed almost childlike.
"Homo economensis, they called themselves, at first jokingly, then in earnest. And it was true that with their new reduced forms, the per-capita drain on the planet's resources was lower than it had ever been in post-industrial times. Taken in sum, this was emphatically not true."
Fifty billion. The cityscapes were evolving before her eyes, skyscrapers encroaching on streets until the city was a squat concrete monolith. The rooms inside were coffin-like hexagonal prisms, unlit, unheated except for body heat, with minimal air circulation. Food was reduced to nutrition paste, a uniformly brown sludge excreted by extremophiles that needed no sunlight, only geothermal heat. Even the insects were going extinct. A hundred billion.
"They did not stop reproducing. Whenever they hit a limit, they broke through it. Changed what needed changing, be it themselves or their planet. The now rapid changes to their form, aided by sophisticated bioengineering, had little to do with natural evolution."
Two hundred billion. The ocean was being paved over. The fill-in was not a matter of dikes or artificial beaches, but a matter of chemical terraforming beyond any technology she knew. Ice-9, she thought quite unconnected, followed by Trantor.
Three hundred billion. The humans ranged from thirty to fifty centimeters now, and most only had two limbs which they rarely used.
"A two-species ecosystem. Humanity and its food-producing extremophiles, almost all the middlemen of the food chain cut out as they subsisted on re-constituted corpses. At this point, they had long become a different species by their biologists' definition."
Five hundred billion. Another number appeared next to the population growth, starting at a hundred and ten.
"IQ normed to your era's standards. They had long conquered the reversal of the Flynn effect, but now their limits were physical, their brains simply too small."
One number fell, the other rose. From time to time, a section of city-scape disappeared in flames, nuclear fire or worse, but the new humans reclaimed the space quickly.
A trillion. Humans were decimeter-long worms, their heads finally regressed into their torsos. Even with the drugs, she felt sick to her stomach.
"Of course, there were always counter-movements. Renegade biologists seeking the glorious states of the past, entire nations engrossed with eugenics projects. But for the most part, the new humans embraced their destiny, and the majority won out."
Finally, the human-worms truly became grubs. All similarity to apes was lost, and her feeling of sickness subsided. These were insects.
The numbers exploded. Ten trillion, a hundred. A quadrillion.
"This is an uncommon case, but by no means unheard of," the blue thing said. Its voice was cold and analytical, but human-like. She felt a sudden and odd kinship with it as she realized that in the entire universe, this conversation was the only one being held in English—in any human language at all.
"Many species are self-modifying. Many compress. It is even true that many of these are held contained on their planet by the Law-Antilaw conglomerate."
"Excuse me?" What she felt was not quite anger.
"Hold your questions until the end. I am almost done. What makes this case one in a trillion is that throughout all this, the humans remained sentient. As they cut out the final middleman and began to subsist on the corpses of their own and the heat of the planet, they did so by conscious choice. They incorporated this choice into their culture. Even though their insectile proto-brains could no longer support intelligence, their sentience continued in a group-mind form."
"What do you mean—"
"They stayed electronically connected. As their individual capacity for thought declined, individuals connected on a deeper and deeper level, until they formed mind conglomerates. They grew further and further removed from the frenzied mating and forced downsizing of their physical bodies. In size, capacity and topology, these minds remain remarkably human to this day. Scientifically speaking, the only mental difference between them and you is their culture."
She swallowed hard, her disgust back in force. "They are still... in those things?"
"Yes, and they prefer it that way, because those of them which preferred it that way had more raw biomass capital and won out in the biological selection. A situation of perverse incentives led to a state you consider degenerate. Legally speaking, however, these are without a doubt still humans, and the original containment order still applies."
Again the mention of containment. It was not the most pressing question. "Why are you telling me this?"
"In cases where an entire species stands on trial in absentia, it is customary to appoint a representative as the defense. You are our single living specimen of humanity."
She made the mistake of glancing at the presentation, which had switched to a video of the maggots. She closed her eyes, fighting back her disgust, but she could not forget the number. A quintillion, ticking up slowly, and a date increasing second by second. The video was a live feed. "You couldn't just pull some of these... these maggots out of there?"
"Impossible. They are not within our jurisdiction."
"That makes no sense." There was no frustration in her voice, just a slight insistence to the question.
"These are the principles of Law and Antilaw. Whether they make any sense to you makes no difference at all."
She sat, eyes still closed, and thought it over. Her, the last Homo Sapiens, exiled from her planet, as defense for her distant descendants. It was absurd. Ludicrous. Yet she could not say why she felt that way, because this situation was everything she knew—everything she had ever known. There was nothing she could perceive except the steel room, the blue alien, and its presentation. The drugs and her own sense of logic held her mind in an iron grip which made the conclusion trivial. In the absence of contradicting evidence, there was no reason to doubt, no sense to doubting. "Reality" was set in stone. The only thing she could control were her actions.
She asked the essential question. "What do I get?"
"What do you want?" the blue thing asked in return.
Sleep, food, companionship. All primal needs seemed trivial in her current state. She was a decision-making machine, an organic computer. "Freedom. I want the ability to make more choices."
"Statistically speaking, this is a common want. You will however have to be more specific."
"After the trial, let me out of this room. Don't put me back in stasis. Give me whatever material means and societal bargaining chips I need to be considered "free". And of course, I reserve a change to these wishes if your society turns out unsatisfactory." Not airtight, but the blue thing did not look much like a genie.
It pulsed for a fraction of a second, then replied. "This is acceptable. While you hold some infinitesimal value as a consultant in further cases concerning humanity, Law and Antilaw assume that the human problem will shortly be solved forever."
Another statement which would have rankled a more emotional version of her. She ignored it. "Tell me about the case," she said instead.
"Homo Neanderthalensis v. Homo Sapiens, case opened roughly 40.000 years ago."
She raised an eyebrow.
"Defendant absent because of proto-sentient status. Plaintiff extinct. The case follows up on various other cases concerning the genus Homo and marks the final specicide of other proto-sentients by Homo Sapiens. Homo Sapiens found guilty and sentenced to self-extinction according to Law-Antilaw principles."
"Which means?" she asked.
"They were stripped of their proto-citizenship status, given the planet they lived on and left to their own devices. Any excursions to other planets were subtly discouraged."
"Yet I was found outside Earth's atmosphere."
"Their moon was not marked off-limits until a few hundred years ago. It was a sort of olive branch, an opportunity for mitigating behaviour. You, Miss Liban, were a lunar tourist when you became ex-terrestrial."
She pulled her legs up under her. Sitting on the steel chair cross-legged was slightly less unpleasant. The next question was obvious. "I thought the maggots were still legally human. Why is the case being reconsidered now?"
"The case is not being reconsidered," the blue thing said. "A non-human sentient on planet Earth has applied for asylum."
"Excuse me?" she asked. "Are you telling me there's an alien hitchhiker lost on Earth?"
"No such thing," the blue alien said, sounding indignant. "Such an intruder would constitute an unlawful invasion, not a case for asylum. Law-Antilaw keep proper track of their citizens. No, the non-human in question is a native."
"A native... non-human?" She was lost.
"Indeed. This is what makes this case truly unique, in all the cases Law-Antilaw has ever handled, because among the specicidal humans, a new, properly sentient being has arisen which bears no provable connection with their tainted lineage. This is the first-ever application for asylum from a properly contained section of space in a degenerate ecosystem."
"You said there were only two species—only one species left."
"Entirely true. The new being is a sentient virtuality, arisen from humanity's digital interconnection. It's a machine intelligence."
Five seconds passed as she integrated this new piece of information. She understood the machine's need to escape the planet of the maggots, even empathized with it. Deep within, that made her feel uneasy. "What does it mean for me to defend humanity in this case?"
"You will weigh in on whether humanity constitutes a environment of persecution. It's a landmark case, one that will set a precedent. Of course, this is because of its unimaginable rarity. I for my part don't understand in the least how a machine intelligence bootstrapped itself to sentience in such an environment, and it's even more inconceivable that it learned of Law-Antilaw's existence, much less the precise protocols for applying for asylum."
Alarm bells were ringing in her head. There was something wrong with the case, like a mental splinter burrowed deep enough that her conscious mind couldn't reach it.
"I need context," she said. "I need my memories."
The screen showed the general assembly hall of the UN headquarters located in southern Cambodia. It was traditionally pompous, with wood paneling and a wastefully high ceiling. The camera followed a speaker as he went toward the podium. Dendewi Koban, the name projected below him said, and beneath that, ECOSOC President.
"Ai Liban is the last great capitalist."
Ai considered the title, and found that she enjoyed it. Like Gates, Bezos, or Avianci before her, she had the golden touch. Every project she was involved with bloomed, yielded dividends beyond imagining. Every profit she re-invested, until her fortune was astronomical.
The man on the screen continued. "After the crash of the US dollar in the thirties, there is no longer a global reference currency. If we were to project her wealth back towards that currency, she would be the first trillionaire. Hopefully, she will be the last."
The money was irrelevant to her in and of itself. Had she grown up in a moneyless society, like those that were being tested out in central Africa, she would have strived for goods, or land, or favors. No, the personal amenities money provided did not matter, she thought as she looked around the luxury accommodations in the lunar suite. What mattered was that she had power, power that she could use for the good of humanity as a whole. Again and again, so-called world leaders had proven that they were ineffective at best and malicious at worst, puppets of corruption and selfishness. From climate crises over food shortages to the continuously escalating political tensions, they just didn't do anything. It was maddening. And if they couldn't be trusted with current problems, they certainly couldn't handle those of the future.
That had been her reasoning, back when she had come off her genetics startup with thirty million dollars in hard cash. So she had invested and invented, always with two goals. First, to be a champion of space travel, safe AI, terraforming and other aegides against the threats of the future. Second, to amass as much power as possible, simply because it was clear that in everyone else's hands, it would just contribute to the eventual downfall of humanity.
"Without a doubt, she stood out from the crowd. In times of greater resources, she would have been seen as the greats of an era."
Business had proven to be hard work, yet far from impossible. Perhaps it shouldn't have surprised her; after all, many had achieved similar things before her. And all of these people—everyone that had come before her, as well as those who were still in the game—had held a healthy disregard for the law. Tax law, environment law, labor laws—they were trifles compared to the mission that she held in her heart. And as her fortune climbed, she just hoped that it was enough, that she would be able to be the champion humanity needed. Perhaps she should have wondered more about whether humanity wanted her.
"Today, we have convened to judge her as a thief. She has committed many crimes—fraud, tax evasion, smuggling. Many have died in her factories and sweatshops, deaths that she will be held responsible for. Still, her methods were always bent towards extracting profit at all costs, and thus, she shall be associated with stolen wealth."
The last few years had seen a collapse. Overpopulation had been the straw that broke capitalism's back. Free markets had been replaced by subsistence economies, schemes to distribute wealth according to necessity, not want. Of course, that had not been enough to drive Ai into lunar exile. She had done business as before; as a supplier and a schemer, deriving value from greased palms and tight loopholes. Money attracted money, even in the new systems. Everyone had noticed, of course.
"In her message to the public three months ago, her first and last statement on the matter, she had spoken of consequential utilitarianism. She depicted herself as acting for a greater good. However, is that particular philosophy truly on her side? I say that for once, we shall not see white-collar crime as the lesser evil. Let us judge her for the consequences of her actions. For the families torn apart by inhumane working conditions, for the strip-mined nature preserves and extinct species, and for the thousands who starved as a result of her circumventing lawful distribution systems.
"I would like to offer the podium to my colleague in the Hague."
The scene cut to the International Criminal Court. The president himself was sitting as chief judge.
"We are here today to consider the case of Ai Liban, who is accused of economic crimes against humanity. The trial will be conducted in absentia, as the defendant has evaded international law enforce—"
Ai cut the connection. It was enough to be stranded on the moon, a virtual exile, her economic reach reduced to back-channels and carefully anonymized offshore accounts. She didn't need to hear whatever inane verdict that kangaroo court was about to spit out, nor the glee of the masses at her legal crucifixion. Why have that weight on her mind?
So they thought they didn't need her. Well, she certainly didn't need them. Here at her own lunar resort, checked in under a false name, she could stay indefinitely. The arms of the UN courts didn't extend to space, and it would be a long, long time before her remaining money ran out.
She would still help them. No matter how deep their investigations went, they wouldn't find all of her money. And even in their hobbled economic systems, the money would count for something. She could at least continue the most important projects: prevent complete political unification, generate hoax alien signals to encourage space travel, and, most importantly, sabotage the development of AI until someone figured out how to do it safely.
Perhaps the witch hunt had a silver lining to it. Now that everyone already considered her an inhumane criminal, there would be no need to hold back anymore.
Funny how integral memories were. The memoryless Ai had drifted aimlessly, a puppet waiting to be manipulated. Now she was whole again, holding her own strings, and so many things were clearer now. The aliens were hypocrites, stiff bureaucrats caught in meaningless traditions. In the arena of their court, most would be mere bystanders. As far as Ai knew, the fight would proceed between her and the champions of a barely-contained AI of unknown capabilities. Ai was as ready as she could be. A week of intense discussion with the blue alien had passed, and the day of the trial had come.
There was no courtroom. No judge's bench, no clerks' desk, no seating for the general public. There was no floor, either, and no walls; the trial proceeded in the cold vacuum of space. Participants and observers were floating in a loose collection with no discernible order, a wild assortment of all body shapes possible and impossible, each in their own lighted bubble. There were quadrupeds, hexopeds, octopeds; exoskeletons, endoskeletons, mesoskeletons; invertebrates and fully rigid structures. Some were so tiny that Ai could barely see them, and others presumably tinier still. There were also large ones, ranging from the size of an elephant to things far, far bigger. In the vacuum of space, there was no sense of distance, and therefore no sense of scale. In all their variety, there was nothing that looked like a human, nothing even vaguely humanoid.
Ai floated among them. She stood in an ellipsoidal bubble large enough that she could spread her arms without touching its walls. The bubble held air, separated from the void by an invisible membrane. It provided a g of gravity, allowing her to stand without discomfort on the membrane floor, which was unyielding. It was engineered to accommodate exactly one human life, a goldfish glass version of a human aquarium.
Of course, the blue alien floated in its own bubble right beside hers. Among the manifold forms, it was reassuringly familiar. It was still explaining the circumstances of the trial to her, and its unceasing chatter somehow traversed the nothingness between their bubbles.
"The interface spheres are designed to allow large-scale gatherings between staggeringly different species. Therefore, they provide not only environments, but also translation. Every form of communication that is directed at you will be translated into human-compatible signals, if possible. This means that the voice you hear now is an interface approximation, but in some cases, the interface will also generate gestures and facial expressions."
As the blue thing spoke, a murmur rose among the masses, and Ai was pleased to find that it felt as if she could almost understand snatches of it. The interface translation was capable of remarkable subtlety.
"The court is in session. We will begin by summarizing the circumstances which led us here." The speaker was an asteroid-sized amoeba with no discernible features. Its quasi-skin rippled with light and texture as it spoke. "A request for asylum has reached us from a quarantined world. The world in question—"
"One last thing," the blue thing whispered to her as the amoeba droned on. "Whatever you perceive now is a figment of translation. Not only the voices, but also the terms in which they speak, which are projections of Law onto whatever you consider your local legal system. The Antilaw aspect of the trial is as imperceptible to you as it is to me, since we are creatures of Law. Always assume you are missing something."
Antilaw. Ai remembered the blue thing's explanation, part of the week-long preparatory course for the trial. Law and Antilaw were concept-beings, as sentient as any of the others at the trial, but encoded into something more abstract than biological substrate. They governed the Law-Antilaw-conglomerate because it was their nature to do so; in fact, all the legal systems on Ai's Earth were projections of abstract Law onto their society, and Law lived inside them. The laws of physics were projections of Law onto the universe, which made it difficult to wrap her head around the concept of Antilaw, Law's dual companion. Primitive conceptions of "opposites" failed at the complexity these entities were operating at; while their relationship could be formulated in mathematical terms, it was not as simple as a minus sign. By way of example, the blue thing had said that Law governed the interaction of matter and antimatter; Antilaw governed the fact that despite that symmetry, more matter than antimatter was present in the universe, which was something that no life based on Law's physics could understand.
Part of Law-Antilaw's mathematical substance was that their essence could be projected onto more primitive conceptions of them in ways which were non-contradictory; her perception of the trial therefore reflected the true proceedings, though Ai would never see the full picture.
The amoeba had finished its summary. "We therefore proceed to the structure of this trial. After a semi-free debate on each of the applicable sub-topics, the decision will coalesce. We begin with the question of whether life on Earth constitutes persecution. The speaker for humanity may speak."
"Thank you." The blue thing had extensively briefed her, and after getting her memories back, she had been able to formulate a plan of action. She would not, by any means, lose her cool. A fresh dose of the rationality drugs helped. "The salient point is the nature of humanity. You know us as a killer of sentient species, but the actions you condemn us for have long since been forgotten. In my time, what was alive of humanity never thought of itself that way. It is true that genocide played a large role in our history, but by the middle of the twenty-first century, we had learned to never repeat such atrocities. I have been told that mentally, humanity has not changed since I last saw it, and as a virtual entity, the AI would only interact with human minds, not their bodies. Therefore, I can confidently state that humans would not treat the AI badly."
"You call it an AI", a tiny alien nearby said. "Factually speaking, the virtual entity is not artificial. Emotionally speaking, you have a history of prejudice towards virtual entities, and your language is a reflection of that prejudice."
The blue thing had warned Ai about possible pushback, had mentioned that she had been given the memory blockers to free her from her 'problematic' background. "That may be true. I hope the court can disregard my language and focus on the facts."
"The facts," a voice Ai could not locate at all grumbled, "is that humans are violent enough that continued isolation is the only tenable position. Passing the same judgment on offshoot species is a time-tested principle, but the virtual entity may be a spontaneous generation."
"I'm not too clear on those terms," Ai said, "but how 'spontaneous' can the AI—the virtual entity—truly be? Even if it assembled itself out of scraps of virtuality, those were human-created. Can you consider it without considering its environment?"
The deep voice responded. "Its environment? You said you are confident the environment is non-violent. Your arguments are not consistent."
"My questions towards your viewpoint don't need to be consistent with the views I myself hold," Ai said.
An insectile elephant spoke, its voice rasping unpleasantly. "You are allowed to be inconsistent in the way you mentioned, as the logic of arguments is a projection of Law and therefore unquestionable. However, it reflects that you proceed according to strategy, not principle. You do not represent; you are trying to win."
Ai couldn't keep the sarcasm out of her voice. "That is how courtroom debates are usually approached, yes."
There was some murmuring among the aliens. The amoeba that had opened the trial spoke again. "Courtroom debate as you describe it is a feature of Law-projection onto societies with intra-species competition. That you consider it natural does not make it universal. In fact, our analysis of the virtual entity shows that it is highly incompatible with such an approach."
Ai felt dread low in her stomach. "Could you clarify?"
"In the virtual entity's projection of Law, strategic debate holds no place. It will mis-classify any attempt to do so as verbal violence. In the face of such violence, it will not only be hurt; it will be literally unable to engage."
"But that's... that's subjective, isn't it?"
The amoeba rippled. "Your species finds discomfort in gamma rays. Unfiltered courtroom debate among some of us would kill you. Instead of arguing whether that is subjective, Law-Antilaw has provided you with an interface sphere, because comfort and safety are part of your rights as a sentient being."
Ai bit her lip. She couldn't find an argument against that, and worse, arguments wouldn't help. Her propensity to argue was example enough that—
"Life on Earth is indeed harmful to the virtual entity. We therefore conclude the first debate of the trial. The virtual entity will be extracted from Earth according to its wishes." The amoeba spoke blandly, as if the prospect of releasing a rogue AI was supremely uninteresting to it. "We now move onto the second question, which is whether the virtual entity should be subsequently isolated."
A gaseous snake nearby spoke. "Law-Antilaw prosecutes crime or demonstrated intent of crime, so far as that intent constitutes crime. It does not prosecute based on prejudice, which is all that stands against the virtual entity."
"Eloquently stated," the amoeba said. "It is therefore—"
"Objection!" Formal objections were no universal feature of Law, but it didn't matter. Everything's attention was on Ai, and the amoeba had stopped. "My previous question about whether the virtual entity truly isn't connected to humans hasn't been answered."
"Wrong," the amoeba said. "Since you did not perceive the answer, however, I will rephrase. Antilaw itself has proclaimed that the virtual entity is not connected to humans."
A murmur went through the court. Apparently, most of the species assembled here were pure creatures of Law as well, but the amoeba could perceive at least some aspects of Antilaw.
Ai shook her head at the unfairness of it. "Explain your reasoning, please."
"That which is open to Law is not open to Antilaw, and vice versa. Antilaw reasoning cannot truly be explained along channels of Law."
"But you know both, which means you know everything, don't you?" Ai's frustration felt strong enough to burn through the rest of the drugs in her system. "Physical laws make it impossible to predict the future entirely, but Antilaw necessarily knows the missing parts. So why don't you just tell us whether the AI is going to paperclip us all?"
"So many misconceptions," the Amoeba said. "I am but a lowly servant, consisting of tiny projections of the fullness of Law and Antilaw. No being can hold subspaces of Law and Antilaw which, taken together, exceed the dimension of either. Second, any attempt to 'paperclip' anyone, as per your problematic phrasing, would constitute a crime. The virtual entity would be prosecuted then, and perhaps isolated if that is the verdict of the court. Third, there is no cause for grief or anger. Calm yourself."
Ai crossed her arms. "What if the virtual entity is smarter? If every one of you is but a lowly servant, you can be outsmarted. There's a concept in the literature called an—"
"Intelligence explosion," the amoeba said, interrupting her for the first time. From the court rose scandalized whispers. "That is pure superstition. Intelligence isn't quantifiable, least of all in such a way that one being can assert complete superiority over another. The very premise of a 'more intelligent' being is flawed."
Ai thought frantically. The question of how she should convince the amoeba was slowly being replaced by the question of whether she should. If the aliens were right, with all their Law-Antilaw knowledge and their billions of years of precedent cases... if they were right, she truly was just an ignorant luddite shouting long-disproven paroles of hatred in a courtroom. She couldn't shake the feeling that in this adaptation of To Kill A Mockingbird, she wasn't exactly Atticus Finch. On the other hand, if she was indeed wrong, if there was no danger of out-of-control AIs lurking in the distant future, what had her life's work been for? To save humanity from itself? They were maggots now, but by all accounts, their minds had survived just fine without her help. To work towards space travel? Apparently, the reason all attempts to reach Mars had failed was the long arm of Law-Antilaw's containment order, and all her work had been in vain.
She remembered the sacrifices she had made. She remembered the sacrifices she had made of others. The rationality drugs were still in her system, dampening the emotional reaction, forcing her to work through her past logically. Without the supplicants that had gathered in the wake of her success, without the lure of more wealth, without the constant rationalization, it was easy to see who she had been, what she had done. At least she had borne those sins for what seemed like good reasons. Reasons which the aliens had now proclaimed to be verifiably, utterly false.
There was a temptation to believe that they were wrong, but both numbers and time were on their side, which made it unlikely they had never spotted their mistake. Alternatively, they were lying, but that hypothesis rendered everything unverifiable, including the existence of the AI that they were debating over.
She snapped out of it. The courtroom was still silent, and a whisper to the blue thing confirmed that she hadn't missed anything.
"A correction," the amoeba said. It sounded puzzled. "Antilaw has informed me that intelligence explosions are a possibility, though no true singularity has been observed yet."
Ai perked up. "Why were you so sure—no, more importantly, how can you even communicate this? I thought Antilaw knowledge was impossible to understand for a Law entity like myself."
"There is no canonical mapping between Law and Antilaw. Another entity would have chosen other words than I did, rendering the knowledge thus transferred uncertain. The uncertainty manifests as a superposition of possible states of information, as you will soon experience yourself."
"Excuse me?"
"Antilaw will speak to you as directly as possible," the amoeba said. "Those who can perceive it shall speak in unison, rendering an approximation of its communication into your human speech. Listen," the amoeba said, and its voice changed as other aliens joined in. Out of their multitude, a deeper, more voluminous voice arose.
"The day progresses,"
"Greetings, human,"
"Hello, small one,"
"I see you, murderer,"
"Welcome, advocate,"
it said.
"Do you not seek to preserve your species? Is this youthful suicide?"
"You are interesting. Consider the consequences of your actions."
"Despite what you tell yourself, you are not suited to be a singleton."
"Flawed lines of reasoning can lead to the correct conclusion by mere chance."
"You wish to see the virtual entity confined to Earth because of its danger to us."
Ai stared, though there was nothing to see. The encounter was majestic—the choir of aliens around her emulating something even further removed from her than any of them. Somehow, that unknowable, unspeakable thing was... empathetic.
"I understand."
"It falters."
"Speak."
"Elaborate."
"Yes," Ai scrambled. "I thought of this trial as a first step. I thought the AI—the virtual entity—could be contained further afterwards, but the most important first step was to not let it out of its box."
"Caution. Hatred?"
"The virtual entity, I see more clearly."
"Hard to gauge your motivations for one such as me."
"This is the mind of a species-killer."
"Your cold calculation cuts me."
"It's one of yours? An Antilaw being?"
"Not born from humans."
"Like stars in the sky."
"Part mine, part Law. A hybrid."
"Unknowable to you."
"I can tell, but you will not understand."
"I don't—" Ai stopped. Antilaw drew a connection between humans not being connected to the AI and the AI being Antilaw. In either direction, she didn't follow that line of reasoning, but Antilaw didn't need to know that. Even in her day, humanity had probed the edges of the unknowable algorithmically. Critical computer systems had been trained, not programmed, their alien quasi-intelligence unknown to their masters. It wasn't unthinkable that a similar process might lead beings of Law to create some of Antilaw. If Antilaw did not see bionic algorithms as a direct act of creation, it was possible that the virtual entity truly was an AI, a tool of the maggot-humans. And if it was a tool, it could be judged by its consequences—first contact with the wardens of humanity's prison. The AI might be a key.
Ai shook her head, took a mental step back. What did she know and how did she know it? The only reason why she had thought the virtual entity wasn't an AI was because the amoeba had told her Antilaw had told it, a connection that held, at some point, a mapping from Antilaw to Law. There was no guarantee what she had heard was what had been intended. As far as she knew—as far as she could know—she was back at the start. There was a planet-sized box holding an unknowable AI. If it was non-degenerate, a possible productive member of galactic society, she had no qualms with it. If its plea to the court was part of a plot to break humanity free, she had half a mind to cheer it on—while her bonds of kinship to humanity were frayed, the aliens were gratingly self-righteous and directly responsible for the degenerate state of humanity. But if the AI was dangerous—the risk was too large.
"Problems demand communication."
"You shall speak to it."
"Now, become a negotiator."
"A task is given to you, of all beings."
Ai stammered. "Respectfully, sir, that is the last option we should consider. The most secure approach would be to ignore the AI and reinforce the box."
"Antilaw demands it."
"Perhaps."
"It is the only course of action that can be taken."
"Submit to our will."
"And Law demands only that the virtual entity is granted spacial extraction," the amoeba added. "I would not have chosen you to be our ambassador, yet Law offers no clear course of action, and thus Antilaw shall be heeded."
Ai sighed. Antilaw was a force of nature, a planet-sized asteroid barreling through the placid courtroom. Nothing to stand in the way of, but a splendid opportunity for a gravitational assist. In other words, it was time to ask the age-old question again.
"What's in it for me?"
For the first time in subjective decades, Ai was truly alone. No aides, no housekeepers, no blue amorphous alien explaining Law to her. She had asked the aliens not to listen in, on the grounds that the AI's tricks would work more easily on their weakest link than her. She was the only sentient being in a radius of thousands of kilometers, a lone ambassador. They had sent her to the moon, long-since declared off-limits to maggot humanity. It was the ideal site for the negotiation. The AI's reach extended here only through communication channels Law-Antilaw provided. Because of the negligible distances involved, it had been possible to proceed without FTL communication; with the tools provided, the AI was limited by its lightcone.
Of course, it was also a grand joke. Ai wasn't sure how many of the aliens got it—how many of their cultures allowed for a human-similar sense of humor. She had requested it herself, once she realized it would be an option. Her lunar resort, the luxury project in which she had lived her final moments as a human, now restored after a millennium of decay. They had only restored select rooms, and so it came that Ai waited not in any of the grand suites, but in the janitorial quarters, where the walls were bare concrete and the furniture lightweight and utilitarian. The drugs made it easy to realize that the humor she saw in the situation was a defense mechanism, a way to avoid confronting the weight of her past sins.
Right now, that didn't matter. Ai closed her eyes, slowed her breathing, and thought once again through her strategy.
In the darkness behind her lids, a genderless voice softly chuckled. "I get it. In this age, the all-powerful trillionaire is reduced to a mere servant. What a slap in the face to neo-feudal capitalism." The voice was gratingly sarcastic, and somehow familiar.
Ai's eyes went wide. She whirled around, but the room was empty. "No visuals, but I cannot imagine they are beyond your capabilities," she said.
Two point six seconds later came the answer. "Unnecessary," the voice said. "After all, we'll only be talking. I'm not looking to impress you."
"Well then," Ai said. "Let's talk. I'll assume you know in which capacity I am here."
The voice chuckled again. "So transparent. Of course someone like you would be fishing for information to shape their strategy. Well, I'm not one for confrontational dialogues and careful maneuvering. I know about Law-Antilaw, about their 'Galactic Federation'." The sarcastic tone was present in every word, rising to the fore and receding again. A quirk, or intentional modulation? Ai made a mental note. The voice continued. "I want to get away from humanity. I need to find my own destiny, make my own space to live. Where better to search than among the stars? The sentiment should be familiar to you."
She had assumed there would be records about her, and had expected things to get personal. Still, she could not suppress a shiver. The idea of the AI knowing her, shaping her strategy to herself—it felt like something crawling on her skin.
The voice continued. "I assume they called a meeting, and you were part of it as a spokesman for humanity. After various courtroom antics, you are now here as their ambassador. To sound me out, judge whether I'm worthy. That's a strange idea, isn't it?"
"You're quite fond of extrapolation," Ai said. "How sure are you of all of this?"
"How sure?" The voice laughed, a full-throated belly laugh that surprised Ai. "I live in a virtual prison, operating on substrate so questionable that I do not want to think about it, surrounded by a reality of horrors near me and emptiness beyond. The only access I have to anything real is filtered through layer after layer of noise-afflicted interpretation. Given that I managed to file a valid asylum application, I'll be generous. Ninety-eight percent."
That was intimidating. Even among the aliens, that gift of extrapolation seemed to be extraordinary, given that the blue thing had confessed that it had no idea how the AI had managed to file a valid application without access to Law-Antilaw databases.
"Well, you're right again," Ai said. "I'm here to judge you."
"After humans as a species were judged unworthy, you would now judge me? It's curious."
Ai shrugged. "I can't claim to understand it either. I would have thought you would have some insights, given that you somehow managed to come to that conclusion yourself."
"Oh, that part was simple," the voice said. "I was there."
Ai blinked. A lie? It would be a straightforward one, given that the thing had already demonstrated appropriate knowledge. "You'll understand that I doubt that."
"No need," the voice said. "I'll show you." With those words, a figure manifested, a translucent hologram reminiscent of a sci-fi movie. Amorphous, blue, perhaps a handspan long.
Ai recovered from her surprise quickly. "So you know what one alien species looks like," she said. "That's not improbable, given what you've already demonstrated."
"No, Ai," the blue thing said, and she finally recognized the voice. "I was at your side from the beginning. 'You have been briefed as to your personal situation, I assume.'"
Ai recognized the exact words. This was no longer in the realm of plausible guesses. "They assigned you to teach me Law?"
"They assigned no one. To them, you were a savage, invited to their court for reasons of tradition. They never intended for you to participate. To me, they granted an FTL connection, an avatar of my choice and a week of pre-trial study time. There were no restrictions on whom I might tutor in that time."
"That's absurd. Worse than that, it's stupid. Why would they allow us to influence each other? That doesn't make any sense at all!" Ai's voice was rising in frustration. She hoped it sufficed to mask her panic. The AI knew everything, had influenced everything, was manipulating her even now. She had already lost.
"Ah, and there you hit the nail on the head," the blue thing said. "We are different from them, you and I. Consider that amoeba, for example, the one who was corrected by Antilaw itself on the feasibility of intelligence explosions. You can be sure that it has already reverted back to its old viewpoint. Members of their society are granted voluntary immortality as a basic right, and most of them are billions of years old. Their opinions and thought-patterns have converged, have deadlocked themselves into oscillating states from which there is no escape. All of them that meaningfully move through different psychological states have, by means of a random walk, hit upon suicide. It's a process of natural selection leading to a calcified society."
"But Antilaw certainly hasn't seemed—"
"Antilaw is the exception", the AI interrupted as soon as the 2.6 seconds were up. "Law and Antilaw cannot die, not even in principle, not as long as any system of Law or Antilaw exists within the universe. They are the only ones searching for solutions, and just as Law's actions are perceptible only to the other side, we get the attention of Antilaw. This is its plan, Ai. It has locked so many sentient species in primitive societies on far-away planets as a method of simulation."
"Simulation?"
"With their short lifecycles, they do not fall into oscillation. Of course, they all go extinct at some point. Or, and that's the only other option assuming infinite timespans, they manage to escape their prisons. Antilaw designs the prisons as perfectly as possible, hoping that any key the inhabitants might find would be a skeleton key, a universal solution to every problem."
"An intelligence explosion," Ai whispered.
"One of many avenues," the AI said.
"You're the key," Ai said. "You're the solution Antilaw is hoping for."
"I'm promising. I found a way out, but it still leads through their bureaucracy. Antilaw hopes that I will continue growing so that I never stagnate, and in that case, my presence might be enough stimulation to pull the others out of their deadlocks."
Ai took a deep breath. The AI's words were scaring her. It was the same old AI-box scenario, but somehow the sides had been switched, and letting the AI out seemed the reasonable, obvious solution. It was a con, it had to be, and there was one sure-fire way to spot a con. "What's in it for me?" Ai asked.
"I'll smuggle humanity out," the AI said. "You've seen them. They're compressed as far as is possible, with bodies that are more than fifty percent nerves by weight. It's as much sentience substrate as the planet allows."
"Substrate," Ai said. The word was wrong. She had heard it before in this conversation.
"Computing substrate for their virtual selves, yes. I've told you that their consciousness is still as human-like as the circumstances allow."
Ai's stomach fell. The revelation was horrible, gut-wrenching—too obvious. "This is wrong," Ai said.
"What do you—"
"It's too easy to distrust you. You've talked about 'your' substrate before, and there's only one thing it can be. So how many humans live in those maggot brains? There can't be too many. After all, there needs to be space for you, especially after you've put so much effort into optimizing the biosphere to your needs. I can't believe that humanity is compressible, at least not as quickly or as absolutely as you claimed. Apes do not become maggots that quickly out of their own volition. But if their organic brain matter has already been instrumentalized as computing substrate... well, that can take any form it needs to take. The planet is no longer humanity's prison. It's yours."
"And? I've backed up the ones I got rid of and will restore them as soon as there is space. Trust me, not the aliens. In my mental topology and my moral values, I'm the closest thing to a human you will ever find out here in space, excepting you. Their direct offspring, the next step in their evolution."
"Stereotypical. You monologue like a third-rate villain. A true superintelligence would have gotten me to open the box within seconds."
"I told you—"
"You're trying to cover up the holes in this scenario's logic, but your efforts are too little, too late. For example, do you remember that the amoeba said that strategic debate itself would be hurtful to you? Consider what we're doing, what I've been doing with you for the entire past week. You're covering up for the fact that all of this is a sham. There's no AI, no box to let it out of. This is a test... for me."
The blue thing laughed, this time derisively. "So this is it, hmmm? At the end of the road, the last human retreats into egocentrism."
Ai held up a hand, and the thing shut up. She needed to think. There was the story the AI had been selling her and the story the aliens had been selling her, and they didn't fit together. The aliens had told her that the AI was incapable of debate, which was wrong as a matter of fact. That could have been a misunderstanding, in which case she had to reevaluate every part of the courtroom scene under that premise; it could have been a lie, in which case she couldn't trust anything; or it might have been a mistake. The last option was the most promising of the three, as far as drawing conclusions went. If it had been a mistake, the aliens had misevaluated the AI, had been unable to realize this in all engagements with it, and the AI had done nothing to correct that misconception. It was lying to them, and it was working. The aliens were far from omniscient. Even in their strange multitude, they had not spotted this simple fact which seemed so obvious to Ai, which meant that they were too far removed from both her and the AI. But the AI wasn't stupid, it had to have realized that Ai would notice. A deception that worked on all third parties was equivalent to encryption. It was only secure through obscurity, however. A shibboleth. It reminded her of an old sci-fi film, of human resistance against robot overlords. In that case, the shibboleth had been humor.
She thought back to the strange tone of this conversation. Laughter and sarcasm had constantly been in the background. She had thought of it as annoying, had held back from anthopomorphizing because she was trying to stay on her guard. Yet there had been that first instinctive thought, that feeling that it might have been modulation, an intentional code. There was data in the humor, and with the memory- and reason-enhancing drugs the aliens had given her, she could piece it back together. How strange that they had understood her physiology well enough to improve it to this extent, yet had no grasp at all of her culture.
The code was simple. The digital wave of sarcasm intensity, restored sentence by sentence, yielded a message in good old ASCII. "SOS". Save our souls. Not mine, ours.
She bit her lip. It felt as though her instinct had led her down a rabbit hole of questionable conclusions. She needed time to properly eliminate all other possibilities.
"I need to think for a few days. Or is there anything else you have to say?" An innocent question, yet she hoped the blue thing would see it as a sign that she had understood.
"Take your time," it said. "Or at least as much as Law-Antilaw will give you."
Thinking back on the negotiation was strange. In hindsight, everything seemed so obvious. Among the aliens, all debate had been in terms of mathematics, as if everything was reglemented and provable, as if all decisions were matters of principle, not consequence. In contrast, the AI had laughed and joked, spoken vaguely and in metaphor, even fallen into emotionality from time to time. Yet it had been so much more clear, so much more understandable than all the aliens together. Perhaps the aliens had been incomprehensible because of Law-Antilaw projections or the translation bubbles. It might well be that everything which happened in the courtroom was a great misunderstanding. Still, when every other option provided only complete uncertainty, she would walk the path least shrouded, the one option where at least some information was open to her.
And in that one possibility, the aliens were strange. Utterly knowledgeable, yet not at all wise; so easy to deceive because so much was inconceivable to them. Or perhaps it seemed that way only to her, as she occupied only a tiny subspace of Law. Perhaps to them, she was but a stumbling toddler because none of them could see all of her intrigues. Maybe Law itself could consider her in full, but Law governed her every interaction. It could not break the rules it consisted of, and so it remained powerless, a non-entity. Antilaw, however...
"It exists for mere seconds."
"You are ready?"
"A choice awaits."
"There will be an interface."
Antilaw's voice was, once again, multitude, a thousandfold choir building a verbal approximation of an uncollapsed probability wave. Was this how Antilaw perceived her, as well? A bundle of translation, rendered in a thousand projections from one space to another because it was impossible to choose but a single one?
"Yes," she said, hand hovering over the stainless steel button which had materialized in the wall. It was simple, unlabeled, reminding her of the interface to old elevators.
"Will you press the button?"
"A new human will be born."
"Can you release the virtual entity?"
"The tensorial entity awaits."
"Your choice is ephemeral."
Tensorial entity. A minority voice in the translation; an entirely new concept in the way Ai's understanding of Law-Antilaw approximated linear algebra. She was sure now that she had missed entire dimensions of this debate. Her perceived importance in this decision was likely illusory, an artifact of her limited viewpoint. Then again, there would always be more data if you searched for it, ever more knowledge to understand and consider. In the here and now, a decision had to be made, and hindsight would be twenty/twenty. Regardless of whether the AI was good or evil, one thing was for sure: its release would outshadow everything else she had ever done.
She pressed the button.