autocracy. “I can split the money with you, more than you were going to get from Kali.”
“What about Soo-hyun?” JD said.
Enda put her cup down on the floor next to her phone. “We can call the police. Kali’s people killed Khoder and shot up Troy’s apartment, the cops will have to do something.”
“The whole city’s a disaster area,” Troy said. “By the time the police do anything, it might be too late.”
“Zero, then,” Enda said. “I make Soo-hyun’s safety a condition of handing the datacube over. They have private security on retainer, they can take care of it, and we don’t have to put ourselves in danger.”
“How long will that take? What if it’s too late?” JD said, voice thick with worry.
Enda nodded. She saw Khoder on the chair, the blood, the bruising, the ragged black hole of his mouth as he gasped his last breath. She’d seen worse—she’d done worse—but this image was fresh.
“I want to help you, JD, but we can’t do it on our own. It’s me, your wrench, and his philosophy degree against a pack of teenage monsters. But if we give it to Zero, we’ll have them on our side.”
“I don’t think we can do that,” Troy said.
“Why not?” Enda asked.
“Here we go,” JD said with a knowing smile.
Troy leaned forward. “I didn’t want to believe Jules at first, but I’ve talked to it, and?…?What if he’s right?” Troy said. “What if it’s an AGI? An honest-to-god strong-AI?”
Crystal sat up a little straighter, the information broker’s interest piqued.
Enda checked her phone.
>> Hello, Enda. Your name is a young name. Have you had many names?
“How does it?…” Enda looked to Troy, and he only shook his head. “We don’t even know what we have here. If we give it to Zero, they can sort it out.”
“But if it’s an AGI,” Troy said, “if it’s genuinely intelligent, can we trust it to a corporation?”
JD nodded. “We’ve got no way of knowing what they could do with it, but we can’t trust any corporation with that kind of power.”
“I’m not talking about what Zero will do, or what the AGI can do—I’m talking about Zero’s philosophy.”
“He’s a philosophy professor, in case you didn’t guess,” JD told Crystal.
Troy continued. “Corporations abuse their employees and contractors, and profit off human misery. At this point in history we have enough data to know that those behaviors are endemic to the corporate structure. How can we justify giving them a new species to subjugate?”
“A new species?” Enda said. “I think you’re getting ahead of yourself.”
“The AGI—if that’s what it is—could be copied a countless number of times, the copies molded and mutilated to fit different functions. In no time at all, Zero would have a broad variety of intelligent machines that were forced to do their bidding, to follow their mandate.
“We’re talking about slavery, and I don’t use that word lightly. If it’s a truly intelligent machine, then it could be sentient. If it’s sentient, then it’s a person. And if it’s a person, then it deserves personhood, it deserves rights. Zero would give it neither. Do you want to be responsible for helping establish a slave species?”
“I hate to break it to you,” Enda said, “but there are already slaves out there, working in places where people’s lives are valued less than machines.”
“And that’s a fucking travesty,” Troy said, “but it doesn’t absolve us of responsibility for what we decide to do here.”
“Your line of argument only matters if it’s smart enough to know it’s a slave,” Enda said. “How do we know it’s sentient?”
“How do we know you are?”
Enda opened her mouth, then closed it. “I don’t want to get into a philosophical debate—”
“Too late,” JD interjected.
“—I just want answers.”
“What about the Turing Test?” Crystal asked. She took a sip from the mug clasped between her hands.
“It’s an interesting thought experiment,” Troy said, “but it was never actually going to work. The Turing Test as imagined doesn’t even take neurodivergent people into account. A reasonably sophisticated neural network might pass the test, but a person on the autism spectrum might not. Then, is that person with autism not sentient, not actually a person?”
“Of course they are,” Crystal said. “But how can we tell—”
“We can’t,” Troy said. “I’ve gone back and forth on this. Part of me is still waiting for the hoax to reveal itself, but if it’s not a hoax, I think we have to give this being the benefit of the doubt. Assume it is sentient and treat