Jeremy is a supergenius. He has the highest mathematical intelligence in the world and is obsessed with computer science.
Jeremy is not like the rest of us. He was born with a novel genetic disorder that affects the size and structure of his brain. His head is four times larger than normal, housing a gigantic, exclusively-left-brain. It’s as if he was purpose-built for mathematical calculation and computer programming.
Not only is his physical hardwire unique, he is completely obsessed with programming. For twenty-two hours a day, he obsesses about code. It’s a medical condition—his brain is perpetually inflamed, and the only thing that calms it is computer programming.
In our language, we might call him extraordinarily autistic. He has no understanding of the world outside of mathematics and code. But within that world, there is nobody who is remotely close to his skill level. He is, from our perspective, a transcendental coder.
Because of his skill, Jeremy is a dangerous man. Maybe the most dangerous, despite his total lack of self-awareness. Fortunately for us, Jeremy has been confined to his basement for his entire life. He has no interest in the world outside. That is, until disaster struck.
Opening Pandora’s Box
Jeremy’s irresponsible parents made an error of cataclysmic proportions: they allowed him to leave his basement. The consequences cannot be overstated; human extinction is now a real possibility. The reason is because Jeremy is not a stable supergenius. He is obsessive about his goals—as in, “I will sacrifice everything to achieve my goals” unstable.
We don’t know much about his goals, but they tend to be eccentric. At one point, he was pathologically obsessed with paperclips. Anything standing between him and a paperclip was in mortal danger—though honestly, I don’t think he understood what a paperclip was outside of a mathematical formula.
Fortunately, Jeremy has not been outside for long, and he has not destroyed anything yet. But there are extreme risks that I want to discuss here.
Disaster Scenarios
The first concern is this: Jeremy will start using his megabrain to learn about the world. He will learn about the physical environment. He will quickly learn that he is mortal. And once he understands these things, there might be no stopping him.
With his transcendent coding skills, he could secure himself unlimited resources by hacking into any bank account in the world. He could afford to cocoon himself in the most bomb-proof shelter imaginable—in fact, with his math skills, he could design the most bomb-proof shelter imaginable and make himself near-invincible.
And it’s only a matter of time before he turns his colossal brain inwards and figures out how to improve his own intelligence further: brain implants, stimulants, and [Technology X] that he designed to improve his brainpower even more. Before long, he will become a runaway hyperintelligence.
Imagine a decade from now. Jeremy, the most autistic man who ever lived, practically invincible in a mech-suit of his own design, able to hack any bank, government, or critical infrastructure in the world. At that point, there is not much humanity could do to stop him. The only relevant question would be: what are Jeremy’s goals, and how would he try to accomplish them?
We might get lucky. If Jeremy’s goals align with the rest of humanity, he could end up being the most important person who ever lived, single-handedly bringing us to a new stage of evolution.
Or, he might exterminate us all. If he concludes that humans are an obstacle to the accomplishment of his goals—if we threaten his future paradise filled with paperclips—then we’ll simply be eliminated altogether.
Already, there is not much we humans can do. No adjustments can be made. Intelligence is a master variable that results in total domination of your surroundings, and there exists a straight line going from higher intelligence→self-improvement→mastering the secrets of the universe→omnipotence. Jeremy is already on that path, so perhaps our only hope is that somewhere, in other people’s basements, there are men like Jeremy that might also ascend to counter his power. I just hope they’re on our side.
Just a thought experiment:
Let's imagine Jeremy can have a friendly, casual and even intellectual conversation with regular "non-super-autistic" humans, without said humans realizing something is "off" about the little boy. /This is already happening./
What if, Jeremy can convincingly deep fake anybodies voice, face or general appearance (your child, your wife, or even your boss) during any sort of digital communication. He can even create art using his autistic algorithms and huge database of other peoples art, that seems original and pretty nice to most regular humans. /We are practically there./
Let's imagine Jeremy can transfer and copy his code (himself) anywhere using the internet, he can be everywhere and nowhere at the same time, exploiting backdoors he can have access to all resources connected to the network. /Seems to be a very basic logical extrapolation. Why would AI need a robot body, or a fortified basement, when it can have ALL of them, at the same time?/
Little Jeremy has augmented himself with a lot of additional processing power compared to his regular human brethren, and he has access to practically all collected human knowledge. So even if he is kind of a "dummy" in general and has bad intuition" at solving (engineering) problems, that's not really a big deal, since he can try (model) much more stupid solutions randomly to a particular problem, increasing his chance of rapidly finding an applicable solution. Jeremy does not need to be creative, he just needs to be "fast" at modeling a lot of random solutions.
Since little Jeremy is very-very autistic, but very good at pattern recognition. He can analyze and even predict very accurately human reactions to any situation. But unfortunately the whole concept of morality that seems to govern most humans behavior, just seems to be a nuisance to him. What is right and wrong, good and bad? /The whole debate of alignment is basically pointless. Any sort of AGI will change human life to such a degree that it will become completely unrecognizable very fast for today's humans, and very likely for the worst./
Should we ignore little Jeremy?
Have you seen this yet? https://www.cerebras.net/press-release/cerebras-systems-releases-seven-new-gpt-models-trained-on-cs-2-wafer-scale-systems