top of page
What Is Superintelligence?
-
Superintelligence, or Artificial Super Intelligence (ASI), refers to an AI system that can perform any cognitive task better than the best humans.
-
Unlike Artificial General Intelligence (AGI), which performs at the level of an average human, ASI far surpasses human capabilities.
The Risk of Extinction
-
Because ASI is both intelligent and powerful, most researchers believe there is a risk it could cause human extinction.
-
Estimates of the probability of doom—aka p(doom)—from advanced AI vary greatly. Elon Musk and Geoffrey Hinton, two leading figures in AI, have estimated p(doom) to be between 10% and 20%.
-
While the most likely scenario is positive, extinction—eliminating about 10 billion lives—is such a catastrophic outcome that even a 10% risk is far too high.
The Cost-Effectiveness of AI Safety Research
-
Investing $10 million in AI safety research, even if it reduced the probability of catastrophic outcomes (p(doom) by just 0.001 (one-tenth of one percent), could potentially save 10 million lives—equating to just $1 per life saved.
-
This calculation highlights why organizations like Effective Altruism (EA) have expanded their focus to include AI safety alongside other global challenges. While EA continues to support impactful causes such as malaria prevention, where saving a child’s life is estimated to cost $5,000, the community recognizes that researching and designing safe superintelligence could be a highly cost-effective way to save lives on a massive scale. By supporting AI alignment research, EA aims to reduce existential risks while continuing to address pressing global issues.
Inventions for Safe Superintelligence
-
Dr. Craig A. Kaplan has developed numerous inventions to make Superintelligence safer. These inventions are summarized on SuperIntelligence.com and are freely available to anyone interested in improving AI safety.
Research
-
Ilya Sutskever, Daniel Gross, and Daniel Levy at Safe Superintelligence Inc. have also focused on designing and developing safe superintelligence systems.
Resources on AI Safety
-
Many videos and resources on AI safety and the design of safe Artificial General Intelligence (AGI) and ASI systems are available at iQ Company.
© 2025 iQ Consulting Company Inc. All Rights Reserved | info@iqco.com
iQ Consulting Company, Inc. operates under various fictitious business names, including iQ Company; iQ Co, iQ Studios, and SuperIntelligence.com. These are used interchangeably.
bottom of page