1.WILL TAKE OUR JOBS
2. EXTREME WEALTH INEQUALITY
People who have the skill to program AI, or the wealth to own AI, have the most to gain. Meanwhile low-skilled workers will wallow in unemployment.
Nanotechnology will be weaponized. Imagine tiny, intelligentas pervasive as insects able to self replicate. It could get out of control. Fast.
Source – scientificvisuals.tumblr.com
4. ELON MUSK IS WORRIED
Elon Musk should be taken seriously. He made electric cars desirable (Tesla) and injected new life into space exploration (SpaceX). He clearly has a firm grip on what technology is capable of. If he’s worried about AI, so should you.
Hope we’re not just the biological boot loader for digital superintelligence. Unfortunately, that is increasingly probable
— Elon Musk (@elonmusk) August 3, 2014
Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes. — Elon Musk (@elonmusk) August 3, 2014
5. STEPHEN HAWKING SPENDS MORE TIME WORRYING ABOUTTHAN STRING THEORY
“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.” – Stephen Hawking via HuffPo
Source: Last Week Tonight with John Oliver
6. THEY’RE FUCKING SCARY
7. SCIENCE FICTION HAS PREDICTED THE ROBOT APOCALYPSE FOR DECADES
The Terminator, Battlestar Galactica, and Mass Effect (the best video game franchise of all time) predict a struggle between humans and machines. Machines don’t care about humans.
“ORGANIC LIFE IS NOTHING BUT A GENETIC MUTATION, AN ACCIDENT. YOUR LIVES ARE MEASURED IN YEARS AND DECADES. YOU WITHER, AND DIE. WE ARE ETERNAL. ” – Sovereign, Mass EffectSource: thats-n0-moon.tumblr.com
8. WE’RE TRAINING THEM TO TAKE OVER