Technology puts power in consumers’, companies’, and politicians’ hands. How we do that in a safe and responsible manner is a complex problem. How can we raise awareness, rally support from all levels of society, so we can put in the proper measures, and safely introduce powerful technologies into society?
Let’s use cars to make the discussion more concrete. If your next car was 10 times more powerful than today’s cars, would you insist on much better safety features, appropriate driver training, suitable traffic rules, proper insurance, etc.? Would you look for independent safety and reliability reviews before you even buy such a car? Would you count on car companies, engineers, the government, and poorly funded advocacy groups to make sure everything is safe? Would you allow product development and testing of these powerful cars without any oversight? What if terrorists crashed one of these 10 times more powerful cars into a building? What if there was a sudden breakthrough and cars became 1,000 times more powerful?
This little hypothetical covers a lot of things. First of all, let’s remind ourselves that having flawed humans operate heavy, fast-moving vehicles in close proximity with each other is inherently a very dangerous thing. Again, technology is power. The way we deal with dangerous, powerful automotive technology is to:
- Make sure it has built-in safety features (eg. seat belts, air bags).
- Make sure users are properly trained (driver training, license requirements, frequent public service announcements, etc.).
- Value independent, rigorous safety and reliability reviews.
- Make sure there are rules and regulations for proper use (traffic rules).
- Make sure there is enforcement (police).
- Make sure there is insurance to mitigate the costs of bad things that could happen.
- Develop and test new features in a controlled, responsible manner.
- Guard against deliberate or accidental misuse.
- Guard against (large) advances with (large) uncertainties.
- Promote safe attitudes and mutual accountability (eg. reject road rage).
How many of these safety strategies do you feel are adequately applied to social media, cyber security, artificial intelligence, genetic engineering, nuclear weapons, and robotics?
Safety is not something that only engineers or legislators have to think about. It concerns and involves all of us. Companies obviously want to sell us shiny, powerful new features, grow market share, and make money. In the process, they are often pressured to cut corners on safety and reliability. Consumers are also tempted to get the latest and greatest, even though they can’t make informed buying decisions without independent research. Legislators often need sufficient support from corporations and the public in order to pass useful regulations.
Society is obviously more familiar with certain technologies than others. But that is not an excuse for not having responsible and comprehensive safety across all technologies. When a major disaster happens with widget technology (maybe tomorrow, maybe in 100 years), we can’t just say nobody knew the risks, that nobody could have anticipated it.
Besides, leadership in safety and reliability is a selling point. Producing independent, rigorous safety and reliability reviews is big business. Insurance is big business. Safety doesn’t stifle development. It is an integral part of responsible development and deployment.
We have to educate technologists, companies, the press, the public, and politicians in accurate, and compelling terms. We have to remind ourselves that power is inherently dangerous, and technology is arguably always risky. We have to take our most successful and comprehensive safety cultures and practices, and inspire societies to apply them to other technologies. We have to be mature in handling safety for all technologies. Otherwise, we will always be one step away from a major disaster.
PS. I've known about EA for just a few months. See my other posts:
Bringing Out The Best In Humanity
Aligning Self-Interest With Survival And Thriving