Existential AI risk isn’t something that’s far off in the future or limited to some evil robot destroying mankind, contrary to what many of the leading lights in AI innovation have told us.
AI is changing your existence right now.
It is affecting how you live, work, and see yourself and the world. The greatest risk is that you’re not aware of its impacts and implications.
Of course, you know about work underway to preclude or mitigate risks of bias, bad data, hallucinated sourcing, and unreliable performance executing particular tasks in individual use cases.
But even the most perfectly functioning AI presents risks that its success will lead to outcomes we didn’t anticipate and for which we’re unprepared, as even its smallest impacts on individuals will add up to massive impacts overall.
It’s a huge problem because there are risks associated with these changes and outcomes that are both purposeful and unintended. There are risks that good things might lead to bad ones, or yield effects for which you’re not prepared and may not like.
The ultimate risks aren’t that AI fails, but rather that it succeeds and that we’re shocked by the world it gives us.
To make matters worse, the changes that AI is giving us are irreversible, and they’re happening as you read this sentence. Changes come with every new AI development or deployment. It’s relentless and present tense. If we don’t acknowledge and understand the risks associated with these changes, we doom ourselves to suffering their consequences.
If we only see the opportunities for AI to improve our lives, we risk missing the ways it might make our lives less, well, livable, or certainly less familiar to us.
Those changes could ultimately end up destroying us, too.
We don’t need to worry about a future sci-fi moment. We need to focus on the changes to our existence right now.
I’ve written a white paper that decodes the three major existential AI risks we face, and then provides a three-step action plan that every consumer can apply to reduce those risks for themselves, their communities, and the world at large.
You can download it here.