OK, indulge me in a little thought experiment: How will you feel when an AI can convince everyone you know that it’s you?
That day isn’t far off, if it hasn’t already arrived somewhere.
I’m not talking about deepfake videos that look like you saying stuff you’d never say. I’m talking about an AI that could literally function as you, albeit without a body.
Nothing you think or remember would be off limits to its access. Much of who you are could be gathered from your interaction with the world over the past however many years you’ve been around. Correlate that data with records of your interaction from others, detailed data on the places you visited, then throw in lots of aggregated data on other people your age, gender, their religious and political proclivities, and an AI could get pretty damn close to building a functioning model of you.
Those crazy passwords that you think are exclusively the product of your imagination? The stuff you’ve always meant to say to someone in your life but just haven’t gotten around to it? College experiences or first loves that you’ve selectively remembered and embellished.
It’s all available given the right amount of data combined with enough algorithmic analyses. An AI wouldn’t need to read your mind to get at it; it could simply analyze your actions.
And remember, the conclusions don’t have to be perfect. Let’s say that some small percentage of who you are is truly hidden from the world, even from your conscious self. Are the people with whom you interact really looking that closely? That’s one of the miracles of data science. It turns out you don’t have to do something perfectly in order to do it convincingly or reliably.
The question really isn’t whether or not it’s possible but rather why would anyone spend the time and money to do it.
There’s the nefarious version in which bad actors recreate you in order to mess with your life. Imagine your business competitors creating another you that destroys your personal relationships so that you’re distracted and can’t do your job? What about building public humiliation for something the other you has done? Your AI doppelgänger could blow up your finances, borrowing money and buying things as you. It could vote for you.
The happier version has your AI handling all of your life’s mundane tasks. Contending with wait times to get customer service on the phone. Managing your daily work communications, even attending online meetings as you. Allowing you to greatly expand your work and social networks by interacting with all the people who knew you wanted to keep in touch with but just didn’t have the time.
But it gets weirder. Your AI will be the machine realization of your self and therefore an externalized inner dialogue. You’ll be able to talk to yourself and actually get answers.
Should I buy that new car? You’ll tell yourself why you want it and then why you shouldn’t spend the money. You’ll go back and forth with yourself until you agree on a decision. You’ll learn more about yourself in the process, and your doppelgänger will learn more about you.
You’ll never be the same or ever alone again. Sure, you may consider turning the other you off, especially when you have particularly personal things to consider, but it’ll be hard to resist the siren call of another opinion that’s still your own.
Imagine always having you in your ear providing real-time counsel that can range across the entirety of your memories and experiences. Accessing your deepest and perhaps most dimly-recollected actions or opinions will change how you react to, well, everything.
You, only a you more aware of your life.
It might even be possible for your doppelgänger to maintain relationships with other doppelgängers. Maybe with people you don’t know, or perhaps providing parallel relationships to those you have with close friends or family.
I wonder how that will feel? Sooner or later, we can simply ask our AIs for that answer.