As humans we are scared of change we perceive in our life time but not that which we see over millennia because it has already happened. We are also scared of the dark thoughts and behaviors we are all capable of – selfishness vs. sharing, manipulation vs. alignment, indifference vs. empathy. Necessity has been the prime causality of adaptation for us to have gotten this far, at least. And that’s been a painful journey. Quite literal pain all those eons ago for the first few fish who ‘walked’ on their fins that had only ever been in water to get to the bigger pond. But better than the pain of dying. And we all have a different perspective on where the original hardcoding to live, to propagate and to seek to get better came from.
We are scared of things we cannot predict reliably and for the most part that instinct has served us well. It’s better to take flight than fight with an unknown opponent, and isn’t that essentially what the Future is? We wage war with our conservation, the balance is tricky, you have to be innovative enough to try some new things because those jumps may be fruitful and yet conservative enough that you don’t die in the process. But it also helps in the expensive experiment that is life to have volumes of test subjects (yup, that’s you and I) so that we learn whether the hypothesis being tested was wise or foolish. And that’s the key – learning. Learning is enriched with empathy, its the way we have of sharing experience, whether we were animating tales around a fire while painting the cave walls or sharing the success stories of business results on our enterprise social network. The empathetic engine in our brains is why we learn more from failure than success because the consequences of failure are far more terminal in this journey of life. Empathy and the emotional triggers associated with it are there to instill key memories in our brains, to make sure we remember because we (generally) are unable to remember everything. Emotion is a shortcut, a kind of signal amongst noise, to remind of us of experience when there are only milliseconds to respond to a stimulus.
It is our empathy that ironically makes us hard to predict. The infinitely complex adaptive system that is the multidimensional Prisoner’s Dilemma we all play every day with each other on many different levels from negotiating what we have for lunch to how much we get paid. We learn to test responses early when we are very young, from modifying crying tonality to get food, ‘white’ lying as a toddler to manipulate our parents for new toys to falling in love and all these tests help us to model future responses. That’s what the voice of experience inside our heads tells us and helps us to model the world which is ultimately about dealing with other people, you and me.
And then if that wasn’t hard enough, the idea of some other construct arrives. An intelligence that could be one day similar to our own and even surpass it, and yet without any of this empathy engine built in, making it very hard for us to understand, model and predict. And this intelligence retrieves huge amounts of data when we cannot, and so has no need for these pangs of brain chemical stimuli we call emotion that cause disruption and sometimes irrational behaviors because our model has had to adapt based on experiences (negative or positive).
I don’t believe any of those aforementioned metaphors will reflect how our relationship with AI will play out, any more than any of us is consciously making an effort to breathe, process nutrients, recycle water and all the other things our brain is in control of right now, in addition to higher thought. There are tasks ‘we’ simply don’t need to be involved with. If you were conscious of all those tasks, we would have never had time to even develop language to communicate with each other, let alone been able to wait nine months to replicate and be a burden for years before independence. The automation we are scared of because it takes us time to learn new skills and we don’t all learn the skill uniformly unlike programmed machines is not our enemy, it is part of an evolutionary process to help us be capable of higher thinking. The lack of emotion we are scared of because pure rationality sometimes seems cruel can be tempered with our choosing to influence the model(s) with environmental variables. Conscience does not require consciousness (in the system).
But what if we could communicate at (close to) the speed of light, would it matter if some of the other intelligences we communicated with were people or not? Wouldn’t it all be a matter of perspective, of the random firing of synapses (human and machine) that cause an idea to form, to be built on, to be tested in theory and then in practice, to model, interpret and solve for thousands or millions of variables instead of the few we can do now.
People are not easily changed, but their circumstances are. Predicting the adaptations of individuals and communities when technologies change is the new art of leadership. No longer the “Art of War”, or control paradigm, the pace of technological change makes an “Art of Lore” participatory paradigm preeminent. What people believe about the outcomes of their ideological and creative participation in the development of their communities will define their commitment to continuation in our collective evolution.
LikeLiked by 1 person