The social dilemma described in the recent movie with the same title exists because technology groups think that they are not responsible for the content and data that people put in their applications. We think we can build the technology and let people decide how they will use it. The truth is we have always influenced entries by editing for format and giving a list of valid options to choose from. Now the artificial intelligence (AI) technology is looking at previous entries and making suggestions. Ads and articles show up that the AI thinks you might be interested in. Suggestions are made to complete our sentences and especially our search requests. All of this technology is influencing us to lean into our conspiracy theories.
The range of opinions on any topic can be represented by a bell curve. In the middle of the curve is the “normal” or “average” reaction. The tail end of the bell curve is mostly conspiracy theories or very little known facts. The internet was initially seen as a way for everyone to get their voices heard without having to meet the catch-22 of being normal before people heard of them. In this way, we could learn things that were not part of the mainstream media focus. However, now everyone is trying to push us out to the tail ends. Chris Heivly, co-founder of MapQuest advises people:
“Being average is only optimal for middle school”…”the only way to get more attention is at the spicy end of the bell curve.”
I have only seen the trailer for the movie The Social Dilemma. In the promotional material, they describe how they used to believe these technologies do good but they now see they were naïve. They no longer believe in that good. They know that social media causes polarization. They explain that it has been found that if they aim lower on the brainstem and make us have fear or worry, it is more addictive.
“These business models are not really aligned with the well-being of society”.
Add to that our tendency to only hear the voices we want to hear. An echo chamber is defined as an environment in which a person encounters only beliefs or opinions that coincide with their own so that their existing views are reinforced and alternative ideas are not considered.
“People are living in partisan and ideological echo chambers”.
There has been a fair amount of scholarly research on echo chambers and social media. Most of the research warns that we need to hear all the viewpoints and adjust our opinions as needed. But now our AI is giving us skewed versions of what is out there and pushing us toward fears and worries that are not helpful.
Information Technology professionals must take responsibility for the software we build. Will it be used to improve the well-being of society? We must study the business models that are being used to justify the software and understand the data and content that will become part of it. Of course, we cannot see all the consequences our creations may cause, but we can see further than most people and we must be ready to speak up when we see problems.