I thought of writing this article after a recent conversation with a Kashmiri Youth who is in college third year and ended up abusing the Indian Army on Twitter. An ex-army friend of mine reacted to that with harsh words. When I asked why he posted this tweet: this Kashmiri youth opened up about how he came to hate the Indian Army. Apparently it had nothing to do with the actions of the Indian Army, he has never encountered one personally, but everything to do with the news/tweets/Facebook posts he is exposed to. He said that whenever he goes to social media all he sees are posts that abuse the Indian Army or present damaging pictures of the Indian Army, and that’s how he came to believe that the Indian Army is evil. Once the initial threshold passed, he himself started posting abuses for the Indian Army for no apparent reason, he felt compelled to do so. He apologized for his ignorance, deleted tweets, and the matter was closed. But for me, the matter just opened up. The whole incident kept bothering me and I decided to write this article. Pardon me if you find it too long, I couldn’t make it shorter.
I have been in the field of Artificial Intelligence for the past two years and have been exposed to various aspects of it. There is no denying that Artificial Intelligence has brought us very close to what we call futuristic society. First by the Internet and then by artificial intelligence, our lives have been made easier whether we want to find a doctor, order meals, purchase goods, read the news articles and books, watch movies, make friends, or simply operate our satellites from the comfort of our home. The list is endless in the current scenario and as of now, the future of artificial intelligence looks very bright.
However, the very same dependence on the internet for almost every aspect of our lives has opened up another parallel dimension: a dimension in which we are nothing more than a data point to be utilized by the algorithms employed to track our every move and predict the next as best as possible. Advertising companies are busy investing billions of dollars to find out about us: when do we wake up, when and with whom do we eat and what and where, whom do we speak with and for how long, tone of our conversation, where do we read the news, which books and pages of the books do we prefer, which movies do we watch, which part of the movies do we fast forward and which part we rewind, what do our friends prefer, when do we sleep and with whom, where do we go for vacations, are we right-wing or left-wing or no-wing, what do our friends like and their friends like, do we go to the gym, do we visit the doctor often and if yes for what, do we wear glasses, what kind of headphones do we like, which phone do we buy, what type of dresses do we purchase, what groceries do we often buy, what do we write in our different emails. As expected, once again the list is endless. By now you should have got the idea that we are nothing but a data-point to an advertising company and internet giants, a data-point whose map needs to be worked out and exploited for financial gains.
Our very smart software personals got together and build complicated algorithms to monitor our lives dot to dot, however, not one of these professionals is aware of what they have built-in its totality, what effects it is going to have, or how it is going to treat us. But their creation is already deployed and mapping everyone on the internet. These innocent-looking algorithms designed with the primary purpose of making our lives easier know us better we know ourselves. They decide what friends we make, which restaurant do we order food from, where and what do we read, where do we go for vacations, what jewelry we buy, what dress we wear, what hair color suits us better, all of these just based on our couple of previous ventures.
Initially, once these algorithms encounter a subject, us, it builds a space for us in its memory. Slowly and silently it watches us making a choice, via clicking through different web-pages, for a day or so. Soon, it starts experimenting with us by showing us what it thinks we might like and that’s where the game begins. There is an old saying “Sometimes you win and other times you learn”. That’s exactly how the algorithm functions. With each passing day, we lose and algorithm wins, and the day it doesn’t, it learns and comes back with a better picture of ourselves. Soon, algorithm wins every single time.
Humans tend to think of themselves as someone special, unique, unpredictable. Algorithms have proved that these ideas are wrong. Many people who hold their smartphones in hand for news and daily activities are reading what algorithms want them to read, watching what algorithms want them to watch, and doing exactly what algorithms want them to do throughout the day and then day after day. Last year in January a friend and I were discussing something and suddenly he got a notification from Spotify which gave him day’s collection of songs he might like: many new ones and some old. He suddenly jumped in surprise and said these are exactly the songs I wanted to listen to today! That’s the impact of a good algorithm that can predict what you want to do in the future. How predictable we unpredictables are!
These algorithms once deployed start creating a bubble for you, a data-point in its safe space. Initially, slowly they start filtering out things from your feed you are not interested in. Later that speed increases as algorithms learn more about you. Soon, you end up in a bubble, your own personal universe that has like-minded friends, favorite restaurants, favorite goods to purchase, favorite videos to watch, favorite destinations to visit, favorite predictions, everything you almost can’t say no to. This is what Eli Praiser termed as ‘Filter Bubble’. It is not difficult to break out of these, however, the difficult part is, most of us don’t realize we are in Filter Bubble and why would we. We by nature favor like-minded people/scenarios and that’s exactly what Filter Bubble provides. It seems like a match made in heaven and we are happy to click, click and click. It is a win-win situation: we are happy, algorithms are happy, advertisers are happy and Internet giants are happy- Business is good after all.
Filter Bubbles are still alright to a certain extent. They seem harmless, just providing us what we want. However, there is a catch: Filter Bubbles end up creating what we call ‘Echo Chambers’: a place where personal beliefs get a strong push by repetition and insulation from rebuttal. If today someone is slightly left-wing and he spends a day looking for YouTube videos to know more about left-wing, read tweets, visits Facebook for related news, by the end of the day or two this person will be put in Filter Bubble with its own Echo Chamber where all he/she will hear is/will be about the left-wing with no opinion or ideas about other wings so to say. No counter opinions. What results is a hard-core Left-Wing persona in its very own Left-wing universe that has zero tolerance for other viewpoints.
When I spoke with Kashmiri Youth I realized how difficult life has come to be. Media is playing its negative role by creating and propagating false information and twisting presentation of the available data to mislead the vulnerable, all for TRPs/money. The same is used by the algorithm to radicalize a youth who might have clicked on one video or read one such article either intentionally or unintentionally. Once he clicks two or three times on such articles, these articles become the main feed for him no matter which site he goes to. He gets shielded from other point of views. That becomes the Whole-Truth for him. He gets no option, no say in anything he gets to watch/hear and the intolerance for anything unknown seeps in.
No one clearly understands what and how these algorithms are functioning. They (algorithms and their creators) don’t consider us alive or someone special. These algorithms are studying us, and in the process manipulating us for the benefit of creator and user (advertising companies). Even today Artificial Intelligence has been running without any control or regulations or responsibility in most of the countries. Most of us who spend considerable time on the internet are already safely sitting in our Filter Bubbles which Echoes our views and makes us feel empowered. The possibilities of discussion between different viewpoints and reaching an agreement have been eliminated completely.
I would like to conclude with a call that unless we bring in strict measures to control the functions of these algorithms and while doing so also hold creator and deployer of these algorithms responsible for their outcome, we will neither be able to run a democracy in its true sense nor be able to eliminate terrorism from Kashmir and other parts of the world. There will always be someone ready to be radicalized and there will always be YouTube-videos/Facebook-feeds/Twitter/Google-news-feed supporting that radicalization in the name of ‘Free Speech’, which in itself has become an oxymoron since these very same tech giants place the algorithms deceitfully to put us in Filter-Bubbles and Echo-Chambers to change the way we think and behave. A difficult and dangerous world we are in right now in which everyone is right and no one is in control.
DISCLAIMER: The author is solely responsible for the views expressed in this article. The author carries the responsibility for citing and/or licensing of images utilized within the text.