Have you fallen into a trust trap on the Net?

Big Tech recommends what they think we need, but the power of choice should rest with us

By Shalini Verma

  • Follow us on
  • google-news
  • whatsapp
  • telegram

Published: Mon 4 Mar 2019, 6:45 PM

Last updated: Mon 4 Mar 2019, 8:50 PM

It's Saturday evening. I ensconce myself on the couch, TV remote in hand, to escape the dystopian cacophony of social media. I turn to Netflix. The Internet entertainment service provider instantly serves up an assortment of movie and TV show titles based on my viewing history. I can't help but wonder at its uncanny ability to predict the movies I am likely to watch. I feel like Netflix can almost read my mind.
Behind this crystal ball gazing is a sophisticated recommendation engine - Netflix's secret sauce that helps the company stay one step ahead of competition. To understand viewing preferences, it combines the work of experts who tag the content on its platform and rich user data it constantly gathers. It then does a match. The more data it collects, the more accurately it can perform the match-making. Each time we play or pause a movie, the algorithm powering the recommendation engine gets a little wiser about our viewing habits.
This is the quintessential holy grail of every brand that is trying to predict our buying behaviour. In the old days, television channels would design their content around notions of gender and age of users. Not Netflix. It runs experiments on us. You and I could be part of the 100,000 viewers selected at random to run hundreds of experiments every year. These tests offer different scenarios or selections to viewers in order to study how they respond to them. My Netflix experience could be wildly different from that of my neighbour.
Recommendation engines use many methodologies such as content-based filtering that suggests items that are similar to what you have liked previously. Collaborative filtering is based on preferences of users with similar tastes. The engines are constantly looking for correlations and patterns in our habits, and fine tuning their predictive models. The Instagram feed is powered by an algorithm that determines what we see based on three factors. This includes our interest, recency or how recent was the post, and relationship or how often have we interacted with the person who shared the post.
The Internet is crawling with recommendation engines that are serving up products and services we are likely to buy, use, read, and watch. Amazon, Pinterest, Spotify, Youtube, and the list goes on. Amazon has the mother of all recommendation engines that studies our every move on its ecommerce site, from product discovery through to checkout. It has a wide selection of 'recommendations for you'. It even recommends products that are frequently bought together. It is estimated that more than a third of Amazon's revenue is generated by its recommendation engine.
These service providers send their recommendations via emails. I was recently searching for home speakers on Souq.com. Soon after I started to receive emails with Souq.com's recommendation list by top brands. These emails have an enviable conversion rate that boost the sales numbers of ecommerce sites. This is partly because ecommerce sites send us recommendations on products that have a higher revenue opportunity for them. 
The more they recommend what we like, the more we start to trust the online service providers. Once these online behemoths have gained the trust of their users, it is easy to fall into a temptation of misusing that power. We should be very careful of the trust trap. The likes of Amazon and Instagram have started to wield an inordinate amount of influence on us unsuspecting users. It is in Netflix's interest to push more of its own content to us.
You are watching an embedded Youtube video on a site. Before you know it, you have spent an hour just binge-watching similar videos on Youtube. Viewers call this the Youtube rabbit hole. It is okay when you are watching a dozen cute cat videos. However, Youtube's recommendations could easily include inflammatory videos from terror groups, white supremacists, and other radicals. Any mainstream political video could take you to more videos on the far right and the far left. Experts have written about how Youtube's recommendation engine leads you down a radical path even if you are not watching political content.
Youtube videos of vegetarianism attract more videos on veganism. Videos on flu vaccines could lead you to anti-vaccine conspiracy theories. We are just an algorithm away from getting radicalised. One of the reasons why they push us towards the outrageous is because they need to hold our attention for the first few seconds, or we will switch to doing something else, what the industry terms as user churn.
Recommendation engines are the reason why we get more personalised content. In many ways they are still work in progress as they try to offer more relevant choices. We cannot deny the sheer convenience of not having to wade through hundreds of choices before we find ours. Yet, Facebook's election propaganda scam teaches us that we can become a victim of recommendation excesses. Let us not lose our own discerning ability to choose what we really want. We need to be self-aware of the choices we make online. The power of choice must always rest with us consumers.
Shalini Verma is the CEO of PIVOT technologies



More news from