Mohammad Chowdhury is PwC's Telecom, Media and Technology consulting leader across Australia, SE Asia and New Zealand. Until recently he built the practice in India where he became one of the most quoted industry experts in the country. Mohammad has served as an adviser to telecom sector reform in Saudi Arabia, Zimbabwe, Ethiopia, Slovakia, Poland and Slovenia and during 2015 as national telecommunications adviser to the Government of Myanmar. Previously in his career he has conducted significant strategic roles at Vodafone and IBM. He is quoted regularly by the Financial Times, Wall Street Journal, BBC, CNBC, TV-18 and NDTV. Mohammad has worked in 83 countries, lived in 7 and speaks 6 languages. He has a BA in Politics, Philosophy and Economics from Oxford University, an MPhil in Economics from Cambridge University, and strategy training from Harvard Business School. He was born in London, has family origins in Bangladesh, and is married with two sons.
On a recent fly-by trip through Singapore, I arranged to meet an old friend for dinner. To my surprise, another friend came along and the three of us had a great evening together catching up on old times. After a sumptuous meal, we triumphantly posed for a selfie. As one of us did a quick picture crop and Facebook post, suitably tagged and located, I uttered aloud an awkward realisation: “You know, I haven’t seen anything from you on Facebook for months!” I told one of my co-diners, assuming they hadn’t been active for a while.
“Oh? Have you blocked me?” was the troubled response. “I put up a few posts every week!”
“No! Perish the thought, why would I block you?” brushing past a moment of embarrassment. “I take interest in your posts, especially the videos and pics of your young son. I haven’t blocked you. Perhaps, Facebook has stopped your newsfeed popping up on my timeline.” Interesting thought.
Facebook employs algorithms to curate what news is, and isn’t, featured in a particular user’s view. It has to, since otherwise most of us would be flooded with so much traffic that we wouldn’t be able to handle it. Take my example: I have around 800 Facebook friends (carefully but sometimes callously culled down to this number). If 400 friends post an update three times a week, and 200 do once, then I would have 1,400 news updates from friends every week. Since I check Facebook once every couple of days, this would amount to 400 new newsfeeds every time I open the app. Given that old newsfeeds with new comments get featured too, the number to view could be as much 600-800, or 30 minutes of scrolling allowing 2-3 seconds on each. I scroll up and down for maybe 10 minutes each time I go to Facebook, register a few likes here and there and post the odd comment. There is a finite limit to what I can see and I am sure, even with filtered feeds, I miss well over a half of my friends’ news on a regular basis.
Facebook evidently thinks hard about this problem: Essentially the dilemma between total (unfiltered) transparency and the effectiveness of one’s social media experience. As mentioned, total transparency would flood the user’s feed with thousands of updates and diminish the quality of their experience. Too much filtering may jaundice it through preferences set for them by the network and de-personalise the experience altogether. So a balancing is required, forcing decisions around how to route traffic.
Recently, the social network announced it has altered its feed policies such that users are exposed less to sponsored feeds paid for by advertisers, leaving more space for feeds from their own friends. Mark Zuckerberg announced recently on his page, “One of our big focus areas for 2018 is making sure the time we all spend on Facebook is time well spent… I'm changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions… The first changes you'll see will be in newsfeed, where you can expect to see more from your friends, family and groups.”
Evidently Mr Zuckerberg cares deeply that users have a fulfilling and meaningful experience. But something is not right here, and my Marina Bay dinner brought the issue to surface. I see three types of challenge:
• Huge filtering challenge: To a large extent, Facebook decides on the filtering of my newsfeed. I am given a few options to determine what I see and what of my posts others see, but beyond that the traffic flow is determined by the network, which sets routing principles around what in reality would be thousands of criteria which determine what someone does or doesn’t want to see. Question: Is the balance of control between user and network the right balance, or should the user be asked more questions to determine what they want to see?
• “Facebook vs real life” challenge: Some of us want Facebook to mimic the social interactions we have in our not digital and virtual, but day-day real lives. I am not trying to say Facebook isn’t real, because it is as real as other parts of our social existence, but what I mean is some of us want social interactions on Facebook to be like our other social interactions. But others, like me, like Facebook to play a complementary role to the rest of my social life, where I use it to keep myself updated on the lives of those who I rarely have any other contact with. To me, Facebook is more in-fill remotely than sharing with those I see regularly. Question: How can Facebook curate users’ newsfeed through a common algorithm if they have hugely different motives for using it?
• “I don’t know what I don’t know” challenge: I don’t know what is going on with my friends on Facebook, and it is difficult for me to pre-determine what I would want to see and what I wouldn’t. Somebody died, a couple had a baby, a family has been on an amazing holiday and others have graduated from college or started a new job. Or a friend just cooked a lovely meal and wants to share the recipe. It would be close to impossible for me, let alone any prescient, AI-driven network, to figure out what the right balance is for a filtered feed from such a wide variety of goings-on across a disparate set of friends. Question: Is it even possible to have a filtering algorithm that solves routing when even the user doesn’t know what they want?
The problem these challenges create is that a common routing algorithm for all either would be too difficult to create, or too difficult to customise to different motives, or just plain impossible to create because users don’t know what they want to see until they know what it is.
Quoting further from Mr Zuckerberg’s announcement in the New Year, “Now, I want to be clear: By making these changes, I expect the time people spend on Facebook and some measures of engagement will go down. But I also expect the time you do spend on Facebook will be more valuable. And if we do the right thing, I believe that will be good for our community and our business over the long term too.”
The comment “if we do the right thing” is most revealing. It indicates that Mr Zuckerberg sees it as a moral or ethical burden for Facebook to get it right. But perhaps he is taking on a question which is too big for him and his team to answer; perhaps the social media algorithm is an algorithm too far to get “right”. There is a bigger discussion to be had about algorithms in society, to understand the implications they will have on our lives, and where we need to be careful about how they influence outcomes and what role they should play where on societal level. The power to moderate exposure to culture, thinking and curated news media all comes to mind. In a world wowed by common talk of the power of AI, a sobering thought is that maybe the search for the perfect algorithm for all things is the wrong objective to have, and we should start by thinking about where we need algorithms most, and where they will add most value: Traffic management: Yes; curated social newsfeed: To some extent; filtered world news headlines: No. And when it comes to the Facebook newsfeed, like life, maybe we need to reset our expectations about what we can and can’t get from social networking.