Does Facebook polarize our society, and how do we stop it?

Note: this post has cheap jerseys received a large amount of interest, and I was asked to translate it into English. It was translated by one of the blog's readers (thanks!), so now you can read it in English as well. Enjoy!

————-

In the last couple of days I realized something horrible: I no longer see extreme right or left wings opinions on Facebook.

Do you think I’m joking? I am dead serious. I have more than 3000 friends on Facebook, and I am convinced that some hold extreme right wing party ideology while others would consider themselves extreme left. When I venture into my Facebook Feed, i.e. the page that combines the new posts my friends wrote, I am exposed to cakes they baked, the cat they thought was adorable and the recent diaper they changed, their loves and fears and other joys. Just one thing is missing: not even one extreme opinion to spare.

This is a disturbing phenomenon. After all, we are currently in the middle of a military operation in Gaza, hotly disputed by many in Israel and supported by many others. I hear about great debates being fought on Facebook and I know full well that the social network should represent everything that happens in the physical world. It should reflect the conversations by the water cooler, so to speak. If so, where did all those extreme opinions disappear? Could it be that the only way I can take part in the discussions that shape Israeli society, is by actively skimming the private walls of my friends and find out what was hidden from me?

But this is not the only discussion I was banned from. In the last year I began to realize Facebook was actively obscuring yet another sector of my opinionated friends from my Feed: Vegans.

Half-truths and lies

In 2013 I started noticing something strange: my vegan friends were gone.

I don’t mean they stopped being my friends. For better or worse, they stayed in my friend list. If I would have bothered to enter their profile page, then I could read their posts, see the pictures they uploaded and chat with them as much as I cared for. But I could simply not find many of their posts on my Facebook feed.

We all use the Feed page but rarely are we aware of its full meaning. Every time we log in to Facebook, a complicated algorithm selects a prized number of statuses it 'thinks' we would like most, or ones that fit perfectly to our hobbies and interests. Since most of us have several hundred online friends and not nearly enough time to read all the posts they keep uploading every hour, it is obvious why we need such an algorithm that will pick and choose the posts suitable for our likings.

And thus my vegan friends were gone.

In retrospect, I can see how that happened. People who read my recent blog posts know I believe that animal experimentation in the lab is critically important for the progression of medicine. I also eat meat several times a week and have never minded looking at picture of a sizzling steak. I believe that throughout my years of using Facebook, the company’s algorithms tracked my actions. They noticed I press “Like” on steak pictures and animal testing supporting posts but will never give praise to a tortured animal post. I actively hold myself from diving into comment debates about the inherent evil of eating meat. I read these discussions to understand better the other side but will never post anything on my behalf.

According to Brian Boland, VP Ads Product Marketing at Facebook, the company’s algorithms should be able to sift through 1,500 posts that could appear each time an average user logs in to the site. This is an enormous surplus of information that needs to be boiled down to 300 posts our feed could hold[i]. To choose the posts that the user would like to be exposed to, the algorithm looks on different aspects that include according to an engineer at Facebook, Lars Backstrom (quoted in The Guardian): “how often you interact with a friend, page or public figure; how many likes, shares and comments individual posts have received; how much you have interacted with that kind of post in the past;…”[ii]

Based on the エンジニア夫婦間のプレゼントを考える expert opinions cited above, the algorithm that analyzed my ???? behavior quickly realized that my opinions are very different from my vegan friends, and started to modify my news feed to my own areas of interest. And so, every time I log in to Facebook I get updated about what happens on the train in Israel (I take the train everyday), I receive new and interesting articles about atheism (I am an atheist), and revealing pictures of my online female friends (I still don’t know why). I will definitely not get an article about the hardship of living in the periphery of Israel, statuses written by extreme orthodox Jews or posts from the aforementioned vegans. This algorithm de facto created a tall barrier for me without me knowing about it or asking for it. And what’s most important about this barrier is not its height, but the fact it’s invisible: I had no idea it existed, or the ????? reasoning behind its filtering. All I know is that every day the opinions I am exposed to online seem to follow my own opinions in a very suspicious way. At the very least I can say that the information I am reading online no longer challenges my beliefs in any significant way.

And that is a very big problem.

A Net Polarized

In 1961 a student by the name of James Stoner ran a simple experiment. Test subjects were asked to give advice to a certain person (let’s call him Mark) whether he should quit his job and go work for a small company that could give him equity, but might also go bankrupt leaving Mark with nothing. Stoner asked the subjects to form an opinion on their own, and then divided them into random groups of six. Each group discussed the issue and then achieved a collective decision about the path Mark should take. Strangely enough, the group as a whole usually preferred for Mark to take a risk for greater profit potential. [iii] In professional words, they were less risk averse.

These results surprised the experts in the field of management and decision making, since the common belief these days was that groups were more careful, cautious and moderate in making decisions. It is obvious why things should be so at a quick glance: in a group, participants are supposedly exposed to a wide array of opinions that will help subjects see a problem from many sides and not only from their own side. Unfortunately, Stoner’s research and many others that followed presented an opposite and more accurate reality: individuals who come to a group discussion with one opinion, tend to leave that discussion with a more extreme version of the original opinion. ([iv]) They become polarized.

Group Polarization tends to happen in online discussion too. In fact, research findings show that polarization tends to be more extreme when subjects do not see each other and are forced to hold the discussions online ([v]). The polarization also occurs when subjects play games such as the Dictator Game, a game that analyzes the participant’s ability to see the world from the opponent’s point of view.

These findings are particularly relevant in the field of business management, where managers are encouraged to have group discussions about potential decisions. However, when the discussion is mismanaged, then the barriers that usually urge us to take a more cautious solution are forgotten. The group as a collective of individuals loses the ability of a single person to think like the other side does. It loses its empathy and thus prevents itself (the group and its members) from collaborating with others.

What are the reasons for group polarization? According to psychologists that research the phenomenon, one of the main reasons is relying on arguments that seem to the members of the group logical and reasonable, but which originate only within the group. These arguments are not challenged by external ideas and opinions coming from outside the group, and that shackles the entire team by their original opinion, that only keeps getting more extreme since members keep competing with one another about who is more loyal to the cause ([vi]).

Facebook and Israeli society

How is group polarization related to Facebook's filters and algorithms? When Facebook chooses to sift through the posts and statuses I see my friends publish, it has very real potential of enhancing group polarization. The extreme vegans, the extreme leftists, the extreme rightists, the extreme atheists and all the rest, are partially a product of a divided and categorized social network. That network delivers to every person a personalized virtual 'habitat' that will get them to log in again and again and to relish on the fact that all of their friends think in a similar fashion. I obviously do not suggest that this is the sole reason for extremists’ existence, but research does strongly suggest that when a person sees that most of their friends support the cause they believe in, that person's beliefs tend to become stronger, and challenging these views becomes more difficult.

This is a frightening situation, and one I fear will only get worse in the near future. Facebook’s purpose is, after all, to be profitable. I do not make this statement as an accusation, but simply as the definition of publicly traded company. Facebook’s economic model relies on people logging in everyday, thus exposing themselves to advertisements from companies who pay Facebook for the right to advertise in the social network. This incentivizes Facebook to create a virtual environment to which people would like to log in again and again and stay there as long as they can. In order to do so, Facebook changes the social environment of our feed to make it more convenient to us than the physical world, where we are forced to be aware of the other side’s point of view.

This, at least, is one of the ideas standing behind the filtering algorithm. It obviously serves us well (since no sane person can go over 1,500 posts by his or her friends), but it has its potential downfalls, even when its purpose is benign. Most frightening of all of these downfalls, is the fact that we – the public – do not know how the algorithm works, and what are its parameters. And we cannot hold back that which we do not understand. And if we don't understand it, then we remain unaware of just how much it filters from our eyes and mind.

In the worst situation I see possible, the other side just disappears in certain social networks. We don’t even need to know that that side and its opinions exist, they will be filtered from our reach, unless we actively search for them. Can you make an entire point of view disappear? Not necessarily. You could instead be exposed just to stereotypical caricature that represent the “other side” in a ridiculous manner, and not to their complicated opinions and beliefs. This might be even more dangerous, since you'll think you understand the world and others, but will only obtain a very simplistic version of it all.

Is there a solution?

How can we prevent group polarization online? The easiest thing to do would be to change the settings on the filter so that it no longer selects the posts you see by interest, but by the time published. However, this is an act most people will not do, as the default setting is the one they usually stick with.

Legal systems around the world have not yet noticed the power social networks have on shaping public opinion. One could hope that governments will require social networks to set the default filter to work by time, or to present a more balanced view for the user. It might be that this requirement will come from the users themselves who will take offence to being manipulated by algorithms. But probably not.

The solution I found for myself is quite simple, and yet almost physically painful. I still want to use the filter to avoid much of the 'friendly spam' around me (no offense, mom, but I don't need to be alerted whenever you do your fingernails). So I keep the filter active, and as a result do not get exposed to extreme posts. However, some of my friends reply and describe their extreme views on my own posted opinions. These opinions make me furious; they insult me and make me want to leave the computer with disgust. This is exactly why I make an effort to never delete other peoples' posts, ignore them or unfriend them. In the last year I have not unfriended any of my Facebook friends for holding opinions opposite to mine.

And if that’s not enough, I also 'like' random posts to confuse the algorithm. It's a petty revenge, I know, but still.

Will these steps help me avoid from getting my opinions reflected and amplified back at me by my friends? Probably not. I feel myself becoming more and more extreme, partly because I'm getting older and more vindictive, and partly because many of my friends online share my own opinions. I am also aware that some of my friends who have opposite opinions choose to unfriend me (after all, I must be pretty annoying to them!), thus leaving me in my own self-focused group whether I want to or not. To try and counter that I make an effort to read different views in a wide spectrum of news media, from the most liberal newspapers to the extreme right wing. I listen to religious leaders’ podcasts and read atheists blogs.

This is a confusing and complicated way to obtain an individual view on the world we live in, but as of today it is still free and open to us all – and I believe that if you take this road, you will be the better for it.

—————

Dr. Roey Tzezana

Dr. Roey Tzezana has a PhD in Nano-technology, and in recent years has conducted research Explanations at the Unit for Technology & Society Foresight at Tel Aviv University. He's currently a research fellow at Yuval Ne'eman Workshop for Science, Technology & Security, and chief futurist and scientist of TelaLabs – a communal lab in Israel. He's a graduate of the Singularity University managers' course, a lecturer in the academy about bio-medical engineering and emerging and 大迫力のぽっちゃり娘をまとめた三段腹マニアへ贈る夢のデブ専エロ動画 disruptive technologies, and is judging in many innovation and entrepreneurship competitions. Dr. Tzezana provides consultation for various ministries and institutes, including the Israeli police wholesale nba jerseys and the Ministry of Defense, and is often invited to speak on subjects Africa! related to the future, disruptive technologies and innovation. His first book, "Guide to the Future", has been released at the end of 2013, became an instant bestseller with three editions and in these days is being translated into several languages.

References

[i] B. Boland, "Organic Reach on Facebook: Your Questions Answered," Facebook, 5 6 2014. [Online]. Available: https://www.facebook.com/business/news/Organic-Reach-on-Facebook. [Accessed 27 7 2014].

[ii] S. Dredge, "How does Facebook decide what to show in my news feed?," The Guardian, 30 6 2014. [Online]. Available: Brooklyn http://www.theguardian.com/technology/2014/jun/30/facebook-news-feed-filters-emotion-study. [Accessed 27 7 2014].

[iii] J. A. F. Stoner, "A comparison of individual and group decision involving risk," 31 7 1961. [Online]. Available: http://dspace.mit.edu/bitstream/handle/1721.1/11330/33120544.pdf. [Accessed 5 6 2014].

[iv] D. G. Myers, "Polarizing Effects of Social Interaction," in Group Decision Making, Academic Press, 1982, pp. 125 wholesale jerseys – 161.

[v] L. M. V. Swol, "Extreme members and group polarization," Social Influence,vol. 4, no. 3, pp. 185 ???wholesale nfl jerseys 199, 2009.

[vi] D. G. Myers, "Polarizing Effects of Social Interaction," in Group Decision Making, Academic Press, 1982, pp. 125 – 161.

להשאיר תגובה

הזינו את פרטיכם בטופס, או לחצו על אחד מהאייקונים כדי להשתמש בחשבון קיים:

הלוגו של WordPress.com

אתה מגיב באמצעות חשבון WordPress.com שלך. לצאת מהמערכת / לשנות )

תמונת Twitter

אתה מגיב באמצעות חשבון Twitter שלך. לצאת מהמערכת / לשנות )

תמונת Facebook

אתה מגיב באמצעות חשבון Facebook שלך. לצאת מהמערכת / לשנות )

תמונת גוגל פלוס

אתה מגיב באמצעות חשבון Google+ שלך. לצאת מהמערכת / לשנות )

מתחבר ל-%s