Philosophy:Social influence bias

From HandWiki
Revision as of 00:01, 16 March 2024 by Len Stevenson (talk | contribs) (url)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Herd behaviours in online social media
Figure 1: Effect of manipulation on voting behaviour. A: probabilities to up-vote. B: probabilities to down-vote. C: Mean final scores (number of up-votes minus number of down-votes) of the manipulated and control group comments inferred from Bayesian linear regression, 95% confidence intervals shown.[1]

The social influence bias is an asymmetric herding effect on online social media platforms which makes users overcompensate for negative ratings but amplify positive ones. Positive social influence can accumulate and result in a rating bubble, while negative social influence is neutralized by crowd correction.[2] This phenomenon was first described in a paper written by Lev Muchnik,[3] Sinan Aral[4] and Sean J. Taylor[5] in 2014,[1] then the question was revisited by Cicognani et al., whose experiment reinforced Munchnik's and his co-authors' results.[6]

Figure 2. Mean final scores of positively manipulated and control group comments as the results of Bayesian linear regression (95% confidence interval shown).[1]

Relevance

Online customer reviews are trusted sources of information in various contexts such as online marketplaces, dining, accommodation, movies, or digital products. However, these online ratings are not immune to herd behavior, which means that subsequent reviews are not independent from each other. As on many such sites, preceding opinions are visible to a new reviewer, he or she can be heavily influenced by the antecedent evaluations in his or her decision about the certain product, service or online content.[7] This form of herding behavior inspired Muchnik, Aral and Taylor to conduct their experiment on influence in social contexts.

Experimental Design

Muchnik, Aral, and Taylor designed a large-scale randomized experiment to measure social influence on user reviews. The experiment was conducted on social news aggregation website like Reddit. The study lasted for 5 months, the authors randomly assigned 101 281 comments to one of the following treatment groups: up-treated (4049), down-treated (1942), or control (the proportions reflect the observed ratio of up-and down-votes. Comments which fell to the first group were given an up-vote upon the creation of the comment, the second group got a down-vote upon creation, the comments in the control group remained untouched. A vote is equivalent to a single rating (+1 or -1). As other users are unable to trace a user’s votes, they were unaware of the experiment. Due to randomization, comments in the control and the treatment group were not different in terms of expected rating. The treated comments were viewed more than 10 million times and rated 308 515 times by successive users.[1]

Results

The up-vote treatment increased the probability of up-voting by the first viewer by 32% over the control group (Figure 1A), while the probability of down-voting did not change compared to the control group, which means that users did not correct the random positive rating. The upward bias remained inplace for the observed 5-month period. The accumulating herding effect increased the comment’s mean rating by 25% compared to the control group comments (Figure 1C). Positively manipulated comments did receive higher ratings at all parts of the distribution, which means that they were also more likely to collect extremely high scores.[1] The negative manipulation created an asymmetric herd effect: although the probability of subsequent down-votes was increased by the negative treatment, the probability of up-voting also grew for these comments. The community performed a correction which neutralized the negative treatment and resulted non-different final mean ratings from the control group. The authors also compared the final mean scores of comments across the most active topic categories on the website. The observed positive herding effect was present in the “politics,” “culture and society,” and “business” subreddits, but was not applicable for “economics,” “IT,” “fun,” and “general news”.[1]

Implications

The skewed nature of online ratings makes review outcomes different to what it would be without the social influence bias. In a 2009 experiment[8] by Hu, Zhang and Pavlou showed that the distribution of reviews of a certain product made by unconnected individuals is approximately normal, however, the rating of the same product on Amazon followed a J-Shaped distribution with twice as much five-star ratings than others. Cicognani, Figini and Magnani came to similar conclusions after their experiment conducted on a tourism services website: positive preceding ratings influenced raters' behavior more than mediocre ones.[6] Positive crowd correction makes community-based opinions upward-biased.

Social media bias

Media bias is reflected in search systems in social media. Kulshrestha and her team found through research in 2018 that the top-ranked results returned by these search engines can influence users' perceptions when they conduct searches for events or people, which is particularly reflected in political bias and polarizing topics.[9] Fueled by confirmation bias, online echo chambers allow users to be steeped within their own ideology. Because social media is tailored to your interests and your selected friends, it is an easy outlet for political echo chambers.[10]

Social media bias is also reflected in hostile media effect. Social media has a place in disseminating news in modern society, where viewers are exposed to other people's comments while reading news articles. In their 2020 study, Gearhart and her team showed that viewers' perceptions of bias increased and perceptions of credibility decreased after seeing comments with which they held different opinions.[11]

See also


References

  1. 1.0 1.1 1.2 1.3 1.4 Muchnik, Lev; Aral, Sinan; Taylor, Sean J. (2013-08-09). "Social Influence Bias: A Randomized Experiment". Science 341 (6146): 647–651. doi:10.1126/science.1240466. ISSN 0036-8075. PMID 23929980. Bibcode2013Sci...341..647M. 
  2. Centola, Damon; Willer, Robb; Macy, Michael (2005-01-01). "The Emperor's Dilemma: A Computational Model of Self‐Enforcing Norms". American Journal of Sociology 110 (4): 1009–1040. doi:10.1086/427321. ISSN 0002-9602. https://repository.upenn.edu/cgi/viewcontent.cgi?article=1604&context=asc_papers. 
  3. Muchnik, Lev. "Lev Muchnik's Home Page". http://www.levmuchnik.net/. 
  4. "SINAN@MIT ~ Networks, Information, Productivity, Viral Marketing & Social Commerce". http://web.mit.edu/sinana/www/. 
  5. "Sean J. Taylor". http://seanjtaylor.com/. 
  6. 6.0 6.1 Simona, Cicognani; Paolo, Figini; Marco, Magnani (2016). Social Influence Bias in Online Ratings: A Field Experiment. doi:10.6092/unibo/amsacta/4669. http://amsacta.unibo.it/4669. 
  7. Aral, Sinan (December 19, 2013). "The Problem With Online Ratings". MIT Sloan Management Review. http://sloanreview.mit.edu/article/the-problem-with-online-ratings-2/. Retrieved June 3, 2017. 
  8. Hu, Nan; Zhang, Jie; Pavlou, Paul A. (2009-10-01). "Overcoming the J-shaped Distribution of Product Reviews". Commun. ACM 52 (10): 144–147. doi:10.1145/1562764.1562800. ISSN 0001-0782. 
  9. Kulshrestha, Juhi; Eslami, Motahhare; Messias, Johnnatan; Zafar, Muhammad Bilal; Ghosh, Saptarshi; Gummadi, Krishna P.; Karahalios, Karrie (2019). "Search bias quantifcation: investigating political bias in social media and web search". Information Retrieval Journal (2019) 22:188–227 22 (1–2): 188–227. doi:10.1007/s10791-018-9341-2. https://link.springer.com/content/pdf/10.1007/s10791-018-9341-2.pdf. 
  10. Peck, Andrew (2020). "A Problem of Amplification: Folklore and Fake News in the Age of Social Media". The Journal of American Folklore 133 (529): 329–351. doi:10.5406/jamerfolk.133.529.0329. ISSN 0021-8715. https://www.jstor.org/stable/10.5406/jamerfolk.133.529.0329. 
  11. Gearhart, Sherice; Moe, Alexander; Zhang, Bingbing (2020-03-05). "Hostile media bias on social media: Testing the effect of user comments on perceptions of news bias and credibility". Human Behavior and Emerging Technologies 2 (2): 140–148. doi:10.1002/hbe2.185. ISSN 2578-1863.