Reports

Detecting Fictitious Consumer Reviews: A Theory-Driven Approach Combining Automated Text Analysis and Experimental Design

Ann Kronrod, Jeffrey K. Lee, and Ivan Gordeliy, 2017, 17-124

Fraudulent user-generated content is harmful for both consumers and marketers and increases uncertainty about consumption experiences and offerings. To improve consumer experience online and increase consumer trust, marketers need a robust method to identify potentially fictitious product reviews.

Here, Ann Kronrod, Jeffrey Lee, and Ivan Gordeliy address this need via a novel method leveraging linguistic theory, experiment-driven data sampling, and automated text analysis on the language used in reviews.

Relying on literature about linguistics of experienced and imagined events, they develop the following conceptualization: reviews with fraudulent user-generated content should exhibit (1) fewer verbs in the past tense (given the lack of memory of sequences of events), (2) fewer unique words (because the teller relies on general knowledge rather than on unique experience), and (3) more abstract language (given the lack of concrete memories to share).

The authors tested these predictions by using automatic text analysis tools on authentic and fictitious reviews written by volunteer participants for the purpose of this work. As expected, they found that writers of authentic reviews used significantly more past tense verbs, unique words, and concrete nouns, compared with writers of fictitious reviews. Importantly, they found that these features of authentic reviews are difficult to falsify. Even when writers of fictitious reviews received clues about these aspects of authentic reviews, they were unable to replicate the frequencies of these aspects used by authentic review writers. 

Implementing an experimental design, the authors also investigated people’s ability to detect fictitious reviews.  Replicating previous findings in the literature, participants were unable to distinguish fictitious and authentic reviews at a better level than chance (49%-52% successful detection). Interestingly, some participants were informed about the linguistic aspects that distinguish authentic from fictitious reviews. These participants became more suspicious, labelling more reviews as fictitious, but overall their detection rates did not improve. These findings suggest that a computerized detection approach offers advantages relative to an approach reliant on human judgement of review authenticity.

These findings offer insights to consumers as well as managers of digital platforms that depend on consumer trust and on an abundance of authentic user-generated content. The study contributes to theory regarding the linguistic features of a lie, and educates consumers on how to avoid naïve reading of product reviews. The results also demonstrate the advantages of using automatic tools to detect potentially fraudulent online content, and provide the basis to develop practical methods for detecting deception in consumer reviews.

Ann Kronrod is Assistant Professor of Marketing, Department of Marketing, Entrepreneurship and Innovation, Robert J. Manning School of Business, University of Massachusetts Lowell. Jeffrey K. Lee is Visiting Assistant Professor of Marketing, New York University, Shanghai. Ivan Gordeliy is a Postdoctoral Researcher, Group for Neural Theory, LNC, DEC, ENS, École Normale Supérieure, Paris.

Acknowledgments
The authors would like to thank Marketing Science Institute for their funding support for this project. The authors are thankful to Seshadri Tirunillai for his help in developing the code and computational approach in the earlier stages of this work. We thank Ravi Kiran and Wang Wan for their assistance in this project. We would also like to acknowledge the valuable input from Alireza Alemi, and to thank Liudmyla Kushnir, Pantelis Leptourgos, Vasily Pestun, Vasil Khalidov and Alexey Arbuzov for fruitful discussions which promoted this work. Part of this project was conducted while the first author was Assistant Professor of Advertising, The College of Communication, Michigan State University.

Pricing:

  • Corporate: FREE
  • Academic: FREE
  • Subscribers: FREE
  • Public: $18.00

3 WAYS to GET CONNECTED

Business

Employees of MSI Member Companies enjoy the benefits of complete online access to content, member conferences and networking with the MSI community.

More

Academic

Qualified academics benefit from a relationship with MSI through access to msi.org, conferences and research opportunities.

More

Public

The public is invited to enjoy partial access to msi.org content, a free e-newsletter, selected reports and more.

More

Featured:

Become a Subscriber

MSI's Online Library of 400+ reports, authored by marketing academics, offers new research and evidence-based insights

Read More

Stay Informed

The MSI Mailing List

Subscribe to our email list to stay informed about upcoming events, news, etc.

Social Media