Online reviews: Filter the fraud, but don’t tell us how

Posted by
Spread the love
Earn Bitcoin
Earn Bitcoin

When you try a new restaurant or book a hotel, do you consider the online reviews? Do you submit online reviews yourself? Do you pay attention if they are filtered and moderated? Does that impact your own online review submissions?

A research team comprising of Rensselaer Polytechnic Institute’s T. Ravichandran, Ph.D., professor in the Lally School of Management, and Jason Kuruzovich, Ph.D., associate professor in the Lally School of Management; and Lianlian Jiang, Ph.D., assistant professor in the Bauer College of Business at the University of Houston, examined these questions in recently published research. In a world where businesses thrive or die by online reviews, it is important to consider the implications of a platform’s review moderation policies, the transparency of those policies, and how that affects the reviews that are submitted.

“In 2010, Yelp debuted a video to help users understand how its review filter works and why it was necessary,” said Jiang. “Then, Yelp added a section to display filtered reviews. Previously, Yelp did not disclose information about its review filter. This change presented the perfect opportunity to examine the effect of policy transparency on submitted reviews.”

Ravichandran and team compared reviews of over 1,000 restaurants on Yelp to those same restaurants on TripAdvisor, whose practices remained unchanged and was not transparent about its review filter. They used a difference-in-difference (DID) approach. They found that the number of reviews submitted to Yelp decreased. Those that were submitted were increasingly negative and shorter in length compared to TripAdvisor. Also, the more positive a review, the shorter it was.

“Platforms are pressured to have content guidelines and take measures to prevent fraud and ensure that reviews are legitimate and helpful,” said Ravichandran. “However, most platforms are not transparent about their policies, leading consumers to suspect that reviews are manipulated to increase profit under the guise of filtering fraudulent content.”

Platforms use sophisticated software to flag and filter reviews. Once a review is flagged, it is filtered out and not displayed, and it is not factored into the overall rating for a business.

“Whether or not to be transparent about review filters is a critical decision for platforms with many considerations,” said Kuruzovich.

Users may put in less time and effort into their reviews if they suspect that they have a significant chance of being filtered, or they may do the opposite to make their reviews less likely to be filtered. Since most fake reviews are overly positive, users may assume that positive reviews are most likely to be filtered and act accordingly. However, with a transparent policy, those who submit fake reviews may be incentivized to change their ways.

“Review moderation transparency comes at a cost for platforms,” said Ravichandran. “Users reduce their contribution investment, or the amount of time and effort that they put into their reviews. This, in turn, affects the quality and characteristics of reviews. Although transparency helps to position a platform as unbiased toward advertisers, the resultant decrease in the number of reviews submitted impacts the platform’s usefulness to consumers.”

“This research informs businesses on best practices and consumer behavior in the digital world,” said Chanaka Edirisinghe, Ph.D., acting dean of the Lally School of Management. “Online reviews pose great opportunity for firms, but also raise complex questions. Platforms must earn the trust of users without sacrificing engagement.”