Arizona Public Radio | Your Source for NPR News
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
SERVICE ALERT:

Our 88.7 transmitter site sustained a fire of unknown origin. We have installed a bypass that has returned us to full power, though repairs are still ongoing. Our HD service remains inoperable. We apologize for the inconvenience and appreciate your patience as we continue to work on the transmitter. Online streaming remains unaffected.

Online Review Too Good To Be True? Sometimes It Is

Online reviews are many, but it's often difficult to tell the real ones from the fakes. To help sort the genuinely delighted customers from profit-driven praise, researchers at Cornell  University have developed software that can successfully tell the difference.
Yanik Chauvin
/
iStockphoto
Online reviews are many, but it's often difficult to tell the real ones from the fakes. To help sort the genuinely delighted customers from profit-driven praise, researchers at Cornell University have developed software that can successfully tell the difference.

From local plumbers to luxury hotels, just about everyone selling a service these days has an online reputation. Increasingly, that reputation is shaped by online reviews. Customer ratings on sites such as Yelp and Urbanspoon can, for example, make or break a new restaurant.

It's no wonder, then, that some businesses are trying to fake us out. On Craigslist and online forums, posters are offering to buy and sell gushing reviews for just a few bucks; potential customers aren't able to tell the difference.

To help sort the genuinely delighted customers from profit-driven praise, Cornell University researcher Jeff Hancock and his colleagues have developed software that successfully unmasks fake online hotel reviews.

Hancock tells Laura Sullivan, guest host of weekends on All Things Considered, that too many bogus ratings could undermine the system.

"It gets at the very basic idea of what these reviews are about: trust," Hancock says.

The researchers started by "training" their computer algorithm on both fake reviews written for the study and real online reviews. Their software then went head-to-head against real humans and summarily defeated them: The computer was 90 percent accurate while the humans were correct three out of five times at best.

It turns out humans are just bad at telling when people are lying.

"This is consistent with about 40 years of psychological research on deception detection," Hancock says.

Part of the reason is that the clues to deceptive reviews are often found in the function words.

"[These are] the 'the's and 'ah's, the prepositions, the pronouns — all the little words. We as humans, we completely ignore those," Hancock says. Fake reviews, for example, were more likely to include self-reference and often came up short on specific spatial information.

This type of deception is not new to the Internet.

"We've been lying as long as we've been talking," Hancock says, "and that's about 60,000 years, so we have lots of practice."

The difference is that online deception, like these hotel reviews, relies on text.

"It's much more of a tell-me world rather than a show-me world," Hancock says.

Copyright 2023 NPR. To see more, visit https://www.npr.org.