Fake online reviews are everywhere. Meet the UTSC prof who’s developing strategies to stop them

Shreyas Sekar

An incredible number of shoppers are turning to online platforms such as Amazon to purchase their goods these days, and those platforms are run by algorithms. Algorithms determine what you’ll want to watch on Netflix. They’ll determine the route your Uber driver takes, and, if you deliver food, whether you’re assigned preferred (i.e. “tipping”) customers.

Never mind the future, warns Shreyas Sekar, assistant professor of operations management in UTSC’s Department of Management: “The present is algorithms.” Algorithms are running in the background of almost every aspect of our daily lives, and influence even the most minute decisions of what you buy.

But how much can you trust them?

Sekar, who is also cross-appointed to the Rotman School of Management, says at the present time, not very much at all.

“These are huge marketplaces where everyone interacts,” says Sekar, who completed his PhD in Computer Science at the Rensselaer Polytechnic Institute and a postdoctoral fellowship at the Harvard Business School.  And while ‘everyone’ interacts on these online vending platforms, that doesn’t mean they play nice.

“People try to manipulate system,” Sekar says.

Rank really does matter

In fact, a significant number of vendors on platforms such as Amazon are thought to be gaming the algorithms to jockey for better product placements – essentially, how quickly a product comes up when you give the platform an idea of what you’re looking for.

Yet, honest vendors don’t actually have a lot of control over where their products come up in the search function. Product reviews and click rates are the only areas that vendors have any control over, which in turn influence where in the search results that product is displayed – and how many potential customers that product will then be able to reach. So, some vendors have found creative ways to “buy” fake reviews and clicks to get a quick leg up.

"The problem is that fake reviews are getting more sophisticated."

“It’s more common than you think,” says Sekar, who notes that there are “entire Facebook pages dedicated to giving people cash incentives to post fake reviews.”

In fact, according to the independent monitor Fakespot, up to 42 per cent of Amazon’s reviews could be fake.

While Amazon employs thousands to sift through reviews to weed out the fake ones, and automated tools to catch the remainder, “the problem is that fake reviews are getting more sophisticated,” explains Sekar. “By the time you’ve detected the problem, maybe they’ve made a few thousand,” he adds.

Sekar thinks we can do better, and he’s developing the algorithms to do just that.

Finding the good despite the bad

“We have an algorithm that identifies which are the good products despite the presence of fake users,” says Sekar, who has been collaborating with researchers from MIT, Yale and Google Research on the project.

“In other words, we want a system that is robust to manipulation and converges to a good outcome no matter what the fake users throw at us,” adds Sekar, who also collaborates with The BRIDGE  – a multipurpose academic space spanning teaching, research and experiential learning for business, developed in partnership with the Department of Management and the UTSC Library – on analytics projects, and looks forward to working with students to mine data to help study the problem.

The team is developing a machine learning process that “learns more carefully,” says Sekar, “by being circumspect instead of making decisions too quickly with too little data.” The system will also deploy multiple algorithms simultaneously, that will keep more fraudsters at bay.

Uniquely, his algorithms will continue to “learn” well after a product launches – the key time that vendors load the system with false information – and thereby introduce a step that will sharply decrease a vendor’s chances of fooling the system.

In the near future, the team hopes to move their fraud-busting algorithm into real-world testing, and he wants to bring more student researchers in on the complex project.

But while Sekar’s system may soon help online giants crack down on product fraud, he adds that consumers still need to pay close attention to algorithms.

As machine-learning tools become omnipresent, their programming and outcomes are all-too opaque. And, as Sekar explains, “We don’t know what the algorithms are doing,”

 

Photo: Shreyas Sekar, assistant professor in the Department of Management