Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Jokes on them, I never look at 5 or 1 star reviews. By convention I think rating systems are fundamentally flawed in that they are anything but a ubiquitous concept. The people struggling to piece together how they feel about something in a binary construct generally have the most insightful comments. There's no way to stop someone from gaming the system though. If the economic incentive is to cheat, cheaters gonna cheat. Eventually plugins like FakeSpot will become as common as ad blockers, imo.


sort by: page size:

The fundamental problem though is that companies want to discretely inject these overly positive reviews. If users can identify these “paid-for” reviews, they will most likely treat them with distrust.

I really can’t see a way around this.


Sadly that can be gamed too. The fake account can give 2-4 star reviews to randomly selected apps in an attempt to not have it's targeted review devalued.

But it would be useful for normal accounts. Some people are just super-critical of everything. A 1 star review from such a person shouldn't count for as much. Same thing on the other side with people who only give 5-star reviews.


True that rating systems have been hacked. Maybe it really boils down to only following reviews from people you trust, which could be selected professional reviewers.

The fundamental problem of the rating systems is how to prevent one party or another (those who receive and those who write reviews) from abusing the system for their own advantage? The two obvious problems are: 1) fake reviews 2) blackmailing ("give me free stuff / discount / do what I say or I'll give you 1 star").

While the first problem is obvious, the latter is especially hard and I have yet to see it solved properly. The only place that comes to my mind that has somewhat mitigated it was early eBay that had ratings for both sellers and clients, so dishonest client would receive low ratings. But it also tended to discriminate new users as there's no way to distinguish between a honest, new user and just another single-use account for a scam. Furthermore, it only works on an established system, as you need to have many ratings to filter out noise, so it's a chicken/egg game ("nobody uses platform, because there's not enough users" / "there's not enough users, because nobody uses the platform").

I don't see how these problems could be solved without a 3rd party arbitration, but to do this you have to move back to square 1 (centralized service).


> No incentive to game the system with inflated review scores.

There huge incentives from the companies who make the products to influence the ratings, regardless.

Is there any way that this site can prevent fake reviews?


It's also that fake reviewers probably don't try to game 3rd party sites rating systems. It seems plausible that if you implemented the same algo that some spotting-fake-reviews site uses on Amazon, sellers would quickly find a way around it.

> Fake reviews are certainly a big challenge, even for companies like fakespot that try to detect them.

Not that big: If you cut away all 5 and 1 star reviews you'll have stripped roughly 99% of astroturfing (for and against a company).


I’m suggesting that Fakespot might not be effective at weeding out fake reviews. We can’t see their core algorithm. We don’t know how they determine a score. Often times, their scoring feels as good as random selection.

Out of curiosity, what is the alternative? Just allow twenty five-star or one-star reviews to show up and say hey, if business owners want to scam the system by putting in fake reviews, then great? Or just open-source their algorithms?

I guess to me it seems infinitely easier to criticize the way someone else does it than to present a more efficient way of doing it.


I use Fakespot, and I wonder about its accuracy. Seeing a 5-star product given a `D` rating is shocking. If Fakespot is reasonably accurate, then Amazon is inexcusably bad at removing fake reviews. Amazon has orders of magnitude more developer talent and user information than Fakespot to tackle accurate ratings. But I suppose that's the reality of differing incentives...

Fakespot and others like it have crappy algorithms. I know sellers that have never done anything shady for reviews that get graded poorly by fakespot. And their Chinese competitors that blatantly fake reviews get A.

Programs like this also encourage bots to come review or even buying reviews to game the system and if there is not a good way to combat this misuse, it can get out of hand pretty quick. Most reviews I look at on Amazon are pretty useless to me nowadays because I can't tell what's real and what's not.

Nobody gets exactly 100% positive reviews. And it's about highest reviews, not about clearing some bar. Cheaters must be punished.

> Humans just aren't very good at ranking things on a normal distribution

Anecdotally this feels correct, the more common user rating distributions I see are an inverted bell curve - lots of 5's and 1's, not too many 3's.

But as far as the same problem that plagues every other review system goes, I would say the paid/fake reviews are the far bigger problem.


Idk is that very useful? There will dubious reviews on any popular product. Isn't the true utility in fakespot that it makes a decision on the credibility of the overall rating of the product for you based on some aggregation of the total reviews? That's what I'm questioning whether or not can be manipulated or tammpered with, in which case we'd be back at the same problem fakespot tries to solve requiring a fakespot for fakespot reviews

Fake reviews can be seen as an instance of Goodhart's law, where the metric is the rating or score of the business. Initially those ratings may have high correlation with something real, let's say the "quality" of the business. But the more people rely on those scores and the reviews underlying them, the more incentive businesses have to game the system— which destroys the original correlation between ratings and quality.

A big part of the problem with review systems is the one-to-many nature of nearly all of them: when a person posts a review, that review and its score can be seen by everyone. This leverage makes it very efficient for businesses to game the system, as a small amount of fake information can "infect" the purchasing decisions of a large number of users.

So, one alternative might be a many-to-many review system where you only see reviews and ratings from your network of friends/follows (and maybe friends-of-friends, to increase coverage). So essentially Twitter, but with tools and UI that focus on reviews and ratings. That way, fake reviews could only affect a limited number of people, making the cost/benefit calculus much less attractive for would-be astroturfers and shills.


The problem with ratings is that you are relying on altruism.

People doing the real ratings are not getting any direct benefit from the work to rate.

However, people doing fake ratings are getting direct monetary benefits.

Which of these two groups do you think will create more ratings?

For a consumer, I think the only thing that ultimately works is something like consumer reports which people pay money for and which do not accept any money or gifts or free samples from the makers of the stuff getting reviewed.


I've seen this too - a fake business (one selling bait-and-switch low quality services through online storefronts) had many 5 star reviews, and if you clicked into their profiles you'd see a smattering of 5 stars reviews all across the country, sometimes in different countries. Some of them also gave a few 1 star reviews to a few places, probably the flip side of the same review selling business.

I keep thinking this ends with credibility ratings weighing reviewers. Similar to how Google uses the odds that you're a real human to drive their captchas, we'd see a credit rating applied to your email account and this is used across platforms to determine if you're an established normal human or a spambot or a troll and weigh your opinion accordingly.

It's dystopian but I don't see another route out of this growing crisis of credibility online.

next

Legal | privacy