PSA: ReviewMeta Is Not Accurate Bewares Resources

There’s an article doing the rounds at the moment from the Washington Post suggesting that the Amazon is undergoing some kind of fake review crisis. There are problems with Amazon reviews, of course, but this article is based on some pretty flawed data. At least in how it pertains to the world of books, which is what I know, and what I’ll focus on here. I can’t speak to the world of diet supplements or fake tan or giant tubs of lube – alas.
The article’s claims are largely based on a flaky site called ReviewMeta, which seems far better at getting publicity for itself than correctly analyzing the trustworthiness of reviews, which is a pity as it would be a wonderful tool if it was in any way accurate.
I first heard about ReviewMeta back in 2016 and was very excited to test it out. Naturally, I started with my own books, as I can be pretty sure there are no fake reviews there, being the author, publisher, and marketer of all these titles, and someone who is fastidious about the rules as my name is literally my brand.
However, ReviewMeta seems to call into question a large number of my reviews and reviewers. And by extension me, I guess. And all of you too, because many of the random selection of books I checked had similar issues.
The ReviewMeta site helpfully gives explanations for why its system made these determinations, and you can actually break down each component and get a further explanation. This transparency is hugely commendable.
Digging into this data, though, shows the extreme limitations of the site and the way it calculates the trustworthiness of reviews – at least how it pertains to the world of books. Perhaps it is more accurate for jellybeans or computer peripherals, I really can’t say. But when it comes to books, it makes a number of pejorative assumptions about what is legitimate reader behavior, such as reviewing Book 2 of a series after reviewing Book 1 or mentioning the title of the book in the review, and these routine reader actions cause ReviewMeta to flag these reviews as questionable or suspicious.
This then casts aspersions on the integrity of the authors of these books – who are self-employed people working in an industry where reputation and integrity are critically important, not huge faceless brands… if that matters. Worse still, the site has been aware of these issues for two years, and not only have they not corrected them, they reacted in a hostile way when presented with this information.
Let’s take a look at some concrete examples. Once again, I’m happy to be the guinea pig here, and have all of you (and ReviewMeta) poke and prod my reviews and check the authenticity of same, because I have 100% confidence that they are all genuine.
Here is ReviewMeta’s take on “David Gaughran” the brand and how trustworthy it is (that’s me, btw).

Okay, this doesn’t look good. And if you look down the page it shows each of the products they have assessed that led to this overall brand trustworthiness score. You can see many of my books have “failed” in the eyes of ReviewMeta and “Unnatural reviews detected” has been appended to several of my books. Crikey.

You can click on each product and see how it came to that determination, and the supposed evidence for each component of that decision. Again, I stress, this transparency is truly commendable.

But this breakdown also reveals the faulty assumptions that led to these incorrect determinations about my reviews. And it’s not just my reviews, of course. These simplistic calculations affect most authors. (You can search for your own books here.)
Of course a reviewer of Book 1 and 2 is likely to review the third book in a trilogy. If you don’t take account for that wholly natural behavior when analyzing book reviews, then all your results will be skewed. Mentioning the title of the book is another pretty common thing that (genuine) book reviewers do, but ReviewMeta views with extreme suspicion.

I pointed all this out in a series of tweets to ReviewMeta back in 2016, and they responded with a pissy blog post, which was very selective in the use of my comments to try and paint them as badly as possible. (You can view the entire dialog here instead.)
You can see in these tweets that I raised all these issues, and more. I made a number of suggestions how they could improve the site to take account of the way that reviewers review books – both to miss all the false positives they seem to be generating, and also take account of suspicious patterns in book reviews they were missing.
But none of these issues have been addressed by ReviewMeta in the last two years. Instead they seem exclusively focused on publicity. Here is founder Tommy Noonan talking to Techspot in November 2016, then CNET a few months later in February 2017, and there have been similar pieces over the last couple of years which I couldn’t be bothered linking to in Forbes, NYMag, Scientific American, Quartz, PBS NewsHour, BuzzFeed, ZDNet, Business Insider, and many, many more. In all that time of furious self-promotion, I haven’t seen ReviewMeta improve the accuracy of its site.
The sad thing about all of this is that Amazon does have a fake review problem, one which is compounded by Amazon deploying a fake review detection algorithm that seems about as accurate as the one from ReviewMeta, perhaps for similar reasons too. Which means that authors innocent of any wrongdoing get genuine, organic reviews from bona fide reviewers removed every day and the scammers and cheaters with fake reviews keep getting away with it. Sites like ReviewMeta aren’t helping with this problem, they are making it worse.
But the worst part of all, perhaps, is the complete misunderstanding of how Amazon algorithms work. Reviews don’t cause success, they are a symptom of it. Yes, a lot of overwhelmingly positive reviews will sway an on the fence-purchaser, but they don’t automatically lead to sales – not in the world of books, at least. Maybe if I’m looking for a phone charger and they are all more-or-less fungible, then reviews become the tie-breaker for a lot of people. Not with novels. I don’t care how many reviews the Da Vinci Code has, I’m never going to read it.
The continued media focus on “fake” reviews – driven in part by ReviewMeta’s relentless publicity drive – is taking attention away from much more serious issues that the media have not covered in any depth, such as clickfarming, bookstuffing, incentivized purchasing, and mass gifting.
Manipulation of rank and payouts, rather than reviews, is the truly serious issue here.
That’s not to say that ReviewMeta couldn’t serve a useful purpose, or something like ReviewMeta. Unfortunately, ReviewMeta itself don’t seem interested in challenging the flawed assumptions underlying their product, and trying to make it more accurate. Which is such a shame.
I tried to engage with them once more in 2016. I left a comment under that prickly blog post responding to my series of tweets on ReviewMeta.

In the most ironic twist since it rained on Alanis Morissette’s wedding day, that comment was deleted. It seems ReviewMeta doesn’t like being reviewed…

38 Replies to “PSA: ReviewMeta Is Not Accurate”

  1. I feel your pain as a reviwer. I have had a number of my very legitimate reviews
    taken down for sexual content. Now I am marriage, relationship and sexual coacah
    and I was reviewing books with large sexual content and they claimed that I violated
    their standards (which I did not, I have over 800 reviews published)

  2. Wow, just checked out one of my own books – they failed it on something they call ‘reviewer ease’, here is what it says: ‘The ease score is the average rating for all reviews that a given reviewer submits. The average ease score for reviewers of this product is 4.5, while the average ease score for reviewers in this category is 4.3. Based on our statistical modeling, the discrepancy in average rating between these two groups is significant enough to believe that the difference is not due to random chance, and may indicate that there are unnatural reviews.’
    On what planet does a difference between 4.5 and 4.3 count as statistically significant?

  3. When the Amazon site requires a $50 purchase to be entitles to review every product whether purchased or used and goodreads their other site has a competition to see who can post the most fake reviews, then both their APIs are frauds. There’s also a pesky FTC regulation that makes it a crime to give false testimonials for products never bought or used that Amazon seem to think doesn’t apply to them. I have been saying since 2013 that the reviews on Amazon is worthless.

  4. Thank you so much for tackling this bizarre organization (but we live in an increasingly crazy world) and doing so as elegantly and truthfully as is allowed.
    One question, inspired by ‘follow the money’ – how does ReviewMeta attract its revenue? Who pays it for flawed data and why?
    Thank you.

    1. That’s where things get interesting, as they always do. The guy behind it runs another site which *drum roll* provides reviews on stuff like supplements. Plus he has monetized his sites with ads and affiliate relationships. So he seems to have an interest in overstating the problem. An incentivized meta-reviewer, if you will.

  5. Truly, the best thing Amazon could do is get rid of the reviewing altogether. There is so much fraud, and literally, even industries built up around devising ways to pull that off. Many of the folks who do so well on Amazon are the ones who are willing to pay said companies for the reviews.

    1. Honestly, the worst fraud doesn’t involve reviews. With books at least, I think the problem is overstated to an extent (and more serious problems ignored). There are issues with reviews – absolutely no argument that there are unscrupulous sellers (and buyers) of fake reviews. And there are authors/publishers who constantly break the rules in terms of inducements and so on. And Amazon has botched the whole issue. But I’d prefer to fix it than right off something which helps readers find books they like and helps authors find their target readers.

  6. Are these the same folks as Fakespot? I had them assess my books and they not only declared I had many fake reviews (I don’t as I am author/publisher/marketer of all my books) but then they tweeted that fact to the entire world using my book title as hashtag. I took it up with them, as did a lot of indie authors, and they reworked their algorithm when they finally realized that reviewing books is a bit different…

      1. A while back Fakespot gave my two major works decent ratings. I just ran the ratings again, and they gave a C and a D. They’ve played with their algorithms, too, and I’d now classify them as Not Reliable.
        To be quite honest, given that most of the reviews are from verified purchases, I’m a little pissed.
        Ah well. If that’s the worst thing that happens to me this week, it’ll be a good week.

  7. What about the reviews that look like they’re part of an RP forum? I’ve seen a ton of them on BN.com and wouldn’t be surprised if they are on Amazon too. Are these flagged as fake reviews?

  8. All the more reason for authors to organize and leverage a united clout. Though I hear you, David, about the importance of valid reviews, I’ve been saying what Kaytee has said for years. “Reviews” is broken. (Although they’re actually mostly “reader comments,” not reviews, technically. Still relevant, of course, if “real.”) The folks who would write intelligent and insightful reviews–other authors–often get their fingers smacked for doing so.
    So … hmmmm … is it not of peculiar interest that Jeff owns The Washington Post? What’s afoot?

    1. The Washington Post has published articles that have been both highly critical of Amazon, and also some very positive coverage too. Doesn’t appear to be any major discernible slant to coverage that I’ve seen.

  9. Back in early 2016 I found a company with a similar service called Fakespot. When i first tested it I kept seeing obviously bogus scores for book reviews.
    https://the-digital-reader.com/2016/01/25/spot-fake-kindle-reviews-with-fakespot/
    Fakespot had to go rework their algorithms to take into account quirks of the book community social graph.
    https://the-digital-reader.com/2016/01/31/fakespot-responds-to-complaints-over-book-review-ratings-promises-changes/
    Or at least that is what i thought at the time. I just checked Fakespot today and they agreed with Review Meta that you are a sketchy character. Funny thing is, the two companies can’t agree on which books have fake reviews:
    https://www.fakespot.com/company/david-gaughran

  10. Ugh. Once again, far too much trust is placed in a new technology before it’s really been shown to work. Or before it matures to the point where is actually can be trusted. 🙁

  11. I am very sceptical of reviews for anything after 15 years of internet purchases. One person’s view of a riveting, ‘best-seller’, has previously turned into a wasted hour or so of my precious reading time and yet another paperback to the charity store – which I think they don’t need. I do find the sample chapters sometimes useful so you can get a sort of a feel for a style. No doubt R####w M#t# will have it’s brief moment in the sun and sink without trace soon enough. And no, I am not a robot.
    Thanks again for such an entertaining and informative blog David. I always learn something! And it’s nice to see so many people are so supportive of each other.
    Roger

  12. I find it interesting that they flagged my reviews for “overrepresented word groups” which means, as near as I can determine, that the reviews for my books tend to longer than reviews for a typical product. Well, I don’t write typical books, and a lot of my reviewers spent a lot of words discussing them. They seem to think that’s a bad thing.

  13. David–Thanks so much for this enlightening information. I wrote a piece on Sunday about the 1000s–maybe 100s of 1000s–of Amazon users whose accounts have been deleted in the last month because they broke some rule nobody will tell them about. Now I think I see what’s happening. If somebody mentions the name of the book, reviews a series, or commits one of ReviewMeta’s “infractions”, they’re flagged as “bad actors” and Amazon deletes their accounts. No explanation, No refunds. Their stories are heartbreaking. I’ve updated my post with this information and a link to this post. http://annerallen.com/2018/04/amazon-paid-reviews/

  14. I’m with you, David! I was astounded at how my books failed. Although this one book had 4-5 star reviews, they gave 100% trustworthiness to the one review that hated my book and gave a 2 star. If people look at these ratings on my books, I’ll never sell another copy.

  15. How nice it would be if somebody who actually knew what they were doing could come up with an effective algorithm. I think that you nailed something very important – that book reviews are a completely different animal than reviews of other products. And maybe that is the Zon’s problem too, they are applying the same data set to book reviews as they are to spoons and tablecloths or various other widgets. Of course you would get repeat reviews by the same readers if you are writing a series – that’s the whole idea of series, to offer a reader more than one chance to be with your characters. Once again, thanks for all you do.

  16. One of my wife’s reviews was flagged for having the following phrase in more than one review: “I would recommend this book”. If that is suspicious, then so are puppies and rainbows.
    They also placed a warning that over 10% of the reviews mentioned that they were incentivized. Someone should tell them that Amazon allows authors to give free review copies of books, even though they have made it against the rules to incentivize reviews for other kinds of products.

Leave a Reply

Your email address will not be published. Required fields are marked *


Subscribe to Blog

Join 44,258 other subscribers