ReviewMeta works very hard to position itself as some kind of independent arbiter of review authenticity, but an examination of its methods proves two things: 1) ReviewMeta is not very accurate and 2) ReviewMeta does not like being reviewed.
There’s an article doing the rounds at the moment from the Washington Post suggesting that the Amazon is undergoing some kind of fake review crisis. There are problems with Amazon reviews, of course, but this article is based on some pretty flawed data. At least in how it pertains to the world of books, which is what I know, and what I’ll focus on here. I can’t speak to the world of diet supplements or fake tan or giant tubs of lube – alas.
The article’s claims are largely based on a flaky site called ReviewMeta, which seems far better at getting publicity for itself than correctly analyzing the trustworthiness of reviews, which is a pity as it would be a wonderful tool if it was in any way accurate.
I first heard about ReviewMeta back in 2016 and was very excited to test it out. Naturally, I started with my own books, as I can be pretty sure there are no fake reviews there, being the author, publisher, and marketer of all these titles, and someone who is fastidious about the rules as my name is literally my brand.
However, ReviewMeta seems to call into question a large number of my reviews and reviewers. And by extension me, I guess. And all of you too, because many of the random selection of books I checked had similar issues.
The ReviewMeta site helpfully gives explanations for why its system made these determinations, and you can actually break down each component and get a further explanation. This transparency is hugely commendable.
Digging into this data, though, shows the extreme limitations of the site and the way it calculates the trustworthiness of reviews – at least how it pertains to the world of books.
Perhaps it is more accurate for jellybeans or computer peripherals, I really can’t say. But when it comes to books, it makes a number of pejorative assumptions about what is legitimate reader behavior, such as reviewing Book 2 of a series after reviewing Book 1 or mentioning the title of the book in the review, and these routine reader actions cause ReviewMeta to flag these reviews as questionable or suspicious.
This then casts aspersions on the integrity of the authors of these books – who are self-employed people working in an industry where reputation and integrity are critically important, not huge faceless brands… if that matters. Worse still, the site has been aware of these issues for two years, and not only have they not corrected them, they reacted in a hostile way when presented with this information.
Let’s take a look at some concrete examples. Once again, I’m happy to be the guinea pig here, and have all of you (and ReviewMeta) poke and prod my reviews and check the authenticity of same, because I have 100% confidence that they are all genuine.
ReviewMeta Case Study: David Gaughran
Here is ReviewMeta’s take on “David Gaughran” the brand and how trustworthy it is (that’s me, btw).
Okay, this doesn’t look good. And if you look down the page it shows each of the products they have assessed that led to this overall brand trustworthiness score. You can see many of my books have “failed” in the eyes of ReviewMeta and “Unnatural reviews detected” has been appended to several of my books.
You can click on each product and see how it came to that determination, and the supposed evidence for each component of that decision. Again, I stress, this transparency is truly commendable.
But this breakdown also reveals the faulty assumptions that led to these incorrect determinations about my reviews. And it’s not just my reviews, of course. These simplistic calculations affect most authors. (You can search for your own books here.)
Of course a reviewer of Book 1 and 2 is likely to review the third book in a trilogy. If you don’t take account for that wholly natural behavior when analyzing book reviews, then all your results will be skewed. Mentioning the title of the book is another pretty common thing that (genuine) book reviewers do, but ReviewMeta views with extreme suspicion.
I pointed all this out in a series of tweets to ReviewMeta back in 2016, and they responded with a pissy blog post, which was very selective in the use of my comments to try and paint them as badly as possible. (You can view the entire dialog here instead.)
You can see in these tweets that I raised all these issues, and more. I made a number of suggestions how they could improve the site to take account of the way that reviewers review books – both to miss all the false positives they seem to be generating, and also take account of suspicious patterns in book reviews they were missing.
ReviewMeta PR Drive
But none of these issues have been addressed by ReviewMeta in the last two years. Instead they seem exclusively focused on publicity. Here is founder Tommy Noonan talking to Techspot in November 2016, then CNET a few months later in February 2017, and there have been similar pieces over the last couple of years which I couldn’t be bothered linking to in Forbes, NYMag, Scientific American, Quartz, PBS NewsHour, BuzzFeed, ZDNet, Business Insider, and many, many more.
In all that time of furious self-promotion, I haven’t seen ReviewMeta improve the accuracy of its site.
The sad thing about all of this is that Amazon does have a fake review problem, one which is compounded by Amazon deploying a fake review detection algorithm that seems about as accurate as the one from ReviewMeta, perhaps for similar reasons too. Which means that authors innocent of any wrongdoing get genuine, organic reviews from bona fide reviewers removed every day and the scammers and cheaters with fake reviews keep getting away with it. Sites like ReviewMeta aren’t helping with this problem, they are making it worse.
But the worst part of all, perhaps, is the complete misunderstanding of how Amazon algorithms work. Reviews don’t cause success, they are a symptom of it. Yes, a lot of overwhelmingly positive reviews will sway an on the fence-purchaser, but they don’t automatically lead to sales – not in the world of books, at least. Maybe if I’m looking for a phone charger and they are all more-or-less fungible, then reviews become the tie-breaker for a lot of people. Not with novels. I don’t care how many reviews the Da Vinci Code has, I’m never going to read it.
The continued media focus on “fake” reviews – driven in part by ReviewMeta’s relentless publicity drive – is taking attention away from much more serious issues that the media have not covered in any depth, such as clickfarming, bookstuffing, incentivized purchasing, and mass gifting.
Manipulation of rank and payouts, rather than reviews, is the truly serious issue here.
That’s not to say that ReviewMeta couldn’t serve a useful purpose, or something like ReviewMeta. Unfortunately, ReviewMeta itself don’t seem interested in challenging the flawed assumptions underlying their product, and trying to make it more accurate. Which is such a shame.
I tried to engage with them once more in 2016. I left a comment under that prickly blog post responding to my series of tweets on ReviewMeta.
In the most ironic twist since it rained on Alanis Morissette’s wedding day, that comment was deleted. It seems ReviewMeta doesn’t like being reviewed…
Okay, so I was wrong. Here’s something even more ironic: a user of ReviewMeta on the consumer side left a lengthy review of ReviewMeta on Trustpilot – a genuinely constructive review which sought to identify genuine shortcomings in ReviewMeta, based on their own experience.
And here’s the crazy part: ReviewMeta’s owner was so annoyed – once again – at being criticized that he contacted the reviewer and asked them to change or delete their review.
“This review is hurtful to our business… please consider deleting or changing your review.”ReviewMeta
Even this author couldn’t make it up.