The US Federal Trade Commission (FTC) is swinging its big, regulatory stick at fake online reviews. Rules announced earlier this year are now in effect, banning reviews by fake people, AI-generated text, and those who don’t disclose their status as company insiders or affiliates.
If you spot violations of the new policy, you can report them directly to the FTC, for a maximum fine of $51,744.
The FTC’s updated guidelines for online reviews outlaw six specific types of fakery: reviews from people who aren’t real or use AI-generated text or images, no paid reviews (or other forms of compensation, like those sneaky $10 coupons thrown in with an Amazon shipment), no reviews from employees of the company offering the product or others with a financial relationship, no fake review sites designed to gum up search results, no suppression of negative reviews, and no buying or selling fake “followers” or other social influence.
FTC chairwoman Lina Khan announced the rule going into effect yesterday on the social media site formerly known as Twitter (and spotted by PCMag). She encourages Americans to report infractions at reportfraud.ftc.gov.
I think the new guidelines are great, and they should make companies like Amazon and Walmart take a much harder look at the increasing amount of review bots that ultimately make their marketplaces and product listings less trustworthy.
That being said, I have some serious doubts about the Commission’s ability to actually enforce these rules. Since most of the infractions come from small-time sellers gaming automated systems — and a huge amount of them fall outside of US jurisdiction — the measurable effect of this policy could be negligible.
Some means of going after the platforms that enable these problems — stores like Amazon, social networks like X and Facebook, etc. — would be a more meaningful avenue toward actually protecting consumers. But it would also be a much harder regulatory achievement, and even if attempted, would cause instant legal blowback from some of the biggest corporations on the planet and their political allies.
But having these rules on paper might dissuade certain well-known laptop makers from, say, nudging the scores on their online customer reviews or hiding any reviews that highlight particularly egregious service examples. It’s a step in the right direction.
Author: Michael Crider, Staff Writer, PCWorld
Michael is a 10-year veteran of technology journalism, covering everything from Apple to ZTE. On PCWorld he’s the resident keyboard nut, always using a new one for a review and building a new mechanical board or expanding his desktop “battlestation” in his off hours. Michael’s previous bylines include Android Police, Digital Trends, Wired, Lifehacker, and How-To Geek, and he’s covered events like CES and Mobile World Congress live. Michael lives in Pennsylvania where he’s always looking forward to his next kayaking trip.
Recent stories by Michael Crider:
Hackers infect thousands of WordPress sites with malware pluginsFacebook and Instagram bring back facial recognition to ‘protect people’YouTube is testing a cheaper ‘Premium Lite’ plan… that still has ads