Online reviews have become one of the most powerful trust signals in modern commerce. Every day, millions of consumers rely on star ratings and customer feedback to decide where to spend their money, which brands to trust, and which businesses to avoid.
But new analysis suggests that the reliability of that system may be under far greater pressure than previously understood.
For years, industry estimates suggested that between 7% and 15% of online reviews were fake or misleading. Those figures shaped public perception and influenced how platforms, businesses, and regulators approached the issue. Fake reviews were widely seen as a contained problem, present but limited. That assumption no longer reflects what the data shows today.
Recent analysis conducted by TruthEngine® indicates that in some sectors, up to 50% of online reviews now display characteristics consistent with manipulation or coordinated inauthentic activity. This does not mean every individual review is provably fake, but it does indicate that large portions of review ecosystems are showing behavioural patterns associated with organised reputation manipulation. This represents a structural shift in the digital trust landscape.
Historically, fake reviews were typically associated with isolated incidents, often small-scale attempts by individual businesses to influence ratings. What is emerging now looks fundamentally different. The problem has evolved from occasional misuse into structured, scalable manipulation.
Several forces have contributed to this change. Online reviews have become deeply embedded in commercial performance, directly influencing search visibility, marketplace rankings, conversion rates, and consumer purchasing decisions. As reviews became economically valuable, incentives to influence them increased.
At the same time, advances in automation and generative AI have dramatically lowered the barrier to producing convincing review content at scale. Reviews can now be generated faster, cheaper, and in far greater volumes than ever before, often appearing authentic when viewed individually.
The result is that manipulation is no longer obvious at the single-review level. Many suspicious reviews look entirely normal in isolation. The patterns only become visible when behaviour is analysed across large datasets over time.
TruthEngine®’s research focuses on these behavioural signals rather than subjective judgments about individual comments. Analysis examines indicators such as posting velocity and timing anomalies, language similarity and templating patterns, reviewer network relationships, rating distribution irregularities, and cross-platform behavioural correlation. When viewed collectively, these signals reveal coordinated activity that traditional moderation approaches may struggle to detect.
Importantly, most review platforms primarily analyse activity within their own environments. Coordinated manipulation, however, often operates across multiple platforms and accounts simultaneously, meaning broader patterns can remain hidden without cross-dataset analysis.
The implications extend far beyond individual purchasing decisions. Online reviews influence market competition, brand reputation, and consumer confidence at scale. When trust in review systems weakens, honest businesses may be disadvantaged, consumers may be misled, and platforms face growing scrutiny over the reliability of the information they host.
Regulatory attention is also increasing. Across multiple jurisdictions, including the UK and EU, authorities are placing greater emphasis on misleading commercial practices and digital transparency. Review authenticity is rapidly moving from a marketing concern to a governance and compliance issue.
The most important takeaway is not simply that fake reviews exist. It is that the scale and sophistication of manipulation appear to be increasing faster than public understanding or industry assumptions. The long-standing belief that fake reviews represent only a small minority of activity may no longer hold true in certain sectors.
Restoring confidence in online reviews will require greater transparency, improved detection methodologies, and a broader recognition that digital trust systems must evolve alongside the technologies influencing them.
Online reviews remain an essential part of modern commerce. Ensuring their reliability is now one of the defining trust challenges of the digital economy.
TruthEngine® analyses large-scale review data to identify patterns of inauthentic activity, helping organisations understand review risk exposure, strengthen governance, and support greater transparency in digital marketplaces.
Written by Andrew Winton on March 3, 2026
Daniel Mohacek
- February 18, 2026
Daniel Mohacek
- January 30, 2026
Daniel Mohacek
- November 11, 2025