top of page
Writer's pictureFahad H

Software Review Site TrustRadius Has A New Way to Treat Reviews Obtained Through Vendors

ThumbsUpAndDown_ss_1200

Online user reviews are the most powerful way to influence purchase decisions. But do they accurately represent the views of most users?

Today, business software review platform TrustRadius is announcing a new way — called trScore — to handle the bias introduced in reviews by users obtained through the vendor of the reviewed software product. The site says more than two million software buyers visit each year to check out its product reviews.

To understand trScore, let’s first look at TrustRadius’ approach.

The site says it authenticates all users through their LinkedIn profiles. It also requires users to answer eight to ten questions about the product, in order to weed out users having no familiarity. Additionally, a staff person reads every review before it is posted, and the site says about three percent of reviews are rejected for not meeting guidelines.

As for the reviews themselves, TrustRadius puts them into two main buckets: independently-sourced reviews and vendor-sourced reviews. (Consider “reviews” to also mean “ratings,” in most cases.)

First, independently-sourced reviews:

These include the reviews created by the thousands who register on the site to see the assessments and who agree to write reviews. Additionally, the site will solicit reviewers through such places as LinkedIn or Quora, based on their interests.

TrustRadius also obtains users from customer lists provided by vendors. But TrustRadius considers these to be independently-sourced, because it only accepts either the full customer list or what it calls “a representative sample,” which it determines by random sampling of a full customer list. TrustRadius then contacts the users and says it requests unbiased reviews.

Sometimes, TrustRadius is paid by vendors to solicit more reviews, or to “scale out reviews,” so the vendor can use them for marketing. This might mean, for instance, turning 10 reviews into 50. TrustRadius says it always asks for “a fair customer sample” of users and for objective assessments.

These reviews are considered independently-sourced, and the site does not note that it has been paid by the vendor to get them. TrustRadius points out that overall product scores are not dependent on number of reviews, but on their average assessments.

Vendor-Sourced Reviews

Now, vendor-sourced reviews:

Vendor-sourced reviews are ones where the vendor directly approaches its customers and asks them to write reviews. They’re also supposed to ask them to be objective.

TrustRadius says it counts reviews from vendors that participate in its free review program on the same footing as ones the site has been paid to get.

The problem that TrustRadius is trying to correct, CEO Vinay Bhagat told me, is that vendor-sourced users tend to generate more positive ratings and reviews than users obtained elsewhere. Unsurprisingly, the distribution of highly positive scores is greater for vendor-solicited reviews than for independently-sourced ones.

That selection bias, VP of marketing Bertrand Hazard said via email, has “escalated in the last 12–18 months” because of a greater emphasis on reviews by vendors and a greater ability for vendors to identify product advocates among customers. The new trScore, he said, is the “action/response we’ve taken.”

Here are two graphs of the score distribution. The first, for vendor-sourced ratings and reviews, shows the number of reviews/ratings (0 to 3500) versus ratings of 1–10, with 10 being the highest:

Distribution of Ratings and Reviews Sourced by Vendors (n=2,733)

And this is the distribution for independently-sourced reviews:

Distribution of Ratings and Reviews Sourced Independently by TrustRadius (n=15,640)

The new trScore weights the averages of vendor-solicited reviews, so as to adjust for the more positive bias. More recent reviews and ratings are given more weight, reviews count more than ratings and independently-sourced reviews count more than vendor-sourced ones.

The main point, Bhagat said, is to get the distribution of high–low scores for vendor-sourced reviews to more closely resemble the distribution of independently-sourced ones.

Since it is correcting for this positive tilt among vendor-supplied reviews, trScore has brought down overall ratings a bit. Here’s the overall rating of social software products before trScore:

  1. Agora Pulse [8.9]

  2. Expion [8.8]

  3. ViralHeat [8.7]

  4. Sprinklr [8.1]

  5. Shoutlet [8.1]

And after:

  1. Expion [8.7]

  2. Sprinklr [8.3]

  3. ViralHeat [8.1]

  4. Agora Pulse [8.1]

  5. Shoutlet [7.5]

TrustRadius says that it is only software review site that is making an effort to compensate for vendor-introduced bias and noted that its leading competitor, G2 Crowd, does not do this.

0 views0 comments

Comments


bottom of page