The Instructables Rating System

As you have all noticed, Instructables has recently implemented a new rating system, based on stars (insert 2001 A Space Odyssey joke here). We've done a bit of experimenting -- and reserve the right to do more in future -- but it's stable for now so here's how it works.

UPDATE 2010-09-24: The below calculation has been tweaked to bring Instructables' ratings closer to the level of ratings of comparable content on the Internet generally.  We found that we run about a star low, based on a rough look at a bunch of other sites and our (totally unbiased of course!) estimates of their content quality.  So, we have added a star to all ratings across the board.  It's like grade inflation... but we know our users put up really good stuff and we want that to be recognized.

UPDATE: we've tweaked the system a bit more. The below describes the rating system as of 2008-11-15.

The rating system uses a weighted average, where Instructables with only a few ratings, or ratings by users who have not rated many Instructables, have a total closer to the middle of the star range than the actual average of the ratings. For example, an Instructable with one five-star rating will have a total closer to 4 than to 5, because there are so few ratings on it. If the user who rated it hasn't rated other Instructables, the total will be lower than if the user who rated it has done a lot of rating in the past. The purpose of this difference is to minimize the effect of 'shill' ratings; we've seen some members attempt to bump their ratings by creating fake accounts just to rate up an Instructable or two!

As more people rate each Instructable, the total becomes closer and closer to the true average of the ratings. And as each user rates more and more Instructables, we weight their rating higher and higher. We have done it this way because we want the rating to reflect the opinion of the whole community, and we believe that members who do a lot of rating have been around enough to have a good sense of what makes a good Instructable. Also, members who comment and rate have their rating counted higher, as the comment indicates a higher level of involvement with the Instructable than rating alone.

This method makes it so a less-than-stellar Instructable can't be pushed to the very top of the ratings because the author got their friends to sign up and rate it at 5 stars; similarly a really good Instructable can't be dragged too far down by a malicious user.

For the mathy amongst you, this is the formula (we changed the way one component is calculated):

(siteAvgNumRates * siteAvgRating) + (numRates(ible) * avgRating(ible))
R(ible) = ----------------------------------------------------------------------
                         siteAvgNumRates + numRates(ible)
The avgRating(ible) is no longer a straight average, but a weighted one. It's calculated like this:
(W1*R1 + W2*R2 + ... + Wx*Rx) + 2(V1*C1 +V2* C2 + ... + Vy*Cy)
               (W1 + W2 + ... + Wx) + 2(V1 + V2 + ... + Vy)

Pretty complicated algebra, huh? W and V are both weight values, between 0 and 1. R is the rating of someone who has not commented, and C is the rating of someone who has commented.

siteAvgNumRates and siteAvgRating are estimates, held constant in the pursuit of not having ALL ratings change every time one person rates one Instructable.

As you can see, when the number of ratings for a particular Instructable is low, the weight of the site averages count for much more of the rating value. When the rating count is high, the site averages are only a small part of the total rating value. We believe this algorithm has a good balance between letting the cream rise to the top, and preventing 'rating spam' to coin a term.

Please comment!

Picture of The Instructables Rating System
sort by: active | newest | oldest
1-10 of 104Next »
kelseymh7 years ago
As announced early this morning (23 Sep 2010), the calculation of ratings has been changed, to essentially bump all the values up by approximately one star. The details haven't yet been propagated to this forum topic.
rachel (author) 9 years ago
@highvoltage2000, we don't currently show details of who rated what anywhere. But it's an interesting idea, what do people think? As an author, should we display this info? As a rater, should we display it?
yokozuna rachel8 years ago
As an author, I'm quite curious as to who rated my instructables, and how that would compare to their other ratings. As a rater, I've started letting authors know when I rate their projects for that reason. Since we won't ever know the site averages, we could at least see how people had ranked it. If you take away anonymity, it would also help people spot "rating spam", would it not? I think people would be more likely to rate more accurately if they knew people could see what they'd been up to.
rachel (author)  yokozuna8 years ago
Thanks, I appreciate the input! We are still undecided about whether to expose this info or not, so it's good to have people's opinions.
kelseymh rachel8 years ago
I would also be interested in Nacho's suggestion. I don't care about who rated, but the frequency distribution would have some useful information: nimodal, bimodal, flat, and so on...
. How about: xxx people rated it 1 star xxx people rated it 2 stars ... . and if you want to get real fancy, add the rating formula with the appropriate values filled in and highlighted. . Authors would get the info they want and the raters are still anonymous.
Kiteman rachel9 years ago
I think it would be interesting to see who gave it a rating at least (the people who hit + used to be displayed, after all).

It would also be interesting to see a histogram of the ratings - is the rating average because it's had a lot of averages, or because it's had a lot of low-ish ones and a few high ones?
I would suggest the same as Nacho, just make it visible for the authors to see how many people gave 1 start, 2star, etc... That way you won't get people spamming other people with insults like "why did you only gave my ible one star?" "hey thanks for rating" and that sort of stuff.

That's one of the plagues you have on for example DeviantArt. They can see who has favorited their artwork and start spamming everyone with "thanks for the fav". Alhough that's very friendly it also gets annoying after a while.

So yeah what Nacho said but only Authors please :)
yeehacmh7 years ago
I recently was qualified ,by Instructables, for a different contest. Why, I don't know. I think my 'able and others, were qualified but have NOTHING to do with the contest. Therefore, when people look at our 'ables, they vote very low on them and that hurts our chances in the real contest for which they were intended.
 If I try to remove my 'able, I am not allowed, because it is in a contest. Your methods leave something to be desired. In the future, persons would be wise to stay away from contests completely if the 'able doesn't even remotely resemble the new contest.
You have to enter stuff into contests. You might have forgotten doing it, but you ought to be the only person who would add your stuff into a contest.

1-10 of 104Next »