The convergence of social media, mobility and analytics has turned the once incredible concept of crowdsourcing into an enabler of practical applications. Crowdsourced locational data beamed up from smartphones and car navigation systems powers Google's real-time traffic reports and drive-time estimates. Restaurant reviews that casual diners post on Yelp are supplanting the work of newspaper critics.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
But crowdsourcing the performance review seems like a distinctly bad and impractical idea.
At the risk of being labeled a contrarian for contrariness' sake, let it be known that I have always been idealistic about information technology's potential to improve lives. But journalism demands skepticism, and many of the technologies once hyped as revolutionary have deserved their permanent place in the dustbin of history.
When I first heard about crowdsourcing performance reviews, I was skeptical, but after reading a colleague's interview with Eric Mosley, author of the book The Crowdsourced Performance Review and CEO of Globoforce, a vendor of social recognition tools, the doubts became deafening.
I have little faith in crowds. Perhaps they have their own wisdom, as other crowdsourcing popularizers have claimed, but whenever the subject of the crowd mentality comes up, I immediately think of the cautionary tales in a much older book, one from the mid-19th century.
Extraordinary Popular Delusions and the Madness of Crowds, by Scottish journalist Charles Mackay, came out in 1841 but remains a respected analysis of the ways that erroneous and harmful ideas can catch fire in society, and how even benign ones can produce unforeseen consequences. The Salem witch trials and financial bubbles such as the 17th-century Dutch tulip mania are among the book's many examples.
Crowdsourced performance reviews could be susceptible to the same ills while amplifying such human failings as jealousy, resentment, pettiness, territorialism and obsequiousness. What about that arrogant jerk with the obviously low emotional IQ whom everyone has worked with at one time or another (or multiple times) -- who is also a genius programmer or the star salesperson who out-earns everyone and makes sure they know it? Will that guy get favorable reviews from the crowd?
Sometimes the undifferentiated mass, called a crowd, turns into a mob. It's not always a benign flash mob that provides dance entertainment to lunchtime passersby or rescues small businesses by quadrupling sales in one well-coordinated afternoon of shopping. Mobs can gang up on people and do real damage.
This isn't a democracy
While its advocates will probably offer good reasons to disagree, crowdsourcing assumes the existence of democratic management processes, where the people who make the big decisions for an organization must consider the opinions of everyone else. (In cooperatives, they owe their very jobs to the continued approval of voting members.) Until most businesses adopt democratic governance models, I don't see crowdsourced performance reviews taking hold.
In contrast, companies that crowdsource customer opinions from social media have a clearly defined interest in taking those opinions seriously. They believe they can glean real value from gauging popular sentiments that are being expressed on social media, then apply what they learn directly to their products and services. But the relevance of consensus opinion to an employee's performance review is weak and unclear. Why should the input of nearly everyone a worker comes in contact with carry so much weight? Precious few companies are like HubSpot, which claims to have a culture of openness that makes social performance reviews possible.
Nor is a crowd a democracy. It's a crowd; no need to reach for Webster's to be absolutely clear what the word means. A crowd doesn't take a vote and operate consciously by majority rule. Rumors, notions, misinformation -- and yes, a type of madness -- often inform its actions. Why should such a group be trusted to judge anyone's job performance?
Online comments about Mosley's book seem to view crowdsourcing in a positive light, often as a check against favoritism and arbitrariness, which they see as inherent in the hierarchical, command-and-control structures of most companies. But I don't have much faith that a crowd of peers will be any less arbitrary, nor do I think most managers who handle performance evaluations are prone to singling out favorites. In my experience, they're more interested in getting the work done and earning strong reviews themselves for leading others toward common goals.
One of the main arguments of crowdsourcing advocates, including Mosley, is that the approach is superior because it provides "data" from social recognition. The book's introduction states that such data comes from thousands of "recognition moments" from co-workers and managers. But slapping statistics onto poll responses and other informal mechanisms, let alone unstructured text, does not make the resulting data more rigorous or scientific than the observations of a single manager. The traditional performance reviews that Mosley maligns are just as good, and probably better, for gathering quantifiable data, such as the number of products delivered or the dollar value of sales. The key is making employee goals specific and measurable.
Despite this raft of concerns, I can see some advantages. Mosley is correct that an aggregation of opinions, like Amazon's and Yelp's star ratings, can often provide more accurate evaluations. And I haven't considered all the nuances of how this could work.
Crowdsourcing performance reviews could yet become the standard way employee assessments are done. Just don't look to the crowd to tell you whether they're right for your organization.