Tuesday, 22 November 2011

Unintended Consequences in Academia

Found this via a posting on Facebook; a very interesting look at the unintended consequences of trying to measure the output of academics, and just like any other metric the issues that come up on how that particular metric gets misused. Sort of reminds me of the classic software engineering metric leading to bonus payments: bug fixing....

Perverse Incentives in Academia

 In particular I quite like these two:

Researchers rewarded for increased number of citations. Researchers do work that is relevant and influential. H-index obsession; list of references no longer included in page limit at many conferences.

Teachers rewarded for increased student test scores. Improve teacher effectiveness. Teaching to the tests; emphasis on short-term learning.

Of course, one issue here is that you are trying to measure some seriously intelligent people and here's a very good discussion on managing intelligent people by Scott Berkun - I'm not sure he has all the answers but just acknowledging this is a step to proper understanding.

Finally, a nod to that classic tome of unintended consequences (and the economic theory that drives this): Freakonomics by Levitt and Dubner.





No comments: