Ovi -
we cover every issue
newsletterNewsletter
subscribeSubscribe
contactContact
searchSearch
Stop human trafficking  
Ovi Bookshop - Free Ebook
Tony Zuvela - Cartoons, Illustrations
Ovi Language
Books by Avgi Meleti
WordsPlease - Inspiring the young to learn
Murray Hunter: Opportunity, Strategy and Entrepreneurship
Stop human trafficking
 
BBC News :   - 
iBite :   - 
GermanGreekEnglishSpanishFinnishFrenchItalianPortugueseSwedish
Eureka: Quantifying research Eureka: Quantifying research
by Akli Hadid
2017-01-21 11:29:08
Print - Comment - Send to a Friend - More from this Author
DeliciousRedditFacebookDigg! StumbleUpon

In the “golden age” of research Paris, London and New York City used to have tourists and other curious intellectuals visit so they could attend lectures by the big names of research. The Dewey, Skinner, Sartre, Hayek, Foucault and many other big names of research used to be fixtures of lecture halls. Back then they were smoke-filled male-dominant lecture halls, and the big names would read lecture notes, often in a dull and boring way, with students and other aficionados taking careful notes, and sometimes publishing such notes.

quant01_400In the 1980s, a new research tool made its appearance: the computer. The People’s Republic of China was the first country to decide to have a full research database, and the algorithm was the one that was later used by Google among others. The algorithm was to rank researchers by number of citation by other researchers in their books and papers. Of course, the most cited thinkers were Confucius, Lao Tzu, Chairman Mao and Sun Tzu among other classical Chinese thinkers, then came, wait for it, the historians.

So in China, and South Korea, historians being more cited and higher up in citation rankings, while chemists and physicists were way lower in the rankings, you had many geneticists, physicists, medical researchers among other scientific researchers who chose history over science, since historical papers would get them more citations.

Another consequence was what I call the rat on the cow’s head effect: in Chinese mythology, there was a race among animals and the rat won, although technically the cow was faster. The rat won simply by riding on the cow’s back and getting off very near the finish line. That is, researchers mostly did research by checking who had the highest citation ranking, and by writing papers that would either affirm or contradict those researchers who were high up in the rankings.

Citation rankings then became popular in North America, and this meant conferences and other lectures were no longer targeting the public but were targeting other researchers, because the more researchers cite you the better. Researchers also started writing papers that were targeted to other researchers rather than the public at large, so they could get more citations.

What’s the consequence of quantifying research? Since naturally history and political science are fields where a lot of the research takes place, you’ve had many researchers shift from science to politics and history. Another consequence is the “click bait” effect of research, that is researchers doing more provocative research so they can be contradicted, the more contradictions that is, the more citations there will be.

I used to be a lecturer who did not target citations, and the longer I was in the field the larger my audience grew. But the financial aspect of it all is a little frustrating. That is, these days, no university will hire me because I don’t have enough “citations” not that I care if anyone cites my work, and conferences, knowing that the more citations the better, charge egregious fees in exchange for large crowds of researchers who will cite or contradict your research, but that are too expensive for the public to attend. No more enthusiastic students taking notes that is.  

I’m not saying that there is no golden age of research. The more I look, the more I find fascinating books that have been written and are worth my time. But I do find that researchers have been so obsessed with contradicting and citing each other that they have lost sight of their public. Researchers competing in citation rankings also means that heavyweights are often thrown out of the ring, with lightweights competing over click bait issues such as “Palestine” “World War II” and “Imperialism” and such being sure to land you on top of the rankings. I hope this will change soon enough.  


     
Print - Comment - Send to a Friend - More from this Author

Comments(2)
Get it off your chest
Name:
Comment:
 (comments policy)

Emanuel Paparella2017-01-21 11:59:26
The necessary other side of the coin:

In a positivist world where everything is quantified and measured, it will come about that quantity rather than quality will be the supreme value, that what the Greeks called "aretè" will lose to mediocrity and inanity.


Emanuel Paparella2017-01-21 16:29:31
A relevant follow-up to the previous comment:

82% percent of articles published in the humanities are not even cited once. Of the articles that are cited, only 20% have actually been read. Half of academic papers are never read by anyone other than their authors, peer reviewers, and journal editors.

But of course we keep on counting and measuring and categorizing as good positivist ought.


© Copyright CHAMELEON PROJECT Tmi 2005-2008  -  Sitemap  -  Add to favourites  -  Link to Ovi
Privacy Policy  -  Contact  -  RSS Feeds  -  Search  -  Submissions  -  Subscribe  -  About Ovi