Thursday, April 2, 2009


Mark Twain said "There are three kinds of lies: lies, damned lies, and statistics." In an interesting article on QSAR, (QSAR is the process where the chemical structure is correlated with chemical or biological activity), the author shows a graph that shows a linear trend that shows that US Highway Fatality rate decreases linearly with an increase with the lemons imported to US from Mexico. He goes on to say, "By not following through with careful, designed, hypothesis testing we have allowed scientific thinking to be co-opted by statistics and arbitrarily defined fitness functions. Statistics must serve science as a tool; statistics cannot replace scientific rationality, experimental design, and personal observation."

While this is one end where the use of statistics is to bolster weak arguments, in India, we usually the follow the other end and disparage statistics that do not support the perceptions. For example, IISc and IITs are often compared with the top universities of USA. As a country, India published 43 articles in Science for the ten year period of 2000-2009 (this includes several articles which had a foreign collaborator). A single university, MIT, published 291 articles in the same period. The numbers are almost exactly the same for publications in Nature (45 and 295).

Other these top journals which are common to science, I have also taken the top three journals in each field of science and engineering. The numbers for chemical engineering has already been published and the rest will be published in a separate paper.


pradeepkumar said...

Those who are very much interested in the statistics of publications in the so called High impact journals may want to look at the following articles by Peter Lawarence from Cambridge

(1) Politics pf Publications

(2) Lost in Publication

Anonymous said...

A useful parameter to consider in all these statistics is amt. of scientific research funding as a percentage of GDP. One that count India is at 0.5% lower than China, Taiwan,Korea,Israel,Russia and other developed countries. This explains partly the lower research output
Ref: Scientific balance of power

Nilesh Sawarkar said...

Publications in "high-prestige" journals are not necessarily a barometer of scientific progress- actually it almost certainly is not. Let us remember that as late as 1985, there are well documented cases of papers that were rejected by well regarded journals - sound papers that went on to win Nobel prizes.

Another commonly used metric- citations is also flawed, especially in engineering. A senior technical manager of a leading research lab (which publishes regularly in these high impact journals) told me that he was forced to retain chemists and cell biologists on his engineering research staff to drive up the citations of his organization - which reflects why many high impact journals are also in this field. He went on to say that most of these "discoveries" were completely useless, and said that when he scrutinizes papers that go out, he sees an inverse relationship of their impact factor and real usefulness (useful stuff is patented or kept secret). It's a very insightful observation from a guy who has several Nature and Science papers, and appears in the Thomson's most cited!

Given that most top Indian universities are in engineering, I would expect them to write in IEEE or ASME or whatever it is that engineers read. So perhaps it is true that engineering schools in India are as good as those elsewhere - its is the Science education that may need a overhaul.

Anonymous said...

Nilesh Sir,

Your comments are very insightful. In fact, the best research and patents in India come from unknown engineering colleges. As you say, there is an inverse relationship between impact factor and usefulness. That's why IIT professors do not publish in 0.1 impact journals because their research is not useful. We should encourage faculty to publish in journals having impact factor less than 0.1 only.


Nilesh Sawarkar said...


What you fail to understand about impact factors is this: you get cited only if people use your research to write more papers. Typically this means scientists who do early stage research get cited more - which means they did some pioneering and useful research.

Now if you write articles in a journal where most papers are used to make products - who cites you? Product literature rarely counts. So engineers, being closer to product realization, will have a lower impact factor. So for an engineer to publish an article in the JAMA would be pointless, unless he/ she hopes that this article is read by potential creators of new products.

As for your comment about encouraging faculty to publish in certain journals - that's not what I said. I merely pointed out the if you do research in a field that figures out how to make a new 100-billion semiconductor fabrication facility, it is unlikely that 35 other groups can afford to do this, hence unlikely that you'll get cited 35 times. What I am saying is not rocket science - in fact the very guys who keep track of citations point out such examples as reasons why you should take these numbers with a pinch of salt.