Labels

India ranks 72 in 'global educational survey'

Delhi,20,Jan,2012:The Indian news media has been aflutter for the last few days, since the release of the results of the OECD (Organisation for Economic Co-operation and Development) Secretariat's Programme for International Student Assessment (PISA), which ranked India 72nd out of 73 countries. The PISA results are based on data collected from some 500,000 odd students undergoing 2-hour tests. The tests are meant to conduct comparative analyses, across vast international contexts, of 15-year-old students for "reading, mathematical and scientific literacy." India "faired miserably," ending up just above Kyrgyzstan, the media has scornfully noted, causing India's performance to be variously labeled "embarrassing," "shocking," and "disappointing." A disheartened Shaheen Mistry, CEO of Teach for India program, stated: "I am glad that now there is data that lets people know how far we still have to go."
In my opinion, the uproar about the "data" stems from an unfortunate tendency we have as a people to let mere numbers tell the whole story about the Indian educational system, a majority of the time. In the second most populous nation on the planet, with the second biggest educational system in the world, it seems that the preferred way to bring clarity to a massive, murky educational landscape would be to let statistics paint the picture cleanly and efficiently.
In doing so, however, too many iniquities and difference are subsumed. As a PhD candidate in an American university, having attended a South Delhi school, and having collected dissertation data in an impoverished village in a Delhi suburb, I know that a "Delhi area" statistic would not really be able to capture the differences-and iniquities-between the focal subjects of my study, and the quality of my own previous educational experience there. The Indian context is so complex, so multi-dimensional, that trying to understand its depth merely through a numbered tale is not just silly, but detrimental to our ability to work on fixing what's wrong.
"India" in the PISA test stood for Himachal Pradesh and Tamil Nadu, selected in their capacity as "showpieces for education and development." The two-hour tests cutting across vast socio-economic, linguistic, and ethnic divides tell us little of the context-specific literacy practices from those areas. For example, PISA itself notes: "it assesses to what extent students near the end of compulsory education have acquired some of the knowledge and skills essential for full participation in society." However, the kind of knowledge and skills required to function as members of society has to depend on the context, that is, change from one place (or society) to another. To what extent can we obtain a holistic idea about the socio-economic backgrounds of students in surveys that take students and principals a mere "20-30 minutes to complete"?
I looked into some sample questions from the 2009 test (you can see the sample here), and a couple of examples make my point clear. One question revolves around dealing with the receipt of a warranty card for a camera (named Rolly Fotonex 250 Zoom) and a tripod. Now, in any statewide testing in India we are going to end up with a very large group of students who would have either never bought a camera themselves (especially by 15 years of age) or had their parents buy it for them. The idea of a "warranty" itself may be encountered for the first time in the test, disadvantaging such students. Another example I noted was a question in which students were asked to describe a particular story, labeling it a "folk tale," "travel story," or a "historical account." These are subjective labels, depending on different historical understanding of narrative categorizations or conventions. Another question posed questions about a library schedule of hours. Again, a vast majority of Indian students may never have encountered an institutionalized library of that sort.
What we end up then are overbroad characterizations of how poorly Indian education is doing, on the basis of large-scale data collection that doesn't tell you what's actually going on (and wrong) in the classroom. This isn't to say that PISA is useless: just that the data, at the end of the day, tells us little about our own educational system.