Monday, June 08, 2009

Education enumerated, not evaluated

How does one measure education? By numbers, it would seem. As an engineer I must be most comfortable with this but I am not. I think effectiveness of education cannot be measured solely by numbers.

I started thinking along these lines when I read two articles, Chennai No. 1 in CBSE exams and A jolly good show by Delhi in Class X exams, both in The Hindu of May 30, 2009. If you went through these pieces, I am sure you would be confounded by the plethora of numbers, many of them on a comparative scale. The first thought that should come to your mind is, “They are not talking about education.” But, that is what they ostensibly think they are doing.

An increase of 1.76 pass percentage points in Class X exams in 2009 vis-à-vis 2008, and if you are a sucker for details, from 87.08 to 88.84%. So precise. Does this mean that students have become better over the course of the year by that percentage, in some combinations of hard work, intelligence, sheer luck? Your guess is as good as mine. There is a drop of 29 candidates, from 4,503 to 4,474, in the metric of “above 90% scorers” in Delhi. The total number should be in the tens of thousands. This is a fit case for apoplexy, is it not?

The topper scored 98.8% whereas below him, six students were bunched up at 98.6% (98.7% went unrepresented and unrepented). So, the topper is a genius and others are duds. The overall pass percentage has Chennai at the top, with Ajmer, Allahabad, Panchkula, Delhi and Guwahati trailing, in the order. Never mind that the comparisons are between oranges and apples. CBSE is just one of a number of systems, including the state board of education, in Chennai and it is possible that an implicit selection process is working, of above average achievers into CBSE in Chennai. In Delhi, CBSE is the only game in town. Will this difference not affect the numbers in the aggregate? Even without knowing the details I would venture to guess, yes. Then, does the comparative statement carry any meaning? No.

If anything meaningful at all can be gleaned from such numbers, it will be in a trend, to be discerned over a longer period, say a decade. There indeed is a trend, pass percentage of girls higher than that of boys, seemingly having been established over the “past few years”. Yes, this data deserves to be scrutinized in depth and the trend analyzed.

This year, the population of Class X whose results we are discussing is a veritable pool of budding social scientists. 1,770 candidates, nearly thrice the number of the class of 2008, got a “perfect 100” in Social Science. But, look at science, only six achieved “centums” (100/100) this year as against 287 last year. So, what do we conclude? Two years from now, all the social science courses in colleges will be deluged and the science curricula courses will go begging. A dismal scenario.

Not so fast. You see, there are sort of apologists for this “dismal” (as though 99/100 in science is something to sneeze at) performance of students in science. “One three mark question in the Science theory paper was out of syllabus,” “there was ambiguity in some questions in the 20-mark multiple choice questions of the practicals,” and “for the first time, the MCQ [do not ask me] paper for Class X was based on a list of experiments from Class X as well as Class IX syllabus.”

So, students do not study the subject but only the syllabus. No, I am not asking for an open ended syllabus but peeking minimally out of the syllabus cocoon can be a differentiator; the six versus 287 may be partially explained. If ambiguities are acknowledged how any one could have got full marks? That is, we know for sure that the six students who scored hundred out of hundred reconciled or did not recognize the ambiguity in the exact same way the people who set the papers did. They came out on top in the crap shoot. Ninth is for ninth and tenth is different. Subsequently, tenth is for tenth and eleventh for eleventh, exclusively. So education is pigeonholed into years, no continuity. An array of numbers substitutes meaningful analysis. And, we call this education!

Please hold me back. Otherwise it will be a never ending lament.

Raghuram Ekambaram

2 comments:

Aditi said...

Raghu, the rant is 100% justified, this is the bitter truth about 'education'. The 'evaluation directions' given to the evaluators are so mechanically precise with references to the NCERT text books that just about anybody (irrespective of being the subject teacher of the subject being evaluated), with the NCERT text book open ( open book evaluation if you like, hahahahah) can evaluate an answer paper and award marks, with great precision.

For 'High Order Thinking Skill'acronymed as HOTS questions, which are targetted to distinguish the wheat from the chaff among students, the evaluators were apparantly directed to give full marks if the steps involved in solving the question were in 'the right direction', irrespective of whether the answer was correct or not.

mandakolathur said...

Aditi,

"mechanically precise" - that is such a wonderful phrase.

When I was a Teaching Assistant at the University of Kentucky, the professor too asked me to consider the "effort" of the students while evaluating. I did not know what exactly that was but just by trial and error (the students were the guinea pigs, I suppose!) I think I got it right.

Thanks for endorsing the rant. I will be louder the next time on this topic, hahaha...!

Raghuram Ekambaram