In India, B-school rankings exist for over one-and-a-half decade (MDRA was among the first to design and develop B-school rankings in 1998 and publishing in 1999). However, despite all the stated benefits of these rankings, generally the participation has been low. Currently, there are 4,000 plus schools imparting business education in India, out of which about 1,500 B-schools fulfil the criteria for participation in BT-MDRA B-school ranking 2014. Among those who are not eligible for participation, most do not fulfil the criteria of having at least three batches graduating. Among these 1,500-odd B-schools, in spite of allowing a larger time-frame and spreading awareness of the ongoing rankings through different media throughout the process, about 250 B-schools have participated, which is about one in six schools. Though this figure is quite impressive considering all other participation-based rankings in India or elsewhere; a look at last three years' rankings by various publications/ research organisations, the participation hovered between 100 and 200.
This leaves much to ponder on - why do many B-schools not want to participate in such rankings despite best of efforts and high level of transparency? While deep analysis and deliberations with various stakeholders are required, a preliminary listing of reasons based on our recent experience could be good enough to initiate a discussion on the subject:
It is no more a secret how B-schools are mushrooming without any control on quality and many are shutting shop. There are plenty of business schools who are in the business of either awarding diplomas or just act as placement agencies without any serious approach towards imparting quality education or doing research. When they see the entire methodology of our rankings and the detailed questionnaire combined with need of documentary evidence, they feel themselves incompetent to participate in our rankings, leave alone the evaluation part.
FEAR OF AUDIT
While BT-MDRA ranking ensures the veracity of data through physical audit of campuses and programmes, this also distances many colleges. Some colleges have been reluctant with our audit system.
BIG AMOUNT OF DATA
To ensure a comprehensive evaluation and comparison of B-schools on all key parameters, the BT-MDRA methodology constantly tries to include relevant attributes year-on-year. This makes the participation form quite demanding in terms of putting together all the information along with documentary evidence. Several departments have to work in sync to complete the participation form. This, many colleges admit, becomes a big barrier. Some colleges just decline to participate saying too many information has to be collated and they do not have time for this.
SPECIALISED RANKING FOR SECTORAL PROGRAMMES
Certain business management programmes (such as offered by institutes like MICA, TISS, etc.) are sector-specific programmes. Some of these institutes demand a specialised ranking for these programmes and hence do not participate in more generalised B-school rankings.
FEAR OF NEGATIVE PUBLICITY
While generally B-schools agree that rankings are an excellent way of marketing, brand building and raising awareness, some B-schools look at it differently. They fear that a low rank would negatively affect their brand and hence choose to stay away of the rankings. But this is far from truth.
RANKING VS RATING
Some institutions just do not want to be compared with their peers, especially in the same region/city. They prefer to be put in the same bucket as their peer group instead of being marked below or above them, which is possible only in a rating system (such as A++, A+, A, B++, B+, B, etc) that avoids a direct comparison.
DIFFERENT RANKINGS, DIFFERENT METHODOLOGY
Even good B-schools have their own merits and demerits. The ranking methodologies used by different ranking agencies use different parameters and evaluation methods to compare B-schools. If some B-schools feel that a particular methodology suits them because they are strong in these parameters of evaluation, they just choose to participate in such selective rankings.
In addition to these strategic decisions on participation, a number of other factors, mostly tactical in nature play their own roles. Some of these are:
TIMING OF RANKINGS
As an example, the participation dates for BT-MDRA ranking 2014 fell in between April and July, when the colleges are either busy in their admission/ internships/ placements or summer vacations. Physical absence of decision-makers at this time also hampers participation.
Many tier-II colleges have placements still ongoing during this time. As placement is a key parameter of evaluation, they do not want to participate based on incomplete placement data.
Other reasons include short-term factors such as relocation of colleges, change in management or officer who works on the participation form, etc. Another reason might be disagreement with the ranking outcomes or the methodology itself.
Whatever be the reasons, there has to be a process through which integrity, transparency and robustness of methodology be put together to produce a credible ranking. It would be good if the B-schools show deep and consistent engagement in such ranking processes which is possible through involving them at the time of designing and developing such rankings and then discussing the published findings to take feedback.
If the sincere efforts are put in right direction, the desired outcome would not be far away.