The Australian Financial Review's (AFR) BOSS magazine's latest issue has a special report on business schools (b-schools) in Australia. Apparently, they publish one every year. This is the first time, however, that they've broken away from lumping Australia's b-schools into four broad categories and have ranked them individually instead.
Now rankings -- especially b-school rankings -- are a contentious thing, both at the personal and professional level. That's because every publication does them differently (by using a slightly different ranking algorithm) and thus comes up with different rankings (sometimes drastically different ones). On the one hand, that makes rankings in general a lot less relevant to, say, b-school applicants. Especially when one school is ranked highly in one ranking and not so highly in another. How do you interpret that?
On the other hand, two good things come out of everyone coming up with different rankings. First, some schools score highly in all rankings. That generally means that they're good regardless of how you look at them (i.e. how you slice the numbers). Second, it tells you that rankings aren't all that useful after all. Actually, it tells you that there isn't one best way to rank schools and, ultimately, it makes you wonder about how useful it is to quantify all this stuff anyway.
Of course, if you're a real b-school candidate, wanting to quantify everything probably comes naturally to you. Numbers are powerful. They can be placed in balance sheets and used in NPV calcuations. You can talk about them, throw them around, and make charts and trends out of them. They're also shorter than works. And so you look at, not only the rankings, but also the methodology used to get those rankings. Basically, rankings do matter, regardless of their relevance to your actual, often highly personal opinion on the "quality" of a particular business school.
Incidentally, this often helps you decide which of the major financial publications (Financial Times, BusinessWeek, etc.) suit your style or thinking, analyzing, and writing. That ends up being quite useful in the long run.
Anyway, coming to the point of this article: Six of the "top" b-schools in Australia (AGSM, MBS, Monash GSB, MGSM, U Queensland, and UWA GSM) didn't like the way BOSS was putting the rankings together. They (under the auspices of the Australian Business Deans Council) then drafted a white paper that presented their collective opinion on how b-schools should be ranked. BOSS, however, stuck to its own rankings system and that's what it printed in its September issue.
So, if you're thinking of doing your MBA in Australia, my advice is to (a) check out all the various rankings and ranking methodologies, (b) read the ABDC white paper, and (c) make your own criteria by which you should judge the schools you want to apply to. Just keep in mind, though, that rankings are rarely (if ever) everything.
Now rankings -- especially b-school rankings -- are a contentious thing, both at the personal and professional level. That's because every publication does them differently (by using a slightly different ranking algorithm) and thus comes up with different rankings (sometimes drastically different ones). On the one hand, that makes rankings in general a lot less relevant to, say, b-school applicants. Especially when one school is ranked highly in one ranking and not so highly in another. How do you interpret that?
On the other hand, two good things come out of everyone coming up with different rankings. First, some schools score highly in all rankings. That generally means that they're good regardless of how you look at them (i.e. how you slice the numbers). Second, it tells you that rankings aren't all that useful after all. Actually, it tells you that there isn't one best way to rank schools and, ultimately, it makes you wonder about how useful it is to quantify all this stuff anyway.
Of course, if you're a real b-school candidate, wanting to quantify everything probably comes naturally to you. Numbers are powerful. They can be placed in balance sheets and used in NPV calcuations. You can talk about them, throw them around, and make charts and trends out of them. They're also shorter than works. And so you look at, not only the rankings, but also the methodology used to get those rankings. Basically, rankings do matter, regardless of their relevance to your actual, often highly personal opinion on the "quality" of a particular business school.
Incidentally, this often helps you decide which of the major financial publications (Financial Times, BusinessWeek, etc.) suit your style or thinking, analyzing, and writing. That ends up being quite useful in the long run.
Anyway, coming to the point of this article: Six of the "top" b-schools in Australia (AGSM, MBS, Monash GSB, MGSM, U Queensland, and UWA GSM) didn't like the way BOSS was putting the rankings together. They (under the auspices of the Australian Business Deans Council) then drafted a white paper that presented their collective opinion on how b-schools should be ranked. BOSS, however, stuck to its own rankings system and that's what it printed in its September issue.
So, if you're thinking of doing your MBA in Australia, my advice is to (a) check out all the various rankings and ranking methodologies, (b) read the ABDC white paper, and (c) make your own criteria by which you should judge the schools you want to apply to. Just keep in mind, though, that rankings are rarely (if ever) everything.