Print PDFTable Of ContentsNext Page

The Ranking of Economics Journals by the Economic Society of Australia

Peter Abelson
Department of Economics, University of Sydney1 , 2009

This paper describes how, in response to a government request, the Economic Society of Australia (ESA) prepared a ranking of economics journals for use in Australia. ESA commissioned the writer to manage the process and it was agreed that the journal ranking should be based on a survey of all professors of economics and econometrics employed in Australian universities. Following the development of the rankings, Heads of Economics Departments were asked in a follow-up survey whether they viewed the process and the results as acceptable or whether amendments should be made. ESA concluded, as does this paper, that the process and the results were credible and generally accepted and that the findings should not be altered on an ad hoc piecemeal basis. ESA also concluded that any revision of these rankings should be based on a similar transparent process of consultation with the profession. An annex provides details of the rankings of just over half of the journals surveyed.

JEL: A10, A12, A13, A20

The Economic Society of Australia (ESA) has published on its website (www.ecosoc.org.au) its recommended ranking of 890 journals. There are, in fact, two lists with 602 and 288 journals respectively. Nearly all journals in the first list are clearly economic journals. Many journals in the second list may be described as interdisciplinary journals. In each list, each journal is assigned to one of four grades from highest to lowest: A*, A, B or C. In this paper, I describe why this ranking exercise was undertaken, how it was conducted, and some implications of the exercise. Results for the top half of the journals are supplied in the annex.

In mid-2007, the Australian Department of Education, Science and Training (DEST) invited the Academy of Social Sciences of Australia (ASSA) to participate in the ranking of academic journals for the ongoing Research Quality Framework exercise. ASSA in turn invited ESA to provide a ranking of economics journals. DEST provided ESA with two lists of economics journals: an ‘A’ list of 540 journals and a ‘B’ list of 67 “other journals”. When duplicated journals were excluded, the total number was 602 journals.

In September 2007, ESA commissioned the writer to manage the process whereby the Society would prepare a ranking of these journals by the end of the year (as required by DEST). Initially ESA envisaged that a committee of some half dozen persons would be established to undertake this study. This committee would examine the existing literature on the subject, including the work on journal rankings by a group of Deans of Australian Business Schools (known as Bardsnet), and draw up a recommended ranking.

It was quickly apparent to me that this approach would be a Sisyphean task. The literature on ranking economics journals is large. Many members of the economics profession were challenging the draft Bardsnet rankings despite Bardsnet’s considerable attempts to measure journal impacts and citations. And there was neither the time nor expertise available to undertake a stand-alone research exercise.

Moreover, subjective judgments would be unavoidable on several counts. First, the selection of committee members would itself involve implicit judgments on academic values and the treatment of mainstream versus minority economics topics. If not carefully selected, the committee could contain hard to reconcile positions. Second, as leading experts on the subject acknowledge (see Kalaitzidakis et al., 2003, or Kodrzycki and Yu, 2006), measures of journal importance based on citations or impact measures are also based on value judgments. Citation and impact measures depend critically on the journals included in the counting, the weight given to each journal in the set2, and the period over which citations are counted. An issue for Australia is the weight, if any, to be given to Australian publications. Several Australian economic journals (for example, the Australian Economic Review, Australian Economic Papers and Economic Papers) are not included in the Kalaitzidakis et al. (2003) database. In short, it was hard to see how any such ranking exercise would carry legitimacy or authority.

Accordingly I proposed a different approach based on ranking journals via a survey of economists. Specifically I proposed, and the ESA Supervising Committee3 accepted, that the ESA ranking should be based on the rankings of economics and econometrics professors currently employed by Australian universities. The Committee agreed that although this group of economists would not necessarily be a completely representative sample (it would for example be generationally biased), it would have sufficient authority to speak for the profession.4 In any case it would have been hard to conduct a survey of all economists in Australian universities given the time and resources available.

It may also be noted that the definition of an economics (or econometrics) professor is an issue. Not all economics professors work in economics departments. Some work in a related field such as finance. Others work in a multi-disciplinary teaching faculty or research institute. To draw up a list of economics professors currently employed in Australian universities. I consulted leading economics professors in universities in each state and territory and modified my draft list in response to their comments.

The A and B lists of journals provided by DEST were sent to the 137 economics professors thus identified with a request that they consult widely with their colleagues and rank the journals in four tiers (grades) A*, A, B and C in accordance with DEST criteria for each tier. The A* journals were defined as the top 5 per cent, the A journals as the next 15 per cent, the B journals the next 30 per cent, and the C journals were the whole bottom half of the field. Apart from these quantified targets, the DEST criteria were quite broad, referring to ‘best’ or ‘important’ or ‘high quality’ publications and open to the subjective views of respondents. Given the inevitable subjectivity of some rankings, it was important that there be an adequate and representative set of responses.

Given also the large numbers of journals to rank, to assist the professors I provided the draft Bardsnet rankings of journals. Because this ranking was widely perceived to be contentious, I believed that this information would assist, but not unduly influence, the responses. This was borne out by the responses, which did indeed vary from the Bardsnet rankings (see below). However, in order to minimize bias in responses, no further information was provided. Specifically, I did not distribute current rankings of, or data on, economics journals that had been developed by individual economics departments.5

Eighty-two replies (a 60% response rate) were received for the core List A journals and sixty responses for the List B journals. Table 1 shows the university affiliations of the responding professors. The response was broadly representative of the universities across the country. Two-thirds of the responses were from professors in Group of Eight universities, which have most economics professors and one-third from other universities. The professors responding included many of the most experienced and respected economists in Australia.6

Numerical values 1 to 4 were assigned to the four grades A*, A, B and C respectively. I then computed the mean scores from the responses along with standard deviations for each journal. Strictly speaking, this implies that the ordinal grades provided by the respondents can be interpreted cardinally. The results for the top half of the List A and B journals (ranked A*, A or B) are shown in the annex.

Table 1 Number of professors responding to the List A survey by university

University

Number of responses

Number of professors in survey

ANU

11

22

CQU

1

1

Curtin University

5

6

Flinders University

1

3

Griffith University

4

7

La Trobe University

2

4

Macquarie University

1

3

Monash University

13

15

QUT

2

4

RMIT

1

1

University of Adelaide

3

4

University of Canberra

2

2

University of Melbourne

9

12

University of New England

3

5

University of Notre Dame

1

1

University of Queensland

8

10

University of Sydney

2

6

University of Tasmania

1

2

University of Wollongong

1

1

UNSW

5

11

UTS

3

5

UWA

1

4

UWS

2

4

Total

82

131

The recommended grades were based strictly on the average professorial rankings subject to DEST constraints for each tier: 5% for A*s, 15% for As, 35% for Bs and 50% for Cs with just one modification. The ESA Supervisory Committee included three Berkeley Electronic (BE) journals in the ‘A’ grade rank although this took the percentage just over 15% benchmark. This reflected an executive judgment that the BE journals, which were new journals at that time, were A-class journals. It did not involve demoting any other journal. About one in three ESA rankings for A* and A journals varied from the draft Bardsnet rankings for these journals.

In addition, in response to our request in the survey, respondents identified another 288 journals that were not on DEST Lists A or B, which they considered should be counted as relevant journals. To obtain rankings for these journals (which I described as List C), a supplementary survey was conducted in January 2008 using the same approach as before. Twenty-nine responses were received from a cross-section of the profession with a good representation from well established professors. The number responding for List C was lower because this survey was undertaken in the holiday month of January and the newly elected Labour Government had indicated that the research ranking exercise was likely to be terminated. The journals were scored in the same way as before and graded according to the cut off-points that applied to the List A and B journals, e.g. an average score of 1.63 or higher was graded A*, an average score of 1.64 to 2.50 was graded A, an average score of 2.51 to 3.72 was graded B.

In early 2008, the government transferred responsibility for determining the funding of research in Australian universities from DEST to the Australian Research Council (ARC). In mid-2008, ARC published its draft rankings of economics and other journals. These journal rankings differed significantly from the rankings that ESA had provided to DEST. However, the ARC did not publish any details on its method(s) for reaching its draft recommendations.

Table 2 provides a summary of the ranking differences between ESA and the draft ARC rankings. In an immediate, preliminary, submission to ARC, ESA expressed special concern about the differences in Lists A and B, which are predominantly economic journals. In particular, the Society expressed concerned about both the downgrading of 10 journals ranked by ESA and the inflation of grades given to another 45 journals.

ESA also expressed concern about the large number of differences relating to List C journals. ESA recognized that many journals in List C are not principally economic and accepted that the core discipline for each journal should determine the ranking in each case. The Society has to adopt this approach if it is to argue for the primacy of its valuations for economic journals. However, disciplinary demarcations are not always clear and there appears to be a systemic tendency for special interest disciplines to provide relatively high rankings for their preferred journals.

Table 2 Differences between ESA and draft ARC ranking of economic journals

Journal List

Total

No. ranked differently

ARC rank lower than ESA

ARC rank higher than ESA

Lists A and B

602

59

10

49

List C

290

162

46

116

Demarcation issues are also evident from the journal classifications and apparent anomalies. In summary, by the author’s count:

● 56 List A journals and 10 List B journals were not shown in ARC’s economic lists.

● 19 List C journals could not be found in any ARC list.

● 24 journals in ARC’s economic lists are not on any ESA list.

In thinking about how to respond to ARC’s draft ranking of journals, ESA considered two strategies. One was to provide a journal by journal critique as requested by the ARC. The other was to argue for the ESA rankings as a whole. ESA’s view was that the journal ranking process was professional and produced results carrying credibility and authority. Accordingly, ESA believed that the ARC ranking should be the based on this survey and that it would be inappropriate for the Society to make individual journal adjustments. However, the ESA Supervising Committee recognized that it needed to obtain authority and support for this position from the profession.

In order to prepare a final submission to ARC, the writer sent a full comparison of ESA and draft ARC journal rankings to all Heads of Economics Departments (HoDs) in Australia seeking views on a strategy affirming the initial survey results in full. Responses were received from 13 HoDs or their representatives. Nine responses were strongly supportive of ESA’s general position and initiatives. Three responses raised procedural points but did not indicate any core dissent. Only one response indicated some dissent. In this case, the HoD described the objective of ranking some 600 journals and including so many journals as class ‘A’ journals as “highly flawed”. However, this was a critique of the whole ranking exercise rather than of the ESA process.

Based on these and other responses, the ESA Supervising Committee concluded that the journal ranking process and results have integrity and are quite widely accepted. Of course, the Committee understood, as I do, that some rankings are arguable and that changes may be appropriate in due course. However, the Committee also believed that there is a strong case for accepting the basic integrity of this process and the results.

Following this consultation process, in September 2008, ESA recommended to ARC that:

1. ESA ranking of all journals in Lists A and B should be adopted in entirety and that ESA ranking of primarily economics journals in list C also be adopted.

2. Any variations to ESA rankings should be based on transparent and organized principles and on consultation with the economics profession and not on ad hoc individual responses.

These recommendations were formally accepted at a ESA Executive meeting later that month.

Shortly thereafter an ARC official told the writer in a telephone discussion that ARC accepted the professional process adopted by ESA and would adopt the rankings of economic journals provided by the Society. However, despite ESA requests that this advice be formalised in writing, no such written assurance has been provided at the time of writing this paper.

A Economic Society Evaluation of List A and B Journals (Excluding bottom 50% ranked C).

B Economic Society Evaluation of List C Journals (Excluding lowest 125 ranked journals).

Kalaitzidakis, P., Mamuneas, T. and T. Stengos, 2003, “Ranking of Academic Journals and Institutions in Economics’, European Economic Association, 1(6), 1346-1366.

Kodrzycki, Y.K. and P. Yu, 2006, New Approaches to Ranking Economics Journals, Working Papers, Federal Reserve Bank of Boston. http://www.bos.frb.org/economic/wp/index.htm

1 Many people contributed to the journal ranking study described in this paper. The study was supervised and all major steps and recommendations agreed by an ESA Supervising Committee made up of the President of ESA (Professor Bruce Chapmen), the President of the Australian Econometrics Society (Professor Trevor Breusch), and the Secretary of ESA (Professor Jeff Sheen). I am grateful for their support and for comments on a draft paper by Andrew Leigh and two referees.

2 Kalaitzidakis et al, 2003, weight the journals according to the number of citations from each journal, but this weight is a value judgment and this still leaves open the choice of journal set and time period.

3 See footnote 1.

4 A reviewer also pointed out that economics professors tend to be mainstream economists and that this could result in a lower ranking of “heterodox” economics journals than people working in that field might consider appropriate.

5 I believe that the Head of one Department of Economics circulated their data on journals to all economics professors (using a list that I supplied to it in response to the Head’s request).

6 ESA has treated the names of respondents as confidential. The names have been provided on a confidential basis (but not with detailed responses of the respondents) to the Australian Research Council.

Top Of PageNext Page