www.iu.qs.com Open in urlscan Pro
34.247.205.23  Public Scan

URL: http://www.iu.qs.com/university-rankings/indicator-academic/
Submission: On October 18 via api from US — Scanned from DE

Form analysis 0 forms found in the DOM

Text Content

 * About Us
 * Rankings
    * World Rankings
      * QS World University Rankings
      * QS World University Rankings by Subject
      * QS Graduate Employability Rankings
      * QS Best Student Cities
    * Regional Rankings
      * Arab Region
      * Asia
      * EECA
      * Latin America
    * Indicators
      * Academic Reputation
      * Alumni Outcomes
      * Employer Reputation
      * Employer – Student Connections
      * Faculty Student Ratio
      * Graduate Employment Rate
      * H Index
      * International Indicators
        * International Faculty Ratio
        * International Student Ratio
      * International Research Network
      * Citations per Faculty
      * Papers & Citations
      * Partnerships with Employers
      * Staff with PhD
      * Web Impact
    * Methods, Policies & Guidelines
      * QS Institution Classifications
      * Data Normalization
      * Damping Mechanism
      * Policies and Guidelines

 * Blog


 * Twitter
 * Linkedin

 * Privacy Policy
 * Contact us

 * About Us
 * Rankings
    * World Rankings
      * QS World University Rankings
      * QS World University Rankings by Subject
      * QS Graduate Employability Rankings
      * QS Best Student Cities
    * Regional Rankings
      * Arab Region
      * Asia
      * EECA
      * Latin America
    * Indicators
      * Academic Reputation
      * Alumni Outcomes
      * Employer Reputation
      * Employer – Student Connections
      * Faculty Student Ratio
      * Graduate Employment Rate
      * H Index
      * International Indicators
        * International Faculty Ratio
        * International Student Ratio
      * International Research Network
      * Citations per Faculty
      * Papers & Citations
      * Partnerships with Employers
      * Staff with PhD
      * Web Impact
    * Methods, Policies & Guidelines
      * QS Institution Classifications
      * Data Normalization
      * Damping Mechanism
      * Policies and Guidelines

 * Blog
 * Search




ACADEMIC REPUTATION

You are here: Home / University Rankings / Academic Reputation


METHODOLOGY



The Academic Reputation Index is the centrepiece of the QS World University
Rankings® carrying a weighting of 40%. It is an approach to international
university evaluation that QS pioneered in 2004 and is the component that
attracts the greatest interest and scrutiny. In concert with the Employer
Reputation Index it is the aspect which sets this ranking most clearly apart
from any other.




BACKGROUND



QS World University Rankings® are based in part on hard data and part on factors
drawn from two large global surveys – one of academics and another of employers.
These are a key characteristic of the QS ranking approach and offer some key
benefits.

QS has rejected many proposed criteria (e.g. financial metrics like research
income) which cannot be independently validated, or are subject to exchange rate
and business cycle fluctuations. Instead, our Advisory Board favour maintaining
a strong emphasis on peer review, for important reasons:

Geographical/Cultural Diversity

Many evaluations seem based on a US model of what defines excellence in a
university. Thus their results are often dominated by English-speaking,
comprehensive, large universities with medical schools. A widely distributed
pool of academic experts help identify excellence in areas unmapped by other
metrics, resulting in institutions from 32 countries appearing in the top 200 in
QS’ ranking.

Unbiased approach to different subjects

Without peer review, institutions with key strengths in Arts and Social Sciences
might be penalised in the rankings simply because they don’t publish much
research.

Contemporary Relevance

Founded as recently as 1991, HKUST came top in the QS Asian University Rankings
in 2011. Nanyang Technological University was also formed in 1991, through
merger, and is the top rated university in Asia within the classification of
large, multidisciplinary, research intensive institutions without a medical
school.

Reduced Language Bias

Respondents to our academic survey identify with research excellence both in
English and their native languages, which avoids a bias towards internationally
recognised journals published in English.

Statistical Validity

Over 62,000 academic respondents contributed to our 2013 academic results, four
times more than in 2010. Independent academic reviews have confirmed these
results to be more than 99% reliable.

Resistant to Data Manipulation

The peer review survey results are collected independently and in such numbers
so as to become almost impossible to manipulate and very difficult for
institutions to ‘game’.




SOURCE OF RESPONDENTS



The results are based on the responses to a survey distributed worldwide
academics from a number of different sources:

Previous Respondents

QS has been conducting this work since 2004 – all previous respondents to our
survey are invited to respond again to provide us with an updated viewpoint on
the quality of universities in their broad field. In 2014, 1,724 previous
respondents returned to revise their response.

World Scientific

www.worldscientific.com
An academic publishing company headquartered in Singapore, World Scientific
publishes about 500 titles a year as well as 120 journals in a variety of
fields. World Scientific holds a subscription database well in excess of 300,000
worldwide from which, until 2010, QS drew 180,000 active records. The
effectiveness of this channel had dropped off over the years and in 2011 QS
chose to redirect and draw on more records from the Mardev lists. Responses from
this channel will remain in the sample for at least two years and World
Scientific may be drawn upon in the future to fill any specific shortfalls.

Mardev-DM2

The data division of Reed Business Information, Mardev-DM2 is one of the world’s
leading providers of business information and services. Mardev-DM2 controls
access to IBIS (International Book Information Service), a database with over
1.2 million academic and library contacts. This channel has grown increasingly
effective over the years and in 2014 QS drew 200,000 records.

Academic Signup

In 2010, QS initiated an Academic Signup process to enable the thousands of
interested academics we meet each year to actively signal their interest in
participation. Volunteers are screened to ensure institutions are not using the
signup process to unduly influence the position of their own or rival
institutions. Over 25,000 academics have signed up since the process was
launched in February 2010.

Institution Supplied Lists

Since 2007, institutions have been invited to submit lists of employers for us
to invite to participate in the Employer Survey. In 2010, that invitation was
extended to lists of academics also. Since academics are not able to submit in
favour of their own institution, the risk of bias is minimal, nonetheless
submissions are screened and sampling applied where any institution submits more
than 400 records. In 2014, nearly 400 institutions supplied lists contributing
over 190,000 additional academic contacts.

Wherever sampling is required, respondents are selected randomly with a focus on
delivering a balanced sample by discipline and geography. Naturally, all
databases carry a certain amount of noise and email invitations do get passed
on. Responses are screened to remove inappropriate responses prior to analysis.




THE SURVEY



The survey has evolved since 2004 but largely follows the same general
principles. Respondents are not asked to comment on the sciences if their
expertise is in the arts. Respondents are not asked to comment on Europe if
their knowledge is centred on Asia. The survey asks each respondent to specify
their knowledge at the outset and then adapts based on their responses, the
interactive list from which respondents are invited to select features only
entries from their own region.

The survey is broken into the following sections:

Personal Details
Knowledge Specification
Top Domestic Institutions
Top International Institutions
Additional Information
Personal Details

Name, Institution, Job Title & Classification, Department, Years in Academia.

Knowledge Specification

Country – respondents are requested to indicate which country they have most
familiarity with rather than the country where they are based. This enables new
international faculty members to comment on their sphere of knowledge rather
than speculate on an area they may yet know little about.

Region – regional knowledge responses are grouped into three supersets that
define the list of institutions from which the respondent can select, these are
Americas; Asia, Australia & New Zealand; and Europe, Middle East & Africa

Faculty Area – respondents are asked to select one or more faculty areas in
which they consider their expertise to lie. These are Arts & Humanities;
Engineering & Technology; Life Sciences & Medicine; Natural Sciences; and Social
Sciences. Sections 3 and 4 below are repeated for each faculty area selected.

Field – respondents are asked to select up to two specific fields that best
define their academic expertise

Top Domestic Institutions

Respondents are asked to identify up to ten domestic institutions they consider
best for research in each of the faculty areas selected in Section 2. Their own
institution, if it would otherwise be included, is excluded from the presented
list.

Top International Institutions

Respondents are asked to identify up to thirty international institutions they
consider best for research in each of the faculty areas selected in Section 2.
Their own institution, if it would otherwise be included, is excluded from the
presented list. The list consists solely of institutions from the region(s) with
which they express familiarity in section 2.

Additional Information

We use this section to gather additional information from respondents, such as
feedback on previous publications and the importance of various measures in
evaluating universities.




RESPONSE PROCESSING



The work is not done once the survey is designed and delivered. Once the
responses are received a number of steps are taken to ensure the validity of the
sample.

Five Year Aggregation

To boost the size and stability of the sample, QS combines responses from the
last five years, where any respondent has responded more than once in the
five year period, previous responses are discarded in favour of the latest
numbers.

The survey samples contributing to this work have been growing substantially
over the lifetime of the project, resulting in inherently more robust reputation
measures. The decision has been taken to extend the window for both reputation
measures to five years instead of three years previously, with responses from
the earliest two years carrying a relative weight of 25% and 50% respectively.

Junk Filtering

Any online survey will receive a volume of test or speculative responses. QS
runs an extensive filtering process to identify and discard responses of this
nature.

Anomaly Testing

It is well documented on the basis of other high-profile surveys in higher
education that universities are not above attempting to get respondents to
answer in a certain fashion. QS run a number of processes to screen for any
manipulation of survey responses. If evidence is found to suggest any
institution has attempted to overtly influence their performance, any responses
acquired through sources 4 and 5 (above) are discarded.


RESULTS ANALYSIS



Once the responses have all been processed, the fun really begins and it works
as follows for each of our five subject areas:

 1. Devise weightings based on the regions with which respondents consider
    themselves familiar – weightings are (now) based only on completed responses
    for the given question. This is slightly complicated by the fact that
    respondents are able to relate to more than one region.
 2. Derive a weighted count of international respondents in favour of each
    institution ensuring any self-references are excluded.
 3. Derive a count of domestic respondents in favour of each institution
    adjusted against the number of institutions from that country with a certain
    level of international nominations and the total response from that country
    ensuring any self-references are excluded.
 4. Apply a straight scaling to each of these to achieve a score out of 100.
 5. Combine the two scores with a weighting 85% international, 15% domestic –
    these numbers were based on analysis of responses received before we
    separated the domestic and international responses three years ago, but a
    low weighting for domestic also reflects the fact that this is
    a world university ranking. We use 50:50 for the employer review.
 6. Square root the result – we do this to draw in the outliers but to a lesser
    degree than other methods might achieve – our intention is that excellence
    in one of our five areas should have an influence, but not too much of
    influence.
 7. Scale the rooted score to present a score out of 100 for the given faculty
    area.
 8. Combine the five totals with equal weighting to result in a final score
    which will then be standardized relative to the sample of institutions being
    used in any given context.

© QS Quacquarelli Symonds Limited 1994-2019. All rights reserved.

