You are here

  1. Home
  2. Support for researchers
  3. Open University Statement on the Responsible Use of Research Metrics
  4. Open University DORA Implementation Plan
  5. DORA Frequently Asked Questions

DORA Frequently Asked Questions

The primary motive of DORA is to cease using journal/platform-based metrics, specifically JIFs, as a surrogate measure of the quality of individual research outputs, or an author(s) as represented through a body of their cited outputs.

Although a recruitment panel may, for example, expect evidence of publishing in leading journals for the discipline, the job advert should refrain from referring to impact factors, or naming a journal title as a proxy for quality (e.g., “a track record of publication in journal X”) or use lists of ‘target journals’ that are based on perceived prestige.

DORA does not block the use of citation metrics, the choice of publication outlet or the publication (e.g., in a promotion case or a grant application). Metrics can be used for the assessment of research quality and whilst ideally a combination of multiple indicators (both qualitative and quantitative) are commonly used, there are occasions where qualified use of a sole metric (e.g. FWCI – Field weighted citation impact) is appropriate and the associated risks understood.

Where research metrics are used in publication, funding or staffing decisions (e.g., recruitment, reward, promotion, retention or redundancy), this should be clearly stated in the application/guidance documentation. Documentation should ideally clarify that the content of a research paper is more important than the identity of the outlet in which it was published.

The inappropriate use of quantitative metrics encourages a tendency to game the system, favouring short term achievement (i.e., rush to publish) over longer term goals such as reproducibility, integrity and broader contribution to science and society. It hinders inclusion (including cross/inter-disciplinary research), favours English-language publications and encourages an (arguably exploitative) economic model by publishers to charge higher publication costs from finite university publication budgets.Want to know more about the methodological flaws of using JIF and why they harm science:

Journal based metrics, specifically JIFs, are identified not for use (as #1 of the DORA principles). Other metrics that need care in their use include other journal-based metrics, h-index, university or journals rankings, author list position in a publication, reputation of an author’s institution. Refer to the Metrics Toolkit and Guide to Metrics to learn more about specific metrics; what they do and do not measure and the pros and cons of key metrics:

JIF (Journal Impact Factor) [Note – this is the only metric explicitly identified in the DORA principles]:

  • What is it: A quantitative measure for ranking, evaluating, categorizing, and comparing journals. JIF is a ratio between citations and recent citable items published and thus measures the frequency with which the ‘average article’ in a journal has been cited over a timeframe.
  • Appropriate uses: can be useful in comparing the relative influence of journals within a discipline, as measured by citations. Used appropriately and in conjunction with other metrics, the JIF can be useful for librarians to decide on specific journal subscriptions.
  • Inappropriate uses: JIFs should not be used as an indicator for the quality or impact of particular articles or authors because it is not statistically representative of (the citations to) individual articles and cannot summarize the quality of an author’s entire body of work. Additionally, due to the skewed distribution of citations (relatively few articles receive most citations, sometimes described as “the long tail”), the use of the mean rather than the median value of citations per article does not offer a reliable prediction for the average number of citations an article can expect to receive.

DORA applies to all scholarly disciplines. Responsibility is shared between the institution (through its administrative and managerial procedures, practices, research culture) as well as individual researchers (through their own research conduct; participation and influence within their research group/wider School; as well as informing/making management decisions [e.g., member of a School committee or by having an ex officio role]).

Universities are under constant change, providing opportunity (and risk) to advancing the spirit of open, fair and inclusive research cultures. Staffing constraints hinder capacity to initiate, engage and sustain change in this space.

DORA principles are part of the internal landscape to foster a positive, inclusive and open research culture. Internally, this links to the Research Career Development Concordat; the University statement on Open and Engaging Research (published Sept’23) and Next Generation aspects of the University Research Plan (Jun’22), the Research Code of Practice (Aug’21) and the University statement on the Responsible use of Research Metrics (Nov’20). Individual contribution to the local research environment is recognised in promotion criteria (citizenship) and reward schemes like the University Research Excellence Awards. Incremental change is evidenced through staff surveys (e.g. CEDARS).

Nationally and beyond, DORA sits alongside the EC initiative CoARA (Coalition for Advancing Research Assessment); the Leiden manifesto, the Metric tide report and other similar policy reviews. Research England uses the Research Excellence Framework (REF) to incentive changes in the research system and for REF2028 this includes open research practices, inclusive research cultures and environment. As a measure of the extent that DORA is embraced in the sector, ~98 UK universities/university colleges are DORA signatories (out of ~160). These signatories accounts for ~88% of all research volume assessed by the REF in English universities.

Many UK and international funders have signed DORA and therefore commit to a process towards not using journal-based metrics for assessing research quality within grant applications and will either ignore JIFs or restrict to a qualified use in specific circumstances. See guidance from the funder. Additionally, they have committed to principles #2 and #3 of the DORA declaration:

#2. Be explicit about the criteria used in evaluating the scientific productivity of grant applicants and clearly highlight, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published.

#3. For the purposes of research assessment, consider the value and impact of all research outputs (including datasets and software) in addition to research publications, and consider a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice.

Prominent funders, publishers and societies that have either signed up to, or endorse the DORA principles, including the UKRI (inc. all Research Councils in their own right), EC, European Research Council, British Academy, Royal Society, Royal Academy of Engineering, Oxford University Press, Wellcome Trust and the Gates Foundation.

View the list of current signatories

The intrinsic quality of a research output was assessed in the Research Excellence Framework (REF) 2021 in terms of originality, significance and rigour (para 191-204). JIFs were expressly not used in REF 2021: “No sub-panel will use journal impact factors or any hierarchy of journals in their assessment of outputs.” (para 250). Citation data was used in sub-panels in A3-Allied Health, A4-Psycology and made available for possible use in B7-Earth Sciences, B9-Physics, B11-Computing as a potential indicator of academic ‘significance’ to inform the peer-review judgements of the research output quality. REF 2028 will incentive further open and inclusive research cultures and environments – details to be confirmed.

Email the Research REF team for further information.

Academic members of the DORA steering group act as champions across the Faculties (as of 1/6/23 are:

Faculty of Arts and Social Sciences

  • Darren Langdridge (Psychology)
  • Steve Pile (Geography)
  • Shonil Bhagwat (Social Sciences and Global Studies)

Faculty of Business and Law

  • Kim Barker (Law)

Institute of Educational Technology

  • (vacant)

Faculty of Science, Technology, Engineering and Mathematics

  • Martin Bootman (Life, Health and Chemical Sciences)
  • Sally Jordan (Physical Sciences)
  • Tracie Farrell/Petr Knoth (Knowledge Media Institute)

Faculty of Wellbeing, Education and Language Studies

  • Steph Doehler (Education, Childhood, Youth and Sport)

Additionally, Faculty-level Research support teams can advise or signpost (to e.g., the DORA academic chair and team – please email the RES Research and Enterprise team).

Refer to the intrinsic quality of the research output. It is also appropriate to consider a broad range of additional evidence, including qualitative indicators such as influence on policy and practice (e.g., testimonials from peers/users as to the significance). Use multiple and contextualised indicators where available and use metrics that best reflect the nature of the research discipline in terms of publication, citation and external practices. Consider other types of research outputs and outcomes, evidence of impact, collaboration, supervision and/or career paths. Normalised metrics should be used where these are available and robust (although normalisation can compromise transparency and become less important when assessing within the same discipline).

DORA should be implemented in a discipline-sensitive manner. The OU implementation of DORA is guided by the principles of the Leiden Manifesto and account for “variation [by discipline] in publication and citation practices” (principle #6). The readiness of different disciplines, accounting for local practices using JIFs, cultures, as well as the external context in which they work, will also differ. This means that there will be sector-wide instances that necessitate the use of JIFs as a measure of research quality (e.g., to provide teaching courses with a ranking for accreditation purposes that are knowingly consistent with DORA). Your Faculty DORA representative can advise in the first instance (see the Who in my Faculty can help FAQ). We welcome your examples of discipline-specific encounters with JIFs et al so that we can better understand use in our research environment and tailor our advice – please email the RES Research and Enterprise team.

Avoid judgements of quality when assessing an individual (i.e., within selection criteria or recruitment decisions), based on journal/outlet title, ranking or JIFs, author position in author listing or the position of their university in rankings (as aggregate quality does not equal individual quality). Judge quality based on a blend of informative quantitative and qualitative indicators (where feasible), for example:

  • Ask applicants to describe a sample of their strongest research outputs in 100 words on the originality and rigour of the research, significance to the field, and the applicant’s contribution.
  • Request an outline of the reception that their papers have had, including who has cited them and why, in addition/or instead of citation data.
  • Considering a wider range of evidence of contributions alongside quality of outputs alone (e.g., data, software, and preprints; a demonstrable commitment to open practices; contributions to the research and innovation community and to society; and, especially for senior roles, mentoring and support the careers of others).
  • Using the job advert as an opportunity to describe your own group philosophy (e.g., approach to the early and wide sharing of data, code, or software; the use of preprints; adoption of reproducible methods; approach to determining authorship; and the criteria used to evaluate research quality).

The Open Reviewers Toolkit is designed to help guide the responsible evaluation of research manuscripts. It comprises three standalone elements:

  • Bias Reflection Guide: a tool to help a reviewer assess their own biases and assumptions while reviewing a research manuscript, guiding them through a non-judgmental and self-reflective process.
  • Reviewer Guide: a framework to guide a reviewer in the process of reading and evaluating a research manuscript, and writing a peer review report.
  • Review Assessment Rubric: a rubric for the reviewer to help improve the review.

Alternative Metrics work alongside traditional (peer-review or citation-based metrics) to inform the impact and reach of research through things like policy documents, social media, news outlets, and blogs. Alongside traditional metrics, they provide a more complete picture and are quicker to generate. However, they aren’t a replacement for peer-review or citation-based metrics and like any metric, have potential for gaming. Usage of AltMetrics is still relatively new and varies by discipline and between sectors – you are advised to speak with your Faculty champion (see the Who in my Faculty can help FAQ) to understand whether and how best to use?

Further information is available on the Library Research Support website.

The Open University (OU) has committed to a set of principles for change. Systemic and continued use of JIFs may see the university expelled as a signatory. DORA principles codify a mindset that requires time to gain awareness, traction and advocacy. It is likely that numerous colleagues, particularly new staff, are unaware of DORA and the commitments upon individuals across Faculties, professional services and collectively as a research organisation. Individuals can be directed here or their Faculty DORA representative (see the Who in my Faculty can help FAQ) for further information and advice.

Whilst any such instance of using JIFs is likely to have low level implications, they have potential to jeopardise staffing or funding decisions and can have a corrosive influence. Systemic use (e.g., referenced in policy documentation for rewarding staff or allocating funding) is likely an oversight of a policy owner unfamiliar with DORA and should be directed to the RES Research and Enteprise team.

Instances where an individual or group, aware that the OU is a DORA signatory and understands the rationale, yet continues the wilful use of JIFs, should be resolved informally and locally where possible, otherwise referred to the Faculty DORA rep and thence the central DORA team (who maintain an open door policy and anonymity where requested). You may also want to refer to the University’s procedure for dealing with research misconduct in which references the University whistle blowing policy. Experience to date suggests however that practices inconsistent with the DORA obligations reflect awareness and fall significantly short of misconduct concerns.

Please contact us if you have any other questions.