English Language Tests for Higher Education Admissions Purposes

Is this test fit for purpose? Evaluating the impact of English language testing
on international student transitions and outcomes


Since language assessment continues to adapt to a context of rapid change, a major challenge for Higher Education Institutions (HEIs) is how to ensure that high-stakes decisions around admissions tests are informed by empirical data. In recent years a broader range of English tests has been used by HEIs as proof of language proficiency. Whilst providing enhanced choice and accessibility, concerns have been raised about the proficiency levels of international students and the ensuing impact on their ability to engage and thrive academically (Wood, 2023). Low language proficiency and lack of English support are often cited as reasons why students may have an inferior academic experience (Russell et.al., 2022).

International students are able to study in universities in other nations due to their hard work at school, have an opportunity to study their chosen discipline in the university of their choice with international academics, and develop social and academic skills as a result of their experience (Jindal-Snape & Rienties, 2016; Moores & Popadiuk, 2011). However, they have also long been recognised as a vulnerable student population (Sherry, 2010).

Studies for many years have highlighted problems international students experience “such as adapting to a new culture, English language problems, financial problems and lack of understanding from the broader University community” (Sherry, 2010; Udah and Francis, 2022).  This is often referred as the ‘Triple Shock’ of student internationalisation (Ryan, 2013). However, new research demonstrates that this vulnerability was, and remains, exacerbated by the pandemic (Du, 2022). International students continue to be “one of the most vulnerable populations,” (Ibid) suffering both physical and psychological outcomes including mental health impacts and depressive disorders (Tancredi et al., 2022).  This is reflected in poorer outcomes for students, who are more susceptible to marginalisation and exploitation. However, the potentially negative impact of the tests themselves should not be overlooked.

To investigate the impact of expanding the range of admissions tests in conjunction with the interwoven factors noted above, a cross-institutional research group was set up, including differing backgrounds, motivations, and perspectives around HEI. This included University Dundee, the British Council, University of Cambridge and Cambridge University Press & Assessment. This group are currently engaged in a large scale mixed-methods study, investigating the prevalence of different tests at institutions, perceptions of university personnel towards the various language tests used, transparency around decision-making for test acceptance and the perceptions and experiences of the students themselves. Combining qualitative and quantitative data, the study involves: i) desk-based research on institutions’ admissions tests, their required scores and how the range of accepted tests have changed in recent years (n=50 institutions); ii) survey data (n=300) and interviews (n=20) with key groups of university personnel (faculty, recruitment, admissions, EAP). Focus group discussions (n=20) with international students from a range of countries are now under way, including across year groups and disciplines.  

The project has multiple streams including the comparability of different testing regimes, the link between international student entrance criteria and student experience and mapping the level of understanding academics have of teaching practice relative to language ability.

To date, the study has demonstrated that the more insecure a student’s language ability, the more exacerbated poor student experience and life outcomes become. A 2023 Russell Group study of 8,800 UK students found that almost a third were living off less than £50 a month. Those most affected were international students, and particularly female international students who were most at risk of precarity (Koebel, 2023).  Students with poorer language skills are also less likely to be able to access work opportunities and healthcare support, while a recent NUS UK survey reports that 90% of students say poor mental health is impacting their ability to study (NUS, 2022).  The Russell Group Students' Unions Survey reports that 21% of students have considered deferring their studies and 18% have considered dropping out as a result.

But the cost to universities of accepting students with insecure English language skills is considerable. Academic misconduct and poor student performance have been linked to universities accepting lower quality language testing regimes in the drive to bring in increasing numbers of international students (Wood, 2023).  In response, universities have been forced to:

  • Increase bursaries (One UK Russell Group university has had to increase bursary support by £1.5million this year alone)
  • Increase staffing budgets for additional Student Funding Advisors and Students Union Advisors
  • Increase staffing support for Mental Health Counselling and Wellbeing Support

Some UK universities are reporting a three-fold increase in academic misconduct cases associated with poor language skills and student inability to transition into the UK HE environment.  At the same time, academic misconduct and poor academic performance is a reputational risk to universities, some of whom are reporting first attempt fail rates of over 50%.


  • Baker, B. (2016). Language Assessment Literacy as Professional Competence: The Case of Canadian Admissions Decision Makers. Canadian Journal of Applied Linguistics , 19 (1), 63–83. Retrieved from https://journals.lib.unb.ca/index.php/CJAL/article/view/23033
  • Bruce & Hamp-Lyons (2015) Opposing tensions of local and international standards for EAP writing programmes: Who are we assessing for? Journal of English for Academic Purposes, 18 (2015), pp. 64-77.
  • Clark, T., & Yu, G. (2022). Test preparation pedagogy for international study: Relating teacher cognition, instructional models and academic writing skills. Language Teaching Research0(0). https://doi.org/10.1177/13621688211072381
  • Du L-J (2022) The crisis of international education students and responsive service in and after COVID-19 pandemic. Front. Psychol. 13:1053512. doi: 10.3389/fpsyg.2022.1053512
  • Koebel, C., (2023). What are the real effects of the cost-of-living crisis on students?  17 March 2023.  UCL News. https://www.ucl.ac.uk/news/2023/mar/what-are-real-effects-cost-living-crisis-students accessed 15 January 2024
  • Lam, D. M.K., Green, A., Murray, N. and Gayton, A.  (2021) How are IELTS scores set and used for university admissions selection: A cross-institutional case study. IELTS Research Reports Online Series, No. 3. Documentation. ELTS Partners: British Council, Cambridge Assessment English and IDP: IELTS Australia.
  • Moores, L., & Popadiuk, N. (2011). Positive aspects of international student transitions: A qualitative inquiry. Journal of College Student Development, 52(3), 291–306. https://doi.org/10.1353/csd.2011.0040
  • NUS, (2022). Cost of living rise sees 96% of students cutting back. NUS news, June 2022. https://www.nus.org.uk/cost_of_living_rise_sees_96_of_students_cutting_back  accessed 15th January 2024.
  • Ryan, J. (2013).(Ed.) Cross-Cultural Teaching and Learning for Home and International Students: Internationalisation of pedagogy and Curriculum in higher Education. Routledge.
  • Rienties, B., & Jindal-Snape, D. (2016). Multiple and multi-dimensional transitions of international students to higher education: a way forward. In D. Jindal-Snape, & B. Rienties (Eds.), Multi-dimensional transitions of international students to Higher Education (pp. 259-283). (New Perspectives on Learning and Instruction).Routledge. http://library.dundee.ac.uk/F/D6P5RQPQLE4LVI42V45BYIDF7CL3HLELFNQ67E2EYEQYJIKPYY-33033?func=find-e&adjacent=N&request=9781138890909&find_scan_code=FIND_WRD&x=0&y=0
  • Russell et al. (2022). A mixed-method investigation into international university students’ experience with academic language demands.
  • Russell Group Students Unions, (2023). Student Cost of Living Report. Russell Group Students’ Union, Gordon Street, London. https://static1.squarespace.com/static/63f4ed73056f42572785c28e/t/640b4a3d20fc6122160c275e/1678461513650/Cost+of+Living+Report+-+March+2023.pdf  accessed 15 January 2024
  • Sherry, M., Thomas, P. & Chui, W.H. International students: a vulnerable student population. High Educ60, 33–46 (2010). https://doi.org/10.1007/s10734-009-9284-z
  • Tancredi, S., Burton-Jeangros, C., Ruegg, R., Righi, E., Kagstrom, A., Vallee, A. Q., et al. (2022). Financial loss and depressive symptoms in university students during the first wave of the covid-19 pandemic: Comparison between 23 countries. Int. J. Public Health. 67:1604468. doi: 10.3389/ijph.2022.1604468
  • Taylor, L. (2013). Communicating the theory, practice and principles of language testing to test stakeholders: Some reflections. Language Testing, 30(3), 403-412. https://doi.org/10.1177/0265532213480338
  • Udah, H., & Francis, A. (2022).  Vulnerability and Well-Being: International Student’s Experience in North Queensland, Australia. Journal of Comparative and International Higher Education. Vol 14, Issue 5 (2022), pp. 171-196.  DOI: 10.32674/jcihe.v14i5.3942 | https://ojed.org/jcihe
  • Wagner, E. (2020). Duolingo English test, revised version July 2019. Language Assessment Quarterly17(3), 300-315.
  • Wood, P. (2023, March 20). ‘Academic misconduct’ and poor student performance in UK universities linked to Duolingo entry tests. The i. https://inews.co.uk/news/universities-warn-over-poor-performance-after-admitting-students-using-duolingo-tests-during-pandemic-2199916


Research Team

Dr Emma Bruce

British Council

Dr Tony Clark

Cambridge University Press & Assessment

Professor Susan Kinnear

University of Dundee

Professor Karen Ottewell

University of Cambridge


British Council: IELTS

Stephen Carey

British Council

Ashleigh Bodell

British Council

Megan Agnew

British Council



University of Cambridge, UK
British Council, UK
University of Dundee, Scotland
Cambridge University Press & Assessment, UK

Project Overview

Our research aims to make a significant impact on admissions into the UK HE sector by creating greater understanding and awareness of ‘high stakes’ English certification standards (British Council, 20231 ) on UK government policy and introducing new research based practices in the following areas:

  • The tests used for HE admissions
  • Language assessment literacy to inform admissions practice across the HE sector globally
  • Duty of care practices adopted by HEIs in the UK and Canada

These changes will be achieved by developing:

  • A new framework for evaluating the tests used for HE admissions
  • Creating a new AdvanceHE training course for academics under the ‘Internationalisation’ stream of the Professional Standards Framework 2023, to be embedded into HEA Fellowships and continuous professional development opportunities for academics globally
  • UK government policy advice for high stakes tests based on the outcomes of the research

1. A high-stakes test provides certification that allows individuals to study, work, or migrate wherever a demonstration of their English language proficiency is required (British Council, 2023, High Stakes Language Tests, what are they and what are they used for? https://www.britishcouncil.it/en/blog/high-stakes-language-tests-what-are-there-and-what-are-they-used, accessed 11 April 2024)


Latest News

Press Releases

10 June 2024

University of Dundee: English Language Testing and its Impact on International Student Admissions

University of Dundee professor joins research collaboration to evaluate English language testing criteria and its impact on international student admissions in Higher Education.

Professor Susan Kinnear from the University of Dundee’s School of Business has joined forces with a team of researchers from the University of Cambridge, Cambridge University Press & Assessment, and British Council, to evaluate English language testing criteria and its impact on international student admissions in Higher Education.

The project aims to make significant impact on admissions into the UK HE sector by introducing new research based practices in the following areas:

  • The tests used for HE admissions
  • Language assessment literacy to inform admissions practice across the HE sector globally
  • Duty of care practices adopted by Higher Education Institutions in the UK and Canada

By introducing a framework for evaluating the tests used for HE admissions and delivering new AdvanceHE training courses globally for academics in line with the Professional Standards Framework 2023, the project aims to assist universities in making more informed decisions about admissions criteria when selecting the English language tests they accept.

It is hoped that a greater understanding of the standards of academic integrity reflected in suitable tests will help identify what additional support international students may need on arrival to the UK to integrate successfully into the English-speaking academic environment.


22 April 2024

English language tests for entry to UK higher education: does it matter which tests universities accept?

The short answer is: Yes, it does matter which tests universities accept and at what cut scores (or minimum score required), at both overall and individual component level (i.e. writing, speaking etc.). This is because universities have a duty of care to the admissions standards of the institution. But, in many respects more importantly, they have a duty of care to the students they admit by ensuring that they pay due care and attention to the selection of tests that accept, and at what levels they set their entry standards, so that in achieving these scores, the university is confident that the students admitted will be able to successfully meet the challenges of their chosen degree courses – or at least, the university is conscious of what additional, scaffolded English for Academic Purposes (EAP) support the students will need on arrival, and that this is provided.

The longer answer is still: Yes, it does matter, but unfortunately, it is not as straightforward as it should be.

Every few years the UK government invites English language test providers to apply to have their tests accredited as SELTs, so Secure English Language Tests. Accreditation is a tough process, requiring assurances on test security, accessibility, availability, and validity, to name but a few areas. All of these are good things, but this accreditation alone does not necessarily translate into a reliably curated list for UK higher education admissions purposes. And indeed, as Highly Trusted Sponsors, universities can accept tests that are not on the SELT list, but it then necessarily follows that they must bear the responsibility that the additional tests that they may choose to accept meet the minimum requirements for a Student Visa.

This is where it all gets a little more interesting as there is no independent body that regulates or oversees the expanding range of English language tests on the market. When making their decisions, universities have to scrutinise the materials that the test providers themselves issue on the validity, reliability, and security etc. of their test and how they benchmark themselves against other such tests on the market. Sometimes this is backed up by rigorous research, as in the cross-mapping carried out by two of the most well-known tests on the market – IELTS (International English Language Testing System) and TOEFL (Test of English as a Foreign Language) – but for the newer players, or for those with smaller research budgets, universities have to ascertain for themselves whether the claims that the test providers make as to the validity, reliability, and comparability are accurate.

Selecting which tests to accept, though, is not as straightforward as it might otherwise seem. At the end of the day, test providers are commercial entities, who, like universities, have bottom lines to consider and so what might be spun in the glossy advertising for their test might not, when the details are scrutinised, translate into a test that’s a valid and reliable indicator of an applicant’s academic literacy proficiency in English.

This situation became all the more acute during the pandemic, which, as with all other aspects of life, forced test providers to swiftly pivot to online tests due to the closure of test centres. Many universities were consequently forced into accepting tests that, under normal circumstances, they would perhaps not have considered. A large number of these universities have since reversed their decisions and equally, there seems to be a preference for a return to tests taken in test centres, both on the part of the universities as well as the test takers themselves.

As life adapts to the post-pandemic new normal, the time is opportune to both reflect on what the pandemic has taught us, both negative and positive, about the potential for online English language testing – but also, more far-reachingly, to investigate the strengths and weaknesses of the range of tests on the market, both old and new.

At the end of the day, universities are, of course, free to stipulate which tests their applicants can take and the levels required in order to evidence that they meet the minimum academic literacy standards in English for their chosen degree programme. But this should be an informed decision. Universities need not only to know about the security, availability, and reliability of the available tests – but far more, they need to know about the validity of the test as a predictor of the applicant’s readiness, in terms of academic literacy in English, to do their chosen degree course. The university has a duty of care to do so. Accepting tests which are cheap, quick, and which have little such validity, whilst perhaps preferred by some test-takers, in the long run, as numerous universities found out during the pandemic, is a false economy. Many universities who accepted such tests subsequently reported that many of the admitted students did not have the necessary linguistic wherewithal to study through the medium of English – despite having test scores that providers asserted would evidence this.

This particular Gordian Knot is the focus of a research collaboration between the British Council, Cambridge University Press and Assessment, and the universities of Dundee and Cambridge, which came about due to shared concerns based on alarming stories and first-hand witnessing of serious issues in post-pandemic international student admissions. Ultimately, what we as a group are looking to do is to publish a guidance document and run a series of workshops for higher education institutions on how to choose a suitable proficiency test from those currently on offer, including what they assess, what they don’t assess, what levels might be best suited for undergraduate and/or postgraduate entry, their global availability, their accessibility, and of course, their security.  Higher education institutions will, of course, make their own decisions as to which tests they accept and why and at which levels – but with many more players coming onto the market and many more online opportunities having been opened up due the necessitated move to online during the pandemic, what we are aiming to do with this research study is to assist in making these decisions more informed decisions.

The research is a mixed-methods study, investigating different tests at different institutions, perceptions of university personnel towards various language tests, and transparency around decision-making for test acceptance. This has involved: 1) an initial desk-based study on the range of English language tests higher education institutions accept for admission onto both undergraduate and postgraduate programmes, at what levels, as well as any recent changes there may have been in accepted tests; 2) a survey which looked to explore the decisions as to which tests are accepted and why from the perspectives of recruitment personnel, admissions, English for Academic Purposes support staff, as well as academics. Participants were asked to comment on what they saw as the strengths and weaknesses of the tests they accepted, and for those in direct contact with international students, so academics and EAP staff, what they see as the main challenges facing international students and by extension, what skills they should ideally come prepared with to meet these challenges. We are now in the process of carrying out focus groups with international students to gain insights into their experiences, both of the admissions testing requirements, but also into how they are navigating studying in the UK.

Preliminary results in the UK context indicate that processes around test acceptance are not uniform across institutions, and that there are often competing tensions within universities, so for example, between recruitment and academics, as to which tests to accept and at what levels. For example, 63% of academics who responded said that they were Extremely or Somewhat dissatisfied with the range of tests that their university accepts – whereas 67% of those responding who worked in recruitment were Somewhat satisfied. Additionally, almost half of the academics who replied noted that their institutions do not have dedicated academic English language and literacy support sections. According to responses from those universities that do, the majority of English for Academic Purposes (EAP) staff are involved in making suggestions or recommendations with respect to test acceptance, but these staff noted that despite their English language assessment expertise within their institutions, they were only one of many stakeholders consulted and their views were sometimes overruled. Some even noted that they were ignored during the pandemic and thus had no involvement in their institution’s current list of accepted tests.

The initial survey data has also shown that whilst for the most part higher education institutions are somewhat satisfied with the information available to them on current proficiency tests, this satisfaction rests mainly with the established players in the market, such as IELTS and TOEFL, namely, those test providers which regularly publish research on the validity and reliability of their tests. 20% of respondents commented specifically on the fact that they need more information on test comparability as there is some uncertainty as to the veracity of some of the claims made by newer tests on the market, especially when these claims are apparently based on their own cross-mapping research to IELTS and TOEFL. In one instance, for example, despite the claims on their website of one of the newer tests that they had carried out research to cross-map their test to IELTS, when probed further by a university, it turned out that this wasn’t the case at all – they had carried out a cross-mapping to the Common European Framework of Reference (CEFR) and then cross-mapped from this to IELTS. Such questionable research practices wouldn’t be accepted of novice postgraduates, so it’s all the more worrying that high stakes English language test providers are seemingly allowed to get away with it.

When it comes to changes to the range of tests accepted during the pandemic, 64% of responding universities (n67) said that they had expanded what they accepted – but of this 64%, 54% had since reversed this decision. The reasons given ranged from security concerns, to validity and apprehension about online testing, to perceptions that students who came in with some of the newly accepted online tests were not adequately prepared for the tertiary learning environment. When asked about their perceptions of the English levels of the students they taught, the academics who responded considered only 9% to be ‘Good’; 47% to be ‘Mixed/Varied’; and concerningly, 44% said that the level was ‘Poor’.

In probing these themes in the follow-up interviews, what has become evident is that in light of the actual English levels of the international students in the classroom, irrespective of their tests scores, there is a feeling amongst some academics that they need to ‘dumb down the curriculum’, coupled with a clear message that language proficiency alone is insufficient – academic skills in English are vital and it was noted by several that students who have attended pre-sessional academic English literacy courses anecdotally fare far better. In terms of test providers, proficiency tests from the big players, such as IELTS, TOEFL, and to a lesser extent, Pearson PTE and Trinity ISE, are generally trusted as reliable tests – but it was also noted that such tests may need to adapt to keep up, whilst not engaging in a race to the bottom if they are to remain trusted.

There are significant costs to both admitting institutions as well as to the students themselves when students are admitted with lower levels of language proficiency than are required to successfully engage with their chosen course. As mentioned at the start, universities have a duty of care, both to their institution, but primarily to the international students that they are looking to attract, when selecting the English language tests that they accept. Greater language assessment literacy (LAL) is therefore required by HEIs both in making such decisions but also in understanding what additional support international students may need on arrival in order to successfully acculturate into the English-speaking academic environment.

The aim of this research collaboration is to address this gap by developing a framework for universities to evaluate the tests they accept so that they can make informed decisions which are in the best interests of their own admissions standards – but arguably more importantly, are in the best interests of the international students they admit. Many of the claims that are being made by some of newer test providers don’t meet the standards of academic integrity that are the cornerstone to university teaching and research – so it’s about time that these were scrutinised.





We will be looking to publish interim updates on the research study, both here and more formally in academic journals, and ultimately our aim is to develop a guidance document for universities, together with language assessment literacy training workshops, so that different personnel within higher education can make more informed decisions for their university.

Should anyone be interested in participating in the research or indeed, in getting involved more formally, then we would very much like to hear from you. Please get in touch with Stephen Carey, Head of Relationship Management, British Council: stephen.carey@britishcouncil.org