The EF English Proficiency Index for Schools (EF EPI-s) examines the acquisition of English skills by full-time students aged 13 to 22.
This third edition of the EF EPI-s includes test data from more than 350,000 students at thousands of partner schools and universities in 43 countries.
Although most school systems in the world teach English, student assessment tools vary widely from country to country, and international testing initiatives such as PISA and TIMSS do not cover English language skills. As a result, there is no standardized way to compare English skill acquisition internationally. EF aims to bridge that gap by providing a free English language assessment platform for school systems, universities, and individual teachers, and by producing this biennial analysis of global English learning trends. The EF EPI-s tracks student English skills and provides benchmarks for comparison. It is a companion to our annual EF EPI report, which evaluates adult English proficiency levels around the world.
All the test data included in this report was collected using the EF Standard English Test (EF SET), designed to the same exacting standards as TOEFL, IELTS, and other leading standardized tests. Because the EF SET is free and online, entire cities, regions, and countries can evaluate their students every year using the test, for only the cost of coordinating the effort.
Hundreds of millions of children around the world learn English at school. In the majority of countries, English instruction begins in primary school and continues at least until the end of secondary education. Many countries include an English language assessment on their standardized secondary school exit exam or university entrance test. These assessments are generally written by educators in the country and calibrated to test the knowledge and skills included on that country’s curriculum. While these test results are helpful to educators who are studying English proficiency trends within a specific country, their relevance as international assessments is limited.
English proficiency among older students and adults varies widely from country to country and region to region. But few of those differences are visible among younger students: one striking feature of the EF EPI-s dataset is how similar students’ English skills are at age 13 around the world.
In part, this parity reflects efforts to improve English education—efforts that are starting to show results. For example, in the last decade, a large number of Asian and Latin American school systems have directed resources toward improving English language instruction, launching initiatives that retrain teachers, offer scholarships for international study, bring educational technology into the classroom, and recruit native English-speaking teachers. The impacts of these investments are evident in the results at age 13, when Asian and Latin students perform on par with their European peers. In fact, Latin students’ results are in line with their European peers’ up until age 16, and it is only among older secondary students and university students that a gap between the two regions appears. The gap between Latin America and Asia opens much earlier, with Latin American students beginning to outpace their Asian peers by age 14. Because Asian students continue to improve their English proficiency beyond age 16, while progress in Latin America stalls, the proficiency difference between these two regions is about the same at age 20 as it was at age 14.
At age 13, over 70% of students worldwide have a beginner (A1) or elementary (A2) level of English. Gains in English proficiency over the subsequent two years show a rapid migration of students out of these lower bands, in particular level A1. However, through the end of secondary school, most students in the lowest proficiency bands no longer migrate up. These students appear to be stuck. Growth for many higher-level students, though, continues as they move from the intermediate bands (B1 and B2) into more advanced levels (C1 and C2).
Trends at the university level are less clear. On the one hand, students at age 21 have the lowest proportion of A1-level English speakers of any age group. On the other hand, 21 and 22-year-olds have lower proportions of advanced C2 students than 17 and 18-year-olds. In other words, by the end of university, students are more concentrated in the middle of the proficiency spectrum. One worrying finding here is that most students are not reaching the skill level they need for professional-level English; the minimum proficiency level required for an international workplace is upper intermediate (B2), a level attained by less than 20% of the university-aged students we tested.
When studying this data, it is important to keep in mind that it represents a snapshot rather than a time lapse. Students were tested once in 2017 or 2018, not followed year by year. Still, the lack of a coherent trend at the university level is indicative of a broader lack of coordination in English instruction in tertiary education worldwide. Some universities may be teaching English, or even offering English-medium instruction for some majors and in some courses. Others have abandoned English instruction entirely. Student progress in acquiring higher-level professional English skills is haphazard as a result.
Looking at a typical curriculum, it might seem that students’ English skills should improve steadily from year to year during secondary school. After all, in any given school system, students receive similar amounts of instruction each year, learn from teachers with similar qualifications, and follow a curriculum designed to produce steady progress. The data, however, tells a different story. On average worldwide, students experience disproportionately large gains in lower secondary school and only moderate gains in upper secondary school. At the university level, as previously discussed, English instruction is more haphazard, the results of which are borne out by the data.
Why is progress slower in high school than in middle school? One reason has to do with the nature of language learning itself: as a rule, it is easier to acquire lower-level language skills than higher-level ones. Simply put, beginners learn faster. But that observation alone does not offer a full explanation for the erratic progress we see in our data. In particular, we find that regions experience this slowdown at different points in schooling and to different degrees, indicating that a natural drop-off in learning rate is not the only factor. In some places, we find many students’ progress stalling completely—even though the curriculum indicates that they are still receiving the same number of hours of English instruction.
Adult English proficiency is highest in Europe, so it is helpful to explore how the continent’s school systems manage to teach English so well. Our most striking finding is that, although European students’ learning speed also slows down as they age, they continue to improve their English proficiency steadily, gaining, on average, more than one point per year throughout their education. A CEFR band on this scale corresponds to about 10 points. In Latin America, students’ improvement slows earlier and more markedly. In Asia, learning speed is slower throughout, perhaps because many students struggle with the added difficulties of mastering a new alphabet and a very different type of language. Even then, Asian students also show a decline in learning rate as they get older, progressing by only half a point per year, on average, by the time they reach university.
These patterns suggest that European adults speak better English not so much because they learn the language more quickly in the early years of schooling, but because they keep learning it steadily, even after they have arrived at university.
It is important to note, though, that these trajectories vary substantially from country to country. Students in Brazil, for example, improve quickly in middle school but hardly at all in the years afterwards. Students in Spain learn more English in upper secondary school than in lower secondary, but their progress during university is minimal. In Switzerland, the learning trajectory follows the Europe-wide trend, but students make more annual progress than the regional average every year and continue to make substantial gains even as they reach the end of their formal education. Indeed, Swiss university students improve by nearly three points per year —much higher than the European average.
This data suggests that many school systems, although successful in introducing students to the basics of the English language, are struggling to build competency beyond that level. From a teaching perspective, sustaining momentum at higher proficiency levels requires a substantially different toolkit. In many countries, English teachers themselves have only intermediate proficiency in English.
Unfortunately, beginner English is of little use in the workplace. Most jobs that use English at all require a B1 or B2 level. University systems and technical schools may be best-equipped to build proficiency at the intermediate and advanced levels, particularly in regards to sector-specific vocabulary. Clearer definition of tertiary English language requirements and curricula would improve results.
In all age groups, female students outpace male students in English learning. This finding parallels findings among adults, where we consistently find that women have stronger English then men. A more striking trend emerges when we examine listening and reading skills separately. Listening skills follow the overall trend, with a slight female advantage that narrows with age. Reading skills, however, are equivalent among younger students. The gender divide only appears at age 17, with boys trailing girls in English reading comprehension from upper secondary school throughout university. This disparity places boys at a particular disadvantage in courses that use English-language textbooks and reading material. Instructors at these educational levels would do well to take this deficit into account when considering how best to support individual students.
English listening comprehension skills develop more quickly than reading comprehension skills, and the gap between the two competencies widens every year up to age 20. This gap is much larger than the gender gap. Zooming in to look at students of a single age cohort, we find a wider range of English listening skills, whereas reading skills are clustered more tightly at lower levels. These findings pose concerns for educators trying to train students for a digital-focused, 21st century workforce, in which English text skills are essential.
One reason for this skills gap is that many education systems are placing increased emphasis on oral communication, as they move away from rote grammar and translation exercises and toward communicative instruction. Another factor may be media consumption. Students today have more frequent exposure to spoken English outside the classroom, through English-language movies, TV, and music.
To be clear, communication-centered instruction and English-language media consumption are positive developments for English learners, but educators clearly need to do more to develop reading skills. Written English uses a much wider range of vocabulary and more complex sentence structures than spoken English, and it requires dedicated instruction and practice. The skill is particularly important in the workplace, where professionals need to understand documentation, emails, news, and research.
Through continuous, standardized assessment of English language skills, educators can pinpoint areas for improvement and identify successful strategies at the institutional, national, and international levels. The EF Standard English Test (EF SET) was designed for that purpose.
Offered at no cost and built with the same methods as other standardized English tests, the EF SET rests on a foundation of evidence-based research and years of continuous investment. Test items are created by experienced examiners, carefully reviewed by a panel of experts, and piloted on more than 150,000 learners from 80 countries. A third-party review in 2014 showed that EF SET results correlate highly with TOEFL iBT and IELTS results for the same test takers. This means all three exams measure a common set of reading and listening comprehension traits. For further information on the EF SET and the research behind it, visit www.efset.org/research/.
Education systems use the EF SET to evaluate their students on their own timetables, with whatever frequency they deem appropriate. Because the test is completely free, it is possible to evaluate a large population of students in different school types and education levels for only the cost of coordinating the effort. Education ministries have also successfully used the EF SET to evaluate teachers in contexts where further training will be made available to those who need it.
Upon completion of testing, participating schools receive customized reports with their students’ EF SET scores and CEFR levels, as well as comparisons between the groups of students determined by the organizers of the evaluation project, whether that is within an individual school or a broader education system. In addition, each student can receive an EF SET level certificate corresponding to the CEFR, provided that the test is administered in a proctored environment. All student test data from around the world is anonymized and used to produce international benchmarks for English learning as well as this biennial report. We invite all schools, universities, and ministries of education to participate in our ongoing research.
You’ll receive a personalized school result and certificates for each participating student.
Your students are tested with EF SET testing platform, completely free of charge.
Your school can partner with us to create an exclusive co-branded testing page for your students.
Your students will join over 150,000 participants who have contributed to the EF English Proficiency Index for Schools (EF EPI-s).