A Level Awards 2021 and Their Obfuscation

Statistical analysis is an aid to thinking, not a replacement for it. Statistical analysis, like words, can be used to reveal the truth or to hide it.


In Girls Are Just Cleverer I promised to look at sex bias in the 2021 A Level awards. I do so here.

Mary Curnock Cook has beaten me to it, though, with an article in The Times and another with detailed statistics in FENEWS. The former quotes the latter where Curnock Cook opines that “this goes further than the usual concerns about boys’ underachievement in education compared to girls and needs a convincing explanation to eliminate what seems, on the face of it, to indicate systemic bias against boys”.

This will not be a surprise to readers of this blog. I have been identifying systemic sex bias in education for 7 years now. My latest update on bias in Key Stage 2 SATS (age 10/11) can be found in Girls Are Just Cleverer. And the bias resulting from A Level awards based on teachers’ assessments was covered in A Levels 2020: The Year of Utter Nonsense.

Having used the phrase “year of utter nonsense” of 2020 leaves me unable to think of an appropriate term for 2021. I am apt to say “things will only get worse”. I apologise for being such a misery – but unfortunately reality has a way of reinforcing my pessimism by proving me right. This year’s A Levels are a case in point. They have become about as credible as a Monty Python sketch.

At the end of Curnock Cook’s opinion piece in The Times, the Department for Education is quoted as rejecting her claims of systemic bias against boys saying that this has been ruled out by Ofqual, the exams regulator. Via FOI I have been informed that the Ofqual analysis which purports to demonstrate this lack of bias is Summer 2021 student-level equalities analysis: GCSE and A level by Ming Wei Lee. This post will therefore address two things,

  • Firstly, my own presentation of the 2021 A Level awards, particularly the evidence for their sex bias in the top grades. This builds on my analysis of the 2020 data in State Education Dying, Dying….Dead? and uses the same methodology.
  • And secondly, a brief critique of the Ofqual analysis by Ming Wei Lee.

(I note in passing that the same author, with Paul Newton, wrote May 2021’s Systematic divergence between teacher and test-based assessment: literature review​ where they observe that “evidence of teacher bias in relation to gender is mixed, but a slight bias in favour of girls (or against boys) is a common finding”. However I shall not review that work here.)

A Level Awards 2021

In both 2020 and 2021, A Level awards were based on teachers’ assessments, not on exams. A Level awards were last based on exams in 2019.

I shall concentrate here on the top grades, A/A*. All Data was taken from National examination figures (bstubbs.co.uk).

The massive grade inflation in these top grades is illustrated by Figure 1 which shows the number of A/A* grades awarded as a percentage of total A Levels versus year (all subjects, both sexes). This was 25.4% in 2019, was falling slightly, and had not been higher than 27.1% over the ten years 2010 – 2019, all based on exams. In 2020 this leapt upwards, and has increased markedly again this year, to nearly 45%.

Figure 1

Inflation in the top grades can be seen for both girls and boys. However, it is more marked for girls. Figure 2 plots the percentage by which the number of A/A* grades awarded to girls exceeds the number awarded to boys. This had been between 20% and 25% for ten years based on exams, then, based on teachers’ assessments, it leapt to 35.3% in 2020 and now to 36.9%.

Figure 3 shows what this looks like in absolute numbers: the number of A/A* grades awarded to girls minus the number awarded to boys. Girls have been awarded more top grades for a very long time (several decades). In the last ten years of exam-based awards, 2010 – 2019, between 20,000 and 25,000 more top grades were awarded to girls than to boys each year. In 2020, based on teachers’ assessments, this leapt to 45,000 and in 2021 to 57,500.

The excess of top grades awarded to girls over boys has nearly tripled since the last exam year, 2019. Bear this in mind when we get to the Ofqual analysis.

Figure 2

Figure 2
Figure 3

Table 1 lists the percentage of candidates awarded an A* grade this year, 2021, compared with 2019, for all subjects – and separately for boys and girls.

Based on exams in 2019, and across all subjects, a larger percentage of boys (8.2%) than girls (7.5%) were awarded A* grades. Based on teachers assessments in 2021, now a larger percentage of girls than boys are awarded A* grades overall.

Percentage awarded A*girls
Art & Design13.023.09.616.9
Further Maths22.150.725.849.1
Sports & PE6.127.02.511.5
D & T5.425.24.014.3
Religious Studies4.316.14.515.8
Other Sciences4.
Other Subjects3.214.42.711.9
All Subjects7.519.78.218.4
Table 1: Percentages awarded Grade A*

It is worth perusing Table 1 to gauge how preposterous has been the A* grade inflation. In 2019 just 3.7% of girls were awarded A* in Computing; in 2021 that became 25.7%. In 2019 just 6.1% of girls were awarded A* in Sports & PE; in 2021 that became 27%. Similarly, in Further Maths 22.1% became 50.7%, and in Physics 8.5% became 25.3%, and so on. Boys’ awards have inflated also, but not by quite so much.

As a result, girls have become dominant in almost all subjects as regards the percentage being awarded A* grades. Based on exams in 2019, a larger percentage of boys than girls attained A* grades in English Literature, Chemistry, French, Maths, Further Maths, Physics, Classics, German, Music, Religious Studies and Other Sciences. In 2021, based on teachers’ assessments, this list has shrunk to just Chemistry, French and Other Sciences. A larger percentage of girls than boys are now awarded A* in the male bastions of Maths, Further Maths and Physics.

Let’s look now at the sex bias, based on the assumption that 2019, based on exams, can be taken as the equitable datum.

Bias is then defined, as I did in 2020, as the excess of the percentage of girls over the percentage of boys gaining A or A* grades in 2021 (based on teacher assessments) minus the excess of the percentage of girls over the percentage of boys gaining A or A* grades in 2019 (based on exam results). Expressed algebraically,

bias = (G_TA – B_TA) – (G_Ex – B_Ex)

where G and B denotes girls and boys respectively, and subscripts TA and Ex denote teachers’ assessments (in 2021) and exam based results (in 2019) respectively. In other words, the bias is defined as how much the girls’ advantage has increased as a result of being assessed by teachers rather than by exam.

A positive bias is to girls’ advantage, or to boys’ disadvantage, and a negative bias vice-versa.

Table 2 lists the bias for all subjects separately. (NB: “All subjects” means all subjects where total candidates was in excess of 2,500. This only excludes Welsh, Irish, ICT, and Performing Arts).

Of the 31 subjects listed, the bias is in girls’ favour in all but 3 (all of them modern languages). Moreover, in all but one of the 28 subjects where the bias is in favour of girls, the bias has increased since 2020.

Across all subjects, the average bias in 2020 was 3.2%. In 2021 this has increased to 4.7%.

The bias is greater than 5% in 16 subjects, and in double figures for two subjects.

Figure 3 puts the impact of this bias in perspective. It has resulted in nearly 40,000 more A/A* grades being awarded to girls, compared to boys, than would otherwise have been the case. Given the competition for top university places, this suggests that the order of 20,000 or so places at top universities will go to girls that might otherwise have gone to boys. And this will have come about due to preferencing one sex over the other by teachers. The small biases expressed as percentages belies the potential magnitude of the impact on people.

bias 2020
bias 2021
Art & Design3.44.0
Classical studies5.44.6
Design & Technology5.210.6
English Literature2.93.5
English Language4.35.2
English Lit & Lang2.73.0
Further Maths56.3
Media & film6.69.6
PE / Sports8.411.7
Religious Studies22.3
Other Sciences-1.17.8
Other Subjects2.34.0
All Subjects3.24.7
Table 2: Bias (percent)

The Ofqual Analysis

Recall that the Department for Education has rejected claims of systemic bias against boys in this year’s A Levels, saying that this possibility has been ruled out by Ofqual’s analysis Summer 2021 student-level equalities analysis: GCSE and A level by Ming Wei Lee. Here I make a brief critique of that analysis. I emphasise that I have no reason to doubt either the numerical accuracy or the factual correctness of the text of that report.

I note firstly that the A Levels used in the analysis did not include Maths. This is extremely odd because Maths is far and away the most popular A Level, attracting nearly 100,000 entrants. In particular, roughly twice as many boys take Maths as take boys’ next favourite subjects, Physics and Chemistry. I also note that Law and Politics – and Further Maths – were not included, which between them would account for about 47,000 entrants.

The analysis uses multivariate regression which means that the effect of sex can be isolated from the effect of the other variables used. I have no difficulty with this as a technique, nor with controlling for variables such as ethnicity, primary language, special educational needs or disabilities (SEND), or free school meal status or other markers of socioeconomics.

However, the use of prior attainment as a control variable has a huge, glaring problem. It turns out not to matter much for this particular analysis, but it would matter a great deal in more general situations. Let me explain…

For A Level candidates their “prior attainment” is based on their GCSE score (which would usually be two years earlier), specifically a mean GCSE score normalised to lie between 0 and 100.

Similarly, although I’m not looking at this here, the “prior attainment” used in conjunction with GCSEs was Key Stage 2 SATS results (usually taken 5 years earlier).

The huge, glaring problem I have with this as a methodology is that it’s an excellent way of hiding bias if the bias existed, and was roughly constant, throughout the whole of schooling – or, at least from Key Stage 2 SATS to A Level. In this situation, bias would be aliased by prior attainment and hence would become invisible.

To spell this out: if you did poorly in your GCSEs then did comparably poorly in your A Levels two years later, your poor performance would not get attributed to your sex, your ethnicity, or to any of the other variables because “prior attainment” explains it. But what if you were actually discriminated against (perhaps on grounds of sex, or of ethnicity, or whatever) when doing your GCSEs, and equally discriminated against at A Level? This could not be picked up because the discrimination has already been built into the “prior attainment” measure at GCSE.

The same applies to GCSEs in respect of the prior attainment measure based on Key Stage 2 SATS. And as I have already identified that there is sex-based bias against males in Key Stage 2 SATS (see Girls Are Just Cleverer) I have therefore already established that the “prior attainment” process is broken from the start.

The way this would work in practice is that an apparent sex bias at A Level might be rationalised away (via the multivariate regression) as due to differing “prior attainments”, which is effectively a code for saying “yes girls did better a A Level because they did better at GCSE (sub-text, because they are cleverer, not because of bias)”. Here are Ming Wei Lee word’s explaining how the process works,

Multivariate analyses allow the effect of a variable to be examined while holding other variables constant. For example, the descriptive statistics may show that females outperform males, and that candidates with high prior attainment outperform candidates with low prior attainment. A multivariate analysis allows us to hold prior attainment constant while estimating the gender difference in results, and vice versa. If the gender difference seen in descriptive statistics disappears in the multivariate analysis, we would conclude that the females in our sample had higher prior attainment indicating higher ability than the males and that it was this difference in ability that led to their higher performance, not their being female per se.

There are other problems with the “prior attainment” control such as the subjects in question. At GCSE it is usual to take a wide range of subjects, and most will be verbal subjects – languages and “essay” subjects. Many people, especially boys, have had weak verbal skills throughout their school career and these people flood into the non-verbal subjects, maths and the sciences, at A Level. This is, I expect, why Ming Wei Lee finds a positive gender effect in 2018 and 2019 (i.e., essentially meaning, due to prior attainment control, that boys did better at A Level than their mean performance at GCSE – when all were exam based).

However, I digress as this prior attainment issue is not very important for the key result, which is the comparison between 2019 and 2021. The reason is that the skew introduced by the prior attainment control cancels when these two A Level years are compared (as both are subject to prior attainment control in the same way). The Ofqual result most nearly comparable with my analysis is that for the change between 2019 and 2021 of the probability of attaining Grade A and above. The outcome of interest is this,

Probability of grade A and above: (i) 19 of the 22 between-group comparisons (on ethnicity, language, FSM, deprivation) showed no notable change from 2019 to 2021; (ii) the gap between males and females and the gap between SEND candidates and non-SEND candidates, have shifted from positive to negative, indicating that in 2019, male candidates and SEND candidates had higher outcomes than prior-attainment-matched female candidates and non-SEND candidates respectively, but in 2021, the direction of the difference reversed. The shifts represent changes of 4.36 percentage points on the gender variable and 2.07 percentage points on the SEND variable.”

So, the bottom line is that the Ofqual analysis identifies, as an average across the subjects analysed, that being female improved your probability of gaining an A or A* grade by 4.36%. This is closely comparable with the average bias I found, namely 4.7% (Table 2).

The same finding is there in the Ofqual analysis if one knows how to read it. However, Ofqual did not report this result on a subject by subject basis – and my Table 2 shows that 16 individual subjects have a greater bias than this.

My beef with the Ofqual report is that it fails to represent how serious is the bias identified – as best illustrated by my Figure 3. Nearly 40,000 more top grades to one sex simply by virtue of their sex is not a small matter, yet there is no hint of this significance in the Ofqual report. The Executive Summary contains this rather anodyne reference to the key result,

Some groups showed notable changes from 2019 to 2021 on all 3 results measures. Male candidates, candidates with SEND, candidates in secondary selective schools, sixth form and tertiary colleges, have seen, from 2019 to 2021, a small decrease in outcomes (small changes not exceeding 0.2 grade) relative to prior-attainment-matched candidates of their respective comparator group, namely, female candidates, candidates without SEND, candidates in academies respectively.”

The main text of the report is no more explicit. The small-scale plots used to illustrate the findings graphically do a good job of minimising differences, and one suspects that might have been the intention. Figures 4 and 5 are the two plots which show the gender effect, Figure 4 based on the mean grade and Figure 5 on Grades A/A*…

Figure 4: from Ofqual report
Figure 5: from Ofqual report

I think we can agree that these Figures do not have the same impact as my Figures 2 and 3. There is more to honesty that merely never being untruthful. Here’s an example of a honest depiction of what has happened: the excess of top A/A* grades awarded to girls over boys has nearly tripled since the last exam year, 2019.

And as for the Department for Education’s claim that this Ofqual analysis has “ruled out” any systemic bias against boys – err, no, it’s there – in that 4.36%. But between a report which minimises this finding, close to the point of invisibility, and the Chinese whispers that no doubt apply between Ofqual management and Department for Education officials, the reality of my Figures 2 and 3 is conveniently magicked away so as to cause no embarrassment.

8 thoughts on “A Level Awards 2021 and Their Obfuscation

  1. Douglas

    Employers will learn to look at the 2020 and 2021 exams with disdain, automatically reducing the likely ‘real’ pass. This is unfortunate for those who truly deserve the top grades but will likely mean more stringent hiring tests, to determine true ability.

    But wait…
    Given that the UK government—like many around the world—is moving away from equality of opportunity and towards equality of outcome, how long is it before the time when boys are awarded the same grades as girls as a matter of equality?

    While we’re about it, the less academically inclined should be awarded the same outcome as the brilliant. Real equality of outcome.

    So, we should in future see all school leavers given an A* pass, regardless of sex, ethnicity OR ABILITY. We could go even further and grant the passes to children regardless of whether they have had to sit through two years of specialist education to gain the qualification, giving everyone nine passes in anything they choose. THAT will be equality! And utter nonsense, of course, but for anyone who gives it some thought, equality of outcome IS nonsense.

  2. AJ

    That the clear sex bias in award of grades by teachers is being minimised and is not widely publicised or accepted is not at all surprising when you consider that the massive grade inflation intoduced by teacher assesment is not widely publicised or accepted and there is in fact a concerted campaign to minimise and obscure this grade inflation. Figure 1 says it all and that this is not being recognised as a disastorous problem and issue but being minimised does not bode well for the future.

    Pity those who are responsible for university admissions. How on earth do you use these results to distinguish between the mediocre and the talented?

    The real problem to be concenred about is not thios but what is going to happen when there is a return to exams.

    What should happen is a massive fall in the pass rate and percentage that receive A grades with a bigger drop in the proportion of girls who get top grades. Its not hard to imagine how this is going too be received and the reaction of politicians. The reaction is likely to be measures to massively increase the pass rate and favour girls through elements of teacher assement. There is a good chance that exam system will be permenantly distorted to provide an excessively high percenetage who receive top grades and that a bias in favour of girls will be designed into the exam system.
    This will of course be on top of the existing bias in the education system and the results of these exams will then be used of evidence of girls superiority to boys.

    The trend is already clear. Exams were still possible in 2020 using volunteer invigilators but it is beyond doubt that exams were possible in 2021 with a very high proportion of teachers vacinated and schools reopened. The reason they did not is a fear of the response to a set of clearly worse results. This fear is not going to go away.

    A return to independant assesment will be portrayed as unfair to children and to girls in paticular. The reaction to this is what has the potential for major long term damage.

    1. Nigel Johnson

      The more I think on it, and observe the political landscape. It looks completely impossible that any politician of any ilk could preside over a massive step back to the same level of awarding under exams. Not only would it bring howls from students and parents who missed the past two years bonanza, it would make all thise students from the past two years suspect. And also call into question the ability of the teaching profession, that can be so unable to accurately assess their students. And with a little digging will probably find some schools evidently playing games by uniformly wildly over estimating their students.
      For once I feel incredibly sorry for whoever is education minister when the decision comes up next year. This being the land of “fudge” I’m expecting the examining boards will save everyones bacon by setting very easy exams.
      One way or another the key issue this debacle exposes is the fact the teaching profession are unreliable without exams, left in their hands things are a complete mess.

  3. Nick Langford

    I was looking forward to this article, but the situation it describes is even worse than I had feared. One might have hoped that grades would have remained at the already preposterous level they achieved last year, without a further ridiculous rise. The question is how assessment should be conducted in 2022 and what the grades will look like if the same policy is adopted.

    I had an argument last year with a colleague who raised the inevitable objection that criticising this grade inflation disparages the wonderful achievement of students who have worked harder than ever, blah, blah, blah. As we know, whether students have worked hard has depended entirely on the opportunities given by their schools and colleges, some of which have maintained a level of teaching comparable with earlier years and some of which have not. No doubt I shall have the same argument this year.

    1. William Collins Post author

      The trouble with “everyone gets a prize” is that no one gets one. And one fears for young people joining university courses with which they are not truly equipped to cope. But, then, those courses will be dumbed down, too, I suppose. In which case it is the genuinely able who loose out. What a mess.

  4. Nigel Johnson

    The percentage bias matches that found innate series of studies of the past few years. The studies were national and international and found that their was bias in teacher assessment in particular and making in general where the sex of the producer of the work was “known”. The studies used “blind marking” and used the same pieces of work for both sexes. All found that what was happening was that “boys” were assessed or marked at the same level as the “blind” marking of the same work while it was the girls who were given “inflated” marks. The trend is observed in many countries and both Male and female teachers follow much the same favouritism towards girls. Corroboration appeared to come from the fact that in the exam regime prior to Covid girls are less likely to achieve their “expected” grades in fact. Further corroboration comes from your analysis in the two years where there have no actual exams. The studies were done by feminists expecting to find the reverse than their actual findings. They clearly believed their findings because their arguments to “do nothing” were not to challenge the fact of the bias but to suggest it was ok because later in life men earn more. One can imagine the uproar if one found the reverse and suggested it was ok because most girls would go on to not make best use of their education by working part time or having time out to be stay at home mothers.
    In effect they are saying girls are to be favoured because boys need handicapping. 40,000 boys this year.
    The same studies at least suggest some theory for this preferential treatment, that girls are rewarded for their better behaviour in general. Of course no mention is made of decades of feminism in teacher training.
    Decades ago when I had some interest in feminism, in whatever “wave” was current in the late ’70s, “benign sexism” was an accepted “fact” certainly in criminal justice and treatment by public services (divorce etc were still rarities) indeed a number of writers then were concerned too often females were treated as either “mentally ill” or irresponsible. How things changed .
    I cannot help but think that the dissembling in choices of subjects and presentation, from Ofqual shows their fear if this 40,000 became a more publicly known figure. Because parents of boys would start to wonder if their boys were the ones being so discriminated against.
    The very fact that such concealment happens and the somewhat unconvincing argument of Male earnings in middle age (irrelevant to the parents of today’s sons hoping for their future) shows the importance of pushing at this issue.

  5. Michael Porter

    Two key words are responsible this state of affairs.
    These are the twin poisons being intravenously fed into the body of education, (inter alia) and disguised by deception and legerdemain.

    But why, what is its purpose?

    Its purpose is to re-engineer society by those who realise that other ways
    have failed, will be likely to fail, or are unavailable to them.
    Thus corruption and perversion of already existing process, present themselves as the weapons of choice for this circumstance.
    I’m tempted to say that this will get worse in the short to middle term, so we may be stuck with it for now I’m afraid…..
    But because nothing that is corrupt can stand, it will be brought down by that same decay it practices, that will rot it from the head down.
    In the mean time people will suffer and damage will be done – as is intended.
    Its duration, however, can be much shortened by the prophylactic of revelation and truth, which is why Ricks work is so very valuable.


Leave a Reply

Your email address will not be published. Required fields are marked *