Civitas Misleads with Dishonest Report
Civitas' report on sex education misrepresents its own data to fit a political narrative
Civitas, a conservative UK think tank, recently released a report on relationships, sex and health education in UK schools (RHSE): “Show Tell and Leave Nothing to the imagination” (yes, it’s a very odd title). The report was written by writer and broadcaster Jo-Anne Nadler, and its basic claim is clear from its subtitle: “How critical social justice is undermining British schooling.” And, indeed, the entire report seems both predicated on the assumption that social justice ideologies are overrunning British schools, and that the evidence Civitas collected proves this without a doubt.
The report has already been picked up by major news outlets which are using it to scaremonger about an epidemic of wokeness in schools: the Daily Mail called it a “damning report” that revealed “Teenagers are questioning their biological sex as they are exposed to 'woke' sex education material”. The Times used the report as the basis of a headline that read “One in ten older teenagers would like to change gender”. The report was even mentioned in the House of Lords in a debate, showing its potential influence on new legislation. So it’s worth asking “Does the Civitas report actually show what the media - and its author - claims it does?”
As an educationalist with expertise in questions of children's autonomy and liberal education, I have an academic interest in this topic; as a gay man who has watched the increasing assault on trans people in the UK with horror, I have a personal interest, too. And I can firmly declare that the the report is based on shoddy research which is then misrepresented to fit the political agenda of the author, Jo-Anne Nadler. Nadler begins with her assumption that something is wrong in schools regarding teaching around sexuality and gender, then ruthlessly applies this assumption to data which, despite being the result of questions seemingly designed to elicit biased responses, does not in fact support the it.
In this post I will explore the many problems with the “Show Tell and Leave Nothing to the imagination” report. I will explore how the report is framed with an obvious political narrative which affects the way the reader understands the results which follow; the study’s methods and survey design; and, finally, what the report’s finding actually are, and how they are comprehensively misrepresented by Civitas.
Framing - Politically-Motivated Research
The report opens with a sort of monologue and historical statement which clearly lays out the ideological leanings of the author. It is useful, because it shows the conclusion which Nadler clearly wishes to draw. The opening sections make a series of tendentious claims about the state of British education. We read, for instance, that when schools display the Pride Flag they are not “merely denoting support for the broadly liberal consensus shared by most British people” but rather engaging in “the partisan discipline Critical Social Justice, that encompasses a range of identity driven ideologies”.
All this is indicative, apparently, of “a revolution” in British schools “delivered largely by stealth…that has disrupted the distinct remit of schools and of teachers to educate” and is the result of an “unholy alliance…[that] might be labelled a ‘Social Justice Educational Complex’” (p. 1). This, we are told, is dangerous, because children are now being taught what to think instead of how to think, effectively indoctrinated with all sorts of stuff about white privilege, the sins of the British state, and details about transgender people (that they exist, that they deserve equality).
I will allow the reader to make their own judgment as to whether any of this sounds plausible (I personally think it is total bollocks), but the important thing for our purposes is this: the report is framed and positioned within this highly conspiratorial and politically slanted perspective, and everything that follows is interpreted through this lens. “The results” of Civitas’ research, it is claimed, “underline the change in school culture” which the introduction describes - and so the proper thing for an informed critic to ask is whether or not this is true.
It is not true. In fact, the results of Civitas’ own research flatly contradict their hypothesis, as I shall explain in the final section below. But first, we have to look at the research and how it was conducted, to identify the many flaws in its design.
Sampling - Who Are These Students and Parents?
So, to the report’s methods. The report was based on two separate online surveys: one polled parents of 12-16 year-olds, the other 16-18 year-old children. This, in itself, is odd: why not parents of similarly aged children, so comparisons could be made between children’s views at a particular age, and people who might be their parents? No methodological rationale is given for this decision, and although it doesn’t discredit the results, it raises a question as to why the surveys were designed in this way - a question which goes unanswered, given the paucity of methodological information offered by the report’s author.
There is, for example, very little information given about the polling sample: we are told online surveys were completed by 1097 parents and 1168 children, but no info is given about how these participants were selected. Specifically, it seems important to know if some of the students go to the same school, or if some of the parents have children in the same school, because many of the questions relate to observations at the school level, and there is no indication in the report as to how many *schools* are represented in the sample. If the method used to obtain the sample found too many students or parents connected with the same handful of schools, we would not be getting useful data about schools in general.
The polling company seems to be a respected one, so I assume they do have this information, but it isn't presented in the report, which seems to me a problem - if Civitas wants their research to inform policy, it should provide policymakers with as much information as possible about how their research was conducted, so that informed judgments can be made. This report does not do so, which raises troubling questions.
Survey Design - Why Are These Questions So Bad?
Worse than unanswered questions about the sample, though, are concrete problems with the design of the surveys themselves, and with the wording of the survey questions. There are two main problems: first, the questions posed are inconsistent in their focus, and second, some of the questions are either incoherent or seem designed to elicit biased responses.
Regarding inconsistency, some questions ask parents about their own child's school, while others ask about their perceptions of schools in general. We know from reams of educational research that these two types of question reliably produce very different sorts of answers. For instance, when asked about the school their own children attend, parents tend to be much more positive than when asked about schools in general. So researchers have to be careful when designing surveys like this to have a clear rationale as to why they are asking one type of question or another.
There seems to be no such rationale here. Parents are asked “Thinking about what you may have seen, read or heard, how worried are you, if at all, about the teaching of issues around sex, sexuality, and gender in your child’s school?” This is a question about a school they know, that they have personal experience with.
Later, though, parents are asked whether they think “Schools spend much too much time promoting social issues on sexuality, race and gender.” This is a question about schools in general, which we know is likely to elicit a different sort of response than had the researchers asked again about their child’s school. Given this, why change the nature of the question? Why not ask again about the parents’ children’s own school? There are legitimate reasons why a study might switch between these two sorts of question, but there is no reasoning given here, and that is suspect.
Much more problematic, though, is that some of the questions are either leading, poorly-worded, or both. Consider this question:
"As far as you are aware, which one of the following do you most agree with?
Schools spend much too much time promoting social issues on sexuality, race and gender
Schools spend a bit too much time promoting social issues on sexuality, race and gender
Schools spend about the right amount of time on this
Schools don’t spend quite enough time promoting social issues on sexuality, race and gender
Schools don’t spend nearly enough time promoting social issues on sexuality, race and gender
Don’t know"
The framing of the question seems to me to direct respondents toward a particular way of thinking about a contentious issue. Instead of a more neutral phrasing like "Schools spend too much time teaching about sexuality, race, and gender", the question asks whether schools do too much "promotion" of "social issues". These are conservative trigger words which have deep resonance with a portion of the population, and should be avoided in survey design. (Also, as with too many of the questions, the phrasing is ungrammatical: what does “promoting social issues on sexuality” mean?)
A number of key questions are written in this leading way. For instance, parents are asked to choose between "Schools should spend more time promoting social issues on sex, gender, race" and "Schools should spend more time on traditional subjects". This implies that you cannot teach traditional subjects while also addressing questions of sex, gender, and race - a position no serious educator would agree with. Thus the survey questions themselves incorporate a particular framing of the issues, one in which there is a competition between “social issues” and “traditional subjects.” I am not surprised that, when asked to choose between the two, more parents choose “traditional subjects” - but this is not a choice schools, students, or parents genuinely have to make, however much the biased framing makes it seem so.
Finally, the questions are frequently written in a confusing and even ungrammatical way. The worst offender, ironically, is the question which gave rise to the most surprising statistic in the whole report: the one which the media have interpreted as “one in ten older teenagers would like to change gender”. A deep dive into this question illustrates problems which persist throughout the whole survey.
The question reads:
"Moving on, do you personally, or do you know of anyone at your school, who wants to change their gender or has done so in the past?
Yes – you yourself
Yes – one person (not you)
Yes – several people
Yes – you yourself and other people
No
Don’t know
Prefer not to answer"
This is an extraordinarily bad survey question. First, it is wholly ungrammatical: “do you personally, or do you know of anyone at your school, who wants to change their gender” is not a sensical statement. It would have to read, at the very least “do you personally want, or do you know of anyone at your school who wants, to change their gender” to make basic grammatical sense at all.
This is not just nitpicking. It is important, because if those taking the survey understand the question differently, they will give answers based on their different understandings - in which case they are effectively answering different questions, and the answers do not provide reliable data.
Second, four questions are being asked here at once:
Do you want to change your gender?
Have you in the past wanted to change your gender?
Do you know of anyone at your school who wishes to change their gender?
Do you know of anyone in your school who has changed their gender?
These questions should simply be split apart, and made into different entries in the poll. Not doing this is a recipe for profound confusion, and a dereliction of a researcher’s intellectual and moral duty to conduct ethical and informative research.
When you consider the range of answers offered to this question, the problems get worse, because some of the categories overlap. Let’s say you are a student who personally wants to change their gender, and who has a single trans friend - what are you supposed to pick? Perhaps “Yes - you yourself”, because that would identify that you want to change gender yourself. But then later there is “Yes - one person (not you)”. Should it be that for your trans friend? But then that would seem to exclude you? OK, what about “Yes - several people”? Seems reasonable, because you plus your friend is several people, right? But then there is “Yes - you yourself and other people” - that seems like it might be the best option, but then you don’t know other trans people, you only know a single other trans person, and there is an option for “one person (not you)”, so the survey distinguishes between knowing a single trans person and knowing many, so what to do? And then, to add to the confusion, the question is also asking whether you or anyone you know has transitioned in the past - which is a totally different question!
It’s a complete mess. Any competent survey designer would reject this question and redesign it, in the knowledge that no reliable information can come out of it. So when media outlets and policymakers claim that the report shows that 10% of older teenagers want to change their gender, I call bullshit. I think it probable that, given the atrocious wording of this question, a fair number of the 16-18 year-olds who selected the first option (“Yes-yourself”) were responding that “they themselves” personally knew someone at their schools who wanted to change their gender, not that they did themselves.
Given the problems with the political framing of other questions, I would not be hugely surprised to find that the survey was designed to elicit such confusion.
Misinterpretation of Results - These Stats Don’t Say What You Say They Say!
The most brazen problem with this report, though, is that the findings of the original research are distorted so that they seem to reaffirm the author's preferred narrative, when in fact the data flatly contradicts it. For there is some useful data here which we can rescue from overall mess of the survey - it’s just that all the salvageable data pulls against the narrative Civitas is trying to push.
Remember, Civitas and Nadler’s argument is that woke ideologies have overrun and are undermining British education. The report closes with statements such as "a specific worldview has come to permeate schooling through a deliberate process of embedding, advocated by activists", and "Two separate polling exercises commissioned for this report find that contentious ideas are seemingly widespread in schools" (p. 46). Those are empirical claims, and the actual research does not support either.
Some examples, of many, of the findings of Civitas’ research, from pages 36-45 of the report:
More parents were "not very worried" or "not worried at all" "about the teaching of issues around sex, sexuality, and gender" in their children's school (52%) than were “very worried” or “quite worried” (46%). Overall, parents were chill about these supposedly “contentious” ideas becoming widespread, then. (p. 36)
More than half of parents thought schools did either "about the right amount" or *too little* "promoting social issues on sexuality, race and gender" (53%). By contrast, only 39% of parents thought schools did this too much. The written report conveys the opposite impression. (This is despite the extremely poorly-worded and leading question, examined above, which frames the issue in terms of "promoting social issues" rather than "teaching about" these issues.) (p. 37)
48% of parents thought schools either got the balance between traditional subjects and social issues like sex, race, and gender right, or should teach *more* about social issues than they do currently. Only 46% thought that schools should teach *less* about social issues than currently. So, despite the binary presentation, parents on the whole are happy with the balance between social issues and traditional subjects, or want more social issues to be explored. (p. 37)
Overwhelmingly more parents supported schools promoting events like Pride Week and LGBT+ History Month (45%) than opposed (28%) (p.39). Yet the report suggests support for such events is NOT "merely denoting support for the broadly liberal consensus shared by most British people" (p.1). The evidence here suggests otherwise - more of the British people want schools to be supporting these events!
More than twice as many parents thought that boys in their child's school were *not* made to feel ashamed for being male (50%) than those who thought they were made to feel ashamed (24%). This suggests that the Great Woke Project of making boy feel ashamed of their maleness is going more poorly than the report writers make out. (p. 40)
As for students, despite scaremongering at the start of the report about how the traditional British school calendar is being usurped by "Critical Social Justice" events, students reported that their school recognizes traditional celebrations like Christmas (73%), Easter (59%), and Remembrance Day (50%) far more than anything to do with social justice.
By contrast, only 33% of students polled reported a celebration of Pride at their school in the last 12 months, and only 32% LGBT History Month. This is comparable to the percentage who recalled celebrations of St. George's Day (30%) and the Platinum/Diamond Jubilee (31%). So much for the Critical Social Justice takeover! (p. 41)
Transgender Day of Remembrance was the LEAST celebrated of the 19 cultural events surveyed about, at 10%. Pancake Day, by contrast, was celebrated in 46% of schools according to these teens. If Pancake Day is four times more likely to be marked in our schools than Transgender Day of Remembrance, then the narrative that gender ideology has permeated our schools to the expense of British traditions must be garbage. (p. 41)
70% of teenagers surveyed reported never having even discussed White Privilege in their school in any way - and this was the most commonly discussed social justice topic of the 15 the researchers polled on. 75% had never heard of Unconscious Bias in school. 83%, microaggressions. 85% of students said they had never had a discussion of decolonisation. The report, however, reads as if these percentages were reversed, as if every school was discussing white privilege and beating white children every day. (p. 41)
More students reported NOT having been taught that the UK is currently a racist (52%) country than report having been taught that it is (42%). Given the clear evidence of the persistence of structural racism in UK society, it seems to me shocking that more than half of students claim to never have even discussed it - and the report’s authors would probably be surprised, too, given they can’t have read these results before writing about them. (p. 42)
The overwhelming majority of students surveyed agreed that "My school positively encourages different viewpoints when discussing contentious social issues" (80%), though a minority were concerned about the social consequences from other students if they expressed certain perspectives (35%). This speaks against the idea that British schools have been overtaken by a censorious monoculture. (p. 42-43)
More students report NOT being taught that young men are a problem for society (55%) than report the opposite (41%). Remember this includes female students, some of whom may hopefully have been taught that, in some cases, men can be especially dangerous to women! (p. 43)
The teenagers surveyed overwhelmingly supported reducing the age at which someone can apply for a Gender Recognition Certificate from 18 years old to 16 years old (56% support vs. 22% oppose). The children, it seems, are a lot more progressive than this report gives them credit for. (p. 43)
All this evidence - from Civitas’ own surveys! - runs entirely contrary to the alarmist narrative of the report. The evidence shows that “contentious ideas” are not “widespread in schools" - unless you consider ideas which 70%+ of students have never heard of “widespread.”
Indeed, the results run so counter to Civitas’ preferred narrative that they are reduced to utterly feeble statements like “We found that a great majority of pupils had encountered at least one of a list of partisan concepts at school” (p.2). Excuse me, but when you list fifteen different, very broad ideas, I am unsurprised that at some point in their schooling most young people have at least a single time discussed at least one of them! This is not serious analysis or serious argumentation.
I cannot say this strongly enough: Civitas’ study disconfirms their theory. Their narrative is false. Even if you agree with Nadler and Civitas that the ideas they abhor are "contentious", the evidence they collected shows they are not widespread in British schools.
Conclusion
I have read a lot of reports from think tanks in my time. I have read a lot of academic research. But I have never read such a shoddy piece of research presented in such a brazenly inaccurate way. It is breathtaking in its audacity, ploughing ahead with a preconceived narrative entirely at odds with its own data, and hoping the readers won’t notice. This report from Civitas and Jo-Anne Nadler is deeply academically unsound and, in the way it represents its findings, fundamentally dishonest. It is an embarrassment to the thinktank and should be retracted.
In “Show Tell and Leave Nothing to the Imagination” Jo-Anne Nadler and Civitas show us a set of data, tell us a story which contradicts that data, and prove that the “woke takeover” of schools they fear is nothing but a figment of their imagination.