Author Archives: tlamjs

About tlamjs

Sometimes have a life, biology teacher, Assistant Head, former Head of Department, interested in teaching and learning. All views are my own or more likely flatteringly borrowed from far more intelligent people, but in no way represent the institutions I work for. Also on twitter @tlamjs.

How Many GCSE ‘Clean Sweeps’ Should We Expect in our Results?

Prologue: I asked my GCSE German teacher if she would tell me my mock grade before our lesson. I cannot believe how well she said I had done! I got a 9! Yay! 

– – – – –

We are at the end of an unprecedented period of reform for GCSEs. It is still unclear exactly the level of challenge reformed qualifications and assessments give a pupil, nor are we yet able to predict with any certainty where grade boundaries might be set. At GCSE the grade 9 was to be reserved for only the very highest performances. Tim Leunig, then chief scientific advisor of the Department for Education, ‘guessed’ on Twitter in 2017 that only two pupils in the UK would achieve a clean sweep – straight grade 9s in all of their GCSEs – once the qualifications had swapped to number grades (Leunig, 2017). A more scientific approach to this estimate was conducted by Cambridge Assessment in the excellent article by Tom Benton in the run up to the summer 2018 exam session. It was predicted that:

  • Of pupils taking eight GCSEs, between 200 and 900 candidates will achieve straight 9s.
  • Of pupils taking ten GCSEs, between 100 and 600 candidates will achieve straight 9s.

(Benton, 2018)

Nationwide this means we are potentially looking at a very small number in contrast to the familiar newspaper stories of huge swathes of pupils at the most selective schools achieving a clean sweep of A*s. It is therefore unfair to expect a raft of pupils to achieve straight 9s in the summer. Are your school leaders aware that there is a possibility only 100 pupils will achieve ten 9 grades this summer? If not show them Tom’s article in all its glory, it really is a thing of beauty.

So how many students actually achieved straight 9s in the 2018 results? A good question! Dave Thomson at the Education Data Lab has written a far better blog than the one you are reading that answers it (Thomson, 2018). He points us to Ofqual’s official guide to the 2018 data, which looked at pupils who took at least seven reformed GCSEs; only 732 achieved a grade 9 in all of their GCSEs; of this number 62% were female and 38% male (Ofqual, 2018). Do you think this data should be something your school knows about? If you think so send them a link to Dave’s wonderful blog.

The overall picture is that the new GCSE qualifications are purposefully tough – the new “gold standard” according to the DfE (Hansard, 2019) – but still in their infancy. The more data that is formally collected and analysed, the better we can evaluate the new qualifications. It is certainly clear that comparisons to pre-2018 data will be tricky if not actively unhelpful. Yet you can be sure that there will be an unhealthy focus in some quarters on achieving ‘clean sweeps’. To end the post and counteract this tunnel vision I would like to bring attention to this quote from Tom Benton:

“Regardless of whether the predictions [of how many pupils will achieve straight 9 grades] are right or wrong, one thing is clear: achieving grade 9 in any GCSE subject is hard. Congratulations to all those students who achieve it in any subject at all.”

(Benton, 2018)

– – – – –

Epilogue: I asked my A Level Spanish teacher if he would tell me what my UCAS prediction was. I am so disappointed. I really feel I can do better than a C. It’s not fair 😦

– – – – –

References:

  1. Leunig, T (2017) “2 is my guess – not a formal DfE prediction. With a big enough sample, I think someone will get lucky…” Tweeted 25th March 2017. Accessed 10th June 2019. View here.
  2. Benton, T (2018) How many students will achieve straight grade 9s in reformed GCSEs? Accessed 10th June 2019. View here.
  3. Thomson, D (2018) GCSE results 2018: How many grade 9s were awarded in the newly reformed subjects? Accessed 10th June 2019. View here.
  4. Ofqual (2018) Guide to GCSE results for England, 2018. Accessed 10th June 2019. View here.
  5. Hansard (2019) IGCSE: Written question 231357, answered on 19th March 2019. Accessed 10th June 2019. View here.
  6. Prologue and Epilogue ‘jokes’ the work of the author of this blog post. Sorry!
Advertisements

Process and outcome

In February Andreas Schleicher, director of education and skills at the OECD, took aim at the lack of depth of study in many schools. His view is that a “mile wide inch deep” education does not prepare young people for an uncertain future. In an effort to be responsive to the demands of the day schools keep broadening but what is added to the curriculum is only of the moment and potentially not relevant in future. The headline No point teaching coding, says Pisa chief makes clear his opinion on this particular area of study.

The article goes on to mention there is no longer a need to teach trigonometry in mathematics. This is indicative of a more general misunderstanding; Herr Schleicher has never taught in a school and is instead a statistician and researcher. It is therefore unsurprising that he has no real understanding of what happens in a classroom nor departmental meetings. The art of learning can sometimes be as important as the content that is learnt. The premise that the content itself is the sole objective of education ignores the process of learning in favour of merely the demonstrable outcome of what has been learnt. Education is all about processes rather than outcomes, yet we find ourselves in the position that statistical analysis reduces ‘success’ to the metric that can most easily be measured, e.g. examination grades. An unfortunate state of affairs that is compounded by pupils and schools being judged on the outcome rather than the process. The act of learning trigonometry is arguably more important than the understanding it brings. Just as an undergraduate degree signifies a student can read and review a variety of information under inflexible deadlines, and work – sometimes with others – to analyse a specific topic. This might even be regardless of what is being analysed. In itself this is enough to demonstrate qualities of application before the topic-specific expertise or class of degree are even considered.

However, Shleicher does at least highlight the discussion of depth v breadth. The curriculum is finite, so anything that is included has ousted some other – perhaps equally important – information. There should always be debate over what should be taught in a specific subject as well as what is learnt as a whole. Other than for hundreds of years of tradition there are many topics that are ‘outdated’ in one way or another or have ‘redundant’ content. There are also whole subjects that could be questioned! Changes at GCSE and A Level has seen a narrowing of curriculum precisely because outcomes are so highly regarded when assessing ‘success’. Shleicher might be wrong in questioning the topics being taught in the classroom, but as someone who ultimately assesses the worth of education systems his views on depth of study carry immense political influence.

 

Lessons from a Lion

Yesterday I had the pleasure of attending ‘An Evening with Stuart Pearce’, one of the events that makeup Marlborough College’s Memorial Hall Festival. Without doubt, Stuart Pearce is one of the best speakers I have listened to. The former England international is reflective, insightful and incredibly honest. Nothing like you might imagine an ex-professional footballer to be. During the event these points particularly resonated:

  • Treat both adversity and success as learning opportunities.
  • Do not fear failure. Using an example from his England career Pearce declared not putting yourself forward or taking a chance to be a bigger failing (this referenced his successful kick from the penalty mark against Spain in Euro 96 following the miss in the semi final of Italia 90 against Germany): “failure is staying on the halfway line and not taking a penalty when you know you are one of the top five penalty takers in the team.”
  • Success breeds success. Until you have broken the glass ceiling and achieved real success you do not know what it takes to get there or how it feels to achieve it.
  • Success is also about hard work. Pearce stated talent was somewhere between 5-20% responsible for success.
  • While executing a game plan, playing well and winning is good it pales into comparison against long-term progress. Pearce gave the example of seeing Jordan Henderson’s development from England U21 captain to established senior international as something much more important than any quick win.
  • The story that made the biggest impression on me was from mid 1995 when the then England manager, Terry Venables, telephoned him. El Tel told him that Graeme Le Saux would be his first-choice left back, perhaps hoping Pearce would decide to retire from international duty. Instead Pearce elected to continue to make himself available as a squad player; a 32 year old, 62 cap, former England captain. Le Saux would suffer the misfortune of breaking his ankle two months before Euro 96, however, this gave Pearce the opportunity to go on to become a great, paying off the hard work and humility of his decision.
  • The evening was fascinating. Not least because it was a chance for me to hear a childhood idol talk about events that I had seen from the outside. But also because Pearce was erudite and open. And I have not even mentioned his anecdotes about Brian Clough!
  • Questioning Questioning: SASFE18

    Discussion 2This year sees the third St Albans School Forum on Education (SASFE) on Saturday 12th May. An energetic conference with a small-scale and very human touch; delegates are not there to make up the numbers but are part of a forum. SASFE is very much built on the successes of conferences such as the Schools History Project, TLAB, Pedagoo and other events of that ilk. There is no hidden agenda, no corporate branding and it is not-for-profit. We simply put on an event hosted by teachers, for teachers, listening to and learning from our peers. Taking the best aspects of a TeachMeet and combining an overarching theme with plenty of time to stop, think and discuss what is being presented. Conversations flow throughout the day (they might even end up in pub after the official event is over!) and hopefully impact on actual teachers in classrooms over the country.

    daqnpyexsaacxpz.jpg

    Our theme for 2018 is ‘Questions and Questioning’ with keynotes from Drs Caroline Creaby, Nick Dennis and Bettina Hohnen, as well as three seminars from a variety of figures in education. Tickets for this event are just £27 and this includes on-site parking, refreshments and lunch. To preserve the smaller-scale feel of the day places are limited, so if you are interested please book tickets via TicketSource (booking fee applies) or get in touch via email.

    Audience

    In the past SASFE has tackled ‘Assessment and Feedback’ and ‘Learning Relationships’. This year’s theme looks to explore the importance of questions. Teachers ask questions all of the time, it is an ever-present weapon in the armoury of teachers around the world. What could be more apt than starting to question questioning? Chris Moyse’s excellent series of Research in 100 Words posters features ‘Ask Questions’. In it he darhowawaaaiprl.jpgsurmises that the most effective teachers ask pupils to explain how they answered a question, honing in on the thought process rather than just the final answer. Come and join us to question questioning itself. It is the delegates that really make the conference so worthwhile; I do hope you might consider coming along and experiencing SASFE for yourself.

    Thoughts from SASFE17: Education is in good hands

    Thoughts from SASFE16: The Conversation


    Book now

    Interdependent learning in action #srocks18

    Do you really want your students to be independent learners? I don’t think so and will be talking about my stance on this issue at the inaugural Southern Rocks conference on Saturday 3rd February… If you have a ticket and are interested in this topic please come along to my postprandial workshop. If you are not lucky enough to have secured a ticket I am sure you could independently find this all our by yourself. Or not.

    More and more I am convinced that there should be interdependence between learner and teacher, combining elements of dependent (teacher-led) and independent (student-led) learning. At a very basic level this could be a teacher going through worked examples on a board moving on to students practising further exercises by themselves. In fact all lessons involve a spectrum* of interdependence.

    My Southern Rocks session will look at three ways of trying to promote interdependent learning:

    1. Self-regulation of learning
    2. Rules of studying
    3. Building topic-specific vocabulary

    1 and 2 are based on whole-school interventions, whereas 3 is an action research project from my very own classroom. However, if you will not be at Southern Rocks, Chapter 10 of Carl Hendrick and Robin Macpherson’s What does this look like in the classroom? is a very useful stimulus for discussing what we really mean and want from independent learning. Or for those really short of time see this thread from Daniel Sabato.

    *If you are being fussy you might call it a ‘continuum’ rather than ‘spectrum’.

    Key Words in A level Biology

    This post follows up from Keywords and fluency of understanding, a post written before the Autumn Term. It seems apt that I reflect back on the vocab books and weekly testing I have introduced with my Upper Sixth class.

    Vocabulary book.PNG

    How it looked in the classroom

    Every student in my class was issued with a vocabulary book in September. This was specifically for the first topic, Plant Physiology and Biochemistry. The students themselves numbered it from 1 to 35 with two numbers per page. Examples from three different students should make this format clear, see below. I had decided the key words in advance (see appendix for the full list) and – as we gradually covered the topic – gave them words to put in the correct place. E.g. number 13 was the Calvin cycle. For homework students found a definition – handily enough there was a full glossary at the back of their textbook – copied it down and learned it for a vocab test at the start of the next lesson.

    VB1.PNG

    Student A – key words 23 to 26

    VB2.PNG

    Student B – key words 27 to 30

     

    VB3.PNG

    Student C – key words 31 to 34

    Fortuitously the arrangement of other Upper Sixth Biology classes gave this intervention a quasi-experimental design that I could compare my class to the other three classes. The fact that the cohort numbers 40 and I have ten, exactly a quarter, in my class was further good luck. This would allow me to evaluate the vocab books and weekly testing for my class which I will now call the experimental group. All 30 other Biology A level students in the Upper Sixth did not use the vocab books or have regular key word testing, therefore these students will be called the control group.

    What happened?

    Good question (and for those who like a bit of value-added data (VAD) tomfoolery this will be a real treat!). Firstly, as a basic comparison the table below compares the mean average score for the Plant Physiology and Biochemistry end-of-topic test for the experimental and control groups:

    Experimental Group 66%
    Control Group 64%

    Two percent! I have written before about gains as small as 1%, but is 2% a good result? Ultimately this does not tell us too much. The classes are all mixed-ability by the standards of the school, so it might seem that my intervention has had a positive effect. But students are organised into classes by option blocking of their other subjects, therefore I might have randomly been allocated students whose attainment is higher by chance.

    This is where VAD might help. Our students are given ALIS scores which project future A level performance from GCSE results; see the clever people at CEM for more details. So I can now use the projected ALIS grade to compare to student performance in the end-of-topic test to calculate a residual. E.g. if a student is projected an A grade but scores a B their VAD residual is -1, if the same student gains an A* their residual is +1 and if they get their projected grade the residual is 0. The table below shows the average VAD residual for both groups:

    Experimental Group -0.8
    Control Group -1.2

    This means the experimental group, on average, scored 0.8 of a grade lower than their projected grade. However, this was better than the control group where students scored, on average, 1.2 grades lower.

    Does this mean anything? Well, let us muddy the waters a little by considering that the experimental group has a second teacher (an excellent person and very knowledgeable biologist) who takes them for half of their allocated teaching time. How do the experimental group’s scores compare in the topic they did with me, using vocab books and regular testing, to the topic they did with Teacher 2? The information is tabulated below:

    Topic

    Mean average score Average VAD residual

    Plant Phys. and Biochem.

    66%

    -0.8

    Genetics

    65%

    -1.1

    We are still looking at very fine margins but it certainly seems that they have attained better grades, in terms of VAD, for the topic they used vocab books and regular key word testing. However, (and as all good biologists know) correlation does not mean causation. There are a huge number of reasons for this, really very small, discrepancy. For example, is the Plant Physiology and Biochemistry topic the same difficulty as the Genetics topic? Questions like this will continue to be considered as the year goes on.

    What did the students think?

    They have humoured me and done as asked, producing the vocab books and gamely accepting the vocab tests. They seem to not object massively to the intervention and, as the topic came to an end, were quite pleased with their final vocab books (I think). In fact, they have even suggested making the vocab test harder by mixing up previous key words.

    What next?

    My class / the experimental group have almost finished their second topic, Energy and Respiration, so I will have further data to consider in due course. However, their points of view are equally important to me so I have arranged a group interview with a few of the students and another separate group interview with a few students in the control group. It will be interesting to consider their responses amongst the data!

    Appendix

    Key words for the Plant Physiology and Biochemsitry topic:

    1. Abscisic acid (ABA)
    2. Absorption spectrum
    3. Accessory pigment
    4. Action spectrum
    5. ADP
    6. Aerenchyma
    7. ATP
    8. ATP Synthase
    9. Auxin
    10. Bundle sheath cells
    11. C3 plant
    12. C4 plant
    13. Calvin cycle
    14. Carotenoid
    15. Chlorophyll
    16. Chloroplast
    17. Dormancy
    18. Endosperm
    19. Giberellin
    20. Granum
    21. IAA
    22. Light dependent reactions
    23. Light independent reactions
    24. Limiting factor
    25. Mesophyll
    26. Palisade mesophyll
    27. PEP
    28. PEP carboxylase
    29. Phosphorylation
    30. Photoautotroph
    31. Photolysis
    32. Photophosphorylation
    33. Photorespiration
    34. Photosystem
    35. Reaction centre

     

    Questions and Questioning

    This post is based on a school TLAG (Teaching, Learning and Assessment Group) meeting held at lunchtime on Monday. Wonderfully, sixteen teachers came along and gave up their free time to debate and discuss how we use questions. The stimulus material is shared below.

    Why Do We Ask Questions?

    Good question! Well it seems they provide necessary practice for students to actually practise new material and help connect new information to prior learning. Questions also allow a teacher to make an assessment of how well material has been learned and whether there is a need for re-teaching. Effective teachers ask students to explain the process they used to answer a question. E.g. orally, through rough working, mini-whiteboards, etc.

    Wait Times

    Teachers significantly overestimate the time they give students to think about and answer a question. The typical teacher pauses, on average, between just 0.71.4 seconds after their question before continuing to talk or attempting to ‘repair’ the interaction… Even worse, if teachers think a student does not know or will be slow in giving an answer this period of time is often less than 0.7 seconds. However, if we can just wait three seconds, amazing things start to happen:

    • The length and correctness of responses increase.
    • The number of “I don’t know” and no answer responses decrease.
    • The number of volunteered answers greatly increase.

    Patterns of Questioning

    Is there a pattern of questioning that you notice? Using I (initiation of a question), R (response), E (evaluation) or F (feedback) and the colours red (teacher) and blue (pupil) we can construct some theoretical patterns of questioning:

    • IRERF
    • IRF
    • IRERERF

    Or how about getting peer evaluation or feedback from another pupil:

    • IREF
    • IRERF

    Next time you observe a lesson, try to make a note of the patterns that questioning takes in the classroom.

    ‘Open’ and ‘Closed’ Questions

    ‘Open’ questions give pupils the chance to “influence the subsequent course of the lesson”. Therefore ‘open’ questions transfer power from the teacher to the learners. ‘Closed’ questions tend to have one answer, or a range of correct answers. E.g. recalling facts or information. Therefore they have a more predictable pattern (IRF).

    DISCLAIMER: Harris and Williams argue that the words ‘open’ and ‘closed’ are not sufficiently complex terms to cover the concept!

    Hands Up?

    Finally, a consideration of how to select pupils to answer a question. A random name generator? Lollipop sticks? With a brief clip from Dylan Wiliam’s BBC2 programme The Classroom Experiment, first shown in 2010 used to end the session.

    Read Further

    Why Do We Ask Questions?

    Via Chris Moyse’s Research in 100 Words series.

    Good and Grouws (1977) Teaching effects: a process-product study in fourth grade mathematics classrooms. Journal of Teacher Education.

    King (1994) Effects of teaching children how to question and how to explain. American Educational Research Journal

    Wait Times

    Stahl (1994) Using “Think-Time” and “Wait-Time” Skillfully in the Classroom. ERIC Digest.

    Patterns of Questioning

    Mehan, H. (1979) ‘Learning Lessons: Social Organisation in the Classroom’. Cambridge, MA, Harvard University Press.

    Mortimer, E. and Scott, P. (2003) ‘Capturing and characterizing the talk of school science’’ in Meaning Making in Secondary Science Classrooms, Buckingham: Open University Press.

    ‘Open’ and ‘Closed’ Questions

    Harris and Williams (2012) (2012) The association of classroom interactions, year group and social class, British Educational Research Journal, 38, 3, 373-397.

    Hands Up?

    BBC2 The Classroom Experiment (2010), Episode 1 

    BBC2 The Classroom Experiment (2010), Episode 2