New Specification A Levels – Waiting for the First Results

This post was originally written the week before the 2017 A Level exam results were released (hence the reference to 17th August on the image). I then updated it with the postscript once the results were published. I also re-posted my post on UCAS clearing.

I teach psychology (among other things) and last year I wrote about the Summer 2016 AS exams which were then the first test of the new specification, my teaching of it and interpretation of the assessment criteria. You can read that post here.

This year we’re waiting for the first results for the full two-year Advanced Level exams. While we had a good experience with AS, all those concerns about the first run-through of a Specification are still in my mind as I wait for the Advanced psychology results: 

  1. How will my students perform in the actual exams as opposed to our own assessments based on specimen materials?
  2. Will performance nationally vary widely from the usual norm, with a large consequent adjustment of grade boundaries (either up or down)?


1. Performance in the actual exams

One of the reasons I opted for the AQA specification was the support this board offered for the new specification including sample assessments, Mark schemes and commentaries. The last time the specification changed the actual exam papers had contained some questions very different in style from the somewhat sparse sample papers. Support from AQA in advance was much better this time, there hadn’t been the same differences in the AS papers, nor were they in the A Level exams this Summer.

There were quite a few widely-reported errors in exams this season, and more recent reporting of the possible impact on students, for example this article from The Guardian on ‘the stress of sitting new untested exams’. Whether or not there were more mistakes than usual, this publicity does seem to have shaken the confidence of many students in the exams process itself.  

Although there were no errors in AQA psychology papers, one thing my students did have to contend with was errors in their brand new text books, particularly first print runs of first editions. I’ve seen this before when publishers rush to get texts out for new specifications. There are often mislabelled images, errors in tables, or inaccuracies in the indexing (i.e. mistakes arising in the production of pages, rather than the authors’ text) but this time there seemed to be several factual errors. Much as it gives my ego a boost to be able to show through reference to primary sources that I was right and the textbook was in error, it doesn’t help students (except perhaps to question everything) and shakes their confidence in their reference materials.


2. Will performance vary nationally with unpredicable consequences?

This is a question we will only be able to answer when the results are out. As I wrote in by post about the AS results, such probes have occurred in the past when new specifications have changed, most notably in 2011 (DFE, 2012). This did not seem to be the case for the 2016 AS exams, although more A grades were awarded in psychology. Hopefully this is an indication that Ofqual are on the ball and ensuring a smooth transition between specifications so that students sitting the first year of a new exam will not be penalised.

Nevertheless, whatever the speculation, it’s the actual results that matter. So, like my year 13 students, I’ll be awaiting the A level results a little more nervously than usual this year. I’ll also be hoping that their results, and everyone else’s, will be a true indication of each student’s performance.

Postscript – 18th August 2017

It’s seems that now the results are available that there was not wide variation nationally compared with the 2016 results (see this Ofqual infographic), although the media made much of the fact that more boys than girls received top grades.  A* and A grades for the new A levels were slightly down on 2016, with Ofqual stating the changes reflected differences in prior attainment. The proportion of top grades in (unreformed) languages increased as had been previously agreed to counter skewing of results by native speakers. I find it interesting that Ofquals analysis focussed on the top grades.

As for psychology, the proportion of A*/A grades fell 0.3% to 18.8%. There weren’t any shocks as far as the results of my own students went, although a couple did a bit better than I predicted and a couple missed out on a grade. It’s a small number to draw valid conclusions from, but if there was a theme, I think it was that those who worked hard did well, irrespective of their starting point, which must be a good thing.

Ten tips to avoid exam stress (revisited)

Exam season looms large on the horizon and we teachers must balance appropriate motivating of our students with awareness of likely stress or anxiety.

I wrote an earlier version of this post in April 2016. In 2017 there seems to be even more uncertainty, for teachers and students alike. In the new GCSEs we can’t guide students with any real certainty as to which grades they will achieve. For A levels, it’s the first time any of the new Advanced exams have been set, and only the second for new AS qualifications. Such uncertainties are likely to add to the anxiety of some students. Teachers need to be especially careful not to project our own worry on to those we teach.

Here, then, I am revisiting ten helpful things students can do to keep motivated and stay healthy too. The list originates from an (old specification!) A level psychology task I gave my students to do when they studied a unit on stress. The aim was to use what they had learned to write advice for fellow students. I have developed it over the years and this latest version is influenced by advice from our School Health Nurse, the NHS, and the charity Mind. 

Ten tips to beat exam stress

  1. Get Organised. Make sure you know what exams you have, what kind of questions they will have and when they are.
  2. Manage your time. Your time is precious, so make the best use of it by drawing up a revision timetable. Make sure you build in breaks between sessions.
  3. Stay In control by sticking to your plan and using it to review what you have achieved and what is coming next.
  4. The right Environment. Work somewhere that is light, has enough space and is distraction-free. Music may be OK (you’ll know what works for you) but visual input from TV, screens & social media will just distract you. 
  5. Boost your confidence. Use a revision journal, recall things that have gone well in the past and visualise your success.
  6. Eat Healthily and stay hydrated. Avoid ‘energy’ drinks: they may give the illusion of alertness but actually impair your performance (that’s why you never see an advert saying ‘Drink Red Bull: it helps you revise.’ Because it doesn’t.
  7. Get enough sleep; don’t stay up late revising, a tired brain does not work well, either at the time, or the next morning.
  8. Friends & family. Let them know you have exams and need to revise. Keep in touch during those breaks you planned into your revision.
  9. Avoid life changes. Now isn’t the time To start a new relationship or plan to run away to the circus (however tempting that may seem).
  10. Understand your body and the signals it sends you. Recognise that signs of exam nerves like ‘butterflies in the stomach’ a dry mouth, or sweaty palms are nothing to worry about. They are just symptoms telling you that your body preparing for action. 

We include a version of this list in the revision advice we give to students and share it with parents through our school newsletter. This year we have also run special sessions on tackling exam anxiety this year which have proved popular. 

Students can get more help and advice on student life from the Student Minds website and  these pages on the Mind website where you can also download a PDF document. Advice directed at parents and carers can be found on this area of the NHS Choices website.

I hope you found this post useful. Please feel free to use and adapt it as you wish. I’d be interested in which resources other schools use.

Dear Santa… An education wish list

Dear Santa,

I know this is your busiest time of year, but amidst running your workshop, feeding your reindeer, checking your list (twice), and delivering all those toys, would you be kind enough to have a look at my school wish list? These are just suggestions; I certainly don’t expect everything, but some progress on one or two would be really helpful.

Invisible goal posts. Many children respond well to sporting analogies and I’d like a way to help explain how the new GCSE grades work. We could play a match where we know that there are goalposts, but aren’t allowed to know exactly where they are. Players can take shots at the end of the field and then, after the final whistle has blown, we can reveal where the goalposts were (adjusting them to allow only a few player’s attempts to count) and only then reveal the final score.

A new Progress 8 coefficient. I know I had one of these last year, so it isn’t very old, but it just doesn’t seem to be working properly. What I’d really like is a progress measure that measures progress and doesn’t get caught up in whether a school has got enough pupils doing particular qualifications.

A bucket. To be honest I’m not sure how I feel about buckets. I know they can be useful – you probably have one hanging off the back of your sleigh to clear up after the reindeer -and it seems that in English schools nowadays, everyone has to have their buckets full. The trouble is, I can’t seem to find the bucket I want. It’s called the ‘Really useful qualifications that help individual students fulfil their career aspirations, progress in life and become productive, responsible citizens within an egalitarian compassionate society’. If you could help with the search for this, that would be fantastic.

An understanding of the delegated SEND budget. My role at school is now focussed on inclusion and I have tried to understand how this funding works, but however hard I think about it, it doesn’t seem to make sense. The bible has been of some help: Jesus apparently fed 5000 people with a few loaves and fishes. This seems to equate closely to the funding model, but even in this example there is no explanation of what to do when more people turn up, undergo a lengthy assessment process, have their needs identified in an EHC Plan, and then schools receives additional funding of… well, nothing. 

A ticket to Shanghai. I’ve been hearing a lot about how well pupils do in Shanghai, particularly in maths, so I’d like to take a trip there. Hopefully I’ll be able to bring back some useful things: some resources and teaching methods yes, but also generous non contact time, a millennia-old appreciation of the value of learning, consistently high parental engagement, and an ingrained universal cultural respect for the status of the teaching profession, which also make up the full package.

Mousetrap. You know, the board game with lots of plastic bits that my mum said would only get lost. Not educational maybe but I put it on my Christmas list each Year through the 1970s. Thought I’d give it another go.

Thanks Santa, I’ll leave a mince pie, a nip of single malt, and a carrot for Rudolf by the fireplace as usual.
What’s on your list to Santa?

Picture credit:


No more mobiles 

From September we are banning the use of mobile phones in the secondary phase of our all-through school (we have never allowed them at primary). As this is a contentious issue, I’m going to blog about how it works and what issues arise.

This first post is about why we took the decision.

Our school opened in 2003, combining former middle and upper schools. We moved into new buildings in 2006 and opened our primary phase in 2012. We have grown as as school in parallel with the growth in mobile phone use by children and the development of smartphone technology. Up to now our policy has been that mobile phones should be off and put away in lessons (unless a teacher specifies their use for an educational purpose) but can be used at break or lunch. 

Last year a team at the LSE produced this paper on the positive impact on exam results at schools which had banned mobile phones – Ill communication: Technology, Distraction & Student Performance. The results were clear-cut and chimed with the concerns that mobiles, especially mobile access to social media were a distraction to students. Nevertheless, we could also see that there could be educational advantages in using this technology, and that we could involve students in the decision. We spent a year monitoring the impact of mobiles. Our Principal spoke to students in assembly about the research on mobiles, how persuasive the evidence was for a ban, and how we would be considering it.

At the end of last term we announced the new rule, publishing it in our newsletter. We took many factors into consideration but four predominated in our thinking:

  1. A proportion of students are distracted by social media. As the LSE research found, these were more likely to be those who had fallen behind their peers and could least afford the distraction.
  2. A high proportion of behaviour incidents in school centred on phone usage. This was a minority of students but took up a disproportionate amount of staff time.
  3. In almost all incidents of bullying (and one-off interpersonal nastiness) social media was a key component. Incidents were prolonged and magnified because of social media comment. This involved a small number of students but a vast amount of staff time.
  4. Mobile phones are a central aspect of the involvement of children, by older peers and adults in substance abuse, crime, and CSE. Thankfully this involved only a very small number of children but they are the most vulnerable of those in our care.

The most frequent educational uses of phones, outside of computing lessons. were recording homework or for reference (e.g. dictionary, web search). Other resources for these are available to all students.

    In view of all this took the decision to ban mobiles in schools. We have addressed parental concerns about safety on the journey to and from school by collecting phones at the start of the day  and handing them back at the end (yes, that has required logistical planning). If a pupil keeps their phone off and in their bag all day we will be none the wiser, of course, but if a phone is used it will be confiscated until a parent / carer can collect it. The exception we have made is students with special needs or disabilities, or who have English as an additional language and use specific apps to help them access the curriculum.

    All that has been theory up to now; we start on Monday 5th September. I’ll be posting updates on how it goes. Your comments are always welcome and I’d be interested in hearing how other schools address this issue.

    New Specification AS – Waiting for Results Day

    I originally wrote this a few days before the 2016 AS and A Level results came out. I added the postscript after the dust had settled on results day.

    I teach psychology, one on the new AS specifications taught from September 2015. This Summer’s AS exams were the first test of the new specification, my teaching of it and interpretation of the assessment criteria. That adds a little spice to the wait for results day, for teachers as well as students!

    We opted for the AQA specification, and had done the AQA A specification previously. There was really good support for the new specification from this exam board and a wealth of sample assessments, Mark schemes and commentaries. The last time the specification changed, there were considerable differences in the style of some questions from the sample materials, but this wasn’t the case this time round. I’m aware, however that odd things can happen in the first year of a new exam and research commissioned by the DFE shows that the grade distributions can plateau or fall when specifications change, as for A levels in 2010 and GCSEs in 2011 (DFE RB203, March 2012). It’s this knowledge that means that, irrespective of how long I’ve been teaching, how much I prepared for the new course, how well the students performed in internal assessments or PPEs, I can’t be as sure of how they will do in the actual exams as I was the year before.

    I was pleased to see that Ofqual are also aware of the issues surrounding assessment of the new qualifications and say they have taken steps to ensure standardisation across the transition, so that students examined in the first year of a new course are not penalised. They also plan to publish their analysis of the results on AS results day (you can read the Ofqual blog post on setting standards for new AS qualifications here). I haven’t been impressed with some strategies used to ‘maintain standards’ especially moving grade boundaries (as you can read in my post Beating the Bounds from last year), but I’m hoping that this move to be ahead of the curve and transparent with a prompt statistical analysis is a positive one.

    So, like my students, I’ll be awaiting the results a little nervously this year – probably good for me to feel a little of what it’s like in their shoes, but I’m hoping that lessons have been from the introduction of new assessments in the past, so results will be a true indication of students’ performance.

    Postscript 18/8/17

    The results are now all in, I now now how my own students did in their exams and Ofqual have published their promised analysis (read it here).

    Nationally, the picture seems fairly stable with a 1% increase for A grades at AS continuing a trend (although only 0.2% for psychology). In general Ofqual say the outcomes for the new AS exams were similar to those taken by 17 year olds last year.

    As for my own students, much the same story. There was a slightly positive VA versus forecasts overall, so I didn’t need to worry and will have a keen bunch to embark on the first run of ‘A level Year 2’ (we decided to enter everyone for AS this time round).

    I hope your experiences with the A Level & BTEC were similarly positive.

    Beating the Bounds: A call for transparency over grade boundaries

    I wrote this post in August 2015. A year later, anticipating the next batch of GCSE results, I still have the same concerns. As Geoff Barton has said (In this Guardian  article), “I used to know what a C grade in English looked like and a grade A. Now it feels as if someone somewhere, in an obscure back office, makes the decision.”

    Beating the bounds’ was an ancient practice that still survives in several English parishes. Members of the parish would travel the boundary beating the marker stones with greenery. Before the advent of accurate maps this ritual had the very practical purpose of ensuring that everyone knew the agreed position of the parish boundaries should any dispute arise in the future.

    In recent years, controversy over exam grade boundaries has become a depressingly regular feature of results days. For some reason the need to ‘maintain standards’ now seems to require annual moving of the goalposts. I am not referring to adjustments exam board panels make to raw marks to take account of annual variations in the difficulty of papers (which has always occurred and is explained by examiners each year) but the wholesale statistical manipulation of results to maintain the proportion of students gaining a particular grade. Evidence of this often has to be inferred from strange grade distributions, the suspicious  clumping of candidates scores just below the C grade boundary, or the realisation that teachers were spot on with there predictions, except for the C/D boundary.

    I believe that for students to succeed, it is essential that they know what they will be assessed on and how they will be assessed. That includes the criteria for each particular grade. In recent years Ofqual has become muscular in flexing its power to adjust grade boundaries – somehow maintaining standards by changing them. This has led to the peculiar situation where teachers and students strive for improvement and Ofqual seems to do its best to stamp it out!

    At this point you may be thinking ‘but what about grade inflation?’ I appreciate that a function of external exams is to identify the differing abilities of candidates and an exam fails in this function if everyone gains the top grades. So, how can we maintain the credibility of exams when schools are continually improving the learning and exam technique of students through better teaching?

    I think the answer is transparency. In the annual ritual of beating the bounds it wasn’t just the priests who marked the boundaries but the whole parish community. The whole point was that everyone knew where the boundary lines lay. In a similar way, it is surely important that all involved in education understand the requirements for achieving a particular grade. If, over time, the spread of results for a particular subject becomes too slanted towards top grades, such that the existing assessment criteria are in danger of becoming unfit for purpose, they should be adjusted. This should be announced in advance of students commencing the course. In this way, teachers, students, and their parents will all know what will be required to secure a particular grade and the public will be aware that that year cohort faces a tougher exam, understanding that a year-on-year comparison cannot be made.

    Of course, this already happens. A couple of years ago, GCSE science students in England knew it would be harder to get a higher grade than the year before. This only made it more extraordinary that those same students had to suffer retrospective tinkering with their English Language results after they had sat the exam.

    If we have an exam regulator that is adept at monitoring and committed to transparency – if we all participate in an annual beating of the bounds – we should be able to achieve a robust, credible exam system that ensures that students can sit exams in the assurance that what is expected of them will never be changed after the event.