New Specification A Levels – Waiting for the First Results

This post was originally written the week before the 2017 A Level exam results were released (hence the reference to 17th August on the image). I then updated it with the postscript once the results were published. I also re-posted my post on UCAS clearing.

I teach psychology (among other things) and last year I wrote about the Summer 2016 AS exams which were then the first test of the new specification, my teaching of it and interpretation of the assessment criteria. You can read that post here.

This year we’re waiting for the first results for the full two-year Advanced Level exams. While we had a good experience with AS, all those concerns about the first run-through of a Specification are still in my mind as I wait for the Advanced psychology results: 

  1. How will my students perform in the actual exams as opposed to our own assessments based on specimen materials?
  2. Will performance nationally vary widely from the usual norm, with a large consequent adjustment of grade boundaries (either up or down)?


1. Performance in the actual exams

One of the reasons I opted for the AQA specification was the support this board offered for the new specification including sample assessments, Mark schemes and commentaries. The last time the specification changed the actual exam papers had contained some questions very different in style from the somewhat sparse sample papers. Support from AQA in advance was much better this time, there hadn’t been the same differences in the AS papers, nor were they in the A Level exams this Summer.

There were quite a few widely-reported errors in exams this season, and more recent reporting of the possible impact on students, for example this article from The Guardian on ‘the stress of sitting new untested exams’. Whether or not there were more mistakes than usual, this publicity does seem to have shaken the confidence of many students in the exams process itself.  

Although there were no errors in AQA psychology papers, one thing my students did have to contend with was errors in their brand new text books, particularly first print runs of first editions. I’ve seen this before when publishers rush to get texts out for new specifications. There are often mislabelled images, errors in tables, or inaccuracies in the indexing (i.e. mistakes arising in the production of pages, rather than the authors’ text) but this time there seemed to be several factual errors. Much as it gives my ego a boost to be able to show through reference to primary sources that I was right and the textbook was in error, it doesn’t help students (except perhaps to question everything) and shakes their confidence in their reference materials.


2. Will performance vary nationally with unpredicable consequences?

This is a question we will only be able to answer when the results are out. As I wrote in by post about the AS results, such probes have occurred in the past when new specifications have changed, most notably in 2011 (DFE, 2012). This did not seem to be the case for the 2016 AS exams, although more A grades were awarded in psychology. Hopefully this is an indication that Ofqual are on the ball and ensuring a smooth transition between specifications so that students sitting the first year of a new exam will not be penalised.

Nevertheless, whatever the speculation, it’s the actual results that matter. So, like my year 13 students, I’ll be awaiting the A level results a little more nervously than usual this year. I’ll also be hoping that their results, and everyone else’s, will be a true indication of each student’s performance.

Postscript – 18th August 2017

It’s seems that now the results are available that there was not wide variation nationally compared with the 2016 results (see this Ofqual infographic), although the media made much of the fact that more boys than girls received top grades.  A* and A grades for the new A levels were slightly down on 2016, with Ofqual stating the changes reflected differences in prior attainment. The proportion of top grades in (unreformed) languages increased as had been previously agreed to counter skewing of results by native speakers. I find it interesting that Ofquals analysis focussed on the top grades.

As for psychology, the proportion of A*/A grades fell 0.3% to 18.8%. There weren’t any shocks as far as the results of my own students went, although a couple did a bit better than I predicted and a couple missed out on a grade. It’s a small number to draw valid conclusions from, but if there was a theme, I think it was that those who worked hard did well, irrespective of their starting point, which must be a good thing.

Storm Doris: Do windy days wind children up?

This is a perennial topic for the staff room or playground duty and this week many a veteran was predicting that the sting winds brought by storm Doris would  lead to some challenging behaviour. 

But is there any evidence that high winds do affect children’s behaviour? I’ve often wondered and a took the opportunity to collect some data on wind speed (published by the nearest weather station) and the behaviour incidents logged at our school over the last two school weeks, one of which featured lower wind speeds, the other higher speeds as Doris passed over the UK. 

I’m not sure what that shows, and it isn’t a lot of data, but It doesn’t look like any kind of convincing correlation. On the other hand it isn’t a precise measure (‘incident’ covers everything from homework not handed in to having to be removed from a lesson). Another interesting point is the positive side of behaviour – we gave out 12% more achievement points in the Doris week than when wind speeds were low. As for every week the, the number of achievements recorded exceeded the behaviour incidents, with teachers giving out over ten times as many positive achievement points as negative behaviour ones.

In Oxford we were only on the southern edge of the storm, maybe the effect would be greater further north. Anyone want to share some data? 

What does published research tell us?

I had a brief look at the range of research on this topic (incidentally, best to avoid typing ‘wind’ and ‘children’ into a search engine unless you’re researching flatulence). There are several ideas as to how high winds could affect behaviour including change in air pressure associated with storm fronts, extra-low-frequency atmospheric pressure oscillations,  increased sensory stimulation, and an increase in positively charged ions. I didn’t explore this last one because the ions are created by hot, dry winds and that doesn’t apply to February in the UK.

Bill Badger and Eric O’Hare of The University of Lankester researched the effect of weather on the behaviour of students at a secondary school in Cumbria in 1989. They found that behaviour was affected by weather but by changes in the prevailing conditions, rather than the type of weather itself. You can read the abstract here. In a US preschool study in 1990, Eva Essa, Hilton & Murray found that stormy, unsettled weather caused children, especially girls, to interact more with other people than toys (abstract here and paper free if you sign up). A small lab-study by Delyukov and Didyk in 1999 showed that artificially produced pressure oscillations reduced attention. Lovely controlled conditions (abstract here) but a long way from Year 9 on a windy wet  Wednesday lunchtime.

So, research suggests that changes in weather and atmospheric pressure do affect children (and adults), but there isn’t a clear link to increases in ‘wild’ behaviour at school.

If you’re interested in involving students in the topic the Met Office have produced a maths investigation for use with their Weather Observation Website.

Making the most of working memory capacity

“My problem is that I have been persecuted by an integer.” That’s how psychologist George Miller began his groundbreaking account of short term memory capacity in 1956 (read his original paper here). That integer was 7, the “magic number” that kept appearing in research on our ability to process incoming information. 

Short term memory stores information collected from our senses. This may be transferred to our long term memory, or may be lost. From his own research and that of others, Miller concluded that the capacity of our short-term memory is limited to 7 +/- 2 items. The reason why we lose some information before it can be transferred to our long term memory is usually because it is displaced by new incoming information. 

Cognitive psychologists Alan Baddeley and Graham Hitch then developed the model of a simple memory store into the concept of working memory, but the principle of a limited capacity remains. Teachers need to be aware of this in presenting students with new subject content. A feeling of being overwhelmed by new information isn’t because our brain is ‘full’ but rather because the capacity of our working memory to process new information is being exceeded. An understanding of the limits of working memory can help teachers plan accessible learning activities for all students and also recognise those who have poor working memory.

In general, we should think about how much information is presented at once and how many items, or instructions in a sequence, students have to recall without prompts in order to complete a task. Most of us would struggle beyond 7 for an unfamiliar task, and some students will not readily recall this many. Examples where teachers should consider this in the design of resources and tasks include:

  • The layout of presentation slides and the number of items on each
  • The number of options or menu items in electronic / online resources
  • The layout of activity sheets – how much information is presented at once
  • The number of steps or stages in a sequence of instructions. Should some steps be broken down further into sub-stages?
  • The number of verbal instructions, repetition, and availability of non-verbal memory aids.
  • What assumptions do instructions for practical activities make about students’ recollection of previous routines?
  • How much do students have to remember in order to complete homework?

Much of this would be considered good advise for general planning. We have to give additional consideration for children who may have more limited working memory capacity.

Characteristics of children with poor working memory (Susan Gathercole)

  • Children have good social skills but may be quiet or reserved in collaborative learning activities.
  • May appear forgetful, inattentive or easily distracted in class
  • May not follow through instructions or complete tasks
  • Forget key content of messages, instructions or homework

If you’re like me, when reading that list you will recall children you teach who have these characteristics. It is well worth considering that the ‘inattentive’ or ‘distracted’ child may be experiencing difficulties with working memory. This can often result in poor academic progress over time. Research has focussed on reading and mathematics, but other areas of study are also likely to be affected. 

On recognising these signs, there are a number of things that teachers can do to help students, including:

  • Reducing the working memory load by decreasing the number of items that need to be remembered at one time, particularly by restructuring complex tasks
  • increasing the meaningfulness of new material by placing it in context and the familiarity by making explicit links with prior learning and similar information of tasks that the student has encountered before
  • Repeating key information frequently, using different formats
  • Using memory aids as appropriate for the student, these could include key vocabulary, visual scripts, framing tools to break down tasks into stages, number lines or grids, literacy  place mats, etc.
  • Helping the child to develop specific strategies such as devising their own memory aids, confidence in asking for help, ‘3 before me’ resourcefulness strategies ( e.g. ‘Brain, book, buddy’), and improved organisational skills.
  • Providing specific support for students in collaborative tasks, providing context and making roles and outcomes clear. I’ve written more on this in my post on ‘Making group work work’.

It is also worth recognising that our working memory capacity increases throughout childhood. For some children, the issue may be a developmental delay and with support they will catch up with their peers.


Can training improve working memory?

A considerable amount of research has been conducted into whether is is possible to train children (and adults) to improve working memory. The results are mixed, but overall this research indicates that training methods can improve short-term performance in specific tasks, but these improvements are not generalisable to other tasks or skills. This evidence suggests that our efforts as teachers may be better placed in helping students make the most effective use of the working memory they have, rather than attempting to increase their capacity.

Further reading

Miller, George A. (1956) The Magical Number Seven, Plus or Minus Two: Some Limits on our Capacity for Processing Information. Originally published in Psychology Review 63: 81-97. A transcript of Miller’s lecture on short term memory capacity mentioned at the start of this post. 

Gathercole, Susan & Alloway, Tracy (2007) Working memory and learning: a classroom guide. Harcourt Assessment, London. A very accessible short practical guide for teachers.

Melby-Lervåg, Monica & Hulme, Charles (2013) Is working memory training effective? A meta analysis of over twenty research studies. Developmental Psychology vol 49, 2:270-291. A meta analysis of the effectiveness of working memory training.

New Specification AS – Waiting for Results Day

I originally wrote this a few days before the 2016 AS and A Level results came out. I added the postscript after the dust had settled on results day.

I teach psychology, one on the new AS specifications taught from September 2015. This Summer’s AS exams were the first test of the new specification, my teaching of it and interpretation of the assessment criteria. That adds a little spice to the wait for results day, for teachers as well as students!

We opted for the AQA specification, and had done the AQA A specification previously. There was really good support for the new specification from this exam board and a wealth of sample assessments, Mark schemes and commentaries. The last time the specification changed, there were considerable differences in the style of some questions from the sample materials, but this wasn’t the case this time round. I’m aware, however that odd things can happen in the first year of a new exam and research commissioned by the DFE shows that the grade distributions can plateau or fall when specifications change, as for A levels in 2010 and GCSEs in 2011 (DFE RB203, March 2012). It’s this knowledge that means that, irrespective of how long I’ve been teaching, how much I prepared for the new course, how well the students performed in internal assessments or PPEs, I can’t be as sure of how they will do in the actual exams as I was the year before.

I was pleased to see that Ofqual are also aware of the issues surrounding assessment of the new qualifications and say they have taken steps to ensure standardisation across the transition, so that students examined in the first year of a new course are not penalised. They also plan to publish their analysis of the results on AS results day (you can read the Ofqual blog post on setting standards for new AS qualifications here). I haven’t been impressed with some strategies used to ‘maintain standards’ especially moving grade boundaries (as you can read in my post Beating the Bounds from last year), but I’m hoping that this move to be ahead of the curve and transparent with a prompt statistical analysis is a positive one.

So, like my students, I’ll be awaiting the results a little nervously this year – probably good for me to feel a little of what it’s like in their shoes, but I’m hoping that lessons have been from the introduction of new assessments in the past, so results will be a true indication of students’ performance.

Postscript 18/8/17

The results are now all in, I now now how my own students did in their exams and Ofqual have published their promised analysis (read it here).

Nationally, the picture seems fairly stable with a 1% increase for A grades at AS continuing a trend (although only 0.2% for psychology). In general Ofqual say the outcomes for the new AS exams were similar to those taken by 17 year olds last year.

As for my own students, much the same story. There was a slightly positive VA versus forecasts overall, so I didn’t need to worry and will have a keen bunch to embark on the first run of ‘A level Year 2’ (we decided to enter everyone for AS this time round).

I hope your experiences with the A Level & BTEC were similarly positive.

Don’t Read This – learning with a little reverse psychology

In December I contributed at a TeachMeet organised by Rob Bown (@CheneyLearning1) at Cheney School, Oxford. My presentation was about using visual cues to help A Level students link researchers with particular research studies and theories.

However, what seemed to catch most people’s imagination was one particular type of resource I mentioned – file and folders I n the student shared network area specifically titled ‘do not read this’. I haven’t found a better way of getting students to read files!

The presentation was about how I’d addressed the increased number of named researchers on the new AQA A Level Psychology specification. Some students find it difficult to link researchers with particular studies or theories, so I wanted to introduce more support.

I introduced more photos of researchers into my teaching. One way was by linking them to descriptions of their work and research findings. 

Another was to add visual cues when we were thinking about the significance of their research findings in class.


I also did this when we looked at how researchers were influenced by the work of others.


Apart from these visual cues within lessons, I also gave some biographical detail and encouraged students to research the life and work of psychologists as homework. In addition to all this, I ‘hid’ some further information in plain sight on the student area of the network. Each folder is headed with something like ‘Do not open’ or ‘Don’t read this!

 These files go beyond the specification content to consider issues connected with the research covered in lessons and the background of researchers. For example, for the social psychology unit on social influence, I wrote a piece considering the fact that so many researchers in the field of minority influence themselves grew up as members of deprived, and sometimes persecuted, minority groups. These seem to have met with a good response; I quipped at the TeachMeet that I haven’t found a better way of getting students to read around the subject. Of course, I have to use this approach sparingly or the network would just become cluttered with files telling people not to read them!

Does it work? Well comparing the responses of students to 12 mark questions  this year and last year, there has been an increase in accurate references linking researchers to studies/theories of around 40%, so it does seem to help. I do think it’s a strategy that can be applied to any subject with similar requirements; I hope you find it useful.

Your comments are always useful and I’d love to here about strategies you use to address this issue.

Let it Go – Achieving a better work-life balance.

I wrote my original post about using Brandon Smit’s self-regulatory technique to improve work-life balance on 7 January 2016. I updated it at the end of February with my reflections after 6 weeks. In short, I’d really recommend giving it a go.

Last year I wrote a post, Getting the Better of Email, about my attempt to deal with email more efficiently (it’s going quite well, thanks for asking). In that post I also mentioned planning my day in 15 minute chunks so that when the unexpected occurs, it only derails what I had planned for a few of these chunks.
The problem is, what to do with the work that gets derailed? I have to reschedule it and sometimes that will have to be for another day. I often find however that it’s thoughts about this planned-but-unfinished work that intrude into my downtime or prevent me from getting to sleep.

I recently came across this research paper by Brandon W. Smit,  reported in the British Psychology Society Research Digest here that looks at the effectiveness of a simple technique for dealing with this type of difficulty in ‘detaching’ from work.

Smit asked workers to create plans of where, when and how to resolve goals they had not yet completed at work. Adapting this for teachers this could be:

“I’ll go into work tomorrow and after morning staff briefing I’ll collate the data I need so that I can complete the CPD evaluation requested for the Governors’ meeting.”

He found that for a subset of his participants, those high in job involvement (sounds like teachers to me), this simple planning technique increased their ability to detach from work when at home to a statistically significant extent.

Putting this together with my previous post, I’m going to start the New Year by using the following elements to try and make a clearer work-life boundary:

  • Segment work tasks into 15-minute blocks, or multiples of them.
  • Define clear goals for each of these work blocks.
  • At the end of the day take stock of the goals I have successfully met and any that remain incomplete.
  • Use Smit’s suggested planning technique to decide when, where and how I’ll deal with unresolved goals.

February 2016 Update

I’ve been using this idea for about six weeks now and it really does seem to make a difference. Ending my working day by reviewing what I have achieved and writing a single-sentence plan on how I’ll deal with incomplete tasks or unresolved issues does seem to allow me to detach more from work so family time can be family time. I’m also sleeping better – I no longer lie awake thinking about work issues and the number of times I wake up in the night with work thoughts has reduced to only two occasions in the six week period. It’s also helped me be better organised and more able to prioritise.

The technique doesn’t, of course, reduce the workload, so it hasn’t stopped the fatigue that comes at the end of a hard day! Nevertheless, I’ve found that using this simple exercise each day has made a real improvement in my work-life balance.

As ever, I welcome your thoughts and comments. If you decide to give this a go, it would be good to hear how it works out for you.