Save time on data and spend it on teaching

I really believe that efficient data analysis that takes the minimum amount of time helps everyone.

It always depresses me when the calculator comes out for working out averages on a set of data.

Yet there is such a focus on getting the data right, that sometimes double-checking it all can take over. Yes, the system might be inefficient, but it works and gets the right answers… Our children deserve better than that. The skilled teachers in our schools are there for doing skilled teaching: they shouldn’t be wasting their time on inefficient processes.

And here is where I make my admission: I use inefficient processes. I am comfortable with spreadsheets, so I use them – I do all my number crunching and contextual analysis in spreadsheets. Yet I have a super-powerful management information system in school. I know that it could do all the number crunching I need and more. But I’m not comfortable with it. One day, I will grow up. I just know it. And on that day, I will understand databases and be able to use them to make data work even better for my in school.

But for now I’ll have to be content with my countifs and my vlookups.

I’m speaking at #WeTweetEd at BETT this Thursday. Come along and contribute to the discussion on using data as best as we possibly can.

Good data processes have an impact on teaching and learning.

You can make really efficient data processes within school that don’t actually do anything.

Some senior leaders would laugh if you came into one of our pupil progress meetings. We hold them three times each year. In each meeting the headteacher, deputy and teacher come together to talk about each child’s progress in reading, writing and maths.

Much of the discussion arises from numbers on a sheet, but it is focused on the barriers each child faces and how, as a team, we might overcome those barriers. Sometimes that might involve a conversation between a senior leader and a parent; sometimes bringing in external professionals; sometimes the tweaking of some classroom practice.

I get the impression that at some schools, senior leaders are so concerned about the ‘big picture’ of what the data shows them, that they forget about the details. But the details are called children. And a school where children are just numbers on a sheet of paper is no school at all.

I have to admit, sometimes I can get a little task focused when I’m creating my latest uber-sheet: in this one, I tell myself, the data process will be so efficient it will hardly detract from our time at all.

It’s important to remember, that each number crunched, each set averaged, is just another tool to help teachers with their job: teaching.

I’m speaking at WeTweeted at BETT this Thursday – it would be great to have you there to contribute your thoughts on data too.

Good data requires a good person more than a good process

I’m speaking for a few minutes at WeTweetEd #5 at BETT on Thursday. The subject is on data, and I’m essentially going to say three things:

  1. Processes on data are only any good if they have an impact on teaching and learning.

  2. Efficient data analysis that takes the minimum amount of time helps everyone.

  3. Moderation should be treated as data’s beautiful bride, and not its jilted lover.

However, for now I’m going muse on this thought: it is more important to have a person in your school who is a good with data, than a good process for handling data.

The reason for this is that the amount of data we have to process each year increases. RAISEOnline gets larger and the emphasis on what kind of data is important changes.

Recent changes to curricula – EYFS, National Curriculm; and also to Special Educational Needs, means that new systems have had to be developed on an almost yearly basis. Yet the core process remains pretty constant:

  1. teachers assess where their children are at;

  2. we give these assessments numbers;

  3. we use maths to analyse the numbers so we can maintain a big picture of what is going on

  4. we target school resources appropriately, both at a classroom and a whole school level.

However because of almost yearly changes to the context of most school schools, the numbers change. And the contextual data changes. So a person is needed to manage these changes and make them work for each school.

I’ll give you an example:

In our last Ofsted we knew our school was good and we had the data to prove it. Even better, the teaching in the classrooms was so good it was almost irrefutable.

Almost.

The Ofsted inspector was looking for numbers that we didn’t quite have. Instead of in-year numbers, he wanted numbers that showed progress of the last year (i.e. from February to February instead of September to July).

It took me 6 hours and quite a bit of jiggering around with formulae to make the spreadsheet do what I wanted it to: the 363 calculations that would generate the 363 numbers the Ofsted inspector required. I learnt a lot about ‘countif’ functions that night. Without that spreadsheet, though, it would have taken a lot longer, probably 3 days, and we wouldn’t have got the data done in time.

Now Ofsted is an extreme example, but with changes and more changes to how we assess things, the tweaks that are needed to keep data processes working in a school could grind a school to a halt without the right person in place.

What AfL is for

Rob Coe recently posted an interesting essay about how AfL might well be over-rated.

I broadly agree. And of course I’m in no position to argue against him – my experience only relates to the impact of AfL on 12 teachers in a small primary school of around 240 children. However, my experience of AfL has been really positive and I’ll explain why…

Everyone knows that there are only 4 things that improve teaching, and one of them is subject knowledge; the purpose of AfL is to increase subject knowledge.

Teachers have curriculum strengths and weaknesses – this is particularly apparent in the upper reaches of primary school, where the required qualification in English and maths for a teacher is a ‘C’ grade at GCSE. There are significant numbers of children at this level who may be working close to that level, hence the teacher’s subject knowledge may simply be not high enough to meet the needs of the students.

This is where AfL comes in. Assessing the children closely against rigorous banks of knowledge statements such as those found in the APP materials for English and maths, means the teacher discovers holes in their own subject knowledge – they find out what their students can do, they can see the next steps and they can determine whether they have the subject knowledge to teach those steps. At this point, if they don’t have the subject knowledge, it’s either time to panic, or seek help from their senior colleagues.

It is exactly at this point that things go wrong – senior colleagues (in other schools, I might add) are often keen to tick the AfL box rather than address the underlying problem. Unfortunately it is far easier to make things look like AfL is happening than to actually increase the subject knowledge in your staff – this involves a level of skill and compassion that is beyond many senior leaders in our education system. In this culture, rather than seeking the improvement they need, teachers who need to develop their own subject knowledge will develop all sorts of strategies to conceal it. In fact one of those strategies is writing the letters WALT and WILF on your whiteboard – a point that Professer Coe alludes to.

It is the culture of the school that makes a difference here. In my school we are all learners and my headteacher repeatedly reinforces a ‘no blame’ culture. Only yesterday, my year 6 teacher (whom I line manage) was teaching me what modal verbs are. Similarly we are all happy to educate each other so that we increase each other’s subject knowledge. We have found systems such as APP and Incerts (an online assessment system based on the old National Curriculum) really useful because they have helped us identify what we are good at teaching and which areas we still don’t know much about. We use them as assessment for learning, but really that means increasing our own subject knowledge so we can teach better.

Good Data: the inspection clincher

Wednesday 14th May was a particularly stunning day for myself. Not only did I finally teach a lesson good enough to be judged ‘outstanding’ by Ofsted, but the data that I produced also helped us do well in the inspection overall.

First, some context. Ofsted are the National body in the UK that inspects state-funded education. Recently (January 2012) a new inspection framework was produced that streamlined some 22 categories into only 4. Consequently, we had begun hearing some horror stories of many schools in our area moving down a category – it seemed it was harder to average out at the same grade you had previously been on. Ofsted judges schools in one of 4 ways: 1 – Outstanding, 2 – Good, 3 – Satisfactory and 4 – Inadequate.

Of course, our fears were that we would move down a category, losing our good status to take on that dreadful label – ‘satisfactory’. It was not to be. We came out as a ‘Good’ school and the report reads particularly well (I think).

So what of the data?

Well, we knew in our hearts that we do a good job for our children. The school is set in a part of Birmingham within the highest 20% of deprivation in the country. The children enter the school well below average and leave the school broadly in-line with national expectations, but how could we prove that in numbers?

It was a function and 3 Google Spreadsheets that came to the rescue.

I keep tracking sheets for reading, writing and mathematics for all students and looking at them, I could see that the children who we’ve taught for a while achieve better than those who’ve just joined us. In other words the children we teach, do well; we have a small but significant group of children who join us late and don’t make as much progress.

One of the data sheets that impressed the Ofsted inspectors

So I used my Google Spreadsheets to calculate a range of measures from current attainment in each subject, to the progress being made. The function that helped me the most was the ‘countif’ function  – I’d recommend finding out how it works if you don’t already – there’s guidance within both Excel and Google Docs.

I used the countif function to help me calculate 12 important numbers for each group – overall, boys, girls, SEND (special educational needs or disabled), FSM (free school meals) and higher achievers. This data showed that all groups who had been taught by us through the Key Stage 2 department (ages 7 to 11) were achieving at or above national expectations

In addition, a second sheet showed that in each year group, progress in reading, writing and maths was good or outstanding.

Sample of the progress data for each year group (if you’re a UK education data guru, you’ll understand what those numbers mean.

In all, I used the spreadsheets to calculate 363 separate numbers to demonstrate to Ofsted that we are still a good school.

I was helped in this process because we use an assessment system called Incerts, which fills up my spreadsheets with meaningful numbers from teacher’s assessments. Once we demonstrated that our monitoring of this assessment was effective by analysing current samples of teacher assessments in books, the inspection team were content to believe that our data did indeed demonstrate that we are doing a good job for our children.

And next time we’ll be ready to argue for ‘outstanding’.

What’s being abused here – the teachers or the data?

I was surprised to see the report on the BBC a few days ago about teachers being abused online. Surprised for two reasons – firstly the headline statement read that over 4 in 10 teachers had been abused online by pupils or parents and secondly that I had contributed to the NASUWT online survey which generated the results.

The email I received from the NASUWT

I was pleased to receive the email, because we’ve begun to have some highly positive experiences with Facebook at my school. I wanted to share them.

We had encountered some unpleasant Facebook incidents some two years ago and so had decided to set up our own Facebook page. It may be just good luck, but it seems that merely having a Facebook presence has deterred any pupils or parents from saying anything inappropriate. Both pupils and parents refer to the page to find out what’s going in school – maybe that has ameliorated their language on the platform.

Anyway – I know that one swallow doesn’t make a summer, so I was hoping by contributing to a big online survey about social networking that a growing number of schools who are using Facebook positively might be discovered and reported on. Nope. Not this time. There was nowhere to record number of times abused = 0. There was nowhere to record positive statements about social networking sites.

It seems that just by filling in the survey I was recording that I had definitely been abused online by either parents or pupils or both.

But if that’s the case I don’t get where the 4 in 10 teachers comes from. You see only about half the teachers in the country belong to the NASUWT. Even during the strike ballot last year less than half of those voted – I can’t imagine that more teachers would respond to an online survey than vote over striking over pension changes.

Furthermore, looking at the BBC report I can see no reference to the data of the survey, no methodology. There’s no numbers saying how many people were actually questioned (I’ve no way of knowing whether it was the online survey I took part in that generated these numbers). The BBC have previous on this. Back in August 2011, they took a report from Plymouth University by Professor Andy Phippen to claim that 35% of teachers have been bullied online. Again there’s no numbers. 35% of teachers sounds bad – but if only 20 teachers have been questioned, it’s not much of a survey.

Delving further, the Andy Phippen survey exists, but again its methodology is questionable. We finally have a number of responders – 377:

In total 377 people responded to the survey, providing a solid, broad base for the
rest of the research. (p4)

We discover that these responders have answered an online questionnaire which they were sent to via ‘teaching mailing lists’ (p3) – although that still doesn’t tell us by which criteria each mailing list was generated.

The crunch for me comes with the question that generates the 35% of teachers have been abused online. I was expecting to see some words akin to:

Have you ever been subjected to any online abuse?

But instead I see the question:

Have you or colleagues ever been subject online abuse?

Or your colleagues? Or your colleagues? What on earth does that do to the data? I work in a small Primary school. Aside from ‘my colleagues’ from other schools that I work with, I have 30 colleagues from solely my school. Given that my school could be about average (and it certainly isn’t), my one vote actually counts for 30. That means each of the 377 responders to this survey are actually answering the key question, not for themselves but for 10, 20, 30 maybe 100 or more colleagues. If we average at 30 that means that there are actually 11310 people in the survey. And 117 out of 11310 as a percentage is 1.03%

So 1% of teachers have been abused online.

Don’t get me wrong, that is still a terrible number. With nearly a million teachers in the country, that means there are over 10000 of us who’ve gone through the pain of online abuse. It’s great that the government funded Safer Internet Centre exists to provide counselling and support for those teachers and strategies to reduce online bullying in the future.

But that’s not the issue here. The issue is that bad data has been used to create a statistic that just isn’t right. Now I’ve got no way of knowing the actual number of educators who have been abused in the sample of 377. The minimum number is 117, because that is how many have been reported. It could, of course, go much higher, but the questioning in the Prof Phippen survey isn’t good enough to find that out.

At least, to give Prof Phippen some credit, his survey does actually have the sample size in it. No joy from the NASUWT survey. The press release about the survey just tells us that  42% of those responding to the survey reported online abuse.

Hang on! “Of those responding?” “Of those responding?”

Again. That could be 42% of 50 people, making the survey next to meaningless. But now think back to the survey – it was a survey of online abuse – there was no opportunity to report ‘no abuse’. And only 42% of those responding said they had been abused? In a survey where you can only say “Yes I have been.”

This quite simply is at best bad data, and at worse is plain lying. And of course the BBC and other reputable news media such as Channel 4 here, and the Independent here are completely taken in by it.

To be fair to the Independent they did interview Chris Keates to find that 1200 teachers had responded to the survey, but nobody asked about the questioning. With around 300 000 members, a response of less than 3000 is again in that ball park area to indicate that about 1% of teachers have experienced online abuse.

For the last time, online abuse of teachers is a terrible thing, but we’re not going to fix it by inaccurate data and sensationalised headlines during conference season.

Low attaining pupils in low attainment shock

The BBC article on the school leagues tables surprised me this morning. According to both the BBC and various politicians, low attaining children don’t attain well. Let me put that another way: Children who are less average when they are 7 don’t become average by the time they are 11.

 

It reminds me of when Tony Blair, newly in power back in 1997, was alleged to have said that he wants all children to become better than average.

 

So what is supposed to happen? Bearing in mind that the National Curriculum is divided into ‘levels’, which are broad descriptors of a child’s knowledge in each subject area, children are supposed to make 2 levels of progress between Key Stage 1 and Key Stage 2. Also, children are expected to finish Key Stage 1 at level 2, although some low attainers finish at level 1, some high attainers at level 3. This progress that children should make means that, if all goes to plan the children
  • move from level 1 to level 3; or
  • move from level 2 to level 4; or
  • move from level 3 to level 5.
Apparently a quarter of children who are ‘low attainers’ actually made it to level 4 – this means moving from level 1 to level 4 – a great achievement. Disappointing then that Steven Twigg, Shadow Education Secretary should see his glass as being half empty with this statement: “The fact that only a quarter of low attainers at age seven go on to meet the expected Level 4 in English of maths when they leave primary school is not good enough.”

 

Fortunately we have a country with such amazing secondary schools that they will pick up these disastrously low expectations from primary schools and make good their low attaining pupils.

 

I’ll write next about how this announcement is akin to thrusting a red hot poker into the nether regions of all secondary schools, given the current SATs regime.