The next time someone asks for your targets to be SMART, you should smack them in the face (metaphorically, of course)

Image by Alexander Lesnitsky from Pixabay

Please stand up if you’ve ever been criticised for your targets not being SMART enough. And please raise your hand if you’ve ever been the critic..

To the critics, this post might make you feel a little uncomfortable, and might also make you think carefully about what you say in the future. And to the criticised: would you like some ammunition to be able to rebuff the criticism? If so, then please read on..

We’ll be discussing this subject in a lot of detail next Wednesday (October, 7th) as part of an online training session on Development Planning, but I’d like to whet your appetite a little here. Wouldn’t it be interesting if some of the key issues with quality improvement were caused BY ‘smart’ target setting, rather than because it’s so often done badly. Not so smart after all, eh?

As we saw in last month’s session on self-assessment writing, the main job of self assessment is to, as precisely as possible, identify the root-cause issues behind any adverse symptoms. What we so often see in providers’ SARs are insufficient analysis of these root-causes, and so a jump to action far too early. This jump to action results in staff (egged on my their managers, of course) attempting to resolve the adverse symptoms as quickly as possible. That way lies disaster, dear reader, as you can never, ever resolve a symptom directly.

Incidentally, one of my favourite TV programmes in recent months has been The Medical Detectives. This show beautifully illustrates what happens when you try to treat symptoms rather than work hard to find what’s causing them. Anyway, back to target setting..

When you do find (or at least have a strong hypothesis for) the root-cause issue, you next have to frame it in terms of it’s negative impact on learning. Once we know what this negative impact looks like, we then have to write the opposite; in other words: what the positive impact on learning would be if we resolved the issue. This becomes our vision of success. And then we need to ask ourselves: ‘How would we know that we have achieved that positive impact?’ What indicators (a much nicer word than data) would we look for? All of this has to be done before asking: ‘So what will we do differently?’ (Tone of voice is all important here, bye the way, so make sure you ask the question with eyebrows raised and a gentle, curious tone.)

Have you spotted the issue with smart targets yet?

Specific   Measurable   Achievable   Realistic   Timebound

Though the ‘R’ can also stand for: reasonable, relevant or results-based.

I’ll gloss over the fact that ‘achievable, realistic and/or reasonable’ are often no more than synonyms for mediocrity, and that (when teamed with dreadfully unambitious targets in value-added processes) can consign a learner the second-class learning they’ve probably been experiencing all their school lives… and concentrate on ‘measurable’.

Too many managers are overly keen that ‘measurable’ means placing a quantifiable number into a spreadsheet. So, are you now ready for a bit of homework? Have a look at your last development plan and tot up the proportion of action lines that use terminal data as the measure of success. In other words: success, pass, retention, high-grade, value-added rates, etc..

There are two issues here. Firstly, none of these data are directly relevant to the root-cause issues – ever. Secondly, they ‘happen’ when the learner has left. So it’s actually morally reprehensible to use terminal data as a success measure, not just ignorant – forgive my bluntness here..

The data, sorry, indicators, must provide an articulation of the root-cause issue once resolved. Period. The problem is that this often requires staff to make – are you ready for it? – a professional judgement. And managers don’t like professional judgements do they? – which is why they criticise your smart-target setting..

Actually, that’s not quite true is it? They’re fine with your professional judgement when you’re: marking work and completing formative assessments, observing learners, completing formal assessments and writing predicted grades, etc.. They’re also fine with inspectors coming in and making a whole raft of professional judgements, but they won’t let you write them in targets. This means that targets are rarely an expression of the resolved issue. Which means that they’re often not just useless, but proactively build in a massive flaw to your quality system, which often manifests as a great deal more bureaucracy in your day jobs.

So what acronym might I now present for consideration? If not SMART, then what should target setting be?

  • The opposite of the adverse impact you’re wanting to resolve (in other words, low-level indicators, directly relevant to the root-cause issue).
  • A description of how learners will be different after you’ve worked your magic.
  • Requiring research, trial, evaluation and refinement (okay, milestones).
  • (Goes without saying that if you’re going to really fix the issue, it will take a few goes to get it right – so say goodbye to short-term fixes.)
  • Requiring evaluation, rather than audit.
  • Often multi-faceted, as there may be a range of related and observable positive impacts.
  • This isn’t a nod to SMART, it’s a crucial element of the research approach. In other words, what is the gestation period of your research? When will your planned new strategy happen? When will you begin looking at your indicators to see if the issue is being resolved?

(And all of this should make you really question any quality team’s calendar that says you should look at your development plan once a term… You need to look for the indicators at exactly the right time for each action line..)

So. All of that appears to make: TAARROT (and I’m now crying with laughter as I honestly didn’t plan that; but hey!)

Quality improvement isn’t a process, it’s a wonderful creative journey.

There’s lots more to discuss, so do please join us next Wednesday if you can, or get in touch if you’d like to discuss it some other way.

 

To join the session on Wednesday October 7th, please use this link: https://www.skillsandeducationgroup.co.uk/events-details/self-assessment-using-a-data-springboard-approach-to-perfect-your-development-plan/?mc_cid=7c91309ed9&mc_eid=7d8197cd3a

 

Comments

No comments yet

Join In

Your email address will not be published. Required fields are marked *