Leaping over crocodiles
Evaluation, can it work? … I think it can. I have worked for several years as a museum
evaluator for Renaissance East of England, Norfolk Museums and Archaeology
Service and as a freelancer. I started with a distinctly academic approach,
having studied an MSc in Research Methods and was often surprised at what
passed as hard evidence in the museum sector– even in studies commissioned from
professional researchers. Research or evaluation reports can make poor use of existing
theory and literature, may contain only minimal explanations about approaches
to sampling, data collection and data analysis and, when making claims from
numerical data, do not always use of appropriate statistical methods.
There is, therefore, a case for
improved skills around evaluation in museums, especially in critically
evaluating and reviewing research. This way we can be more demanding when
commissioning research and more proactive in using what evidence is already out
there. A further issue with evaluation
in the sector is that it often leans towards advocacy and has an emphasis on
demonstrating success rather than identifying what can be built on and improved.
The reasons for this are understandable as museums are not a statutory service
and are often focussed on making a case for their own worth. The issue, however, is not just the quality
of the evaluation, but about how it is able to influence practice.
Does evaluation make a difference?
This is a thorny issue and, in the past, I have felt that the gap between a
formal research report and its implementation might as well be a leap over a
crevasse filled with hungry crocodiles.
Nile Crocodile (Crocodylus niloticus), Norwich Castle Natural History Collection |
I have begun to realise that the
word ‘evaluation’ has different meanings and that there are different levels of
evaluation, from formal research studies to what might better be termed
‘reflective practice’. When looking for
evidence that evaluation makes a difference, the focus is often on the big
decisions and on matters of policy – but what about the smaller things that
affect our everyday service delivery?
How evaluation is implemented
will depend on its ‘level’. When an individual is involved in evaluating their
own work, or has a distinct connection to it, they often have the power to make
the changes they deem necessary first-hand. On the other hand, where the
evaluation concerns structures, policy or a number of stakeholders, implementing
change is more complicated and the level of ‘proof’ needed to persuade an
organisation to change is higher. Therefore, when planning evaluation, we
should start with the organisation’s structure and ask ourselves in each case
‘How will the decisions be made here?’ and ‘What is the level of proof needed to
instigate change?’
When it comes to getting research
findings implemented, I have had a lot to learn in my own approach. By and
large, staff are too overloaded to be interested in the nuances, caveats and
methods of an evaluation - preferring concrete recommendations or clearly
defined concepts. Working in this way can take some nerve, because it involves
sticking your neck out and making judgements, and as an evaluator you are all
too aware of the limitations of your data. It is also important to find a
number of ways of getting the evaluation findings across, and nothing’s better
than getting out there and talking to people, whether informally or formally. .
Doing evaluation is only one piece of the puzzle when it comes to bringing
about change. Evaluation needs ‘embedding’ within systems – this might mean
involving the evaluator in a project team, ensuring there are built-in cycles
of service review, or having a clearly laid-out process for implementing
evaluation findings.
So where is my evidence that
evaluation can work? When writing this
blog I asked around our organisation for examples of how evaluation had been
used – here is my evidence.
School children enjoying a trip to Time and Tide Museum in Great Yarmouth |
‘I always try to speak to the lead teacher
during an event so I can gauge how an event is working and will actively ask
for an honest (not just pleasing) evaluation. Often, I find that with
written evaluations people are reluctant to criticise or reflect on what
could be improved. My fear is that if the event didn't work, the school might
not tell us and simply not book again. We also always hand out evaluation forms, most
of which are returned completed. In either case, when we receive
comments which indicate something is amiss, we will contact the teacher direct
and either explain why x,y or z happened, or ask for ideas to improve whatever
didn't work. This follow -up is essential because the teachers feel that
their comments are valued. ,They become part of the planning process and
are more likely to book with us again. Over the years, our
evaluation and assessment procedures have enabled me to build up good working
relationships with local teachers and our events have benefited from this
process.’ PD: Time and Tide Museum
‘When I first started running events in the
Norwich Castle Study Centre, I had a very small budget for marketing. I
imagine this is how a lot of smaller museums find themselves. I did a
basic feedback form for the first ten or so events, asking
people how they found out about the event as well as how much they enjoyed it
etc. The information about how much they enjoyed it was pretty much
useless in a way, because I just had 60 people telling me they enjoyed it!
Which is, of course, great, but it didn't help me change anything. The
evaluation was far more useful for targeted marketing as we were quickly able
to see what worked (whether people found
out about the event from the local paper or e-postcards). In the
end I saved a lot of money on flyers and it meant that when we were
arguing for support from marketing, we were able
to advocate those services. So I suppose the moral of that story is to
only ask questions where you care what the answer is, and that you can use the
results from….I think the most
powerful and immediate evaluation is probably verbal. It is so informal,
that we often overlook it. I have realised this from working with the
public on a daily basis. Every time someone asks a question about what we
do, or why something is the way it is, or enquire about information, you
sub-consciously make a note to ensure that information is available in future -
whether you include it in a tour, change a bit of text, or raise it yourself as
a question. So I suppose that comment is about having an attitude which
seeks feedback and investigates it.’
RB: Norwich Castle Study Centre
‘A great example of how audience data has
changed a museum in Great Yarmouth is when we did some mapping of how visitors
moved through galleries before we did any redisplay work. I compiled a
floor plan and gave this to the Front of House Team. I then asked them to map ten
visitor movements per day for three weeks. Using the results of this data (as
well as written feedback) it seemed that a high percentage of visitors
used part of one gallery as a corridor, walking straight through rather than
looking in some half-bays to one side. When we were designing the new gallery
plans, this data gave me the ability to persuade staff that we should break up the long corridor
space by adding an extra wall and enhance the interpretation – this made
the new Romans gallery more of an individual statement space rather than just
part of a bigger room.’ JO:
Time and Tide Museum
‘The repeated requests for a guide book in
the visitor book enabled us to make a good case for this - and we now have a
lovely inexpensive souvenir guide. I regularly look at the 'What could we
improve section' of the visitor book to check we're not missing anything or to
pick out and rectify anything that has obviously gone 'pear-shaped.’
CT: Strangers Hall
‘Evaluation is very important to the way I
work as an individual, and I encourage my team to do likewise. I rely on feedback forms primarily, but also
other sources like ‘Trip advisor’. I use
evaluation to find out if there is a problem, and if so I'll sort it out right
away. As a manager I always read all feedback as it helps me keep up to date
with what's going on.’
CM: Norwich Castle
I have come to the conclusion
that, for evaluation to work, you need both the will and the way:
- A genuine desire to use evidence and a culture that
is open to change
- Systems for embedding the use of evidence within planning and decision-making cycles.
Evaluation is deeply linked to
organisational change and should be an ingredient in the cake, rather than the
icing.. So,
happy cake making, and good luck leaping over those crocodiles!'
Author | Amanda Burke, Evaluation Officer | Amanda.burke@norfolk.gov.uk / 01603 493657
Norfolk
Museums and Archaeology Service.
Image
courtesy of Norfolk Museums and Archaeology Service
No comments:
Post a Comment