What Do We Mean When We Talk About ‘Evaluation’, ‘Learning’ and ‘Reflection’?

 

In this essay Susanne Burns ponders the relationships she develops with companies through her work as an arts evaluator. With particular reference to working across multiple projects with Fevered Sleep she gives a personal reflection on her own learning.

 
 

I have carried out numerous evaluation projects over many years and I have come to realise that my approach to it is somewhat distinctive. Not unique, but perhaps evolving from more traditional approaches to impact evaluation. I believe that evaluation must be formative, interacting with the work; that data gathering should be built into the DNA and working processes of a programme of work. I do not stand outside a project as an observer but engage with it. I ‘walk alongside’ it.  

I am committed to reflection, to reflexivity, to ongoing learning and enquiry. I may create ‘logic models’ but do not believe that a good ‘evaluation’ process is linear or logical. Instead, it has to be embedded and iterative, it has to reflect the working processes of the programme, project, company being evaluated and has to be owned and ‘lived’.

I am passionate about building relationships with organisations over time and over projects that can embed evaluation and learning at the heart of the organisation. 

Working with Fevered Sleep has been an important part of this process of refinement and adjustment for me.  Directed by artists David Harradine and Sam Butler, Fevered Sleep have been making innovative performance that challenges traditional approaches to making work for and with people for more than 20 years. My first work with them began in 2014 when I was asked to evaluate the three-year pioneering programme Future Play which sought to challenge the existing models for making and touring work for children:

“Future Play was exciting and challenging to support. It was a complex programme of work and the extraction of learning was always the principle driver for what was essentially an action learning project. Evaluation of it therefore needed to be formative – sitting within the project and interacting with it. The process of evaluation was therefore important as it needed to engage partners from the onset and encourage learning and problem solving so we used round table meetings at regular key stages in the project as an opportunity to gather data. We designed a process that was not onerous but built on what the venues and the companies were already doing and this was supplemented by interviews and visits at the end of the project. We ‘triangulated’ data being gathered by engaging a range of stakeholders, programmers, marketing, press and sales staff. the local ambassadors, teachers, audience members and of course company members. We gathered qualitative and quantitative data throughout using a range of different tools including interviews, box office data, surveys, observations and round table meetings. There is no ‘one size fits all’ and no ‘right way’ to carry out meaningful evaluation. There is a toolbox of methods and approaches but what seems to me to be most important is to ensure that the methodology is developed to ‘fit’ both the organisation and the project and of course to match the desired outcomes.” 

— Susanne Burns, Evaluator, 2015

In 2016, when the company secured long term funding from the Paul Hamlyn Foundation and the Wellcome Trust under the Sustaining Excellence strand, I was then asked to continue to work with them over a three-year period to examine and interrogate their work in participation. The relative security afforded by the funding allowed the company to ‘step behind’ the work and take some time and space to reflect and gather learning across projects rather than within individual projects. This is a rare and special opportunity and one to be maximised. We are still working on what this might mean but we are clear that we need to capture a wide range of voices involved in the work. How do we capture the voices of people at different distances to the work and what kind of different stories or narratives might come out of this? What can we learn about the company’s approach to participation and to making work? Can we render explicit the implicit and surface the learning? This approach aligns with the overall approach adopted by the company to making work:

“We actively try to learn from each project and that feeds into the next one. We are constantly trying to learn and interrogate the process when we’re in it but then debrief at the end. We are constantly questioning and improving as we go along. We are constantly questioning. We are always working out what works as we are engaged in the process.”

— David Harradine, 2018

This internalised reflective process is built into the way work is made, embedded in the working processes that utilise multiple voice perspectives, that make research public and that results in multiple art works and deep engagement and participation. In some senses it can be seen as an internal evaluation process.

But, there is a potential tension here - an ongoing dilemma between external accountability and internal working processes. David articulated this as follows:

“I have had a strong internal feeling about articulating it as internal evaluation processes. There is an endless tension that I feel as an artist who wants to work, think and collaborate in a particular way and the business that supports this. We need to question everything - we have to be nimble – but this is primarily about the texture of the art and how critical the process of questioning and provocation is to the making of the work. The art/ creative process is the main thing – if I won £10million and we didn’t need ACE funding we would still be reflective because this is what the art is about. We do not do it for funders. We are working as humans and know how we want to do it and we are doing it for ourselves not for external agencies. I suppose it is something about the same stuff being articulated in different ways.”

This approach also means that learning is embedded in the actual working processes of the company and is not an add on for the purpose of accountability. This brings massive benefits to artists who welcome the time and space and opportunity for reflection:

“The fact that we’re doing this, it makes me realise how many projects I’ve been part of where this isn’t built in. To like, evaluate, and maybe do it with like, getting money, and evaluation through Arts Council’s way of evaluating. But yeah this thing we said at the start about going back on what we’ve done throughout the year, I’ve realised I’m just rolling and  rolling… And this feels so much like a completion in a way, of like, processing thoughts that have been kind of simmering away since March.”  

— Kip Johnson, Artist

It adds value to the presenters, promoters and venue managers who see reflection on the work as a kind of after care from the company:

“It is great to have this opportunity to take time to think and talk about what happened and what was learned, to consider how we might do things in future and to really reflect on it all. We don’t do this often enough as it tends to be about form filling and tick boxes rather than a real conversation and consideration of what worked and what worked less well.” 

— Promoter

So what have I learned and could this have relevance to other organisations and projects?

I have learned that ‘walking alongside’ a project is a challenging way to evaluate. But that it is worth it! Standing outside is easier and brings objectivity but you miss so much when approaching it like that. I have learned to value the small things, the texture of the project, the relationships and to place value on these things in assessing impact. 

I have learned to ask questions that are tough, that ‘check and challenge’ and that provoke reflection from all parties involved in the work. 

I have learned to truly value the voices of multiple participants in the work and to value these ‘stories’ and experiences as evidence of impact rather than undervaluing them as subjective, qualitative and therefore ‘soft’.

I have learned that ‘contribution’ is more important than ‘attribution’ – it is always a challenge to attribute change to any one cause because people and contexts are complex. I realised it was not worth worrying about it too much as, if change was occurring and the work was part of the mix, it had to be making some kind of contribution to the change. 

I have learned that any evaluation process requires flexibility, openness, honesty and transparency and a commitment to changing processes and approaches when they do not appear to work. Fevered Sleep have demonstrated this at several stages in the process of this project and over the time I have worked with them.  It is a privilege to support a commitment to reflection, learning and improvement and to support real ownership in a team of deeply reflective people. 

I have reached the stage where I think of my work as providing learning support rather than ‘evaluation’, where I see my role as being to surface learning, to ask the right questions at the right time and to support the reflection of all those engaged in a programme of work or with a company. 

 
 
 

“I may create ‘logic models’ but do not believe that a good ‘evaluation’ process is linear or logical”