Last Friday, 45 delegates descended on the Ashmolean Museum for a one day conference on Understanding Audiences. I’ll post a full round up of the event shortly, but this time I presented, sharing our experience at the Oxford University Museums of starting from scratch with a new audience evaluation system.
Where we were pre-2012
Previously, when we were funded by the MLA (Museums, Libraries and Archives), they sent a team to conduct surveys at our museums a couple of times a year on our behalf. Not only was this cost neutral for us, but we received extremely high satisfaction ratings with 97-98% of respondents describing their visit as very satisfying – providing excellent support for reports and funding applications! But this was not ideal, as we had no control over what questions were asked, and responses were only collected on a small number of days each year. Their analysis was based on less than 1,000 responses per museum each year. The museums also conducted their own surveys, usually paper surveys available in the gallery but also occasionally conducted face to face by volunteers and work experience students, so again we only managed to collect a couple of hundred each year.
With the demise of the MLA, of course these surveys stopped, but this was an opportunity for us to re-examine the way we understand our audiences.
Automated Surveying Kiosks
So one of the first things I did when I started working for the museums was research what we could do. We spoke to many of the sector’s leading audience research agency, but mostly their services were beyond our budget. In the end, one of the companies recommended some touch screen kiosks which they had used in the past as part of their work. After much deliberation we invested in a number of these, a full size kiosk for each our museums plus a tablet for use to gather feedback at special events. The company we used is called CRT Viewpoint, who have a good back-end system which is very easy to use both for producing surveys and analysing results. They also have sophisticated ‘quarantine’ technology that can tell if someone is not reading the survey by the speed with which they answer for example, and so can quarantine responses automatically, saving us quite a bit of work!
Since introducing the kiosks at the end of 2012, we have been collecting approximately 1,000 valid responses from each museum each quarter, a big increase on what we had before, and we are collecting data every day.
Using an automated system rather than face to face surveys does have its drawbacks:
- Our reported satisfaction levels have dropped. I suspect that this is because people who have had a bad experience are more likely to choose to leave feedback, and people are more likely to respond negatively anonymously to a screen than to an individual.
- It attracts under 16s – about a third of responses comes from under 16s. While in theory this may in fact be an accurate reflection of our audience, this is not a sample of all under 16s, but rather a disproportionately high sample of 12-16s.
- We get a disproportionate number of first time visitors responding. This was not a problem when we first introduced the kiosks as they were new to everyone, but now 60% of responses come from first time visitors. We now need to look at innovative ways to encourage return visitors to respond again
- Qualitative vs Quantitative – quantitative responses are easier to analyse and easier to collect, respondents are much more likely to answer yes/no and rating questions than give extended feedback. We find we can only legitimately fit 2 qualitative responses into the survey and expect people to complete them
One of our biggest challenges with the new kiosks has been limiting the length of the surveys. We were all very excited to get them and at the prospect of being able to gather the information that we want, but the temptation is to collect everything, from what people thought of label style and shop stock, to all their vital statistics including which newspaper they read. While all this information is useful, it renders the survey very long, and people walk away, both undermining our data and perhaps leaving a slightly sour note at the end of their visit. Our response to this has been to have shorter surveys which we change regularly, so we still get an even spread of responses over a certain period of time.
We are still working on this, and what I really emphasise when discussing this with colleagues is rather than getting bogged down in what questions you should be asking – which are often based on other surveys you have seen – to start with what information would be useful to you for current decisions and discussions, and then work back to the questions that can provide this.
Postcode Analysis with ACORN
The data I spend most of my time on is postcodes (which arguably we don’t need this elaborate system to collect, but I will return to this). We analyse the postcodes we collect using ACORN, which is a marketing segmentation tool, as it is the system used elsewhere in the university. This means that we can benchmark our data with other areas within the university. We look at all UK postcodes, but the most useful thing we do is track the composition of our Oxford and Oxfordshire based audience with the general population profile of the city and county. This allows us to clearly identify where the gaps are in our local constituency and approach them.
As you can see in the below profile, our local profile does track relatively closely with the local population, except for a spike under ‘Rising Prosperity’, a group which includes a lot of young families, who of course use the museum regularly taking advantage of our extensive range of family activities, and that we are not getting as great a segment of the ‘Urban Adversity’ segment as we would like.
Having identified ‘Urban Adversity’ as a target segment within the city, and knowing where this segment is predominantly based, we can then return to the data collected from our kiosks and gain a deeper insight into this particular audience. Looking at respondents who did attend from this postcode, we see that they are (unsurprisingly being local) more likely to have visited before than other visitors, that when they do visit they have a positive experience, and that in addition to under 16s, many of whom come as part of a school trip, most visitors from these areas fall into th 25-45 age category. A large number list their reason for visiting as to entertain children, but socialising is also a strong reason.
With this information, we know that this segment, when they attend, generally like our offer. So we need to focus on getting them through the door. Possible tactics could be promoting our activities for families, as well as social events like our free late night openings or gig nights.
Anyway, that is our core, ongoing, constant benchmarking evaluation. This is coupled with more focussed, one-off pieces of research.
In 2013 we conducted a non-user survey in partnership with the County Council. They sent the survey to their Oxfordshire Voice group, members of the public who have agreed to answer surveys for the County for a certain period of time, as well as making it available on their website. We promoted the survey in local newspapers to try and garner responses. Unfortunately, everyone who responded had visited our museums and had positive things to say about them. While this is good to hear, we clearly missed the section of the public we needed to target, and it was individuals with an interest in museums who chose to respond to the survey. We are looking to conduct additional non-user research soon, but will be taking a more targetted approach using focus groups.
Last year we also worked with the Oxford University Student Consultancy – an organistion for students to take time out of their studies to conduct small pieces of consultancy work to gain experience of different sectors. They looked into the profile of our student visitors and determined awareness of the museums within the student population. While awareness was high, with most students aware of most of the museums and aware that they were free, students that had engaged with the museums as part of their studies were much more likely to return as a visitor, and to engage with the museums as a volunteer.
Earlier this year we took on a student intern who spent six weeks with us. Her major piece of work was to conduct surveys with visitors, asking them how they would use public Wi-Fi should it be available in the museum. We were reassured that iPhone and Android were our key market with very few visitors owning other Wi-Fi enabled smartphones, and still very few visitors bringing their tablet. We also found out that only 50% of visitors usually carry headphones, a negative for audioguides, or an opportunity to sell museum branded headphones? Finally, when asked if they were more interested in information on individual objects or overall displays, most respondents picked the latter. See more results from the survey here.
In 2013-14 we also invested in some research from Morris Hargreaves Mcintrye into the visitor profile, in particular to paying exhibitions, at the Ashmolean. They profile people into culture segments based on how they like to consume culture, and how best to communicate these different audiences. MHM profiled the population of the immediate area around the South East based on their cultural consumption profile, and the current Ashmolean visitors, and identified gaps where the Ashmolean’s offer should appeal, but wasn’t reaching its full potential. Later MHM ran workshops with Ashmolean staff to identifying potential ways to engage with these audiences.
Responding to Research
Key to our project of introducing new audience evaluation methods across the museums is introducing a way to respond to what we learn. Shortly after introducing the kiosks we introduced the Audience Engagement Fund, a small pot of money that colleagues across the museums could apply to in order to support activity that specifically responds to audience research. We thought this was a good approach as it would allow us to be agile in responding to issues as they arose, and the museums and their audiences vary greatly, requiring flexibility in our response.