Last week we were able to take EnergySparks into the classroom for the first time. Here’s a short write-up of some things we learned.
We’ve been developing EnergySparks from a prototype through to the current early release since the end of last year. We’ve iterated the basic functionality a number of times based on our understanding of how the application might best support schools.
We’ve also got a lot of ideas for further improvements. For example, we’d like to create richer tools to support exploration and analysis of the data. But we didn’t want to rush into building further enhancements without doing some user testing. No product design, however well thought out, survives its first encounter with users!
We were very pleased to be given the opportunity for our school liaison, Claudia Towner, to visit Freshford CE Primary School last week. She spent the day working with a fantastic group of pupils. It was our first opportunity to test out the application in a class room setting using tablet devices.
And it was time well spent, because we had some great feedback and suggestions from the pupils.
To structure the testing we first gave the whole group of pupils a short talk about the application and energy efficiency in general. Hands-on testing was then done in smaller groups. These sessions were organised as a mixture of tasks (“can you find you school”, “can you find your school’s electricity data”, etc) and some open discussion and exploration of the data and functionality of the site.
The pupils were really enjoyed working with the application. They were very engaged with the gamification aspects of the application. They were very keen to know how they could score points and how they could compare their progress with other schools. This confirmed our thinking that gamification is a great way to get the children engaged and encourage some ongoing behaviour change.
But we also learnt that:
- Not all the children understood what “badges” were, so we need to think about our labelling. Apparently the Minecraft players were quick to understand that feature!
- Our current set of “badges” are boring, so we need to get a better set of icons to make them more appealing
- We need to put more effort into describing how to achieve the badges and include suggestions which ones they should tackle first
We also realised we need to be careful about positioning comparisons across schools. There are many factors which make cross-school comparison tricky to do reliably. We’ve avoided building this in as a feature of the service, but interestingly the children were keen to do these types of comparisons.
The children enjoyed working with the graphs and thinking through how to interpret the data. They had lots of great suggestions about how to better label the graphs and improve the interactivity to make them more usable.
Interestingly they were quick to start looking for significant dates, for example their birthday or dates when the school was holding an event. This helped to provide them with some meaning and context to the data.
One of the main areas we need to improve is the recording of activities. The children found adding updates easy and were quick to add unicorn emojis across their updates! But the selection and recording of energy saving activities needs a lot of improvement. The navigation needs a complete overhaul and the features need to be better integrated across the site to make using the application more natural.
This is exactly the kind of feedback we wanted to get. It’s really helped us focus on the most important areas to improve over the next few months.
Thanks to Freshford school for their support and all of the pupils for taking part!