We met with Ben Gonshaw, Game Designer and Experience Lead at AKQA, and Paul Deane, Natural History Unit, to chat about Amana
Note: This pilot is a fictionalised drama, based on footage, published research on lion behaviour and personal accounts from scientists and rangers, and created using footage shot by the Natural History Unit over a number of years.
Can you sum up Amana in a sentence?
Paul: Amana is an interactive story where you make decisions as a lioness with four cubs to protect, as you try to survive in a dangerous pride.
What is the aim of this project?
Ben: From the content angle this isn’t the Lion King. We wanted to expose the harsh reality of a lion’s life. Males aren’t regal kings, but lazy, domineering fighters. Females play a constant game of submission to them. The result should be something that really grabs you and leaves you shaken, but also that opens your eyes to natural history.
From a technical perspective we wanted to get something running on an iPad as people are using tablets more and more as an entertainment device – so we thought this was a good place to start.
How did you go about creating the pilot?
Paul: We started by pulling together the latest research on lion behavior and deciding the story we wanted to bring to the screen. We then poured over literally 25 years’ worth of our footage to pull together the story we have. A lot of the time we couldn’t tell the story we wanted to because the footage simply didn’t exist, so we’d have to completely re-write it. Other times the story wrote itself. Our writer and editor Mark Marlow crafted the story by putting different shots together, working with Joe Fenton and Chris Kidd who researched lion behaviour and roughly assembled the footage.
We always felt this needed high production values, with proper grading and sound (Happy Hour), original score (Jean-Marc Petsas) and narrator (Colin Salmon). Anything less we thought would ruin the experience.
Ben: I worked with Paul to help make the player’s choices as interesting as possible and make the types of button-presses differ from moment to moment. Often this meant altering the story to give more meaningful choices to the player. Sometimes it changed the way things were set up, so we could give enough information for a decision to be taken thoughtfully.
As we went along we mapped everything out as a set of flow charts, which became our guide for how to edit the whole piece together. Our video player needed clips to flow together in solid sequences as the story branches and merges, so that meant making a really strange edit that jumps all over the place. If you were to sit down and try to watch it through like a normal video it would make absolutely no sense at all!
What technologies did you use to create it?
Ben: We built this system from scratch and it’s custom-made to run on iPad. We’re quite proud of what we’ve achieved technically – even with the very specific limitations of the platform. One example is that the iPad only tells us where it is in the video every 250 milliseconds. It doesn’t sound like much, but it means that any cuts we make could be up to 6 frames early or late making it hard to do seamless loops or precise cuts.
What do you think are the most compelling aspects of Amana?
Ben: I think that making something that really puts you in the headspace of a lioness trying to survive is a really interesting scenario. We have a tendency to put human emotions and motivations on animals and this is a chance to help players to shift their mindset. It’s exciting to be able to capture that and immerse people in something so different.
Paul: We hope that a fictionalised drama aimed at a young audience, and on a different device, might open a new and exiting way to bring the natural world into people’s daily lives and capture a wider audience.
What happens next?
Ben: This method of storytelling can be used to get under the skin of people, animals and situations. With enough time we can really evolve how you make choices, which can allow you the freedom to explore where you want to, pick up and use items and choose more fluidly how to tackle situations and in what order.
I’d like to push much further away from a linear journey and back towards the original freeform vision. We can also open this up to work on other platforms beyond just iPad.
With consumer VR units just around the corner, the framework that we have built could plug in 3D VR videos to really transport you into another place.