Documentation for the ZALF AR Cookbook app, created by students of Filmuniversität Bablesberg.
While designing an app for ZALF, we were faced with a difficult task of delivering complex research in a straightforward, amusing format. We wanted to communicate ZALF’s scientific research in a playful, immersive way that would attract young audiences. We decided to use technology to allow the users to become part of the story. Thanks to this approach the person using our app isn’t just a passive observer, but the center of the narrative.
Based on the prototype as seen in the demo, below are some dot-points identifying what we would improve.
First asked ZALF’s team what they wanted to communicate and to whom. Based on this information we came up with a concept.
We started from deciding on our target group. After a couple of conversations, it was clear that even though ZALF’s message is universal, it would be impossible to design an experience for everyone.
Our design persona lives in a big city, is young, well educated and already interested in ecological sustainability. We call her Alexa. Alexa has studied in Berlin, now is working her first job. She buys in bio supermarkets, but when she thinks about it, she doesn’t really know what is the difference between “organic” and “demeter”. She doesn’t know if it really makes any difference and when it would. Alexa would like to make informed choices, but doesn’t have time to read long publications. She likes streaming series, meeting friends, cooking, eating out, listening to podcasts and audiobooks.
Together with ZALF we decided on a crucial message that we wanted to communicate. The message was: “Soil matters!”. While researching and exchanging ideas, we found out that soil influences almost everything around us - soil is important when we talk about subjects like food production, climate change, provision of clean water, biodiversity, etc.
Based on ZALF’s research, we initially explored four subjects that we have found the most interesting as a base of an interactive experience:
We came up with 5 ideas for interactive projects that we have pitched to ZALF teams. ZALF has decided on an “interactive cooking experience” that we have developed further.
Our interactive cooking experience is inspired by location-based audio walks. Experiences with added audio layer are really transportive for the listener.
Our cooking experience is designed for everyone’s kitchen. We add interactive listening and AR layers to the physical world, illuminating stories hidden in the ingredients and dishes prepared. We would like to indulge curiosity, as our experience has the potential to completely reimagine how we process our surroundings both old and new, bringing the past, present, and future of spaces to life simultaneously.
We decided to base our cooking experience on legumes-rich recipe for mujaddra. We have started with writing a script for the audio layer for the app. We wanted it to be informative and entertaining. We have succeeded in delivering the message about the importance of changing our diet to one rich in legumes, while keeping light, playful tone.
Based on the script, we have designed the recipe page with markers for the AR app.
For the graphics to work well as markers, we had to keep in mind to create illustrations with many tiny details and high contrast, so the AR app can easily identify them.
An early concept and technical test:
The technical implementation involved the following general processes:
The Zalf Mujaddara AR app uses three main technologies: Unity, Vuforia and Xcode.
In short, the application works by instructing the application to perform a given
action upon the camera ‘seeing’ what’s called an ‘image target’.
An image target is a visual element on the printed page, that we would like to have AR enabled. This meaning, that the user can view the printed image via their phone’s camera, at which point the associated AR content will be shown on their screen.
The afore mentioned process is achieved using the 3d engine, Unity.
Below is a screencapture of the image targets along with the paired AR content, arranged in Unity.
Within Unity runs Vuforia, the engine which takes care of specific components of the AR functionality, such as image target recognition and AR content presentation. As an initial stage of image target preparation, the visual images destined to be image targets, need to be analysed by an online system. This process creates a graphical ‘map’ of the image, which the Vuforia engine in Unity uses to interpret whether it is seeing something it recognises or not.
Below you can see an example of a section of page that is destined to become an image target. The image after it shows the resulting ‘map’ that the online Vuforia system has calculated. As mentioned this ‘map’ allows the engine running in Unity to not only be able to identify what it’s seeing as an iamge target, but also to differentiate it from other image targets.
The audio playback in the Zalf Mujadarra App followed the below steps:
The following image shows the script below in use within Unity. What’s happening is:
All of these elements are then programmatically controlled in the below script to achieve two main functionalities:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
public class audio_system_control : MonoBehaviour
{
public AudioSource[] audioSources;
public Toggle[] toggleButtons;
public AudioSource worldAnimation;
public Sprite onSprite;
public Sprite offSprite;
private bool[] isPlaying;
private void Start()
{
// Initialize isPlaying array
isPlaying = new bool[audioSources.Length];
// Add toggle function to each button
for (int i = 0; i < toggleButtons.Length; i++)
{
int index = i; // Create a copy of i for use in the listener
toggleButtons[i].onValueChanged.AddListener((value) => TogglePlayPause(index, value));
}
}
public void worldDisplayed()
{
// Stop all other audio sources before playing
for (int i = 0; i < audioSources.Length; i++)
{
audioSources[i].Stop();
// Change the image of the other toggle buttons to offSprite
toggleButtons[i].GetComponentInChildren<Image>().sprite = offSprite;
}
}
private void TogglePlayPause(int index, bool value)
{
if (!value)
{
worldAnimation.Stop();
// Stop all other audio sources before playing
for (int i = 0; i < audioSources.Length; i++)
{
if (i != index)
{
audioSources[i].Stop();
// Change the image of the other toggle buttons to offSprite
toggleButtons[i].GetComponentInChildren<Image>().sprite = offSprite;
}
}
// Play audio and set isPlaying to true
audioSources[index].Play();
isPlaying[index] = true;
// Change the image of the selected toggle button to onSprite
toggleButtons[index].GetComponentInChildren<Image>().sprite = onSprite;
}
else
{
// Pause audio and set isPlaying to false
audioSources[index].Pause();
isPlaying[index] = false;
// Change the image of the selected toggle button to offSprite
toggleButtons[index].GetComponentInChildren<Image>().sprite = offSprite;
}
}
}
If you would like to build the project onto iOS yourself, please see the building page.
Building and testing is a process of exporting the Unity project to an Xcode project, opening the project in Xcode and exporting to a target iOS device upon when the AR app can be tested with the intended hardware.
Apart from some initial bugs due to version incompatibilty between the target iOS device, Unity and Xcode, the process was very smooth and painless. Below is an early test of the AR system. The final product is presented on the demo page.