Incom ist die Kommunikations-Plattform der Fachhochschule Potsdam

In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre

Incom ist die Kommunikations-Plattform der Fachhochschule Potsdam mehr erfahren

Cheffrey – The Voice Of Your Recipe

Cheffrey is a Voice Assistant for the kitchen. Choose a recipe, and Cheffrey will assist you and answer every question you may have about that recipe. The screen will show all the information you need to perform the step-by-step instructions of the recipe. All you have to do, is cook!

Hi! I'm Cheffrey.

Cheffrey offers a broad variety of recipes for you to cook and assists you while doing so. If you’ve found a recipe you want to cook, either press “Start Cooking” or say “Cheffrey, Start Cooking” and from there on Cheffrey will guide you through process and answer any question you might have about the recipe.

With Cheffrey you don’t have to waste time trying to understand the recipe in order to cook the recipe. With Cheffrey you can concentrate fully on actually cooking – whether it’s chopping an onion, blanching chard or flambéing a soufflé.

Why do I need Cheffrey?

Recipes can be hard to understand. While they’re actually meant to enable you to cook any meal without knowing how to, they often fall short in communicating the instructions efficiently. You want to cook a meal, but you end up spending more time looking at the recipe.

Cheffrey aims to eliminate miscommunication and improve your cooking experience. If you want to cook a meal, the recipe should not stand in your way. With Cheffrey you’ll spend less time wondering what tumeric is and more time actually cooking the food you want to eat.

(KptnCook/HelloFresh/AlexaCookingCompanion/GoogleHomeCooking)

There are a number of Apps and Voice Assistants to date that set out improve the cooking experience while addressing this specific problem. Apps like Kitchen Stories or KptnCook have created interactive recipes that centre around the cooking process and Google's Google Assistant as well as Amazon's Alexa have included features to read out certain parts of recipes which you can also speak to. But none of them focus on this specific problem first.

Cheffrey aims to eliminate miscommunication and improve your cooking experience by combining the lean user experience of interactive, process-centric recipes and the convenience of voice interaction, symbiotically creating a new format for Human-Computer-Interaction.

If you want to cook a meal, the recipe should not stand in your way. With Cheffrey you’ll spend less time wondering and trying to understand, and more time actually cooking the food you want to eat.

How to cook with Cheffrey

The journey starts with choosing a recipe. That means looking at recipes in beforehand and checking if you have the right utensils and ingredients or if you even have the time to cook this meal right now.

Once you've chosen your recipe, press „Start Cooking“ or say „Cheffrey, start cooking“ and Cheffrey will launch right into the first step.

Cheffrey narrates every step in detail. The summary of what was said is written below the picture in broad bullet points. A picture accompanies each step to give a visual queue on how the step is to be performed.

The ingredients as well as their amount stand out in bold while still incorporated in the sentence structure provided by the verbal narration.

At the bottom a green on grey line indicates how much progress you've made in the recipe so far. With each completed step the green bar grows.

The button with the Cheffrey-Logo has a multitude of functions: tapping on it reveals or hides a list of all the ingredients in the recipe with the used one crossed out, pulling the button to the right or to the left fasts forward or backward, and long-pressing it activates the voice assistant.

The upper segment of the screen is dedicated entirely to pictures. In observing people using recipes we found that while they look to the text for content, they often use pictures to make sure if they're doing the step right. Without pictures it was hard for them to tell whether they're on the right track.

The steps are aligned from left to right – as in the reading direction in western countries and the direction in which progress bar progresses. There is nothing else on this axis to make it as easy as possible to navigate between steps.

The ingredients come in from the bottom circle – or Cheffrey button – and offer stability throughout the recipe, for the content of this screen is the same in every step.

Talking to Cheffrey

Cheffrey's wakewords are „Cheffrey“, „Hey, Cheff“ or „Okay, Cheff“ – or any other combination thereof. Say those words together with a question or a command and Cheffrey will answer.

Saying „Hey, Cheff. Next Step“ for instance prompts Cheffrey to show you the next step in the recipe and immediately tell you what you need to do to complete it.

If a question or a problem arises, ask Cheffrey and you'll get the assistance you need.

Cheffrey aims to not not distract you as to maintain your focus on the cooking process. Only when necessary will Cheffrey draw your attention to the screen by showing you a visual answer to your question.

Construction of Cheffrey

You'll notice the screen layout changes, once you start the cooking process. That's because these two screens are part of two different modes of use.

Normally you would hold your phone fairly close to your eyes (approximately 30cm) as you browse various contents on many Apps and sites and try to internalise as much information as possible. This also goes for looking at recipes, when you haven't decided what to cook yet. You hold the phone in your hand as you carefully pick and choose a recipe.

But once that decision has been made, you put the phone down somewhere, so you're able to perform the instructions of the recipe. And ideally you shouldn't have to pick it back up again.

The ladder screen state is what we have come to call „Voice First Screen Design“. It means that the visual screen design advocates speech input instead of touch input.

The entire Cheffrey concept rests on this principle: You shouldn't pay any more attention to phone than necessary – you should focus on cooking.

Layout

The spacial freedom the components enjoy on each screen comes from the dissection of the recipe into manageable steps and the way that navigation connects them with one another.

All that needs to be shown per screen are the ingredients used, the task to be performed and a guiding picture.

Pictures

Pictures are important for cooking with recipes. They give you a point of reference; they tell you how to do the task and whether or not you've done it wrong.

If the phone isn't in my hand, then the picture needs to be as big as possible. That means they need to be as wide as possible to fill the upright screen from left to right. And that amounts to 50% of the screen being taken up by the picture.

Text

Text needs to be readable from difficult angles at difficult distances. That means has to be big and it has to have a good background contrast to be easily identifiable as writing.

Using the calculator from leserlich.info we estimated that text should ideally be 32pt to 48pt large, at a distance of 1,2m for a visus of 0,7.

Cheffrey Button

This button is a hide-away. In normal touch first screen design each of the functions embedded in the Cheffrey Button would get their own button with clear definition. But that sort of screen design would warrant touch interaction, so these functions – for they still need to be an option – are hidden to benefit voice interaction.

How we came up with Cheffrey

We were thinking out loud about cooking using recipes and agreed that while there is a certain thrill to cooking something you've never cooked before, it is also very time consuming and far from a simple ordeal.
We started wondered where the problem lies and came up with our solution hypothesis: If the recipe could talk, it would make things a lot easier. Since people usually follow a recipe we could know what issues occur in the cooking process by analysing the recipe and observing the cooks.

Isolating the problem area

We wanted to see if we could correctly identify the source of the problem in order to fix it. So, we brainstormed everything that bothered us about cooking with recipes and tried to be as specific as possible in doing so.

It quickly became apparent that we had to get input from other people as well, to make sure that this was not only a problem for us but for others as well.

In order to understand how people cook, what they like and what they don't like about the entire process, we interviewed different people about their cooking habits. We found that time is often the most important factor for their meals and when they cook something new they take extra time for it. Even experienced cooks said that if they cook new meals with a recipe they follow the recipe thoroughly because they want to know how the original recipe should ideally taste like.

password: Voice

This gave us a broad overview of how people cook but to get an even better understanding about the cooking procedure we shadowed three groups of people by giving them a new recipe that they were supposed to cook.

The cooks faced many issues during their cooking process. One of the main issues was that the text was precisely written and they often had to scroll through the recipe since the information they needed wasn't where they wanted it to be. Since the cooks got their hands all dirty they had to constantly wash their hands to scroll through the recipe.

One group even assigned a person to read and understand the recipe while the other two are cooking. The cooks also looked very often at the pictures and tried to make their meal look exactly as seen on the pictures. This lead to confusion in some groups because their ingredients looked a bit different from the pictures.

Understanding recipes

To understand what a cook might want to know about what they're cooking and what problems they might face we needed to understand how recipes are structured.

A good classic recipe is usually divided into steps with short, precise sentences and a picture for every step. Smaller steps are often bundled into larger ones. Ingredients and sometimes utensils are most commonly listed on top with the amount of every ingredient next to it when needed.

This style works out for visual interfaces, since the eye can quickly jump between sentences and get a lot of information with little effort. However, we found that our test subjects spent an average of 2-3 minutes reading the recipe for every 10 minutes of total preparation and cooking time.

We tried reading out the recipe steps to maintain the test persons focus on the cooking process. But the test persons had a hard time following the voice as it read out the instructions. We concluded that our test persons' attention span for auditory stimuli was too short to follow the instructions, read out as they were, and that they quickly forgot important pieces of information.

Other than with the eye, the user can't quickly skim the recipe with their ears to get the information they want. This meant that classical recipes can't just be read out loud by a voice assistant – they need to be divided into smaller sub-steps.

These sub-steps can't just be created by dividing one step into two, since some steps would be put out of context, while others would be unnecessarily elongated just because they consist of more text.

Another important aspect was the duration of certain tasks. In our user tests we found out that people tend to see long and tedious tasks as one step. On some occasions the test subjects were instructed to peel and dice 10 potatoes. After they peeled them they went on to the next step, forgetting that they needed to dice them.

Other persons, who only had to peel and dice 4 potatoes, didn't forget this step. Since the duration of tasks changes depending on the serving size, the sub-step needs to adapt to the time needed for every step. However, increasing the portion size doesn't automatically increase the duration for every step.

Since steps are usually compressed into short sentences, one can't really see the complexity of the recipe until every single step is being visualized on a single canvas. Creating information architectures of not just the recipe but of all the steps that our test cooks did, gave us a broader understanding of some issues that people face when cooking.

During the process the cooks had to do things that weren't mentioned in the recipes. Cutting cauliflower into pieces is quickly said but you first have to wash it. And how big should the pieces be? Can you also add the stem? Those problems are not being issued in classic recipes but are still relevant to the cook.

Thinking of a solution

We formulated HMWs around the biggest pain points and came up with ideas on how to solve each. We quickly decided on what to carry on with an determined that the biggest problem area was adjusting content to context; how might the recipe support me in what I'm doing now?

The ideas we came up with covered everything from sound recognition – where the microphone on a voice assistant could tell that I'm chopping onions based on what that sounds like – to an IoT kitchen with sensors in every corners of the kitchen. We narrowed our focus down to the technology people already use nowadays.

In a survey we found out that around 84 % of people prefer to use their smartphone when cookng with a recipe. Because voice assistant are still new and people are sceptical about it, it's better to introduce it to them with something familiar. Smart kitchens will probably be the future but this is a big step forwards and users might be overwhelmed by having multiple sensors in the kitchen. A voice based app might be the best solution for a soft transition into Smart kitchens.

Prototyping a cooking experience

We quickly scribbled different Interface-concepts onto post its and foucused on the most promising ones.

Testing, Testing, Testing!

By creating paper prototypes we were able to quickly test wether the interface was understandable and what users would expect it to do when pressing on certain elements. We digitalized the best ones and those which needed to be more detailed to be conceivable. One problem with the early prototypes was the the font was to small for the testers to see from a large distance. By doing quick prototypes we were able to find issues that we could fix before we started to create more detailed screens.

password: Voice

We thought about what kind of commands Cheffrey should react to and created a small mind-map with all categories for for different questions and commands that the user might ask to get an overview of what issues might be addressed. For the final prototype we created all screens and Voice outputs necessarily to create a meal and tested it with a person. This helped us to polish the result and fix last few issues.

password: Voice

In the end everything went right but the test person had some acoustic issues with understanding some recipes so we had to add longer breaks in between some sentences.

Final Presentation

What Cheffrey means for the cooking experience

Establishing an ease of use with Cheffrey means establishing an ease of use with new recipes. It will be less of a barrier to be confronted with something new.

Cooking with recipes usually takes more time and effort for a number of reasons – starting with having to buy the right ingredients, ending with actually putting them together in the correct order using the right tools. Cheffrey supports the cooking process and saves valuable time, thus creating an even more positive cooking experience.

With Cheffrey, you won't say „it tastes really good, but I'm too tired to cook something that complicated“, you'll say „if I can cook that, I can probably cook anything“.

One last thought...

The ideas we applied to cooking with recipes can be applied to other forms of instructions, like IKEA instructions for putting up a shelf. You are given a set of „ingredients“ (pieces of wood, bolts and screws) and a step-by-step instruction of how to get them to the desired state, aided by visualised examples.

So if Cheffrey can help you cook a recipe, then maybe Cheffrey could also help you build a shelf. Instructions of any kind could become part of Cheffrey's repertoire.

Fachgruppe

Interfacedesign

Art des Projekts

Studienarbeit im Grundstudium

Betreuung

Dipl. Des. Christian Wendrock-Prechtl

Zugehöriger Workspace

Voice Based Apps – Sprachassistenten und ihre Auswirkungen auf das Interfacedesign

Entstehungszeitraum

Wintersemester 2018 / 2019

Keywords

Noch keine Kommentare