In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre
In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre
The data visualization project “touch, touch, swipe” was developed as part of the “Data Talks / Data Walks” course by Francesca Morini and Prof. Marian Dörk. The result of this team effort is a scrollytelling website visualizing an Instagram user’s interactions with her feed alongside a reflexive conversation about her behavior.
We were tasked with conducting a so-called data-based interview. On a freely chosen topic, a single person was to be interviewed. During this interview, additional quantitative data was to be collected with the help of sensors that could also be freely selected.
The goal was to juxtapose these two data sources by displaying the sensor data alongside corresponding interview segments. This combination was to be presented within the framework of a journalistic article and implemented as an interactive web page.
Our group decided to investigate the use of social media, focusing on Instagram. As of today, Instagram is the most popular and widely-used social media platform. The possibilities of interaction that the app provides are the same for every user, and yet each of us experiences the journey through our personalized feed differently.
Where does our fascination with Instagram come from? What makes us invest our valuable time into using this app for up to several hours each day? Which parts of this experience consume most of our attention? And do we act autonomously or does Instagram play its part in pulling the strings? With the project “touch, touch, swipe”, we aimed to get to the bottom of these questions.
Sensor Data
As the first step, we had to decide which sensors to use for collecting our quantitative data. Since our interest revolved around a user’s attention, the first method we considered was eye tracking. In combination with Instagram’s own usage stats, this approach seemed quite promising. Unfortunately, we were not able to establish a working eye tracking workflow in this short period of time. We thus opted for a completely different tracking method.
Android’s developer tools make it possible to establish a special debugging connection to an Android device via USB. Through this connection, we were able to access and record the device’s internal log of touch events, the instructions for which we discovered on GitHub alongside a Python script to store this touch data in a more accessible format. We were therefore able to record all touches and swipes that a user engages in while scrolling through and interacting with Instagram.
Interview
For the interview, we opted for a narrative, loosely-structured approach that would prompt our interviewee to mostly speak herself, rather than follow a predetermined structure. We nonetheless prepared a catalogue of possible questions beforehand, which we made use of during certain sections and topics of the conversation.
Data Sketching
Already in this very early phase of the project, we started to sketch out possible uses and visualizations of our data, both on paper as well as digitally. These continuous visualization ideas helped us explore (as well as discard) many options and possibilities from a very early stage. As a consequence, we were able to relatively quickly decide on a direction when the data was finally ready to be used much later.
Touch Event Data Pipeline
The data pipeline for our touch event recordings turned out quite complex. Through Python scripts embedded within a Jupyter Notebook, we restructured the raw event log from the Android phone into data points that each represented a single interaction. Touches were sorted into different categories based on their positions on the screen, while the overall direction of movement was calculated in the case of swipes.
Visualizations of the touches’ coordinates and a timeline of events (both of which you can view in the slides above) helped us understand and inspect the data we had collected. We ended up exporting this reorganized data as a JSON file for use in the web page.
Interview Setup
After multiple test runs that each revealed certain problems and shortcomings with our initial approaches, we decided to divide our interview process into two stages. In the first stage, we let our test user scroll through her Instagram feed alone, interacting with the app naturally. We recorded the touch events of all these interactions using the methods described above and also recorded a video of the screen.
This screen recording was essential for the second stage of the interview. After using the app, we asked the user to watch the screen recording together with us, prompting her to explain to us what she had been doing on her journey through the app. Within these mostly autonomous reflexions, we also asked questions from the catalogue we had prepared beforehand.
Again, we recorded audio and video of this conversation. This recording later served as the basis for the transcription of the interview, the selection of specific segments as well as the synchronization of these segments with our recording of touch events. The introduction and conclusion texts, which frame our conversation and provide context, were written as one of the very last steps, after the editing of our interview data had been completed.
Synchronization
Finally, our two interview stages as well as our touch event data had to be synchronized. This synchronization worked by associating each segment of the conversation with a corresponding timestamp within our timeline of touch events. Our pipeline for transforming the interview data was once again quite complex, ranging from an export of chapter markers from Adobe Premiere via the manual insertion of our interview transcript using Excel to another Python script that calculated proper timestamps and exported the interview sections as another JSON file.
Data Analysis
To be able to translate our collected data into actual interpretations of the actions being taken by our user, we examined the Instagram app in detail. We noted down the different types of content available (posts, stories, IGTV videos, etc.) and the possibilities of interaction given within the contexts of accessing each of those content types. The same gestures, like touches in a certain position or swipes in a certain direction, turned out to cause different actions within different sections of the app – so the same touch data within our recording signifies the skipping of a post in some situations and a more in-depth engagement with a post in others. This insight was crucial for giving our data context and meaning as well as providing us with a clear outlook on what we wanted our visualization of the touch events to communicate.
Information Structure
For sketching out the multiple layers our collected information consisted of, we used a website template provided within the course as our point of departure. While on a technical level, this template turned out too complex and narrow-focused to adapt it to our specific needs (prompting us to develop our own website using Vue.js), seeing how the layers of data were combined and arranged within the template helped us derive our own possible information architecture. Certain visual and layout-related elements of the template were incorporated into our own web page.
Data Visualization
The sensor data to be visualized were our test user’s touches and swipes carried out while interacting with the Instagram app. Following input from Francesca, we decided to place the touches and swipes within a stylized wireframe of our test device. We chose to represent single touches as larger circles and swipes as a trail of smaller, overlapping circles that sequentially appear following the direction of movement.
We additionally chose to indicate the current type of content within the app as well as the action triggered by the most recent interaction as status labels above and below the device. The faint, simplified UI elements displayed in the background of the device’s screen react to changes in the type of content viewed by our test user.
Website Interface
The layout of our website was partly derived from the provided template, with the transcript of our interview segments being placed on the left and the visualization of our sensor data on the right side. The functionality and interactivity of the website revolves around the paradigm that the passage of time is tied to the scroll position. Scrolling up and down influences the timestamp visible to the right of the device wireframe and, consequently, the touch events displayed on the device’s screen.
When an interview segment on the left scrolls into view, the data visualization freezes at the point in time this interview section is associated with. The specific interaction the conversation refers to is additionally highlighted with a yellow background in the text. Once each interview segment scrolls above the center of the screen, the passage of time and the visualization resume.
Visual Language
The visual language of the website was defined in a series of rapid design iterations which all happened within the final two days of the project. With the help of a mood board that had been compiled beforehand, we designed several variants in Figma, discussed the pros and cons of each draft, combined certain parts of those designs and finally decided on a visual style which we all liked best. Our visual language incorporates certain elements from the Instagram interface while also being inspired by the rapid up-and-down movement of scrolls and swipes.
Below, you can see a screen capture of our final, interactive website. You can also try scrolling yourself at https://lab.jona.im/swipe/.
Tim
The biggest learning of this course was to get a quick insight in the whole process of a data visualization. Starting with the collection of data with a sensor, leading an interview, cleaning the data and finishing with the visualization and coding, I learned much about the possible methods and tools of a data viz project.
Working together with a very motivated and skilled team made it possible to develop a working concept and build a functional prototype in the short project time.
The course was a big motivation to get deeper into Jupyter notebooks, JavaScript and working with sensor data.
Vi
Throughout the working on this project I had the very first experiences on quantitative research, data journalism and data visualization.
I have to say it was a fascinating journey to work with such motivated and supportive team members in a fluently planned workflow, as well as the helpful instructions of Francesca and Prof. Marian Dörk. First time getting my hands on the analyzing of sensor data in the combination with interview data, I learned new possibilities to process information and data visualization tools that I haven't had the chance to get close to. After this course I definitely want to find out more about Jupyter notebooks, the use of Vue.js and the methods of data journalism.
Jona
I very much enjoyed being part of this interesting project and fantastic team. The individual strengths within our group were really well balanced, enabling us to cooperate effectively, work on various tasks in parallel and, in the end, complete the rather complex structure of this project in this short period of time.
I also really liked the course’s approach of combining qualitative and quantitative data, as the two are rarely synthesized in this way. This was my first time going through a full quantitative data collection workflow and I learned a lot about the intricacies of successfully capturing sensor data as well as the restructuring and “massaging” of the data (as Prof. Dörk likes to put it) using Jupyter Notebooks.
This project has definitely solidified my interest in data visualization and I would love to pursue more projects in this direction. Thanks a lot to Francesca and Prof. Marian Dörk for their support and to my teammates for this great experience. This start to my studies at FHP couldn’t have been better :)
Anna
Working on this project brought me great joy. We were a very good team and therefore the teamwork went very well. I was myself surprised how much the topic offered in terms of content, as I was rather skeptical about it at first and took it rather on to challenge myself.
Working through the entire workflow of a data visualization project in the short term of one project week showed me very clearly the sequence of steps that build on each other and my own personal strengths and weaknesses in this long process. I want and have to learn to visualize better :) And of course there is JavaScript waiting patiently on the internet for me to dive in.
Data journalism as a field of activity for interface designers I liked so much that in fact I am considering making it my major field of study during the Hauptstudium at the FHP.
A sincere thank you for that wonderful experience to the team, to Francesca and to Prof. Marian Dörk.