Till innehåll på sidan
Till KTH:s startsida

Smart Implicit Interaction

The “Internet of things” offers much potential in terms of automating and hiding much of the tedium of our everyday lives. It is predicted to revolutionise logistics, transportation, electricity consumption in our homes, connectivity - even the management of homes.

Yet these systems are beset with manifest human interaction problems The fridge warns you with a beep if you leave the door open, the washing machine signals when it is finished, or even chainsaws now warns you when you have been using them for too long. Each individual system has been designed with a particular, limited, interaction model: the smart lighting system in your apartment has not been designed for the sharing economy, the lawn mower robot might run off and leave your garden. Different parts of your entertainment system turn the volume up and down and fail to work together. Each 'smart object comes with its own form of interaction, its own mobile app, its own upgrade requirements and its own manner of calling for users’ attention. Interaction models have been inherited from the desktop-metaphor, and sometimes mobile interaction have their own apps that use non-standardised, icons, sounds or notification frameworks. When put together, the current forms of smart technology do not blend, they cannot interface one-another, and most importantly, as end-users we have to learn how to interact with them each time, one by one.

In some senses this is like personal computing before the desktop metaphor, the Internet before the web, or mobile computing before touch interfaces. In short, IoT lacks its killer interface paradigm.

This project is built around developing a new interface paradigm that we call smart implicit interaction. Implicit interactions stay in the background thriving on data analysis of speech, movements and other contextual data, avoiding unnecessarily disturbing us or grabbing our attention. When we turn to them, depending on context and functionality, they either shift into an explicit interaction – engaging us in a classical interaction dialogue (but starting from analysis of the context at hand) – or they continue to engage us implicitly using entirely different modalities that do not require an explicit dialogue – that is through the ways we move or engage in other tasks, the smart objects responds to us. One form of implicit interaction we have experimented with is when mobile phones listen to surrounding conversation and continuously adapt to what might be a relevant starting point once the user decides to turn to it. As the user activates the mobile, we can imagine how the search app already has search terms from the conversation inserted, the map app shows places discussed in the conversation, or if the weather was mentioned and the person with the mobile was located in their garden, the gardening app may have integrated the weather information with the sensor data from the humidity sensor in your garden to provide a relevant starting point. This is of course only possible through providing massive data sets and making continuous adaptations to what people say, their indoor and outdoor location, their movements and any smart objects in that environment – thriving off the whole ecology of artefacts, people and their practices.

Team

Kristina Höök
Kristina Höök professor
Anna Ståhl
Anna Ståhl PhD, Senior Researcher anna.stahl@ri.se
Karey Helms
Karey Helms
pavelka
Vasiliki Tsaknaki
Vasiliki Tsaknaki
Ylva Fernaeus
Ylva Fernaeus universitetslektor
sanches
Madeline Balaam
Madeline Balaam professor
Charles Windlin
Charles Windlin intermittent

Funding

SSF

Project duration

2016 - 2021 

Publications

[1]
K. Helms, "A Speculative Ethics for Designing with Bodily Fluids," i CHI Conference on Human Factors in Computing Systems Extended Abstracts, 2022.
[2]
P. Tennent et al., "Articulating Soma Experiences using Trajectories," i CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 2021, s. 1-16.
[3]
K. Helms et al., "Away and (Dis)connection: Reconsidering the Use of Digital Technologies in Light of Long-Term Outdoor Activities," Proceedings of the ACM on Human-Computer Interaction, vol. 3, no. GROUP, 2019.
[4]
K. Helms, "Careful Design: Implicit Interactions with Care, Taboo, and Humor," i Designing Interactive Systems (DIS 2020), 2020.
[5]
A. Russo, D. Foffano och A. Proutiere, "Conformal Off-Policy Evaluation in Markov Decision Processes," i 62nd IEEE Conference on Decision and Control, Dec. 13-15, 2023, Singapore, 2023.
[6]
M. Gaissmaier et al., "Designing for Workplace Safety : Exploring Interactive Textiles as Personal Alert Systems," i Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction, 2020, s. 53-65.
[7]
P. Karpashevich, "Designing Monstrous Experiences Through Soma Design," Doktorsavhandling Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2023:38, 2023.
[8]
K. Helms, "Designing with care : Self-centered research for interaction design otherwise," Doktorsavhandling : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2023:7, 2023.
[9]
C. Windlin, "Designing with the Body: Addressing Emotion Regulation and Expression," i DIS ’20 Companion, Doctoral Consortium, July 6–10, 2020, Eindhoven, Netherlands, 2020.
[10]
K. Helms, "Do You Have to Pee? A Design Space for Intimate and Somatic Data," i ACM Conference on Designing Interactive Systems (DIS 2019), June 23–28, 2019, San Diego, CA, USA, 2019, s. 1209-1222.
[11]
M. Balaam et al., "Emotion Work in Experience-Centred Design," i CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland UK, 2019.
[12]
K. Helms, "Entangled Reflections on Designing with Leaky Breastfeeding Bodies," i In Proceedings of the 2021 Designing Interactive Systems Conference (DIS ’21), Virtual Event, 2021.
[13]
V. Tsaknaki et al., "“Feeling the Sensor Feeling you”: A Soma Design Exploration on Sensing Non-habitual Breathing," i In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21), 2021, s. 1-16.
[14]
M. Alfaras et al., "From Biodata to Somadata," i CHI ’20, April 25–30, 2020, Honolulu, HI, USA., 2020.
[15]
P. Ferreira et al., "From nomadic work to nomadic leisure practice: A study of long-term bike touring," Proceedings of the ACM on Human-Computer Interaction, vol. 3, 2019.
[16]
J. Miniotaitė, V. Pakulytė och Y. Fernaeus, "Gentle Gestures of Control : On the Somatic Sensibilities of an IoT Remote App," Diseña, no. 20, s. 1-16, 2022.
[17]
K. Helms, M. L. J. Søndergaard och N. Campo Woytuk, "Scaling Bodily Fluids For Utopian Fabulations," i Proceedings of the 9th Bi-Annual Nordic Design Research Society Conference: Matters of Scale, 2021, 2021.
[18]
C. Windlin, "Shape and Being Shaped : Sketching with Haptics in Soma Design," Doktorsavhandling Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2023:34, 2023.
[19]
C. Windlin et al., "Soma Bits - Mediating Technology to Orchestrate Bodily Experiences," i Proceedings of the 4th Biennial Research Through Design Conference19–22/03/2019, 2019.
[20]
P. Tennent et al., "Soma Design and Sensory Misalignment," i 2020 ACM CHI Conference on Human Factors in Computing Systems, CHI 2020, 2020.
[21]
M. Sahlgren et al., "The Smart Data Layer," i Papers from the 2018 AAAI Spring Symposium on Artificial Intelligence for the Internet of Everything, 2018.
[22]
K. Helms och Y. Fernaeus, "Troubling Care: Four Orientations for Wickedness in Design," i ACM Conference on Designing Interactive Systems (DIS 2021), 2021, s. 789-801.