HCI

Digital Phenotyping -EMOCHI'22


Posted on April 13, 2022, 6:41 p.m.


New Sensors and Features for Digital phenotyping on Off the Shelf Smart Devices

Jiwan Kim and Ian Oakley
CHI'22 Workshop: The Future of Emotion in Human-Computer Interaction

EMOCHI'22 Digital Phenotyping.pdf

Abstract

New forms of digital healthcare are being enabled by smartphones and other smart wearables. These widely used devices can provide rich sets of sensor data regarding the behaviors of their users and this data that can be used to power various forms of medical monitoring or diagnosis. Digital phenotyping is one approach in this area that primarily uses smartphone data to detect or recognize cognitive, behavioral or affective states and traits. It is typically based on data from sensor measurements (e.g., inertial or touch sensors), activity or use logs, and analysis of user-generated content. While these data sources are valuable, we argue they do not leverage the full potential of modern smart devices: the sensing capabilities of current smartphones go considerably beyond the features sets that have been previously studied. In our work, we focus on generating new digital biomarkers using the advanced capabilities of off-the-shelf smart devices that can support detection of affect. In this paper, we discuss two examples: eye gaze (using a smartphone eye-tracker) and breathing rate (using inaudible active sonar). We believe that extending the scope of affective digital phenotyping to include these new sensing modalities will increase the reliability, and robustness, of the emotional state detection it enables.

Introduction

Digital phenotyping, defined as the “moment-by-moment, in situ quantification of the individual-level human phenotype using data from personal digital devices” [2], exploits data captured from smart devices’ sensor channels to assess human states and behaviors. Using these data, digital phenotyping has been shown to accurately detect or predict a wide range of cognitive traits, affective states, and behaviors including mood disorders, stress, and substance abuse. In addition, it also has been shown to be valuable in a range of tasks from prevention to early diagnosis of clinical conditions through monitoring and treatment. The wide range of states that can be detected and monitored, and the wide variety of uses that this information can be used for, promises to revolutionize mental well-being and healthcare – moving from a paradigm of formal testing in clinical settings to one of continuous assessment in daily life. By enabling fine-grained tracking and monitoring, digital phenotyping may also enable customized and timely interventions that have the potential to support improved mental health and well-being. The types of digital biomarkers [1] informing current phenotyping efforts are various and include: location data; phone/app utilization data; social media data; touch screen input data; device motion data; data from physiological sensors (e.g., heart rate monitors) and; voice analysis. While this data is rich, it also often suffers from privacy issues (e.g., location, activity, social media use, voice) or is relatively coarse (e.g., inter-key press time) or sparsely captured (e.g. heart rate). As such, there are many situations in which biomarkers are not viable (e.g., privacy is sacrificed) or simply not observed (e.g., due to infrequent phone use). For digital phenotyping to achieve its full potential, there is a need to both increase the available set of biomarkers and to increase the richness of information that can be gained from each one. One way of addressing these concerns is to develop new digital bio-markers based on emerging mobile sensing channels, such as eye-tracking [6] or sonar [3]. These sensor channels can provide entirely new biomarkers (e.g., from gaze patterns) or complement existing biomarkers (e.g., sonar data during key tap events) and we argue they have strong potential to increase both the available use scenarios and also the diagnostic salience of digital phenotyping systems.

Gaze Based Affection Detection during SNS Use

Online Social Networking Services (SNS) allow users to interact (e.g., comment, like, post, message) with other users, activities which evoke diverse affective experiences including both positive and also negative emotions (e.g., envy, social comparison, appearance comparison). To mitigate those negative experiences, our group has developed affect monitoring systems based on sensor data captured during SNS use—specifically for Facebook and Instagram, two currently popular services. In our previous work [6], we were able to detect users’ binary emotion with high levels of accuracy for both valence (94.16%) and arousal (92.28%) by using motion, touch, and eye-tracking data in relatively controlled lab settings. Eye-gaze features, in particular, showed peak performance and we see considerable value in developing these further. Specifically, our prior work used raw features provided by Apple’s ARKit (https://developer.apple.com/augmentedreality/arkit/). These represent gaze as a series of discrete coefficients describing gaze direction (e.g., look left, look right, etc) rather than as a more traditional representation in which a gaze point is calculated. We plan to extend this prior work with a new study that uses a more traditional gaze-point based representation and seeks to determine the performance of features derived from this (e.g., saccade speed, count, fixation duration) for affect detection.

Breathing Rate Monitoring using Active Sonar

In addition to this work on gaze, we see further opportunities to expand the scope of digital phenotyping by using entirely novel input modalities. Specifically, we identify sonar as a promising input modality in this area. Numerous prior demonstrations has shown smartphone sonar is capable of supporting functionality as diverse as back of device finger tracking [8], mid-air gesture recognition [9] and, breathing monitoring [7, 10]. Recently, our group also proposed a novel approach to natural and unencumbered touching finger identification on an unmodified smartwatch using sonar. We believe breathing rate detection has particular relevance for digital phenotyping. Prior work has suggested that sonar-based breathing rate detection is robust to various environmental changes and also to changes in smartphone orientation [10]. This suggests it can perform well during daily life activities. Indeed, prior work has shown that sonar systems can be effective in various real-world situations such as detecting opioid overdoses [4] and sleep apnea [5]. Based on these results, we suggest that breathing detection can be a valuable new feature to support digital phenotyping approaches for detecting affect. We believe it can also be applied to the scenario of affect detection during social media use.

Conclusion

In conclusion, we argue that the novel sensing modalities available on smart devices, such as eye gaze and active sonar, will improve the quality of predictions that can be achieved when using digital phenotyping approaches to monitor affective states. The systems that these approaches will enable will be able to assess users’ emotional states during daily life activities, a capability we suggest has strong potential to improve both individual and societal well-being. For example, increased awareness of affective states may support better mental well-being and alleviate stress, anxiety, and non-clinical depression. Additionally, interventions such as behavior recommendations and affect prediction will empower individuals to manage their mental health and reduce the occurrence rates of clinical mental disorders such as depression.

REFERENCES

[1] Paul Dagum. 2018. Digital biomarkers of cognitive function. npj Digital Medicine 1, 1 (28 Mar 2018), 10. https://doi.org/10.1038/s41746-018-0018-4.
[2] Kit Huckvale, Svetha Venkatesh, and Helen Christensen. 2019. Toward clinical digital phenotyping: a timely opportunity to consider purpose, quality, and safety. npj Digital Medicine 2, 1 (06 Sep 2019), 88. https://doi.org/10.1038/s41746-019-0166-1.
[3] Rajalakshmi Nandakumar and Shyamnath Gollakota. 2017. Unleashing the Power of Active Sonar. IEEE Pervasive Computing 16, 1 (jan 2017), 11–15. https://doi.org/10.1109/MPRV.2017.15.
[4] Rajalakshmi Nandakumar, Shyamnath Gollakota, and Jacob E Sunshine. 2019. Opioid overdose detection using smartphones. Sci. Transl. Med. 11, 474 (Jan. 2019), eaau8914.
[5] Rajalakshmi Nandakumar, Shyamnath Gollakota, and Nathaniel Watson. 2015. Contactless Sleep Apnea Detection on Smartphones. In Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services (Florence, Italy) (MobiSys ’15). Association for Computing Machinery, New York, NY, USA, 45–57. https://doi.org/10.1145/2742647.2742674.
[6] Mintra Ruensuk, Eunyong Cheon, Hwajung Hong, and Ian Oakley. 2020. How Do You Feel Online: Exploiting Smartphone Sensors to Detect Transitory Emotions during Social Media Use. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 4, 4, Article 150 (dec 2020), 32 pages. https://doi.org/10.1145/3432223.
[7] Xingzhe Song, Boyuan Yang, Ge Yang, Ruirong Chen, Erick Forno, Wei Chen, and Wei Gao. 2020. SpiroSonic: Monitoring Human Lung Function via Acoustic Sensing on Commodity Smartphones. In Proceedings of the 26th Annual International Conference on Mobile Computing and Networking (London, United Kingdom) (MobiCom ’20). Association for Computing Machinery, New York, NY, USA, Article 52, 14 pages. https://doi.org/10.1145/3372224.3419209.
[8] Ke Sun, Ting Zhao, Wei Wang, and Lei Xie. 2018. VSkin: Sensing Touch Gestures on Surfaces of Mobile Devices Using Acoustic Signals. In Proceedings of the 24th Annual International Conference on Mobile Computing and Networking (New Delhi, India) (MobiCom ’18). Association for Computing Machinery, New York, NY, USA, 591–605. https://doi.org/10.1145/3241539.3241568.
[9] Wei Wang, Alex X. Liu, and Ke Sun. 2016. Device-Free Gesture Tracking Using Acoustic Signals. In Proceedings of the 22nd Annual International Conference on Mobile Computing and Networking (New York City, New York) (MobiCom ’16). Association for Computing Machinery, New York, NY, USA, 82–94. https://doi.org/10.1145/2973750.2973764.
[10] Xuyu Wang, Runze Huang, Chao Yang, and Shiwen Mao. 2021. Smartphone Sonar-Based Contact-Free Respiration Rate Monitoring. ACM Trans. Comput. Healthcare 2, 2, Article 15 (feb 2021), 26 pages. https://doi.org/10.1145/3436822.