Julian Steil (PhD Student)

MSc Julian Steil

Address
Max-Planck-Institut für Informatik
Saarland Informatics Campus
Campus
Location
-
Phone
+49 681 9325 2000
Fax
+49 681 9325 2099
Email
Get email via email

Personal Information

Education

I hold a Bachelor’s degree in Computer Science from Saarland University in February 2013. Instead of continuing the Master’s program in Computer Science I specialized in the fields of image acquisition, analysis and synthesis requiring profound scientific knowledge, in particular in computer science, mathematics, physics, engineering and cognitive science. In December 2014 I received an interdisciplinary Master’s degree in Visual Computing from Saarland University and started my Phd in the Perceptual User Interfaces group.

 

Research Interests

  • Mobile Eye Tracking
  • Image Processing and Computer Vision
  • Machine Learning and Pattern Recognition
  • Human-Computer Interaction

Publications

Steil, J. (2019). Mobile Eye Tracking for Everyone. Universität des Saarlandes, Saarbrücken.
Abstract
Eye tracking and gaze-based human-computer interfaces have become a practical modality in desktop settings, since remote eye tracking is efficient and affordable. However, remote eye tracking remains constrained to indoor, laboratory-like conditions, in which lighting and user position need to be controlled. Mobile eye tracking has the potential to overcome these limitations and to allow people to move around freely and to use eye tracking on a daily basis during their everyday routine. However, mobile eye tracking currently faces two fundamental challenges that prevent it from being practically usable and that, consequently, have to be addressed before mobile eye tracking can truly be used by everyone: Mobile eye tracking needs to be advanced and made fully functional in unconstrained environments, and it needs to be made socially acceptable. Numerous sensing and analysis methods were initially developed for remote eye tracking and have been successfully applied for decades. Unfortunately, these methods are limited in terms of functionality and correctness, or even unsuitable for application in mobile eye tracking. Therefore, the majority of fundamental definitions, eye tracking methods, and gaze estimation approaches cannot be borrowed from remote eye tracking without adaptation. For example, the definitions of specific eye movements, like classical fixations, need to be extended to mobile settings where natural user and head motion are omnipresent. Corresponding analytical methods need to be adjusted or completely reimplemented based on novel approaches encoding the human gaze behaviour. Apart from these technical challenges, an entirely new, and yet under-explored, topic required for the breakthrough of mobile eye tracking as everyday technology is the overcoming of social obstacles. A first crucial key issue to defuse social objections is the building of acceptance towards mobile eye tracking. Hence, it is essential to replace the bulky appearance of current head-mounted eye trackers with an unobtrusive, appealing, and trendy design. The second high-priority theme of increasing importance for everyone is privacy and its protection, given that research and industry have not focused on or taken care of this problem at all. To establish true confidence, future devices have to find a fine balance between protecting users’ and bystanders’ privacy and attracting and convincing users of their necessity, utility, and potential with useful and beneficial features. The solution of technical challenges and social obstacles is the prerequisite for the development of a variety of novel and exciting applications in order to establish mobile eye tracking as a new paradigm, which ease our everyday life. This thesis addresses core technical challenges of mobile eye tracking that currently prevent it from being widely adopted. Specifically, this thesis proves that 3D data used for the calibration of mobile eye trackers improves gaze estimation and significantly reduces the parallax error. Further, it presents the first effective fixation detection method for head-mounted devices that is robust against the prevalence of user and gaze target motion. In order to achieve social acceptability, this thesis proposes an innovative and unobtrusive design for future mobile eye tracking devices and builds the first prototype with fully frame-embedded eye cameras combined with a calibration-free deep-trained appearance-based gaze estimation approach. To protect users’ and bystanders’ privacy in the presence of head-mounted eye trackers, this thesis presents another first-of-its-kind prototype. It is able to identify privacy-sensitive situations to automatically enable and disable the eye tracker’s first-person camera by means of a mechanical shutter, leveraging the combination of deep scene and eye movement features. Nevertheless, solving technical challenges and social obstacles alone is not sufficient to make mobile eye tracking attractive for the masses. The key to success is the development of convincingly useful, innovative, and essential applications. To extend the protection of users’ privacy on the software side as well, this thesis presents the first privacy-aware VR gaze interface using differential privacy. This method adds noise to recorded eye tracking data so that privacy-sensitive information like a user’s gender or identity is protected without impeding the utility of the data itself. In addition, the first large-scale online survey is conducted to understand users’ concerns with eye tracking. To develop and evaluate novel applications, this thesis presents the first publicly available long-term eye tracking datasets. They are used to show the unsupervised detection of users’ activities from eye movements alone using novel and efficient video-based encoding approaches as well as to propose the first proof-of-concept method to forecast users’ attentive behaviour during everyday mobile interactions from phone-integrated and body-worn sensors. This opens up possibilities for the development of a variety of novel and exciting applications. With more advanced features, accompanied by technological progress and sensor miniaturisation, eye tracking is increasingly integrated into conventional glasses as well as virtual and augmented reality (VR/AR) head-mounted displays, becoming an integral component of mobile interfaces. This thesis paves the way for the development of socially acceptable, privacy-aware, but highly functional mobile eye tracking devices and novel applications, so that mobile eye tracking can develop its full potential to become an everyday technology for everyone.
Export
BibTeX
@phdthesis{Steilphd2019, TITLE = {Mobile Eye Tracking for Everyone}, AUTHOR = {Steil, Julian}, LANGUAGE = {eng}, DOI = {10.22028/D291-30004}, SCHOOL = {Universit{\"a}t des Saarlandes}, ADDRESS = {Saarbr{\"u}cken}, YEAR = {2019}, MARGINALMARK = {$\bullet$}, DATE = {2019}, ABSTRACT = {Eye tracking and gaze-based human-computer interfaces have become a practical modality in desktop settings, since remote eye tracking is efficient and affordable. However, remote eye tracking remains constrained to indoor, laboratory-like conditions, in which lighting and user position need to be controlled. Mobile eye tracking has the potential to overcome these limitations and to allow people to move around freely and to use eye tracking on a daily basis during their everyday routine. However, mobile eye tracking currently faces two fundamental challenges that prevent it from being practically usable and that, consequently, have to be addressed before mobile eye tracking can truly be used by everyone: Mobile eye tracking needs to be advanced and made fully functional in unconstrained environments, and it needs to be made socially acceptable. Numerous sensing and analysis methods were initially developed for remote eye tracking and have been successfully applied for decades. Unfortunately, these methods are limited in terms of functionality and correctness, or even unsuitable for application in mobile eye tracking. Therefore, the majority of fundamental definitions, eye tracking methods, and gaze estimation approaches cannot be borrowed from remote eye tracking without adaptation. For example, the definitions of specific eye movements, like classical fixations, need to be extended to mobile settings where natural user and head motion are omnipresent. Corresponding analytical methods need to be adjusted or completely reimplemented based on novel approaches encoding the human gaze behaviour. Apart from these technical challenges, an entirely new, and yet under-explored, topic required for the breakthrough of mobile eye tracking as everyday technology is the overcoming of social obstacles. A first crucial key issue to defuse social objections is the building of acceptance towards mobile eye tracking. Hence, it is essential to replace the bulky appearance of current head-mounted eye trackers with an unobtrusive, appealing, and trendy design. The second high-priority theme of increasing importance for everyone is privacy and its protection, given that research and industry have not focused on or taken care of this problem at all. To establish true confidence, future devices have to find a fine balance between protecting users{\textquoteright} and bystanders{\textquoteright} privacy and attracting and convincing users of their necessity, utility, and potential with useful and beneficial features. The solution of technical challenges and social obstacles is the prerequisite for the development of a variety of novel and exciting applications in order to establish mobile eye tracking as a new paradigm, which ease our everyday life. This thesis addresses core technical challenges of mobile eye tracking that currently prevent it from being widely adopted. Specifically, this thesis proves that 3D data used for the calibration of mobile eye trackers improves gaze estimation and significantly reduces the parallax error. Further, it presents the first effective fixation detection method for head-mounted devices that is robust against the prevalence of user and gaze target motion. In order to achieve social acceptability, this thesis proposes an innovative and unobtrusive design for future mobile eye tracking devices and builds the first prototype with fully frame-embedded eye cameras combined with a calibration-free deep-trained appearance-based gaze estimation approach. To protect users{\textquoteright} and bystanders{\textquoteright} privacy in the presence of head-mounted eye trackers, this thesis presents another first-of-its-kind prototype. It is able to identify privacy-sensitive situations to automatically enable and disable the eye tracker{\textquoteright}s first-person camera by means of a mechanical shutter, leveraging the combination of deep scene and eye movement features. Nevertheless, solving technical challenges and social obstacles alone is not sufficient to make mobile eye tracking attractive for the masses. The key to success is the development of convincingly useful, innovative, and essential applications. To extend the protection of users{\textquoteright} privacy on the software side as well, this thesis presents the first privacy-aware VR gaze interface using differential privacy. This method adds noise to recorded eye tracking data so that privacy-sensitive information like a user{\textquoteright}s gender or identity is protected without impeding the utility of the data itself. In addition, the first large-scale online survey is conducted to understand users{\textquoteright} concerns with eye tracking. To develop and evaluate novel applications, this thesis presents the first publicly available long-term eye tracking datasets. They are used to show the unsupervised detection of users{\textquoteright} activities from eye movements alone using novel and efficient video-based encoding approaches as well as to propose the first proof-of-concept method to forecast users{\textquoteright} attentive behaviour during everyday mobile interactions from phone-integrated and body-worn sensors. This opens up possibilities for the development of a variety of novel and exciting applications. With more advanced features, accompanied by technological progress and sensor miniaturisation, eye tracking is increasingly integrated into conventional glasses as well as virtual and augmented reality (VR/AR) head-mounted displays, becoming an integral component of mobile interfaces. This thesis paves the way for the development of socially acceptable, privacy-aware, but highly functional mobile eye tracking devices and novel applications, so that mobile eye tracking can develop its full potential to become an everyday technology for everyone.}, }
Endnote
%0 Thesis %A Steil, Julian %Y Bulling, Andreas %A referee: Krüger, Antonio %A referee: Kasneci, Enkelejda %+ Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society International Max Planck Research School, MPI for Informatics, Max Planck Society Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society External Organizations External Organizations %T Mobile Eye Tracking for Everyone : %G eng %U http://hdl.handle.net/21.11116/0000-0005-652F-6 %R 10.22028/D291-30004 %I Universität des Saarlandes %C Saarbrücken %D 2019 %P 272 p. %V phd %9 phd %X Eye tracking and gaze-based human-computer interfaces have become a practical modality in desktop settings, since remote eye tracking is efficient and affordable. However, remote eye tracking remains constrained to indoor, laboratory-like conditions, in which lighting and user position need to be controlled. Mobile eye tracking has the potential to overcome these limitations and to allow people to move around freely and to use eye tracking on a daily basis during their everyday routine. However, mobile eye tracking currently faces two fundamental challenges that prevent it from being practically usable and that, consequently, have to be addressed before mobile eye tracking can truly be used by everyone: Mobile eye tracking needs to be advanced and made fully functional in unconstrained environments, and it needs to be made socially acceptable. Numerous sensing and analysis methods were initially developed for remote eye tracking and have been successfully applied for decades. Unfortunately, these methods are limited in terms of functionality and correctness, or even unsuitable for application in mobile eye tracking. Therefore, the majority of fundamental definitions, eye tracking methods, and gaze estimation approaches cannot be borrowed from remote eye tracking without adaptation. For example, the definitions of specific eye movements, like classical fixations, need to be extended to mobile settings where natural user and head motion are omnipresent. Corresponding analytical methods need to be adjusted or completely reimplemented based on novel approaches encoding the human gaze behaviour. Apart from these technical challenges, an entirely new, and yet under-explored, topic required for the breakthrough of mobile eye tracking as everyday technology is the overcoming of social obstacles. A first crucial key issue to defuse social objections is the building of acceptance towards mobile eye tracking. Hence, it is essential to replace the bulky appearance of current head-mounted eye trackers with an unobtrusive, appealing, and trendy design. The second high-priority theme of increasing importance for everyone is privacy and its protection, given that research and industry have not focused on or taken care of this problem at all. To establish true confidence, future devices have to find a fine balance between protecting users’ and bystanders’ privacy and attracting and convincing users of their necessity, utility, and potential with useful and beneficial features. The solution of technical challenges and social obstacles is the prerequisite for the development of a variety of novel and exciting applications in order to establish mobile eye tracking as a new paradigm, which ease our everyday life. This thesis addresses core technical challenges of mobile eye tracking that currently prevent it from being widely adopted. Specifically, this thesis proves that 3D data used for the calibration of mobile eye trackers improves gaze estimation and significantly reduces the parallax error. Further, it presents the first effective fixation detection method for head-mounted devices that is robust against the prevalence of user and gaze target motion. In order to achieve social acceptability, this thesis proposes an innovative and unobtrusive design for future mobile eye tracking devices and builds the first prototype with fully frame-embedded eye cameras combined with a calibration-free deep-trained appearance-based gaze estimation approach. To protect users’ and bystanders’ privacy in the presence of head-mounted eye trackers, this thesis presents another first-of-its-kind prototype. It is able to identify privacy-sensitive situations to automatically enable and disable the eye tracker’s first-person camera by means of a mechanical shutter, leveraging the combination of deep scene and eye movement features. Nevertheless, solving technical challenges and social obstacles alone is not sufficient to make mobile eye tracking attractive for the masses. The key to success is the development of convincingly useful, innovative, and essential applications. To extend the protection of users’ privacy on the software side as well, this thesis presents the first privacy-aware VR gaze interface using differential privacy. This method adds noise to recorded eye tracking data so that privacy-sensitive information like a user’s gender or identity is protected without impeding the utility of the data itself. In addition, the first large-scale online survey is conducted to understand users’ concerns with eye tracking. To develop and evaluate novel applications, this thesis presents the first publicly available long-term eye tracking datasets. They are used to show the unsupervised detection of users’ activities from eye movements alone using novel and efficient video-based encoding approaches as well as to propose the first proof-of-concept method to forecast users’ attentive behaviour during everyday mobile interactions from phone-integrated and body-worn sensors. This opens up possibilities for the development of a variety of novel and exciting applications. With more advanced features, accompanied by technological progress and sensor miniaturisation, eye tracking is increasingly integrated into conventional glasses as well as virtual and augmented reality (VR/AR) head-mounted displays, becoming an integral component of mobile interfaces. This thesis paves the way for the development of socially acceptable, privacy-aware, but highly functional mobile eye tracking devices and novel applications, so that mobile eye tracking can develop its full potential to become an everyday technology for everyone. %U https://publikationen.sulb.uni-saarland.de/handle/20.500.11880/28498
Steil, J., Tonsen, M., Sugano, Y., & Bulling, A. (2019). InvisibleEye: Fully Embedded Mobile Eye Tracking Using Appearance-Based Gaze Estimation. GetMobile, 23(2). doi:10.1145/3372300.3372307
Export
BibTeX
@article{Steil_2019GetMobile, TITLE = {{InvisibleEye}: {F}ully Embedded Mobile Eye Tracking Using Appearance-Based Gaze Estimation}, AUTHOR = {Steil, Julian and Tonsen, Marc and Sugano, Yusuke and Bulling, Andreas}, LANGUAGE = {eng}, ISSN = {2375-0529}, DOI = {10.1145/3372300.3372307}, PUBLISHER = {ACM}, ADDRESS = {New York, NY}, YEAR = {2019}, MARGINALMARK = {$\bullet$}, DATE = {2019}, JOURNAL = {GetMobile}, VOLUME = {23}, NUMBER = {2}, PAGES = {30--34}, }
Endnote
%0 Journal Article %A Steil, Julian %A Tonsen, Marc %A Sugano, Yusuke %A Bulling, Andreas %+ Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society External Organizations External Organizations %T InvisibleEye: Fully Embedded Mobile Eye Tracking Using Appearance-Based Gaze Estimation : %G eng %U http://hdl.handle.net/21.11116/0000-0005-6586-2 %R 10.1145/3372300.3372307 %7 2019 %D 2019 %J GetMobile %O Mobile Computing and Communications %V 23 %N 2 %& 30 %P 30 - 34 %I ACM %C New York, NY %@ false
Steil, J., Hagestedt, I., Huang, M. X., & Bulling, A. (2019). Privacy-Aware Eye Tracking Using Differential Privacy. In Proceedings ETRA 2019. Denver, CO, USA: ACM. doi:10.1145/3314111.3319915
Export
BibTeX
@inproceedings{steil19_etra2, TITLE = {Privacy-Aware Eye Tracking Using Differential Privacy}, AUTHOR = {Steil, Julian and Hagestedt, Inken and Huang, Michael Xuelin and Bulling, Andreas}, LANGUAGE = {eng}, ISBN = {978-1-4503-6709-7}, DOI = {10.1145/3314111.3319915}, PUBLISHER = {ACM}, YEAR = {2019}, MARGINALMARK = {$\bullet$}, DATE = {2019}, BOOKTITLE = {Proceedings ETRA 2019}, EDITOR = {Krejtz, Krzysztof and Sharif, Bonita}, EID = {27}, ADDRESS = {Denver, CO, USA}, }
Endnote
%0 Conference Proceedings %A Steil, Julian %A Hagestedt, Inken %A Huang, Michael Xuelin %A Bulling, Andreas %+ Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society External Organizations Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society External Organizations %T Privacy-Aware Eye Tracking Using Differential Privacy : %G eng %U http://hdl.handle.net/21.11116/0000-0003-2BCC-8 %R 10.1145/3314111.3319915 %D 2019 %B 11th ACM Symposium on Eye Tracking Research & Applications %Z date of event: 2019-06-25 - 2019-06-28 %C Denver, CO, USA %B Proceedings ETRA 2019 %E Krejtz, Krzysztof; Sharif, Bonita %Z sequence number: 27 %I ACM %@ 978-1-4503-6709-7
Steil, J., Koelle, M., Heuten, W., Boll, S., & Bulling, A. (2019). PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features. In Proceedings ETRA 2019. Denver, CO, USA: ACM. doi:10.1145/3314111.3319913
Export
BibTeX
@inproceedings{steil19_etra, TITLE = {{PrivacEye}: {P}rivacy-Preserving Head-Mounted Eye Tracking UsingEgocentric Scene Image and Eye Movement Features}, AUTHOR = {Steil, Julian and Koelle, Marion and Heuten, Wilko and Boll, Susanne and Bulling, Andreas}, LANGUAGE = {eng}, ISBN = {978-1-4503-6709-7}, DOI = {10.1145/3314111.3319913}, PUBLISHER = {ACM}, YEAR = {2019}, MARGINALMARK = {$\bullet$}, DATE = {2019}, BOOKTITLE = {Proceedings ETRA 2019}, EDITOR = {Krejtz, Krzysztof and Sharif, Bonita}, EID = {26}, ADDRESS = {Denver, CO, USA}, }
Endnote
%0 Conference Proceedings %A Steil, Julian %A Koelle, Marion %A Heuten, Wilko %A Boll, Susanne %A Bulling, Andreas %+ Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society External Organizations External Organizations External Organizations External Organizations %T PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features : %G eng %U http://hdl.handle.net/21.11116/0000-0003-2BC0-4 %R 10.1145/3314111.3319913 %D 2019 %B 11th ACM Symposium on Eye Tracking Research & Applications %Z date of event: 2019-06-25 - 2019-06-28 %C Denver, CO, USA %B Proceedings ETRA 2019 %E Krejtz, Krzysztof; Sharif, Bonita %Z sequence number: 26 %I ACM %@ 978-1-4503-6709-7
Steil, J., Huang, M. X., & Bulling, A. (2018). Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets. In Proceedings ETRA 2018. Warsaw, Poland: ACM. doi:10.1145/3204493.3204538
Export
BibTeX
@inproceedings{steil18_etra, TITLE = {Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets}, AUTHOR = {Steil, Julian and Huang, Michael Xuelin and Bulling, Andreas}, LANGUAGE = {eng}, ISBN = {978-1-4503-5706-7}, DOI = {10.1145/3204493.3204538}, PUBLISHER = {ACM}, YEAR = {2018}, MARGINALMARK = {$\bullet$}, DATE = {2018}, BOOKTITLE = {Proceedings ETRA 2018}, PAGES = {1--9}, EID = {23}, ADDRESS = {Warsaw, Poland}, }
Endnote
%0 Conference Proceedings %A Steil, Julian %A Huang, Michael Xuelin %A Bulling, Andreas %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T Fixation Detection for Head-Mounted Eye Tracking Based on Visual Similarity of Gaze Targets : %G eng %U http://hdl.handle.net/21.11116/0000-0001-1DAC-E %R 10.1145/3204493.3204538 %D 2018 %B ACM Symposium on Eye Tracking Research & Applications %Z date of event: 2018-06-14 - 2018-06-17 %C Warsaw, Poland %B Proceedings ETRA 2018 %P 1 - 9 %Z sequence number: 23 %I ACM %@ 978-1-4503-5706-7
Steil, J., Koelle, M., Heuten, W., Boll, S., & Bulling, A. (2018). PrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysis. Retrieved from http://arxiv.org/abs/1801.04457
(arXiv: 1801.04457)
Abstract
As first-person cameras in head-mounted displays become increasingly prevalent, so does the problem of infringing user and bystander privacy. To address this challenge, we present PrivacEye, a proof-of-concept system that detects privacysensitive everyday situations and automatically enables and disables the first-person camera using a mechanical shutter. To close the shutter, PrivacEye detects sensitive situations from first-person camera videos using an end-to-end deep-learning model. To open the shutter without visual input, PrivacEye uses a separate, smaller eye camera to detect changes in users' eye movements to gauge changes in the "privacy level" of the current situation. We evaluate PrivacEye on a dataset of first-person videos recorded in the daily life of 17 participants that they annotated with privacy sensitivity levels. We discuss the strengths and weaknesses of our proof-of-concept system based on a quantitative technical evaluation as well as qualitative insights from semi-structured interviews.
Export
BibTeX
@online{steil2018_arxiv, TITLE = {{PrivacEye}: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysis}, AUTHOR = {Steil, Julian and Koelle, Marion and Heuten, Wilko and Boll, Susanne and Bulling, Andreas}, LANGUAGE = {eng}, URL = {http://arxiv.org/abs/1801.04457}, EPRINT = {1801.04457}, EPRINTTYPE = {arXiv}, YEAR = {2018}, MARGINALMARK = {$\bullet$}, ABSTRACT = {As first-person cameras in head-mounted displays become increasingly prevalent, so does the problem of infringing user and bystander privacy. To address this challenge, we present PrivacEye, a proof-of-concept system that detects privacysensitive everyday situations and automatically enables and disables the first-person camera using a mechanical shutter. To close the shutter, PrivacEye detects sensitive situations from first-person camera videos using an end-to-end deep-learning model. To open the shutter without visual input, PrivacEye uses a separate, smaller eye camera to detect changes in users' eye movements to gauge changes in the "privacy level" of the current situation. We evaluate PrivacEye on a dataset of first-person videos recorded in the daily life of 17 participants that they annotated with privacy sensitivity levels. We discuss the strengths and weaknesses of our proof-of-concept system based on a quantitative technical evaluation as well as qualitative insights from semi-structured interviews.}, }
Endnote
%0 Report %A Steil, Julian %A Koelle, Marion %A Heuten, Wilko %A Boll, Susanne %A Bulling, Andreas %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society External Organizations External Organizations External Organizations Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T PrivacEye: Privacy-Preserving First-Person Vision Using Image Features and Eye Movement Analysis : %G eng %U http://hdl.handle.net/21.11116/0000-0001-1840-C %U http://arxiv.org/abs/1801.04457 %D 2018 %X As first-person cameras in head-mounted displays become increasingly prevalent, so does the problem of infringing user and bystander privacy. To address this challenge, we present PrivacEye, a proof-of-concept system that detects privacysensitive everyday situations and automatically enables and disables the first-person camera using a mechanical shutter. To close the shutter, PrivacEye detects sensitive situations from first-person camera videos using an end-to-end deep-learning model. To open the shutter without visual input, PrivacEye uses a separate, smaller eye camera to detect changes in users' eye movements to gauge changes in the "privacy level" of the current situation. We evaluate PrivacEye on a dataset of first-person videos recorded in the daily life of 17 participants that they annotated with privacy sensitivity levels. We discuss the strengths and weaknesses of our proof-of-concept system based on a quantitative technical evaluation as well as qualitative insights from semi-structured interviews. %K Computer Science, Human-Computer Interaction, cs.HC
Steil, J., Müller, P., Sugano, Y., & Bulling, A. (2018). Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors. In MobileHCI 2018, 20th International Conference on Human-Computer Interaction with Mobile Devices and Services. Barcelona, Spain: ACM. doi:10.1145/3229434.3229439
Export
BibTeX
@inproceedings{steil_MobileHCI2018, TITLE = {Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors}, AUTHOR = {Steil, Julian and M{\"u}ller, Philipp and Sugano, Yusuke and Bulling, Andreas}, LANGUAGE = {eng}, ISBN = {978-1-4503-5898-9}, DOI = {10.1145/3229434.3229439}, PUBLISHER = {ACM}, YEAR = {2018}, MARGINALMARK = {$\bullet$}, DATE = {2018}, BOOKTITLE = {MobileHCI 2018, 20th International Conference on Human-Computer Interaction with Mobile Devices and Services}, PAGES = {1--13}, EID = {1}, ADDRESS = {Barcelona, Spain}, }
Endnote
%0 Conference Proceedings %A Steil, Julian %A Müller, Philipp %A Sugano, Yusuke %A Bulling, Andreas %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society External Organizations Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors : %G eng %U http://hdl.handle.net/21.11116/0000-0001-1834-A %R 10.1145/3229434.3229439 %D 2018 %B 20th International Conference on Human-Computer Interaction with Mobile Devices and Services %Z date of event: 2018-09-03 - 2018-09-06 %C Barcelona, Spain %B MobileHCI 2018 %P 1 - 13 %Z sequence number: 1 %I ACM %@ 978-1-4503-5898-9
Tonsen, M., Steil, J., Sugano, Y., & Bulling, A. (2017). InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1(3). doi:10.1145/3130971
Export
BibTeX
@article{tonsen17_imwut, TITLE = {{InvisibleEye}: {M}obile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation}, AUTHOR = {Tonsen, Marc and Steil, Julian and Sugano, Yusuke and Bulling, Andreas}, LANGUAGE = {eng}, ISSN = {2474-9567}, DOI = {10.1145/3130971}, PUBLISHER = {ACM}, ADDRESS = {New York, NY}, YEAR = {2017}, MARGINALMARK = {$\bullet$}, JOURNAL = {Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies}, VOLUME = {1}, NUMBER = {3}, EID = {106}, }
Endnote
%0 Journal Article %A Tonsen, Marc %A Steil, Julian %A Sugano, Yusuke %A Bulling, Andreas %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society External Organizations Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation : %G eng %U http://hdl.handle.net/11858/00-001M-0000-002D-D0F4-A %R 10.1145/3130971 %7 2017 %D 2017 %J Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies %O IMWUT %V 1 %N 3 %Z sequence number: 106 %I ACM %C New York, NY %@ false
Mansouryar, M., Steil, J., Sugano, Y., & Bulling, A. (2016). 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers. In Proceedings ETRA 2016. Charleston, SC, USA: ACM. doi:10.1145/2857491.2857530
Export
BibTeX
@inproceedings{mansouryar16_etra, TITLE = {{3D} Gaze Estimation from {2D} Pupil Positions on Monocular Head-Mounted Eye Trackers}, AUTHOR = {Mansouryar, Mohsen and Steil, Julian and Sugano, Yusuke and Bulling, Andreas}, LANGUAGE = {eng}, ISBN = {978-1-4503-4125-7}, DOI = {10.1145/2857491.2857530}, PUBLISHER = {ACM}, YEAR = {2016}, DATE = {2016}, BOOKTITLE = {Proceedings ETRA 2016}, PAGES = {197--200}, ADDRESS = {Charleston, SC, USA}, }
Endnote
%0 Conference Proceedings %A Mansouryar, Mohsen %A Steil, Julian %A Sugano, Yusuke %A Bulling, Andreas %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers : %G eng %U http://hdl.handle.net/11858/00-001M-0000-0029-D2CB-2 %R 10.1145/2857491.2857530 %D 2016 %B ACM Symposium on Eye Tracking Research & Applications %Z date of event: 2016-03-14 - 2016-03-17 %C Charleston, SC, USA %B Proceedings ETRA 2016 %P 197 - 200 %I ACM %@ 978-1-4503-4125-7
Steil, J., & Bulling, A. (2015). Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models. In UbiComp 2015, ACM International Joint Conference on Pervasive and Ubiquitous Computing. Osaka, Japan: ACM. doi:10.1145/2750858.2807520
Export
BibTeX
@inproceedings{Steil15_ubicomp, TITLE = {Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models}, AUTHOR = {Steil, Julian and Bulling, Andreas}, LANGUAGE = {eng}, ISBN = {978-1-4503-3574-4}, DOI = {10.1145/2750858.2807520}, PUBLISHER = {ACM}, YEAR = {2015}, DATE = {2015}, BOOKTITLE = {UbiComp 2015, ACM International Joint Conference on Pervasive and Ubiquitous Computing}, PAGES = {75--85}, ADDRESS = {Osaka, Japan}, }
Endnote
%0 Conference Proceedings %A Steil, Julian %A Bulling, Andreas %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models : %G eng %U http://hdl.handle.net/11858/00-001M-0000-0029-5964-1 %R 10.1145/2750858.2807520 %D 2015 %B ACM International Joint Conference on Pervasive and Ubiquitous Computing %Z date of event: 2015-09-07 - 2015-09-11 %C Osaka, Japan %B UbiComp 2015 %P 75 - 85 %I ACM %@ 978-1-4503-3574-4
Steil, J. (2014). Discovery of Eye Movement Patterns in Long-term Visual Behaviour Using Topic Models. Universität des Saarlandes, Saarbrücken.
Export
BibTeX
@mastersthesis{SteilMaster2014, TITLE = {Discovery of Eye Movement Patterns in Long-term Visual Behaviour Using Topic Models}, AUTHOR = {Steil, Julian}, LANGUAGE = {eng}, SCHOOL = {Universit{\"a}t des Saarlandes}, ADDRESS = {Saarbr{\"u}cken}, YEAR = {2014}, DATE = {2014}, }
Endnote
%0 Thesis %A Steil, Julian %Y Bulling, Andreas %A referee: Fritz, Mario %+ Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society %T Discovery of Eye Movement Patterns in Long-term Visual Behaviour Using Topic Models : %G eng %U http://hdl.handle.net/11858/00-001M-0000-0024-547F-E %I Universität des Saarlandes %C Saarbrücken %D 2014 %P XII, 126 p. %V master %9 master