Intuitive user interface (UI) for the Tactonom

ESR15

Objectives

Research on 2D multi-line tactile devices continuosly advances but the design of tactile UI design and their interaction possibilities does not progress likewise. Following a user-centered design methodology, the fellow will perform contextual inquiry (CI) analysis to collect efficiency reference values of common working practice (e.g. with single-line Braille displays) as well as general requirements. An accompanying longitude experiment will give more insights in haptic stimulation and perception. Usability tests will be applied to validate the UI design and the interaction methods with end users. Based on the results, strategies for presentation of different media types (diagram, tables, maps etc.) combined condensing, translating, iteratively presenting or navigating their information content will be developed.

Expected Results

Intuitive user interface for tactile display based on state-of-the-art understanding of haptics.

Placement

Host institution: Inventivio GmbH

Enrolments (in Doctoral degree): Karlsruhe Institute of Technology

Supervisors

Klaus-Peter Hars, Rainer Stiefelhagen

Presentation of ESR15

PhD defense: To be announced

My name is Gaspar Ramôa, and I am from Portugal. I hold a master’s degree in Computer Science and Engineering at the University of Beira Interior, Portugal. I am passionate about health and fitness. Most of my personal and technical skills were obtained, in the last five years, as a young swimming and Modern triathlon athlete and as a university student. During my bachelor and master studies, I have worked in the fields of Robotics, Artificial Intelligence and Computer Vision. I built a portable low-powered system to help visually impaired people navigate in indoor environments in my master dissertation, entitled “Artificial Vision for Humans”, where I developed a specific ability to build user interface systems for blind people. I always had a great desire to take a PhD course in a European country other than mine, and thanks to the INTUITIVE H2020-MSCA-ITN project, I can finally fulfil my dream. I am confident that I can complete the research project successfully since I believe on my hardworking ability to achieve goals, I believe in my suitable background in computer engineering, and I have the motivation to improve visually impaired people’s life quality.

Abstract of PhD goals

Enabling blind and visually impaired (BVI) people to perceive and interact with two-dimensional data is a complex challenge. Blind users cannot perceive graphical information in visual user interfaces and can only access text information linearly with assistive technologies. An emerging technology, 2D refreshable audio-tactile devices, is capable of presenting two-dimensional data to blind persons. While hardware development for this technology has advanced, user interface development has not progressed likewise. Therefore, there is a need to improve these devices’ user interfaces to enable BVI persons to access and use two-dimensional information.

 

To accomplish this, it is necessary to systematically explore and validate all key aspects of audio-tactile user interfaces. This can only be done with the use of blind test users, and it needs to be grounded in a challenging application domain. Public places such as train and bus stations represent such a domain since there is a massive interest in enabling blind persons to access and use public places independently. The user interface aspects include, among others, the design of widgets, dynamic audio-tactile interactions for navigation and camera-based pointing and gesture recognition mechanisms. Main topic domains are Audio Pinpoint Navigation, Audio-Tactile Line Chart Exploration, and Audio-Tactile UI for Travel Planning. The development of these follows a human-centred design approach, including contextual inquiry analysis, brainstorming sessions, quantitative questionnaires (NASA TLX and the System Usability Scale, SUS) and usability tests.

The results from this dissertation are expected to improve the foundation for the user interfaces of the emerging 2D refreshable tactile displays.

Results

D5.5 Classification of 2D refreshable tactile user interfaces
Literature review of the user interfaces and interaction

D5.6 Design principles and basic version of the intuitive tactile user interface
Software prototype for the new user interface allowing interaction between the Tactonom and computer

D5.7 Audio-Tactile Design Pattern Repository
Repository of designed patterns for combination of audio-output with tactile output

Journal Article – Upcoming
Ramôa, G.; Schmidt, V.; König, P.
SONOICE! A Sonar-voice dynamic navigation UI for pinpointing elements in 2D tactile readers

Conference Article
Ramôa, G.; Moured, O.; Schwarz, T.; Müller, K.; Stiefelhagen, R.
Enabling People with Blindness to Distinguish Lines of Mathematical Charts with Audio-Tactile Graphic Readers
PErvasive Technologies Related to Assistive Environments, PETRA, 2023
DOI: https://doi.org/10.1145/3594806.3594818

Conference Article
Ramôa, G.; Moured, O.; Schwarz, T.; Müller, K.; Stiefelhagen, R.
Display and Use of Station Floor Plans on 2D Pin Matrix Displays for Blind and Visually Impaired People
PErvasive Technologies Related to Assistive Environments, PETRA, 2023
DOI: https://doi.org/10.1145/3594806

Conference Article
Ramôa, G.
Classification of 2d Refreshable Tactile User Interfaces
ICCHP-AAATE, 2022
DOI: https://doi.org/10.35011/icchp-aaate22-p1-24

Journal Article
Ramôa, G.; Schmidt, V.; König, P.
Developing Dynamic Audio Navigation UIs to Pinpoint Elements in Tactile Graphics
Multimodal Technol. Interact., 2022
DOI: https://doi.org/10.3390/mti6120113