The research of my Ph.D. concerns XR technology, where X stands for Virtual (V), Augmented (A), or Mixed (M) and R stands for Reality, with a particular focus on MR (Mixed Reality).
Referring to the Milgram taxonomy a “Mixed Reality (MR) environment can be everything that merges digital and real content, e.g. a multisensory environment , so the level of immersivity is different depending on the size of the display”. Starting from this broad definition, I investigated issues related to the extension of “typical” interaction modes of MR (based on gestures e.g., touch or mid-air tap gesture, depending on the MR device) by integrating the manipulation of digitally enhanced physical elements (“tangibles”, in the HCI jargon) to control and explore the MR environment and generate tactile feedbacks. We called this multimodal interaction paradigm Tangible Mixed Reality.
MR applications in HoloLens already include the perception of “real life elements”, i.e., the view of the surrounding space on which digital elements (text, images, holograms, or animations) can be dynamically superimposed. Adding tangibles and the possibility to manipulate them involves the tactile sense leading to a multisensory experience that could make the use of the MR application more engaging and stimulating, particularly for persons with CD.
Generalizing these concepts and looking at them from a broader perspective helped me to identify three key design dimensions for Tangible MR applications devoted to people with cognitive: embodiment, immersivity, and virtual continuum.
The focus and challenge of my current and future research is how to guide the design process of MR applications (and, more broadly, XR applications) in order to balance these factors in an effective way for people with CD, and how to extend existing technologies to enable the integration of different degrees of immersivity and embodiment.