hosted by
publicationslist.org
    

Elias Giannopoulos


hlias.giannopoulos@gmail.com

Journal articles

2011
2010
N de la Pena, P Weil, J Llobera, A E Giannopoulos, Pomes, B Spanlang, D Friedman, M V Sanchez-Vives, M Slater (2010)  Immersive Journalism: Immersive Virtual Reality for the First-Person Experience of News   Presence - Teleoperators and Virtual Environments 19: 4. 291-301  
Abstract: This paper introduces the concept and discusses the implications of immersive journalism, which is the production of news in a form in which people can gain ï¬rstperson experiences of the events or situation described in news stories. The fundamental idea of immersive journalism is to allow the participant, typically represented as a digital avatar, to actually enter a virtually recreated scenario representing the news story. The sense of presence obtained through an immersive system (whether a Cave or head-tracked head-mounted displays [HMD] and online virtual worlds, such as video games and online virtual worlds) affords the participant unprecedented access to the sights and sounds, and possibly feelings and emotions, that accompany the news. This paper surveys current approaches to immersive journalism and the theoretical background supporting claims regarding avatar experience in immersive systems. We also provide a speciï¬c demonstration: giving participants the experience of being in an interrogation room in an offshore prison. By both describing current approaches and demonstrating an immersive journalism experience, we open a new avenue for research into how presence can be utilized in the ï¬eld of news and nonï¬ction.
Notes:
Elias Giannopoulos, Zheng Wang, Angelika Peer, Martin Buss, Mel Slater (2010)  Comparison of people's responses to real and virtual handshakes within a virtual environment.   Brain Res Bull Nov  
Abstract: In this paper we present a method for evaluating a haptic device which simulates human handshakes interfaced via a metal rod. We provide an overview of the haptic demonstrator and the control algorithm used for delivering realistic handshakes. For the evaluation of this handshake demonstrator we introduce a 'ground truth' approach, where we compare the robot handshakes with handshakes operated by a human via the same metal rod. For this, an experiment was carried out where the participants entered a virtual environment, i.e. a virtual cocktail party, and were asked to perform a number of handshakes, either with the robot operating with one of two control algorithms operating the metal rod - a basic one for comparison or the proposed new advanced one, or with a human operating the metal rod. The virtual environment was represented only through audio and haptics, without any visual representation, i.e. the subjects participated blindfolded. The evaluation of each handshake was achieved through the subjective scoring of each of the handshakes. The results of the study show that the demonstrator operating with the proposed new control scheme was evaluated significantly more human-like than with the demonstrator operating with the basic algorithm, and also that the real human handshake was evaluated more like a real human handshake than both types of robot handshakes. Although the difference between the advanced robot and human handshake was significant, the effect sizes are not very different, indicating substantial confusion of participants between the advanced robot and human operated handshakes.
Notes:

Book chapters

2008

Conference papers

2010
Bernhard Spanlang, Jean-Marie Normand, Elias Giannopoulos, Mel Slater (2010)  A First Person Avatar System with Haptic Feedback   In: ACM Symposium on Virtual Reality Software and Technology (VRST) Edited by:George Baciu, Rynson Lau, Ming Li, Taku Komura, Qunsheng Peng. 47-50  
Abstract: We describe a system that shows how to substitute a person’s body in virtual reality by a virtual body (or avatar). The avatar is seen from a first person perspective, moves as the person moves and the system generates touch on the real person’s body when the avatar is touched. Such replacement of the person’s real body by a virtual body requires a wide field-of-view head-mounted display, real-time whole body tracking, and tactile feedback. We show how to achieve this with a variety of off-the-shelf hardware and software, and also custom systems for real-time avatar rendering and collision detection. We present an overview of the system and detail on some of its components. We provide examples of how such a system is being used in some of our current experimental studies of embodiment.
Notes:
Bernhard Spanlang, Jean-Marie Normand, Elias Giannopoulos, Mel Slater (2010)  GPU based Detection and Mapping of Collisions for Haptic Rendering in Immersive Virtual Reality   In: IEEE International Symposium on Haptic Audio Visual Environments and Games 41-44  
Abstract: We present a method that maps collisions on a dynamic deformable virtual character designed to be used for tactile haptic rendering in Immersive Virtual Reality (IVR). Our method computes exact intersections by relying on the use of programmable graphics hardware. Based on interference tests between deformable meshes (an avatar controlled by a human participant) and a few hundred collider objects, our method gives coherent haptic feedback to the participant. We use GPU textures to map surface regions of the avatar to haptic actuators. We illustrate our approach by using a vest composed of vibrators for haptic rendering and we show that our method achieves collision detection at rates well over 1kHz on good quality deformable avatar meshes which makes our method suitable for video games and virtual training applications
Notes:
Powered by PublicationsList.org.