Precision position tracking in virtual reality environments using sensor networks
2007 IEEE International Symposium on Industrial Electronics, 2007•ieeexplore.ieee.org
In an immersive interactive virtual reality (VR) environment a real human can be
incorporated into a virtual 3D scene to navigate a robotic device within that virtual scene.
This has useful applications in rehabilitation. The non-destructive nature of VR makes it an
ideal testbed for many applications and a prime candidate for use in rehabilitation robotics
simulation. The key challenge is to accurately localise the movement of the object in reality
and map its corresponding position in 3D VR. To solve the localisation problem we have …
incorporated into a virtual 3D scene to navigate a robotic device within that virtual scene.
This has useful applications in rehabilitation. The non-destructive nature of VR makes it an
ideal testbed for many applications and a prime candidate for use in rehabilitation robotics
simulation. The key challenge is to accurately localise the movement of the object in reality
and map its corresponding position in 3D VR. To solve the localisation problem we have …
In an immersive interactive virtual reality (VR) environment a real human can be incorporated into a virtual 3D scene to navigate a robotic device within that virtual scene. This has useful applications in rehabilitation. The non-destructive nature of VR makes it an ideal testbed for many applications and a prime candidate for use in rehabilitation robotics simulation. The key challenge is to accurately localise the movement of the object in reality and map its corresponding position in 3D VR. To solve the localisation problem we have formed an online mode vision sensor network, which tracks the object's real Euclidean position and sends the information back to the VR scene. A precision position tracking (PPT) system has been installed to track the object. We have previously presented the solution to the sensor relevance establishment problem where from a group of sensors the most relevant sensing action is obtained. In this paper we apply the same technique to the VR system. The problem can be broken down in two steps. In step one, the relevant sensor type is discovered based upon the IEEE 1451.4 Transducers Electronic Data Sheets (TEDS) description model. TEDS is used to discover the sensor types, their geographical locations, and additional information such as uncertainty measurement functions and information fusion rules necessary to fuse multi-sensor data. In step two, the most useful sensor information is obtained using the Kullback-Leibler Divergence (KLD) method. In this study we conduct two experiments that address the localisation problem. In the first experiment a VR 3D environment is created using the realtime distributed robotics software Player/Stage/Gazebo and a simulated PPT camera system is used to localise a simulated autonomous mobile robot within the 3D environment. In the second experiment, a real user is placed in a cave-like VR 3D environment and a real PPT camera system is used to localise the user's physical actions in reality. The physical actions of the real user are then used to control the robotic device in VR.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果