The Interaction of Five-Fingered Haptic Controller for Manipulating Object in Virtual Reality

,


Introduction
Virtual Reality (VR) technology allows users to experience physical interactions in a virtually created environment. Head mounted display (HMD) such as HTC Vive and Oculus Rift can track the user's head position and orientation by interworking with external sensors, so it can change the image and sound according to the user's movement. The VR controller can be used to track the position of the user's hand. More detailed interactions are possible using input and output sensors such as buttons, joysticks, touchpads, and vibration actuators mounted on controllers. Haptic hand interaction is important for object manipulation in virtual reality. Because haptic rendering provides realistic feeling of touching, grasping, and manipulating by giving force feedback and vibration, many researches are underway to apply haptic interaction to VR hand controllers.
Haptic devices can be classified into three categories (ground-based, body-based, and un-based) according to the force reference system [1], [2]. Ground-based devices are effective to sense weight of virtual object without burdening device weight [3]- [5]. Because of the limited workspace, it's not suitable as VR haptic devices that move around freely. Body-based haptic device has exoskeleton mechanism, which a person wears on the arm, presents more multiple degree-of-freedom (DOF) with actuators [6], [7]. The exoskeletal haptic device shows the possibility of VR controller attached to the hand with the advantage of portability, range of motion and tracking fingertips [8]. Un-based hand-held device are considered for general virtual reality controller because it is easy to hold and operate with both hands. If the existing VR controller receives finger's input and generate vibration, the controller has recently been studied in such a manner as to give force feedback to each finger. CLAW haptic controller provides articulated movement and force feedback actuation to the user's index finger which enables three interactions of grasping virtual object, touching virtual spaces, and triggering [9]. TORC haptic controller represents the haptic feel of squeezing, shearing or turning a virtual object by sliding their thumb on TORC's trackpad [10].
In this paper, we used hardware with pressure sensors that collect pressure data from each finger, linear actuators to control different force feedback on each finger, and a vibration motor in order to overcome the limitations of existing VR controller that only provides vibration. We have developed and evaluated an interface system to control the VR five-fingered haptic controller that supports the user's five finger force feedback. This paper is organized as follows: The next chapter describes the virtual reality haptic controller interface design, and third chapter discussed the evaluation of user interface of five-fingered haptic controller and future works.

VR Haptic Controller Interface Design
The haptic controller system consists of haptic controller hardware, haptic controller interface, and game engine interface as shown in Fig 1. The haptic controller hardware used in the study has the form of hand-held device to easily hold and move freely with both hands [11]. The haptic controller consists of five finger haptic modules, one vibration module, a processor, and a network module. The finger haptic module is modularized with a pressure sensor and a linear actuator as shown in Fig. 2. The processor collects data from five pressure sensors and controls the position of five linear actuators and a vibration actuator. The haptic controller hardware sends pressure data to haptic controller interface module via serial communication through the network module. The haptic controller interface analyzes the pressure data and adjusts linear actuators according to the pressure data. The game engine interface module includes the function of mapping the linear movement of the haptic controller and the movement of the skeleton-based hand model of virtual reality, When the virtual hand touches the virtual object, the physical interaction is calculated by physics engine in Unity and transmitted to the haptic controller interface module. The haptic controller interface module sends the position of linear actuators to haptic controller hardware based on physical interaction from game engine interface module. As the position of actuator is controlled according to physics calculations, the user can feel the rigidity of the virtual object.

International Journal of Computer and Communication Engineering
As shown in Fig. 3, the haptic controller interface module consists of the functions of measuring the pressure generated by pressing the finger haptic module, controlling the movement of the linear actuator and the vibration module. It can be divided into three interactions: press, hold, and release. When the user grabs the haptic controller to hold the virtual object, pressure value through sensor in the finger haptic module is increased and the pressure analysis module analyzes and classifies. In the 'press' interaction, the actuator control module calculates displacement in proportion to the pressure value. When the user presses with a strong force, the linear actuator moves in quickly, and when pressed with a weak force, the linear actuator moves slowly. The material factor presents the physical properties of space. For example, interaction in water requires relatively more force than interactions in air to move the same displacement. In 'press' interaction, if a user's virtual finger touches a virtual object, the linear actuator no longer moves in, even if the force is continued. After grabbing the object, the user maintains a constant pressure without using a large force to hold the object. The pressure analysis module classifies the 'hold' interaction and stops linear actuator movement. When the user releases the virtual object, the pressure value disappears and is converted to 'release' interaction and the linear actuator moves to the initial position. The Unity game engine is used to visualize virtual objects and environment for virtual reality. The game engine interface module supports developers to create virtual reality content using haptic controllers using Unity game engine as shown in Fig. 4. The haptic controller hardware is designed to have linear motion. The virtual hand is composed of a skeleton structure with several joints. When the user grabs the hand-held controller by bending finger in reality, the virtual hand in virtual reality need to be shown to hold the virtual object to increase the sense of realty. However, because haptic controller's finger haptic modules move linearly, the haptic controller interface module sends the position of the finger haptic module to the linear actuator movement module of the game engine interface. The skeleton-based virtual hand mapping module converts the displacement of the linear module into angles of finger's joint as shown in Fig. 5.  The physics collider is placed on the virtual hand to detect the contact of the virtual object as shown in Fig. 6. As the user grabs the virtual object, physics collider set on the object and colliders on the virtual hand come into contact. The object collision detection module checks the contact collider and contact position of virtual object. The physics-based interaction performed according to the correlation between the property of the object and physics colliders in contact.

User Study and Discussion
An experiment was conducted to measure the user satisfaction of the five-fingered haptic interface. Two systems using the VIVE controller and the proposed controller are used. The experimental system used a prototype under construction as shown in Fig. 2. Since the controller has not yet added the position tracking function, the experiment was designed with the position of the virtual hand fixed. The first system, which uses a VIVE controller, can move fingers using a single trigger button. When a virtual object is held, fingers stop and vibrates.

International Journal of Computer and Communication Engineering
In the second system using the proposed controller, each finger moves separately according to the movement of five fingers as shown in Fig. 7. When touching the virtual object, the touched finger stops and controller vibrates. Ten students conducted experiments using two systems and answered. Fig. 8 shows the results obtained through the survey. The average interaction preference of the first system is 6.1 and the average of the second system is 8.3. Most students felt more real to touch objects with virtual hands in the second system. This paper is the initial stage of interaction research that manipulates a virtual object in virtual environment with five fingers. It is difficult to have a significant meaning because the experiment was conducted with a small number of people. However, the subjects responded the manipulation of the hand was more realistic, visually confirming the five fingers were separated and manipulated. In future research, we plan to conduct research and experiment on five-fingered hardware and hand interaction using five fingers.

Conflict of Interest
The authors declare no conflict of interest.

Author Contributions
SangHun Nam designed the interface system and wrote the paper, Jiyong Lee programed interface modules; Ginam Ko designed the haptic controller hardware.