Physics based object grasping in Unity

This post briefly describes my journey to discovering the ultimate method for physics based dextrous object manipulation in the Unity or Unreal game engines.

In short, so far I have been unsuccessful.

There is a body of published work on this topic [1,2]. Most of the research I have come across so far do not use a purely physics based approach. However, there are a few groups out there who are coming pretty close to using purely physics based grasping [3,4].

The general idea

Virtual physics hand (flesh coloured) and the users tracked hand (green cuboids) interact with each other via a set of spring-dampers.
Image adapted from [5]

The image above, shows the general grasping principle in action, which consists of three components:
1. The users tracked hand and fingers (green cuboids), not visible to the user
2. The virtual, physics hand, visible
3. A virtual physics enabled (rigid body) object

The users tracked hand (kinematic) and physics hand interact via a set of spring-dampters, which connect each finger to their corresponding virtual counterpart and allows for the physics hand to realistically interact/collide with virtual objects in a compliant manner. In the example image above, the users hand has inter-penetrated the virtual object, but the physics hand has comformed to the object.

Although this approach results in fairly realist hand-object behaviour, it does not actually achieve stable, physics based grasping of the object. To this end, most solutions resort to some heuristic (rules) to create the illusion of physics based stable grasping, but never actually achieve it.

For a physics based grasp, the grasping and frictional forces between the virtual fingertips and the virtual object must be simulated, which is much more complicated than applying a few simple if-else statements.

Below I will post about my tested:
> Appraoches
> Results and
> Suggestions for future improvements

  1. A Hinge Joint approach
    17th June 2020
    This first attempt creates two hinge joints at the contact points between the tip of the digits and the target object.

[1] Borst, C. W., & Indugula, A. P. (2005). Realistic Virtual Grasping. Proceedings of the IEEE Virtual Reality, 1–9.

[2] Nasim, K., & Kim, Y. J. (2016). Physics-based Interactive Virtual Grasping. Proceedings of HCI, 1–7.

[3] Holl, M., Oberweger, M., Arth, C., & Lepetit, V. (2018). Efficient Physics-Based Implementation for Realistic Hand-Object Interaction in Virtual Reality (pp. 175–182). Presented at the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), IEEE.

[4] Nasim, K., & Kim, Y. J. (2016). Physics-based Interactive Virtual Grasping. Proceedings of HCI, 1–7.

[5] Vershoor, M., Lobo, D., & Otabuy, M. A. (2018). Soft Hand Simulation for Smooth and Robust Natural Interaction, 1–8.


TacTile App on iOS

This article describes an experimental setup to study the effects of different sensory modalities on the perception of touch. Using a psychophysical approach we question the role different sensory inputs play on the tactile ability to distinguish different textured surfaces. Using motion capture of the hand and fingers, force sensors on the textured objects and surface reactions we are able to capture a rich set of behavioural data as well as user responses to the tactile task.

Tactile feedback is one of the least studied areas when it comes to designing physical interactions with our virtual environments. Whether it is through the screen of a phone or in virtual and augmented reality, vibro-tactile feedback is often seen as an optional feature, not essential to the experience.

However, in the last five years there have been several great developments towards more advanced vibro-tactile stimuli, such as through the Oculus Rift Touch Controllers, or the Apple Taptic engine (Force Touch). These systems use linear actuators, as opposed to rotary motors, which have a smaller ramp-up/down phase and as a result can deliver effective vibro-tactile stimuli faster.

In this project I present an iOS app (TacTile) I have been developing to study multi-sensory (visual, auditory and tactile) roughness perception, the perception of surface roughness through vision, audio and touch.

The idea is that the perception of how rough or smooth a surface feels is differentially dependent on the afferent visual, auditory and vibratory neural signals to the brain.

Through the app we can control the level of each signal modality in order to “make” a surface feel more smooth or rough (diagram 1).

Upper limb target tracking task

Upper limb functional deficits, including increased muscle tone and experience of pain and weakness during directed arm motion, are often associated with patients following acquired brain injury. Stroke is the most prevailing cause of disability in the UK and US, being most common in the elderly population.

Upper limb movement deficits also occur in younger people, often due to peripheral nerve injury in traffic related accidents as well as in military personnel. A loss of functional upper limb motor control can have a bigger impact on this population, as most patients are still in working age.

The overarching goal of this project is to use accessible technologies, such as motion capture and virtual reality, to design an assessment and training tool to support standard rehabilitation interventions in clinic and at home. We aim to achieve this by quantifying movement performance parameters to inform patients about their recovery and clinicians about the effects of their interventions.

However, this study narrows the focus to test the feasibility of such a setup. As a result, before testing patients, we aim to answer a few simpler questions:
Can we use a few kinematic parameters, such as positional error and velocity, to show the effects of a mechanical elbow constraint on healthy participants? Is such artificial elbow restriction representative of patient behavior, who often experience difficulties to extend their arms outward during reaching movements?

Patient movement range and quality is limited. As an attempt to simulate this limitation we used a constraint condition in a within-subject target tracking design to test the effects of limited motion on healthy participants. As the constraint we used a reinforced elbow splint to limit elbow flexion-extension motions. We expect this intervention will have a negative effect on movement performance, shown in a lower velocity and higher positional tracking error, compared to the unconstrained, control condition. Each condition was blocked and randomized across participants. Target motions were oval in shape at an average velocity of 10cm/s and followed the two-third power law. We programmed the target trajectories in either a frontal or transverse plane within a limited space of a 30x30x30cm cube.  We present data collected from 11 healthy post graduate students, who were asked to follow the target motions as accurately and as quickly as possible.

upper limb exercise.png

Figure 1: Upper limb movement assessment setup with marker based motion capture cameras and a head-mounted-display (HMD) to present moving targets in a immersive virtual environment. 

As shown in figure 1 and 2, we used a target tracking protocol to assess movement range and tracking performance by extracting kinematic parameters, figure 3, such as velocity, acceleration and positional accuracy.

upper limb exercise vr

Figure 2: Virtual environment with target and hand positions represented by simple geometrical shapes (cubes). The background objects and landscape serve as visual reference points. 

upper limb data analysis.png

Figure 3A: The positional error is defined as the distance between the center of the hand (yellow frame) and the center of the target (red cube). B: Excluding the initial error period, defined as the first two seconds, this error was averaged to represent the overall error for the individual trial. 

In contrast to our expectations, we found a significantly reduced positional error in the constrained condition compared to the control, n=11, M=67.8, SD=4.8 mm/s, p < 0.05. And for tangential velocity we did not find any difference, figure 4.

Figure 4A: Position error averaged over all trials and participants (n=11) with a significant difference between the unconstrained (M = 76.16, SD = 11.24) and constrained (M = 67.76, SD = 4.79)  conditions, p < 0.05. B: Average tangential velocity (n=11) for the unconstrained (M = 0.07, SD = 0.08) and constrained (M = 1.1, SD = 0.09)  conditions, p > 0.05.

We can positively answer the first of the initial two questions, i.e. can we use kinematic parameters to show any difference in our intervention? However, it seems we have the opposite effect to our initial expectations, because the constrained has significantly improved positional error. This also answers our second question: Our elbow constraint is not representative of the movements deficits observed in the patient population.



upper limb results 3

Figure A1 A: Typical path taken to a target (point-to-point) by a participant compared to a straight line. B: Tangential velocity profile for a point-to-point motion 

In Orbit

In Orbit

This is yet another space based game for the lone wolves out there. I have only brought it as far as the basics, but my grand vision goes much further than the usual space battle game. My idea is to give every space player the opportunity to build their own homes, on asteroids that is. Players can also setup asteroid base stations with mining operations among others. Ultimately, you will be able to create a small community within your network of asteroids and be able to do the usual: trading, negotiating etc. And if all fails, you can always go to war, as usual.

In Orbit (image taken from within the game)Strike Base REVAMPED.png

At the beginning of the game, you escape a dying home planet in your space craft with nothing more than a bunch of basic tools, an AI assistant and a worker robot who can help you with the usual maintenance, occupy your asteroid of choice and help you with mining and building.

Strike Base REVAMPED 2. png

Once you find a place, you can setup a base station, a small town, a defense perimeter and factory, provided you have collected enough material resources.

Strike Base REVAMPED 5.png

Every asteroid will have unique qualities, such as natural resources, rare materials, as well as a “life span”. Furthermore, the resources will be either easy or more difficult to harvest, depending on where they lie within the asteroid.

Strike Base REVAMPED 4.png

Space battles are inevitable, but they are not at the centre piece of the game mechanics. The main focus is to build, trade, explore and discover asteroids, planets and hidden gems within the vastness of the universe.