Using RUIS in Unity 5


An upcoming Kinect 2 + Oculus Rift DK2 demo created with RUIS and Unity 5 (no release date yet).

Unity 5 allows the use of Kinect 2 and Razer Hydra without a Pro license, so it makes sense to update your RUIS project to Unity 5. In May we will put out an official RUIS for Unity 5 release. Meanwhile you can use the below guide to upgrade RUIS to work in Unity 5.

How to update RUIS 1.07 to Unity 5
Download the RUISunity1071_Unity5.unitypackage from here:

Update instructions
1. Create backup of your project.
2. If you have modified any RUIS scripts, prefabs, or scenes, you need to create duplicates of them because RUIS files with the original names will be overwritten by the RUIS update.
3. Install Unity 5. If you are a Windows user and intend to use Kinect 1, you should install the 32-bit Editor (Additional Downloads, For Windows link):
4. Open your project with Unity 5, and Upgrade the project when Unity asks to do so.
5. When “API Update Required” appears, choose the option “I Made a Backup, Go Ahead!”.
6. The project should be open now. Ignore any errors in Console, and open the RUISunity1071_Unity5.unitypackage file in Explorer/Finder and import everything.
7. After importing, delete the following files and bundles in \Assets\Plugins: KinectForUnity.dll,, OculusPlugin.dll, sixense.dll, sixense.bundle. Do NOT touch any files in the Android, Metro, OculusPlugin.bundle, x86, and x86_64 subfolders.
8. Everything should work now, unless some of your own scripts or assets are broken.

P.S. If anyone knows a math library for C# that supports (or can be easily modified to support) in-place matrix operations for multiplication, addition, inversion, and transposition of 4-by-4 matrices, let me know! The current matrix library that we are using (and Unity’s Matrix4x4) allocates new memory after every matrix operation, which causes very frequent garbage collection and results in a noticeable performance loss when Kinect rotation smoothing is enabled in RUIS.

Posted in RUIS | Tagged , , , , , | Leave a comment

RUIS 1.07 released! Fist gesture and more

Aalto University’s course for virtual reality started earlier this month, and to get things rolling we have just released the latest version of RUIS for Unity. RUIS 1.07 adds new features and fixes many issues of our previous release, which we admittedly rushed out to be in time for the Spatial User Interaction 2014 conference in October. Among other things, the bug fixes of RUIS 1.07 restore positional head tracking functionality for Oculus Rift DK1 using Kinect 1/2, PS Move, and Razer Hydra.

For Kinect 2 we have added avatar joint filtering and a fist gesture that can be used to grab and manipulate objects. Developers with Oculus Rift but without Kinect get to choose whether the Rift’s orientation rotates only the avatar head, the whole body, or the walk forward direction.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Kinect 2 tracked player grabs the hammer with a fist gesture.

Unity Free users can now use Oculus Rift, but they need to import OculusUnityIntegration.unitypackage and overwrite the existing files. Due to the premature nature of the Oculus Unity integration, there are also other considerations that you can check from the “Known issues” section of our readme. These issues, including “judder” in Unity Editor, should be alleviated as new versions of Oculus SDK will be released in the future.

We’ve performed appropriate testing this time, and RUIS for Unity 1.07 holds together well and is shaping up nicely. There is still jaggedness in the motion of Kinect controlled avatars and motion controllers, that seems to be related to Unity frame update and the irregular device refresh rate. We’ll hope to fix that for the next release.

Posted in RUIS | Tagged , , , , , , | 1 Comment

Photos from 2014 Virtual Reality Course

Below you see VR applications created by my students with our RUIS for Unity toolkit. Most of the applications featured Oculus Rift DK1, Kinect, and PlayStation Move controllers.

This year and in 2015 the virtual reality course is organized in Aalto University under the name Experimental User Interfaces. A new course is starting in January of 2015.

Wheelchair Hero

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

An empowering first-person game using Oculus Rift, where the player controls a wheelchair by spinning wheels that have PS Move controllers attached to them.

Flying Game

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

A two player co-operative game, where the player with Oculus Rift and a rifle sneaks around a city, using a laser to light up targets which can be destroyed by the second player who pilots an attack helicopter.

Virtual Curling

Two player co-operative curling simulator where one player is a curler who "throws" stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.

Two player co-operative curling simulator where one player is a curler who “throws” stones, and the another player acts as a sweeper, affecting the trajectory of the stones while it slides on ice.


The player's avatar is a giant cyber-lizard, who uses his claws and "laser breath" to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.

The player’s avatar is a giant cyber-lizard, who uses his claws and “laser breath” to destroy skyscrapers while fighting human soldiers, tanks, and helicopters.


A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.

A co-operative two player game in the spirit of Super Monkey Ball and Marble Madness; one player is controlling a size-varying ball from a first-person Oculus Rift view, while the other player uses a god-view to help him advance in obstacle courses.


Two players compete on who can travel further on a snowy path filled with dangers.

Two players compete on who can travel further on a snowy path filled with dangers.

All you see above is created by students who have no or very little experience in creating VR applications. Five out of six applications featured two different display systems: Oculus Rift and two stereo 3D screens (for audience and/or second player).

In the course the students were free to create any kind of applications, and for some reason everyone chose to develop games :-)

Posted in Teaching | Tagged , , , , , , | Leave a comment

Oculus Rift DK2 and Kinect 2 support added!

Head over to the download section to get the latest RUIS for Unity version!


We have also added a process to calibrate the transformation matrix between several different sensor pairs (see above image). This enables to use Kinect 1, Kinect 2 (Win8 only), Oculus DK2, and PS Move in the same coordinate system even if the individual sensors have some distance between them or are oriented into different directions (the sensors’ view frustums need to partially overlap though), In other words, if you have calibrated Oculus Rift DK2 and Kinect 2, the Kinect 2 avatar’s head and body is correctly aligned with the head tracked position of Oculus Rift DK2 when you are using RUIS prefabs, and you will see your whole body in virtual reality!

There are still some issues that will be fixed for the next RUIS release. For example, Kinect 2 joint data is not smoothed yet, and the joints have a noticeable amount of jitter. The “Known issues” section in RUIS’ readme file lists a few more rough edges.


Posted in RUIS | Tagged , , , , | 7 Comments

Oculus Rift Developer Kit 2 has arrived!

We just received our Oculus Rift Development Kit 2 (DK2) head-mounted-display and are thrilled to report our experiences with it.

Rift DK2 in front left, its position tracking camera in front center, and two Kinect 2 sensors behind them.

Rift DK2 in front left, its position tracking camera in front center, and two Kinect 2 sensors behind them.

The DK2 comes bundled with an infrared webcam that tracks the Rift’s position (and most likely helps to correct yaw drift in orientation as well).  My first question upon unboxing DK2 was “Where the infrared LEDs at??”

So I pointed Kinect 2’s infrared camera at it, and took the below picture:


The LEDs appear as overexposed white blobs in the infrared image.

It seems that the LEDs are below the Rift’s exterior, which is made of (special?) plastic that lets through IR spectrum but absorbs visible light, hiding the nasty insides.

DK2 demo experiences

After solving the LED mystery, we tried the following demos:

Oculus demo scene is best for checking out the tracking and image quality, as the scene is peaceful and its 3D objects are simple and elegant. Cyber Space is a virtual amusement park ride for those of us who want to explore our cyber-sickness limits, Horse World Online is only for the most hard-core horsie fans, and Chilling Space has a calm atmosphere (we didn’t notice how positional tracking was employed though).

DK2 has been out for a relatively short time, and I’m not aware of any killer apps for it yet. Personally I’m looking forward to the DK2 version of the Senza Peso opera.

In many ways Oculus Rift DK2 is superior to DK1: head position tracking is responsive and accurate, which is integral to immersion and minimizing nausea. While the screen door (pixel border) effect is still noticeable, it’s a minor nuisance as there are major improvements in other areas. DK2’s resolution is higher, its OLED-display produces a better color range, and its image is crisp because the motion blur from slow pixel color change times has improved (except for blacks). The tracking latency is still low as it should be, and the low persistence technique really seems to do the trick, considerably reducing cyber-sickness.


Meant for each other?

That’s it for now, we’ll get back to combining DK2 with Kinect 2! It’s wonderful stuff, keep your eyes on us!

Posted in Virtual Reality | Tagged , , , , , | 5 Comments

VR2014 conference highlights: Tactical Haptics and more

I participated in the IEEE Virtual Reality 2014 conference that was held between March 29th – April 2nd in Minneapolis. Eager beavers can jump straight to the below link to see a list of the best papers and demos at the conference:

VR works better with drugs (pain relief)

VR works better with drugs (pain relief). Timothy Leary approves.

The biggest VR2014 highlight for me was trying out Tactical Haptics‘ Reactive Grip prototype:

Sense of touch is one of the major senses and perhaps the most challenging sense to provide with convincing virtual sensations. Currently haptic feedback is missing from most virtual reality applications. Reactive Grip could change that for many applications: it is cheap and simple haptic technology that could be integrated into any number of modern game controller variations. The handle of Reactive Grip utilizes four sliding contactor plates whose movement conveys the sense of inertia from the virtual object. Examples include gun recoil (kickback), struggle of a fish caught by a fishing rod, or hit of a sword against another virtual object. I tried bunch of demos that included those examples. Tactical Haptics have close ties with Sixense, and in the prototype motion tracking is handled via Razer Hydra controllers.

Reactive Grip has its limitations: because it is a game controller, the Tactical Haptics’ device can only approximate sensations from rigid, hand-held objects such as virtual gun grips, fishing rods, steering wheels, and other tools. For most games and applications this should be enough though. Reactive Grip is a mechanical device and I wonder how robust it can be. Traditionally haptic devices break easily.

Funny thing is that if you close your eyes when using the controllers, the haptic feedback alone doesn’t convey what you are doing in the virtual world, due to the vagueness and low fidelity of the haptic effect. But when combined with audiovisual cues, the different perceptions merge together gracefully providing more immersion than any of the cues alone. Most importantly, the haptic feedback doesn’t contradict the audiovisual cues but rather supports them.

Tactical Haptics ran a Kickstarter campaign last Autumn that unfortunately didn’t go through. People really need to try out this controller to see its potential. Tactical Haptics’ invention is something that for the first time could bring haptic feedback to the masses, especially if one of the major console manufacturers would adopt it.

The acquisition of Oculus VR by Facebook was a big news topic throughout the conference. As such it was a pity that we didn’t get to see Crystal Cove or DK2 prototype of Rift. Vicon was in talks with Oculus VR to bring DK2 to the conference, but at the time Oculus canceled public demonstrations of DK2 due to the Facebook buyout. That’s what I heard anyway. Palmer Luckey was also supposed to participate in the conference, but apparently the Facebook acquisition and the related death threats to Oculus staff got in the way.

Several times I witnessed Oculus’ HMD referred as Facebook Rift and FaceRift. Perhaps there was slight bitterness in the air regarding the 2 billion dollar buyout? This is understandable as traditionally VR hasn’t been a very lucrative business, and suddenly seasoned VR researchers and practitioners see a VR company go from zero to a hero in less than two years.

I talked to a person who had tried Sony’s Morpheus, DK2, and Valve’s prototype. In his opinion DK2 and Morpheus were very close to each other performance-wise. He liked Valve’s prototype the best though, because of the wide positional tracking that was implemented with camera-based inside-out-tracking of fiducial markers. With Michael Abrash joining Oculus, hopefully the good features of Valve’s prototype will be transferred to future Oculus HMDs.

University of Minnesota presented a bunch of their VR related projects to the conference audience. The most interesting one was a high-resolution, wide-FOV HMD built from an iPad mini and a 3D printed frame. In their demo up to 6-8 people wore the HMDs, dwelling the same VR place simultaneously, while being tracked over a large area using a commercial optical tracker.

The HMD utilized high-quality glass optics (~$40 a piece) to spread the iPad mini’s 2048-by-1536 resolution to a FOV that similar to Oculus Rift’s. Needless to say the image was much crispier than with Rift, whereas the iPad’s orientation tracking was slightly less responsive than that of the Rift. Overall, I was very impressed with this HMD!

No virtual prostate examinations this year, but at least we got to probe some chest cavities.

After the conference it was time to get back to basics.

P.S. I also visited Kinect 2 Developer Preview Program Breakfast that was co-organized with Microsoft’s Build Conference in San Francisco. Microsoft hopes to start selling Kinect 2 for Windows in the summer, and us developers with the preview version should get a Kinect 2 Unity plugin even before that.

Posted in Uncategorized | Tagged , , , , , , , , | Leave a comment

RUIS for Unity 1.05 released

Last week Oculus Rift was acquired by Facebook for 2 billion dollars, which is the biggest move in virtual reality industry that we have seen. I speculate that this was at least partially influenced by Sony finally becoming serious with head-mounted-displays through their Morpheus HMD.

Another news piece of (almost) similar proportions, is that the latest version of RUIS for Unity is out :-) Oculus Rift package has been updated to version 0.2.5c and several bugs have been fixed. So what can you do with RUIS for Unity? Use RUIS’ Wand prefabs to easily bring interaction via input devices like Razer Hydra, Kinect, and PlayStation Move to your application, configure multiple mono or stereo displays in Unity through RUIS’ DisplayManager, or use the MecanimBlendedCharacter prefab to blend real-time Kinect body tracking with Mecanim animation of your 3D avatar.

Speaking of Oculus Rift, apparently some people experience 150 ms latency in certain applications built with Unity. Jason Jerald found out that this can be remedied by
commenting out the following line inside SetMaximumVisualQuality() function of OVRCameraController.cs script:

//QualitySettings.vSyncCount = 1; //comment out this line.

Thanks to Zach Wendt for this information!

In my next update I’ll bring you news from IEEE Virtual Reality 2014 conference. Stay tuned!

Posted in RUIS | Tagged , , , , , , | Leave a comment

Look ahead into year 2014: Virtual reality & RUIS

The year 2014 looks very promising for virtual reality; A new version of Oculus Rift is coming out, along with plethora of VR peripherals like Sixense STEM and Virtuix Omni. Developing applications that use these devices means that middleware and software toolkits like RUIS will have an even more important role in the future, when developers want to combine different devices or develop using higher levels of abstraction.

Valve and Sony are working on their own head-mounted-displays, and who knows what surprises this year has in store for us! [update: seems like Valve is not making their own HMD after all] VR gaming is far from becoming mainstream however, and I suspect that it will be in 2015 at earliest, when indie developers start to make some serious profit with games that exclusively require VR peripherals. I don’t expect established game companies to develop big budget VR-only games in the near future. What about games that have a traditional UI and a VR user interface then? I have my own reservations; Getting two interfaces to work in one game while sharing gameplay mechanics etc., requires a lot of work and is likely to dilute both experiences if not botch at least one of them altogether.

New virtual reality course

Starting this January, we will run our virtual reality course for the 4th time in Aalto University (we started organizing it in 2011). Student teams will develop virtual reality applications using Oculus Rift, Kinect, PS Move, and other peripherals. Check out the projects from the previous year. Any interested Aalto University students should keep their eye on the course homepage, and note the new course name: Experimental User Interfaces. I also have access to Kinect 2, which will be supported in some future version of RUIS for Unity.


And speaking of further developments of RUIS: Since autumn we’ve been working at our own pace to improve RUIS for Unity with the aim of releasing it in Unity asset store. We have been improving documentation, adding essential features, fixing bugs, and making RUIS easier to use. Work has been slower than we anticipated and we missed our planned release date, as I’m busy writing publications for my PhD and Mikael has been focusing on his Master Thesis. It’s coming however, with all the features that we used to combine Oculus Rift with Kinect, PS Move, and Razer Hydra in our TurboTuscany demo. And then some :-)

P.S. In 2013 we got some nice coverage: I was interviewed for a Gamasutra article about holodeck, a spanish technology site Xataka featured our TurboTuscany demo, and videos of that demo have together gathered so far almost 30,000 views (part 1, part 2).

Posted in RUIS | Tagged , , , , , , | Leave a comment

Lessons learned while developing TurboTuscany

Below I sum a few points that we learned while developing the TurboTuscany demo. Some of our findings are consequential, while some are common knowledge if you have developed stuff for Razer Hydra, Kinect, or PS Move before.


Latencies of used devices, smallest first:
Oculus Rift < Razer Hydra < PS Move < Kinect

Body tracking with Kinect has an easily noticeable lag, has plenty of jitter, and the tracking fails often. Nevertheless, Kinect adds a lot to the immersion and is fun to play around with.

From all the positional head tracking methods available in our TurboTuscany demo, PS Move is the best compromise: big tracking volume (almost as big as Kinect’s) and accurate
tracking (not as accurate as Razer Hydra though). Therefore the best experience of our demo is achieved with Oculus Rift + Kinect + PS Move. Occlusion of the Move controller from PS Eye’s view is a problem though for positional tracking (not for rotational).

Second best head tracking is achieved with combination of Oculus Rift, Kinect, and Razer Hydra. This comes with the added cumbersomeness of having to wear Hydra around the waist.

My personal opinion is that VR systems with a virtual body should track the user head, hands, and forward direction (chest/waist) separately. This is so that the user can look into different direction than the direction where he is pointing a hand-held tool/weapon, while walking in a third direction. In TurboTuscany demo we achieve this with the combination of Oculus Rift, Kinect, and Hydra/Move.

Latency requirements for positional head tracking

The relatively low latency of Razer Hydra’s position tracking should be low enough for many HMD use cases. If you’re viewing objects close, the Hydra’s latency becomes apparent however when moving your head. Unless STEM has some new optimization tricks, it will most likely have different latency (higher?) than Hydra because it’s wireless.

If head position tracking latency is less or equal to Oculus Rift’s rotational tracking, that should be good enough for most HMD applications. Since this is not a scientific paper that I’m writing here, I won’t cite earlier research that suggests latency requirements in milliseconds.

Because we had positional head tracking set up to track the point between eyes, we first set Oculus Rift’s “Eye Center Position” to (0,0,0) which determines a small translation that follows the orientation of Rift. But we found out that the latency of our positional head tracking was apparent when moving the head close (>0.5 meters) to objects, even with Razer Hydra. Therefore we ended up setting “Eye Center Position” to the default (0, 0.15, 0.09), and viewing close objects while moving became much more natural. Thus, our positional head tracking has a “virtual” component that follows the Rift’s orientation.

And now something completely different… We had lots of bugs in grandma when implementing Kinect controlled avatar for TurboTuscany demo, below are some results :-)






Posted in RUIS | Tagged , , , , , , , | 2 Comments

Oculus Rift + Kinect + Razer Hydra + PS Move demo released!

In the past months we’ve been adding new features and Oculus Rift and Razer Hydra support in RUIS for Unity. Our just released TurboTuscany demo showcases the new capabilities of RUIS:

Video part 2:

TurboTuscany features a 1st person view with a Kinect controlled full-body avatar and 4 methods for six-degrees-of-freedom (6DOF) head tracking:

  • Oculus Rift + Kinect
  • Oculus Rift + Razer Hydra
  • Oculus Rift + Razer Hydra + Kinect
  • Oculus Rift + PlayStation Move (+ Kinect)

Head tracking with Razer Hydra

It makes a difference to see your own body in the virtual world, affecting the sense of presence. Those of you with Kinect: Try pushing and kicking stuff, or even climb the ladder with your hands. You can take steps freely inside Kinect’s range, and when you need to go further, just use a wireless controller to walk or run like you would in any normal game. We are blending your Kinect captured pose with Mecanim animation, so while you’re running and your feet follow a run animation clip, you can still flail around your hands and upper body as you like.
(Kinect users will need to install Win32-bit version of OpenNI See the readme that comes with the demo for other details.)

Positional head tracking with Kinect alone is quite rough, so try the Razer+Kinect or PS Move option if you can.

Minimum requirements

  • Oculus Rift
  • Windows operating system: Vista, Windows 7, Windows 8 (see comment section for details)

Supported input devices

  • Razer Hydra
  • ASUS Xtion Pro, Kinect for Xbox, (Kinect for Windows?)
  • PlayStation Move and PS Navigation controllers ( software and PS3 required)
  • Gamepad (any Unity compatible gamepad or joystick)
  • Mouse and keyboard

This demo should be pretty fun to try out even with just mouse and keyboard. There’s several physics based activities, and we’ve hidden a bunch of Easter eggs in the scene.

Download links:
1080p version:

Within a month or two we will release a new version of RUIS for Unity that will have all the features that we used to create TurboTuscany demo.

Posted in RUIS | Tagged , , , , , , | 12 Comments