However, all of these solutions have two problems: they require separate hardware, and they don’t feel natural. With its latest project, Microsoft Research plans to put an end to that. MRTouch uses the HoloLens’ depth sensing cameras to bring touch controls to non-existent objects. It brings the gestures from mobile devices to a 3D space, throwing in up to 10-finger support for complex and natural control. Users can reach out and define areas for virtual interfaces on physical objects, with surprising precision and multi-touch support. It works on most flat surfaces, from tables to walls and floors. It’s not hard to see the applications, especially when refined to include collaborative tools.

Still Early Days

In a demo video, Microsoft showcased the manipulation of 3D objects, web page scrolling, photo viewing, painting, and even an early Minecraft concept. Importantly, the user can have several of these areas active at once, emulating a multi-monitor workflow for increased productivity. Despite this, it’s obvious the project is in its teething stage. There’s still a significant amount of latency, which would be a problem for typing or gaming. However, with hardware from the HoloLens 2 and some refinment, it’s not hard to imagine it becoming a completely viable solution. Currently, MRTouch has an average positional error rate of 5.4mm and 95% accuracy rate on 16mm buttons. Microsoft says that’s already comparable to capacitive touchscreens, and we look forward to furthering improvements in framerate and app compatibility.

MRTouch  Microsoft Project Brings Ten Finger Touch Controls to HoloLens - 81MRTouch  Microsoft Project Brings Ten Finger Touch Controls to HoloLens - 4MRTouch  Microsoft Project Brings Ten Finger Touch Controls to HoloLens - 3MRTouch  Microsoft Project Brings Ten Finger Touch Controls to HoloLens - 89MRTouch  Microsoft Project Brings Ten Finger Touch Controls to HoloLens - 7