Delicode and faceshift showcase their usage of Natural Interaction with two new concepts around interaction and connectivity to bridge the real world and digital world
TEL AVIV – 12 March, 2013 – OpenNI® , an industry-led organization formed to certify and promote the compatibility and interoperability of Natural Interaction (NI) devices, applications and middleware, announced today that it will demonstrate how the tools used in full body interactions can be applied to broader market uses such as security, robotics, retail, digital signage and more during the 2013 Game Developer’s Conference.
OpenNI, along with partners Delicode and faceshift, will demonstrate Natural Interaction development at Booth #201 at GDC 2013, March 27-29 in San Francisco.
Delicode , the award winning concept design and software house, will be on-hand to demonstrate Kinetic Stories. This charming, interactive, gestured-based story book opens up a whole new world of enjoyment for kids, allowing them to literally step inside a book on screen and guide their character through an unfolding story.
Kinetic Stories was born from Delicode’s original and main product – NI mate, one of the OpenNI Arena’s most downloaded programs. NI mate takes real-time motion capture data from a 3D sensor and turns it into MIDI (Musical Instrument Digital Interface) and OSC (Open Sound Control). NI mate simplifies the process of creating a motion program, allowing developers to make sensor-based games like Kinected Stories more easily.
faceshift joins OpenNI at GDC to demonstrate its latest software, faceshift 1.1, the next generation of the company’s technology that uses OpenNI 2.0 platform to change the way people interact with each other in virtual worlds. faceshift’s markerless motion capture system uses the PrimeSense 3D short range sensor, the Carmine 1.09 to captures real-time facial expressions, eye gaze, and head pose, to allow users to animate virtual characters for use in film and game production. At GDC, the faceshift team will show its latest upgrades with amazing lipsync, better eyetracking, and several user interface improvements.
“The game development industry is a pioneer in adopting new technologies that spread to other markets. Accordingly, Natural Interaction is expanding to new markets and use cases.” said Inon Beracha, CEO of PrimeSense, OpenNI founder. “Developers can easily use the existing full body interaction tools and apply them to other uses such as interactive storybooks, avatar creation and motion tracking for security.”
At GDC, OpenNI will also demonstrate the PrimeSense NiTE™, the most advanced and robust 3D computer vision middleware available today, boasting a thin host, minimal CPU load, and multiplatform support.
This middleware provides applications working with 3D sensors with a clear user control API, whether it is hand-based control or a full-body control. It enables users to perform functions such as hand locating and tracking, a scene analyzer (separation of users from background), accurate user skeleton joint tracking, various gestures recognition and more.
OpenNI is the largest 3D sensing development framework and community. The open source SDK is the recognized standard for developing computer vision middleware and 3D solutions. The OpenNI community provides developers with a full range of software tools along with a vivid ecosystem platform for effective collaboration and promotion.
The Open NI consortium was established in November 2010 as a not-for-profit organization whose purpose is to promote and standardize the compatibility and interoperability of Natural Interaction (NI), devices, applications and middleware. For additional information, please visit http://www.openni.org or follow on Twitter at @OpenNI.