Apple was today awarded a patent for touch-free, motion sensing gestures within 3D GUI concepts just as the company reassigns a patent related to 3D mapping technologies that it picked up through its acquisition of PrimeSense.
We first told you about Apple’s patent application for a “Sensor Based Display Environment” back in January of 2012, but today the United States Patent and Trademark Office officially awarded Apple the patent (via AppleToolbox).
As pictured in the patent drawings above, Apple’s patent describes 3D user interfaces that utilize “onboard sensors to automatically determine and display perspective projection of the 3D display environment based on the orientation data without the user physically (e.g., touching) the display.” In one scenario described in the patent, gestures “made a distance above a touch sensitive display” are detected by proximity sensors built into a device.
It’s not the first time Apple has touched on 3D UI concepts and using sensors to detect Kinect-like gestures, but recently rumors popped up that Apple is working on a glasses-free 3D iPhone display technology. That’s perhaps another hint Apple might be getting closer to introducing such a feature.
Speaking of Kinect-like gestures, Apple has completed its first patent reassignment from its purchase of Israel-based PrimeSense, the company behind the technology in Microsoft’s Kinect sensor. The patent for a “Lens array projector” (via AI), covers methods of light projection for mapping environments in 3D.
It’s unclear exactly how far along Apple is with its work on 3D user interfaces and motion sensing gestures, which have yet to appear in its products, but it’s also hinted at its interest in virtual reality recently.
Earlier this month we spotted a number of job listings on the company’s site seeking developers with experience developing on virtual reality and motion sensing technology like Oculus Rift and Leap Motion to build VR and augmented reality experiences for future products.