Sigma R&D has won first prize in a gesture challenge to show just how much more talent -- like sign language translation and light saber fun -- can be unlocked in a Kinect. Normally the Microsoft device can only scope body and full mitt movements, but the research company was able to track individual fingers with a Kinect or similar sensor, plus its custom software, allowing a user's hand to become a more finely tuned controller. To prove it, the company introduced a virtual lightsaber to a subject, tracking his swordsmanship perfectly and using his thumb extension to turn it on and off. The system even detected when a passing gesture was made, seamlessly making a virtual transfer of the weapon. The same tech was also used to read sign language, displaying the intended letters on the screen for a quick translation. The SDK is due in the fall, when we can't wait to finally get our hands on a Jedi weapon that isn't dangerous or plasticky. To believe it for yourself, see the videos after the break. http://www.engadget.com/2012/07/25/s...sign-language/