Google’s Soli Uses Gesture Technology to Immerse K–12 Students
Author: eli.zimmerman_9856
{authorlink}
Go to Source
eli.zimmerman_9856
Mon, 08/26/2019 – 13:29
In movies, the fastest way to convey a futuristic world is by giving characters the power to control their environment with a simple wave of the hand.
Gesture control isn’t ubiquitous yet, but the technology is progressing. In January, the Federal Communications Commission approved Google’s Project Soli, a sensing technology that uses miniature radar to detect touchless gestures.
“With the FCC approvals, Google can develop the gesture technology at a faster pace,” says Thadhani Jagdish, senior research analyst in the information and communications technology practice at Grand View Research, to eventually “transform an individual’s hand into a universal remote control.”
MORE FROM EDTECH: Check out what experts see as the future of classroom technology in K–12.
Microsoft Kinect Puts Content Control in Users’ Hands
Gesture technology is already being used in diverse applications, Jagdish notes. In South Africa’s O.R. Tambo International Airport, for example, a coffee company installed a machine that analyzes travelers’ facial gestures and dispenses a cup of coffee if it detects someone yawning.
A Samsung TV supports motion control for changing channels, playing games and using the internet. An input device from Leap Motion, which develops hand-tracking software, lets a user control a computer or laptop with hand movements.
In schools, these motion-sensing devices can be useful in spaces that rely on shared screens and collaboration platforms.
Rich Radke, a professor of electrical, computer and systems engineering at the Rensselaer Polytechnic Institute, says staff there use Microsoft Kinect, which lets them point at a screen to change the content.