Byron Spice Wednesday, May 8, 2019Print this page.
SCS had a strong showing this week at CHI 2019, the Association for Computing Machinery's Conference on Human Factors in Computing Systems. Here are highlights from the stories we wrote on some of the CMU research the conference featured.
Show Your Hands: Smartwatches Sense Hand Activity
We're used to smartwatches and smartphones that sense what our bodies are doing, but what about our hands? It turns out that smartwatches, with a few tweaks, can detect a surprising number of things your hands are doing, including typing on a keyboard, washing dishes, petting a dog, playing the piano or using scissors.
Knit 1, Purl 2: Assembly Instructions for a Robot?
CMU researchers have used computationally controlled knitting machines to create plush toys and other knitted objects actuated by tendons. It's an approach they say might someday be used to cost-effectively make soft robots and wearable technologies.
Suitcase, Wayfinding App Help Blind People Navigate Airports
Robotics Institute researchers have teamed up with the Pittsburgh International Airport to develop two tools that help people with visual disabilities navigate airport terminals safely and independently. The first, a smart rolling suitcase, sounds alarms when users are headed for a collision. The second tool is a navigation app that provides turn-by-turn audio instructions for how to reach a departure gate - or a restroom or a restaurant.
CMU Researchers Make Transformational AI Seem "Unremarkable"
Physicians making life-and-death decisions don't give much thought to how artificial intelligence might help them. And that's how CMU researchers say clinical AI tools should be designed - so doctors don't need to think about them. They call this "Unremarkable AI."
Byron Spice | 412-268-9068 | bspice@cs.cmu.edu