CSHL neuroscientist Anthony Zador shows how evolution and animal brains can be a rich source of inspiration for machine learning, especially to help AI tackle some enormously difficult problems, like doing the dishes.
The majority of soft robots today rely on external power and control, keeping them tethered to off-board systems or rigged with hard components. Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Caltech have developed soft robotic systems, inspired by origami, that can move and change shape in response to external stimuli, paving the way for fully untethered soft robots.
An experiment with a water-saving 'smart' faucet shows potential for reducing water use. The catch? Unbeknownst to study participants, the faucet's smarts came from its human controller.
In the blink of an eye, the human visual system can process an object, determining whether it's a cup or a sock within milliseconds, and with seemingly little effort. It's well-established that an object's shape is a critical visual cue to help the eyes and brain perform this trick. A new study, however, finds that while the outer shape of an object is important for rapid recognition, the object's inner 'skeleton' may play an even more important role.
Researchers at the UW have used machine learning to develop a new system that can monitor factory and warehouse workers and tell them how ergonomic their jobs are in real time.
UC Berkeley neuroscientists have created interactive maps that can predict where different categories of words activate the brain. Their latest map is focused on what happens in the brain when you read stories.
EPFL scientists are investigating new ways to provide visual signals to the blind by directly stimulating the optic nerve. Their preliminary study uses a new type of neural electrode and provides distinct signals.
Machine learning algorithms can sometimes do a better job with a little help from human expertise, at least in the field of materials science.
Is the way we bark out orders to digital assistants like Siri, Alexa and Google Assistant making us less polite? Prompted by growing concerns, two Brigham Young University information systems researchers decided to find out.
A collaborative team at the Wyss Institute, Harvard SEAS, and the University of Nebraska Omaha reports in Science that they now have developed the first portable exosuit that can assist the extension of the hip joint during both walking and running. The team successfully tested their wearable robot in uneven outdoor environments while wearers walked uphill, and walked and ran at different speeds.