Friday, January 11, 2013

Machine Perception Lab Shows Robotic One Year Old


Jan. 9, 2013 — The world is getting a long-awaited first glimpse at a new humanoid robot in action mimicking the expressions of a one-year-old child. The robot will be used in studies on sensory-motor and social development – how babies “learn” to control their bodies and to interact with other people.
Different faces of Diego-san: video of robo-toddler shows him demonstrating different facial expressions, using 27 moving parts in the head alone.



















Diego-san’s hardware was developed by leading robot manufacturers: the head by Hanson Robotics, and the body by Japan’s Kokoro Co. The project is led by University of California, San Diego full research scientist Javier Movellan.
Movellan directs the Institute for Neural Computation's Machine Perception Laboratory, based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2). The Diego-san project is also a joint collaboration with the Early Play and Development Laboratory of professor Dan Messinger at the University of Miami, and with professor Emo Todorov's Movement Control Laboratory at the University of Washington.
Movellan and his colleagues are developing the software that allows Diego-san to learn to control his body and to learn to interact with people.
"We've made good progress developing new algorithms for motor control, and they have been presented at robotics conferences, but generally on the motor-control side, we really appreciate the difficulties faced by the human brain when controlling the human body," said Movellan, reporting even more progress on the social-interaction side. "We developed machine-learning methods to analyze face-to-face interaction between mothers and infants, to extract the underlying social controller used by infants, and to port it to Diego-san. We then analyzed the resulting interaction between Diego-san and adults." Full details and results of that research are being submitted for publication in a top scientific journal.
While photos and videos of the robot have been presented at scientific conferences in robotics and in infant development, the general public is getting a first peak at Diego-san’s expressive face in action. On January 6, David Hanson (of Hanson Robotics) posted a new video on YouTube athttp://www.youtube.com/watch?feature=player_embedded&v=knRyDcnUc4U#!.
“This robotic baby boy was built with funding from the National Science Foundation and serves cognitive A.I. and human-robot interaction research,” wrote Hanson. “With high definition cameras in the eyes, Diego San sees people, gestures, expressions, and uses A.I. modeled on human babies, to learn from people, the way that a baby hypothetically would. The facial expressions are important to establish a relationship, and communicate intuitively to people.”
Diego-san is the next step in the development of “emotionally relevant” robotics, building on Hanson’s previous work with the Machine Perception Lab, such as the emotionally responsive Albert Einstein head (see video on Calit2 YouTube channel).
The video of the oversized android infant was picked up by the popular online technology magazine, Gizmag, with a Jan. 7 article titled “UCSD’s robot baby Diego-san appears on video for the first time,” written by Jason Falconer.
In his article, Falconer writes that Diego-san is “actually much larger than a standard one year old – mainly because miniaturizing the parts would have been too costly. It stands about 4 feet 3 inches (130cm) tall and weighs 66 pounds (30kg), and its body has a total of 44 pneumatic joints. Its head alone contains about 27 moving parts.”
The robot is a product of the “Developing Social Robots” project launched in 2008. As outlined in the proposal, the goal of the project was “to make progress on computational problems that elude the most sophisticated computers and Artificial Intelligence approaches, but that infants solve seamlessly during their first year of life.”
For that reason, the robot’s sensors and actuators were built to approximate the levels of complexity of human infants, including actuators to replicate dynamics similar to those of human muscles. The technology should allow Diego-san to learn and autonomously develop sensory-motor and communicative skills typical of one-year-old infants.
"Its main goal is to try and understand the development of sensory motor intelligence from a computational point of view," explained principal investigator Movellan in a 2010 Q&A with the Japan-based PlasticPals blog. "It brings together researchers in developmental psychology, machine learning, neuroscience, computer vision and robotics. Basically we are trying to understand the computational problems that a baby’s brain faces when learning to move its own body and use it to interact with the physical and social worlds."
The researchers are interested in studying Diego-san’s interaction with the physical world via reaching, grasping, etc., and with the social world through pointing, smiling and other gestures or facial expressions.
As outlined in the original proposal to the NSF, the project is “grounded in developmental research with human infants, using motion capture and computer vision technology to characterize the statistics of early physical and social interaction. An important goal is to foster the conceptual shifts needed to rigorously think, explore, and formalize intelligent architectures that learn and develop autonomously by interaction with the physical and social worlds.”
According to UCSD's Movellan, the expression recognition technology his team developed for Diego-san has spawned a startup called Machine Perception Technologies (MPT). The company is currently looking for undergraduate interns and postgraduate programmers. "We like UCSD students because they tend to have a strong background in machine learning."
The project may also open new avenues to the computational study of infant development and potentially offer new clues for the understanding of developmental disorders such as autism and Williams syndrome.
As noted in the Gizmag article, Diego-san won’t be the only child-like robot for long. This spring Swiss researchers will demonstrate their nearly 4-foot-tall Roboy robot toddler (with a face selected via a Facebook contest!).

Scientsts use a Shortcut to get Clues from Deep Ocean

By the A.M. Costa Rica staff
January 10, 2012

Out of sight of land in the Pacific another group of scientists is trying to make sense of the undersea activities in that part of the world.

This is the scientific complement of the "JOIDES Resolution" research vessel, and they are taking a shortcut.

The "JOIDES Resolution" is a frequent visitor to Costa Rica's Pacific ports, and has participated in a series of scientific exploration off the coast.

The main area of study is the activity beneath the sea floor that can cause horizontal movements of up to 8 centimeters (a bit less than 3.5 inches) a year as the earth pushes up magma.

That research relates to the forces that forced up Costa Rica's mountains, energizes its volcanoes and continues to rattle the country periodically. Unlike a lot of scientific expeditions, this one is transparent with an educational dimension, so daily activity becomes discussion topics.

This expedition, No. 345, is headed by Jonathan Snow of the University of Houston, Texas, and Kathy Gillis of the University of Victoria, Canada.

The drilling ship "JOIDES Resolution" is now on station above what is known as Hess Deep Rift about 2 degrees above the equator. Previous expeditions sought to sample the rock deep under the sea floor by drilling to it. But the presence of the rift, sort of an undersea canyon, provides a shortcut, the expedition explained in a recent summary. The deep cut in the ocean floor allows the crew to avoid a lot of drilling.

The rock surface at the bottom of the rift is estimated to be just about a million years old, a real youngster in geological terms. And this is what the scientific crew calls a tectonic window.
The rift is not far from the intersection of the Pacific, Nazca and Cocos tectonic plates.

The Cocos is the one that keeps pushing under and against the lighter Caribbean plate to bring earthquakes to Costa Rica.

Expedition, No. 345 photo 
Scientist examine rock core from Hess Deep

The summary of Expedition No. 345 describes its goal this way:

"The objective of this project is to sample for the first time, primitive magmatic rocks of the lower crust in the oceanic Pacific. These samples will help scientists seek to understand the manufacturing process of the oceanic crust at a fast-spreading rift, but also to document the effect of cooling the young crust by seawater, and thus the importance of chemical exchanges between the crust and the ocean. They control the chemical evolution of the oceanic crust before recycling into the mantle via subduction zones, and play a fundamental role in geochemical cycles across the globe."

As magma is pushed to the surface, it cools into basalt and an igneous rock called gabbro, said the summary.

previous expedition obtained samples of gabbro from a deep drilling also off the isthmus in the Pacific.

A member of the expedition, Jean-Luc Berenguer, reported via the "JOIDES Resolution" Web site this week that the drilling team had brought up core samples of rock from the Hess Deep, which is about 5,000 meters (about 16,400 feet) down.

In fact, the scientists have been demonstrating the force of the water by lowering polystyrene objects, which are compressed into a much smaller size in the deep ocean. This is mostly as demonstrations to schoolchildren who are viewing the work electronically.