add

Friday, February 28, 2014

Showers that reuse your soapy water.


The OrbSys shower claims to save up to 90 percent of the water you'd normally use to get yourself clean by catching and sending it right back to you. Don't worry -- the dirty, soapy water goes through a purification process first, before it emerges as drinking quality at a higher-than-average pressure. And because the used water only needs to be reheated slightly, the shower saves up to 80 percent in energy, or about $1,350 per year from the average utility bill. Swedish tech company Orbital Systems developed the shower with NASA's Johnson Space Center, having been inspired by the designs used in space missions, where fresh water is obviously limited.

Thursday, February 27, 2014

Firefighting Robot Paints 3D Image for Rescuers


Recent headlines regarding autonomous robots suggest that smart machines have a license to kill. But a new project from engineers at the University of California, San Diego suggests a different reality.
The UC engineers have built a pack of tiny autonomous robots that could help save the lives of both fire victims and firefighters.
These lifesaving robots, which look a lot like small Segways, were designed for mobility, agility and reconnaissance. As the first to enter burning buildings, they can serve as scouts for firefighters arriving at the scene of an emergency.
The robots are equipped with infrared and red-green-blue (RGB) cameras, which they use to record temperatures, detect volatile gases and check for structural integrity - all while also searching for victims.
Using on-board software systems, the bots turn the information they gather into 3D maps, which can be viewed by firefighters in real time.
The robots' Segway-like structure, which includes an actuated center leg, even lets them climb stairs and overcome large obstacles. When working collaboratively, the bots can provide firefighters with a highly detailed map of an entire structure.
"These robot scouts will be small, inexpensive, agile and autonomous," said Thomas Bewley, professor of mechanical engineering at the Jacobs School of Engineering at UC San Diego.
Firefighters arriving at the scene of a fire have a thousand things to do. To be useful, the robotic scouts need to work like well-trained hunting dogs, dispatching quickly and working together to achieve complex goals while making all necessary low-level decisions themselves along the way to get the job done."
And this pack of robots isn't the only machine helping first responders. A research team at the University of Sheffield in the U.K. recently developed a "tactile helmet" that lets firefighters sense what's going on around them, even in total darkness.
And the European Space Agency recently funded the development of an autonomous reconnaissance robot designed to gather sensory information about hazardous disaster scenes. [See also: Killer Robots Condemned in New UN Report].

Your Cellphone Could Be a Sonar Device


Submarines have used sonar for decades. Bats and dolphins have used it for millions of years. And thanks to a little math, humans could soon be echolocating with their mobile phones.
At the École Polytechnique Fédérale de Lausanne (EPFL), in Switzerland, experts in signal processing discovered a mathematical technique that allows ordinary microphones to "see" the shape of a room by picking up ultrasonic pulses as they bounce off the walls. The work was published in this week's edition of the journal Proceedings of the National Academy of Sciences (PNAS).
Microphone echolocation is harder than it sounds. Ambient noise in any room interferes with the sounds used to locate the walls, and the echoes sometimes bounce more than once. There is also the added challenge of figuring out which echoes are bouncing off which wall. [See also: "How Bats Stay on Target: Bio Sonar"]
Bats have had millions of years to evolve specialized neural circuits to fine-tune their echolocation abilities, said Ivan Dokmanic, a doctoral researcher and lead author of the PNAS paper. He added that humans can echolocate too, though not as precisely. (Some blind people have demonstrated this ability.)
One reason echolocation is easier for bats and humans than it is for computers is that bats and humans have skulls that filter the sound. Tracking where a sound originates is easier for humans because people's two ears hear slightly different things. This allows humans to pinpoint the origin of a sound.
To enable echolocation in mobile devices, Dokmanic investigated the math behind echolocation. What he found was that it's possible to treat the echoes of sounds emitted by a speaker as sources, rather than as waves bouncing off of something.
It's kind of like what happens when you look into mirror: Your eyes see a reflection, but there's the illusion that there's another person who looks just like you standing at precisely the same distance from the mirror.
That's what Dokmanic did with sound. He assumed that each echo was a source, and created a kind of grid, called a matrix, of distances. Using some advanced math, he was then able to create an algorithm that could group the echoes in the correct way to deduce the shape of a room.
First, the team experimented with an ordinary room at the EPFL, using a set of microphones and a laptop computer to test whether the algorithm worked. It did, and their next step was to test their program in the real world. So they went to a cathedral and tested it there.
"It was really the opposite environment," Dokmanic said, adding that unlike a controlled lab setting, a cathedral has a lot of ambient noise and the space isn't perfectly square.
The algorithm worked there too, showing that the echolocationscheme could detect the cathedral's walls.
"The innovation is in the way that they process the signal to calculate the shape of the room," said Tommaso Melodia, an associate professor of electrical engineering at the University at Buffalo who was not involved in the study.  
Martin Vetterli, professor of communications systems at EPFL and a co-author of the paper, said that mobile phones could be used to locate people more precisely. One problem with getting anyone's precise location on the phone is that only certain frequencies penetrate building walls, so GPS signals are sometimes useless.
Moreover, GPS is not always precise — if there's a lot of interference,it's not uncommon for a phone to say it can't locate you more precisely than within a half mile. Wi-Fi could work, but it depends on the existence of a local network.
Echolocation partly solves that problem, because it can measure the distance from where a user is standing to the walls of an individual room, and send that more precise information to tell the network exactly where that person is located. Instead of knowing where someone is within a city block, you'd be able to see that he or she is inside a room of a certain size or is surrounded by walls that give an intersection a certain shape.
One other issue is the distance between two microphones on a mobile phone. Many mobile phones have two mics —the directional mic is used when it's pressed to your head while you're on a phone call, and the other is used for canceling out the ambient noise.
The two microphones on a phone calculate the distance by triangulating – measuring the small gap between when an echo reaches each microphone. The distance between the microphones is the base of a triangle, and the time difference between echoes' time of arrival tells you the length of the other two sides.
But these two microphones usually aren't very far apart on phones, so calculating the distance to a source that's far away is harder to do.
One solution, Vetterli said, might be to use people's tendency to walk with their phones in order to help echolocate walls more accurately.
Since you can't make phones much bigger, it is simpler to have the phone take measurements from more than one spot as the user walks with it, so the base of the triangle is longer, he said.

Pencil Pusher

U.S. businesses use about 21 million tons (19 million metric tons) of paper every year -- 175 pounds of paper for each American, according to the Clean Air Council. This has led to office recycling programs, "please think before you print" e-mail signatures and printers that offer double-sided printing. Now a trio of Chinese inventors hopes to add another device to the cubicle environment: the P&P Office Waste Paper Processor, which turns paper destined for recycling into pencils. The machine, looking a bit like a three-hole punch crossed with an electric pencil sharpener, was a finalist in the 2010 Lite-On Awards, an international competition that seeks to stimulate and nurture innovation.
Here's how the pencil-making gadget works: You insert wastepaper into a feed slot. The machine draws the paper in, rolls and compresses it, and then inserts a piece of lead from a storage chamber located in the top of the device. A small amount of glue is added before -- voilà -- a pencil slides out from a hole on the side. It's not clear how many pieces of paper form a single pencil, but you figure the average office worker could generate a decent supply of pencils in a month.
And that seems to be the biggest drawback to the pencil-producing gadget. How many No. 2 pencils can an office really use, given that most workers take notes on their tablet PCs or laptops? And how much glue and lead core do you need to buy to keep up with the overflowing paper recycle bin? Too much, we would suspect, which is why you may never see this gadget in your office supplies catalog

Wednesday, February 26, 2014

SR-71 Blackbird: How to fly the world's fastest plane

During the Cold War, the US Air Force operated the world's fastest air-breathing aircraft - the Lockheed SR-71 Blackbird. It was a plane which flew at the edge of space; so high that most other jet engines would seize because of the lack of air. A plane that flew so fast that its airframe heated and grew during flight. A plane that, if needed, could outrun missiles launched to bring it down.
The Lockheed SR-71 was a product of airplane maker Lockheed's Skunk Works, a secretive project which came up with some of the world's most advanced aircraft. It was designed after the loss of a U-2 spyplane over the Soviet Union in 1960 – a plane thought to fly too high to be shot down. The Blackbird would fly even higher, and at speeds of Mach 3.3 it would be fast enough to outrun any missile fired at it.
From 1966 until its last mission in 1989, the Lockheed SR-71 Blackbird flew thousands of missions around the globe, photographing military installations from China to Egypt, the Arctic Circle to North Korea. 
Colonel Rich Graham flew the Blackbird from 1974 until the mid-1980s, first as a mission pilot and then as a trainer. He later took command of all Blackbird detachments – in California, Mildenhall in the UK and at Kadena on the Japanese island of Okinawa. He has also written several books about the aircraft.
Here he tells BBC Future about what made the SR-71 such a remarkable plane.

Electric scooter design that makes a hole lot of sense

Scooters are a firm favourite in the developing world – but they are noisy and polluting, and can be dangerously overloaded. In many parts of the world, the scooter is one of the most popular ways to get around. City streets are crowded with the small two-wheeled vehicles, sometimes carrying one passenger, but often carrying two or more people, their children – and their cargo. Small motorcycles can be a vital form of transport for farmers to get their produce to market, or parents to get their children to school or medical care. So, drivers and their families, pets, livestock and boxes and bags often have to balance precariously on two wheels.
More and more scooters are taking to the streets of South American cities – faster than new cars, in some cases. In many Asian cities they’re the dominant form of transport. They can be cheap to own and operate, and they can be faster than the car in congested cities. But scooters are also a source of urban pollution, in terms of both emissions and noise. Their small engines burn a combination of oil and gasoline relatively inefficiently compared to an average modern car, and routine maintenance isn’t always carried out.
They are also dangerous – even deadly. A scooter is designed for one, maybe two people. If it is loaded beyond that the centre of gravity is shifted higher and balance becomes much more difficult when weaving in and out of traffic.
Lit Motors, a small company based in San Francisco, California I profiled in my last article, believes it might have a solution. By going back to the drawing board and designing an electric scooter for carrying cargo, it has come up with a design that it says should be safer, more efficient, more useful, but no more expensive. These are big goals for a small machine.
The first thing you notice when you look at the new cargo scooter is actually an absence, or a gap, in the middle of the machine. Right in the centre, approximately where the engine would sit underneath the driver, there is a square hole which can accommodate boxes up to 50x50x50cm (20x20x20 inches). The driver sits on a saddle right at the back. The total cargo capacity is estimated to be up to 90kg (198lbs), not including the weight of the rider. 
All in the lean
A key factor in the redesign was making the scooter electrically powered. The motor is placed inside the rear wheel, says Ryan James, Lit Motors’ chief marketing officer.
“That frees up the vehicle architecture to make it whatever we want it to be, and it also greatly simplifies the drivetrain making it cheaper to produce, cheaper to sell, and cheaper to the consumer,” he says.
Rechargeable batteries run along the base of the scooter, keeping the centre of gravity low, which is key to a stable, and therefore safer, ride. The batteries hold enough charge for around 50 miles, or 80km, and top speed will be around 50mph (80kph).
The other standout part of the design is two handlebars, which seem to sprout out of the top of the bike.  
“We have a unique, tank-style steering,” James explains. There is one handlebar on either side, which instead of twisting around a central pivot, are pushed or pulled in tandem to move the front wheel. “Like any other two wheeled vehicle, when you’re moving you’re actually turning the front wheel very little. The turn is all in the lean.”
For the cargo scooter to have an impact, it will have to be cheap enough for people to afford it. So is it possible to get the price of an innovative electric bike that low?
“We think we can,” says James. “We went over to India a couple of years ago and conducted market research, met with three of the largest scooter and motorcycle manufacturers, and we were able to determine how much we could produce this for. We should be able to get this comparable with similar price-point scooters.” In reality that means a target price of around $5,000.
Early production will be determined by demand, according to the company. It says it is prepared to produce anywhere from 10 to a few hundred at its facility in San Francisco at first, and then increase the numbers. Longer term, the company may license the design to manufacturers around the world.
The cargo scooter is still very much in development stage, and I was shown an early prototype (see video). The company already has ambitious plans though. Their next goal is to make the scooter fold in half, so you end up with the two wheels much closer together, and the bike’s footprint halved.
“That will be really big in places like China, where the norm is not to leave your scooter locked down in the street, but to bring it up into your apartment with you,” says James.
Lit plans to release the machine later on this year – using a Kickstarter campaign, of course.

Tuesday, February 25, 2014

How Dogs Know What You're Feeling


When you hear a friend’s voice, you immediately picture her, even if you can’t see her. And from the tone of her speech, you quickly gauge if she’s happy or sad. You can do all of this because your human brain has a “voice area.” Now, scientists using brain scanners and a crew of eager dogs have discovered that dog brains, too, have dedicated voice areas. The finding helps explain how canines can be so attuned to their owners’ feelings.
“It’s absolutely brilliant, groundbreaking research,” says Pascal Belin, a neuroscientist at the University of Glasgow in the United Kingdom, who was part of the team that identified the voice areas in the human brain in 2000. “They’ve made the first comparative study using nonhuman primates of the cerebral processing of voices, and they’ve done it with a noninvasive technique by training dogs to lie in a scanner.”
The scientists behind the discovery had previously shown that humans can readily distinguish between dogs’ happy and sad barks. “Dogs and humans share a similar social environment,” says Attila Andics, a neuroscientist in a research group at the Hungarian Academy of Sciences at Eötvös Loránd University in Budapest and the lead author of the new study. “So we wondered if dogs also get some social information from human voices.”
To find out, Andics and his colleagues decided to scan the canine brain to see how it processes different types of sounds, including voices, barks, and natural noises. In humans, the voice area is activated when we hear others speak, helping us recognize a speaker’s identity and pick up on the emotional content in her voice. If dogs had voice areas, it could mean that these abilities aren’t limited to humans and other primates.
So the team trained 11 dogs to lie motionless in a functional magnetic resonance imaging brain scanner, while wearing headphones to deliver the sounds and protect their ears. “They loved doing this,” Andics says, adding that the pooches’ owners were there to reward them with treats and petting. The scanner captured images of the dogs’ brain activity while they listened to nearly 200 dog and human sounds, including whines, cries, playful barks, and laughs. The scientists also scanned the brains of 22 human subjects who listened to the same set of sounds. Both dogs and humans were awake during the scans.
The images revealed that dog brains have voice areas and that they process voices in the same way that human brains do, the team reports online today in Current Biology. And because these voice areas are found in similar locations in the brains of both dogs and humans, the scientists suggest that they likely evolved at least 100 million years ago, when humans and dogs last shared a common ancestor, an insectivore. Indeed, some think that brain areas for processing vocal sounds could be discovered in more species.
Still, when voice areas were first discovered in humans, they were thought to be special and somehow tied specifically to the evolution of language. “So what are they doing in dog brains?” Andics asks.  
The answer lies, he thinks, in what the scans also revealed: Striking similarities in how dog and human brains process emotionally laden sounds. Happy sounds, such as an infant’s giggle, made the primary auditory cortex of both species light up more than did unhappy sounds, such as a man’s harsh cough. “It shows that dogs and humans have similar brain mechanisms for processing the social meaning of sound,” Andics says, noting that other research has shown that dogs “respond to the way we say something rather than to what we say.” The similarity in auditory processing, he adds, “helps explain why vocal communication between the two species is so successful.”
But there were differences, too. The researchers discovered that in dogs, 48% of their auditory brain regions respond more strongly to environmental sounds, such as a car engine, than to voices. In humans, in contrast, a mere 3% of their sound-sensitive brain regions lit up more for the nonvocal sounds. “It shows how very strongly attuned the human auditory cortex is to vocal sounds,” Andics says. “In dogs, it’s more heterogeneous.”
Yet it is the similarity in how dogs and humans process the emotional information in voices that other researchers find most intriguing. “They’ve confirmed what any dog owner knows—that their pooches are sensitive to one’s tone of voice,” says John Marzluff, a wildlife biologist at the University of Washington, Seattle. Even more important, he adds, is that the study “confronts us with the realization that our wonderful brain is in many ways a product of our distant evolutionary past.”

Living, breathing running shoes.



If everything goes according to Shamees Aden's plan, you may one day never need to buy another pair of running shoes. The designer and biotech researcher unveiled herproduct concept, a collaboration with a University of Southern Denmark professor, at London's Wearable Futures conference. The shoes are 3D-printed using protocells -- molecules that are not alive but can be combined to create living organisms -- and conform to the wearer's foot like a second skin. After a run, they'd need basic care,like a houseplant, and they could also respond to pressure exerted by the wearer, inflating or deflating as needed to better cushion the foot. The technology needed to create a fully functioning prototype, however, is still about 20 years away.

A pen that lets you draw 3D stick figures.


The toy company WobbleWorks put up a Kickstarter page back in February to fund its 3Doodler pen, a fun device that allows doodles to expand beyond 2D surfaces. The goal was to raise $30,000 to launch their product. By the campaign's end, however, the company had garnered over $2.3 million in donations. The pen works by heating up and dispensing a thin plastic filament that cools down quickly to create hardened structures. It's like drawing with hot glue, and each one-foot stick of plastic can provide 11 feet worth of doodles. 3Doodlers are still not quite ready for mass production yet, but they are available for pre-order.

The phone you can take apart like Legos.


Like playing with LEGOs, a modular smartphone is a make-it-yourself device consisting of an endoskeleton base and modules that attach to create a custom phone. Don't care about having a camera? Swap it out for a larger battery. Want to update your display without getting an entirely new phone? You could do that, too.
Motorola has been collaborating on a mission to make these devices a reality with Dave Hakkens, creator of a similar initiative called Phoneblocks, since this past fall. Project Ara would result in less electronic waste with devices that last a lot longer. Leading 3D printer manufacturer 3D Systems also recently got involved in an effort to improve the phones' blocky aesthetics.


Monday, February 24, 2014

Will drones become the future of farming?



Pilotless drones have had the biggest impact on the battlefield, serving as eyes in the sky and even as attack aircraft. But could they find another role, helping farmers boost food production?

The popular image of drones is as expensive pieces of military hardware which can be used for spying – or even going on the offensive.
But that's changing as robots become more integrated with our everyday lives. Farming is one of the new frontiers, as food production has become more automated – everything from GPS-guided tractors to automated milking machines - and drones are started to be incorporated into what's known as precision agriculture.
Robots are being used to survey crops and help farmers manage the water and chemicals they use in vast fields. Chris Anderson, the former editor-in-chief of Wired magazine, recently switched careers to move into drone manufacturing. He co-founded 3D Robotics, which is building drones in Mexico and the US which may one day keep a beady electronic eye on the food being grown for our tables.
BBC Future visited 3D Robotics' workshop in San Diego, Californi
a

Spinning wind turbines spark clockwork lightning



The spinning blades of wind turbines don't just generate useful electricity. They also trigger bolts of lightning.
Little is known about these spontaneous bolts because turbines are hard to monitor in stormy conditions. But now Joan Montanyà of the Polytechnic University of Catalonia in Barcelona, Spain, and his colleagues have captured the first high-speed video of wind turbines firing lightning. They also used a lightning detection network to map how the bolts propagate in three dimensions.
The team found that, under certain atmospheric conditions, rotating turbine blades can produce a bolt each time one of the blades is at its highest position. Strikes often propagate up to 2 kilometres upwards.
Unexpectedly, the video revealed three turbines close together producing lightning at the same time. "It's surprising to see so much activity from turbines in a small area," says Montanyà.

Struck through the blade

Famously, in 2011 a UK wind turbine exploded in high winds that forced the blades to move against their brakes, generating enough friction to turn it into a fireball. But turbines are more vulnerable to lightning.
The frequent lightning bolts could be damaging turbine blades, which are made of carbon-reinforced plastic. They are designed to resist occasional electrical discharges, but regular bolts could degrade them faster. "Wind turbines are exposed to a lot of electrical discharges and can easily be stressed," says Montanyà.
Aircraft are also often struck by lightning. But they are better protected than turbines – their aluminium fuselage conducts electricity around them. This is not an option for turbine blades because the extra weight would reduce their efficiency.
Another reason that lightning bolts pose a bigger problem for wind turbines is that they are hard to inspect and repair. Their blades sit on a tall structure, making them difficult to access. By contrast, when an aircraft is struck by lightning, it is checked and repaired the next time it lands.
Montanyà hopes to find ways to cut the number of lightning bolts generated by turbines. The challenge is to find out why they happen so often. He suspects the spinning tip of the blade becomes charged through friction with the air. "Many YouTube videos have documented spark-ups from turbines but the effect responsible needs to be confirmed," says Montanyà.

Spider-drones weave high-rise structures out of cables



SPIDER-LIKE, the drone spools cable behind it as it zips between supports. It is weaving a structure high above where ordinary building equipment can easily reach.
This is construction as envisioned by roboticists and architectsMovie Camera at the Swiss Federal Institute of Technology (ETH) in Zurich. As well as these web-like designs, the team is teaching drones to build towers from foam bricks.
Flying machines have an unlimited workspace – they can go anywhere, says Federico Augugliaro, who is leading the robotics side of ETH's Aerial Construction project. "There is no physical connection with the ground, so they can move construction elements to any location, and fly in and around existing structures."
Each quadcopter drone is equipped with a spool of strong plastic cable that runs out behind it as it flies. One end of the cable can be secured by making several turns around a pole. The drones are positioned and directed autonomously from the ground by a central computer fitted with a camera that watches them as they fly. For example, to loop cables around each other, the computer directs two drones to fly through certain points at an exact time. In this way, the fleet can tie complicated knots and form large, regularly repeating patterns strung between fixed structures.
Augugliaro's team revealed the work this week at a robotics conference in Tokyo, Japan.
"Something possible would be a structure like a bridge or a connection between existing buildings," says Ammar Mirjan, Augugliaro's counterpart on the architectural side of the project. Mirjan is working with the roboticists to help make sure their work will be useful for architecture and construction. "If you had skyscrapers, you could connect them," he says.
The drones could make building much easier, says roboticist Koushil Sreenath at Carnegie Mellon University in Pittsburgh, Pennsylvania. "You just program the structure you want, press play and when you come back your structure is done," he says. "Our current construction is limited, but with aerial robots those limitations go away."
The ETH researchers are not the only group writing drones into the future of construction. At the MIT Media Lab in Cambridge, Massachusetts, Neri Oxman and her team are using robots suspended on cables to build structures. And at the University of Pennsylvania, the General Robotics Automation Sensing and Perception Lab is using drones with robotic clamps to build towers of magnetic blocks.

Termite robots build castles with no human help


A shoe-sized robot, shaped like a VW Beetle and built by a 3D printer, scuttles in circles on a Harvard lab bench. Its hooked wheels, good for climbing and grasping, also let it trundle on the flat. As I watch, it scoops a styrofoam block on to its back and then scrabbles across a layer of already deposited blocks to flip the new one into place. An impressive feat – especially given that it does this without human control, using simple rules about its environment to build a whole structure.
The robot is making a tower – like a termite might.
"If you want to build underwater, if you want to build a Mars base, it's going to be very difficult, dangerous and expensive to send people," says Justin Werfelof the Wyss Institute for Biologically Inspired Engineering at Harvard University. "But if you could send a team of robots to go build the habitat as the first step – that's the really long-term vision."
To spur his swarm into action, Werfel gives the robots a mathematical model of the structure to be built, say, a pyramid. Each robot uses that model to calculate where it will place the next block it picks up, moving on to another spot if its planned drop-off has already been completed by another bot. It uses nothing but basic ultrasound and infrared sensors, as well as an internal accelerometer, to figure out how many blocks it has climbed, and where it is in relation to the structure it is building.
As well as pyramids, the bots can build castles and towers.

Build by numbers

Neri Oxman of the Media Lab at the Massachusetts Institute of Technology, who works on robotic architecture, says the concept is very promising. "This work promotes a truly decentralised construction system offering robust and customised designs," she says. "It paves the way for emergent design based on environmental sensing and represents an important step towards enabling the shift from swarm-based construction to swarm-based design."
The robots – part of Harvard's termite-inspired TERMES project – are never going to build anything outside an academic setting, but they show that controlling robots using simple, distributed rules does work. "You have a simple, cheap, expendable robot, and then you throw a bunch of them at a system and you don't care if they break," says Kirstin Petersen, who designed and built the robots. "You can get really far with simple robots."
Petersen and Werfel are not the only roboticists who see value in the swarms. The Laboratory of Intelligent systems at the Swiss Federal Institute of Technology in Lausanne has developed a swarm of small aircraft that communicate, position themselves and find targets using nothing but sound, without any central control system. Instead, the Swiss researchers give each flying robot the ability to make decisions based on its local conditions. At Heriot-Watt University in Edinburgh, UK, roboticists are working on a swarm which can repair damaged coral reefs.
In the short term, Werfel says TERMES-like robots could be used to build levees out of sandbags, working through night and day without human help to build flood protections. "The scenarios where you want to use robots rather than humans are described by the three Ds – dirty, dangerous and dull," he says.
The group has already used a modified version of the TERMES robots to drag and stack bags of rice to make a wall.

Necklace projectors will throw emails onto the floor


Not ready to don a Google Glass headset? An alternative way to access smartphone content could hang around your neck.
A digital device disguised as a necklace or brooch could one day project email, tweets and text alerts onto nearby surfaces, allowing you to open them with hand gestures.
"The projector gives you a window into the virtual world that you carry around like a flashlight, as a way of serendipitously accessing information," says Christian Winkler at the University of Ulm in Germany. His team's Ambient Mobile Pervasive Display generates a green "SMS" graphic that is projected ahead of the user (see video above). To find out who the message is from, you hold your palm in front of you and the sender's name is projected onto it. To read the message, you make a gesture, such as a subtle swipe, and the text is displayed on your hand.
The idea is that most functions that normally require a screen can be performed in this way. So, to access your running distance from a fitness app, or a football score, you could customise gestures that will project the results on your hand. The team thinks the system will be especially useful in navigation, projecting arrows on the ground, and for location-aware adverts.

Getting focused

Switching from a projection on the ground to a hand image isn't easy because the projector has to alter its focal length instantaneously. But a Kinect-style 3D sensor solves the problem by calculating the distance to the ground and to the user's hand. The projector uses this information to refocus. The 3D sensor is also used to recognise gesture controls.
In a test system – which the team is confident can be scaled down – a laptop carried in a backpack controlled the projector and sent emails and Facebook updates to the device so that it could display them. A consumer version could be feasible in two years for indoor use and later for outdoor use, Winkler says, as LED-based projectors improve in brightness, use less power and shrink in size. New camera and smartphone models already come equipped with built-in projectors.
William Coggshall, an analyst based in Menlo Park, California, who has studied the future of projector technology, is cautious about how useful the proposed system will actually be. "The size and power of a device that could project onto a sidewalk in daylight may make it pretty clunky," he says. "And the hand is not a very flat or uniform screen so a message of any meaningful length might not be very legible."
The system will be presented at the annual computer-human interaction conference in Toronto, Canada, in April.