Sunday 26 April 2015

3D-printed model of London's buildings lets you see the future

It's a tabletop version of London colour-coded for architects. In this view of the interactive model of the city, unveiled yesterday at The Building Centre in London, red and yellow sections represent sightlines – restricted areas where construction cannot take place to preserve views of famous landmarks such as St Paul's Cathedral.
The 12.5-metre-long reconstruction, built by design company Pipers over five months, reproduces more than 85 square kilometres of the city. One-tenth of its 170,000 buildings were constructed recently enough to have existing digital versions that could be used to fabricate scaled-down copies with a 3D-printer. Older buildings, however, had to be laser-cut or fashioned by hand, at a cost of about £250,000, before the city's landscape was assembled with the help ofOrdnance Survey, the UK's mapping agency. The final model contains many elements that are accurate down to the nearest centimetre.
(Image: NLA/Paul Raftery, Pic 2 Nils Jorgensen/REX Shutterstock)
Not everything on the map exists in the actual city – at least not yet. Tall buildings for which planning permissions have been issued have been rendered bright white, to distinguish them from their existing, dove-grey neighbours. There are currently 263 of these future constructions.
(Image: NLA/Paul Raftery, Pic 2 Nils Jorgensen/REX Shutterstock)
The interactive model has been designed for visitors to explore how the city will adapt and change. A projection and lighting system controlled with a touchscreen can be used to focus on certain features, for example sightlines or historic events, by displaying digital animations across the surface. The Great Fire of 1666 can be depicted in one view, where the financial district in the centre of the capital appears to burn. Key facts about upcoming projects to reshape the city can also be brought up.

Tiny robots climb walls carrying more than 100 times their weight

Mighty things come in small packages. The little robots in this video can haul things that weigh over 100 times more than themselves.
The super-strong bots – built by mechanical engineers at Stanford University in California – will be presented next month at the International Conference on Robotics and Automation in Seattle, Washington.
The secret is in the adhesives on the robots' feet. Their design is inspired by geckos, which have climbing skills that are legendary in the animal kingdom. The adhesives are covered in minute rubber spikes that grip firmly onto the wall as the robot climbs. When pressure is applied, the spikes bend, increasing their surface area and thus their stickiness. When the robot picks its foot back up, the spikes straighten out again and detach easily.
The bots also move in a style that is borrowed from biology. Like an inchworm, one pad scooches the robot forward while the other stays in place to support the heavy load. This helps the robot avoid falls from missing its step and park without using up precious power.

Heavy lifting

All this adds up to robots with serious power. For example, one 9-gram bot can hoist more than a kilogram as it climbs. In this video it's carrying StickyBot, the Stanford lab's first ever robot gecko, built in 2006.
Another tiny climbing bot weighs just 20 milligrams but can carry 500 milligrams, a load about the size of a small paper clip. Engineer Elliot Hawkes built the bot under a microscope, using tweezers to put the parts together.
The most impressive feat of strength comes from a ground bot nicknamed ╬╝Tug. Although it weighs just 12 grams, it can drag a weight that's 2000 times heavier – "the same as you pulling around a blue whale", explains David Christensen – who is in the same lab.
In future, the team thinks that machines like these could be useful for hauling heavy things in factories or on construction sites. They could also be useful in emergencies: for example, one might carry a rope ladder up to a person trapped on a high floor in a burning building.
But for tasks like these, the engineers may have to start attaching their adhesives to robots that are even larger – and thus more powerful. "If you leave yourself a little more room, you can do some pretty amazing things," says Christensen.

Human cruise control app steers people on their way

Where are you going? <i>(Image: Ingrid Rasmussen/Plainpicture)</i>
Electrodes attached to legs can guide people wherever you want them to go via an app. Welcome to the bizarre world of electro-stimulation
For a few days last summer, a handful of students walked through a park behind the University of Hannover in Germany. Each walked solo, but followed the same route as the others: made the same turns, walked the same distance. This was odd, because none of them knew where they were going.
Instead, their steps were steered from a phone 10 paces behind them, which sent signals via bluetooth to electrodes attached to their legsMovie Camera. These stimulated the students' muscles, guiding their steps without any conscious effort.
Max Pfeiffer of the University of Hannover was the driver. His project directs electrical currentMovie Camera into the students' sartorius, the longest muscle in the human body, which runs from the inside of the knee to the top of the outer thigh. When it contracts, it pulls the leg out and away from the body. To steer his test subjects left, Pfeiffer would zap their left sartorius, opening their gait and guiding them in that direction.
Pfeiffer hopes his system will free people's minds up for other things as they navigate the world, allowing them to focus on their conversation or enjoy their surroundings. Tourists could keep their eyes on the sights while being imperceptibly guided around the city.
Acceptance may be the biggest problem, although it is possible that the rise of wearable computing might help. Pfeiffer says the electrode's current causes a tingling sensation that diminishes the more someone uses the system. Volunteers said they were comfortable with the system taking control of their leg muscles, but only if they felt they could take control back.
One of the students compared the feeling to cruise control in a car, where the driver can take control back when they want it. "Changes in direction happened subconsciously," said another.
Pfeiffer steered students manually, but the plan is to build the mechanism into other apps. Navigation apps, for instance, could steer people along their route automatically, meaning they never have to look at their phone or think about where they are going.
"When I use Google Maps and I navigate somewhere, I am always pulling my mobile out of my pocket to check," he says. "We want to remove this step out of the navigation process so you just say ‘I want to go there', and you end up there."
The system could also be used to direct crowds, not just individuals. "Imagine visitors to a large sports stadium or theatre being guided to their place, or being evacuated from the stadium in the most efficient way in the case of an emergency," the team write in a paper that will be presented at the CHI conference in Seoul, South Korea, next week.
Evan Peck of Bucknell University in Pennsylvania says Pfeiffer's system will stop us being chained to our smartphones. "We're developing all this really wonderful tech and almost all of it demands our attention," he says. "We build a navigation system, but then we have to stare at it."
"Their goal is letting you use your attention on what you want to use it on," says Peck. "It's in your hands now."