Duke University melds two rats' thoughts over the internet we're not sure Spock would approve

Some would say the internet already lets us share every minute detail of our thoughts, much to our followers' dismay. Duke University isn't deterred by our behavior -- if anything, it just took oversharing literally by connecting two rats' minds in an experiment, first in a lab and ultimately online. Electrodes attached to the brain of a host "encoder" rat in Brazil processed the motor-oriented mental activity for a desired behavior, such as pressing a lever on cue, and converted it into a signal that was then received by a "decoder" rat as far away as Duke's US campus. The majority of the time, the decoder rat performed the same action as the encoder. Researchers also found that rewarding the encoder alongside the decoder created a virtuous loop, as treating the first rat for a job well done focused its attention and improved the signal strength.

We're not sure that Vulcans would endorse this kind of mind meld, though: apart from immediately depriving the decoder rat of self-control, prolonged testing led to the same rodent developing additional sympathetic reactions to the encoder. There's also concerns that the test was too binary and didn't reflect the complexity of the whole brain. All the same, Duke's study is proof enough that we can export brainwaves in a meaningful way.

0 Comments

3D Feh MIT has already moved on to 4D printing video

The bad news: just as much of the world is starting to get excited about the prospects of 3D printing, science is moving on to the world of 4D. The good news: in the future, you might not have to assemble that Ikea chair yourself. "4D printing" is the term cientists are using to refer to a technology that MIT's Skylar Tibbits talked up during a recent TED appearance. The fourth "d" here is time, referring to an object that, once printed, is capable of changing shape (over time, naturally).

"Essentially the printing is nothing new," Tibbits told the BBC. "It is about what happens after." So far the concept has been demonstrated with thin strands of plastic, which, once added to water, form into a predetermined shape, using energy from the absorption. Suggested future applications involve furniture, pipes, bikes and buildings. First, however, scientists will have to demonstrate the technology on a larger structure, of course, and they'll explore the possibility of other energy sources, like heat, sound and vibration.

Read the Full Story | 0 Comments

Sony Ericsson Windows Phone prototype slides onto eBay, wants us to call her Julie

eBay is as close as it comes to a genuine Aladdin's cave, and we've seen plenty of ancient rarities, prototypes, sci-fi weaponry, and the odd killer robot go under its gavel. One of the latest artifacts of interest comes from eBay's Netherlands site, which is hosting an auction for a Windows Phone prototype slider known to her friends as Julie (or Jolie, depending on where you look in the listing) from the now defunct Sony Ericsson partnership. The phone that never was from the company that is no longer is allegedly one of only seven units made, and is touted as having an 8-megapixel shooter and 16 gigs of storage. Some digging through the XDA Developers' forum suggests the handset's old Windows Phone 7 ROM is basically non-functional, so don't expect to plug in your SIM and stroll out the door with a useable device. If that doesn't put you off, however, there's no exorbitant entry price, and bids remain sensible, for now. Head to the listing below for more pictures and to get in on the action, but bear in mind the only shipping options are for Europe. Nothing a PM with an outrageous offer won't rectify, surely.

Update: The seller has been in contact to let us know that international shipping is now available, and while the WP7 ROM running on the handset is by no means a final build, there are no issues with voice calling, the camera or Bluetooth.

0 Comments

Stretchable, serpentine lithiumion battery works at three times its size

While we've seen more than a few flexible batteries in our day, they're not usually that great at withstanding tugs and pulls. A team-up between Northwestern University and the University of Illinois could give lithium-ion batteries that extreme elasticity with few of the drawbacks you'd expect. To make a stretchable battery that still maintains a typical density, researchers built electrode interconnects from serpentine metal wires that have even more wavy wires inside; the wires don't require much space in normal use, but will unfurl in an ordered sequence as they're pulled to their limits. The result is a prototype battery that can expand to three times its normal size, but can still last for eight to nine hours. It could also charge wirelessly, and thus would be wearable under the skin as well as over -- imagine fully powered implants where an external battery is impractical or unsightly. There's no word yet on whether there will be refined versions coming to real-world products, but we hope any developments arrive quickly enough to give stretchable electronics a viable power source.

0 Comments

Image

Firefighters already use infrared cameras to find people in burning buildings, but the technology can't distinguish between a person's heat and that of the surrounding fire. That's because a zoom lens is needed to concentrate the infrared rays in a way that enables the apparatus to form a human-readable image. Fortunately, a team of researchers from the Italian Institute of Optics has developed a system that ditches the lens in favor of digital holography that produces detailed 3D images in the darkness. The hardware isn't out of short trousers just yet, but the team is planning to develop a portable version for field work -- and chief Pietro Ferraro hopes that the idea will be co-opted by the aerospace and biomedical industries, too. Curious to see what all the fuss is about? Head on past the break for a video.

Read the Full Story | 0 Comments

DARPA trying again to develop a highspeed VTOL aircraft

If at first your unmanned aerial vehicles don't succeed... try, try again? After a series of unsuccessful tests with the Boeing X-50 Dragonfly and Groen Heliplane, the US government is once again trying to develop a high-speed, vertical takeoff-and-landing (VTOL) aircraft. DARPA just announced the VTOL X-Plane program, a 52-month, $130 million project with one mission: to build an aircraft that can exceed 300 knots, achieve a hover efficiency of 75 percent or better, and hit a cruise lift-to-drag ratio of 10 or more.

In layman's terms, such an aircraft would be faster than a traditional helicopter, but still have better hover efficiency than a modern high-speed 'copter. Sounds like a sensible idea, right? The thing is, DARPA doesn't know yet how such a thing would look: for now, the agency is merely soliciting proposals, with a particular emphasis on smaller, non-traditional companies nimble enough to develop products quickly. So if you've got any good ideas, may as well head on over to the source link, we guess, and try your luck.

0 Comments

MIT algorithm teaches robots to think outside of the mechanical box video

Although robots are getting better at adapting to the real world, they still tend to tackle challenges with a fixed set of alternatives that can quickly become impractical as objects (and more advanced robots) complicate the situation. Two MIT students, Jennifer Barry and Annie Holladay, have developed fresh algorithms that could help robot arms improvise. Barry's method tells the robot about an object's nature, focusing its attention on the most effective interactions -- sliding a plate until it's more easily picked up, for example. Holladay, meanwhile, turns collision detection on its head to funnel an object into place, such as balancing a delicate object with a free arm before setting that object down. Although the existing code for either approach currently requires plugging in existing data, their creators ultimately want more flexible code that determines qualities on the spot and reacts accordingly. Long-term development could nudge us closer to robots with truly general-purpose code -- a welcome relief from the one-track minds the machines often have today.

Read the Full Story | 0 Comments

Nexus One launched into space on CubeSat, becomes first PhoneSat in orbit (video)

Google's Nexus One has dreamt of space travel for a while now, but on Monday it was finally launched into orbit aboard a CubeSat dubbed STRaND-1, which was developed by Surrey Satellite Technology and the University of Surrey's Surrey Space Centre. STRaND-1 now holds the honor of being the first PhoneSat and UK CubeSat that has made it into orbit. Alongside the HTC-made handset are an altitude and orbit control system, two propulsion setups and a Linux-based computer with a "high-speed" processor. After the Tux-friendly rig conducts a battery of tests, it'll relinquish control of much of the satellite's functions to the smartphone, which still runs Android.

Not only will the mission test how commercial, off-the-shelf tech can survive in the vacuum and conduct experiments, but it'll squeeze in some fun courtesy of apps developed by winners of a competition held last year. An app called 360 will let folks back on terra firma request their own snapshots of earth taken with the phone's shooter and pin them to a map. Ridley Scott might like to say no one can hear you scream in space, but another application loaded onto the device will put that to the test by playing user-submitted shrieks and recording them with the handset's microphone as they playback. Hit the break for more details and a brief video overview of the satellite, or jab the more coverage links to partake in the app shenanigans.

Read the Full Story | 0 Comments

http://www.engadget.com/2013/02/24/researchers-transparent-flexible-image-sensor-screen-camera/

CCD sensors have long ruled the digital imaging roost, but a team of researchers at Johannes Kepler University in Linz, Austria have concocted flat, flexible and transparent image sensors that could eventually change things up. Made from a flexible polymer film suffused with fluorescent particles, the prototypes catch only a specific wavelength of light and shoot it to an array of sensors that surround the sheet's edge. At that point, the rig calculates where light entered the polymer by measuring how much it has diminished during its travel time, and then composes an image from that data. It's said the process is similar to how a CT scan functions, but uses visible light instead of X-rays. Not only is the membrane relatively inexpensive and potentially disposable, but the solution is a world's first, to boot. "To our knowledge, we are the first to present an image sensor that is fully transparent – no integrated microstructures, such as circuits – and is flexible and scalable at the same time," said Oliver Bimber, co-author of the group's paper.

As of now, the setup only snaps black and white images with a resolution of 32 x 32 pixels, but there are plans to boost its fidelity by leveraging higher quality photodiodes (or even composite photos). Also, color photographs could be achieved by using several sheets that capture different hues of light. So, what's this all mean for practical applications? Researchers believe its prime use lies in layering the film on TV screens and other displays to offer gesture controls without pesky, additional cameras. In addition, objects can be imbued with sensor capabilities if wrapped with the layer, and even CCD's could benefit from having a slice of the polymer slapped on them to take photos at different exposures. Hit the second source link for the scientific nitty-gritty, or head past the break for a glimpse at the setup's photos.

Read the Full Story | 0 Comments

Alt-week peels back the covers on some of the more curious sci-tech stories from the last seven days.

Altweek 22413

The discovery of what is hoped to be the Higgs boson was an exciting time for anyone with a curious mind. It turns out, that the price of knowledge is often a heavy one. Without putting too much of a negative spin on it... that teeny-weeny boson could predict bad news. On a lighter -- or is that darker -- note, other areas of science and technology bravely march ever-onward with the goal of a better understanding of life, the universe, and tattoos. This is alt-week.

Read the Full Story | 0 Comments

SpiderSense suit lets you know when danger is near

Know that feeling when someone wanders too far into your personal space? The University of Illinois' Victor Mateevitsi does, which is why he'd built a suit that does the job to a far greater degree of accuracy. SpiderSense is a onesie that uses a series of microphones to rend and receive ultrasonic signals from the space around you, like high frequency radar. When the outfit senses something approaching, a robotic arm corresponding to the microphone exerts pressure on your skin, pointing you in the direction of the danger. Mateevitsi tested the gear by blindfolding researchers and asking them to throw a cardboard ninja star whenever (and wherever) they sensed a threat -- with positive results 95 percent of the time. SpiderSense will get its first public showing at Stuttgart's Augmented Human conference in March and it's hoped that the hardware will eventually help Blind people get around easier.

[Image Credit: Lance Long]

0 Comments

Microchip implant lets blind patients see light, skip the glasses

An eye-implanted chip from Retina Implant has restored patients' ability to discern light during its latest trial, according to German researchers. The device works in a similar fashion to the newly FDA-approved Argus II retinal prosthesis to return limited vision in patients with photoreceptor cell diseases like retinitis pigmentosa. Unlike that system, however, light is picked up via 1,500 pixels on a retinal implant instead of an eyeglass-mounted camera. The signal is boosted by a coil implanted in skin behind the ear and sent back to so-called bipolar cells still active on the retina, which in turn send an image to the brain through regular neural circuits. A small battery mounted behind the ear -- the only external sign of the device -- contains controls for brightness and contrast. The recent trial let 8 out of 9 patients see in varying degrees, with three in the study even able to read letters and see the faces of family members. Given that the Argus II finally crossed the FDA's bionic eye barrier, hopefully we won't have to wait nearly as long for research like this to become a product.

0 Comments

Cornell scientists 3D print ears with help from rat tails and cow ears

Science! A team of bioengineers and physicians over at Cornell University recently detailed their work to 3D print lifelike ears that may be used to treat birth defects like microtia and assist those who have lost or damaged an ear due to an accident or cancer. The product, which is, "practically identical to the human ear," according to the school, was created using 3D printing and gels made from living cells -- collagen was gathered from rat tails and cartilage cells were taken from cow's ears. The whole process is quite quick, according to associate professor Lawrence Bonassar, who co-authored the report on the matter,

"It takes half a day to design the mold, a day or so to print it, 30 minutes to inject the gel, and we can remove the ear 15 minutes later. We trim the ear and then let it culture for several days in nourishing cell culture media before it is implanted."

The team is looking to implant the first ear in around three years, if all goes well.

0 Comments

NASA's Kepler telescope spots smallest planet to date, no aliens

NASA's Kepler telescope is permanently on the lookout for celestial objects of interest, and its latest discovery is a small one. A small planet, to be exact -- in fact, the smallest its encountered during its search. Kepler-37b is a tad larger than our heavenly dance partner, the Moon, and whizzes round a star much like our Sun, with two larger planets in its system for company. NASA's issuing back pats all round, as finding Kepler-37b has highlighted "the precision of the Kepler instrument" (although admittedly, the star's behavior was favorable), and suggests there are many more humble worlds of similar size awaiting our detection. It's unlikely any aliens call Kepler-37b home: it's thought to be rocky, with no atmosphere, and hugs its sun in a 13-day orbit cycle, meaning surface temperature is terribly high. Still, an achievement for Kepler, no doubt, but what we really want it to find is a planet home to beings who can explain the plot-line of Prometheus. We're still a little confused.

0 Comments

MIT imaging chip blends photos with and without flash, keeps detail in noise reduction

Mobile image processing in itself isn't special when even high dynamic range shooting is virtually instant, at least with NVIDIA's new Tegras. A new low-power MIT chip, however, may prove its worth by being a jack of all trades that works faster than software. It can apply HDR to photos and videos through near-immediate exposure bracketing, but it can also produce natural-looking flash images by combining the lit photo with an unassisted shot to fill in missing detail. Researchers further claim to have automatic noise reduction that safeguards detail through bilateral filtering, an established technique that uses brightness detection to avoid blurring edges. If you're wondering whether or not MIT's work will venture beyond the labs, don't -- the project was financed by contract manufacturing giant Foxconn, and it's already catching the eye of Microsoft Research. As long as Foxconn maintains interest through to production, pristine mobile photography won't be limited to a handful of devices.

0 Comments