Researchers from Cornell’s Creative Machines Lab wondered what would happen if two chatbots engaged in conversation with one another. The result: it is awkkkkwaarrddd. I won’t spoil the topics that do come up, so go on and watch the bizarre back-and-forth above.
So I just got back from the midnight screening of Inception and I’ve been trying to come up with the best words to describe it. It’s a brilliant, well thought out, and visually impressive film that will surely make you think. It’s better if you go into the film knowing the least amount of plot detail, so I will keep it at that for now. From all of the trailers released you must know that the movie focuses on dreams and dream states. Director Christopher Nolan (The Dark Knight) conducted intensive research during the making of the film so as to make the foundational theories behind it believably sound. The video above is a breif albeit stimulating glimpse into Nolan’s research. Note: There are no spoilers contained in this video, just talk about dreams, REM sleep, and so on. The really good mind-bendy stuff comes in at 3:35. (And no, that is not Sarah Palin contrary to popular belief.)
Last week a six man crew (comprised of Russian, French, Italian and Chinese engineers, doctors, and one astronaut) embarked on a simulated journey to Mars. The team will be locked up in a contained space for 520 days to “simulate the psycological stress of a real space mission” to Mars. The simulation includes a 250-day flight to the Mars, a 30-day exploration period, and a 230-day return to planet Earth. Though the crew will not experience weightlessness, the jouney will be as close to the real thing as possible. In fact, crew members will only shower once every ten days and communications to the “outside world” will be delayed by up to forty minutes. All in the name of research! Watch the the video above to learn more about the mission to Mars in Moscow. Godspeed, friends!
British scientist Dr. Mark Gasson of Reading Univeristy has implanted an RFID chip inside his hand. Today RFID chips have been used to tag animals to keep track of them. But they also have some uses when implanted in humans. With such a chip sitting inside Gasson’s hand, this allows him to gain secure access to the University and it also acts like an internal fingerprint reader for a cell phone (his cell phone recognizes the chip and only allows him to access its features). But with such advances in technology, says Gasson, also comes risks. And here’s the kicker: Gasson has infected the RFID chip in his hand with a computer virus. The risk? After much research, Gasson concludes that a computer virus can spread and infect other medical implants inside the body, such as pace makers. He goes on to propose that such an “infection” could potentially spread to other people! Yikes–so is this how the zombie infestation breaks out? Listen to Dr. Gasson explain his groundbreaking research in the video above.
Researchers at the University of Tokyo have created an advanced motion-tracking camera that is designed to attach to a mobile device like a cell phone. It allows for “in-air typing” or Project Natal-style gestures to type out text messages or emails by detecting finger movement. The demo above shows how something like this could replace a physical and touch (virtual) keyboard on mobile devices.
A bunch of bright minds at the Hasso Plattner Institute in Germany have been working on this research project they call “multi-toe interaction.” Basically it’s a multitouch floor that can recognize a person based on their shoe pattern. But I’ll let the masterminds explain:
The key factor of the shown design is that it is based on frustrated total internal reflection sensing. FTIR allows it to identify and track users based on their sole patterns. The floor recognizes foot postures, distinguishes users who interact from people walking by, and enables high-precision interaction. In addition, the floor can approximate users’ head positions based on the pressure profile in the soles and it extracts enough details from soles to allow users to play first person shooters by balancing their feet.
So precise! Not so sure if this can ever be practically implemented, but it’s always good to see unique implementations of a multitouch interface. Even if it involves stinky feet.
Microsoft’s Surface table is fairly large and very expensive. And those are two factors that don’t mesh well with the general consuming public. Microsoft gets that, so they’ve gone ahead and created a prototype version of their multitouch table called Mobile Surface. Like its older brethren, Mobile Surface uses a projector/camera combo that allows you to interact with on-screen images. Difference here is that the image projection can be displayed on any surface (making it portable) and it allows for in-air manipulation. For example, as seen in the video above, you can play the drums without physically touching the tabletop. Mobile Surface links up to a secondary device, like a cell phone or laptop, to indicate what you’re interacting with. Pretty neat if you ask me. Currently Mobile Surface is a Microsoft Research project and Microsoft did not comment on a potential mainstream release.
A team of robotics researchers led by Radu Horaud began a project this year called the “Humanoids with auditory and visual abilities in populated spaces” (HUMAVIPS) project. The goal? To give robots the ability to possess “social skills” and mimic the “cocktail party effect,” which is “the human ability to focus attention on just one person in the midst of other people, voices and background noise.” Natural interaction between humans and robots, huh? If you were ever on the fence about Judgement Day being upon us…now what say you?
PhD student Fabian Hemmert asks, “How can we make digital content graspable for us, for humans?” How do we get from the physical to the digital? He introduces three conceptual cell phone models (“mobile-shaped phone boxes,” really) that could theoretically help bridge this gap that has yet to be conquered. The “weight-shifting mobile” uses a iron weight that moves around, giving the user a sense of physical mass. For example, when you’re exploring your way through a city using Google Maps, the weight can direct you in the right direction as it shifts the center of gravity inside the phone, making you feelwhich way to turn. Next up is the “shape-changing mobile.” Say you have a wide collection of eBooks on your device. If you’re reading a short twenty page story, the device will make itself thin; if you’re reading a lengthy Harry Potter novel, the device will expand, making the experience more realistic, as if you were holding a thick book. It can also change it’s shape to customize the grasp, have the device lean towards you when in use, etc. The “living mobile phone” is downright strange: this model has a “breath” and “heartbeat.” It’s meant to make the device seem organic in your pocket. In standby mode, the phone “breaths” up and down; when you receive an alert, it’s “heart rate” increases and the up and down motions speed up. To calm it down? “Pat it behind the ears.”
Hemmert’s ideas are innovative andinteresting, though I feel the implementation of such technologies will remain far-out research assignments for the forseeable future. He ends the demonstration with this postulation: “Not humans should get more technical in the future; rather than that, technology a bit more human.” Intriguing, intellectual, and plain ‘ol creepy if you ask me.
Out of Microsoft and University of California Berkeley comes the Pictionaire touchscreen table. It uses integrated overhead cameras to capture physical objects and convert them into digital ones to be manipulated. The Microft Surface-esque table is almost six feet long, begging for an intuitive collaborative experience. Creating digital copies of “physical artifacts” couldn’t be simpler. You place an object (say, a notebook with doodles) onto the surface, the table recognizes its presence, the overhead camera snaps a picture of it, and all you have to do is drag and drop the corner edge of the object to a new area on the table. Viola–now you have a digital copy of the notebook that can be manipulated in a variety of ways. You can drag the digital copy onto paper-sized whiteboards and use a marker to add annotations. The surface can even act as a light table to create hand-drawn copies of things. Wireless keyboards and mice are supported for text imput and image search. Though Pictionaire remains a research project, it’s good to see Microsoft and company working towards a more practical experience with the Microsoft Surface table.
Russell Turnbull lost eyesight in his right eye when a squirt of ammonia shot into it as he intervened in a fight between two men. The chemical caused significant damage to his right eye and was diagnosed with a condition called Limbal Stem Cell Deficiency (LSCD). LSCD is very painful and requires many trips to the hospital for care. Because he had this rare condition, doctors used him as a guinea pig in stem cell trials. Fifteen years after the frightful incident, Turnbull was miraculously cured of blindness thanks to the stem cells. The treatment involved taking small samples of stem cells from his healthy eye’s cornea, growing it in the lab, and implanting them into his damaged eye. Only eight weeks after the operation, Turnbull regained full eyesight in his right eye. He was cured of blindness. He remains “one of eight patients with impaired vision who have been treated successfully by surgeons at the North East England Stem Cell Institute.”
Though this treatment surely won’t cure all blind people, it is a huge step in the right direction for science and health. Stem cell research is a controversial topic in today’s heated political and religious debates. No matter what side you stand on, you cannot deny that this miraculous story of Turnbull and the advances in science and technology are pointing to a brighter future where incurable diseases become curable ones.
Imagine a world in which you can drink as much alcohol as you like without getting out-of-control drunk and without worrying about feeling hungover the next morning. A new synthetic alcohol is being developed by a research team at Imperial College London, led by Professor David Nutt. The alcohol, which is being developed from chemicals found in Valium, serves the same function as today’s alcohol (induces relaxation, well-being, etc.) except that it has NO affect on other parts of the brain, thus eliminating potentially dangerous side effects such as mood swings and addiction. In addition, the researchers claim that a simple antidote can quickly flush out your system, removing the possibility of feeling hungover.
You can drink as much alcohol as you want. You reach a pleasant state of drunkenness. You can drink as much as you want and you will never leave this state. Now it’s time to drive home; pop a pill and you’re sober as a bird. Sounds great, right? According to the researchers, since the substance is tasteless and colorless, it can replace the alcohol content in beer, wine, and other alcoholic drinks without issue. So why isn’t this in place right now? Unfortunately Professor Nutt has been unsuccessful in finding a test country to market his discovery and he does not have the financial backing to move forward in the process. Also, since the liquor industry has shown little interest in change, this all places the substance on death watch. What a damn shame.