Skip to main content

Self-levitating displays: Mid-air virtual objects

An interactive swarm of flying 3D pixels (voxels) developed at Queen's University's Human Media Lab is set to revolutionize the way people interact with virtual reality. The system, called BitDrones, allows users to explore virtual 3D information by interacting with physical self-levit






Queen's professor Roel Vertegaal and his students are unveiling the BitDrones system on Monday, Nov. 9 at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina. BitDrones is the first step towards creating interactive self-levitating programmable matter -- materials capable of changing their 3D shape in a programmable fashion -- using swarms of nano quadcopters. The work highlights many possible applications for the new technology, including real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.
"BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality," says Dr. Vertegaal. "It is a first step towards allowing people to interact with virtual 3D objects as real physical objects."
Dr. Vertegaal and his team at the Human Media Lab created three types of BitDrones, each representing self-levitating displays of distinct resolutions. "PixelDrones" are equipped with one LED and a small dot matrix display. "ShapeDrones" are augmented with a light-weight mesh and a 3D printed geometric frame, and serve as building blocks for complex 3D models. "DisplayDrones" are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board. All three BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time via motion capture technology. The system also tracks the user's hand motion and touch, allowing users to manipulate the voxels in space.
"We call this a Real Reality interface rather than a Virtual Reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset," says Dr. Vertegaal.
Dr. Vertegaal and his team describe a number of possible applications for this technology. In one scenario, users could physically explore a file folder by touching the folder's associated PixelDrone. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. Files in this wheel are browsed by physically swiping drones to the left or right.




Users would also be able to manipulate ShapeDrones to serve as building blocks for a real-time 3D model. Finally, the BitDrone system will allow for remote telepresence by allowing users to appear locally through a DisplayDrone with Skype. The DisplayDrone would be capable of automatically tracking and replicating all of the remote user's head movements, allowing a remote user to virtually inspect a location and making it easier for the local user to understand the remote user's actions.
While their system currently only supports dozens of comparatively large 2.5" -- 5" sized drones, the team at the Human Media Lab are working to scale up their system to support thousands of drones. These future drones would measure no more than a half inch in size, allowing users to render more seamless, high resolution programmable matter.
Story Source:
The above post is reprinted from materials provided by Queen's University.




Comments

Popular posts from this blog

Solar car hits U.S. in round-the-world jaunt

Last October, the SolarWorld GT solar-powered car set out from Darwin, Australia on a drive around the world. It has since driven 3,001 kilometers (1,865 miles) across Australia, logged 1,947 km (1,210 miles) crossing New Zealand and been shipped across the Pacific Ocean. This Friday, it will embark on the U.S. leg of its journey, as it sets out across America from the University of California, Santa Barbara.   The SolarWorld GT is the result of a collaboration between solar panel manufacturer SolarWorld, and Bochum University of Applied Sciences in Germany. The four-wheeled, two-door, two-seat car gathers solar energy through photovoltaic panels built into its roof, with its solar generator offering a peak performance of 823 watts. Custom hub motors are located in both of the front wheels. The vehicle manages an average speed of 50 km/h (31 mph), with a claimed top speed of 100 km/h (62 mph). In order to demonstrate that solar powered cars needn't be a radical...

Biocomputer, Alternative To Quantum Computers

A team of international scientists from Canada, the U.K., Germany, the Netherlands and Sweden announced Friday that they had developed a model biological supercomputer capable of solving complex mathematical problems using far less energy than standard electronic supercomputers. The model “biocomputer,” which is roughly the size of a book, is powered by Adenosine triphosphate (ATP) — dubbed the “molecular unit of currency.” According to description of the device, published in the  Proceedings of the National Academy of Sciences , the biocomputer uses proteins present in all living cells to function. It uses a strategy similar to that of quantum computers, which use qubits — the quantum computing equivalents of bits — to perform “parallel computation,” wherein  computers are able to process information quickly and accurately by performing several calculations simultaneously, rather than sequentially. In the case of the biocomputer, the qubits are replaced with ...

Qualcomm showcases the Snapdragon S4 ahead of Mobile World Congress

We’ve already heard about Qualcomm’s latest processor, the Snapdragon S4 , which will be quad-core and utilize LTE. Qualcomm took the time to give us some details ahead of Mobile World Congress. The new SoC now supports up to three cameras (two in the back for 3D and one front-facing), 20-megapixels, and recording video at 1080p (30fps). We can also expect zero shutter lag, 3A processing (autofocus, auto exposure and auto white balance), and improved blink/smile detection, gaze estimation, range finding and image stabilization. Last but not least, it supports gesture detection/control, augmented reality , and computer vision (via Qualcomm’s FastCV). Hit the break for a couple of videos featuring image stabilization and gestures.