Skip to main content

Google glasses coming to stores this year




A number of anonymous Google employees are reporting that the company is currently developing Android-powered glasses that can provide a heads-up display to the wearer and connect over wireless data services. The glasses will purportedly work like a wearable version of the Google Goggles app, providing real time information on a user's location via GPS and motion sensors. Even more surprising, the same sources are saying these "Google glasses" could be available to the public by the end of this year.
The Google glasses have apparently been in production for quite some time at Google's secretive Project X lab, where the company designs its more outlandish projects, such as robots, space elevators, and the like. Anonymous employees have indicated that this is strictly an experimental program from Google, though it may look into future business applications depending on how successful the product is.
Aside from a few buttons on the side, the glasses are said to resemble a regular pair of eyeglasses with a design similar to the Oakley Thumps (pictured below). The glasses will feature a low-resolution camera on the front for gathering information to relay to a small screen built into one side of the lenses. The screen will not be transparent, but will be located to the side of the frame, so as not to obscure a person's view but still give an augmented reality feel. The camera will also be able to take pictures, and have a built-in flash.


Using either WiFi or a 3G/4G connections, the device will tap into Google's cloud and relay information to the user on their environment, including locations or friends nearby and objects that they look at. The glasses will also work as a smartphone, allowing users to make calls, use certain apps, and connect with friends.
Actually controlling the glasses will be a bit unique, as reading through information on the display will require a user to tilt their head to scroll and click. Sources at Google have noted that this function is actually a lot easier to use than it sounds, and will not be noticeable to others.
Unnamed employees told the New York Times that the new Google glasses are expected to be priced much like a current smartphone (in the US$250 to $600 range) and are aimed for a 2012 release date.


Comments

Popular posts from this blog

Google and Stanford early adopters of Honda Fit EV

Honda's first all-electric vehicle is hitting the streets a little early. The  Honda Fit EV  debuted at the Los Angeles Auto Show in November 2011, and it's expected to be     available for lease this summer. However,  Honda announced  that Google and Stanford University got a special early delivery of the tiny EV this week.The Honda Fit EV is equipped with a 20kWh lithium ion battery, and has an EPA estimated driving range of 76 miles. Google added the EV to its  car -sharing service for employees, dubbed the G-Fleet, in    Mountain View, Calif. The search giant maintains several electric and plug-in vehicles that it uses for research and to cart Googlers around town and between buildings on campus. Stanford University also is an early adopter of the Fit EV, but will be using it primarily for research. The university's automotive research department will study the difference in psychological and physical reactions of using battery...

Hand-manipulated objects and transparent displays - the computer desktop of tomorrow

A see-through screen, digital 3D objects manipulated by hand, perspective adjustments according to the user's viewing angle - these are the core features of a prototype computer desktop user interface created by Microsoft's Applied Sciences Group. The prototype uses a "unique" Samsung transparent OLED display through which the user can see their own hands to manipulate 3D objects which appear to be behind the screen. A demo video appears to show a working prototype of a computer markedly different from those we use today. Yes it includes a familiar keyboard and trackpad - but these are placed behind the OLED display. The user simply lifts their hands from these input devices to manipulate on-screen (or more accurately  behind -screen) objects, such as selecting a file or window. The video shows the interface in action with a series of program windows stacked behind one another, with the user selecting the desired program by hand, using the depth of the w...

Bioengineers develop smart, self-healing hydrogel

Velcro is pretty handy stuff, but imagine if there was a soft, stretchy material with the same qualities. Well, now there is. Scientists from the University of California, San Diego have created a self-healing hydrogel that binds together in seconds, essentially copying the Velcro process at a molecular level. The new material could potentially find use in medical sutures, targeted drug delivery, industrial sealants and self-healing plastics. The secret to the jello-like polymer hydrogel is its "dangling side chain" molecules, that reach out toward one another like long, spindly fingers. When developing the gel, a team led by bioengineer Shyni Varghese ran computer simulations, in order to determine the optimal length for these molecules. The resulting substance is capable of healing cuts made to itself - or of bonding with another piece of hydrogel - almost instantly. The behavior of the material can be controlled by adjusting the pH of its environment. In lab t...