Skip to main content

Canon’s 250-megapixel sensor can read the side of a plane from 11 miles away

Each time a new, higher-resolution sensor is introduced, there is speculation that we’ve reached the theoretical limit of what is possible with today’s technology. As pixels get smaller, image quality suffers. In particular, smaller pixels mean a lower native ISO, and in turn a decrease in low-light image quality. Small pixels also start to be effected by the diffraction of light as it passes through the camera’s aperture. Pushing these two constraints to the limit, Canon has created a stunning 250-megapixel sensor prototype that could be used in a DSLR.




It is an APS-H-size sensor, about 80% of the length and width of a full-frame sensor, and slightly larger than the popular APS-C format. Given its 19,580 x 12,600 pixel resolution, that means each pixel is about 1.5 microns — almost the same as in an iPhone 6. So you could imagine the new sensor as an array of 30 perfectly-aligned, very-high-speed smartphone sensors. It shows its high speed by being able to write out over 1 billion pixels per second, enabling it to capture 250MP video at 5 fps.
Canon’s feat of using the sensor, coupled with a prototype camera, to read the lettering on an airplane at something over 11 miles away is quite impressive. However, since it hasn’t disclosed anything about the camera or lens used, it is a little hard to tell how much of that magic is the sensor. For example, DARPA’s Argus can perform similarly amazing visual feats using a massive array of traditional smartphone-quality sensors.




This may be exactly what Lytro and the lightfield photography business needs


While Canon is looking to eventually market the new sensor to specialized audiences including surveillance, measuring instruments, and industrial equipment, it may also help pave the way for practical lightfield cameras. The big problem with lightfield products to date, like those from Lytro, is how much resolution they give up to gain their ability to capture dimensionality and alter focus in post-production.
Lytro founder Ren Ng has made it clear that the company was built on the premise that sensor resolution would continue to improve, making that tradeoff more reasonable. Depending on how much of the lightfield a camera is designed to capture, it can cut the native sensor resolution by anywhere from 10% to 90%. With a 250MP native resolution, though, even a 90% reduction would yield a very respectable 25MP image.
Source : Extremetech

Comments

Popular posts from this blog

Google and Stanford early adopters of Honda Fit EV

Honda's first all-electric vehicle is hitting the streets a little early. The  Honda Fit EV  debuted at the Los Angeles Auto Show in November 2011, and it's expected to be     available for lease this summer. However,  Honda announced  that Google and Stanford University got a special early delivery of the tiny EV this week.The Honda Fit EV is equipped with a 20kWh lithium ion battery, and has an EPA estimated driving range of 76 miles. Google added the EV to its  car -sharing service for employees, dubbed the G-Fleet, in    Mountain View, Calif. The search giant maintains several electric and plug-in vehicles that it uses for research and to cart Googlers around town and between buildings on campus. Stanford University also is an early adopter of the Fit EV, but will be using it primarily for research. The university's automotive research department will study the difference in psychological and physical reactions of using battery...

Hand-manipulated objects and transparent displays - the computer desktop of tomorrow

A see-through screen, digital 3D objects manipulated by hand, perspective adjustments according to the user's viewing angle - these are the core features of a prototype computer desktop user interface created by Microsoft's Applied Sciences Group. The prototype uses a "unique" Samsung transparent OLED display through which the user can see their own hands to manipulate 3D objects which appear to be behind the screen. A demo video appears to show a working prototype of a computer markedly different from those we use today. Yes it includes a familiar keyboard and trackpad - but these are placed behind the OLED display. The user simply lifts their hands from these input devices to manipulate on-screen (or more accurately  behind -screen) objects, such as selecting a file or window. The video shows the interface in action with a series of program windows stacked behind one another, with the user selecting the desired program by hand, using the depth of the w...

Bioengineers develop smart, self-healing hydrogel

Velcro is pretty handy stuff, but imagine if there was a soft, stretchy material with the same qualities. Well, now there is. Scientists from the University of California, San Diego have created a self-healing hydrogel that binds together in seconds, essentially copying the Velcro process at a molecular level. The new material could potentially find use in medical sutures, targeted drug delivery, industrial sealants and self-healing plastics. The secret to the jello-like polymer hydrogel is its "dangling side chain" molecules, that reach out toward one another like long, spindly fingers. When developing the gel, a team led by bioengineer Shyni Varghese ran computer simulations, in order to determine the optimal length for these molecules. The resulting substance is capable of healing cuts made to itself - or of bonding with another piece of hydrogel - almost instantly. The behavior of the material can be controlled by adjusting the pH of its environment. In lab t...