HAPPY BIRTHDAY, William Kennedy Laurie Dickson!

 

william-kennedy-dickson1.png

What the what?!  That’s William Kennedy Laurie Dickson, the guy who invented the Kinetoscope, among other completely awesome stuff!  Today is Billy Boy’s birthday!  Happy Birthday, William Kennedy Laurie Dickson!

Dickson was one of Edison’s “muckers,” the guys who did all of Edison’s work for him.  What a d-bag he was, that Edison!

Check out the Happy Birthday, William Kennedy Laurie Dickson Official Birthday Post!

 

Holy Terminator Eyes! An LED Contact Lens That Gives Your Eyes A Display Overlay!

LED-contact-lens-fantasy

Can you imagine contact lenses that give you a see-through display that connects via Bluetooth into your iPhone?  Maybe something that allows you to get news stories as they pop up, see email notifications in your vision, or perhaps maybe even something actually useful?  The people at the University of Washington have developed a test case of this exact scenario — albeit in the eye of a rabbit.  But if Bugs Bunny can see like the Terminator, with images and text, then where’s the limit?  I submit it’s the SKY!

From the University of Washington’s press release, cross-posted from the Journal of Micromechanics and Microengineering:

We present the design, construction and in vivo rabbit testing of a wirelessly powered contact lens display. The display consists of an antenna, a 500 × 500 µm2 silicon power harvesting and radio integrated circuit, metal interconnects, insulation layers and a 750 × 750 µm2 transparent sapphire chip containing a custom-designed micro-light emitting diode with peak emission at 475 nm, all integrated onto a contact lens. The display can be powered wirelessly from ~1 m in free space and ~2 cm in vivo on a rabbit. The display was tested on live, anesthetized rabbits with no observed adverse effect. In order to extend display capabilities, design and fabrication of micro-Fresnel lenses on a contact lens are presented to move toward a multipixel display that can be worn in the form of a contact lens. Contact lenses with integrated micro-Fresnel lenses were also tested on live rabbits and showed no adverse effect.

Terminator-Lens-in-rabbit-eye

Let’s hit some key points here:

  • Part of the purpose of this most recent test was to test the safety of this device on a live subject.
  • Scientists tested a real, live, working video contact lens display on a real, live, BREATHING AND POOPING RABBIT (that’s what in vivo means, basically not diced up into dead tissue)
  • The device had wireless power, and everything needed is integrated into the tiny contact lens
  • No bad effects were observed on the rabbit, which was anesthetized
  • The contact lens had one pixel, but the next phase is a micro-Fresnel multi-pixel display lens, which were also tested on the bunnies, with no apparent bad effects.

led-contact-lens-detail

This is, by all accounts, AMAZING!  Can you imagine the implications of having a see-through display in your vision?!  From my lighting designer mind, I see things like photometric data or spectrophotometric data just updating as you look at something?  I hate to be the one to state this, but you KNOW the Defense Department is going to get their hands on this if they haven’t already — and we’ll see the next round of soldiers equipped with instant range finding and targeting displays right there in their vision as if it was nothing at all.  Seal Team 6, for example, was rumored to be wearing night vision contact lenses on the raid in Abbottabad, Pakistan on Osama Bin Laden.  A rumor of course, but is it really that inconceivable that something along those lines is possible?  I think not!

night-vision-contact-lenses

 

We’re still quite a bit away from the kinds of retina display technology we see in the movies — for example, in Mission Impossible 4 when Josh Holloway was in the train station looking at people’s faces as they passed by — but that technology is definitely going to be hitting our wallets in the next decade.  Call it intuition, call it a gut feeling, I don’t know.  But the interface is already there, Edward Snowden has made us very aware of that — and if it’s not already there by now, I have to believe that it isn’t way too far behind development.

retina-display-scanning

We already have license plate scanning cameras that police drive around with as they do their patrols.  We have data systems that can mine faces and scan instantly as people pass by the sensors.  What’s to say that soon we can’t have a device you go purchase at the local high end electronics retailer that allows you to shop for something anywhere, and while you’re looking at things in the store, you’re getting a display of the current price on Amazon versus what you’re seeing at Target?  Amazing thought, huh!

From an excellent article written in the IEEE Spectrum back in 2009, when the thought of monitoring someone’s blood glucose was an excellent reason for developing a technology like the one being tested today:

ieee-spectrum-bionic-eye

These lenses don’t need to be very complex to be useful. Even a lens with a single pixel could aid people with impaired hearing or be incorporated as an indicator into computer games. With more colors and resolution, the repertoire could be expanded to include displaying text, translating speech into captions in real time, or offering visual cues from a navigation system. With basic image processing and Internet access, a contact-lens display could unlock whole new worlds of visual information, unfettered by the constraints of a physical display.

Besides visual enhancement, noninvasive monitoring of the wearer’s biomarkers and health indicators could be a huge future market. We’ve built several simple sensors that can detect the concentration of a molecule, such as glucose. Sensors built onto lenses would let diabetic wearers keep tabs on blood-sugar levels without needing to prick a finger. The glucose detectors we’re evaluating now are a mere glimmer of what will be possible in the next 5 to 10 years. Contact lenses are worn daily by more than a hundred million people, and they are one of the only disposable, mass-market products that remain in contact, through fluids, with the interior of the body for an extended period of time. When you get a blood test, your doctor is probably measuring many of the same biomarkers that are found in the live cells on the surface of your eye—and in concentrations that correlate closely with the levels in your bloodstream. An appropriately configured contact lens could monitor cholesterol, sodium, and potassium levels, to name a few potential targets. Coupled with a wireless data transmitter, the lens could relay information to medics or nurses instantly, without needles or laboratory chemistry, and with a much lower chance of mix-ups.

Three fundamental challenges stand in the way of building a multipurpose contact lens. First, the processes for making many of the lens’s parts and subsystems are incompatible with one another and with the fragile polymer of the lens. To get around this problem, my colleagues and I make all our devices from scratch. To fabricate the components for silicon circuits and LEDs, we use high temperatures and corrosive chemicals, which means we can’t manufacture them directly onto a lens. That leads to the second challenge, which is that all the key components of the lens need to be miniaturized and integrated onto about 1.5 square centimeters of a flexible, transparent polymer. We haven’t fully solved that problem yet, but we have so far developed our own specialized assembly process, which enables us to integrate several different kinds of components onto a lens. Last but not least, the whole contraption needs to be completely safe for the eye. Take an LED, for example. Most red LEDs are made of aluminum gallium arsenide, which is toxic. So before an LED can go into the eye, it must be enveloped in a biocompatible substance.

terminator_vision_02More from the press release at the University of Washington:

At the moment, the contact lens device contains only a single pixel of information, but the researchers say it is a proof of the concept that the device could be worn by a person. Eventually it could display short emails and other messages directly before a wearers eyes.

“This is the first time we have been able to wirelessly power and control the display in a live eye,” said Babak Parviz, an author and UW associate professor of electrical engineering. Among his coauthors are Brian Otis, associate professor of electrical engineering, and Andrew Lingley, a graduate student.

“Looking through a completed lens, you would see what the display is generating superimposed on the world outside,” Parviz explained during a 2008  interview.

The researchers findings were published Nov. 22 in the Journal of Micromechanics and Microengineering.

Perhaps the best-known science fiction character to use such a display is the Terminator, and for almost seven years Parviz and others have worked on trying to make the display a reality.

Building the lenses required researchers to make circuits from metal only a few nanometers thick, about one-thousandth of a human hair. They built light-emitting diodes (LED) one-third of a millimeter in diameter. And to help focus the images, the researchers made arrays of tiny lenses that were put into the contacts.

The contact lens has an antenna to take power from an external source, as well as an integrated circuit to store this energy and transfer it to a transparent sapphire chip containing a single blue LED.

Otis called this successful wireless transmission to a lens “an extremely exciting project … that presents huge opportunities for health-care platforms.” The team is working on a way to monitor a diabetic patients glucose level using lenses.

Check this out, it’s three minutes worth of awesomesauce — some of this project from back in 2011:

GAH!  What an awesome project!

Contact_Lens_Designs

Crazy Friday Science: New “Dua’s Layer” Discovered in Human Eyes, Ophthalmology Changed Forever

From May 28, 2013 onward, the study of the human eye will forever be changed.  A doctor named Harminder S. Dua, Professor of Ophthalmology and Visual Sciences at the University of Nottingham has discovered a new layer of cells that lies just above Descemet’s Layer of the cornea and the corneal stroma.  Like so:

duas-layer

“Now hold on there cowboy, what’s the cornea?!”

The cornea is the covering for the iris, pupil, and the anterior chamber  – basically the spot in front of the eye’s lens.  It’s one of the body’s most nerve-filled tissues, and it’s filled with fluid for light transmission.  Check this out, it’s an excellent visual description of the cornea, anterior and vitreous chambers — for reference, Dua’s Layer is right between the rear edge of the cornea (closest to the iris) and the middle of the cornea:

Three_Main_Layers_of_the_Eye

 

What Dr. Dua has discovered is a layer within the cornea that seems to have something to do with failures in the cornea where misshaping takes place.  These kinds of diseases are thought to be caused by water becoming waterlogged within the cornea itself, perhaps caused by a tear in this new Dua’s Layer.  They give the person afflicted a cone-shaped cornea that can be corrected with glasses, contacts, or in extreme cases, corneal surgery.  I’ve never seen anything quite like this before, so I’m guessing you haven’t either:

Keratoconus_eye

keratoconus-eye

from http://thesclerallenscenter.com/wp-content/uploads/2010/10/IMG_8964.jpg

Dua’s Layer is the new tissue discovery that is thought to cause things like this crazy degenerative keratoconus, which looks very annoying and painful to me.  Keratoconus causes pretty awful headaches and eye strain for people afflicted, which nobody wants.  But, this discovery is being heralded as a potential game changer for corneal diseases and degenerative conditions.  From Sci News:

“This is a major discovery that will mean that ophthalmology textbooks will literally need to be re-written. Having identified this new and distinct layer deep in the tissue of the cornea, we can now exploit its presence to make operations much safer and simpler for patients,” said Dr Harminder Dua, Professor of Ophthalmology and Visual Sciences at the University of Nottingham and lead author of a paper published in the journal Ophthalmology.

“From a clinical perspective, there are many diseases that affect the back of the cornea which clinicians across the world are already beginning to relate to the presence, absence or tear in this layer.”

The human cornea is the clear protective lens on the front of the eye through which light enters the eye. Scientists previously believed the cornea to be comprised of five layers, from front to back, the corneal epithelium, Bowman’s layer, the corneal stroma, Descemet’s membrane and the corneal endothelium.

…and from Science Daily:

The scientists proved the existence of the layer by simulating human corneal transplants and grafts on eyes donated for research purposes to eye banks located in Bristol and Manchester.

During this surgery, tiny bubbles of air were injected into the cornea to gently separate the different layers. The scientists then subjected the separated layers to electron microscopy, allowing them to study them at many thousand times their actual size.

Understanding the properties and location of the new Dua’s layer could help surgeons to better identify where in the cornea these bubbles are occurring and take appropriate measures during the operation. If they are able to inject a bubble next to the Dua’s layer, its strength means that it is less prone to tearing, meaning a better outcome for the patient.

The discovery will have an impact on advancing understanding of a number of diseases of the cornea, including acute hydrops, Descematocele and pre-Descemet’s dystrophies.

The scientists now believe that corneal hydrops, a bulging of the cornea caused by fluid build up that occurs in patients with keratoconus (conical deformity of the cornea), is caused by a tear in the Dua layer, through which water from inside the eye rushes in and causes waterlogging.

This is the first time I am ever researching Keratoconus — I have a good friend who has Retinitis Pigmentosa, another degenerative disease of the eye (in that case the retina), but the conical cornea is quite an odd phenomena.  Have you ever had or know anyone who has had this disease?  I found some information at WebMD on Keratoconus on diagnosis and treatment:

Keratoconus changes vision in two ways:

  • As the cornea changes from a ball shape to a cone shape, the smooth surface becomes slightly wavy. This is called irregular astigmatism.
  • As the front of the cornea expands, vision becomes more nearsighted. That is, only nearby objects can be seen clearly. Anything too far away will look like a blur.

An eye doctor may notice symptoms during an eye exam. You may also mention symptoms that could be caused by keratoconus. These include:

  • Sudden change of vision in just one eye
  • Double vision when looking with just one eye
  • Objects both near and far looking distorted
  • Bright lights looking like they have halos around them
  • Lights streaking
  • Seeing triple ghost images

To be sure you have keratoconus, your doctor needs to measure the curvature of the. cornea. There are several different ways this can be done.

One instrument, called a keratometer, shines a pattern of light onto the cornea. The shape of the reflection tells the doctor how the eye is curved. There are also computerized instruments that make three-dimensional “maps” of the cornea.

How Is Keratoconus Treated?
Treatment usually starts with new eyeglasses. If eyeglasses don’t provide adequate vision, then contact lenses may be recommended.  With mild cases, new eyeglasses can usually make vision clear again. Eventually, though, it will probably be necessary to use contact lenses or seek other treatments to strengthen the cornea and improve vision.

A last resort is a cornea transplant.  This involves removing the center of the cornea and replacing it with a donor cornea that is stitched into place.

Congratulations to Dr. Harminder Dua and his team at the University of Nottingham for this amazing discovery!
Keep up the excellent game-changing work, good sir!

dr-harminder-dua

Check out the abstract at the journal Ophthalmology.

keratoconus-normal

from http://www.centralohioeyecare.com/user-files/PageImage206991.jpg

Thanks to Wikipedia on Keratoconus, Dua’s Layer, Traffic Shaper!

What If We Used Trees to Light Our Streets Instead of Electric Lamps?

glowing_plant

That’s the question that a core team of people on a Kickstarter campaign meant to create illuminating plant life want to know, and they want to know NOW!

So what exactly is going on here? From the Kickstarter campaign website on the Glowing Plants:

We are using Synthetic Biology techniques and Genome Compiler’s software to insert bioluminescence genes into Arabidopsis, a small flowering plant and member of the mustard family, to make a plant that visibly glows in the dark (it is inedible).

Funds raised will be used to print the DNA sequences we have designed using Genome Compiler and to transform the plants by inserting these sequences into the plant and then growing the resultant plant in the lab.

Printing DNA costs a minimum of 25 cents per base pair and our sequences are about 10,000 base pairs long. We plan to print a number of sequences so that we can test the results of trying different promoters – this will allow us to optimize the result. We will be printing our DNA with Cambrian Genomics who have developed a revolutionary laser printing system that massively reduces the cost of DNA synthesis.

Transforming the plant will initially be done using the Agrobacterium method.  Our printed DNA will be inserted into a special type of bacteria which can insert its DNA into the plant.  Flowers of the plant are then dipped into a solution containing the transformed bacteria. The bacteria injects our DNA into the cell nucleus of the flowers which pass it onto their seeds which we can grow until they glow!  You can see this process in action in our video.

Once we have proven the designs work we will then insert the same gene sequence into the plant using a gene gun.  This is more complicated, as there’s a risk the gene sequence gets scrambled, but the result will be unregulated by the USDA and thus suitable for release.

Funds raised will also be used to support our work to develop an open policy framework for DIY Bio work involving recombinant DNA.  This framework will provide guidelines to help others  who are inspired by this project navigate the regulatory and social challenges inherent in community based synthetic biology.  The framework will include recommendations for what kinds of projects are safe for DIY Bio enthusiasts and recommendations for the processes which should be put in place (such as getting experts to review the plans).

So far, as of writing this post, the campaign has raised over 700% of their goal.  The campaign stops tomorrow, June 7, 2013, but they’ve already raised almost $500,000 dollars!  The initial startup campaign?  Only $65,000.

Some commentary I found interesting – from the Glowing Plant website (at www.glowingplant.com) – what do you think of a GMO plant type like this?  They plainly state that the plant is not edible and not made for food:

Aren’t GMOs evil?  Luckily, that’s one question we don’t typically tend to get – although some people have definitely told us as much.

Like it or not, biology is the science of the 21st century, the way the steam engine dominated the first half of the 20th century. And just as there was a backlash against steam technology – it was going to put everybody out of work, and cows were going to drop dead in fright at the sight of a 20 mph steam train – there is a lot of Fear, Uncertainty and Doubt about genetic engineering and genetically modified organisms. To the point that creations like the vitamin fortified “Golden Rice” are now banned from countries where they could be saving thousands of lives. I’m sure that the first humans to discover fire were feared and reviled by their neighbors. And I’m sure those fire makers were concerned that their invention might “fall in the wrong hands”.

As with all technology, genetic engineering is not inherently good or bad – it all depends how you apply it. Science fiction stories are full of the hypothetical abuses of genetic engineering. Then again, they are also full of Midichlorians, and nobody takes those serious. More down-to-earth: yes, genetic engineering has been used to create quasi-monopolies on seeds and herbicides. But it is also being used to produce insulin and hundreds of other lifesaving drugs, develop cures for inherited diseases through gene therapy, and to make sure the next billion members of humanity will have enough to eat.

Monoculture and loss of crop diversity may be a really bad idea, ecologically speaking. And depriving farmers of the right to save and replant seed could arguably be called evil. But those are the products of a screwed up agroindustrial system, not the inevitable consequence of GMOs. As for the health concerns with GMOs – well, we’re not creating a food crop here, but as a scientist I would rate eating a tomato with fish genes about as dangerous as eating a fish-and-tomato dinner – and far less risky than eating a new tropical fruit I’ve never seen before.

When it comes to synthetic biology and DIYbio, I feel we’re standing alongside those early fire makers, discussing whether only the village elders should be allowed to handle fire, or whether we should teach everyone how to deal with it safely. Luckily, we know how that decision turned out…

The team:

glowing_plants_team

What do you think of this Kickstarter?  Is it a good thing?  Is it a bad thing?  How do you feel about GMOs that aren’t food based?  Leave a reply below!

LIDAR Helps Scientists Add Mass to Dinosaurs

…and all of it without having to use the strawberry milkshake protein powder that I got from Walmart.  That stuff was horrible!!!

One of my favorite laser publications, Optics.org, posted this awesome article — dinosaur skeletons, LIDAR, and imagining the mass of dinosaurs when they were alive.  The article is pretty cool, check it out here.

From the article:

A team at the University of Manchester has developed a new method for doing so that shows promise, by applying a lidar scanning technique to one of the largest mounted dinosaur skeletons in the world. The findings are published in Biology Letters.

Starting from the principle that the best estimates of dinosaur mass come from a volumetric approach, whereby a model of the animal is created and its mass then calculated via its density, the team scanned a complete skeleton using a lidar scanner supplied by Z+F, specialists in laser scanning and data capture.

I had to know more about this LIDAR business — LIDAR means Light Detection and Ranging.  From the wikipedia:

In general there are two kinds of lidar detection schema: “incoherent” or direct energy detection (which is principally an amplitude measurement) and Coherent detection (which is best for doppler, or phase sensitive measurements). Coherent systems generally use Optical heterodyne detection which being more sensitive than direct detection allows them to operate a much lower power but at the expense of more complex transceiver requirements.

In both coherent and incoherent LIDAR, there are two types of pulse models: micropulse lidar systems and high energy systems. Micropulse systems have developed as a result of the ever increasing amount of computer power available combined with advances in laser technology. They use considerably less energy in the laser, typically on the order of one microjoule, and are often “eye-safe,” meaning they can be used without safety precautions. High-power systems are common in atmospheric research, where they are widely used for measuring many atmospheric parameters: the height, layering and densities of clouds, cloud particle properties (extinction coefficient, backscatter coefficient, depolarization), temperature, pressure, wind, humidity, trace gas concentration (ozone, methane, nitrous oxide, etc.).[1]

There are several major components to a LIDAR system:

  1. Laser — 600–1000 nm lasers are most common for non-scientific applications. They are inexpensive, but since they can be focused and easily absorbed by the eye, the maximum power is limited by the need to make them eye-safe. Eye-safety is often a requirement for most applications. A common alternative, 1550 nm lasers, are eye-safe at much higher power levels since this wavelength is not focused by the eye, but the detector technology is less advanced and so these wavelengths are generally used at longer ranges and lower accuracies. They are also used for military applications as 1550 nm is not visible in night vision goggles, unlike the shorter 1000 nm infrared laser. Airborne topographic mapping lidars generally use 1064 nm diode pumped YAG lasers, while bathymetric systems generally use 532 nm frequency doubled diode pumped YAG lasers because 532 nm penetrates water with much less attenuation than does 1064 nm. Laser settings include the laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, YLF, etc.), and Q-switch speed. Better target resolution is achieved with shorter pulses, provided the LIDAR receiver detectors and electronics have sufficient bandwidth.[1]
  2. Scanner and optics — How fast images can be developed is also affected by the speed at which they are scanned. There are several options to scan the azimuth and elevation, including dual oscillating plane mirrors, a combination with a polygon mirror, a dual axis scanner (see Laser scanning). Optic choices affect the angular resolution and range that can be detected. A hole mirror or a beam splitter are options to collect a return signal.
  3. Photodetector and receiver electronics — Two main photodetector technologies are used in lidars: solid state photodetectors, such as silicon avalanche photodiodes, or photomultipliers. The sensitivity of the receiver is another parameter that has to be balanced in a LIDAR design.
  4. Position and navigation systems — LIDAR sensors that are mounted on mobile platforms such as airplanes or satellites require instrumentation to determine the absolute position and orientation of the sensor. Such devices generally include a Global Positioning System receiver and an Inertial Measurement Unit (IMU).

This scanning technology is actually pretty widely used all over the place — along with terrestrial map data from suppliers, the GPS companies’ travel vans are mostly fitted with LIDAR scanners.  These scanners are actually pretty cool – the company listed in the article, Z+F UK, has some particularly interesting looking devices!  Also, Radiohead apparently used lots of LIDAR capture to film their House of Cards video.  Here’s a bit of them doing some scanning work:

Crazy.  Also, if you’re one of those nerds like me who likes to comb through the images and content on places like NOAA and see the output from satellites at the various observation stations, check out the LIDAR stuff at the USGS (US Geological Survey) website.

Crazy Friday Science: Mini-Interview with Sonja Franke-Arnold on Rotary Photon Drag

I wrote an article about a paper I read in the journal Science a few weeks ago – the article was about Rotary Photon Drag Enhanced by A Slow Light Medium.  I got two handfuls of emails about the article, so I got in contact with one of the original paper’s editors, Sonja Franke-Arnold.  When you have questions, it’s best to go to the source!

JimOnLight.com:  Hi Sonja, welcome to JimOnLight.com! I’m very interested in your research, and we’ve gotten a lot of interesting response to the post I wrote on your paper, “Rotary Photon Drag Enhanced by a Slow-Light Medium.”  Can you take a moment and give us a bare-bones layperson’s look at what you and your team has discovered? What exactly has happened here in your experiment?

Sonja Franke-Arnold:  We were wondering how the world looks like through a spinning window!  About 200 years ago Augustin-Jean Fresnel predicted that light can be dragged if it travels through a moving medium. If you were to spin a window faster and faster, the image would actually be slightly rotated as the light is dragged along with the window. However, this effect is normally only some millionth of a degree and imperceptible to the eye.

We managed to increase the image rotation by a factor of about a million to an easily noticeable rotation of up to 5 degrees. This happened by slowing the light down to roughly the speed of sound during its passage through the “window” (in fact a ruby crystal). The light therefore spent a longer time in the ruby rod and could be dragged far enough to result in an observable image rotation.

JimOnLight.com:  Can you explain the significance of the wavelength of light you used? Why was 532nm (green) used for the experiment?

Sonja Franke-Arnold:  This wavelength excites a transition within the ruby crystal (the same that is also used in ruby lasers). Light at 532nm is absorbed and excites an atomic level with a very long (20 millisecond) lifetime. This allows to “store” the energy of the photon as an internal excitation of the rotating ruby crystal – generating slow light.

JimOnLight.com:  Tell me about the significance of the shape of the coherent beam in the experiment – was the shaped beam simply to observe a change in the image, or was a different purpose considered?

Sonja Franke-Arnold:  We used an elliptical light beam for two reasons, one of these is to define the image rotation angle as you suggested. The elliptical beam travelling through the spinning ruby rod however also plays an important part in making the slow light itself: At any particular position of the ruby, the elliptical light – spinning with respect to the ruby – looks like an intensity modulation. The varying intensity produces a large refractive index of about one million which slows the light down from the speed of light to roughly the speed of sound – a method pioneered by our co-worker Robert Boyd.

JimOnLight.com:  Could you give a few examples of uses for this discovery? How can the general populous relate to what this discovery really means for light and photonics?

Sonja Franke-Arnold:  For me, the main highlight was that we managed to observe a 200 year old puzzle – that images are indeed dragged along with rotating windows. We are now working on possible applications in quantum information processing: our image rotation preserves not only the intensity but also the phase of the light and could therefore be used to store and rotate quantum images. Access to the angle of an image could allow a new form of image coding protocol.

Thanks so much, Sonja!  Very cool paper for those of us nerds out here!

The Anti-Laser – Scientists Discover How to Cancel Out a Laser Beam

Whoa – a laser story that doesn’t involve someone mounting a man-killing laser on top of some kind of vehicle?!  SAY IT AIN’T SO!

Professor Douglas Stone and his team of Yale scientists have discovered a way to get material to nearly completely absorb laser light.  They’ve developed this thing – more of a material, really – called a CPA, or Coherent Perfect Absorber.  What it seems the team has done is to take the Law of Conservation of Energy and used it to their advantage.  Do we all remember the Law of Conservation of Energy?

Energy can not be created or destroyed – it can only change form.

So what the scientists have done here, in layman’s terms, is that they’ve figured out a way to get laser light to basically be absorbed into a medium by waiting until that laser light bounces around this little silicon chamber until its energy changes forms to heat energy.  Stone and his team used a silicon structure to basically take beams of laser light and capture them in this silicon medium until they change form to heat energy.  Right now his team says that they can capture 99.4% of the light through absorption, but their Coherent Perfect Absorber will potentially be able to capture 99.99% of the laser light shone into the CPA.

Why this is significant is that silicon is already being used in the semiconductor industry in computers – this new technology from Yale and Douglas Stone’s team has potentially many, many uses in computing – the hope is that they’ll be able to use these tech as a way to make microswitches and other types of computer components.  Hey, using light instead of electrons?  Awesome!

Very cool!

Thanks CTV, BBC, PopSci, and NewScientist!

FujiFilm is Developing A New Light Diffusing Film for LEDs

I found this press release interesting, and now I’ve been trying to find out more information about the product.  FujiFilm (yeah, that one) is working on a light diffusing film for LED sources – they’re set to release some prototypes in the late summer/fall according to a few of my sources, but we’ll see what happens.

This is a little teaser – believe me, I’ve been looking all over to find info on this product!

From the release:

Japan’s Fujifilm Corp. plans to enter the market for LED lighting materials with a new kind of light diffusing film which is thinner and offers higher illumination intensity than products now available.

The light diffusing film acts to spread out the light from the LED bulbs so they are not so bright. These diffusers are now typically made from acrylic resin materials which are milky in color and around 2mm thick. The thickness means they are hard to bend, and the milky color means that some of the light is wasted.

Fujifilm’s new light diffusing film is made by coating a polyethylene terephthalate (PET) resin sheet with an orderly array of micrograins of a light-dispersing material. The film is only 0.3mm thick so it is lighter and easier to bend and shape, plus it has around 30 per cent higher illumination intensity, which translates into a significant improvement in the energy efficiency of the overall LED lighting system.

Fujifilm will begin manufacturing prototypes this summer for distribution to LED lighting makers, homebuilders and contractors.

The Fujifilm Holdings Corp. (TSE:4901) unit hopes to begin commercial sales during fiscal 2011 and intends to develop this area of business into a major new revenue source.

Happy Birthday, Francis Robbins Upton!

Francis Robbins Upton!  Happy Birthday, dude!

That guy is straight out of Deadwood!

Yes, I like to also recognize obscure yet related industry people on JimOnLight.com – they are the people behind the people.  The people that were doing the thing that we all strive to do now – carve the path.

Francis Robbins Upton was a mathematician, physicist, and an employee of Thomas Edison’s Menlo Park laboratory facility back in the 1870’s.  Francis was the general manager and partner of an Edison project called Edison Lamp Works.  The guy was an intelligent scientist, and worked on the watt-hour meter, the electric light, engineering dynamos, and apparently lots of interesting arguments/spats with Edison himself.  From an article about Francis Upton at the School of Mathematics and Statistics at St. Andrews University:

Edison liked and respected Upton, for the latter had acquired a brilliantly profound store of knowledge. And under Edison’s guidance he soon gained the necessary experience to make theory and practice meet. It was always edifying to listen to their arguments, and often a group of us would gather round and drink in every word that was spoken. Reasoning and sparrings between Edison and Upton often led to new experiments …

A totally random bit of information on Francis Robbins Upton is that he was the guy who invented the electric fire alarm/detector.  That’s a big deal, right?  Well sure!  However, this fact often goes overlooked because of some dumb ass at the US Patent Office in the late 1800’s who misspelled the title of Upton’s fire alarm.  Officially, the patent for his device was called the “Portable Electric Tire Alarm.”  Lame.  Sorry that people suck, Francis!

Francis also developed something called “Nature’s Farter.”  Yeah, you read that right.  Upton invented a device that had something to vibrating a circular tube and producing a constant fart sound.  I think this is hilarious – a guy with Upton’s mathematics prowess having a sense of humor!  The United States Government, however, had no sense of humor.  Francis Upton actually got arrested for his invention, because the government found it “rude.”  Lame again.

Happy Birthday, Francis!

Thanks, Wikipedia and GAP!

NASA’s Flying Lady with Long Distance Eyes, SOFIA

NASA has many telescopes in play, optical or otherwise, in a variety of different forms.  We have the Hubble Space Telescope that peers into the celestial bodies in several ways, we have ground-based telescopes that track the stars, radio telescopes that listen for whispers among the stars, and several other forms of watching and tracking the sky and beyond.  NASA has been working on a new one for a while (at least a few years), this time it’s a far-infrared vision system mounted on a modified 747SP.

Meet SOFIA – NASA’s Stratospheric Observatory for Infrared Astronomy:

See that big gaping hole in the side of that aircraft?  That’s the telescope.  SOFIA flies around and tracks planets, stars, and other space stuff – at least when it’s operational.  That’s the plan.  Right now, the feat is that SOFIA’s big open cavity there is the largest to ever have been flown.  The telescope is fully exposed, and NASA is making sure that all is copasetic with the design and equipment before doing any of the really cool stuff.

SOFIA’s main gear is a German-made, 2.5 meter far-infrared range telescope capable of seeing between 0.3 and 1600 microns, weighing in at 34,000 pounds.  She’s going to be looking for planet formations in nearby star systems, planetary composition, Milky-Way dynamic activity, and ultra-luminous infrared galaxies among her other work.  SOFIA’s got a big task, and it is super cool to me that NASA is taking this to the skies.

Besides looking at the universe from a new angle, what I like best about SOFIA is that she’s not at all trying to blow up missiles, enemy troops, tanks, planes, or any of that other nonsense crap.  SOFIA is trying to scope out things that could help us find answers.  LOVE IT!

Here’s a few videos of SOFIA – the first is a NASA “Mission Update” video:

The second video is an air-to-air video of SOFIA in flight:

Last video – an animation of the SOFIA aircraft and some of its inner workings:

Be sure to check out the SOFIA mission page at NASA, and the Dryden Flight Research Center site.