Latest Breakthroughs in Electro-Optics and Lasers

Researchers from around the world will present the latest breakthroughs in electro-optics, lasers and the application of light waves at the 2009 Conference on Lasers and Electro-Optics/International Quantum Electronics Conference (CLEO/IQEC) May 31 to June 5 at the Baltimore Convention Center in Baltimore.

HEALING WITH LIGHT

Star Trek scanners that fix injuries with beams of light may not be science fiction after all. A new optical technology that lines up living cells and controls their movements has opened the door to better artificial tissues and wounds that heal faster with less scarring.

For years, scientists have used the energy in laser light to drill microscopic holes or as tweezers or traps to direct and maneuver small pieces of matter. Guiding entire cells, though, has proven difficult because the lasers used for manipulation tend to damage the structural units of living organisms.

Now Aristide Dogariu and colleagues at the University of Central Florida in Orlando have developed an optical procedure that does not harm cells, but affects their skeletons – an ensemble of slender rods made out of an abundant protein called actin. The actin rods are constantly growing and shrinking inside of cells. The direction in which they grow changes the cell's membrane shape and dictates where the cell moves.

Dogariu and colleagues use the polarization of optical waves to create a field around the cells in which the growing actin rods line up like a compass in the Earth's magnetic field. These optical fields can be used to guide large groups of cells to line up and move in the same direction.

The technique could be useful for cancer assays, which test the motility of cells, or as a non-invasive, non-toxic boost for regenerative medicine. Though cells have complicated and intriguing mechanisms to sense and communicate where an injury occurs, the possibility of using photonic scaffolds to stimulate and guide cells' motility to accelerate tissue repair, is now quite promising.

Presentation CMMM2; Monday, June 1, 4 – 4:15 p.m.

VEHICLES THAT DRIVE THEMSELVES

The thought of a car or truck that can drive itself is at once both exciting and frightening. Autonomous vehicle navigation, as the technology is known, may make life more convenient if it allows people to kick back and enjoy a good book or movie while their cars guide themselves through rush-hour traffic. But what happens if it starts to rain or if traffic suddenly picks up? If the technology is to work at all, it will have to be completely safe on all roads, under all speeds, and in all weather. Therein lies the challenge: if cars and trucks are to drive autonomously, they will need futuristic sensors and advanced computing capabilities to respond to ever-changing road conditions.

Perhaps the most extreme example of ever-changing conditions is a war zone, where roads may be reduced to rubble and vehicles are natural targets of attack. Rolling out fleets of self-navigating vehicles for the military is an enticing idea because it could keep thousands of troops out of harm's way. But will it be possible for these vehicles to operate in war zones? This question was the inspiration for a recent Defense Advanced Research Projects Agency (DARPA) contest aimed at spurring the development of such technologies.

Held at a former air force base in Victorville, Calif. in late 2007, the DARPA Urban Challenge offered a $3.5 million purse to competitors who could design the fastest and safest vehicles that could traverse a 60-mile urban course in moving traffic in less than six hours. The contestant vehicles were unmanned and had to complete a simulated military supply mission, maneuvering through a mock city environment, avoiding obstacles, merging into moving traffic, navigating traffic circles, and negotiating intersections -- all while conforming to California driving rules. Of the 89 international teams that entered the challenge, only six finished in the allotted time.

Wende Zhang of General Motors was part of the team that designed the winning vehicle, which finished with the fastest time -- an average speed of approximately 13 miles per hour. The GM team drew upon existing technology already offered in some of their vehicles that can assist in parking or detect lane markers and trigger alarms if the drivers are coming too close to the shoulder of the road. For the DARPA challenge, they developed a more sophisticated package of sensors that included GPS coupled with a camera and a laser-ranging LIDAR system to guide and correct the vehicle's route through the city. In Baltimore, Zhang will present GM's patented new methods for detecting lanes and correcting a vehicle's route, which helped them win the challenge.

Though they won, don't look for robotic chauffeurs immediately. The technology must prove reliable in many different road, weather and lighting conditions. Still, says Zhang, a commercially-viable autonomous driving product may be available in the next decade.

Presentation PThB1; Thursday, June 4, 2:15 – 2:45 p.m.

FLEXIBLE MONITORS FOR FUTURE BATTLEFIELDS

Among the technological demands of an increasingly sophisticated U.S. military force is the need for futuristic computer displays. While existing flat-panel, light-emitting diodes (LED) displays are good for most commercial purposes, they may not be optimized for the modern battlefield; they could be too heavy and too fragile, for instance. Making them more durable with protective aluminum and plexiglass casing would only add bulk and weight. Their energy consumption is also an issue in the field, where U.S. soldiers have to maximize precious battery power.

Flexible displays are an attractive alternative to existing liquid crystal display (LCD) models because they would be lighter and more durable, consume less power, and could ultimately be rolled up and stuffed in a pocket between uses. The technology needed to make such displays already exists. It is based on arraying pixels of individual red, green, and blue LEDs on top of electronic circuitry fabricated on flexible plastic substrates. A number of laboratories in the U.S. have already made experimental versions of such flexible displays.

The key challenge, says Eric Forsythe of Army Research Laboratory, is to improve the size, weight, and energy efficiency of these experimental displays and to find a design that can be easily manufactured. In Baltimore, Forsythe will discuss the latest research on organic LEDs and the U.S. Army's progress toward pilot-scale production of flexible displays with improved efficiency. Currently they have a small experimental display of 320 x 240 pixel resolution on a flexible material known as polyethylene naphthalate. He estimates that within a couple of years, a more manufacture-friendly model of a PDA-like flexible display will exist.

Presentation PThA1; Thursday, June 4, 10:30 – 11 a.m.

WORLD'S HIGHEST-RESOLUTION COMMERCIAL SATELLITE

Since the early 1960s, super powerful spy satellites have been the stuff of the military and intelligence communities. Now two U.S. companies have launched commercial imaging satellites that offer the same sort of space-based images of the Earth to the public. One of these companies, GeoEye of Dulles, Va., launched a multi-million dollar satellite last year, and it's the highest-resolution commercial imaging satellite in the world.

From its vantage point of 425 miles in space, the 4,300-pound GeoEye-1 satellite orbits the Earth and focuses its powerful lens on the surface below, snapping electronic images that can resolve objects on the ground as small as 41 cm across (16 inches). That's approximately the size of home plate on a baseball diamond. These images are typically processed and sold to the military for mapping and to companies like Google, which makes them available to the public through its platform Google Earth. (Because of federal regulations, the publicly-available images are slightly lower resolution -- approximately 50 cm).

In Baltimore at next week's CLEO/IQEC, GeoEye's Systems Engineering Director Michael Madden will describe some of the satellite's key features, such as the fact that it's the first commercial satellite with military-grade star trackers, which along with GPS makes the imagery from the satellite very accurate -- an important aspect for making precise maps. He will also preview the satellite GeoEye-2, which is expected to be launched around 2012 and would have a ground resolution twice as fine as GeoEye-1.

These powerful public eyes in the sky have already had an impact. Madden says for instance, a researcher at the University of California, San Diego is using satellite imagery to search for the tomb of Genghis Khan in Mongolia. A few months ago, one of the enduring photos taken during U.S. President Barack Obama's inauguration was the image captured by GeoEye-1 of the National Mall in Washington, D.C., which showed throngs of people crowded together. In March 2009, the GeoEye-1 satellite captured a close-up image of a North Korean missile sitting on the launch pad just 25 minutes before launch. GeoEye-1 also provided a look at the annual Cherry Blossom Festival held in Washington, D.C. From the space photo, details were clear enough to resolve individual trees, ripples on the Potomac River, and people and cars crowded along the Tidal Basin, the area in downtown Washington, D.C. where the festival takes place.

Presentation PWB4; Wednesday, June 3, 6:15 – 6:45 p.m.

MORE QUIET MIRRORS

In physics many subtle phenomena can be studied by allowing waves to interfere with each other. In an interferometer, light waves travel by two different paths, directed from place to place by strategically places mirrors, and converge at a detector, where they produce a striped interference pattern. The pattern can be read out to learn details of the journey taken by the waves. Interferometry is used in many endeavors, such as navigation, optical clocks, encryption, and in the attempt to observe gravitational waves. The quality of the interferometer depends on the positions of the mirrors being precisely stable. Unfortunately, when experiments are carried out at room temperature the smallest amount of heat present will agitate the mirrors; a century ago Albert Einstein demonstrated the relation between fluctuations ("Brownian motion") brought about by thermal energy.

H. Jeff Kimble of Caltech will describe a new effort to counteract thermal noise and improve the sensitivity of interferometers. He and his colleagues argue that a very slight thermally-induced movement of a mirror's surface owing to thermal noise is accompanied by associated changes in other physical parameters, such as the index of refraction of the mirror. These correlated changes can be exploited to compensate, in a coordinated way, the deleterious effects of the mirror surface's motion.

Presentation CWI1; Wednesday, June 3, 4:45 – 5:15 p.m.

TERAHERTZ MODULATORS

Scientists have for the first time devised a multi-pixel modulator for light waves at terahertz (THz, or 10^12 Hz) frequencies. The formal study of THz radiation, which can be described as far-infrared light, dates back many years, but has become increasingly widespread since around 1990, when efficient methods for generating and detecting the radiation become available. The expected applications include carrying out biological spectroscopy and imaging buried structures in semiconductors.

Rice University physicist Daniel Mittleman and his colleagues at Sandia and Los Alamos National Labs use a metamaterial to turn a stream of THz waves off and on. It's called a metamaterial since it consists of an array of microscopic split metal rings. The rings can be controlled by nearby electrodes; modulating the ring's capacitance, in turn, modulates the radiation; that is, the THz light (sometimes called T rays) can be switched so as to pass through or not. The modulator consists of 16 pixels in a 4 x 4 array. Mittleman reports that this is the first time the wavefront of a THz beam has been under electrical control, which is important because THz wavelengths may be good for imaging and this would be the first step in allowing that by sending light across a whole plane, not just as a linear burst. The switching speed, about 1 MHz, isn't fast compared to today's quickest data transmissions. But, Mittleman say, high bandwidth is not necessary for many of the imaging tasks that will be carried out by T rays. A larger 32 x 32 pixel array is now being designed.

Presentation CThX2; Thursday, June 4, 2:45 – 3 p.m.

WORLD'S HIGHEST-RESOLUTION PROJECTOR

If one were to stack 16 of the world's best high-definition projectors side-by-side (and on top of each other), the combined image projected would contain 33 megapixels. This is the resolution achieved by the world's highest-resolution projector, soon to be unveiled by the company Evans & Sutherland (E&S) of Salt Lake City, Utah.

Most projectors contain two-dimensional arrays of pixels, tic-tac-toe arrangements of tiny microelectromechanical systems (MEMS) devices that each light up with a particular color. Because fabricating 33 million of these devices is a tricky endeavor, the E&S projector only uses a single column of 4,000 pixels, powered by a beam of laser light. This rapidly-changing vertical stripe of colors is swept across a screen faster than the eye can see, so spectators see the illusion of a projected 2-D image.

To create this projector, twice the resolution of any that currently exists, the company had to develop powerful fiber lasers. These lasers, discussed in Forrest Williams' talk, may have uses for other projects, such as making anti-counterfeit identifiers or projecting artificial stars into the night sky that can be used to calibrate astronomical instruments.

The projector, which creates a 2:1 image twice as wide as it is high, will be marketed to planetariums, simulations, and training companies that currently wire multiple projectors together to display large images.

Presentation PThA2; Thursday, June 4, 11 – 11:30 a.m.

PICOSECOND OSCILLOSCOPE

An oscilloscope is a device for displaying signals that are too fast to be seen by the human eye. Typically the signal consists of a voltage level that changes quickly moment by moment (over millisecond to nanosecond timescales). What is seen on the screen of the scope is a waveform whose value is graphed along the vertical axis as a function of the horizontal axis representing time. An electron beam, aimed at a phosphorescent screen, is swept horizontally providing a light-trace on the screen while, coincidentally, the instantaneous voltage of the input signal is used to deflect the electron beam up or down, creating the visible trace. The dynamic range of this whole process is the range of voltage values that can be displayed; the other important feature is the time resolution: how fine a time scale can be achieved. Conventional analog television displays use comparable technology. A trace is swept horizontally across the screen, but instead of deflecting the beam up and down, the beam is interrupted or allowed to proceed toward the phosphor screen, where the trace shows up as a bright or dark spot. The display is then scanned across the screen again in a raster pattern to build up a complete screen image (but so quickly that the human eye doesn't notice it at a rate of 30 or 60 frames per second).

For performing high-end physics, ordinary oscilloscopes and televisions aren't fast enough, and the deflection of a beam used to display an image or a short-lived signal requires a different technology, which sometimes goes by the name "streak camera." Because the electrons comprising the beam are charged particles, the signals they carry suffer unavoidable blurring where the signal strength is strongest, thereby limiting the useful dynamic range. John Heebner and colleagues at Lawrence Livermore National Lab (LLNL) recently devised a solid-state all-optical streak camera, the first to attain a time resolution near 1 picosecond while simultaneously preserving a wide dynamic range, 3000:1. In his camera, the beam being deflected consists not of charged electrons but of uncharged photons, which do not suffer from the limitations of conventional streak cameras.

He achieves an unprecedented deflection rate of a light beam by sending it through an ordinary planar waveguide whose optical properties can be nearly instantaneously modified by a separate pump laser beam incident from above. A sequential array of "transient" prisms is created by first allowing the pump beam to pass through a serrated mask. When the pump beam is properly synchronized to the signal beam to be recorded, time-of-flight at the speed of light does the rest. Because later portions of the signal encounter more prisms, that part of the signal is deflected by a greater amount than the earlier portions of the signal that had already advanced through the waveguide before the prisms turned on. The prisms persist for the duration of the sweep and disappear in time for the process to start again with the next trace. Each deflected light trace is then focused onto an array of camera pixels. The light level detected on the array thus preserves a recording, over time, of the light beam's intensity.

Heebner's device, which he calls serrated light illumination for deflection encoded recording (or SLIDER), can even be used to study short bursts of light in the X-ray region of the light spectrum. This is accomplished by first encoding the X-ray signal onto an optical beam using an optical device (a Fabry-Perot cavity) that can be modulated at picosecond timescales. This makes SLIDER potentially valuable for monitoring the brilliant bursts of X-rays streaming from fusion targets at the collision point where the multiple laser beams of LLNL's National Ignition Facility (NIF) come together.

The benefit of the device is that it enables the recording of very fast phenomena. As the world's fastest light deflector, it can be used as a picosecond oscilloscope or for observing transient events like the miniature fusion reaction that occurs at the National Ignition Facility.

Presentation CThW1; Thursday, June 4, 2:30 – 3 p.m.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.