Optics 101

Beam Divergence

Beam divergence of an electromagnetic beam is defined as a measure of increase in laser beam diameter or radius with respect to the distance from an antenna aperture or optical aperture through which the beam emerges.

The beam divergence is directly proportional to the beam size at aperture. The divergence can be reduced by increasing the beam size at the aperture and vice versa.

Low beam divergence is critical for free-space or pointing optical communications. Beams with small divergence or constant beam radius for certain propagation distances are called collimated beams.

When a beam has considerably larger beam divergence, it is said to have a poor beam quality. Beam divergence is usually measured by measuring the beam caustic, i.e. radius of the beam at various points, using a beam profiler.

Like other electromagnetic beams, lasers undergo divergence that is measured in degrees or milliradians. 

The divergence of a laser beam is directly proportional to laser wavelength and inversely proportional to the diameter of the laser beam. The divergence of good-quality laser beams is usually designed using Gaussian beams. Gaussian laser beams are said to be limited by diffraction when their radial beam divergence “θ” is given by,

         θ = λ/πω

where λ is the wavelength of laser and ω is the radius of laser beam at a narrow point, which is also referred to as the beam waist.

Beam divergence can also be derived using the complex amplitude form of the beam in a single plane, which is obtained using a Shack-Hartmann wavefront sensor.



Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback