Last week, I did a little experiment in interferometry: I did mask the objective lens of my digital camera with a double-slit aperture, and photographed a scene with street lights at different distances. Far away lights showed a very nice interference pattern, as seen in this detail of the photo:
In the comments thread of the post, a question came up I hadn't thought about before: How comes that an interference pattern can be seen in the first place, given that the source of light looks white? After all, the broad spectrum of wavelengths from a white light should result in the superposition of the interference patterns for each wavelength, and a thus in a blurring of the fringes.
One reason for the appearance of the interference pattern is the spectrum of the street lights I had photographed: It is not a broad and continuous spectrum, but has a few bright lines. On the other hand, as I could convince myself, even for a broader spectrum, there is enough contrast between the main central spot and the satellite spots of the pattern to be clearly distinguishable.
When searching for information about the spectral features of typical street lights, I came across a great website, The CD-ROM Spectroscope, by astronomer Joachim Köppen. It explains, for example, how to see the spectra of street lamps using simply a CD-ROM. Actually, following the instructions on this site, I could check that the street lights are some variant of mercury vapour lamps - their spectrum looks pretty much like this photo taken from Joachim Köppen's site, with bright lines in the green, orange, and red:
Spectrum of a mercury vapour lamp as seen using a CD ROM as a diffraction grating; taken from: How to see the spectra of street lamps
I then set to figure out the diffraction pattern to be expected from the double-slit aperture in my experiment for light with different wavelengths. Actually, there is an analytical formula for the diffraction pattern of a double slit in the so-called far-field or Fraunhofer limit, i.e. for asymptotically large distances from the aperture. When a lens is used to map the interference pattern in the focal plane, as in my experiment with the camera, this limit is automatically reached. The interference pattern then only depends on the angle away from the optical axis. For two slits of width w a distance d apart, the intensity pattern for light with wavelength &lambda is described by the formula
where α is the angle away from the optical axis. The first fraction is the profile of the diffraction pattern for one slit, and the second fraction is the two-beam interference.
Actually, instead of using this formula, I found it quite instructive to play around with a small program to calculate the intensity pattern, by summing up the intensities of spherical waves emerging from the two slits, according to Huygens' principle. Thus, I calculated the intensity pattern for two slits w = 0.5 mm wide each with a separation of d = 1 mm, as seen on a screen in a distance of D = 10 m, and for different wavelengths. The result, which is pretty much the same as that from the analytical formula, is shown here:
(Click for larger view or the eps file)
The grey curve gives the sum of the intensities for the four different wavelengths. One sees that the pattern gets tighter for shorter wavelengths, and that, as a result, the satellite peaks are broadened. However, there is still enough contrast between the central maximum and the first pair of satellite peaks for the fringe pattern to be visible. All further peaks, on the other hand, will be washed out.
Now, to obtain the diffraction pattern of the street lights, one would have to sum up the diffraction patterns for different wavelengths, weighted with the spectral density of the mercury vapour lamp. There is enough contrast for the three fringes of the interference pattern to be distinguishable, and even for the solar spectrum, one could expect sufficient contrast to see interference.
One final question, however, remains: According to the plot of the diffraction pattern for different wavelengths, one could expect that the satellite peaks look coloured: more blueish on the side towards the inward peak, and more reddish on the outside. However, in the photo, above, there are no colours. How comes?
My guess is that this is due to the pixel resolution of my camera. Actually, what is recorded on the chip of the camera is the convolution of the diffraction pattern with the pixel characteristics of the CCD chip. One pixel of the photo corresponds to an angle of about 1/2 arc minute, and the peaks are about two arc minutes wide, corresponding to four pixels. This binning is too coarse to clearly distinguish colours, and hence, also the satellite peaks do look white.
Amazing what one can do with modern day household equipment isn't it? Which reminds me to dump a link to my earlier post on Interference.
ReplyDeleteTsk, tsk, missing out on citations :)
ReplyDeleteI wonder if one of the newer high-pixel density cameras with a telephoto or supertelephoto lens will be able to resolve colors.
"White" light is really quite narrow-band (e.g., nothing at all like "white" noise).
ReplyDeleteYes, a superposition of the fringes appropriate to each wavelength! I haven't seen mention however of an obvious implication of that: a delicate form of chromatic aberration based on the different sizes of the diffraction disk for different wavelengths. So even a perfect optical system should show, a reddish-to-yellow-green-fringed Airy disc at high magnification due to the red-light disc being larger than the blue, and so on in between. I've never seen anything about that! Have any of you?
ReplyDeleteIt wold be significant, really, with the D-disk for 660 nm (rich red) being 1.5 the diameter of the 440 nm (rich blue) - no petty difference.
BTW if I could promote an OT-ish diversion after an on-T comment: at my blog, there's a cute but genuinely perplexing relativity paradox for those who like such things. See also sci.physics etc.
PS: The main post talks about the case of sources with several discrete spectral lines. I meant to refer to the general case, of "white light" (like stars, in practice) seen in a telescope. Actual line sources, though, would appear about the same as that anyway, since a few color discs would be superposed. Not much difference, except some more discrete borders between colors.
ReplyDeleteDo street lamps also have an infrared component? Most digital cameras are sensitive to some bands in the near infrared. For example, you can see the light from a remote control with a digital camera, but not the unaided eye.
ReplyDeleteYou also have to take into account that inexpensive digital cameras use a single sensor and a set of filters to extract color information. The exact algorithm, which is optimized to produce decent photographs of the usual stuff, can shuffle positions and colors by a pixel or two.
Still, it's kind of amazing the stuff we get to play with these days. I remember buying a diffraction grating from Edmund's back in the 1960s. A friend of mine had a sheet of mylar for some project with radio antennas. That was pretty exotic then, but quite mundane now.
Hello,
ReplyDeletesteet "mercury" lamps have a spectum
more or less similar to fluorescent
lamps.
The mercury lamp is a small (some centimeters)
medium to high pressure arc inside,
the elliptic outer glas bulb is
coated inside with some fluorescent powder
similar to the coat in fluorescent lamps.
Due to the much higher pressure in the
mercury burner the lines are rather broad,
some red/infrared comes from the
glowing electrodes.
Older lamps can contain a filament
inside ("compund lamps")
which is used as a resistor for the
mercury arc, in
this case You dont need a special
choke or transformer to
operate the arc.
Recently the use of sodium lamps
is rather popular, those were used on crossings only formerly.
(There is a mistake in one of the links above, speaking of "orange"
sodium lines. 589 nm sodium line is a
dublett indeed, but a rather close one, not resolved in a
tinkerers spectrometer. And of course it is yellow!)
I remember an american physics textbook
named "Waves" from the sixties,
which was sold with a set of optical
color filters, a diffraction replica,
polarisation filters, lambda/4
filters and some more.
I bought that book, although I did not need it,
more or less to get that set to play with it.
Regards
Georg
Hi Stefan,
ReplyDeleteThis post certainly confirms the scientist in you; for once a mystery was raised you were not content until you found the answer. Then again when we look at the original double slit experiment as conducted by Young there was no differentiation of wave length as the source being sun light. In fact there weren't even two slits, yet sunlight directed by a mirror through a pin hole in a window shutter then passing length wise over a thin card onto a screen. So there were no slits, yet rather the beams separated by the card. Actually it should have been called the split beam rather than the two slit experiment. Young’s result was in alternating bands of coloured light and not in black and white bands.
Just as a query about that program you used it has a “pl” extension which I tracked down to mean it was written in PERL, which was developed for unix and found as basic in Linux. Looking around it doesn’t appear to very windows friendly although it is for Mac. This may be a simple 2k program yet not for dumb old Vista :-)
Best,
Phil
Always nice to understand the compositional arrangement of information one can deduce from transmission of planned spaceflight by Messenger.
ReplyDeleteScientists additionally took their first look at the chemical composition of the planet's surface. The tiny craft probed the composition of Mercury's thin atmosphere, sampled charged particles (ions) near the planet, and demonstrated new links between both sets of observations and materials on Mercury's surface. The results are reported in a series of 11 papers published in a special section of Science magazine July 4.MESSENGER Settles Old Debates and Makes New Discoveries at MercurySo "diffraction of light" becomes interesting while considering the calibration?:)
Hi Neil,
ReplyDeleteSo even a perfect optical system should show, a reddish-to-yellow-green-fringed Airy disc at high magnification due to the red-light disc being larger than the blue, and so on in between. I've never seen anything about that! Have any of you? Good point! I am also not aware of "coloured" rims of the central spot in the Airy disk, but the effect should be real? Maybe the intensity typically is too small to perceive colour?
Actually, something I haven't done yet, but have in mind, is to calculate the spectrum in the diffraction pattern at a certain distance away from the optical axis, given the spectrum of the source. That's not so difficult to do numerically for, say, a Planck spectrum for the source and the analytically solvable diffraction patterns of the Airy pattern or the slit diffraction. The question then would be, what is the perceived colour of the resulting spectra?
Hi Kaleberg,
I don't know about the infrared component of street lights, but it may be partially filtered by the glass casing anyway? I wasn't aware that digital cameras are sensitive to the infrared,
You also have to take into account that inexpensive digital cameras use a single sensor and a set of filters to extract color information. The exact algorithm, which is optimized to produce decent photographs of the usual stuff, can shuffle positions and colors by a pixel or two.Ah, thanks for the explanation. I wasn't aware of any technical details on what's going on in the camera with respect to colour determination. That's what I had in mind when writing about the "convolution of the diffraction pattern with the pixel characteristics" of the camera.
Hi Georg,
thanks for the explanations about "mercury vapour" lamps!
Hi Phil,
thanks for the comment and link about Young's original experiment!
The small program, I wrote it in perl because for such small tasks, I get results fast, and I know the language. There may be better suited languages.
Hi Arun,
"White" light is really quite narrow-band (e.g., nothing at all like "white" noise).Indeed... that's probably the reason that even with sunlight (from a pinhole) or starlight, diffraction patterns don't look coloured?
Best Stefan.
Bee - And don't forget that almost all digicams use bayer masks (filters) to generate the final color image. I believe only the Foveon sensor directly registers the correct color at each pixel rather than interpolating off a mask.
ReplyDelete