Understanding Fourier Optics: Object Distance & Image Formation Explained

In summary, when imaging a distant object, the (FT)^2 at the rear focal plane is the image of the object.
  • #1
fisico30
374
0
Dear Forum,
Consider an object, illuminated with coherent light, that is located very far away from a positive lens.
The object is considered to be at "optical infinity" (its distance is simply very large compared to the focal length of the lens).
In that case, the image of the object (real, small, inverted) will form on the back focal plane of the lens (to be picky, on a plane that is infinitesimally close to the focal plane).

Here my confusion: what forms on the back focal plane? The image of the object or the modulus squared of the FT of the object?

It would seem that both FT and image are in the same plane or that the FT is within the depth of focus...

thanks
fisico30
 
Science news on Phys.org
  • #2
I'm not sure what you re asking: the field at the rear focal plane is the FT of the field at the entrance pupil. For example, a distant star provides a plane wave at the pupil, so the field is a 'circle function' (i.e. |U| = 1 inside the circle and |U| = 0 outside). The FT is then an Airy function.

The image can often be considered as the object convolved with the point spread function (alternatively, the FT of the image is the FT of the object times the pupil function). For incoherent imaging (that is, intensity detection rather than field detection), the FT of the image intensity is the FT of the object intensity times the autocorrelation of the pupil function (the optical transfer function).

This is covered in more detail in, for example, "Introduction to Fourier Optics" by Goodman, in Chapter 6.
 
  • #3
So, take an object at a finite distance d_o from a positive lens of focal length f.

The image will form on a plane at a distance d_i (using the simple lens equation).
Don't we find the modulus squared of the FT of the object on the rear focal plane of the lens at distance f?
I thought that was what Goodman book said...

Sure, the image is not perfect: it is the convolution of the object with the PSF which is the FT of the entrance pupil (the lens itself in this case, circular aperture)...
thanks
fisico30
 
  • #4
If I understand you correctly, that's true if the object distance is equal to the focal length. That is, placing the object a distance f in front of the lens produces the FT of the object at the rear focal plane of the lens. The image is then at infinity...

If the object is located an arbitrary distance d_o, then the rear focal plane has the FT of the object times a phase factor that goes like exp(stuff*(1-d_o/f)). Again, whether or not you see the FT or |FT|^2 depends on the detector.
 
  • #5
Ok thanks,
I think you are seeing what I mean: no matter the object distance d_0, we always and only find the (FT)^2 , using a squared law detector, on the rear focal plane of the lens.

If d_0 becomes very large (but not at infinity, far enough such d_0/f is very large) we still always find the (FT)^2 on the rear focal plane (we don't worry about the extra phase term since we are recording intensity) and the image forms on (infinitesimally close) the rear focal plane where there is always the (FT)^2...

The dilemma is: does the (FT)^2 look like the image of the object? Surely not, but both appear to form on the same plane or almost...

thanks
fisico30
 
  • #6
Ah- now I see what you are getting at. Interesting...

Ok, the (simplified!) impulse response:

[tex] h(x_{i},y_{i},x_{o},y_{o})\cong \int P(x,y) exp[i\frac{k}{2}(\frac{1}{d_{o}}+\frac{1}{d_{i}}-\frac{1}{f})(x^{2}+y^{2})] exp[-ik[(\frac{x_{o}}{d_{o}}+\frac{x_{i}}{d_{i}})x + (\frac{y_{o}}{d_{o}}+\frac{y_{i}}{d_{i}})y]]dx dy [/tex]

and the image field given by the usual convolution:

[tex]U_{i}(x,y) = \int h(x_{i}, y_{i},x_{o},y_{o})U_{o}(x_{o},y_{o})dx_{o}dy_{o}[/tex]

Using the lens law (1/d_o +1/d_i = 1/f), setting d_o to infinity, d_i = f gives:

[tex] h(x_{i},y_{i},x_{o},y_{o})\cong \int P(x,y)exp[-ik(\frac{x_{i}}{f}x + \frac{y_{i}}{f}y)]dx dy [/tex]

which reproduces the result you get when the object is at the front focal plane- *when the object at the front focal plane is a spatially-transmitting object illuminated by a plane wave*. So imaging a distant object is equivalent to imaging the FT of the object located at the front focal plane of the lens. Which I think I understand, because the far-field diffraction pattern of an object is the FT of the object.

That's a pretty cool observation you made...
 
  • #7
Cool thanks.
So you say that:

" So imaging a distant object is equivalent to imaging the FT of the object located at the front focal plane of the lens."
I am going to spend some time on Goodman to convince myself even more.
The farther an object gets the less spatial frequencies are captured by the lens diameter...

On a different note: consider an object,a small train and a positive lens. Assume no aberrations (perfect world). The lens will form an image of the train according to the lens equation, on a specific plane located at d_i.
What if we illuminated the train with coherent illumination, like a laser? I think we would see an image of the train.
What if we illuminated the object with incoherent light (white light)? We would still see the image of the train.
The two images (aside from the colors) should be identical, correct?

The imaging processes are different: in coherent imaging each point in the object is a delta which becomes a point spread function in the image plane.Here we are talking about the E field. We sum all the superimposed PSFs, all the E fields, get total E field, and finally square the total E field to get the irradiance. (E field convolution)
In the incoherent case, we first square each PSF and then sum them all together (intensity convolution).

Again, ignoring aberrations and speckle, could we get images that are identical in shape? I know they will both form on the same planes...

thanks
fisico30
 
  • #8
I think the term 'coherent' is being used to refer to a few non-related things.

Laser light is very temporally coherent but not very spatially coherent (the speckle). Detectors can be coherent or incoherent as well, depending on if they detect the phase or not.

A coherent detector will see the image as E = h * U (E = image field, h the PSF, U the source field), and so the recorded intensity is I = |E|^2 = |h*U|^2. In Fourier space, e = Hu and i=|e|^2 = Hu * Hu. The coherent transfer function H is the pupil function, H = 1 for frequencies inside the circle function and H = 0 outside (for an ideal circular aperture). The spatial frequency where this occurs is the 'cutoff frequency'.

An incoherent detector, by contrast, can only measure I = |h|^2 * |U|^2, or in terms of spatial frequencies, i = (H*H)(u*u). The incoherent transfer function is the autocorrelation of the pupil function H*H, and for an ideal circular aperture the cutoff frequency is twice the cutoff frequency for the coherent case. However, the contrast at all frequencies is less than 1 (except for the DC component).

So it's not fair to ask "Is coherent imaging better or worse and incoherent imaging?" Incoherent imaging has a higher cutoff frequency ('increased resolution'), but the contrast is lower. The two images are quantitatively different, and in the visible spectrum, we don't have coherent detectors anyway. I worked with a millimeter-wave imaging system once (30 MHz), and that was a coherent detector system.

And again, this use of 'coherent' is unrelated to the coherence state of illumination. Laser light usually a poor choice for wide field imaging because of the speckle, so a 'spatial filter' (a pinhole) is used to increase the spatial coherence, decreasing the speckle. Light scattering applications (and fluorescence correlation spectroscopy), AFAIK, *love* speckle because that increases the signal to noise ratio.
 
  • #9
Actually, I think a laser is pretty coherent both temporally and spatially.
speckle is a manifestation of how coherent a laser is: when it reflects on a rough surface we get all those tiny interference points, speckle, which would not show up if the illumination was incoherent...

Now is speckle due to temporal or spatial coherence? I will have to investigate...
thanks
fisico30
 
  • #10
The speckle due to a reflection from a rough surface has its origin in the spatial coherence of the incident light. When you are looking at a unique speckle, you are looking at reflection of light from differents parts of the surfaces that are separated spatially from one to another. Each point you are looking from is formed by the constructive and destructive interference of all the other points of the surface.
If the light was only temporally coherent, you couldn't see any speckle. Because at a given time, the light scattered by different points on surface couldn't interfere.
 

Related to Understanding Fourier Optics: Object Distance & Image Formation Explained

1. What is Fourier optics?

Fourier optics is a branch of optics that deals with the mathematical analysis and manipulation of light waves. It is based on the principles of Fourier analysis, which is used to break down complex waveforms into simpler components. In Fourier optics, this concept is applied to the analysis and synthesis of light waves passing through optical systems.

2. How does object distance affect image formation?

In Fourier optics, object distance refers to the distance between an object and a lens or optical system. This distance plays a crucial role in determining the characteristics of the resulting image. The object distance affects image formation by determining the angle at which light rays from the object enter the lens, which in turn affects the magnification and orientation of the image.

3. What is the relationship between object distance and image distance?

The object distance and image distance are inversely related in Fourier optics. This means that as the object distance increases, the image distance decreases and vice versa. This relationship is described by the thin lens equation: 1/object distance + 1/image distance = 1/focal length, where the focal length is a fixed property of the lens or optical system.

4. How does Fourier optics explain image formation?

Fourier optics uses mathematical tools such as the Fourier transform to analyze and manipulate light waves. In the context of image formation, the Fourier transform breaks down the complex wavefronts of light coming from an object into simpler components, which are then recombined to form a clear image. This process is based on the principle that the Fourier transform of an object's spatial distribution of light intensity is directly related to the image formed by the optical system.

5. What are some real-world applications of Fourier optics?

Fourier optics has a wide range of applications in various fields, including microscopy, astronomy, and telecommunications. In microscopy, Fourier optics is used to improve the resolution and clarity of images obtained from microscopes. In astronomy, it is used to enhance the quality of images captured by telescopes. In telecommunications, Fourier optics is used in the design of optical communication systems, such as fiber optic networks.

Similar threads

Replies
1
Views
2K
Replies
8
Views
1K
Replies
17
Views
2K
Replies
1
Views
1K
Replies
7
Views
2K
Replies
2
Views
3K
Replies
3
Views
2K
Back
Top