Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Wanted a definition for near field and far field photometry 1

Status
Not open for further replies.

RadLight

Electrical
Jan 20, 2003
17
Please could someone clarify what near field and far field photometry are and when they are appropriate. I am measuring luminaires and would like to understand what the differences are so I do not make erroneous reports.
 
Replies continue below

Recommended for you

near field pertaien to the fundamentl order defraction of an field. In the far field, this disapears and you are viewing the higher order maxima of a feild. The far the fourier transform of the near. This is why and image through binoculars darken when an object is in front within the near field. at this point you are gathering less of the higher modes. This is why telescope resole better the larger they are
 
One classical definition of far-field is where a source behaves as a point-source, whereas, in the near-field, the source behaves as an extended source.

TTFN
 
Thanks canyoncruz for your input, however, please could you give me a bit more clarity on these orders of difraction, how they affect the application of measuring light output from a luminaire. How could I calculate that I was in a nearfield and how can I calculate the real light output form the luminaire if I might only be seeing the fundamental order of difraction in the near field. Are there any good references? Prior to your information I was under the impression that nearfield meant closer than 5 X luminous dimention of the source and that the inverse square law for illumination only applies to distances outside this range and are thus in the far field. Your input would be greatly apprecaited.
 

Photometric science treats light as if it were propagating in straight lines, like rays. Diffraction is not considered. Therefore, "near-field" and "far-field" mean something totally different in photometry science than in diffraction science. As such, IRstuff has given the proper definition, and your prior impression was correct.

The best way to measure the transition point between near and far field is to measure the power as a function of distance from your luminaire (Your detector must be much smaller than your luminaire, if not, you can put an aperture in front of your detector). The boundary between near-field and far-field will be the distance where the power received by the detector begins to fall like 1/r^2.

CV
 
That's essentially the point at which the source looks like a point source to your detector, or equivalently, the point at which the field of view of the detector exceeds the physical dimensions of the source.

Prior to that, the source still obeys 1/r^2, but as you move back, you see more of the extended source, thus compensating for the 1/r^2 drop exactly. This is "constant brightness" law for extended sources.

TTFN
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor