Photometric science treats light as if it were propagating in straight lines, like rays. Diffraction is not considered. Therefore, "near-field" and "far-field" mean something totally different in photometry science than in diffraction science. As such, IRstuff has given the proper definition, and your prior impression was correct.
The best way to measure the transition point between near and far field is to measure the power as a function of distance from your luminaire (Your detector must be much smaller than your luminaire, if not, you can put an aperture in front of your detector). The boundary between near-field and far-field will be the distance where the power received by the detector begins to fall like 1/r^2.
CV