You could use the sig gen and the spectrum analyzer, make two identical probes, place them apart by 1.5 inches and trim the probes to maximize the gain. Have the probes sitting on foam, or away from other objects. You'll get about -27 dB coupling between them at 1.5 inch spacing if the probes are 0 dBi gain. here's a link to space loss calculation, input 14 ghz and 1.5 inches and it gives the loss
(most of these calculators use miles for distance since it's for radar, but converting 1.5 inches to miles works fine).
Knowing the "spaceloss", if you measure exactly that value, your probes are zero dBi. If you measure 2 dB less than that value, your probes are +1 dBi each, etc.
It's a legal way to determine antenna gains that's used industry wide in the antenna world. It'll help convince people if you do the calibration, then do the measurement at 1.5", 2", 2.5" and 3" and use the space loss equation to calculate all the results and see how close they are to each other.
The orientation of the cables of both probes can change results a bit (+/- ?1,2 dB). If you get the probes too close to each other, they'll hurt the accuracy. You may be able to be 3/4 inch apart and still have good results.
If you are really in need of accuracy, make 3 probes. It's called a "3 antenna test method", you can determine the exact value of each probe that way. 3 coupling measurements 1-2,1-3,2-3 and 3 variables gain1,gain2,gain3, hence 3 equations, 3 unknowns is solvable for the individual gain of each probe. It'll look more impressive to the management. Either way will work though.
kch