It might prove informative to test the connector at sewveral currents from (say) 10A up to full rating. There will undoubtedly be some thermal effects which will increase the resistance of the connector when in service at rated current compared with a measurement taken at a low current.
What access do you have to test equipment? Are you in the test lab of a major organisation, or playing with this on the dining room table? It helps to know roughly what level of resources you have access to.
A 200A DC source should be fairly easy to hire, or you could make a simple AC source yourself. Use a couple of turns of heavy cable passed through a toroidal transformer core and close the loop through the connector. Leave the original secondary open circuit. A variac would be a useful way to control the primary voltage, thus allowing you to regulate the current. There should be virtually no reactive component of the connector impedance so the measurement should be simple enough without worrying about complex impedances: R=V/I.
----------------------------------
If we learn from our mistakes I'm getting a great education!
Since the voltage drop is the ohmic resistance times the current you may choose to meter the electric resistance. For low resistance measurement the best way is to use the 4-wire method or an ohmmeter that meters that way. This link may be helpful to you:
Nowadays there are ohmmeters that use bateries and are portable, if you get a good ducter you may not need to inject higher currents than 10 A.
So, once you meter the resistance, to calculate the voltage drop you just need to muliply the current thru times the resistance. The resistance is independent from the value of the current.