In reading your latest inquiry I may be a little confused as to exactly where you are or what you are after in the design, installation, or testing etc. processes. I would however think it is normally best practice to quite early (say in contract documents) spell out upfront very clearly, in at least as unambiguous terms as possible, and for the benefit of all parties including bidders exactly how pipelines and appurtenances etc. are to be designed, installed, and yes, tested. While I think AWWA C600 is a quite long-standing and pretty well-“vetted” standard, I believe field testing philosophies around the world in fact vary all over the map. The vast majority of differing practices are probably successful with DIP.
In this regard, I believe various ISO and EN standards, that prescribe much higher standard pressure ratings for DIP [e.g. that have talked about working pressures up to 40 bars (~580 psi) or more] than AWWA design, also publish a little different default field test pressure philosophy. I believe they say that instead of a sort of direct multiplier like AWWA, the test pressure should be the maximum allowable operating pressure of the system + 5 bars (~73 psi)]. While this perhaps gives results not much different than AWWA for much common lower pressure work, it is obvious that with very high working pressures a constant 5 bar default addition results (in back-calculated effect) like a much smaller direct multiplier on the operating pressure as compared to AWWA. I guess it could be argued this is reasonable if e.g. any greater e.g. transient pressures e.g. due to surge to be imposed on th system are a function of common flow velocity (not pressure level), and/or if there are now better surge prediction tools and surge preventive/dependable hardware available for pipelines now than when older standards guidelines were established/older pipelines built..
I would suspect that the most common ductile iron pipelines (as far as the hundreds of thousands of miles of at least ductile iron pipes out there in distribution systems) operate in the normal (steady) working pressure range of 60-100 psi, and it may not be a coincidence that there have been so many municipal projects (and most perhaps also with not huge elevation differences) out there that for many years have specified that their pipes be field tested e.g. to 150 psi (that happens to be 1.5x100 psi). I suspect at least early drafters/versions of this standard may have kept this in mind. Over the years and for whatever reasons (boosting of pressures to extend service areas, more needed water delivery, more and longer transmission mains over rougher terrain to more remote sources etc., or realizations that greater pressures might occur?) I think I have seen more and higher specified test pressure projects.
Now, I must also note that the clause you quoted in AWWA C600-05 is immediately followed by another rather common sense (sort of a chain is only as strong as its weakest link) clause,
“5.2.1.1.2 The test pressure shall not exceed the thrust restraint design
pressures or 1.5 times the pressure rating of the pipe or joint, whichever is less (as
specified by the manufacturer).”
I believe these are good reasons (among others) to provide as much information as possible in consistent terms in the contract documents to allow such determinations to be made, so that all parties are on the same page. A transmission pipeline with widely varying elevations, of course that results in a non-level hydraulic grade line, obviously has “operating pressure” that varies and is not a constant with the contents in motion or even bullheaded off (like a closed valve, or test bulkhead at the bottom for static/hydrostatic test).
All that being said (and while DIP may have some more security with regard toover-pressurizations than other pipes), I believe a well run field test of any pipeline to at least at pressure levels not likely to be exceeded in service probably benefits and perhaps provides some level of protection to all parties, including the pipe supplier. Have a good weekend!