Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations SSS148 on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Limit State Design

Status
Not open for further replies.

Mccoy

Geotechnical
Nov 9, 2000
907
Hello to all!
Some news from the geotech world in Europe.
LSD in geotechnical engineering with its probabilistic view is being officialized as a state law in many european country, after its principles have been summarized in Eurocode 7.
In Italy the new regulations have been issued and concede a transitional period to comply. It's a bit of a revolution from the concept of allowable stress design. Geotechnical design will have to use either full probabilistic methods with distribution functions instead of deterministic values, or semiprobabilistic methods adopting parameter values, called "characteristic values" which represent "a cautious estimate of the mean value" of the soil property.
Characteristic values are also indicated as the 5° percentile of the parameter distribution. Also reference is made to Bayesian methods, as a suggested way to find characteristic values.
A few question marks will inevitably rise:
What shall we define the carachteristic values of a single soil sample sent to lab and put into a triaxial cell? We have only single phi and c' (or Su) values. same goes for a single SPT test in an homogeneous layer.
A pretty criticized approach this one, but has become part of codes now.
Any comments ?
 
Replies continue below

Recommended for you

Here is the relevant EC7 section:


P Design values of ground properties, Xd, shall either be derived from characteristic values, Xk, using the equation:
Xd = Xk / gm (2.1) where:

gm
is the safety factor for the ground property

or shall be assessed directly.

(2) P The selection of characteristic values for soil and rock properties shall be based on the results of laboratory and field tests. Account shall be taken of the possible differences between the properties measured in the tests and the soil and rock properties governing the behaviour of the geotechnical structure due to factors such as:
- presence of fissures, which may play a different role in the test and in the geotechnical structure;
- time effects;
- the brittleness or ductility of the soil and rock tested.

(3) A conversion factor shall be applied where necessary to convert the
laboratory and field test results into values which can be assumed to
represent the behaviour of the soil and rock in the ground.

(4) P Selection of characteristic values of soil and rock properties
shall take account of the following:
- geological and other background information, such as data from previous projects;
- the variabilities of the property values;
- the extent of the zone of ground governing the behaviour of the geotechnical structure at the limit state being considered;
- the influence of workmanship on artificially placed or improved soils; - the effect of construction activities on the properties of in-situ ground.

(5) P The characteristic value of a soil or rock parameter shall be selected as a cautious estimate of the value affecting the occurrence of the limit state.

(6) The extent of the zone of ground governing the behaviour of a geotechnical structure at a limit state is usually much larger than the extent of the zone in a soil or rock test and consequently the governing parameter is often a mean value over a certain surface or volume of the ground. The characteristic value is a cautious estimate of this mean value.
The governing zone of ground may also depend on the behaviour of the supported structure. For instance, when considering a bearing resistance ultimate limit state for a building resting on several footings, the governing parameter is the mean strength over each individual zone of ground under a footing, if the building is unable to resist a local failure. If instead the building is stiff and strong enough, the governing parameter may be the mean of these mean values over the entire zone or part of the zone of ground under the building.
Statistical methods may be employed in the selection of characteristic values for ground properties. Such methods should allow apriori knowledge of comparable experience with ground properties to be taken into account for example by means of Baysian statistical methods.
If statistical methods are used, the characteristic value should be derived such that the calculated probability of a worse value governing the occurrence of a limit state is not greater than 5 %.

(7) P Characteristic values may be lower values, which are less than the most probable values, or upper values, which are greater. For each calculation, the most unfavourable combination of lower and upper values for independent parameters shall be used.

(8) P The selection of characteristic values shall take account of the uncertainties in geometrical data and in the calculation model unless they are allowed for directly or in the calculation model.

(9) P For verification in persistant and transient situations of ultimate limit states the numerical values of partial factors for ground properties given in Table 2.1 for the cases A, B and C s.re generaly appropriate to be used with the partial factors for actions for the same cases for conventional design situations. For accidental situations all numerical values of partial factors shall be taken equal to [1,0].

(10) P For ultimate limit states in which soil strength acts in an unfavourable manner, the value of gm adopted shall be less than [1,0].

(11) The degree to which soil strength will be mobilised at the limit state may be taken into account by adopting design values which are less than the upper characteristic values divided by factors gm less than (1,0].

(12) P The partial factors for the resistance of a pile or an anchorage, determined on the basis of soil strength parameters, pile driving formulae or load tests, or anchorage tests are given in sections 7
and 8.

(13) P For serviceability limit states all values of gm are equal to [l.Oj.

(14) P Design values of ground properties may also be derived by methods other than the use of partial factors. The partial factors set out in the Table 2.1 indicate the level of safety considered appropriate for conventional designs. These shall be used as guidance to the required level of safety when the method of partial factors is not used.

(15) Where design values for ultimate limit state calculations are assessed directly, they should be selected such that a more adverse value is extremely unlikely to affect the occurrence of the limit state.
 
I have seen your other thread and thought it was better to answer here.
EC7 generated hopes that geotechnical design would be uniform throughout europe but in fact it is an empty shell since it gives only general guidance regarding the method but "national" coefficients will be determined by each member nation ( for example a driven pile will not be designed the same way in UK and in france ) which means that the final results will not change a lot.
I will give a simple example : driven concrete piles are very popular in UK, in Spain, in Belgium but not in france because the design parameters are very conservative which prevent them from being a competitive product.
Since france will be entitled to choose its own national parameters for each type of pile, this type of pile will stay where it is.
 
BigHarvey,
I was aware that there was some freedom left in coefficients, your example is interesting. I gather the reason for that is to let member states adjust the code to speficic national situations and practices (at the same time, alas, disrupting uniformity of application).
Have you had any problems in the definition of characteristic values or what is the general feeling about this aspect of EC7 in your community.
 
Mccoy - I'll have to print out and digest your point. This is a topic, I guess, of LRFD - something that Focht3 and VAD are passionate about. I submit, as I did in the open thread, a list of links. See Kulhawy and Phoon on reliability in geotechnical engineering and the Washington State GeoManual Chapter 8 specifically. It seems to me that this is just a fancy way of sending geotechnical engineers to mimic the structural types especially since all the 'factors' are calibrated. But, as we know, the 'old' correlations were from many sites and the new calibrations, at times, are only from a single site that may or may not be applicable to all sites. I like the old way - I really don't see any positive reason to change other than the newer method is trying to make geotechnical engineeering more into science than an "art" - but, we are, in many instances, still providing degree 1 parameters into degree 3 equations and we are seeing geotechnical engineers forgetting any semblence of 'reasonable' answers (see how many pile load test results are now determined to 2 decimal places!).
[cheers]


Gordon A. Fenton, D.V. Griffiths, and W. Cavers
Can. Geotech. J./Rev. can. geotech. 42(5): 1422-1436 (2005)

(Specifically)
Deep Foundations
168.166.124.22/RDT/reports/Ri03030/or06010.pdf
by Kulhawy and Phoon
Paper_PPT_WS02/HONJO_partial_factors-JCSS.pdf
by Phoon
 
Up to now we do not have any problems in France since national coefficients for EC7 ara not yet available ( committees are just formed ). EC7 is not yet applied and we still use DTU 13-2 for private works and goovernment "fascicules" for public works. EC7 will only apply to private works. Government bodies will adapt the "fascicules" to make them close from EC7 but it won't be EC7 anyway !
 
In Italy national legislation gave 18 months to learn and digest, then mandatorily comply!

One of the reasons I'm digging into the topic is because I'm organizing courses to teach the new methods to geologists in Italy (there are no specialized foundation engineering schools here and most engineers will let geologists do the geotech report).

As far as I see, there are strong reactions against the lower-bound concept (characteristic values) from both the achademical and the practitioners' world.

Main flaws outlined: disrupts technical judgement and experience-acquired wisdom, requires too many samples, disrupts uniformity.

The main drawback to me is the latter: as far as there are no accurate definitions of the lower bound to be adopted, uniformity cannot be reached.
At the same time, lack of well defined rules gives the engineer some freedom of choice (I might choose a 25th percentile lower bound or a 5th percentile, or a minus 1-sigma, according to my judgment about the field situation).
Also, EC7 suggests the employment of Bayesian methods, which give importance to prior knowlege and experience (the posterior ditribution is calculated by the field data AND the known data for the material).
Methods to figure out a lower bound for a single test abound: from six-sigma to literature tables (great ones those from the Kulhawy articles bigH signaled) to Bayesian analysis).
And static penetrometers can yield as many data as we wish.
Uncertainty about lower bound values applies exclusively to soil parameters, structural materials have a well defined 5th percentile lower (resistances) and 95th percentile upper bound (loads and actions).

As far as I've seen, LRFD would appear a bit different from LSD, the former focusing on partial safety factors, the latter on semiprobabilistic methods (and consequently on upper or lower bound values).
 
I've just read the FHWA report, the second from the end of BigH's list.
Detailed comparison between LRFD (or LSD, differences are explained) in Canada, Denmark, Germany and France.
Also, characteristic values are usually defined as the 5th percentile of the distribution of the mean.
but the report is misleading in other parts, when it speaks of "a value of 5 percent below the mean".
In France characteristic values are THE 5th percentile, and this rises legitimate doubts about overly conservative results. Germans use extensive databases.
Italians have not been interviewed but, I ensure you, don't have the faintest idea about the doing.
To me EC7 is quite clear, a "cautious estimate of the mean value" of soil parameters means that you go as down as the 5th percentile of the MEAN, not of the whole distribution, which, if data are scattered, would make up an overkill situation.
 
I don't know about a "full probabilistic" analysis in place of a deterministic analysis, but I will say I've been using parametric analyses in conjunction with conventional deterministic analyses for quite a while on critical elements of a project. The method I use is described by Duncan (Virginia Tech).

The pros?...you get to use the deterministic methods everyone is already familiar with, with the addition of also considering the variability of the materials we're working with, i.e., soils. With this method, you can actually quantify (a range at least) the probability of failure, which provides a good comparison of the iterations you run. It supplements the deterministic methods and adds to your ability to make good decisions.

However, I agree with BigH's point in that we need to make sure these "codes" that are suppose to help protect the public do not impede or hamper our ability to use sound judgment at all times.

 
MRM,
Nice to know there is someone out there working with probabilistic methods. Your ideas about joining deterministic and probabilistic are also very sound.
I gather the method described by Duncan is the FOSM method, or Taylor series approximation, also proposed by USACE, Wolfe Christian & Becher and others.
Do you use it for all kinds of calcs?

My doubt about the method is that in presence of non linear models it may err substantially as far as the variance is concerned.

I didn't see any comparison made between FOSM and analitical, exact calculations, or quasi-exact results obtained by Montecarlo simulations, in the geotech field. Maybe you know about some.

NASA made a study about a flow equation where the mean resulted in a good approximation , the standard deviation in a poor approximation, even with second-order approximation (SOSM).

I made a study on some non-linear equations used in industrial hygiene and FOSM performed poorly, even SOSM when input data have a high coefficient of variation resulted in poor performance. That means that you have a distorted result of variability and consequently of reliability. But I have no data for common geotechnical analyses. My suspicions are strong, though, even if everyone seems to like FOSM (Kulhawy reccomends it as well).
Until they show me results are similar to exact or quasi-exact methods, I'll remain skeptical.
Full probabilistic methods are Montecarlo and its variations. Or, if you are a mathematical genius, analytical methods on distribution themselves .
Semiprobabilistic methods are the same as deterministic analysis, only the resistance parameters are reduced by a "cautious" amount, or increased if loads, plus partial safety factors are applied.
I've started to apply these methods, but you have to get the habit to it, at first it seems over conservative. Besides, communication between structurals and geotechs is usually poor, and structurals don't like to give you loads and moments before you giv'em bearing capacity, which on its turn would need loads and moments to be exactly calculated.
A classical catch-22 situation!
 
"Reliability Based Design in Civil Engineering",
by Milton E. Harr, McGraw-Hill, 1987 has many practical examples in all areas of Civil Engineering including geotechnical.

McCoy, if you are using LRFD (or at least runing the calculations to check against more conventional analysis) how do you go about it? In practice, have you set up spreadsheets to analyze the data and plug in the various parameters?
 
jheidt,
the book you cited is the one which introduced me to the topic, and in my opinion remains maybe the best.

I tried all the procedures outlined by Harr:

PEM (rosenblueth) is not too convenient and akward with multiple varaibles, recently has been improved by Christian (I didn't try the improved version).

FOSM or Taylor series approximation is an approximation whose performance should be checked. I tried to apply it to bearing capacity, becomes cumbersome if you take more then 2 variables, covariance (between phi and c) is not easy to implement. Maybe is best suited to simple settlements analysis, assigning uncertainty only to the elastic modulus. Simple spreadsheets are OK. Now there are more efficient numerical differentiation algorithms then the ones given by Harr, USACE, Duncan. Benefit is moderate use of CPU memory, expecially in FEM applications.
The following links illustrate the method with some performance checks:

Montecarlo is very efficient and you can apply it to all models, CPU computational power being the only limit.
Best way to go is to use one of the 2 commercial packages available: Crystal ball or @risk. They are excel plug-ins, so you build your excel spreadsheet and from there declare variables, assign distributions and correlations within variables, almost no limits to complexity. These packages also employ the latin hypercube technique, which is an improvement of Montecarlo increasing its efficiency.
It works very well, only difficulty I had in applying it was in slope stability, because of cumbersome models, there you have to make it simple.
Suited to important projects, it makes up a full probabilistic analysis.

Semiprobabilistic analyses used in LFRD or LSD take lower- or upper-bound values, such as 5th or 95th percentile as point estimate for probability distributions. Actually, I'm not sure about LFRD, it's supposed to use only calibrated safety factor and expected values of parameters. I might be wrong.
In geotech, after Erocode7, as I told above there is much controversy about which would be the more appropriate conservative estimate of soil strenght parameters.
 
McCoy:

Thanks for the references, although the first one comes up as a damaged file. Perhaps you could check the link one more time?

I too used Harr's book as my first introduction to the topic of RBD. I was looking at a way to apply it to my unit labor cost database for construction estimating. I still don't have user friendly product, but it does give a good insight into unit costs vs. unit quantity.

Harr's book is out of print, but it might be found at some of the used book websites for those looking for a copy.
 
McCoy,
Yes, you're right. It's the FOSM method which uses Taylor's series that I was talking about. I agree with you in that it's not exact and that it's not perfect. There is definitely merit in computing a probability of an event occurring using a method that actually looks at all the possible combos in which failure could result, as the Montecarlo method does.

I've had decent luck with it as an approximate method for my purposes though. It's been a while, but I believe Duncan runs through a little Montecarlo vs. FOSM method example in one of his papers. For the examples he was working with, the FOSM method gave results comparable to that of the Montecarlo method. And the extent that it varied from the Montecarlo method was, in my best recollection, well within the operator inputted parameter errors that would be expected. Those errors that I refer to are of the "random" variety. It still holds true: Garbage in-garbage out in any method used.

Now to be honest, I was not aware of a simple program that uses the Montecarlo method or something similar until you brought it up. I may have to look into the excel plug-ins you mentioned. It may be even simpler for all I know. Is that available for free download by any chance?

However, one reason for me to think a more robust method may not particularly matter for me is because of the way I use the FOSM method; I like to think of the results (a probability of an event occurring) as a relative value rather than an absolute value.

For example, if I calculate (using FOSM) the probability of a foundation reaching or exceeding 2 inches of settlement using one type of site preparation method to be 0.5%, and the second site preparation method has a probability of 4% of meeting or exceeding 2 inches of settlement, I can look at the increased chances of unsatisfactory performance using the second site preparation method. I then have that tool available in deciding whether it's even worthwhile to mention such an apparently risky site prep scheme to a client. I would hope that since one operator (me) performed both analyses, the bias or latent errors present for both would be approximately equivalent for both iterations.

 
jheidt,
if you google search by the following keywords:
nasa fosm sosm
you should end up with that link first on the list.
Which is the labor cost equation you use and which are the input variables?

MRM,
I would be curious to see Duncan's verifications. If the equation is linear or quasi-linear, FOSM gives good results, after all you approximate a straight line with another line (around the mean value). But approximating a nth order polynomy with a 1st order (a straight line) may give problems, expecially as far the second moment (variance) goes.
Also Christian and Baecher gave an example of good performance in their book, but I saw no practical example of the usual formulas we use in geotechnics.
I'm really not against FOSM/SOSM, only in the computer era you have much more powerful methods to proceed. I think Crystal ball and @-risk by now have a consolidated reliability as random number generators.
is the Crystal ball link, you have access to free trial downloads.
It sure is simpler, and you can go into very complex details. for example, I tried a settlement equation by the Schmertmann's multy-layer method, using as input variables all the single Ei's plus q (the load), and correlating each layer's E to the preceding layer's modulus, also correlating each Ei to the load (you load more if you know the soil is good).
As to the site preparation equation, use of the method for comparative purposes should be OK, but only if variables COV's are the same. When speaking of P(failure) you speak of the extreme percentiles, the distribution's tails, regions very sensitive to even small alterations in the standard deviation.
The NASA studies concluded that FOSM and SOSM are not very good for reliability purposes, because the tails of the output distribution are not reproduced accurately. Conversely, FOSM/SOSM is good for robust design, where the mean is the governing parameter.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor