Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

FEA software for very large structural analysis with defects

Status
Not open for further replies.

Alonso84

Industrial
Nov 5, 2015
3
0
0
CH
Hi all,

I need some help (hope this is the right place to post).
I need to decide on the right FEA software for a project I will do.
Project: structural analysis of a very large asset. The model would have between 100 and 500 million degrees of freedom. Simulations of at least 50 load cases. Hence I need a fast solver, the fastest. Simulia-Abaqus, Ansys, MSC Apex, MSC Nastran, Femap, other???
By the way, Beam software would not work, because I need to model defects, damages on the structure, corrosion, cracks, etc.
As you can see the main issue is computational speed.
(university support, license cost or computer core power not an issue)
Thanks in advance!!
Alonso
 
Replies continue below

Recommended for you

A lot depends on the actual properties of the model. What are the components ? Beams, plates, shells ? Is there an elastic foundation of some sort ? What sort of analysis is required: linear static, natural frequencies, dynamic response, buckling, etc ? Depending on your requirements, you may well need a special, customized product. A couple of images of the structure would be helpful to evaluate the complexity of the problem.
 
Hi

I would look at how different softwares scale when they run over several cores. I think that Ansys, ABAQUS and the diggerent versions of Nastran all do it. I work with Autodesk Nastran and it uses more than one core but it also depends on the solution type. If you use linear statics they usually don't scale well. Based on what you mention it may be nonlinear statics it may be nonlinear and that is often better.

Forget Femap since it is not a solver.

If cost really is no issue, why not try a supercomputer cluster? I would also talk to some of the vendors if cost is no issue. And if it is very nonlinear, why not try the explicit road?

I would also look at the modelling approach. I very often model entire buildings or bridges and I have never used several hundred million DOF's. If you want to be very detailed, does the entire structure have to be detailed?

Good Luck

Thomas
 
Thank you ThomasH and ShellsPlatesMeshes for you kind answer,

ShellsPlatesMeshes:
- What are the components ? Beams, plates, shells ?
* I need 3D basically. The idea is to model a virtual replica of the asset, a Digital Twin.

- Is there an elastic foundation of some sort ?
* No

- What sort of analysis is required: linear static, natural frequencies, dynamic response, buckling, etc ?
* I require: Steady-state and dynamic linear elasticity, vibrations and modal analysis, and contact analysis. No buckling or nonlinear for the entire structure. In a second stage of the project I would do bucking analysis in specific regions (sub-models).

- A couple of images of the structure would be helpful to evaluate the complexity of the problem.
* The structure will be an entire oil rig. 80k tons.

ThomasH:
- I work with Autodesk Nastran and it uses more than one core but it also depends on the solution type. If you use linear statics they usually don't scale well.
* I do need steady-state and dynamic linear elasticity analysis. As many cores as possible. Ideas?

- If cost really is no issue, why not try a supercomputer cluster? I would also talk to some of the vendors if cost is no issue.
* Yes, that is a good idea. I've never talked to vendors. What can they offer that the standard software providers cannot? Other more powerful software packages, or just the same ones with more cores or a supper-computer on the cloud, or something like that?

- And if it is very nonlinear, why not try the explicit road?
* I don't understand this suggestion.

- I would also look at the modelling approach. I very often model entire buildings or bridges and I have never used several hundred million DOF's. If you want to be very detailed, does the entire structure have to be detailed?
* Yes it has to be highly detailed. For an oil rig with waves, wind and all the load forces it receives, and considering the risk of failure (environmental disaster), I cannot risk not modelling the entire structure and to solve it completely (many times actually, I'm thinking about 50 load cases at least -- not buckling analysis or nonlinear to begin with).

Thank you very much!!
Alonso
 
Hi

I sounds like primarily linear analysis and dynamics. It should scale fairly well depending on the software. Contact can be non-linear but hopefully not to bad.

What I meant when I mentioned vendors were software vendors. They can usually give information regarding their software provided that you ask the correct questions. But beware of the sales guy [smile]. A model with the size you mention can be of interest as a demo for their software.

If it was very nonlinear an explicit solver might be a good idea. The are also good for really large problems but in your case the "traditional" implicit is probably better.

You say that it is an entire oil rig. Since there are several of those around I wonder, has this been done before? Like I mentioned, I often model entire buildings or bridges and I have never used hundreds of millions of DOF's. There are so many methods available to avoid something like that. The first obvious one is to use the appropriate elements, beams are beams. To model beams with volume elements when you model a full oil rig. That is probably not a sensible approach.

I was at a seminar a few years back where a couple of guys gave a presentation regarding a model of an oil rig. I don't remember that the model was particularly large but the number of load cases was huge. The presentation was actually regarding how to handle a large number of load cases in FEA. And it was more than 50 loadcases. They had a special software for managing the load cases.

A model with 100+ million DOF's and 50 loadcases. There is an obvious risk of a drowning accident, you will drown in data.
One question (at the risk of being rude which is not my intention), have you worked with this type of modelling before?

Thomas
 
Hi again

Something that occured to me a few days ago.

If you stick to this solution approach don't forget the postprocessing. You have a huge model and the solution time may be long depending on a lot of things. But then when you have all the data, you will need to postprocess and requires a postprocessor. Don't forget that part. That may be governing for the model size.

And Greg has a valid point.
If the aim is a digital twin you need a lot of data. At least if you want an identical twin [smile]. I did something like that recently when I modeled a column to determine the strengt after an accident. It was a single column and we had a laser scan. But the analysis was not straightforward, it required some iterations to reproduce the deformed geometry. And I have probably not reproduced the exact accident but hopefully something valid for a conclusion.

Good Luck

Thomas
 
Hi all,

Thanks for the conversation. I've been testing Ansys, but it hits a wall when the size of the model gets too big.
I'm modeling a FPSO by the way, the whole boat in its real condition (with cracks, corrosion, etc.). So I have a coarse mesh in the global model (t10 DNV-RP-C206 for the ones familiarized with DNV standards), and several sub-model with fine mesh (t3 to t1 mesh refinement).

But I really need to go to the next step, simulate the entire model in full detail (condition-based modeling). I have looked at a solution provider in the web, but haven't dare to test it yet. You can check out their website and tell me if you know about it:
And, any other options to build a 1000 mdf condition-based model ??

Thanks!
DNV_R_coarse_model_fine_mesh_t3kcht.png
 
That company appears to just be automating what good FEA users already do manually. They are passing it off as improving the solution of huge models, but their algorithms simply reduce the size of the model with out telling you by converting things to superelements and submodels automatically.

You could do this is nearly any of the FEA packages that you mentioned and you would also have the benefit of using rigorously tested codes compared to this where it could be anything goes.
 
Have a look at P element solvers. Instead of increasing mesh density, they increase the order of the element, like QUAD4s to QUAD8.

another day in paradise, or is paradise one day closer ?
 
While I do not have any specific suggestions in terms of which code to choose, I want to make a comment:

As mentioned above, management of the data (transfer, visualization, storage, etc.) itself can easily get out of hand within a very short period. I would want to have these conversations, not testing, with vendors (LSTC/Dassault Systemes/ANSYS/Seimens/Altair/MSC/..). Testing of the code can come later once they have given you a quick demo or two and have answered your questions. I hope your management has blessed you with their support (in terms of buy-in, appropriate resources, checkpoints, etc.).

*********************************************************
Are you new to this forum? If so, please read these FAQs:

 
Status
Not open for further replies.
Back
Top