bigmanrunning
Mechanical
- Mar 22, 2011
- 10
I've found that .gz DEFLATE compression algorithm reduces most .odb file sizes by a factor of two or more depending on model size. However, CPFE simulations with 1-2 million DOF sometime push 10k-20k increments in a step and I can I have very large .odb files that can't be further reduced due to memory usage during the simulation and the number of variables I need for post processing. These .odb files aren't terrible alone, but when doing studies with 100+ of these simulations the storage of the data becomes a pain.
I'm sure I'm not the only one with this issue. Does anyone have a Python code or know of a method for retroactively rewriting the .odb files with some stride and perhaps dropping select data?
I'm sure I'm not the only one with this issue. Does anyone have a Python code or know of a method for retroactively rewriting the .odb files with some stride and perhaps dropping select data?