Qwerty11111
Mechanical
- Jan 9, 2014
- 2
I have a odb file which is ~1.6Gb in size. It contains 1 step with 7 frames. I am writing a GUI plug-in which needs to loop through all the steps and all the frames. A cut down version of the code I use to do this is as follows:
odb = openOdb(path=self.owner.odbNameKw.getValue())
nstep = len(odb.steps)
for i in range(nstep):
odb.close()
A the end of the procedure, there is an additional ~1.6Gb in memory. Calling odb.close() does not release it. I have tried various del statements and also garbage collection, none of which work. The memory increases each time the 'currentFrame = ' line is executed. I have also tried using 'del currentFrame', but this doesn't work either.
Any help much appreciated, as I will need to open files upto 10Gb eventually.
odb = openOdb(path=self.owner.odbNameKw.getValue())
nstep = len(odb.steps)
for i in range(nstep):
stepKey = odb.steps.keys()
nframes = len(odb.steps[stepKey].frames)
for j in range(nframes):
currentFrame = odb.steps[stepKey].frames[j]
odb.close()
A the end of the procedure, there is an additional ~1.6Gb in memory. Calling odb.close() does not release it. I have tried various del statements and also garbage collection, none of which work. The memory increases each time the 'currentFrame = ' line is executed. I have also tried using 'del currentFrame', but this doesn't work either.
Any help much appreciated, as I will need to open files upto 10Gb eventually.