Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Huge File Size: Difficult to work on a heavy surface data file. 1

Status
Not open for further replies.

ShuklaPK

Automotive
Jul 4, 2016
11
Hi

I am using NX 10.0

I have to measure around 200 points on a surface data of a vehicle. I have the scanned data available with me in *.stl format for which the file size is more than 2 GB.

Concern:

1. The file is taking more than 30 mins to import in NX 10.0.
2. After the file is opened, it is too slow to work on. (I am using 64 bit, 64 GB RAM, 12 core Processor CPU)

Also, I can't divide the scanned file into multiple files due to some measurement constraints.

Can anyone suggest a method by which I can reduce the size of the file or make it lightweight?

Thanks in Advance

Pratik
 
Replies continue below

Recommended for you

NX has edit -> facet body -> decimate which can do the job but my choice would be to use something with more control for example GOM Inspect or mesh mixer and reduce the mesh size there.
 
To me the numbers sounds a bit strange.
Your scan data is 2GB and you only need 200 points.
- there are probably more than 2 million points in that cloud. ( 1 per 10 000)
are we doing the correct thing here ?

Is this the complete vehicle or is it a sub region ?
Is there a reason you cannot split the cloud in NX ?
Is this NX11 Which has enhanced capabilities handling very large clouds ?

regards,
Tomas
 
Follow up question if you are doing alignment and GD&T evaluation i would strongly suggest that you use GOM Inspect rather than NX.
 
Thanks a lot petulf. The 'Decimate Facet Body' worked and the file size is now low enough to easily work upon.

Regards
Pratik Shukla
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor