Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

Tolerance Designs Affecting The Accuracy of Lathe Spindle Caused by the Bearing System

Status
Not open for further replies.

jeremymc

Student
May 2, 2023
1
0
0
ID
Hi, I was given a lathe design by my professor to analyze.

The lathe uses two tapered roller bearing with different sizes in a back to back arrangement which is 340mm apart.
The lathe will then be fitted with a testbar and it's runout will be calculated against the lathe bed.
The spindle is required to have an accuracy that follows the code number G6 from ISO 1708:1989 "Acceptance conditions for general purpose parallel lathes — Testing of the accuracy".
The standard states that the runout shall not pass 0,005mm for a measuring length of 100mm (from spindle nose).

I'm allowed to assume a perfect bed and test bar, so that only the seatings (on the spindle), the housings (on the headframe), and the bearing itself that contributes to the spindle's accuracy.

I'm wondering if there's any book/reference/paper about how to choose the correct bearing class to fit that standard and how to design the tolerance for the bearing seatings and housings.
I tried modelling the system as lines but i'm not sure if this is accurate enough.
We would like to also figure out how different tolerance values affect the runout accuracy through that model and finally if the preload affects/breaks the whole calculation.

Feel free to ask for more details of the bearing system. Any references regarding similar topic is appreciated, thank you.
 
 https://files.engineering.com/getfile.aspx?folder=668ff02b-b2f1-469d-80bc-12b20cdd8252&file=Screenshot_2023-05-02_151830.png
Replies continue below

Recommended for you

As BrianE22 above stated, the only approach I know (and perhaps others know other resources) is to use the manufactures published design rules for tapered bearings.

Just to leave some extra info, non-tapered bearings have a much more straightforward approach. "Roller Bearing Analysis" by Harris is, as far as I know, the Bible for such things. It contains a lot of information though. Ch 3 is titled "Interference Fitting and Clearance." which is probably of most interest to you. Below is a table from the chapter (really from ANSI/ABMA), which is specifically for ABEC-1 bearings. Also below is a reference from the chapter which contains ANSI/ABMA standards for bearings, which may or may not contain info on higher precision bearings (I've never looked into it). I tend to use the table shown for ABEC-1 without problems, but my applications are not nearly the speed of a lathe.

Ch_3_Table_eri0ak.png


Ch_3_References_l8juul.png
 
And to just close the loop on your overall question:

Runout is not primarily determined by bearing fits (though improper fits will certainly cause issues). The type of chuck used (3 jaw vs 4 jaw) and designs in the lathe that allow for adjustment/calibration is what really contributes to reducing runout. This is typically done with a dial caliper (which I assume you will use for your tests) and based on the read outs over the length of the test bar will tell what adjustments need to be made (this is where 4 jaw chucks really shine, link below). Doing such adjustments/calibration is done on all fabrication machinery, such as a mill which is called "tramming" which ensures squareness of the cut to the workpiece.

Link
 
Status
Not open for further replies.
Back
Top