Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations IDS on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

For all the old geisers 5

Status
Not open for further replies.

Baldy217

Mechanical
Jun 7, 2007
41
Not too sure where to post this, so I'll do it here

If you remember the days when calculators took a whole room, well this thread's for you.



So what was it like having to do all the calculations by hand? I certainly can't imagine my life without a calculator, probably like most engineers.

Nowadays, people are complaining that we take computers for granted. Some say that many new grads rely to much on them and don't really grasp the theory they are applying.


So my question is : did they have that same argument for calculators back then too?
 
Replies continue below

Recommended for you

The problem with any "tool" or instrument or computer is that it has to be welcomed by the operators, not just understood.

The best and most idiot proof equipment in the world stands no chance of surviving if the operators think it will cost jobs, impact on their perks, or even if they just don't like the bloke from the office (aka the "suit" i.e. anyone in cleaner overalls than they) trying to tell them how they can do a better job with his new fangled kit.

It is surprising how many such innovations get suddenly flooded by the fire hoses, bursting steam lines, hit by fork lift trucks where no fork lift should be, suffer heavy weights falling on them etc. .

I went through an exercise on a lubes facility where the plant manager wanted to improve the de-asphalter control but every time he'd set up the control parameters the operators would come in and set them back to where they wanted them, where they were used to having them.



JMW
 
Sure, it was. Every generation has a discontuity with the previous one(s). The REAL difference is that in this particular point in time, there are a mulitude of contemporaneous generational gaps.

My grandfather was dirt farmer who never even heard of computers.
My father learned to program computers using patch cords.
Somewhere in between was punch cards
I learned simple programming tools like QBasic
My son is like, "What, you have to type in all those commands, where are the drag&drop icons?"

Not much different than when the Jacquard loom automatically cranked out different weaves and patterns, while the older generation then lamented the loss of artistry and the human touch. The time scale difference is that it took almost 100 years for the next set of major advances to occur, while we're getting major advances in less than 20 years.

Anyone remember the RL02 removable disks? I used them in 1982. They were gigantic, about 16-in diameter, and held a piddly 5 MB. You can't even find a flash drive that holds less than 128 MB in the store now. You can easily buy a 500 GB HD for lower cost than the RL02 platters alone. Just this single change has completely revamped the notions of what to store and whether we care. We've got processors in coffee machines that are more powerful and come more storage than the navigation computer on Apollo 11. Contrast that with the time lapse between the Jacquard loom and the Hollerith punch card of nearly 100 yrs.

TTFN

FAQ731-376
 
Computer programs purchased from the work of computer nerds is not my idea of comprehensive structural analysis. Complacency can be bred into analysis by trusting that all the details have been addressed. Some modern day disasters have been the result of so-called computer routines that may have overlooked some finer details, like connections, wind/snow/rain effects, thermal effects, fatigue, material defects, etc.

In aviation design we had pre-release design reviews staffed by a number of grey beards with a few cycles of experience under their belts. They posed pointed questions that got the design engineers thinking along avenues not considered before.

I remember when early composite fan blades were fabricated and bonded with questionable mid planes. The outer zones were face to face bonded, but the mid plane was the bonded juncture of edges. Typically, in bird strike research, failure was at the midplane, and some engineers even claimed that it was a good thing. I pointed out that all bonded zones should be face to face, not edge to edge. This is the kind of dialogue that comes out of thoughtful considerations of the details, and computer programs are liable to overlook such finer details.
 
Jistre
Its made of vinyl and is covered in a spiral groove, one on each side, made by recording sounds.
By spinning it at about 78rpm and resting a needle in the groove, the vibrations created in the needle are converted to sound in this big trumpet thing like a megaphone.
This is a great device, especially when it is the Ink Spots singing "Whispering Grass", one of my favourites.


JMW
 
Even the Jaz isn't THAT old, and that was 100 MB on a disk that was probably 25 times smaller and 100 times lighter.

As for software, I'm not convinced that argument is valid. Software is a tool. The fact that someone uses a staplegun as a hammer shouldn't be blamed on the staple gun. Things are left off simply because the projects are really too complex for humans to deal with. I work on a BIG PROGRAM, where there are guys whose sole job is "gap analysis," to determine which ORD requirements weren't flowed or correctly flowed downward, and there are TONS of them. And this is from an organization CERTIFIED to have the highest possible and best levels of systems engineering processes.

Software does what it's programmed to do, usually... But, we don't ever seem to get the time tally the checklist of lessons learned so that we can use it for the next project.

TTFN

FAQ731-376
 
In looking through some old technical books, one thing I have noticed is there was more of an attempt to get an analytical solution by whatever means possible. This might include drastic approximations in the problem set-up. One drawback on scholastic-type problems was to consider a problem "solved" when no numbers had actually been generated. For example, you can "solve" a partial differential equation with the solution being a double series with terms of Bessel functions, etc. Actually turning your "solution" into a stress would then be about on par with running a FEA solution in the first place, which makes it hard to consider that the original problem was actually solved in a meaningful way.

On the significant-digit issue, I remember that being emphasized at various times in school, but those pushing the issue tend to be about as bad as those generating 8-digit decimals. The problems with that approach are multiple. The significant digits in your solution are not just dependent on the significant digits in your original numbers, but on the functional relationships and the accuracy of the functional relationships as well. You can in fact wind up with more significant digits (or fewer) than what you started with. I vaguely recall one problem from Thermodynamics class where we were using properties read off of a graph, and the uncertainty was amplified such that when we were all done, we couldn't even tell which way the fluid was flowing in the system.
 
I don't see that as a problem generated by the people, per se. 200 years ago, all physics problems that could be solved at all, could be solved analytically. The tallest building 200 yrs ago, aside from the pyramids, were not more than a few stories tall. One could contemplate such a building being analyzed with hand solutions.

When I was in college, MOS transistors could be well represented by a rather simply parabolic equation run on an HP25. 10 yrs later, that equation was no longer applicable, because of short and narrow channel effects. Body effect became a bigger contributor to the non-ideality of the transistor model. Likewise, the venerable Ebers-Moll model lasted until the Gummel-Poon model was required to properly model all the parasitic effects that then dominated the bipolar transistor model.

In electro-optics, we used a rather simplistic analytical model 20 yrs ago to determine resolution, detection, recognition, and identication ranges for sensors, all in closed-form algebraic approximations. The optics performance was treated as diffraction-limited and not a serious impact on performance. We modeled the diffraction blur as being well less than 1 pixel in size. Today, the diffraction blur is spread over 4 or 9 pixels. The optics can no longer be treated as ideal. Today, we have to use a numerical approximation model that has little analytical traceability to the "real world." That's also partly be cause we know more about how the eye-brain interface works, so the resolution model of the eye is substantially more complicated than it was 20 yrs ago.

I would contend, therefore, the mere progress of the technology and materials that we use, the change in scale of the physical designs, all collude to require us to use models that can't be solved on a calculator, or even a spreadsheet. That's the cost of doing business with the latest technology or sensors; we have to use complicated models to accurately reflect the actual performance of systems we analyze.

TTFN

FAQ731-376
 
I don't think anyone has said that we should have stopped advancing with 1980's technology (I've never understood why the Amish are happy to use advanced technology from the 19th century but find 20th century stuff evil). I'm saying that whatever technology we use, we should understand the underlying assumptions and do our best to uncover any hidden limitations. 30 years ago it was so much work to get a beleivable answer to a real-world problem that people tended to make darn sure that they started driving the nail with some sort of hammer instead of the side of a stapler. Today I see a lot less care in tool selection.

David
 
I used a slide rule in high school; calculators weren't allowed becuase not everyone could afford them. I bought a TI SR-10 in 1973 for $90.

I think young engineers today place too much emphasis on computers. They're seem too eager to create a model for even the simplest things. There's nothing wrong with that but the danger is that they're not thinking through the problem. A computer should be a tool not a crutch. It shouldn't be a substitute for learning how to discern things.

When most things were done by hand, we spent more time thinking about the approach to the analysis or design.
 
Too tolerant of mistakes?
Bridgebuster's comments suggest that when you do it the hard way you have to be rigorous because you can't afford to make mistakes. With computers it doesn't matter that you get to the end before you find the mistake because it is so easy to correct it and start over. The problem is when this "tolerance" leads to the mistakes not being discovered; perhaps we don't fear mistakes as much as we should.

Consider the spell checker. We type away and then let we correct the spellings that spell checker points out to us. We will miss the mistaken use of their for there because we no longer have the right discipline. Before spell checkers when we wrote directly to the page, you had to get it right first time because too many mistakes meant starting over and even a few mistakes meant getting out the typex.

So, is Word with spell checker better than the old ways? It is certainly more productive but have we learned to use it correctly or lazily?

JMW
 
With or without a spell checker, Word (or any other WP) is essential for me. I am not able to put down my thoughts coherently in a linear fashion (i.e., start at the beginning; finish at the end). I write chunks of prose (manuals, design docs, etc) in a random order and then stitch them together in a WP. Without a WP I would sit at the beginning with writer's block.

- Steve
 
I really think it boils down to ubiquity and the understanding of the calculation process - having a feel for where you should end up. The more esoteric the machine or program, the less likely anyone other than the designer will know the limitations of it. A standard calculator can be given to most folks with an understanding of math and within a few seconds be able to perform many types of calculations with great accuracy and precision. Hand that person an RPN calculator and they stall for a little while, but can catch up fairly quickly.

With both of these tools, an understanding of the order of operations of math is required to do many multi-step calculations. With that understanding, they can use parenthesis and the RPN stack to keep things straight.

Now, put them on Excel to do some calcs. There is a quirk in Excel where the program handles one rule of the order of operations differently than what is generally accepted. Here's where things get interesting. Without knowing that quirk, and using the same rules they used with their calculators, they will get different answers with Excel than what their calculators gave them. If they don't check it with their calculators and they have no feel for what the answer should be, they will blissfully carry on believing that their answer is perfectly correct.

The Garbage In/Garbage Out rule ALWAYS applies, but when the volume of input gets larger (sophisticated programs), or the rules change (unexpected or unknown program assumptions or quirks), it is much easier for garbage to get in. This is why a much greater understanding of what is going on is required when using more sophisticated tools.


(FYI: for a very lively discussion on the Excel quirk, see these threads:



If you "heard" it on the internet, it's guilty until proven innocent. - DCS
 
SomptingGuy,
25 years ago, you would have put your thoughts on a piece of paper, probably written an outline, then pieced them together for the secretary to type up (hopefully on a Wang). If your paper was not put together well enough you would have seen first hand "garbage-in garbage-out".
That would have forced a different technique of putting your paper together.
 
Well it was 22 years ago, but yes, she did have a Wang. There was so must wasted time trying to get bits rearranged when they didn't read quite right. Plus she wasn't keen on typing little bits here and there - it had to be considered finished.

- Steve
 
"Well it was 22 years ago, but yes, she did have a Wang"

LOL, sounds like a bad New Orleans story.
 
I think that you're all arguing different things here. Spell checking isn't quite the same level. Everyone knows that you cannot get 100% success rate with manual checking, without infinite time. It's one thing to proof a 5 page memo manually, it's another to proof a 500-page document manually.

As for calculator usage, I think that you're failing to see that the models in use today can't be checked with a calculator, at least, not in any plausible period of time. For a simple beam, yes, you can. For a 100-story building, good luck with that.

For those in college, are you expecting them to spend 6 yrs to get a BS? Because that's what it would take for them to learn and understand every bit of any complex modelling approach. 50 yrs ago, you spent 4 years in college and got a BS, with no computers, no calculators, no finite element analysis, etc. Today, you spend 4 yrs in college and get a BS, but with computers, calculators, finite element analysis, intro to programming, spreadsheets, etc. Something has to give or people have to spend more time in school.

Much of what I've read here smacks of Luddism. I can imagine the guild-masters of the mid-19th century demanding that textile workers put in the required years of apprenticeship in the weavers guild so that they can understand the intricacies and complexities of weaving, to prepare them to work in textile factories.

While I understand the argumens and partially agree, the modern world and education and work is a different place. My son, freshman in high school, taking honors biology, is being taught stuff we didn't even learn in college. Likewise, he's already completed a class in Java. And he's expected to take at least 4 AP classes, while I graduated when 3 AP classes was already beyond the norm. In order to get all this stuff in, something gets left out. It used to be that Latin was considered to a mandatory requirement for a high school grad, it got cut out because there were other things that had higher priority.

What it all boils down to, is that to progress, you stand on the shoulders of giants. To do that, you have to build on, and not redo, what was done before. No one expects you to re-derive the Laplace transform.

Everyone blithely talks about using a calculator to check the math, but no one has argued that you need to understand the intracies of binary and BCD arithmetic to use a calculator. Why not? Isn't that just as important as understanding how a slide rule works? So shouldn't everyone be using slide rules to verify the math? Is your company going to pay you your salary to verify that Algor's or Mathcads algorithms are all correct and validated?

TTFN

FAQ731-376
 
"No one expects you to re-derive the Laplace transform."

Um, I seem to recall doing this at Uni.

In fact the deriviation of various standard equations was the emphasis in a lot of my university classes.

Maybe it's a UK v US thing.

Then again maybe that wasn't your point IRstuff.

What you say makes a lot of sense, in fact I had further evidence just this morning that I may be a luddite.

At the same time though the point about fundamentally knowing what you're doing, or at least reailizing you don't know what you're doing is a fair one.

KENAT, probably the least qualified checker you'll ever meet...
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor