Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Amplifier driving SAR ADC 1

Status
Not open for further replies.

sanlog

Electrical
Aug 25, 2006
11
Does anyone know of a "true" 16 bit SAR ADC that has an on-chip amplifier? It seems like all of the converters require an external amplifier. Given that the dataconverter speed is known, load is known, input voltage level is known, power supply votlage is known, and the noise known...what's left? And the data sheets always show an optimum amplifier part number.

Driving these huge switched cap loads would sure be easier if they simply gave you a pin where you hang a cap!

 
Replies continue below

Recommended for you

Ha! I know it hurts.. [lol] You bumped me outta top spot in Electronics.[cry]

And thanks.

Oh and those are rough numbers as there are sooo many variations. (No multiplexing included) They just jumped out of my head and often first thoughts in EE are relatively correct (if you have a lot of experience in a particular area) [otherwise bad results]. I then went back and looked at them again and couldn't really change them. (seemed good still)

Keith Cress
Flamin Systems, Inc.-
 
Those numbers from Keith look pretty soild to me for "best practice" and are certainly realistic for a well thought out layout.

As to noise, the worst of the noise will come from the digital microprocessor part of the circuitry and will be virtually asyncronous from what the a/d itself is doing. Other sources of noise will be either truly wideband white noise, or perhaps mains frequency related, if there are remote incoming analog inputs. All of which will look random and asynchronous to the a/d.

Simple oversampling will work fine if the inputs are slowly varying. But the software can be made a bit cleverer than that if it has to deal with sudden large step changes in amplitude. Do the best you can with the hadware, and then the final ultimate result will really come down to the software.
 
So, based on Keith's numbers,

cost (in dollars) =~ .00516 * 1.83^resolution (in bits)

(with a correlation coefficient of about 0.99).
I just had to do a curve fit on that [glasses]
I'll leave it as an exercise to the reader to plug in 24 bit resolution...

Now, how to factor in samples per second and number of channels into the equation, and we might have a genuinely useful rule of thumb.

More seriously, I read a paper years ago outlining the limits of A to D conversion resolution and speed based on thermal noise and information theory. Some part specs today exceed the limits of physics. And that's not even accounting for things like clocking noise and power supply noise. It's always wise to seperate your needs in terms of accuracy, resolution, and repeatability.
 
HA! Wow, surprising correlation!

Yeah, I stayed away from the multiplexing as that is a huge can of worms all by itself.

Bit specs are highly suspicious to me now days.

hmmm 24bits = $10,267.... sounds about right! :)

Keith Cress
Flamin Systems, Inc.-
 
Give me enough time (and money), and I'll give you all of the accuracy you can handle [:D]

Dan - Owner
Footwell%20Animation%20Tiny.gif
 
Guys I have a question.

If you knew the spectrum of the noise roughly
could you not build a FIR digital filter and get
the bits back with much less oversampling than say
1024 times.
The noise in this case because it is interference from
other logic would be high in frequency say more than
1 khz. ???
 
One old way to counter some of the processor noise is to use an FM clock signal for the processor. This will mean using a separate external VCO clock oscillator plus some sort of modulator to vary its frequency; this modulator can be analogue cyclic but better is a VCO driven by the processor itself from a psuedo random look-up table. Not all processors can tolerate a wide enough range of clock rate to make this technique useable.
 
I think the discussion on noise has gotten off track. I may be wrong, but I think we are discussing several types of noise:

1. White noise from the source.
2. Deterministic noise from the source.
3. White noise from the ADC.
4. Deterministic noise from the ADC.
4. Deterministic noise from the microcontroller.

I think the ADC deterministic noise is carried in the specifications for DNL and INL (i.e. quantization noise). And I think there is very little white noise coming from the microcontroller.

Given all of the above, oversampling (which I think is really ensemble averaging) only increases the ADC resolution in the presence of white noise; either from the source, the ADC, or even the microcontroller if the noise exists.

Actually, SAR converters have a problem with oversampling. You don't get equal noise density of 1s and 0s right at the bit transitions; a requirement for averaging to work. This is not an issue for a delta-sigma ADC which is why oversampling is a term readily applied to delta-sigma converters. And why delta-sigma converters achieve higher bit resoluition at lower conversion rates. So there is an upper limit to "bit extension" if you ensemble average with a SAR.

There is a very interesting paper I read a while back that discusses the above in rote detail. If anyone is interested I can go back in and pullup the title.

Nevertheless, oversampling does not remove deterministic noise to the extent that the deterministic noise is below the averaging frequency.

This brings us to the last post about using a filter. In effect a filter is simply a "better" way to remove the deterministic noise than averaging. This is because the filter can have a much higer order of rolloff for the determinisitc element.

Given all of the above, can you buy a 16bitADC/microcontroller whose deterministic noise is below the LSB size?

I don't expect the ADC to solve the source noise issues or even the ADC/microcontroller white noise. This can be done to a limited extent by ensemble averaging. But if running the microcontroller simultaneously with the 16 bit ADC creates a deterministic noise within the ADC, I think I am screwed unless I know the noise spectrum of the code. If I could "deterministically" (is that a word?) account for the noise, then one could use a high order filter provided the noise doesn't come at a very low frequency (i.e. 1Hz clock in software).


Boy! That's a lot of writing! But, buried in there is a question.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor