Ryan Allybaum
Electrical
- Sep 3, 2018
- 3
I have to work with a linear image sensor, 1x2048 pixel array, for sampling this video signal using an ADC (obviously), am I dependent on the start scan signal from the image sensor for syncing my sample rate for the ADC? Otherwise I’d have to oversample in order to provide a good analysis of the signal? E.G. every single ADC sample must be perfectly timed with the peak of each analog voltage reading.
It this supposed to be guess work using delays inside of a microcontroller? Can any ADC sampling at the same frequency of the analog signal successfully sample uniformilly, (offset from the peak can simply be gained in software)?
It this supposed to be guess work using delays inside of a microcontroller? Can any ADC sampling at the same frequency of the analog signal successfully sample uniformilly, (offset from the peak can simply be gained in software)?