Hi, I'm retrofiting my old analog scanning electron microscopes with digitizers for display on a PC (see my post of Feb. 9 "Do Nyquist/filtering requirements hold for raster vide digitizing?" and today re: SNR of CRT's, LCD's, and film). In the earlier discussion a reader suggested that delta-sigma converters were not a good choice because they are not phase linear. I haven't been able to find any direct reference to this problem, but in searching I took a more carefull look at settling times of D/S converters. The implications are pretty startling. Here's an example part I was seriously considering. It has a nominal word rate of 5MSPS and a nominal bandwidth of 2.45Mhz: a sample every 200nS. But full-step settling time to .001% is 47 samples or 9.4uS out! And the curve is steep: %1 at 6.4 uS, 10% at about 5.4uS out. The impulse resonse graph is slightly better, but not much: at least %10 error over about 1 uS. I guess I get my "intuitive" sense of rise/settling time vs. bandwidth solely from working with oscilloscopes. It boggles my mind as to how anything with a 9.4uS settling time (implying a 'scope bandwith of about 100Khz) can be said to have a bandwidth of 2.45Mhz. Can this long settling time be understood in terms of lack of phase linearity? It seems like these D/S converters require a totally different paradigm for understanding. And it seems they are totally unsuited for my application. The bandwidth of the analog front end of an SEM is nominally 5Mhz, and much of the appeal of SEM images lies in sharp edges and high contrast. Ultimately that bandwidth is not needed, by scanning more slowly 1-2Mhz digitizing is fine: even 100Khz would be OK, except that image refresh rate would be intolerably slow. So now rather than looking at this 16 bit 5 MSPS D/S part, I'm looking at a 14 bit 10 MSPS SAR part. It's a tough decision, at a refresh rate of 3-5 1Kx1K frames/sec I can live with considerable artifacts and in turn live with a 10 sec frame rate for saving "final" images to disk. But I was planning on saving images of 4Kx4K or more as well, and 160 seconds is really getting "out there". Agree? Comments? -Jeff
Delta-Sigma: bad choice for image digitizing? Horrible settling time! Not phase linear?
Started by ●February 16, 2005
Reply by ●February 16, 20052005-02-16
jeff miller wrote:> Hi, > > I'm retrofiting my old analog scanning electron microscopes with > digitizers for display on a PC (see my post of Feb. 9 "Do > Nyquist/filtering requirements hold for raster vide digitizing?" and > today re: SNR of CRT's, LCD's, and film). > > In the earlier discussion a reader suggested that delta-sigma converters > were not a good choice because they are not phase linear. > > I haven't been able to find any direct reference to this problem, but in > searching I took a more carefull look at settling times of D/S > converters. The implications are pretty startling. > > Here's an example part I was seriously considering. It has a nominal > word rate of 5MSPS and a nominal bandwidth of 2.45Mhz: a sample every > 200nS. But full-step settling time to .001% is 47 samples or 9.4uS out! > And the curve is steep: %1 at 6.4 uS, 10% at about 5.4uS out. The > impulse resonse graph is slightly better, but not much: at least %10 > error over about 1 uS. > > I guess I get my "intuitive" sense of rise/settling time vs. bandwidth > solely from working with oscilloscopes. It boggles my mind as to how > anything with a 9.4uS settling time (implying a 'scope bandwith of about > 100Khz) can be said to have a bandwidth of 2.45Mhz. Can this long > settling time be understood in terms of lack of phase linearity? > > It seems like these D/S converters require a totally different paradigm > for understanding. And it seems they are totally unsuited for my > application. The bandwidth of the analog front end of an SEM is > nominally 5Mhz, and much of the appeal of SEM images lies in sharp edges > and high contrast. Ultimately that bandwidth is not needed, by scanning > more slowly 1-2Mhz digitizing is fine: even 100Khz would be OK, except > that image refresh rate would be intolerably slow. So now rather than > looking at this 16 bit 5 MSPS D/S part, I'm looking at a 14 bit 10 MSPS > SAR part. It's a tough decision, at a refresh rate of 3-5 1Kx1K > frames/sec I can live with considerable artifacts and in turn live with > a 10 sec frame rate for saving "final" images to disk. But I was > planning on saving images of 4Kx4K or more as well, and 160 seconds is > really getting "out there". > > Agree? Comments? > > -JeffThe filter that you were hoping -- and I doubt -- will suppress aliasing has long latency, but that doesn't affect the delay between samples. Think of the filter as a conveyor. Samples come out as fast as they go in. It takes some time for the output stream to begin after the input starts, and it keeps coming for a while after the input stops. The major effect I see is a lateral shift in the displayed image. D/S converters are death to servo systems, where delay is usually fatal. They are just dandy for canned video, where you can think of the latency as a slow response to the START button, and I don't see how they'll fail you. Think it through, though. My hunch shouldn't be taken as a guarantee. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●February 16, 20052005-02-16
Jerry Avins wrote:> jeff miller wrote: > >> Hi, >> >> I'm retrofiting my old analog scanning electron microscopes with >> digitizers for display on a PC (see my post of Feb. 9 "Do >> Nyquist/filtering requirements hold for raster vide digitizing?" and >> today re: SNR of CRT's, LCD's, and film). >> >> In the earlier discussion a reader suggested that delta-sigma >> converters were not a good choice because they are not phase linear. >> >> I haven't been able to find any direct reference to this problem, but >> in searching I took a more carefull look at settling times of D/S >> converters. The implications are pretty startling. >> >> Here's an example part I was seriously considering. It has a nominal >> word rate of 5MSPS and a nominal bandwidth of 2.45Mhz: a sample every >> 200nS. But full-step settling time to .001% is 47 samples or 9.4uS >> out! And the curve is steep: %1 at 6.4 uS, 10% at about 5.4uS out. The >> impulse resonse graph is slightly better, but not much: at least %10 >> error over about 1 uS. >> >> I guess I get my "intuitive" sense of rise/settling time vs. bandwidth >> solely from working with oscilloscopes. It boggles my mind as to how >> anything with a 9.4uS settling time (implying a 'scope bandwith of >> about 100Khz) can be said to have a bandwidth of 2.45Mhz. Can this >> long settling time be understood in terms of lack of phase linearity? >> >> It seems like these D/S converters require a totally different >> paradigm for understanding. And it seems they are totally unsuited >> for my application. The bandwidth of the analog front end of an SEM is >> nominally 5Mhz, and much of the appeal of SEM images lies in sharp >> edges and high contrast. Ultimately that bandwidth is not needed, by >> scanning more slowly 1-2Mhz digitizing is fine: even 100Khz would be >> OK, except that image refresh rate would be intolerably slow. So now >> rather than looking at this 16 bit 5 MSPS D/S part, I'm looking at a >> 14 bit 10 MSPS SAR part. It's a tough decision, at a refresh rate of >> 3-5 1Kx1K frames/sec I can live with considerable artifacts and in >> turn live with a 10 sec frame rate for saving "final" images to disk. >> But I was planning on saving images of 4Kx4K or more as well, and 160 >> seconds is really getting "out there". >> >> Agree? Comments? >> >> -Jeff > > > The filter that you were hoping -- and I doubt -- will suppress aliasing > has long latency, but that doesn't affect the delay between samples. > Think of the filter as a conveyor. Samples come out as fast as they go > in. It takes some time for the output stream to begin after the input > starts, and it keeps coming for a while after the input stops. The major > effect I see is a lateral shift in the displayed image. > > D/S converters are death to servo systems, where delay is usually fatal. > They are just dandy for canned video, where you can think of the latency > as a slow response to the START button, and I don't see how they'll fail > you. Think it through, though. My hunch shouldn't be taken as a guarantee. >The converter will only work well if it has good phase linearity -- i.e. the output should be a delayed version of the input and not be otherwise smeared out. If your ADC fits this then the pure delay, as Jerry said, doesn't matter much. Depending on how much you want to dink with things you can improve the resolution of your SAR converter by adding a dither signal to your data and oversampling when you want higher resolution. If you dither over a goodly number of LSBs then you'll average out quantization noise, electrical noise, and even some of the differential nonlinearity of the ADC. While you wouldn't want to calibrate on it, you could probably claim significantly more than 14 bits of resolution (_not_ precision) by doing this. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●February 17, 20052005-02-17
Tim Wescott wrote:> Jerry Avins wrote: > >> jeff miller wrote: >> >>> Hi, >>> >>> I'm retrofiting my old analog scanning electron microscopes with >>> digitizers for display on a PC (see my post of Feb. 9 "Do >>> Nyquist/filtering requirements hold for raster vide digitizing?" and >>> today re: SNR of CRT's, LCD's, and film). >>> >>> In the earlier discussion a reader suggested that delta-sigma >>> converters were not a good choice because they are not phase linear. >>> >>> I haven't been able to find any direct reference to this problem, but >>> in searching I took a more carefull look at settling times of D/S >>> converters. The implications are pretty startling. >>> >>> Here's an example part I was seriously considering. It has a nominal >>> word rate of 5MSPS and a nominal bandwidth of 2.45Mhz: a sample every >>> 200nS. But full-step settling time to .001% is 47 samples or 9.4uS >>> out! And the curve is steep: %1 at 6.4 uS, 10% at about 5.4uS out. >>> The impulse resonse graph is slightly better, but not much: at least >>> %10 error over about 1 uS. >>> >>> I guess I get my "intuitive" sense of rise/settling time vs. >>> bandwidth solely from working with oscilloscopes. It boggles my mind >>> as to how anything with a 9.4uS settling time (implying a 'scope >>> bandwith of about 100Khz) can be said to have a bandwidth of 2.45Mhz. >>> Can this long settling time be understood in terms of lack of phase >>> linearity? >>> >>> It seems like these D/S converters require a totally different >>> paradigm for understanding. And it seems they are totally unsuited >>> for my application. The bandwidth of the analog front end of an SEM >>> is nominally 5Mhz, and much of the appeal of SEM images lies in sharp >>> edges and high contrast. Ultimately that bandwidth is not needed, by >>> scanning more slowly 1-2Mhz digitizing is fine: even 100Khz would be >>> OK, except that image refresh rate would be intolerably slow. So now >>> rather than looking at this 16 bit 5 MSPS D/S part, I'm looking at a >>> 14 bit 10 MSPS SAR part. It's a tough decision, at a refresh rate of >>> 3-5 1Kx1K frames/sec I can live with considerable artifacts and in >>> turn live with a 10 sec frame rate for saving "final" images to disk. >>> But I was planning on saving images of 4Kx4K or more as well, and 160 >>> seconds is really getting "out there". >>> >>> Agree? Comments? >>> >>> -Jeff >> >> >> >> The filter that you were hoping -- and I doubt -- will suppress >> aliasing has long latency, but that doesn't affect the delay between >> samples. Think of the filter as a conveyor. Samples come out as fast >> as they go in. It takes some time for the output stream to begin after >> the input starts, and it keeps coming for a while after the input >> stops. The major effect I see is a lateral shift in the displayed image. >> >> D/S converters are death to servo systems, where delay is usually >> fatal. They are just dandy for canned video, where you can think of >> the latency as a slow response to the START button, and I don't see >> how they'll fail you. Think it through, though. My hunch shouldn't be >> taken as a guarantee. >> > The converter will only work well if it has good phase linearity -- i.e. > the output should be a delayed version of the input and not be otherwise > smeared out. If your ADC fits this then the pure delay, as Jerry said, > doesn't matter much. > > Depending on how much you want to dink with things you can improve the > resolution of your SAR converter by adding a dither signal to your data > and oversampling when you want higher resolution. If you dither over a > goodly number of LSBs then you'll average out quantization noise, > electrical noise, and even some of the differential nonlinearity of the > ADC. While you wouldn't want to calibrate on it, you could probably > claim significantly more than 14 bits of resolution (_not_ precision) by > doing this. >I don't have a problem with the propogation or "group delay", per se, but it seems to me the settling time spec is a different issue. The docs warn that it's important to consider the settling time for a large step in input: presumably group delay is independent of step size. And whereas it is obvious that the group/propagation delay is 26 samples from the impulse response graph (and the spec table), the settling time to .001% is 47 samples. At 27 samples, settling error is 10%, and at 32 cycles it is still %1, 40 cycles .1%. The graph of settling times goes off scale (>10%) at 26 or less samples, presumably because it's nonsensical to speak of settling time before the signal has propogated though the group delay. Focusing on the group delay of 26 samples and the %1 settling error at 32 samples, I have to conclude that the discrepency (between 26 and 32 samples, for example) indicates it's a totally different issue. If the input toggles every 6 samples (416.7 khz square wave at 5MSPS) it seems to me that according to these graphs I can never expect better than %1 accuracy from a 16 bit converter: it will always be too busy "settling" from the last step. The impulse response graph also indicates horrible ringing (as bad as %20) for a few samples to either side of an impulse (and really bad ringing for many samples more to either side), which is fairly consistent with the settling time issue (although %10 deosn't equal %20, I could be 1 sample off in my interpretation of the graphs). And it's symetrical about the 26th sample: somehow the D/S technique and/or the digital filter "sniffs out" the impulse and starts bouncing around many cycles before the impulse has propagated. I don't see any other way of interpreting this other than to expect ghosting around sharp edges in my images. I've uploaded the data sheet to my virtual server as www.hibytes.com/ads1606.pdf, study page 22 with a critical eye. Some web searching turns up parts with special solutions to this problem, but I think they only work for channel-switching applications: channel switching being a special case of input stepping, having as well the special feature that such stepping can be anticipated in the face of a channel-switch command whereas a step in input signal can't be anticipated. -Jeff
Reply by ●February 17, 20052005-02-17
Tim Wescott wrote:> > > The converter will only work well if it has good phase linearity -- i.e. > the output should be a delayed version of the input and not be otherwise > smeared out. If your ADC fits this then the pure delay, as Jerry said, > doesn't matter much. >In theory that's all well and good. But in reality you are almost always dealing with a smeared out version of the input. That's the function of the anti-aliasing filter to smear the signal - what that smear looks like is called an impulse response. There is already a smearing of the input coming from the electron microscope and there are apparently some capability to control what that smear looks like. But that's still theory lets look at the real problem: If I understand it correctly (and I may not), The electron microscope has an inherent bandlimit of 5 MHz (I.e. it can only follow the surface so fast). It sounds like you can control the speed so that the slower you go the more detail you can pick up. The actual spatial frequency capability has not yet been defined, but there must be a limit. The OP is planning to use a device that samples at 5 mhz. The D/S converter internally samples at 40 mhz which is well above Nyquist for this application. There is an inherent filtering in the D/S process and the OP is assuming that this will take of aliasing of frequencies in the 2.5-5.0 Mhz range. This is a reasonable assumption. The only question is - does the device have linear phase. I quick google search reveals many D/S converters do claim that their devices are linear phase. But it doesn't have to be. If the device doesn't make that claim I would be suspicious. I'm assuming that this is essentially an image processing problem. That is the output is designed for the consumption by human eyeballs. So the process should be designed for that. Human eyes can only see about 6 bits worth of grayscale levels so it doesn't really make a lot of sense to gather 16 bits of data for that application. On the other hand if the goal is to geometrically reoproduce the scanned surface then higher resolution would be good. -jim> Depending on how much you want to dink with things you can improve the > resolution of your SAR converter by adding a dither signal to your data > and oversampling when you want higher resolution. If you dither over a > goodly number of LSBs then you'll average out quantization noise, > electrical noise, and even some of the differential nonlinearity of the > ADC. While you wouldn't want to calibrate on it, you could probably > claim significantly more than 14 bits of resolution (_not_ precision) by > doing this. > > -- > > Tim Wescott > Wescott Design Services > http://www.wescottdesign.com----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Reply by ●February 17, 20052005-02-17
Hi Jeff The spec sheet for the device says it is linear phase so that is covered. The delay is not important. Your audience will never percieve the delay. The only thing I would question is do you need 16 bits. It does say this device is good for medical imaging, but they need the extra resolution because the object there is to extract information the eye can't see. But maybe you want that too. -jim jeff miller wrote:> > Tim Wescott wrote: > > Jerry Avins wrote: > > > >> jeff miller wrote: > >> > >>> Hi, > >>> > >>> I'm retrofiting my old analog scanning electron microscopes with > >>> digitizers for display on a PC (see my post of Feb. 9 "Do > >>> Nyquist/filtering requirements hold for raster vide digitizing?" and > >>> today re: SNR of CRT's, LCD's, and film). > >>> > >>> In the earlier discussion a reader suggested that delta-sigma > >>> converters were not a good choice because they are not phase linear. > >>> > >>> I haven't been able to find any direct reference to this problem, but > >>> in searching I took a more carefull look at settling times of D/S > >>> converters. The implications are pretty startling. > >>> > >>> Here's an example part I was seriously considering. It has a nominal > >>> word rate of 5MSPS and a nominal bandwidth of 2.45Mhz: a sample every > >>> 200nS. But full-step settling time to .001% is 47 samples or 9.4uS > >>> out! And the curve is steep: %1 at 6.4 uS, 10% at about 5.4uS out. > >>> The impulse resonse graph is slightly better, but not much: at least > >>> %10 error over about 1 uS. > >>> > >>> I guess I get my "intuitive" sense of rise/settling time vs. > >>> bandwidth solely from working with oscilloscopes. It boggles my mind > >>> as to how anything with a 9.4uS settling time (implying a 'scope > >>> bandwith of about 100Khz) can be said to have a bandwidth of 2.45Mhz. > >>> Can this long settling time be understood in terms of lack of phase > >>> linearity? > >>> > >>> It seems like these D/S converters require a totally different > >>> paradigm for understanding. And it seems they are totally unsuited > >>> for my application. The bandwidth of the analog front end of an SEM > >>> is nominally 5Mhz, and much of the appeal of SEM images lies in sharp > >>> edges and high contrast. Ultimately that bandwidth is not needed, by > >>> scanning more slowly 1-2Mhz digitizing is fine: even 100Khz would be > >>> OK, except that image refresh rate would be intolerably slow. So now > >>> rather than looking at this 16 bit 5 MSPS D/S part, I'm looking at a > >>> 14 bit 10 MSPS SAR part. It's a tough decision, at a refresh rate of > >>> 3-5 1Kx1K frames/sec I can live with considerable artifacts and in > >>> turn live with a 10 sec frame rate for saving "final" images to disk. > >>> But I was planning on saving images of 4Kx4K or more as well, and 160 > >>> seconds is really getting "out there". > >>> > >>> Agree? Comments? > >>> > >>> -Jeff > >> > >> > >> > >> The filter that you were hoping -- and I doubt -- will suppress > >> aliasing has long latency, but that doesn't affect the delay between > >> samples. Think of the filter as a conveyor. Samples come out as fast > >> as they go in. It takes some time for the output stream to begin after > >> the input starts, and it keeps coming for a while after the input > >> stops. The major effect I see is a lateral shift in the displayed image. > >> > >> D/S converters are death to servo systems, where delay is usually > >> fatal. They are just dandy for canned video, where you can think of > >> the latency as a slow response to the START button, and I don't see > >> how they'll fail you. Think it through, though. My hunch shouldn't be > >> taken as a guarantee. > >> > > The converter will only work well if it has good phase linearity -- i.e. > > the output should be a delayed version of the input and not be otherwise > > smeared out. If your ADC fits this then the pure delay, as Jerry said, > > doesn't matter much. > > > > Depending on how much you want to dink with things you can improve the > > resolution of your SAR converter by adding a dither signal to your data > > and oversampling when you want higher resolution. If you dither over a > > goodly number of LSBs then you'll average out quantization noise, > > electrical noise, and even some of the differential nonlinearity of the > > ADC. While you wouldn't want to calibrate on it, you could probably > > claim significantly more than 14 bits of resolution (_not_ precision) by > > doing this. > > > > I don't have a problem with the propogation or "group delay", per se, > but it seems to me the settling time spec is a different issue. The docs > warn that it's important to consider the settling time for a large step > in input: presumably group delay is independent of step size. And > whereas it is obvious that the group/propagation delay is 26 samples > from the impulse response graph (and the spec table), the settling time > to .001% is 47 samples. At 27 samples, settling error is 10%, and at 32 > cycles it is still %1, 40 cycles .1%. The graph of settling times goes > off scale (>10%) at 26 or less samples, presumably because it's > nonsensical to speak of settling time before the signal has propogated > though the group delay. > > Focusing on the group delay of 26 samples and the %1 settling error at > 32 samples, I have to conclude that the discrepency (between 26 and 32 > samples, for example) indicates it's a totally different issue. If the > input toggles every 6 samples (416.7 khz square wave at 5MSPS) it seems > to me that according to these graphs I can never expect better than %1 > accuracy from a 16 bit converter: it will always be too busy "settling" > from the last step. The impulse response graph also indicates horrible > ringing (as bad as %20) for a few samples to either side of an impulse > (and really bad ringing for many samples more to either side), which is > fairly consistent with the settling time issue (although %10 deosn't > equal %20, I could be 1 sample off in my interpretation of the graphs). > And it's symetrical about the 26th sample: somehow the D/S technique > and/or the digital filter "sniffs out" the impulse and starts bouncing > around many cycles before the impulse has propagated. I don't see any > other way of interpreting this other than to expect ghosting around > sharp edges in my images. > > I've uploaded the data sheet to my virtual server as > www.hibytes.com/ads1606.pdf, study page 22 with a critical eye. > > Some web searching turns up parts with special solutions to this > problem, but I think they only work for channel-switching applications: > channel switching being a special case of input stepping, having as well > the special feature that such stepping can be anticipated in the face of > a channel-switch command whereas a step in input signal can't be > anticipated. > > -Jeff----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Reply by ●February 17, 20052005-02-17
jeff miller wrote: ...> I don't have a problem with the propogation or "group delay", per se, > but it seems to me the settling time spec is a different issue. The docs > warn that it's important to consider the settling time for a large step > in input: presumably group delay is independent of step size. And > whereas it is obvious that the group/propagation delay is 26 samples > from the impulse response graph (and the spec table), the settling time > to .001% is 47 samples. At 27 samples, settling error is 10%, and at 32 > cycles it is still %1, 40 cycles .1%. The graph of settling times goes > off scale (>10%) at 26 or less samples, presumably because it's > nonsensical to speak of settling time before the signal has propogated > though the group delay.Trust me. It's OK. Here's what happens: Every transversal (the typical FIR structure) filter has a start-up transient. (For that matter, so do analog filters. The caps have to charge, etc.) What the spec sheet calls the settling time is the transient duration. Once the pipeline is filled, the output is normal, but delayed. The quality of what comes out after you stop getting new data deteriorates as the pipeline empties. So what? Just scan an extra line. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●February 17, 20052005-02-17
jim wrote:> > Tim Wescott wrote: > > >>The converter will only work well if it has good phase linearity -- i.e. >>the output should be a delayed version of the input and not be otherwise >>smeared out. If your ADC fits this then the pure delay, as Jerry said, >>doesn't matter much. >> > > > In theory that's all well and good. But in reality you are almost always > dealing with a smeared out version of the input. That's the function of > the anti-aliasing filter to smear the signal - what that smear looks > like is called an impulse response. There is already a smearing of the > input coming from the electron microscope and there are apparently some > capability to control what that smear looks like. But that's still > theory lets look at the real problem: > > If I understand it correctly (and I may not), The electron microscope > has an inherent bandlimit of 5 MHz (I.e. it can only follow the surface > so fast). It sounds like you can control the speed so that the slower > you go the more detail you can pick up. The actual spatial frequency > capability has not yet been defined, but there must be a limit. The OP > is planning to use a device that samples at 5 mhz. The D/S converter > internally samples at 40 mhz which is well above Nyquist for this > application. There is an inherent filtering in the D/S process and the > OP is assuming that this will take of aliasing of frequencies in the > 2.5-5.0 Mhz range. This is a reasonable assumption. The only question is > - does the device have linear phase. I quick google search reveals many > D/S converters do claim that their devices are linear phase. But it > doesn't have to be. If the device doesn't make that claim I would be > suspicious. > > I'm assuming that this is essentially an image processing problem. That > is the output is designed for the consumption by human eyeballs. So the > process should be designed for that. Human eyes can only see about 6 > bits worth of grayscale levels so it doesn't really make a lot of sense > to gather 16 bits of data for that application. On the other hand if the > goal is to geometrically reoproduce the scanned surface then higher > resolution would be good. > > -jim >Except that in video processing one often dispenses with the anti-aliasing filter -- in image processing circles such a filter would be done spatially and would be known as "soft focus". In general you can get a nicer looking image by focusing down smaller than your effective detector size (i.e. you want your blur spot to be on the order of your detector pitch). This creates all sorts of interesting difficulties for anything that involves interpolating between pixels, but it's what looks best. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●February 17, 20052005-02-17
jeff miller wrote: -snip-> And it's symetrical about the 26th sample: somehow the D/S technique > and/or the digital filter "sniffs out" the impulse and starts bouncing > around many cycles before the impulse has propagated. I don't see any > other way of interpreting this other than to expect ghosting around > sharp edges in my images. >It's not "somehow" -- it sounds like the digital filter implements a sinc(x) or [sinc(x)]^n function that's offset by 26 samples. So when the impulse hits it'll start ringing up, hit a maximum at sample #26, then die down until it gets to sample #50 or so. This is just grand for making a brick-wall filter in the frequency domain, but any time you have a sharp cutoff in the frequency domain you have a corresponding ring in the time domain. You may be able to diminish this effect somewhat by following your S/D converter with a Gaussian filter, but you'll have a distinct tradeoff between frequency response and nice settling, which would translate to less ringing but a longer settling time on the "bulk" signal. I don't know if you'd end up with an improvement in settling time this way or not. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Reply by ●February 17, 20052005-02-17
Tim Wescott wrote:> > > Except that in video processing one often dispenses with the > anti-aliasing filter --If your talking about an image procurred by an array of detectors they dispense with the anti-aliasing filter simply because they have no way to implement it. That's not the situation the OP is in. He has an analog signal in an electrical circuit that he is sampling. -jim>in image processing circles such a filter would > be done spatially and would be known as "soft focus". In general you > can get a nicer looking image by focusing down smaller than your > effective detector size (i.e. you want your blur spot to be on the order > of your detector pitch). > > This creates all sorts of interesting difficulties for anything that > involves interpolating between pixels, but it's what looks best. > > -- > > Tim Wescott > Wescott Design Services > http://www.wescottdesign.com----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy via Encryption =----






