DSPRelated.com
Forums

interface to ADC and DAC

Started by hellodsp July 30, 2005
Hi all,

In building a high speed communications system, I have some general
thought on synchronizing DSP with DAC and ADC and would like to have your
opinion. 

In transmitting mode, I think there are two ways:

1. Set up an interupt from timer and the transmitter task can be inside
the ISR. 
2. Each time the DAC successfully read from the DSP parallel port, it
creates an interupt to the DSP and the DSP process and put the next sample
to I/O port.

Which scheme is better? I feel 2 is probably better.

In the receiver mode, I can still think of two ways:

1. Interupt based on samples. This consumes a lot of dsp cycles for
handling the ISR, but it's truely realtime.
2. Use DMA to move a block of data into memory and then process. This
saves DSP cycles. But one might need dual buffers in order to have a small
latency in processing. When have dual buffers, is there any problem to do
filtering across the boundary?

Michael


		
This message was sent using the Comp.DSP web interface on
www.DSPRelated.com
hellodsp wrote:

> Hi all, > > In building a high speed communications system, I have some general > thought on synchronizing DSP with DAC and ADC and would like to have your > opinion. > > In transmitting mode, I think there are two ways: > > 1. Set up an interupt from timer and the transmitter task can be inside > the ISR. > 2. Each time the DAC successfully read from the DSP parallel port, it > creates an interupt to the DSP and the DSP process and put the next sample > to I/O port. > > Which scheme is better? I feel 2 is probably better.
Define "better". This usually means you need to investigate the cost and performance impact of your various decisions, and decide which one fits your product the best. If you're setting up DMA channels anyway, why not load an output DMA buffer from the DSP, and use DMA to write to the DAC? In control systems for all but the finickiest plants I usually just write the DAC any old time (actually I write the DAC just as soon as the control computation gets done). This would cause all sorts of subtle problems in audio and communications systems where synchronization is much more important than speed.
> > In the receiver mode, I can still think of two ways: > > 1. Interupt based on samples. This consumes a lot of dsp cycles for > handling the ISR, but it's truely realtime.
Look on the web for definitions of "real time". Real time does not mean real fast. Real time means that you have hard deadlines that must be met -- but if the task is to maintain the level of the lake behind a dam, the hard deadline for adjusting the spill gates may be every day at midnight.
> 2. Use DMA to move a block of data into memory and then process. This > saves DSP cycles. But one might need dual buffers in order to have a small > latency in processing. When have dual buffers, is there any problem to do > filtering across the boundary? >
Or use DMA to move single ADC reads into a circular buffer -- if your DMA hardware is up to it. How important is latency in your particular application? 1 sample time? 2? 5? 10? 100? I sometimes work with imaging systems where a frame time (1/30th of a second, or well over 10^6 ADC sample times) is perfectly acceptable. I often work on control systems where you want to keep the latency to within 1 sample time. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Tim,

Thanks for replying to my post. In my system, I need to keep the latency
<100 samples, but a packet of data could contain several thousand samples.
Now on the transmitter side, I can't afford such a big buffer and latency
for storing and then use DMA to output to DAC. So I might use a timer to
create interupt to write to DAC. On receiver side, I will probably do
sample based interupt(I guess the same situation as yours) but I'm worried
of the overhead of ISR.

Michael


>hellodsp wrote: > >> Hi all, >> >> In building a high speed communications system, I have some general >> thought on synchronizing DSP with DAC and ADC and would like to have
your
>> opinion. >> >> In transmitting mode, I think there are two ways: >> >> 1. Set up an interupt from timer and the transmitter task can be
inside
>> the ISR. >> 2. Each time the DAC successfully read from the DSP parallel port, it >> creates an interupt to the DSP and the DSP process and put the next
sample
>> to I/O port. >> >> Which scheme is better? I feel 2 is probably better. > >Define "better". This usually means you need to investigate the cost >and performance impact of your various decisions, and decide which one >fits your product the best. > >If you're setting up DMA channels anyway, why not load an output DMA >buffer from the DSP, and use DMA to write to the DAC? > >In control systems for all but the finickiest plants I usually just >write the DAC any old time (actually I write the DAC just as soon as the
>control computation gets done). This would cause all sorts of subtle >problems in audio and communications systems where synchronization is >much more important than speed. >> >> In the receiver mode, I can still think of two ways: >> >> 1. Interupt based on samples. This consumes a lot of dsp cycles for >> handling the ISR, but it's truely realtime. > >Look on the web for definitions of "real time". Real time does not mean
>real fast. Real time means that you have hard deadlines that must be >met -- but if the task is to maintain the level of the lake behind a >dam, the hard deadline for adjusting the spill gates may be every day at
>midnight. > >> 2. Use DMA to move a block of data into memory and then process. This >> saves DSP cycles. But one might need dual buffers in order to have a
small
>> latency in processing. When have dual buffers, is there any problem to
do
>> filtering across the boundary? >> >Or use DMA to move single ADC reads into a circular buffer -- if your >DMA hardware is up to it. > >How important is latency in your particular application? 1 sample time?
> 2? 5? 10? 100? I sometimes work with imaging systems where a >frame time (1/30th of a second, or well over 10^6 ADC sample times) is >perfectly acceptable. I often work on control systems where you want to
>keep the latency to within 1 sample time. > >-- > >Tim Wescott >Wescott Design Services >http://www.wescottdesign.com >
This message was sent using the Comp.DSP web interface on www.DSPRelated.com