Hi everybody, Here is another simple question for you : I'd like to use a delay function which would allow me to make delays of x ns. First, is it feasible ? What's the minimum delay i can do with the TMS320C5402 working with a 100 MHz CPU clock? I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how much time does it take? Once more time, thanks everybody! Jerome
TMS320C5402 : a simple question about delay
Started by ●May 26, 2004
Reply by ●May 26, 20042004-05-26
jerome_lapeyre_mirande@hotmail.com (french_student) writes:> Hi everybody, > > Here is another simple question for you : I'd like to use a delay > function which would allow me to make delays of x ns. First, is it > feasible ? What's the minimum delay i can do with the TMS320C5402 > working with a 100 MHz CPU clock? > I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how > much time does it take? > > Once more time, thanks everybody! > > JeromeJerome, If your delay time is a multiple of the sample period, then you can simply buffer the samples. If it is not, then you can use a fractional delay filter, which is basically an all-pass filter with a group delay equal to the desired time delay. -- Randy Yates Sony Ericsson Mobile Communications Research Triangle Park, NC, USA randy.yates@sonyericsson.com, 919-472-1124
Reply by ●May 26, 20042004-05-26
french_student wrote:> Hi everybody, > > Here is another simple question for you : I'd like to use a delay > function which would allow me to make delays of x ns. First, is it > feasible ? What's the minimum delay i can do with the TMS320C5402 > working with a 100 MHz CPU clock? > I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how > much time does it take? > > Once more time, thanks everybody! > > JeromeHow much memory can you use for a buffer? Store an incoming sample in the (circular) buffer, and read it out N cycles later. At 100 MHz, the delay is 10(N+C) ns, where C is the constant number of overhead cycles. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●May 26, 20042004-05-26
On Wed, 26 May 2004 11:14:54 -0400, Jerry Avins <jya@ieee.org> wrote:>french_student wrote: > >> Hi everybody, >> >> Here is another simple question for you : I'd like to use a delay >> function which would allow me to make delays of x ns. First, is it >> feasible ? What's the minimum delay i can do with the TMS320C5402 >> working with a 100 MHz CPU clock? >> I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how >> much time does it take? >> >> Once more time, thanks everybody! >> >> Jerome > >How much memory can you use for a buffer? Store an incoming sample in >the (circular) buffer, and read it out N cycles later. At 100 MHz, the >delay is 10(N+C) ns, where C is the constant number of overhead cycles.From reading his code snippet, he only wants to waste a small, precise amount of time by using a do-nothing loop, rather than delay an actual signal FWIW, I've done some stuff with the 5402, and a SWAG at the time of that countdown-from-1-to-0 loop, running at 100MHz, is about 10 or 20ns. If you really need delays like that, there are better ways to do it. For short delays, a few nop's or a short loop like you have would be okay (I'd rather use assembly-code nop's or an assembly-code loop, as its timing won't change with different compile options, whereas it might with C code). For much longer delays, it would be better to set up a timer interrupt and let the processor go off and do something useful.>Jerry----- http://mindspring.com/~benbradley
Reply by ●May 26, 20042004-05-26
Ben Bradley wrote:> From reading his code snippet, he only wants to waste a small, > precise amount of time by using a do-nothing loop, rather than delay > an actual signal > FWIW, I've done some stuff with the 5402, and a SWAG at the time of > that countdown-from-1-to-0 loop, running at 100MHz, is about 10 or > 20ns. If you really need delays like that, there are better ways to do > it. For short delays, a few nop's or a short loop like you have would > be okay (I'd rather use assembly-code nop's or an assembly-code loop, > as its timing won't change with different compile options, whereas it > might with C code). For much longer delays, it would be better to set > up a timer interrupt and let the processor go off and do something > useful.A general bit of unsolicited advice: software delays are almost never just waiting for time to pass - they are more often allowing time for an event to occur. It's better to monitor the event and proceed after it has occurred than it is to wait until the event has had time to happen. These days the only reason I ever use software delays is when generating reset pulses or when forced to write "bit banging" code (like generating I2C pulses using digital I/O lines). But when ever I DO use a delay, I make sure there's not a better way to do it. -- Jim Thomas Principal Applications Engineer Bittware, Inc jthomas@bittware.com http://www.bittware.com (703) 779-7770 To understand recursion, one must first understand recursion.
Reply by ●May 26, 20042004-05-26
"Randy Yates" <randy.yates@sonyericsson.com> wrote in message news:xxp1xl7b78b.fsf@usrts005.corpusers.net...> jerome_lapeyre_mirande@hotmail.com (french_student) writes: > > > Hi everybody, > > > > Here is another simple question for you : I'd like to use a delay > > function which would allow me to make delays of x ns. First, is it > > feasible ? What's the minimum delay i can do with the TMS320C5402 > > working with a 100 MHz CPU clock? > > I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how > > much time does it take? > > > > Once more time, thanks everybody! > > > > Jerome > > Jerome, > > If your delay time is a multiple of the sample period, then you can > simply buffer the samples. If it is not, then you can use a fractional > delay filter, which is basically an all-pass filter with a group delay > equal to the desired time delay.Randy I strongly suspect, the OP wasn't trying to implement a delay line for signal operations but more on the lines of a x ns delay in software before performing other actions. BTW, Jim's recommendations should be read before reading my comments because they address the broader topic. To answer the original questions.. Yes, it's possible to make delays of x ns. The minimum delay with a 100 MHz CPU clock is 10 ns - I don't believe this is DSP dependent. If you are trying to get delays at the resolution of the processor clock in software and you care about accuracy - you should re-think your approach.> > I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how > > much time does it take?This depends on your compiler. My recommendation is to benchmark this on your emulator. If you do want to implement delays using such empty loops, I recommend you use Macros and inline assembly instructions to get better accuracy. The best option is to use a timer that I'm sure your DSP contains. Cheers Bhaskar> -- > Randy Yates > Sony Ericsson Mobile Communications > Research Triangle Park, NC, USA > randy.yates@sonyericsson.com, 919-472-1124
Reply by ●May 26, 20042004-05-26
"Bhaskar Thiagarajan" <bhaskart@deja.com> writes:> "Randy Yates" <randy.yates@sonyericsson.com> wrote in message > news:xxp1xl7b78b.fsf@usrts005.corpusers.net... >> jerome_lapeyre_mirande@hotmail.com (french_student) writes: >> >> > Hi everybody, >> > >> > Here is another simple question for you : I'd like to use a delay >> > function which would allow me to make delays of x ns. First, is it >> > feasible ? What's the minimum delay i can do with the TMS320C5402 >> > working with a 100 MHz CPU clock? >> > I'm wondering : if I write this instruction " for(i=1;i>0;i--); " how >> > much time does it take? >> > >> > Once more time, thanks everybody! >> > >> > Jerome >> >> Jerome, >> >> If your delay time is a multiple of the sample period, then you can >> simply buffer the samples. If it is not, then you can use a fractional >> delay filter, which is basically an all-pass filter with a group delay >> equal to the desired time delay. > > Randy > I strongly suspect, the OP wasn't trying to implement a delay line for > signal operations but more on the lines of a x ns delay in software before > performing other actions.Ayup - I missed that. Sorry! (I gotta start READING these posts before I respond to them...) -- % Randy Yates % "So now it's getting late, %% Fuquay-Varina, NC % and those who hesitate %%% 919-577-9882 % got no one..." %%%% <yates@ieee.org> % 'Waterfall', *Face The Music*, ELO http://home.earthlink.net/~yatescr
Reply by ●May 27, 20042004-05-27
Hi everybody, Thanks for your answers. Apparently i wasn't clear enough. I don't want to add a delay to an incoming signal, i just want to do a short delay before keeping on the program. As my prog is in C, and as i'd like it to remain if possible for the moment, i 'd rather use a C instruction. So you say a classic loop isn't accurate and you advice to use the timer. But is it hard to configure? What would be the instructions? Do i really need to use the timer? You also say the duration of the classic loop "for(...)" depends on the compiler. Could you explain why? I thought it only depended on the DSP's CPU clock. I think that's all for this subject. But don't worry i have other questions!! ;:) Thanks everyone. Jerome
Reply by ●May 27, 20042004-05-27
french_student wrote:> Hi everybody, > > Thanks for your answers. > > Apparently i wasn't clear enough. I don't want to add a delay to an > incoming signal, i just want to do a short delay before keeping on the > program. > As my prog is in C, and as i'd like it to remain if possible for the > moment, > i'd rather use a C instruction. So you say a classic loop isn't > accurate and you advice to use the timer. But is it hard to configure? > What would be the instructions?You need to read the manual to get an understanding of the processor architecture. With that, the rest is easy.> Do i really need to use the timer?No, but you need to read the assembler output each time to see what it does.> You also say the duration of the classic loop "for(...)" depends on > the compiler. Could you explain why? I thought it only depended on the > DSP's CPU clock.Different compilers and different optimization levels with the same compiler can translate the same C expression to different instruction sequences. A good optimizing compiler might simply eliminate an empty loop altogether.> I think that's all for this subject. But don't worry i have other > questions!! ;:)Don't worry; we have other answers. Here's one: you can write your delay loop in assembler, then call that from C. With a little thought, you can write it to accept a passed delay argument. By determining the actual delay when the argument is zero, you can compensate for that minimum time and have the routine delay the actual number of clock cycles you call for. There will be a minimum, of course, but you will know it. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●June 1, 20042004-06-01
Hi everybody, Thanks for all your answers. It's really interesting. For the moment, i think i don't have to use a delay. But if one day i'll need it, i'll try the things you advised. Thanks again. Jerome> > Hi everybody, > > > > Thanks for your answers. > > > > Apparently i wasn't clear enough. I don't want to add a delay to an > > incoming signal, i just want to do a short delay before keeping on the > > program. > > As my prog is in C, and as i'd like it to remain if possible for the > > moment, > > i'd rather use a C instruction. So you say a classic loop isn't > > accurate and you advice to use the timer. But is it hard to configure? > > What would be the instructions? > > You need to read the manual to get an understanding of the processor > architecture. With that, the rest is easy. > > > Do i really need to use the timer? > > No, but you need to read the assembler output each time to see what it > does. > > > You also say the duration of the classic loop "for(...)" depends on > > the compiler. Could you explain why? I thought it only depended on the > > DSP's CPU clock. > > Different compilers and different optimization levels with the same > compiler can translate the same C expression to different instruction > sequences. A good optimizing compiler might simply eliminate an empty > loop altogether. > > > I think that's all for this subject. But don't worry i have other > > questions!! ;:) > > Don't worry; we have other answers. Here's one: you can write your delay > loop in assembler, then call that from C. With a little thought, you can > write it to accept a passed delay argument. By determining the actual > delay when the argument is zero, you can compensate for that minimum > time and have the routine delay the actual number of clock cycles you > call for. There will be a minimum, of course, but you will know it. > > Jerry






