Reply by Jeff Brower November 1, 20062006-11-01
Abhishek Dixit-

> I am working on a video decoding algorithm on TMS320C6416 DSK at 600
> MHz.
>
> On enabling and viewing the clock while profiling it is found that it
> takes nearly 20 Mega Cycles for single frame decoding.
> But real time taken for single frame decoding when profiling enabled
> comes in few seconds.
>
> When I disable the clock it is found that it decodes 3 frames per second
> even though CPU clock is supposed to be at 600 MHz.
> I have checked CLKMODE0 and CLKMODE1 pins which are at 0 and 1
> respectively.
>
> Can somebody help me why real time for decoding is coming much higher
> than expected?
> We should be able to decode 30 frames per second. But we are getting
> only 3 frames and on enabling clock it becomes less than a frame per
> second.
> What could be the possible reasons for such mismatch between cycle taken
> for decoding and elapsed time?
> Or is there anything wrong with the cycle measurement itself?

Profiling consumes 64x CPU resources, as does any RTDX communication with the host.
The point of profiling is to learn which code takes what amount of time relative to
other code. I think of it as "percentage execution time". If you want fastest
real-time performance, all host I/O (printf, logprintf, getch, etc) has to be turned
off, all RTDX has to be off, and "Run Free" should be selected in the CCS Debug
menu. In that case, if you need to measure specific processing intervals (like why
certain decode functions are still running too slow), then you can use an onchip
timer (TIMER1, manually) or add DSP code to toggle GPIO lines at start/end of "code
of interest" and measure those lines on a dig scope.

-Jeff
Reply by Abhishek Dixit November 1, 20062006-11-01
Hi All,

I am working on a video decoding algorithm on TMS320C6416 DSK at 600
MHz.

On enabling and viewing the clock while profiling it is found that it
takes nearly 20 Mega Cycles for single frame decoding.
But real time taken for single frame decoding when profiling enabled
comes in few seconds.

When I disable the clock it is found that it decodes 3 frames per second
even though CPU clock is supposed to be at 600 MHz.
I have checked CLKMODE0 and CLKMODE1 pins which are at 0 and 1
respectively.

Can somebody help me why real time for decoding is coming much higher
than expected?
We should be able to decode 30 frames per second. But we are getting
only 3 frames and on enabling clock it becomes less than a frame per
second.
What could be the possible reasons for such mismatch between cycle taken
for decoding and elapsed time?
Or is there anything wrong with the cycle measurement itself?

Thanks and Regards

Abhishek