DSPRelated.com
Forums

TMS320VC33 Dropping Sync Serial Port Interrupt

Started by kk7p...@wavecable.com November 30, 2007
I have a system running that uses the Serial Port interrupt. It is the only interrupt enabled in the system. All processes are launched from the interrupt service routine. There are 4 interrupts spaced about 2.5 usec apart, followed by 10 usec period of no interrupt. Thus we have about 200,000 interrupts/sec. The ISR itself, with overhead for call and exit, has a worst case execution time of less than 200 nsec, or about 10% of the overall CPU load. The interrupts can never nest (although the routines are re-entrant and would not cause a crash if they were).

The 'VC33 is typically running at about 80% CPU load. The system can be configured by the user to consume about 90% of the CPU steady-state. User interaction can result in peak loads approaching 100%, but only for a few microseconds.

I find that upon rare occasion that the DSP stops servicing interrupts. This is under heavy user interaction contrived to stress the system to its limits. It usually takes minutes of focused effort to cause this to happen, although it may only take 10or 20 seconds. And sometimes it will runs for hours under these conditions. Under normal conditions, it runs until turned off. it runs relaibly under heavy user interaction - but not contrived interaction - for days on end with not problems. I have not yet been able to determine if the stack is blowing up or the interrupt is simply dropped. I can't be sure because the Code Composer + Emulator is so unstable on my computers that I cannot use the emulator. But I did determine that the interrupt service stops.

I have a 512 word space set aside for the stack. Under normal operation I don't believe the stack will ever exceed a depth of 30 or 40 words.

I am not directly accessing the IE register after system initialization. I had one place in the code where I intentionally stopped interrupts (clearing the GIE bit in the ST register) but quickly re-enabled it, and many places in the code where I set the GIE bit just to be sure it is set.

My question is whether there is a known mechanism or set of conditions whereby the interrupt (IF bit?) from the serial port system can be accidentally masked or missed, and if so if there is a workaround.

Thank you for your insights.

Lyle Johnson
Hello Lyle

If the serial port is internally generating frame synchs and is underrun
it sees this as a fault and halts. That means no more frame synchs and
no more more interrupts. You can poll for these bits, but in the end
you will find that an asynchronous CPU read (or write) of the serial
port, will often get things going again. This also does not clear an
interrupt.

Hope this helps,
Keith
--------
k...@wavecable.com wrote:

I have a system running that uses the Serial Port interrupt. It is the
only interrupt enabled in the system. All processes are launched from
the interrupt service routine. There are 4 interrupts spaced about 2.5
usec apart, followed by 10 usec period of no interrupt. Thus we have
about 200,000 interrupts/sec. The ISR itself, with overhead for call and
exit, has a worst case execution time of less than 200 nsec, or about
10% of the overall CPU load. The interrupts can never nest (although the
routines are re-entrant and would not cause a crash if they were).

The 'VC33 is typically running at about 80% CPU load. The system can be
configured by the user to consume about 90% of the CPU steady-state.
User interaction can result in peak loads approaching 100%, but only for
a few microseconds.

I find that upon rare occasion that the DSP stops servicing interrupts.
This is under heavy user interaction contrived to stress the system to
its limits. It usually takes minutes of focused effort to cause this to
happen, although it may only take 10or 20 seconds. And sometimes it will
runs for hours under these conditions. Under normal conditions, it runs
until turned off. it runs relaibly under heavy user interaction - but
not contrived interaction - for days on end with not problems. I have
not yet been able to determine if the stack is blowing up or the
interrupt is simply dropped. I can't be sure because the Code Composer +
Emulator is so unstable on my computers that I cannot use the emulator.
But I did determine that the interrupt service stops.

I have a 512 word space set aside for the stack. Under normal operation
I don't believe the stack will ever exceed a depth of 30 or 40 words.

I am not directly accessing the IE register after system initialization.
I had one place in the code where I intentionally stopped interrupts
(clearing the GIE bit in the ST register) but quickly re-enabled it, and
many places in the code where I set the GIE bit just to be sure it is set.

My question is whether there is a known mechanism or set of conditions
whereby the interrupt (IF bit?) from the serial port system can be
accidentally masked or missed, and if so if there is a workaround.

Thank you for your insights.

Lyle Johnson