Reply by Steve Holle January 14, 20042004-01-14
I finally got it to work with the help of many suggestions and code
from ADI and this group.  ADI supplied me with a modified bootloader
and I had to upload the code with the bus width set to 16-bit in the
SYSCON register.  I also had to lower the timout value on the Loader
to around 100 to prevent the Coldfire from generating a bus error.

Thanks everyone for your help and suggestions.
Reply by Georgi Beloev January 13, 20042004-01-13
Hi Steve,

Make sure you have TYPE(PM RAM) and WIDTH(48) at the line that describes the
SRAM in the Memory section of your LDF file.

Note: this is from a rather old project I worked on; the syntax may have
changed.

Hope this helps,
-- Georgi

"Steve Holle" <sholle@link-comm.com> wrote in message
news:ba83847d.0401121519.6393558f@posting.google.com...
> I'm so close I can taste it. Just one more step. > > We have a system comprised of an ADSP-21065L with two 256kx16 SRAM > connected as a 256kx32 SRAM. This system also has a 32-bit Host Port > connected to a Motorola 32-bit Coldfire processor bus. I am able to > boot using the Host Port as long as I don't use the external SRAM in > my program. I've created a simple app that just turns on 4 LEDs > connected to Flag lines to demonstrate boot completion. I can run the > program without and with external SRAM using the Summit ICE BMD. I > can upload the non-SRAM program using the Host Port and it works fine. > When I load code into SRAM the program fails. > > I've compared the code loaded into internal memory (0x8000-) and > external memory (0x20000-) and the external memory using the BMD by > loading from the BMD and dumping memory and loading from the Host port > and using the BMD to dump memory. The internal memory seems to be the > same but the external memory is not correct. The problem seems to be > some mismatch in packing but I don't know where I need to change it. > Do I need to change it in the ADI host bootloader code? The > bootloader seems to produce 16-bit words so it seems that it is > expecting 16-bits at a time but how does the bootloader know how to > pack the data in external SRAM? > > Any advice would be greatly appreciated.
Reply by Ron Huizen January 13, 20042004-01-13
You probably need to modify the bootloader kernel to setup for your sram
correctly.  The bootloader code isn't that convoluted (except the last bit
of it) so you should be able to see where you can add whatever settings you
need for the sram.

-----
Ron Huizen
BittWare

"Steve Holle" <sholle@link-comm.com> wrote in message
news:ba83847d.0401121519.6393558f@posting.google.com...
> I'm so close I can taste it. Just one more step. > > We have a system comprised of an ADSP-21065L with two 256kx16 SRAM > connected as a 256kx32 SRAM. This system also has a 32-bit Host Port > connected to a Motorola 32-bit Coldfire processor bus. I am able to > boot using the Host Port as long as I don't use the external SRAM in > my program. I've created a simple app that just turns on 4 LEDs > connected to Flag lines to demonstrate boot completion. I can run the > program without and with external SRAM using the Summit ICE BMD. I > can upload the non-SRAM program using the Host Port and it works fine. > When I load code into SRAM the program fails. > > I've compared the code loaded into internal memory (0x8000-) and > external memory (0x20000-) and the external memory using the BMD by > loading from the BMD and dumping memory and loading from the Host port > and using the BMD to dump memory. The internal memory seems to be the > same but the external memory is not correct. The problem seems to be > some mismatch in packing but I don't know where I need to change it. > Do I need to change it in the ADI host bootloader code? The > bootloader seems to produce 16-bit words so it seems that it is > expecting 16-bits at a time but how does the bootloader know how to > pack the data in external SRAM? > > Any advice would be greatly appreciated.
Reply by Steve Holle January 12, 20042004-01-12
I'm so close I can taste it.  Just one more step.

We have a system comprised of an ADSP-21065L with two 256kx16 SRAM
connected as a 256kx32 SRAM.  This system also has a 32-bit Host Port
connected to a Motorola 32-bit Coldfire processor bus.  I am able to
boot using the Host Port as long as I don't use the external SRAM in
my program.  I've created a simple app that just turns on 4 LEDs
connected to Flag lines to demonstrate boot completion.  I can run the
program without and with external SRAM using the Summit ICE BMD.  I
can upload the non-SRAM program using the Host Port and it works fine.
 When I load code into SRAM the program fails.

I've compared the code loaded into internal memory (0x8000-) and
external memory (0x20000-) and the external memory using the BMD by
loading from the BMD and dumping memory and loading from the Host port
and using the BMD to dump memory.  The internal memory seems to be the
same but the external memory is not correct.  The problem seems to be
some mismatch in packing but I don't know where I need to change it. 
Do I need to change it in the ADI host bootloader code?  The
bootloader seems to produce 16-bit words so it seems that it is
expecting 16-bits at a time but how does the bootloader know how to
pack the data in external SRAM?

Any advice would be greatly appreciated.