Project

General

Profile

Default Clock Speed

Added by Marc Lichtman over 13 years ago

Hey All,

What speed will an out of the box MityDSP-L138 run at? I was under the assumption it was 456MHz, but when I look in the provided .tcf it says 300MHz. Any information would be much appreciated.

-Marc


Replies (4)

RE: Default Clock Speed - Added by Gregory Gluszek over 13 years ago

Hi Marc,

The boards default to running at 300 MHz, though they can be run at 456 MHz. You can change this in the DSP in two ways. You can do this statically by double clicking the .tcf file in your project, going to "Global Settings," right clicking and select properties. There you can set the DSP speed. The second way would be in main() in your DSP code to add the following:

GBL_setFrequency(tcDspFirmware::get_cpu_clock()/1000);
CLK_reconfig();

This will automatically detect the system clock rate and make necessary adjustments in the DSP.

You can change the system clock by logging into the mitydsp and running the following command:

root@mityomapl138:~# echo 456000 > /sys/devices/system/cpu/cpu0/cpufreq/scaling_setspeed

Other things to be aware of when changing the clock is that if your system has an FPGA and you're using it, the EMIFA clock rate is based on the system clock and will be different than the 300 MHz we typically use here.

\Greg

RE: Default Clock Speed - Added by Michael Williamson over 13 years ago

Hi,

I want to clarify a couple of things here.

The DSP and the ARM clocks are driven by the same system clock. Modifying the ARM modifies the DSP clock.

The MityDSP modules (i.e., the OMAP-L138/ARM-1808/ARM-1810) come up at 300 MHz primarily because the on-board power management IC (PMIC) defaults the core voltage to 1.2 after power-up or reset. In order to run above 375MHz, the core voltage must be raised to 1.3 volts. This is done automatically by the ARM in linux via an I2C request to the PMIC when you request the speed change with the "echo 456000 > /sys/devices/system/cpu/cpu0/cpufreq/scaling_setspeed" mentioned above. We use the same factory images for 300 MHz or 456 MHz qualified parts, so we default the speed to 300 MHz here in our bootloader stages and sample kernel images.

The modification of the .tcf file or using the GBL_setFrequency/CLK_reconfig calls above simply provide DSP/BIOS with the correct "tick" divisors the RTOS uses for the various DSP/BIOS calls. E.G., it needs to know the actual CPU rate is in order to properly sleep in TSK_sleep() calls or to timeout in a SEM_pend(..., timeout) call. These modifications do not actually change the clock rate of the DSP, they only modify the divisors the OS uses to compute time intervals.

Hope this helps.

-Mike

RE: Default Clock Speed - Added by Marc Lichtman over 13 years ago

Will GBL_setFrequency(); actually change the clock, or just feed the RTOS will correct numbers like you were talking about?

My goal is to set the OMAP to 375MHz, preferably using the DSP and CCS.

RE: Default Clock Speed - Added by Michael Williamson over 13 years ago

No, it won't change the clocks. It will just feed the RTOS the correct numbers.

You really should set the frequency using the ARM unless you are not running any OS on it at all. The linux OS needs to know about changes made to the PLL multiplier / divisor settings as they may cascade into several peripheral clocks you may be using (e.g., the SPI bus clocks, UART clocks, I2C clocks, LCD framebuffer driver, etc.). The linux drivers have notifiers that get called on system clock changes that will recompute local peripheral divisor settings in order maintain a specified data rate (e.g., the BAUD rate for a UART, pixel clock and scan timings for LCD, clock rate for I2C, etc.). You should be able to use "echo 375000 > ..." to accomplish a 375 MHz rate.

If you start messing with the system clocks on the DSP (possible, as it has access to the system configuration registers affecting the PLLs), you may find that peripherals the ARM is using will stop functioning.

If you are trying to just write DSP code and test it / debug it via JTAG emulator (and not use the ARM yet), then I would advise stopping the ARM in u-Boot and altering the divisor settings via modifying the GEL file macros and executing them from CCS. This takes a little practice, but works reasonably well until you want to integrate the DSP loading/execution with the ARM via DSPLINK.

-Mike

    (1-4/4)
    Go to top
    Add picture from clipboard (Maximum size: 1 GB)