Hi,
I am profiling an algorithm on an NRF52840. As the algorithm use more memory, it takes more time. On larger timescales where the algorithm might take 90seconds to compute, the in/out time from system view says 18 ish seconds. As of now Im using noOS, 64Mhz and systick is not enabled. Is there anything that might be causing this (timer overflow somewhere)?
Calling
SEGGER_SYSVIEW_Conf();
SEGGER_SYSVIEW_OnIdle();
in the main and then
SYS_TRIGGER(SYS_SENSOR_NOISE_MODEL);
SYS_RELEASE(SYS_SENSOR_NOISE_MODEL);
as my entry and exit points for the algorithm.
I am profiling an algorithm on an NRF52840. As the algorithm use more memory, it takes more time. On larger timescales where the algorithm might take 90seconds to compute, the in/out time from system view says 18 ish seconds. As of now Im using noOS, 64Mhz and systick is not enabled. Is there anything that might be causing this (timer overflow somewhere)?
Calling
SEGGER_SYSVIEW_Conf();
SEGGER_SYSVIEW_OnIdle();
in the main and then
SYS_TRIGGER(SYS_SENSOR_NOISE_MODEL);
SYS_RELEASE(SYS_SENSOR_NOISE_MODEL);
as my entry and exit points for the algorithm.