DSO firmware version 3.64

It’s not my intention to push you away, but if we keep the discussions at a higher level, it is likely to be more productive (from my point of view at least).

A single ADC interrupt may require upwards of 70 cycles just for saving and restoring context. At a sampling rate of 1Ms/s we would then consume 100% of CPU time just for ADC interrupt overhead. Also we have to account for all other interrupts. This includes a millisecond interval timer used for general time keeping, scanning input button key states (so that the Nano appears responsive at all times) and a few other housekeeping tasks. Connected to USB it gets a lot worse as we have to service inbound interrupt requests for the SD card file system. Service time for all interrupts combined will determine the minimum ADC cycle time. DMA is the performance enabler here, not the problem.

Using DMA and a 4k buffer size provides a window for real-time concurrent access to the acquisition buffer and we need this in order not to loose samples.

At 5ms we will sample at 50kHz and so barely capture the 10kHz ripple (5 samples per cycle). Step down one TD however and you should be ok.

In this case I think you need to go back and check your settings (fast mode, zero sensitivity and correct trigger level) as this should be within the capability of the Nano/firmware.

You can not have a large buffer AND short cycle at the same time so this is a trade off. The min/max samples however will be in the acquisition buffer and so can be targeted by triggers.