DSO firmware version 3.64

Hi guys,

First of all I wish to say that I don’t like “I’m … you are …”. If you are all agree with this, please let’s avoid this in future.

I hope we can all agree for this:

  • DSO Nano is fine and useful product, but has more limitation then we originally expected for big TD (TD>=5ms). The freq sample is too low, there is no LP filters and no way to see high components. I put an demonstration on page 9 with OO/Excel file where you can see 6kHz as 1kHz if you sample at 5kHz.

  • BenF software is big improvement to original one and we all respect that work. Long time ago, I made an FFT (and oscilloscope) PC sound card app. I made first version just for few days, but need few month to make it good. The PC is much easier for programming and testing then this hardware. I can just imagine how it is difficult to do the same with hardware limitations of Nano. Testing also require a lot of time because you need to transfer program to Nano, … This is big and hard work. I’m aware of that, and I’m sure others.

The main question from my point is: can this be better for bigger TD? My intuition says: it can. But maybe it can not. I cannot be 100% sure.

Unfortunately, I really do not have time to go even a bit deeper into this. So I will just push some another ideas. Let’s BenF or somebody else think about them more. If they cannot be implemented or they are too much hard to implement, that’s OK. Otherwise, maybe they can help.

  1. First, I was surprised that irq switching time requites such a lot overhead. I guess BenF check that. If that is true, how we can stand with sampling at 100 kHz, or even 50 kHz? Maybe that can pass? Even with very small battery life.
    The 50kHz irq sampling looks the same as current TD=5ms on fast mode, but it is not: with Avg, Min and Max we have LP filter (Avg) and see high components (Min, Max) and the buffer is 10x bigger.

  2. If I understand, the main limitation is irq switching time. If #1 cannot pass , let’s consider an another more complicated:

  • Let’s use DMA, but on very small buffer. Accentually two small buffers. While DMA fill one of them, the DSO process another one.
  • Data from small buffer is transfered to main bigger CIRCULAR buffer used for displaying, saving, … During this transfer, more then one value from small buffer correspond to one tupe (Avg, Min, Max) in big/main buffer. During this transfer from “small” to “big” buffer, program is looking for trigger and process other calculations.

How long this small buffer has to be? Will we have the whole “small” buffer just to one tuple in big (main) buffer, or more?
I will leave these to be considered. Maybe the small buffer will be bigger then main buffer. This need some calculations and considerations.