There appears to be a major problem with the way signal scaling is handled. Try measuring a constant DC signal, alter the Y position of the trace, and see the reported amplitude of the signal change.
Yea, I see what you’re talking about. But, it was less than 5% off my Fluke. Oh yea, I have a Fluke for that kind of measurement. Sorry, the smarta$$ in me just pops out, you may be talking about something I didn’t see. Just FYI: Firmware 2.6, Sys: 1.34, App: 2.33, used a 9v battery and Fluke.
And I forgot to mention that I was looking primarily at the Vdc indication for the variation.
I suggested checking a DC signal because it should have a constant (and easy to verify with a meter) Vdc, regardless of timebase, range, ypos etc.
Maybe the problem isn’t as noticeable with higher voltages, but viewing a 200mV signal at 50mV/div, the reported Vdc varies from ~180mV to ~400mV as I increase Ypos.
OK, point taken. I don’t have a function generator with which to test, but I had planned to use this on, amoung other things, audio work and that kind of variation would be a big correction factor to have to constantly account for.
I noticed that the function generator has a wide variation, but since the documentation isn’t there yet, this might be due, in part, to your findings. I’d be interested in all input on this.
Has this been resolved?
I do see a variation acording to the Y position (higher Y -> higher Vdc)
Will check the code when I can.
Could this be caused by bad calibration? especially the gain correction parameter?
btw, you can get a version of the app with some other fixed readings (rms, vpp, etc) here: viewtopic.php?f=22&t=2957