DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes)

Moderators: lily.li, violet, jeremy882, crail.lyu969

Wildcat
Elementary-1
Elementary-1
Posts: 166
Joined: Fri Jun 22, 2012 1:29 pm
Are you a staff member of seeedstudio?: no
Which products/projects are your favorite?: DSO Quad

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by Wildcat » Sun Mar 06, 2016 8:48 am

thenaughtyfantasy wrote: Thanks a lot Wildcat. Still i'm limited to my multimeter's accuracy this way..and i wouldn't say i have even a good entry multimeter (although i will be buying a good one in the near future). After this i have to do the hardware calibration i guess to minimize overshoot and rise time. Or should i do this first? Now i remember something on calibrating at 50V, and not being possible due to the clamping diodes of the hw v2.81!?!
Unless you have a REAL crappy multimeter (for example with limited ranges), it's probably accurate enough. The 8 bit readings in the Quad do not offer great accuracy, even when compared with cheap DMM's. That's why it's important to calibrate it to get the most out of it. What the Quad voltage meters lack in accuracy they make it up for in bandwidth and ability to measure complex waveforms, while generally providing enough accuracy for most purposes.

Hardware calibration (high frequency compensation) is not in any way related to the DC software calibration. Doesn't matter which one you do first.

If your device still has the clamping diodes at the analog inputs, I would strongly recommend removing these. Not only will these limit the voltage applied to the inputs, but if what you measure is of a low enough impedance (very often is) it can short the diodes and result in a dead channel, or even possibly blow a circuit trace and/or damage the circuitry you are measuring. The HW 2.81 device I purchased from SEEED had these already removed when I got it, presumably at SEEED's request to their supplier. It did still however have them at the digital inputs.

I could go into a very long, drawn out discussion about the input circuitry of the Quad. I don't wish to do that at this time, suffice it to say that clamping diodes at the input, without any resistive buffering is NOT a good idea...

hannes-diethelm
Pre-kindergarten
Pre-kindergarten
Posts: 2
Joined: Tue Mar 08, 2016 3:16 am

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by hannes-diethelm » Tue Mar 08, 2016 4:37 am

Hello,

First of all, many thanks to Wildcat for maintaining the software. It is much better than the original one! I just bought a DSO203. The original software is quite buggy. So I searched an alternative and found yours. It works great and I did not find any bug until now! However, for the moment I'm using V5.0 and the original FPGA.
Wildcat wrote:The issue I ran into was parasitic bit toggling on ch A when using the updated FPGA (see bottom screenshot). Ch B was not affected. The problem was finally isolated to the ADC chip: Heating the IC would make this disappear, while cooling it made it worse. It should be noted though that when switching back and used with the previous FPGA versions, the noise was not seen. Apparently the added functions in the updated FPGA somehow seemed to be causing this.
This is typical when you do changes inside the FPGA. I assume that the timing was already critical before you did the changes. If your changes add some additional delay, this can happen.

Now if the output of the ADC changes for example form 1000 0000 to 0111 1111 (128 to 127) and bit 5 is a bit to slow, the sampled value can be 0101 1111 (95). By heating or cooling the ADC, these delays change and the effect can be better or worse. This is the same if you change the ADC.

I just reverse engineered the FPGA code a bit. MCI seams to be the master clock from the CPU (72MHz). The ADC Clocks are given by: assign CKA=MCI;
assign CKB= (Ctrl_Link[1]) ? !MCI : MCI;

In the DP_RAM, the ADC Data is sampled on the rising edge of MCI.

This seams all fine, but assign CKB= (Ctrl_Link[1]) ? !MCI : MCI; adds some additional delay on CKB. Just give it a try and use:
assign CKA= (Ctrl_Link[1]) ? !MCI : MCI;
to have the same delay on CKA. I looked trough the code and it seems Ctrl_Link[1] is always zero.

The code integrates a mux which gives additional delay. It is possible that all delays are so big that the FPGA samples the signal on one rising edge later, so giving more delay could fix the bug.

It might also be that CHA and CHB are exchanged some where by accident, so trying:
assign CKB=MCI;
removes the delay on CHB which could be CHA... ;-)

Normally, such timing things are tested and fixed if possible by the FPGA tool, but the actual SDC files has no information about the external delay of the ADC, so it assumes zero delay. And it might work or not by chance and temperature! ;-)

By the way: Which tool are you using to synthesize the FPGA? I could improve the SDC file and do some FPGA simulations to see if there is really a problem. And which compiler are you using for the C code?

Regards,
Hennes

Wildcat
Elementary-1
Elementary-1
Posts: 166
Joined: Fri Jun 22, 2012 1:29 pm
Are you a staff member of seeedstudio?: no
Which products/projects are your favorite?: DSO Quad

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by Wildcat » Tue Mar 08, 2016 12:15 pm

hannes-diethelm wrote:This is typical when you do changes inside the FPGA. I assume that the timing was already critical before you did the changes. If your changes add some additional delay, this can happen.

Now if the output of the ADC changes for example form 1000 0000 to 0111 1111 (128 to 127) and bit 5 is a bit to slow, the sampled value can be 0101 1111 (95). By heating or cooling the ADC, these delays change and the effect can be better or worse. This is the same if you change the ADC.
This is what I originally assumed before doing countless builds, however the problem has proven frustratingly difficult to pinpoint.
I just reverse engineered the FPGA code a bit. MCI seams to be the master clock from the CPU (72MHz). The ADC Clocks are given by: assign CKA=MCI;
assign CKB= (Ctrl_Link[1]) ? !MCI : MCI;

In the DP_RAM, the ADC Data is sampled on the rising edge of MCI.

This seams all fine, but assign CKB= (Ctrl_Link[1]) ? !MCI : MCI; adds some additional delay on CKB. Just give it a try and use:
assign CKA= (Ctrl_Link[1]) ? !MCI : MCI;
to have the same delay on CKA. I looked trough the code and it seems Ctrl_Link[1] is always zero.
I did try that, while it may have changed things a bit (just about anything you changed affected it somewhat) it did not fix the problem. The reason for the Ctrl_Link conditional is to provide support for 144Mhz sampling, which requires the ADC to be set up with out of phase clocks.
The code integrates a mux which gives additional delay. It is possible that all delays are so big that the FPGA samples the signal on one rising edge later, so giving more delay could fix the bug.
Some of the early builds in trying to solve this looked as if this could have been the case, as post synthesis timing showed around a 1 MCI clock period shortfall. However, other compiles with everything stripped down to keep timings in line still did not solve the problem.
It might also be that CHA and CHB are exchanged some where by accident, so trying:
assign CKB=MCI;
removes the delay on CHB which could be CHA... ;-)

Normally, such timing things are tested and fixed if possible by the FPGA tool, but the actual SDC files has no information about the external delay of the ADC, so it assumes zero delay. And it might work or not by chance and temperature! ;-)
I didn't add delay with the constraints file but after observing that reading from the falling rather than the rising edge made no difference whatsoever with the noise I kind of dismissed the possibility of input timing issues. Still it wouldn't hurt to try, would be great if such a simple fix would work...
By the way: Which tool are you using to synthesize the FPGA? I could improve the SDC file and do some FPGA simulations to see if there is really a problem. And which compiler are you using for the C code?
I'm using Lattice's ICEcube2, release 2015.04.27409 with the Synopsys synthesizer and GCC 4.6.1 from codesourcery for the C compiler (CodeBench Lite).

I spent a good amount of time trying to solve this problem before changing the ADC chip. I eventually came up with some builds that actually worked pretty well, but while some minimized the noise to barely noticeable levels, they never completely eliminated it.

Here are some notes from my experience with this issue that may be of help:

First of all, the issue was confirmed coming from the ADC by swapping the A ch up to bits 8-15 and B ch down to 0-7 at the input ports of the FPGA. This caused the problem to now display in the B channel (the A ch from the ADC was now being sent to the B display). The noise was not changed at all, it was exactly the same.

The problem seems to be intermittent in nature, in other words it's not a continuous, on every cycle event but a random occasional blip. Of course, if something is "on the edge" it can act that way but it's unusually persistent, occurring to a varying extent at all clock speeds and with a wide variety of completely different FPGA configurations.

Most of the builds I tried did not show the effect at all in normal modes, due to it's intermittent nature. However full speed mode stores any event like this. At slow timebases, the cumulative effect of having say 60,000 samples taken, stored and displayed would result in the near certitude of capturing at least one event for each displayed sample, resulting in what looks like a continuous string of noise.

While some bit toggling at various levels could be observed with some configurations, and these would respond to changes in timing, as well as would diminish and eventually disappear when reducing the clock speed (keep in mind in normal modes the timbase sets the clock speed, it can as well be adjusted in full speed mode), there was one "level" at which the noise occurred that did NOT respond to reducing the clock speed and proved frustratingly persistent.

Many configurations were tried, with different functions clocking on different edges. For example, reading the input on the falling edge rather than the rising edge did not have any effect, with the exact same particular bits in the ADC ch A behaving in the exact same way.

The one thing that seemed to cause the most problem was increasing XTthreshold from 16 bits to 32 bits to account for the faster clock used with full speed mode at slow timebases. This seemed to come from the increased complexity of having to compare several 32 bit registers, in comparison to comparing 16 bit regs.

The final FPGA I published is NOT optimized to minimize this problem, but rather optimized to function properly. Final timing analysis for that one looks good, the only exceptions being with some inter-clock relationships with unspecified constraints between the main clocks and program/FPGA control transfers that occur only occasionally, too seldom to be an issue. Many builds were found to almost eliminate the noise, but frustratingly needed a compromise somewhere (most notably in the time based triggering function which needed to be changed to 32 bits) and for that reason could not be used.

None of this made any sense to me. At first, I was sure the issue was timing but no matter what I tried, the results were always random and unpredictable.

After the ADC was changed, none of the builds showed ANY trace of this whatsoever.

At this point I'm leaning more towards some hardware issue being the culprit, possibly the added gate count in the FPGA causing increased noise on the supply lines that the ADC can't cope with, or something like that.

I hope you can find something I overlooked that can solve the problems using this with the ADC's that come with the units. It also appears that some seem to work just fine... At this point, though, I have no way to reproduce the issue for testing as my unit no longer has the problem, so it's up to you if you wish.

Thanks for taking an interest in this, I believe this is a potentially very useful mode and well worth the effort to make it right. Let me know if I can help in any way.

By the way, you wouldn't by any chance have the means to program the ICE65 chips used in the earlier devices? These were made by Silicon Blue which was bought by Lattice but they do not appear to want to support the old chips. Although the 65 version can be selected, the compiler comes back with a missing DEV file. An earlier version of ICEcube from Silicon Blue only has Synopsys available for synthesizing and will not take the current license available from Lattice.

hannes-diethelm
Pre-kindergarten
Pre-kindergarten
Posts: 2
Joined: Tue Mar 08, 2016 3:16 am

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by hannes-diethelm » Wed Mar 09, 2016 4:31 am

I see you did a lot of tests!
Wildcat wrote: I did try that, while it may have changed things a bit (just about anything you changed affected it somewhat) it did not fix the problem. The reason for the Ctrl_Link conditional is to provide support for 144Mhz sampling, which requires the ADC to be set up with out of phase clocks.
I was thinking this could be for double sample rate using two combined channels. Is this supported by the software and the bandwidth of the analog input?

I'm still waiting to be able to download ICEcube2... I hate this stupid sites where you need to register, than to wait... :evil:

I see, you did a lot of testing...
So as you describe, it might be also possible that this problem also exists with the original FPGA image, but there is no way to see it because you just don't display eneught samples that you could see it in a reasonable time?
Wildcat wrote: While some bit toggling at various levels could be observed with some configurations, and these would respond to changes in timing, as well as would diminish and eventually disappear when reducing the clock speed (keep in mind in normal modes the timbase sets the clock speed, it can as well be adjusted in full speed mode), there was one "level" at which the noise occurred that did NOT respond to reducing the clock speed and proved frustratingly persistent.
What do you mean with level? At which clock speeds the problem disappeared?

It might also really be the ADC. The HWD9288 is just a cheap copy of the AD9288, they even copied the data sheet images! :shock: The FPGA Gates run on 1.2V but this supply is generated from the 2.8V where the CPU and the ADC runs on. I've also seen that the ADC runns from 2.7 to 3.6V. So the supply is just 100mv more than required. This might be a problem, especially on a switched supply which is possibly slow and not so stable.

No, I don't have anything for the ICE65 chips. At the company, we use Altera FPGA's. May be you could just ask Lattice support for a license of the old tool?

thenaughtyfantasy
Pre-kindergarten
Pre-kindergarten
Posts: 42
Joined: Thu Apr 09, 2015 9:48 am

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by thenaughtyfantasy » Wed Mar 09, 2016 6:25 am

Wildcat wrote:
thenaughtyfantasy wrote: Thanks a lot Wildcat. Still i'm limited to my multimeter's accuracy this way..and i wouldn't say i have even a good entry multimeter (although i will be buying a good one in the near future). After this i have to do the hardware calibration i guess to minimize overshoot and rise time. Or should i do this first? Now i remember something on calibrating at 50V, and not being possible due to the clamping diodes of the hw v2.81!?!
Unless you have a REAL crappy multimeter (for example with limited ranges), it's probably accurate enough. The 8 bit readings in the Quad do not offer great accuracy, even when compared with cheap DMM's. That's why it's important to calibrate it to get the most out of it. What the Quad voltage meters lack in accuracy they make it up for in bandwidth and ability to measure complex waveforms, while generally providing enough accuracy for most purposes.

Hardware calibration (high frequency compensation) is not in any way related to the DC software calibration. Doesn't matter which one you do first.

If your device still has the clamping diodes at the analog inputs, I would strongly recommend removing these. Not only will these limit the voltage applied to the inputs, but if what you measure is of a low enough impedance (very often is) it can short the diodes and result in a dead channel, or even possibly blow a circuit trace and/or damage the circuitry you are measuring. The HW 2.81 device I purchased from SEEED had these already removed when I got it, presumably at SEEED's request to their supplier. It did still however have them at the digital inputs.

I could go into a very long, drawn out discussion about the input circuitry of the Quad. I don't wish to do that at this time, suffice it to say that clamping diodes at the input, without any resistive buffering is NOT a good idea...
It is really crappy for accuracy and precision, but really good for 8gbp:p No auto-range, no lead compensation, 2000 counts only i believe, next one i'm buying is gonna be about 100gbp, 6000 count min (looking at Brymen ones). I understand about the ADC, with only 8bits you can't do much accuracy wise:p

I checked the hardware, clamping diodes are missing which is good. Didn't even need to take the shield off, just lift it up a bit and i could just barely check. Maybe all hw v2.81 has them removed but who knows.

I managed to do a calibration. Not the best way though, i'll probably re-do it. I used 2 18650 batteries in series, that went to a power module with selectable 5V and 3.3V (on board regulators) and used two variable resistors as potential dividers with the output connected to the input of the other one so i could make more fine adjustments. It was really hard to pinpoint the adc to the center at low voltage ranges, probably due to noise from the power module regulators (dumb me;p). For the higher ranges i used a power supply connected to a boost converter to go up to 35V. For the last range i couldn't generate high enough voltage so i just grounded the probe (i'm not sure if this the correct action to do).

Also everytime i save a calibration for low-battery, and full-battery it overwrites the old one automatically? Or do i need to delete the old calibration files manually?

Wildcat
Elementary-1
Elementary-1
Posts: 166
Joined: Fri Jun 22, 2012 1:29 pm
Are you a staff member of seeedstudio?: no
Which products/projects are your favorite?: DSO Quad

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by Wildcat » Wed Mar 09, 2016 12:10 pm

hannes-diethelm wrote:I was thinking this could be for double sample rate using two combined channels. Is this supported by the software and the bandwidth of the analog input?
144M sampling is not currently implemented in the software. It's supported in all FPGA versions, even very early ones. I set it up not too long ago to see how well it worked. It works but is extremely complex and tedious to set up, with special time bases, both time and level interlacing compensation needed and meters need to be adapted to it. It is only of any benefit at the 0.1uS/div timebase, since at 0.2 you need to halve the sampling rate so you might as well just use a non interleaved mode. The display is already pretty good at the ~10Mhz max anyways, not much to gain from such an elaborate function in my opinion.

The noise problem may have been there with previous FPGA's, but the added gate functions, particularly the 32 bit compares for the time triggering makes it much worse, plus the nature of full speed sampling "records" every little blip, so it really shows up with that.
What do you mean with level? At which clock speeds the problem disappeared?
By a certain "level" I mean a level shift on the vertical display where some more significant bit takes over lesser ones (eg: shift from b01111 to b10000) . There was one such noise "spot" about 3/4 of the way up (actually this is 1/4 of the way up, since the FPGA inverts the input from the ADC) that acted different, seemed to always be there, even with the sampling rate reduced way down, while if a particular configuration otherwise created a lot of noise (at other "levels"), these would generally disappear below 18 or 9 MS/sec.

As far as the HWD9288 is concerned, I had to replace another one of those a couple of years back on another device that produced nothing but garbage on one channel at the 2 fastest timebases. And yes, I saw the photocopied data sheets... Pretty blatant of them!
Last edited by Wildcat on Wed Mar 09, 2016 12:15 pm, edited 1 time in total.

Wildcat
Elementary-1
Elementary-1
Posts: 166
Joined: Fri Jun 22, 2012 1:29 pm
Are you a staff member of seeedstudio?: no
Which products/projects are your favorite?: DSO Quad

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by Wildcat » Wed Mar 09, 2016 12:13 pm

thenaughtyfantasy wrote:Also everytime i save a calibration for low-battery, and full-battery it overwrites the old one automatically? Or do i need to delete the old calibration files manually?
You don't need to delete the old ones, the new values overwrite the old ones.

z76
Pre-kindergarten
Pre-kindergarten
Posts: 3
Joined: Tue Dec 09, 2014 2:58 am

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by z76 » Wed Mar 09, 2016 10:36 pm

Thanks for an amazing firmware, Wildcat. I'm mostly a hobbyist and this is my first feature-rich scope. I'm learning a lot thanks to your efforts!

Here's a feature request I've been thinking about: What if there was a mode where the scope boots in the same condition as it was when it was turned off? (Technically the config 0 file would be saved whenever parameters were adjusted, perhaps after a timeout to avoid excessive writes.)

I try to remember to save my config before I power-off but sometimes I forget and spend a lot of time trying to restore whatever settings I had before. Thanks for reading -Zach

Wildcat
Elementary-1
Elementary-1
Posts: 166
Joined: Fri Jun 22, 2012 1:29 pm
Are you a staff member of seeedstudio?: no
Which products/projects are your favorite?: DSO Quad

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by Wildcat » Thu Mar 10, 2016 1:55 pm

z76 wrote:Thanks for an amazing firmware, Wildcat. I'm mostly a hobbyist and this is my first feature-rich scope. I'm learning a lot thanks to your efforts!

Here's a feature request I've been thinking about: What if there was a mode where the scope boots in the same condition as it was when it was turned off? (Technically the config 0 file would be saved whenever parameters were adjusted, perhaps after a timeout to avoid excessive writes.)

I try to remember to save my config before I power-off but sometimes I forget and spend a lot of time trying to restore whatever settings I had before. Thanks for reading -Zach
Well, I've done that myself, on more than one occasion. If I can come up with a simple way of monitoring the parameters that doesn't use a lot of memory I might do that. Only working with 48 KILObytes of RAM for the entire program, things have to be done in a very efficient way. There's very little memory left to spare...

Will be a while before another update though, I have little free time right now. Also waiting to see how the new FPGA I posted works out.
Last edited by Wildcat on Fri Mar 11, 2016 7:56 am, edited 1 time in total.

z76
Pre-kindergarten
Pre-kindergarten
Posts: 3
Joined: Tue Dec 09, 2014 2:58 am

Re: DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes

Post by z76 » Thu Mar 10, 2016 8:33 pm

Thanks for considering the request, Wildcat. I understand that there's no timeline for features like these. Take care.

Post Reply