DSO203 GCC APP - Community Edition (2.51+SmTech1.8+Fixes)

You will want to also install app1.hex. But first, after copying the 2 FPGA files, disconnect the device and shut it down, then boot up the DFU again, reconnect, then copy app1.hex.



If doesn’t really matter which you install first, but the 2 FPGA files must be installed together, without rebooting between them. The first (ADR) tells the DFU where to install the FPGA code, so it must be loaded first. Addressing for the HEX file is built-in, so it does not need an ADR file.

Just finished another program/FPGA update for HW V2.81, however, in the course of development, I ran into an issue with the hardware and I need help to find out how prevalent this is. The update adds a “full speed” buffer mode which samples at 72Ms/sec for all timebases.



The issue I ran into was parasitic bit toggling on ch A when using the updated FPGA (see bottom screenshot). Ch B was not affected. The problem was finally isolated to the ADC chip: Heating the IC would make this disappear, while cooling it made it worse. It should be noted though that when switching back and used with the previous FPGA versions, the noise was not seen. Apparently the added functions in the updated FPGA somehow seemed to be causing this.



However, after replacing the ADC IC with an Analog Devices unit the problem completely disappeared, showing no traces of the noise whatsoever when used with the new FPGA.



Hopefully this was an isolated problem with my particular device, and not common with the ADC’s that come with the later V2.81 units. Again, my device showed no signs of this until using the new FPGA update, and when changing back to earlier FPGA versions everything worked as it did before, while replacing the chip totally eliminated the problem.



So until I can get some feedback from other users on this I would have to consider it as a “test” version to establish the viability of posting any future FPGA updates.





WITH THIS IN MIND:



When used on HW 2.81 with accompanying revised FPGA V1.1:



Added full speed oversampling buffer mode: Will sample at full 72Ms/sec at all times, providing ~10MHz bandwidth without aliasing for all timebases, except at very slowest (50Ms/div and longer) where the rate is gradually reduced to maintain a maximum of 60,000 x OS ratio. Combines features of both digital and analog scopes. Cycle with right toggle center button: Single window > Averaging > 8x OS > Full speed sampling > Large buffer. Buffer notification will turn blue to indicate mode. Sampling speed for this mode can be changed from default of 72Ms/sec in steps down to 2Ms/sec if desired by holding button 2 for more than 3 sec while menu is on Time/Div adjust item while in full speed mode. Change with left toggle and save with boot config 0. Additional info is included in user’s guide.



By it’s very nature, this mode will capture and display any noise, no matter how intermittent, whether it’s generated from within the scope or from an outside source. For example, an intermittent ground strap between the display frame and gen out jack was also causing accumulated noise.



Once the ADC was replaced and the ground repaired, this worked quite well, increasing the sensitivity of the instrument in a very useful way. For example, when used at timebases of around 1mS/div and slower, where the display can keep up with incoming data, the mode provides continuous ~10MHz bandwidth sampling without any “dead time” or aliasing, with the ability to capture any signal or noise pulse, no matter how narrow or intermittent.





Other fixes with this update: (effective when used on all HW device versions)



-Fixed analog channels offset from each other, visible at highest 0.1uS/div when used with certain ADC’s found in later devices (shows as a 4-5 pixel offset due to interpolation). This was present with all versions from 3.3 and later due to the removal of interlace mode reset; for some reason all SYS versions set interlace on boot-up. ADC’s used on earlier devices do not seem to be affected, so this went on unnoticed.



-Added horizontal trace thickness adj: with menu on backlight adj, press RIGHT toggle (button 6) to toggle thickness adjustment menu, change with left toggle (2 steps with full speed OS, 3 with all others). Note that “vertical bright adjust” (see user guide for more info) can also be toggled in this way while in backlight menu with LEFT toggle center button. Save settings in boot menu 0. Display changes with these as well as the ADC offset adj can now be observed while changing.



-Added frequency display as alternative to delta time for Time cursors. Highlight T cursor display as sub menu item (T1 > T2 > Display), toggle between time or corresponding frequency with left toggle.



-Fixed UART generator function at times transmitting random buffer contents if trying to load a non existing file, also fixed file save function number display prematurely increasing value if file already exists.



-Trigger delay function is now disabled in oversampling modes.



-User guide has been updated.





50 nS overshoot pulses captured on a 10Hz square wave without aliasing in full speed buffer mode. On right the same pulse is expanded to 0.1uS/div.





Trace can now be adjusted for horizontal thickness.





Parasitic bit toggling in the ADC while in full speed OS mode on ch A. Ch B not affected. After replacing the IC, everything worked properly…

Hi WildCat, congratulations for your good work, I use it everyday and I find it very useful,

however I can’t find the pre-trigger function, in other words it seems not possible to move in time the trigger point.

It would be very useful to have it because I’m triggering a signal and I want to see what happens before it in SINGLE mode.



Do you have in mind to implement such a function?

Loaded in Wildcat 5.1 easy on Windows 7 with DSO Quad plugged into USB2.0 port. I wanted to reproduce Wildcats test of ADC but do not know what source of sine wave is. It looks like on picture of screen that wave generator on DSO203 was not used. I want to do exactly the same just in case.

All modes have 150 samples (5 divisions or a bit less than half a screen width) pre-trigger.



The actual trigger point is at the vertical orange trigger cursor, the one that moves when shifting XPOS. Anything to the left of this cursor is the pre-trigger section. If the window is shifted all the way to the left with XPOS, the orange trigger cursor will appear to move to the right towards the middle of the screen. The window and cursor (along with the waveform if triggered) will slow down at the very end to provide fine positioning.



In Single mode, the buffer defaults to the large buffer format, even when the buffer is set to single window. In large buffer mode, the window can be shifted down the length of the entire buffer with the XPOS control, some 10 windows width so if it is shifted to the right, the trigger point along with the pre-trigger section may not be visible unless you bring the window all the way back to the start at the extreme left. The small rectangle at the bottom of the screen will show the position of the window in relation to the buffer.



If the buffer is set to one of the oversampling or averaging modes, the buffer length will be one window’s width plus the pre-trigger section, even in single mode. In this case the window can only be shifted to the right a little ways, just enough to bring the trigger point to the left of the screen.



So in summary, to see the pre-trigger section, simply shift the window all the way to the left with XPOS, until the orange trigger cursor moves to the right towards the center of the screen.

The wave frequency was about 50Hz, however this is not critical. The noise was more obvious at slow timebases (2mS in this case) and occurred at specific levels, as can be seen in the screenshot. Seems as though the wrong bits were engaged as the ADC processed the incoming analog data.



Note however, that this was only observed on hardware V2.81 devices when the accompanying FPGA was also loaded, so that the full speed mode can be engaged, and while it was most severe with this full speed mode engaged, it was also observed to a lesser degree in normal modes.



So this test is only relevant if you have installed both Wildcat 5.1 and the FPGA 1.1 included with the update. Note that this FPGA in ONLY FOR HARDWARE V2.81.

Wildcat,



I am brand new to the dso203 and am still playing around with the system and the instructions in the V5.1 upgrade. I do have a new Ver 2.81 with the new wildcat V5.1 including the new FPGA 1.1 that was in V5.1. I want to make sure that I am testing the exact same thing you are when you get the ADC error. I was wandering if you had a more standardized test that people can do to check this. I was thinking you could have a test using the wave generator in the dso203 and A and B channels connected to it for the test then save the configure file and uploading it to this forum so we could download it, run it then upload a picture to the forum for you to see.



P.S having lots of fum with Version wildcat 5.1. Comparing it to full o-scopes at work. Much better features then the original software in the dso203. Thank for the hard work you have put into it.

Just realized Ionly have 2 max leads for the dso203. Can the test be done with only a channel and the wave generator?

Wildcat,



I noticed something similar, but erratic and not as pronounced. Hard to capture in a screen shot because the imperfections in the trace pattern move around and come and go. When I switch to the new blue buffer mode some more pronounced effects.



Didn’t try adjusting the room temperature.



This is with WC5.1 and custom FPGA V1.1, Hw 2.81.

Yes, you can do one channel at a time, and you can use the built-in generator. You don’t have to do anything special, just see if you are getting the type of noise that’s in the bottom screenshot, observing any waveform while in full speed buffer mode (press right toggle center button until bottom of screen area inside orange rectangular bar turns blue). If the noise is there you will see it. It may do it on one channel and not the other, and it may not be the same as mine.



It will be more obvious in slower timebases, say slower than 200uS/div. Cooling the device down to ~45 degrees F (5 degrees C) made it MUCH worse, heating it with a hair dryer to a bit above room temp made it go away.



Using the scope for a period of time while in full speed mode will certainly reveal it, if the problem is there.



Hopefully this was just a bad chip in my device and everyone else’s units are OK…

Yes, this looks like the noise, the key is that it occurs at the same vertical level on the wave trace. Try it in full speed mode with the same waveform/timebase.

Yes, at faster buffer mode (blue) shows more pronounced.

OK, well it looks like your device is responding exactly like mine was. The height is a bit different, but that’s likely because the ADC offset was adjusted to a different level on yours (looks like yours is set to around the default of 54).



I was hoping mine was an exception, but now with 2 out of 2 doing the exact same thing, looks more like this is “normal” for these ADC’s. I would like to get some more feedback to make sure ALL units behave in exactly the same way. During the course of troubleshooting this issue, I developed a few software “work arounds” that minimized the noise, at least at warm room temperatures (mine was EXTREMELY sensitive to temp, for example it was a bit pronounced when first turned on, but gradually got better as the circuitry warmed up from being powered up).



While the best solution would be to replace the ADC (which by the way also solved the premature clipping at the top of the screen, which was a problem with mine and apparently some others) I suspect not many have the resources or the willingness to do this. Since I no longer have a unit that exhibits this problem, I can now only guess as to what was causing this. Perhaps the increased gate load in the FPGA was causing noise on the supply lines that the ADC couldn’t cope with. The extreme sensitivity to temp has me baffled though. This usually indicates a defective chip, but with the complexity of modern processors, anything is possible.



The software “work arounds” included shifting the A channel to the B ADC when channel B wasn’t being used, this way totally eliminating the problem when only ch A or only ch B was used, as well as reducing the number of gates in the FPGA by cutting down as many functions as possible and simplifying things. Before I finally took mine apart to look into the matter, I came up with a APP/FPGA version that minimized the problem pretty well, if not eliminated it altogether. It did however have some issues with time based triggering, occasionally showing some instability with some wave types while in full speed mode. Solving this would have meant adding gates, which increased the parasitic noise, but the version was nonetheless quite useable, with the triggering instability very seldom occurring, only on rare occasions.



Will see if I can bring this older version up to date with the latest changes when I get some more free time.



Thanks for checking this out. Also thanks to anyone else in advance for verifying if their units behave in the same way.

Thank you for the explanation, today I noticed the orange cursor can move back and forth in the first quarter of the screen, however is there any hardware limitation for limiting this pre trigger to the first 150 samples?

Wildcat,



I just tried WC5.1 w/FPGA W1.1 on a different HW2.81 device. Results on this second unit in two buffer modes are attached. Didn’t try placing it the frig, just another room temperature test.

150 samples is a bit more than 1 division short of half way down the screen. There’s no hardware limitation. The default is set in the FPGA at 150 and it can be changed by the program, but the program continually resets the trigger and every time it does the FPGA resets the pre-trigger to 150. All of this can be changed. I always thought 150 samples of pre-trigger was enough but I suppose it could be useful to make it adjustable.

Thanks for checking this other one. It does not appear to have the problem, I would check it with a full scale waveform though just to be sure it doesn’t show up at the top or bottom. Doesn’t matter what voltage scale you’re on, it’s all the same to the ADC, it basically sees what you see on the screen.

Wildcat,

I was wondering which AD9288 spec from Analog Devices you replaced on your DSO. Was it another 40, or did you use a faster rated chip like the 80 or 100? Are they all 48 pin LQFP packaging.



Also, I found this photo of an old pcb board with 2MB flash from an eBay seller. The photo seems to indicate an option for AD9218 (10-bit A/D). Is the AD9218 pin compatible with the AD9288 and supported by the DSO’s firmware?

I used an 80MS/Sec chip for replacement. All speed versions are the same, just different grades. The 10 bit version is not compatible with the hardware, this is only advertising hype. They do make a pin compatible 10 bit version, but the extra bit ports would be unused if installed in the Quad.



The speed rating of the chip however I don’t believe is a factor in this since the parasitic noise was still there even with the sampling rate brought down to 2MS/Sec, thought somewhat less prominent. I chose a 80M for replacement simply because it’s proper engineering practice, but the 40M devices seem to work OK at 72M.



What I would be curious to know though, is the difference, if any, between the ADC versions used in the 2 devices you tested. The schematic for HW 2.81 specifies a HWD9288 from Chengdu Sino (CSMT), which is what I had in my device. Another, earlier HW V2.70 device I have might also have had one of these or possibly something from another source, can’t remember, while the early V2.60 versions have Analog Devices chips.

Single Channel A. Looks good to me. Let me know if this is the sample with the new dso203 with the new FPGA chip and the wildcat 5.1 and the new FPGA 1.1 you are looking for.

attachment=0]image.jpg[/attachment]