hannes-diethelm wrote:This is typical when you do changes inside the FPGA. I assume that the timing was already critical before you did the changes. If your changes add some additional delay, this can happen.
Now if the output of the ADC changes for example form 1000 0000 to 0111 1111 (128 to 127) and bit 5 is a bit to slow, the sampled value can be 0101 1111 (95). By heating or cooling the ADC, these delays change and the effect can be better or worse. This is the same if you change the ADC.
This is what I originally assumed before doing countless builds, however the problem has proven frustratingly difficult to pinpoint.
I just reverse engineered the FPGA code a bit. MCI seams to be the master clock from the CPU (72MHz). The ADC Clocks are given by: assign CKA=MCI;
assign CKB= (Ctrl_Link) ? !MCI : MCI;
In the DP_RAM, the ADC Data is sampled on the rising edge of MCI.
This seams all fine, but assign CKB= (Ctrl_Link) ? !MCI : MCI; adds some additional delay on CKB. Just give it a try and use:
assign CKA= (Ctrl_Link) ? !MCI : MCI;
to have the same delay on CKA. I looked trough the code and it seems Ctrl_Link is always zero.
I did try that, while it may have changed things a bit (just about anything you changed affected it somewhat) it did not fix the problem. The reason for the Ctrl_Link conditional is to provide support for 144Mhz sampling, which requires the ADC to be set up with out of phase clocks.
The code integrates a mux which gives additional delay. It is possible that all delays are so big that the FPGA samples the signal on one rising edge later, so giving more delay could fix the bug.
Some of the early builds in trying to solve this looked as if this could have been the case, as post synthesis timing showed around a 1 MCI clock period shortfall. However, other compiles with everything stripped down to keep timings in line still did not solve the problem.
It might also be that CHA and CHB are exchanged some where by accident, so trying:
removes the delay on CHB which could be CHA...
Normally, such timing things are tested and fixed if possible by the FPGA tool, but the actual SDC files has no information about the external delay of the ADC, so it assumes zero delay. And it might work or not by chance and temperature!
I didn't add delay with the constraints file but after observing that reading from the falling rather than the rising edge made no difference whatsoever with the noise I kind of dismissed the possibility of input timing issues. Still it wouldn't hurt to try, would be great if such a simple fix would work...
By the way: Which tool are you using to synthesize the FPGA? I could improve the SDC file and do some FPGA simulations to see if there is really a problem. And which compiler are you using for the C code?
I'm using Lattice's ICEcube2, release 2015.04.27409 with the Synopsys synthesizer and GCC 4.6.1 from codesourcery for the C compiler (CodeBench Lite).
I spent a good amount of time trying to solve this problem before changing the ADC chip. I eventually came up with some builds that actually worked pretty well, but while some minimized the noise to barely noticeable levels, they never completely eliminated it.
Here are some notes from my experience with this issue that may be of help:
First of all, the issue was confirmed coming from the ADC by swapping the A ch up to bits 8-15 and B ch down to 0-7 at the input ports of the FPGA. This caused the problem to now display in the B channel (the A ch from the ADC was now being sent to the B display). The noise was not changed at all, it was exactly the same.
The problem seems to be intermittent in nature, in other words it's not a continuous, on every cycle event but a random occasional blip. Of course, if something is "on the edge" it can act that way but it's unusually persistent, occurring to a varying extent at all clock speeds and with a wide variety of completely different FPGA configurations.
Most of the builds I tried did not show the effect at all in normal modes, due to it's intermittent nature. However full speed mode stores any event like this. At slow timebases, the cumulative effect of having say 60,000 samples taken, stored and displayed would result in the near certitude of capturing at least one event for each displayed sample, resulting in what looks like a continuous string of noise.
While some bit toggling at various levels could be observed with some configurations, and these would respond to changes in timing, as well as would diminish and eventually disappear when reducing the clock speed (keep in mind in normal modes the timbase sets the clock speed, it can as well be adjusted in full speed mode), there was one "level" at which the noise occurred that did NOT respond to reducing the clock speed and proved frustratingly persistent.
Many configurations were tried, with different functions clocking on different edges. For example, reading the input on the falling edge rather than the rising edge did not have any effect, with the exact same particular bits in the ADC ch A behaving in the exact same way.
The one thing that seemed to cause the most problem was increasing XTthreshold from 16 bits to 32 bits to account for the faster clock used with full speed mode at slow timebases. This seemed to come from the increased complexity of having to compare several 32 bit registers, in comparison to comparing 16 bit regs.
The final FPGA I published is NOT optimized to minimize this problem, but rather optimized to function properly. Final timing analysis for that one looks good, the only exceptions being with some inter-clock relationships with unspecified constraints between the main clocks and program/FPGA control transfers that occur only occasionally, too seldom to be an issue. Many builds were found to almost eliminate the noise, but frustratingly needed a compromise somewhere (most notably in the time based triggering function which needed to be changed to 32 bits) and for that reason could not be used.
None of this made any sense to me. At first, I was sure the issue was timing but no matter what I tried, the results were always random and unpredictable.
After the ADC was changed, none of the builds showed ANY trace of this whatsoever.
At this point I'm leaning more towards some hardware issue being the culprit, possibly the added gate count in the FPGA causing increased noise on the supply lines that the ADC can't cope with, or something like that.
I hope you can find something I overlooked that can solve the problems using this with the ADC's that come with the units. It also appears that some seem to work just fine... At this point, though, I have no way to reproduce the issue for testing as my unit no longer has the problem, so it's up to you if you wish.
Thanks for taking an interest in this, I believe this is a potentially very useful mode and well worth the effort to make it right. Let me know if I can help in any way.
By the way, you wouldn't by any chance have the means to program the ICE65 chips used in the earlier devices? These were made by Silicon Blue which was bought by Lattice but they do not appear to want to support the old chips. Although the 65 version can be selected, the compiler comes back with a missing DEV file. An earlier version of ICEcube from Silicon Blue only has Synopsys available for synthesizing and will not take the current license available from Lattice.