I have upgraded my Quad with the new firmware and FPGA. Now the calibration seems to work much better than before so I made a second time the measure I made before to see if the offset between the two halves of the ADC is still a problem when hiding a channel.http://www.seeedstudio.com/forum/viewtopic.php?p=6828#p6828
There is a bug with new firmware.
When hiding one channel, then display is stretched in the horizontal direction (2 time more samples)
I have made the following acquisitions with a 150KHz sine
I have noticed this too. It appears to only occur on ranges of 2uS up to 0.1uS. The doubling of sample rate is really only significant at these higher speeds because the samples are beyond the display resolution at slower speeds. I would actually favour the doubling of displayed horizontal width as long as the timebase readout was corrected to match it. This would give us a highest timebase rate of 50nS/div which is justifiable with the higher sample rate.
I also get the poor reconstruction of the waveform as shown on your lower picture. I have matched the gains of the two channels but can’t get rid of it. Maybe there is more at play here?
I would love to see averaging of the samples at these higher rates so we could reconstruct waveforms of fast repetitive events which were up near the sample rate.
My next guess about this interleave noise concerns DC offset differences between channels A & B. When you adjust the gain calibration, I suspect you are changing the programmed reference voltage for the associated ADC to correct for any DC front-end offset. ADC channel A is adjusted to channel A front-end, same with channel B. But now, when you interleave channel A with both ADC A&B, then it is quite likely that any channel B gain calibration is not right for the ADC B channel while interleaved with the channel A front-end.
A way to test this would be to set the calibration gain adjustment of channel B to match the gain adjustments for channel A. Now input the signal on channel A and see if the interleave noise disappears.
Maybe someone could look at the source code and see if the channel A calibration values are used to set the programmed reference voltage for both ADCs while in interleave mode, and same when channel B is active.
I have had another look at the waveform interleave. There is something not right about the way the samples are drawn. I cannot post an image but I will attemp to explain what I mean. If the channels were simply different in offset or gain then I would expect to see horizontal steps differing for each sample. The actual waveform has one sample period with a horizontal line and the next sample with a sloping line and so on . This looks more like a software drawing error. What do others think?
Lygra - the problem is not apparent when the inputs are grounded. you simply get the flat trace you would expect. This really rules out offset difference between channels.
fdufnews - it might be related to sample ordering but I would not expect the sloping line if it was just that. I would expect flat steps at two different levels.
Further experiments show that it is proportional to the difference between sucessive samples and is worst on the 0.1uS scale and most apparent on faster rising edges. eg. If I view a 100nS rising edge then the “sawtooth” steps are about 0.5div deep when the whole rising edge is 6div If I then slow the rising edge to 500nS then the “sawtooth” is much less pronounced and around 0.2div (almost acceptable). This was all done on a 1.2v pulse on the 0.2v/div scale.