x1/x10 probe and Nano V2 Question

I have a question on setting this probe up with the Nano
(I will apologize in advance if I am not saying something correctly, this is new to me)

I received a few nano v2 that i plan to use for my business yesterday. It will be used to
setup car audio systems correctly (30Hz to 20kHz and varying voltage output up to 2-50v, amplifier side)

Every unit has the BNC adapter and a x1/x10 probe. In setting up the Nanos, I can change attenuation
on the Nano from x1 to x10, does this work in conjunction with the switch on the probe?

Do I even need a x10 setting working in the range I have setup above?

Thank you in advance…

I don’t currently have a V2 unit, but the general answer is to always match the probe switch with the Nano “OT - Probe Att” value.

Your application may not require a 10x probe because the 10x probe increases the usable input range and also reduces circuit loading in circuits with high impedence sensitivity.

While using a 1x or 10x probe, never attach to a voltage that exceeds that V/Div range setting; this includes while the Nano is turned off. While the Nano is turned off, the input circuitry defaults to the lowest range setting … burn-baby-burn :astonished:

You can’t even safely attach a 100mV signal with the Nano turned off, because that could possibly violate the internal chip absolute maximum input pin specifications. :confused:

Thanks… So, I will be working in the 2v-50v range 98% of the time… Never less than 1-2v
and in the freq range of 30Hz to 20khz, which this should be more than able.

So BEFORE I calibrate the probe, make sure the attenuation is set on x10 on the Nano
They should be in sync (x10 nano and probe)?