On Mon, Sep 16, 2019 at 12:22 AM, Reginald Beardsley wrote:
I try to understand what's going on but don't see it. What is wrong in the following reasoning?
The trigger and ramp periods are equal to and almost equal to 10 us, with a difference of 10^-6 period or 10 ps. So per sample the time shift of ramp w.r.t. trigger is 10 ps while at 1 ps/div the time base adds only 10 fs per sample to the shift. If my view is correct then the time base accuracy plays no role in your display.
Another concern is the following. The total time shift during one trace is 10 ns, only 1/1000 period. The display shows a very small portion of the ramp. You measure at 2 mV/div. The total of 20 mV difference during one trace would correspond to 0.001 x total ramp amplitude. Hence total ramp amplitude would be 20 V, obviously way to high for the sampling head. Apart from that, the displayed ramp would nearly always be vertically off screen.
Something must be wrong in my reasoning, but what?