Thanks for taking the time to help me along with this investigation. I may have failed to state my findings clearly.
Specifications for the B6201 probe call for a 100K input at low frequencies. This results from R130, a 90K series resistor in the probe head and R265, an 11.02K resistor in the DC-LF amplifier, which is paralleled by R220, a 100K resistor in the bias path.
My source for the 1kHz signal was the terminated output of a 50-ohm generator, so the approximate 850K a.c. input impedance of the Fluke meter should have had a minimal affect when measuring the voltage at this point.
I roughly determined the impedance of the meter by placing a resistance decade in series with the source and meter and adjusting the decade to reduce the meter reading to one-half. Of course this "resistance" could be non-linear and level dependent, I did not look at the Fluke manual.
My reasoning was as follows:
With a 900K resistor between the source and B6201 probe, the voltage at the probe should have been 0.1 times the source voltage, or a 10 : 1 division.
With the 910K resistor I had available, it should have been 0.099 or a 10.1 : 1 division.
What I observed was a 0.0446 division corresponding to a combined meter paralleled with probe resistance of 42.5K. Adjusting for the assumed 868K of the meter, this suggested the probe input impedance of the probe was 1/ (1/42.5 - 1/868) = 44.7K instead of the expected 100K.
But then I got to thinking about the decreasing probe input impedance as a function of frequency and examined Figure 1.1 in the manual. At 1 MHz the probe input impedance is only about 50K shunted by a reactance of -j 500-ohms. It appears compensating capacitors C1 and C2 in the probe head might not be set correctly. But from Figure 1.1, It does not appear one should expect flat frequency response with the 10X (voltage divider) attenuator because the input impedance of the main probe is falling sharply with frequency.
Is my reasoning correct here?
I thank all who have responded for their useful comments.