On the PS5010, my reference to the voltage and current ramping is the programming that takes place in the current and voltage limit DACs on a user request through front panel ON/OFF switch, or through remote control. This action takes place in both ON > OFF and OFF > ON operations. If you are monitoring the DAC outputs with a scope when the Output is toggled, you will see this programming action. It is to protect the output relay. Because the supply will normally go into CC mode during this transition, the software does not monitor the loop status during this time, to prevent reporting of the mode change.
If you need to draw a small amount of current to get regulation, then the minimum load is not sufficient to keep the output regulated. This could be either in the supply itself, or possibly (??) C-E leakage around the pass transistor in the mainframe. That would explain different amounts of the effect in different mainframe slots.
For the PS5004 – I designed all of it. For some fun, get a schematic and quickly try to find the precision 16 bit DAC needed to support the output resolution. If you are looking for a large “brick”, you won’t find it. 16 bit DACs were available at the time, but cost about 2.5 times the proposed manufacturing cost target for the entire built and tested instrument. The DAC I designed is a gated charge pump, using simple digital counter logic to set the duty factor of the current gate clock. BTW, Sony independently came up with the same idea for the 16 bit DACs needed in CD players about the same time.
I believe the part of the cal procedure you are referring to asks you to set output V to max, then back down one count (If my memory is correct – that was designed 40 years ago.) What you are doing is setting the full scale of the fine span of the DAC. Rather than take the full 16 bit resolution in one span, there are two charge pumps, scaled – I believe 200 counts (Each is 8 bits, the fine is 1/200th the coarse. They don’t map 1:1 with the digital pot coarse-fine range). Since calibration requires use of the digital knobs to set ranges, and the display is a volt meter measuring the measured output (not the programmed value as in the PS5010), the only way to set known values with the knob is at the extremes. So “0” is easy to set – turn both knobs to the left several turns and you can calibrate out the offsets in the system. Turn either or both knobs to the right several turns and you are at full scale – it is now possible to set the full span gain of the coarse current pump. Turning only the fine control down one “click” reduces the coarse stage DAC count by 1, and sets the fine DAC to is max scale. Now the user can calibrate the full scale of the fine . The order might not be the same as what I wrote in the cal procedure 40 years ago, but the process is the same.