This LCR-meter works a treat for "regular" resistor values, but shows unstable results for values over 10 Mega ohms. These hi-R measurements should be done at low frequency (100 Hz) to minimise the typical lowpass filter effect of hi-R resistors. No need to say that these resistors are measured within a properly earthed, full metal enclosure, with maximal samples averaged.
But that's not enough. The LCR-meter allows each individual sample a very short timeslot, thus minimising time/phase faults while alternating between I and U measurements. These samples are probably not fully synchronised with the 100 Hz sinussignal (and this signal itself is definitely not synchronised with the omnipresent AC mains noise).
To compensate for this a much longer timeslot would be needed. And wouldn't it be nice to add an extra "zero" choice to the 3 sinus frequencies? Then with all sinus switched off (and DUT still in situ) the meter could measure the actual noise level, taking this into account when the sinussignal is restored. This and an extended timeslot could probably be realised in a next software version.
Or am I off topic here? Maybe measuring resistors proper should better be done with a DC source and DC mV meter instead of a sinussignal-based LCR-meter?
