Hi guys,
I am using an LCR meter with a test voltage of 0.3Vrms. Available test frequency is only 100Hz. But the capacitor that I want to test is specified at 120Hz.
Although this seems to be pretty close, I just want to know how much would be the difference if the LCR meter could read it at 120Hz.
Do you guys know any formula or correction factor that I can use to compute the capacitance at 120Hz with the values I am getting at 100Hz?
Also if you could share how does the LCR meter measure capacitance and dissipation factor (I mean does it measure impedance, current , time constant then convert to capacitance), that would be greatly appreciated.
Thanks!
I am using an LCR meter with a test voltage of 0.3Vrms. Available test frequency is only 100Hz. But the capacitor that I want to test is specified at 120Hz.
Although this seems to be pretty close, I just want to know how much would be the difference if the LCR meter could read it at 120Hz.
Do you guys know any formula or correction factor that I can use to compute the capacitance at 120Hz with the values I am getting at 100Hz?
Also if you could share how does the LCR meter measure capacitance and dissipation factor (I mean does it measure impedance, current , time constant then convert to capacitance), that would be greatly appreciated.
Thanks!
Comment