Announcement

Collapse
No announcement yet.

How does LCR meter measure capacitance?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    How does LCR meter measure capacitance?

    Hi guys,

    I am using an LCR meter with a test voltage of 0.3Vrms. Available test frequency is only 100Hz. But the capacitor that I want to test is specified at 120Hz.

    Although this seems to be pretty close, I just want to know how much would be the difference if the LCR meter could read it at 120Hz.

    Do you guys know any formula or correction factor that I can use to compute the capacitance at 120Hz with the values I am getting at 100Hz?

    Also if you could share how does the LCR meter measure capacitance and dissipation factor (I mean does it measure impedance, current , time constant then convert to capacitance), that would be greatly appreciated.
    Thanks!

    #2
    Re: How does LCR meter measure capacitance?

    Hi, the capacitance of a capacitor is not going to change in any important way if the test frequency is changed, especially if it's only between 100Hz and 120Hz.

    120Hz seems to often appear in capacitor specs because it's the ripple frequency of full wave rectified 60Hz US mains power. The designers of the meter would have chosen 100Hz because it's best suited to their measurement technique.

    Only the manufacturer of your meter would know exactly how it works, and they are probably not all the same. With most of these kind of things, all that's important is that the instrument is reasonably accurate. Like a cell phone, you don't need to know exactly what's in it, to use it.

    Regards
    Bob
    It is a good shrubbery. I like the laurels particularly...

    Comment

    Working...