February 16, 2016
I have a question regarding how camera frame rate (PAL vs. NTSC) is affected by power phase (60hz vs. 50hz), more specifically, whether or not IP cameras are even affected by power phase.
I was told recently that IP cameras are "digital", therefore power phase does not affect the frame rate, but I guess I do not fully understand how this is possible.
I think that this may come down to where/how the IP cameras AD converter gets its time? I think an IP camera basically works like this: LENS>CCD/CMOS>ADC>DSP, where the lens captures the image via the imager, is then converted from analog to digital via the ADC, and finally is compressed, encoded, etc. by the DSP?
The AD converter is sampling the analog signal voltage and comparing it to its reference voltage, but I think that voltage is derived from the cameras power source frequency (50 or 60hz), which even if it is converted to DC would still retain its frequency, right?
I may have no idea what I am talking about, but any help is appreciated.
Thanks
Members
May 15, 2015
I found a site that explains the issue a little. Most cameras coming from Asia (as most do) are set to 50Hz by default. Since U.S. systems run at 60Hz, you can see the flickering effect in lighting. Set your cameras to 60Hz (if you are in the U.S.) to avoid the flicker. In a lot of cameras, you will find this shown as 'Anti Flicker' setting.
Ceck this link -
1 Guest(s)