January 7, 2015
I have four 3 MP IP bullet cameras (IPOB-EL3MPIR50), and all exhibit video smearing on the main stream when on-camera motion detection is enabled.
What I'm trying to do is have different settings for the General stream type, with very low bit and frame rates, and when motion is detected, use the max for both. I haven't found any combination of CBR vs. VBR, bit rates, or other video settings that work around this, other than disabling motion detection entirely.
All cams are running the same firmware version:
2.420.0002.0.R, build : 2014-06-21
I've (hopefully) attached a small screen capture showing the problem.
Thanks,
Jim
The issue that you're having is a latency issue. Decreasing the I frame rate (on the encode page of the DVR's menu)should help with this issue.
You can set different bit rate and frames per second by going to the encode page and changing the type setting to Regular and then MD.
January 7, 2015
Zeke Richey said
The issue that you're having is a latency issue. Decreasing the I frame rate (on the encode page of the DVR's menu)should help with this issue.You can set different bit rate and frames per second by going to the encode page and changing the type setting to Regular and then MD.
Thanks, but I'm not sure how fiddling with the I-frame rate would fix that, when this just looks like a straight-up encoder bug.
Over the past week I've tried a large number of settings, and regardless of bitrate, i-frame frequency, VBR/CBR, etc., the larger the area of the motion, the more corrupt the stream becomes. This generally means close in motion like people approaching the camera become smeared for multiple seconds, to the point where I can't identify them.
Jim
1 Guest(s)