Jan Ekholm
2014-05-16 20:36:26 UTC
Hi,
So I see the error below in my code that works like a proxy. It receives a stream from a remote
camera, optionally does some magic with it and then multicasts it onwards. The multicasting is
done using a PassiveServerMediaSubSession, H264VideoRTPSink and H264VideoStreamFramer
in addition to what MediaSubsession->initiate() sets up. I see a lot of:
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (693). 1023 bytes of trailing data will be dropped!
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (794). 959 bytes of trailing data will be dropped!
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (490). 1329 bytes of trailing data will be dropped!
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (1269). 685 bytes of trailing data will be dropped!
The final recipient does not get any data (tested with testRTSPClient).
If I change the H264VideoStreamFramer to a H264VideoStreamDiscreteFramer then I see none
of the above errors and then testRTSPClient gets data, but it is not playable by, say, VLC.
If I have openRTSP save the stream and then stream the saved file with testH264VideoStreamer,
then I get nothing streamed at all. So the data seems corrupt somehow.
Anyway, how are the buffers *really* set? I've set OutPacketBuffer::maxSize to all kinds of huge
values, even in the RTPSink code, but that has no effect. The buffer handling is a little bit
clumsy, if I may be honest. Lots of email threads say to set the buffer size using some
bufferSize parameter to the sink (the error is from a source) or use OutPacketBuffer::maxSize,
but none works.
So I see the error below in my code that works like a proxy. It receives a stream from a remote
camera, optionally does some magic with it and then multicasts it onwards. The multicasting is
done using a PassiveServerMediaSubSession, H264VideoRTPSink and H264VideoStreamFramer
in addition to what MediaSubsession->initiate() sets up. I see a lot of:
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (693). 1023 bytes of trailing data will be dropped!
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (794). 959 bytes of trailing data will be dropped!
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (490). 1329 bytes of trailing data will be dropped!
MultiFramedRTPSource::doGetNextFrame1(): The total received frame size exceeds the client's buffer size (1269). 685 bytes of trailing data will be dropped!
The final recipient does not get any data (tested with testRTSPClient).
If I change the H264VideoStreamFramer to a H264VideoStreamDiscreteFramer then I see none
of the above errors and then testRTSPClient gets data, but it is not playable by, say, VLC.
If I have openRTSP save the stream and then stream the saved file with testH264VideoStreamer,
then I get nothing streamed at all. So the data seems corrupt somehow.
Anyway, how are the buffers *really* set? I've set OutPacketBuffer::maxSize to all kinds of huge
values, even in the RTPSink code, but that has no effect. The buffer handling is a little bit
clumsy, if I may be honest. Lots of email threads say to set the buffer size using some
bufferSize parameter to the sink (the error is from a source) or use OutPacketBuffer::maxSize,
but none works.
--
Jan Ekholm
***@d-pointer.com
Jan Ekholm
***@d-pointer.com