Discussion:
Proper use of StreamReplicator
Jan Ekholm
2014-10-16 19:19:45 UTC
Permalink
Hi,

I've use Live555 for some prototypes for a while now and it's working quite well so far. My use cases
are to act as a cental video hub for a number of remote surveillance cameras as well as locally connected
USB cameras and serve H264/MJPEG streams using unicast and multicast. Those scenarios more or
less work ok. The code isn't too pretty but Live555 is quite hard to use.

Now I need to save the streams to disk too. There is the handy StreamReplicator class that should allow
me to save the streams to disk as well as stream them to clients, but I've not really understood how to use
it correctly. From what I've understood I need to create one StreamReplicator for the source stream and
then replicator->createNewStreamReplica() for the streaming as well as the saving. Well, this does not work
at all so I'm doing something wrong.

The class that handles a local USB camera and unicasts MJPEG is basically:

class LocalMJpegUnicastServerMediaSubsession : public OnDemandServerMediaSubsession {
...
protected:

virtual FramedSource* createNewStreamSource(unsigned clientSessionId, unsigned& estBitrate);

virtual RTPSink* createNewRTPSink(Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource);

private:
CameraParameters m_cameraParameters;
StreamReplicator * m_replicator;
FileSink * m_saveSink;
};


FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned clientSessionID, unsigned& estBitRate) {
// create and initialize a source for the camera. This is a JPEGVideoSource subclass that captures, encodes and delivers JPEG frames
// it works fine as long as do not try to use StreamReplicator
MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
source->initialize( m_cameraParameters );

m_replicator = StreamReplicator::createNew( envir(), source, False );

return m_replicator->createStreamReplica();
}

RTPSink* LocalMJpegUnicastServerMediaSubsession::createNewRTPSink (Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) {
return JPEGVideoRTPSink::createNew( envir(), rtpGroupsock );
}

When I use this ServerMediaSubsession and connect a client the call sequence I see is:

LocalMJpegUnicastServerMediaSubsession::createNewStreamSource()
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()
LocalMJpegUnicastServerMediaSubsession::createNewStreamSource()
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()

Nothing is however delivered to the network. As if the stream doesn't start. I never see a call to
MJpegFramedSource::doGetNextFrame() which is the overridden method for, well, getting the next
frame. No errors, no crashes and no data. If I now try to save the stream I do get something
saved, but I have not analyzed the file yet. The amount of data looks correct though (a lot of
data very fast). To start saving I use code like (simplified):

void LocalMJpegUnicastServerMediaSubsession::startSavingStream (const std::string & filename) {
FramedSource * source = m_replicator->createStreamReplica();
m_saveSink = FileSink::createNew( envir(), filename.c_str(), bufferSize );
m_saveSink->startPlaying( *source, 0, 0 );
}

So I get no stream but a saved file.

If I change my createNewStreamSource () back to the below it works fine for streaming:

FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned clientSessionID, unsigned& estBitRate) {
MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
source->initialize( m_cameraParameters );
return source;
}

In this case there is no replicator at all and I can not save the stream. Trying to later add a replicator to the
source and then use that with the FileSink leads to infinite recursion in doGetNextFrame(). But as I've understood
I can not use a replicator like this the behavior is perhaps to be expected. In this case I get a stream but no save
file.

So, how would one properly use StreamReplicator here so that I get both a stream and a save file? Later I also need
to be able to save streams that are local cameras multicasted as well as remote proxied cameras.
--
Jan Ekholm
***@d-pointer.com
Ross Finlayson
2014-10-17 01:34:58 UTC
Permalink
Post by Jan Ekholm
I've use Live555 for some prototypes for a while now and it's working quite well so far. My use cases
are to act as a cental video hub for a number of remote surveillance cameras as well as locally connected
USB cameras and serve H264/MJPEG streams using unicast and multicast. Those scenarios more or
less work ok. The code isn't too pretty but Live555 is quite hard to use.
Yes, it's hard to use, but that's mainly because it's used to build complex systems. It's also intended for experienced systems programmers; not for the 'faint of heart' :-) Finally, everyone should remember that Live Networks, Inc. is not a charity, and I make no money by helping people with the software 'for free' on this mailing list; it's just something that I do as a public service.
Post by Jan Ekholm
Now I need to save the streams to disk too. There is the handy StreamReplicator class that should allow
me to save the streams to disk as well as stream them to clients, but I've not really understood how to use
it correctly. From what I've understood I need to create one StreamReplicator for the source stream and
then replicator->createNewStreamReplica() for the streaming as well as the saving. Well, this does not work
at all so I'm doing something wrong.
class LocalMJpegUnicastServerMediaSubsession : public OnDemandServerMediaSubsession {
First, because you're streaming from live sources (rather than prerecorded files), make sure that your "LocalMJpegUnicastServerMediaSubsession" constructor sets the "reuseFirstSource" parameter - in the "OnDemandServerMediaSubsession" constructor - to True.
Post by Jan Ekholm
FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned clientSessionID, unsigned& estBitRate) {
// create and initialize a source for the camera. This is a JPEGVideoSource subclass that captures, encodes and delivers JPEG frames
// it works fine as long as do not try to use StreamReplicator
MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
source->initialize( m_cameraParameters );
m_replicator = StreamReplicator::createNew( envir(), source, False );
I would replace this line with:
if (m_replicator == NULL) {
m_replicator = StreamReplicator::createNew( envir(), source, False );
startSavingStream(yourFileName);
}
and, of course, initialize "m_replicator" to NULL in your constructor.
Post by Jan Ekholm
return m_replicator->createStreamReplica();
This is correct.
Post by Jan Ekholm
RTPSink* LocalMJpegUnicastServerMediaSubsession::createNewRTPSink (Groupsock* rtpGroupsock, unsigned char rtpPayloadTypeIfDynamic, FramedSource* inputSource) {
return JPEGVideoRTPSink::createNew( envir(), rtpGroupsock );
}
This is correct.
Post by Jan Ekholm
LocalMJpegUnicastServerMediaSubsession::createNewStreamSource()
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()
FYI, at this point, you should also be seeing:
~JPEGVideoRTPSink()
~StreamReplica()
Post by Jan Ekholm
LocalMJpegUnicastServerMediaSubsession::createNewStreamSource()
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()
The reason for this is that the first "createNewStreamSource()"/"createNewRTPSink()" calls are to create 'dummy' objects that the RTSP server uses to get the stream's SDP description (for the RTSP 'DESCRIBE" command). These two objects are then deleted (thus the "FYI" above). Then, 'real' source and sink objects are created.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Jan Ekholm
2014-10-17 09:16:38 UTC
Permalink
Ok, I think I've found the issue. The culprit here is the StreamReplica class which is
a FrameSource, but it is not a JPEG video source, i.e. it does not return True for:

virtual Boolean isJPEGVideoSource();

This is what is checked in:

Boolean JPEGVideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
return source.isJPEGVideoSource();
}

which is checked from:

Boolean MediaSink::startPlaying(MediaSource& source,
afterPlayingFunc* afterFunc,
void* afterClientData) {
// Make sure we're not already being played:
if (fSource != NULL) {
envir().setResultMsg("This sink is already being played");
return False;
}

// HERE
// Make sure our source is compatible:
if (!sourceIsCompatibleWithUs(source)) {
envir().setResultMsg("MediaSink::startPlaying(): source is not compatible!");
return False;
}
...

To me it seems as if StreamReplica should forward all the methods below to the replicated source?

// Test for specific types of source:
virtual Boolean isFramedSource() const;
virtual Boolean isRTPSource() const;
virtual Boolean isMPEG1or2VideoStreamFramer() const;
virtual Boolean isMPEG4VideoStreamFramer() const;
virtual Boolean isH264VideoStreamFramer() const;
virtual Boolean isH265VideoStreamFramer() const;
virtual Boolean isDVVideoStreamFramer() const;
virtual Boolean isJPEGVideoSource() const;
virtual Boolean isAMRAudioSource() const;

But if I redefine that methods it all crashes later in:

unsigned JPEGVideoRTPSink::specialHeaderSize() const {
// Our source is known to be a JPEGVideoSource
JPEGVideoSource* source = (JPEGVideoSource*)fSource;
if (source == NULL) return 0; // sanity check
...

Here the source is a StreamReplicator and not a JPEGVideoSource, so the cast will be bogus.
Would perhaps be better to use dynamic_cast<> here, then the sanity check would work. As the
StreamReplica class is internal to StreamReplicator specialHeaderSize() can not check for it when
casting and do special handling for that case.

So, for MJPEG there are a few issues.
--
Jan Ekholm
***@d-pointer.com
Jan Ekholm
2014-10-17 09:16:54 UTC
Permalink
Post by Ross Finlayson
Post by Jan Ekholm
I've use Live555 for some prototypes for a while now and it's working quite well so far. My use cases
are to act as a cental video hub for a number of remote surveillance cameras as well as locally connected
USB cameras and serve H264/MJPEG streams using unicast and multicast. Those scenarios more or
less work ok. The code isn't too pretty but Live555 is quite hard to use.
Yes, it's hard to use, but that's mainly because it's used to build complex systems. It's also intended for experienced systems programmers; not for the 'faint of heart' :-) Finally, everyone should remember that Live Networks, Inc. is not a charity, and I make no money by helping people with the software 'for free' on this mailing list; it's just something that I do as a public service.
Indeed. You should make it easier to support you, perhaps with a sponsorship program where you
can pay some predefined amounts as appreciation for support and the software. Make it easy to just click
a few buttons. Invoices etc are much more work for a customer and is what scared me off last spring when
I queried how I could pay for the help you'd given me. I still don't mind paying, but we in the rest of the world
are not really accustomed to US software consulting prices, so at least I can not pay for weeks of work
(that's what I get in a whole year).
Post by Ross Finlayson
Post by Jan Ekholm
Now I need to save the streams to disk too. There is the handy StreamReplicator class that should allow
me to save the streams to disk as well as stream them to clients, but I've not really understood how to use
it correctly. From what I've understood I need to create one StreamReplicator for the source stream and
then replicator->createNewStreamReplica() for the streaming as well as the saving. Well, this does not work
at all so I'm doing something wrong.
class LocalMJpegUnicastServerMediaSubsession : public OnDemandServerMediaSubsession {
First, because you're streaming from live sources (rather than prerecorded files), make sure that your "LocalMJpegUnicastServerMediaSubsession" constructor sets the "reuseFirstSource" parameter - in the "OnDemandServerMediaSubsession" constructor - to True.
It does set it to true.
Post by Ross Finlayson
Post by Jan Ekholm
FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned clientSessionID, unsigned& estBitRate) {
// create and initialize a source for the camera. This is a JPEGVideoSource subclass that captures, encodes and delivers JPEG frames
// it works fine as long as do not try to use StreamReplicator
MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
source->initialize( m_cameraParameters );
m_replicator = StreamReplicator::createNew( envir(), source, False );
if (m_replicator == NULL) {
m_replicator = StreamReplicator::createNew( envir(), source, False );
startSavingStream(yourFileName);
}
and, of course, initialize "m_replicator" to NULL in your constructor.
Ok. I've never been really sure if the source should be created in createNewStreamSource() along with
the replicator or if it is better to create both in the constructor and in createNewStreamSource() just return
replicator->createStreamReplica()? If I create the source in the constructor and reuse it there really is no
difference, it still does not work. I do think the replicator does not work in a more complex situation like
this.
Post by Ross Finlayson
Post by Jan Ekholm
LocalMJpegUnicastServerMediaSubsession::createNewStreamSource()
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()
~JPEGVideoRTPSink()
~StreamReplica()
Yes, I see JPEGVideoRTPSink being destroyed three times as createNewStreamSource() is called three times.
First with a clientSessionID=0 and the other times with valid large ids. No idea why three times, perhaps VLC which
acts as the client does something interesting. Trying with testRTSPClient I only see two calls. Perhaps VLC retries
something as there is no data. The output from testRTSPClient is:

./testProgs/testRTSPClient rtsp://192.168.1.12:8554/camera0
Opening connection to 192.168.1.12, port 8554...
...remote connection opened
Sending request: DESCRIBE rtsp://192.168.1.12:8554/camera0 RTSP/1.0
CSeq: 2
User-Agent: ./testProgs/testRTSPClient (LIVE555 Streaming Media v2014.10.07)
Accept: application/sdp


Received 562 new bytes of response data.
Received a complete DESCRIBE response:
RTSP/1.0 200 OK
CSeq: 2
Date: Fri, Oct 17 2014 07:48:44 GMT
Content-Base: rtsp://192.168.1.12:8554/camera0/
Content-Type: application/sdp
Content-Length: 396

v=0
o=- 1413532111476134 1 IN IP4 192.168.1.12
s=Local unicast MJPEG camera
i=Camera
t=0 0
a=tool:LIVE555 Streaming Media v2014.10.07
a=type:broadcast
a=control:*
a=source-filter: incl IN IP4 * 192.168.1.12
a=rtcp-unicast: reflection
a=range:npt=0-
a=x-qt-text-nam:Local unicast MJPEG camera
a=x-qt-text-inf:Camera
m=video 0 RTP/AVP 26
c=IN IP4 0.0.0.0
b=AS:200
a=control:track1

[URL:"rtsp://192.168.1.12:8554/camera0/"]: Got a SDP description:
v=0
o=- 1413532111476134 1 IN IP4 192.168.1.12
s=Local unicast MJPEG camera
i=Camera
t=0 0
a=tool:LIVE555 Streaming Media v2014.10.07
a=type:broadcast
a=control:*
a=source-filter: incl IN IP4 * 192.168.1.12
a=rtcp-unicast: reflection
a=range:npt=0-
a=x-qt-text-nam:Local unicast MJPEG camera
a=x-qt-text-inf:Camera
m=video 0 RTP/AVP 26
c=IN IP4 0.0.0.0
b=AS:200
a=control:track1

[URL:"rtsp://192.168.1.12:8554/camera0/"]: Initiated the "video/JPEG" subsession (client ports 56224-56225)
Sending request: SETUP rtsp://192.168.1.12:8554/camera0/track1 RTSP/1.0
CSeq: 3
User-Agent: ./testProgs/testRTSPClient (LIVE555 Streaming Media v2014.10.07)
Transport: RTP/AVP;unicast;client_port=56224-56225


Received 214 new bytes of response data.
Received a complete SETUP response:
RTSP/1.0 200 OK
CSeq: 3
Date: Fri, Oct 17 2014 07:48:44 GMT
Transport: RTP/AVP;unicast;destination=192.168.1.12;source=192.168.1.12;client_port=56224-56225;server_port=6970-6971
Session: 476829A6;timeout=65


[URL:"rtsp://192.168.1.12:8554/camera0/"]: Set up the "video/JPEG" subsession (client ports 56224-56225)
[URL:"rtsp://192.168.1.12:8554/camera0/"]: Created a data sink for the "video/JPEG" subsession
Sending request: PLAY rtsp://192.168.1.12:8554/camera0/ RTSP/1.0
CSeq: 4
User-Agent: ./testProgs/testRTSPClient (LIVE555 Streaming Media v2014.10.07)
Session: 476829A6
Range: npt=0.000-


Received 187 new bytes of response data.
Received a complete PLAY response:
RTSP/1.0 200 OK
CSeq: 4
Date: Fri, Oct 17 2014 07:48:44 GMT
Range: npt=0.000-
Session: 476829A6
RTP-Info: url=rtsp://192.168.1.12:8554/camera0/track1;seq=52977;rtptime=2830850619


[URL:"rtsp://192.168.1.12:8554/camera0/"]: Started playing session...
^C
Post by Ross Finlayson
Post by Jan Ekholm
LocalMJpegUnicastServerMediaSubsession::createNewStreamSource()
LocalMJpegUnicastServerMediaSubsession::createNewRTPSink()
The reason for this is that the first "createNewStreamSource()"/"createNewRTPSink()" calls are to create 'dummy' objects that the RTSP server uses to get the stream's SDP description (for the RTSP 'DESCRIBE" command). These two objects are then deleted (thus the "FYI" above). Then, 'real' source and sink objects are created.
Yes, that seems to be normal behavior.

So far there doesn't seem to be any error in what I'm trying to do, but the stream simply is not playing. Is there
really a startPlaying() being called for the replicated stream too?
--
Jan Ekholm
***@d-pointer.com
Ross Finlayson
2014-10-17 13:48:13 UTC
Permalink
Post by Jan Ekholm
Yes, I see JPEGVideoRTPSink being destroyed three times as createNewStreamSource() is called three times.
First with a clientSessionID=0 and the other times with valid large ids. No idea why three times, perhaps VLC which
acts as the client does something interesting. Trying with testRTSPClient I only see two calls. Perhaps VLC retries
something as there is no data.
Yes, VLC first requests regular RTP/RTCP-over-UDP streaming, but then - if it receives no data within a certain period of time - tries again, this time requesting RTP/RTCP-over-TCP streaming.
Post by Jan Ekholm
So far there doesn't seem to be any error in what I'm trying to do, but the stream simply is not playing. Is there
really a startPlaying() being called for the replicated stream too?
Yes, there should be - it’s done in the “OnDemandServerMediaSubsession” code, when the server handles the RTSP “PLAY” command.

Unfortunately, I can’t figure out why it’s not working for you. You’re going to have to work through the “StreamReplica” code, checking that "StreamReplica::doGetNextFrame()” gets called, as it should.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Jan Ekholm
2014-10-17 15:02:53 UTC
Permalink
Post by Ross Finlayson
Post by Jan Ekholm
Yes, I see JPEGVideoRTPSink being destroyed three times as createNewStreamSource() is called three times.
First with a clientSessionID=0 and the other times with valid large ids. No idea why three times, perhaps VLC which
acts as the client does something interesting. Trying with testRTSPClient I only see two calls. Perhaps VLC retries
something as there is no data.
Yes, VLC first requests regular RTP/RTCP-over-UDP streaming, but then - if it receives no data within a certain period of time - tries again, this time requesting RTP/RTCP-over-TCP streaming.
That explains it, as there definitely was no data being sent.
Post by Ross Finlayson
Post by Jan Ekholm
So far there doesn't seem to be any error in what I'm trying to do, but the stream simply is not playing. Is there
really a startPlaying() being called for the replicated stream too?
Yes, there should be - it’s done in the “OnDemandServerMediaSubsession” code, when the server handles the RTSP “PLAY” command.
Unfortunately, I can’t figure out why it’s not working for you. You’re going to have to work through the “StreamReplica” code, checking that "StreamReplica::doGetNextFrame()” gets called, as it should.
I found the culprit, did that mail not come through? I had some hassles with non responding email servers, so
perhaps it never was sent properly. Below I've copied the message.

---------------

Ok, I think I've found the issue. The culprit here is the StreamReplica class which is
a FrameSource, but it is not a JPEG video source, i.e. it does not return True for:

virtual Boolean isJPEGVideoSource();

This is what is checked in:

Boolean JPEGVideoRTPSink::sourceIsCompatibleWithUs(MediaSource& source) {
return source.isJPEGVideoSource();
}

which is checked from:

Boolean MediaSink::startPlaying(MediaSource& source,
afterPlayingFunc* afterFunc,
void* afterClientData) {
// Make sure we're not already being played:
if (fSource != NULL) {
envir().setResultMsg("This sink is already being played");
return False;
}

// Make sure our source is compatible:
if (!sourceIsCompatibleWithUs(source)) {
envir().setResultMsg("MediaSink::startPlaying(): source is not compatible!");
return False;
}
...

To me it seems as if StreamReplica perhaps should forward all the methods below to the replicated source?

// Test for specific types of source:
virtual Boolean isFramedSource() const;
virtual Boolean isRTPSource() const;
virtual Boolean isMPEG1or2VideoStreamFramer() const;
virtual Boolean isMPEG4VideoStreamFramer() const;
virtual Boolean isH264VideoStreamFramer() const;
virtual Boolean isH265VideoStreamFramer() const;
virtual Boolean isDVVideoStreamFramer() const;
virtual Boolean isJPEGVideoSource() const;
virtual Boolean isAMRAudioSource() const;

But if I redefine that methods it all crashes later in:

unsigned JPEGVideoRTPSink::specialHeaderSize() const {
// Our source is known to be a JPEGVideoSource
JPEGVideoSource* source = (JPEGVideoSource*)fSource;
if (source == NULL) return 0; // sanity check
...

Here fSource is a StreamReplicator and not a JPEGVideoSource, so the cast will be bogus.
Would perhaps be better to use dynamic_cast<> here, then the sanity check would work. As the
StreamReplica class is internal to StreamReplicator specialHeaderSize() can not check for it when
casting and do special handling for that case.

So, for MJPEG there are a few issues.
--
Jan Ekholm
***@d-pointer.com
Ross Finlayson
2014-10-17 18:48:40 UTC
Permalink
Post by Jan Ekholm
Ok, I think I've found the issue. The culprit here is the StreamReplica class which is
virtual Boolean isJPEGVideoSource();
OK, so yes - that’s the problem.

The data that you feed to “JPEGVideoRTPSink” MUST BE a subclass of “JPEGVideoSource”. It can’t just redefine “isJPEGVideoSource()” to return True (or just do some type casting hack). The reason for this is that “JPEGVideoRTPSink” needs to know the “type”, “qFactor”, “width”, and “height” of the frames that it receives (so it can pack these values into the appropriate fields of the outgoing RTP packet).

So, you’ll need to define your own subclass of “JPEGVideoSource” - e.g., called “ReplicaJPEGVideoSource”. This must take as input another “FramedSource” object (a “StreamReplica”, in your case), and must implement the following (pure) virtual functions: “doGetNextFrame()”, “type()”, “qFactor()”, “width()”, “height()”.

Implementing “doGetNextFrame()” is easy; just call “getNextFrame()” on the input (“StreamReplica”) object.

To implement the other virtual functions (“type()”, “qFactor()”, “width()”, “height()”), you’ll need to have these four parameters added to each frame of data somehow. I.e., you’ll need to modify your original JPEG video source object - i.e., the one that you feed into the “StreamReplicator” - to add a header at the start (or at the end) that contains these four values.

These four values will presumably also be useful to the other replica - the one that you feed into a “FileSink”.

Your “ReplicaJPEGVideoSource” class should also make sure that its destructor calls “Medium::close()” on the input source (a “StreamReplica”), and should also reimplement the “doStopGettingFrames()” virtual function to call “stopGettingFrames()” on the input source. (Note the implementation of “FramedFilter”, which does the same thing. In fact, you *might* try having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”, but I’m not sure whether or not that will work. (I’m wary of multiple inheritance in C++, and haven’t used it at all in any of the LIVE555 code so far.))

Finally, you’ll need to modify your implementation of “createNewStreamSource()” to not just return a new “StreamReplica”, but instead to feed this “StreamReplica” into a new “ReplicaJPEGVideoSource” object, and then return a pointer to this new “ReplicaJPEGVideoSource” object.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Ross Finlayson
2014-10-17 22:09:45 UTC
Permalink
Post by Ross Finlayson
In fact, you *might* try having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”, but I’m not sure whether or not that will work. (I’m wary of multiple inheritance in C++, and haven’t used it at all in any of the LIVE555 code so far.))
FYI, I looked into this, and unfortunately that hack (having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”) won’t work, because of the ‘Diamond Problem’. Because both “JPEGVideoSource” and “FramedFilter” inherit (non-virtually) from “FramedSource”, you can’t multiply-inherit from “JPEGVideoSource” and “FramedFilter”, otherwise you’d end up with two copies of “FramedSource”, which would probably be bad.

So, unfortunately you’re going to have to duplicate some of the existing functionality of “FramedFilter” in your new “ReplicaJPEGVideoSource” class.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Jan Ekholm
2014-10-18 09:12:40 UTC
Permalink
In fact, you *might* try having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”, but I’m not sure whether or not that will work. (I’m wary of multiple inheritance in C++, and haven’t used it at all in any of the LIVE555 code so far.))
FYI, I looked into this, and unfortunately that hack (having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”) won’t work, because of the ‘Diamond Problem’. Because both “JPEGVideoSource” and “FramedFilter” inherit (non-virtually) from “FramedSource”, you can’t multiply-inherit from “JPEGVideoSource” and “FramedFilter”, otherwise you’d end up with two copies of “FramedSource”, which would probably be bad.
So, unfortunately you’re going to have to duplicate some of the existing functionality of “FramedFilter” in your new “ReplicaJPEGVideoSource” class.
I tend to avoid multiple inheritance too, unless one class is a totally abstract interface base class. I ran into the same
issue with replicating H264 too, as the StreamReplica does not fulfill isH264VideoStreamFramer(). I'll take a shot at
making a proxy class for at least H264 and perhaps JPEG too. I have not found a working way to actually get the JPEG
video replayed from disk, so I may end up having to discard to idea of saving JPEG video to disk for later streaming.

Yes, I know JPEG video is far from ideal, but in this case it's a question of a demo that would prerecord some minutes
of live camera footage and then replay said material over and over again. H264 tends to drop frames and have glitches
even when run over localhost while JPEG video looks best.
--
Jan Ekholm
***@d-pointer.com
Jan Ekholm
2014-10-18 18:49:25 UTC
Permalink
The data that you feed to “JPEGVideoRTPSink” MUST BE a subclass of “JPEGVideoSource”. It can’t just redefine “isJPEGVideoSource()” to return True (or just do some type casting hack). The reason for this is that “JPEGVideoRTPSink” needs to know the “type”, “qFactor”, “width”, and “height” of the frames that it receives (so it can pack these values into the appropriate fields of the outgoing RTP packet).
So, you’ll need to define your own subclass of “JPEGVideoSource” - e.g., called “ReplicaJPEGVideoSource”. This must take as input another “FramedSource” object (a “StreamReplica”, in your case), and must implement the following (pure) virtual functions: “doGetNextFrame()”, “type()”, “qFactor()”, “width()”, “height()”.
Implementing “doGetNextFrame()” is easy; just call “getNextFrame()” on the input (“StreamReplica”) object.
To implement the other virtual functions (“type()”, “qFactor()”, “width()”, “height()”), you’ll need to have these four parameters added to each frame of data somehow. I.e., you’ll need to modify your original JPEG video source object - i.e., the one that you feed into the “StreamReplicator” - to add a header at the start (or at the end) that contains these four values.
These four values will presumably also be useful to the other replica - the one that you feed into a “FileSink”.
Your “ReplicaJPEGVideoSource” class should also make sure that its destructor calls “Medium::close()” on the input source (a “StreamReplica”), and should also reimplement the “doStopGettingFrames()” virtual function to call “stopGettingFrames()” on the input source. (Note the implementation of “FramedFilter”, which does the same thing. In fact, you *might* try having your “ReplicaJPEGVideoSource” class inherit from both “JPEGVideoSource” and “FramedFilter”, but I’m not sure whether or not that will work. (I’m wary of multiple inheritance in C++, and haven’t used it at all in any of the LIVE555 code so far.))
Finally, you’ll need to modify your implementation of “createNewStreamSource()” to not just return a new “StreamReplica”, but instead to feed this “StreamReplica” into a new “ReplicaJPEGVideoSource” object, and then return a pointer to this new “ReplicaJPEGVideoSource” object.
That approach could work, but there are some obstacles along the way. First let me paste the class I use, see discussion after the class:

class ReplicaJPEGVideoSource : public JPEGVideoSource {

public:

ReplicaJPEGVideoSource (FramedSource * replica, MJpegFramedSource * source, UsageEnvironment& env) : JPEGVideoSource(env), m_replica(replica), m_source(source) {

}

virtual ~ReplicaJPEGVideoSource () {
Medium::close( m_replica );
}

virtual u_int8_t type () {
return m_source->type();
}

virtual u_int8_t qFactor () {
return m_source->qFactor();
}

virtual u_int8_t width () {
return m_source->width();
}

virtual u_int8_t height () {
return m_source->height();
}

virtual void doGetNextFrame () {
//m_source->doGetNextFrame();

//m_replica->getNextFrame( fTo, fMaxSize, fAfterGettingFunc, fAfterGettingClientData, fOnCloseFunc, fOnCloseClientData );
}

virtual void doStopGettingFrames() {
// TODO: is this needed?
JPEGVideoSource::doStopGettingFrames();

// TODO: should this be the source or the replica?
m_source->stopGettingFrames();
}


protected:

FramedSource * m_replica;
MJpegFramedSource * m_source;
};


The problems here arise in doGetFrame(). You said to just getNextFrame() on the replica, ie. a FramedSource created based on
this code:

FramedSource* LocalMJpegUnicastServerMediaSubsession::createNewStreamSource (unsigned clientSessionID, unsigned& estBitRate) {
// create and initialize a source for the camera
MJpegFramedSource *source = MJpegFramedSource::createNew( envir() );
if ( ! source->initialize( m_cameraParameters ) ) {
return 0;
}

if ( m_replicator == 0 ) {
m_replicator = StreamReplicator::createNew( envir(), source, False );
}

return new ReplicaJPEGVideoSource( m_replicator->createStreamReplica(), source, envir() );
}

The replica is wrapped in the above class, as per instructions. However, doGetNextFrame() can not simply call
getNextFrame() as that requires a set of parameters that are not accessible:

m_replica->getNextFrame( fTo, fMaxSize, fAfterGettingFunc, fAfterGettingClientData, fOnCloseFunc, fOnCloseClientData );

The fAfterGettingClientData and fOnCloseClientData are private and not accessible to my wrapper. I can also not override getNextFrame()
in my wrapper and save said data as that method is not virtual. Instead if I assume you made a typo and meant:

m_replica->doGetNextFrame();

This will crash when my MJpegFramedSource delivers the frame by copying raw data to fTo, as it has not been set. It seems to be NULL
or some random value. So apparently this extra proxying probably makes FramedSource::getNextFrame() get called for the wrong FramedSource
and the data is saved in the wrong instance.

As a test I did make the Live555 code work without any extra proxying layer, but the bad typecasts in the original code make it quite
ugly. This was doable by lifting out StreamReplica and have it implement the suitable isXYZ() methods and go through all the places with
hardcoded C style casts and fix them to use dynamic_cast<> and check for a StreamReplica. This does then not work for H264 streams, as
isFramedSource() is a private member of FramedSource. Why has it been inherited as private?

Anyway, currently stream replicating does not work at all and it does not seem to be easy to do. The StreamReplicator class works in
trivial examples but breaks in real world code.
--
Jan Ekholm
***@d-pointer.com
Ross Finlayson
2014-10-18 22:15:44 UTC
Permalink
Post by Jan Ekholm
The replica is wrapped in the above class, as per instructions. However, doGetNextFrame() can not simply call
m_replica->getNextFrame( fTo, fMaxSize, fAfterGettingFunc, fAfterGettingClientData, fOnCloseFunc, fOnCloseClientData );
The fAfterGettingClientData and fOnCloseClientData are private and not accessible to my wrapper. I can also not override getNextFrame()
m_replica->doGetNextFrame();
No, I meant “getNextFrame()” - i.e., the regular call that an object makes to get data from an upstream “FramedSource”. You need to provide your own ‘after getting’ and ‘on close’ functions and data. There are numerous examples of this in the code.
Post by Jan Ekholm
Anyway, currently stream replicating does not work at all and it does not seem to be easy to do. The StreamReplicator class works in
trivial examples but breaks in real world code.
No, it works just fine. It just needs to be used properly.

In any case, I’ve pretty much used up all the free help I can give you on your project right now.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Jan Ekholm
2014-10-19 09:12:36 UTC
Permalink
Post by Jan Ekholm
The replica is wrapped in the above class, as per instructions. However, doGetNextFrame() can not simply call
m_replica->getNextFrame( fTo, fMaxSize, fAfterGettingFunc, fAfterGettingClientData, fOnCloseFunc, fOnCloseClientData );
The fAfterGettingClientData and fOnCloseClientData are private and not accessible to my wrapper. I can also not override getNextFrame()
m_replica->doGetNextFrame();
No, I meant “getNextFrame()” - i.e., the regular call that an object makes to get data from an upstream “FramedSource”. You need to provide your own ‘after getting’ and ‘on close’ functions and data. There are numerous examples of this in the code.
Yes, there are a lot of uses in the code, but none work like a proxy to a proxy. I have absolutely no idea what my class is
supposed to do in this case. getNextFrame() is a central method and used in so many different ways across the entire library.
Post by Jan Ekholm
Anyway, currently stream replicating does not work at all and it does not seem to be easy to do. The StreamReplicator class works in
trivial examples but breaks in real world code.
No, it works just fine. It just needs to be used properly.
Well, it's escalated quickly from "just use replicator->createStreamReplica()" to creating new components with intricate
knowledge of the internal workings of Live555.
In any case, I’ve pretty much used up all the free help I can give you on your project right now.
Understood. I thank you for all your help.
--
Jan Ekholm
***@d-pointer.com
Ross Finlayson
2014-10-19 19:09:50 UTC
Permalink
Post by Jan Ekholm
Post by Ross Finlayson
Post by Jan Ekholm
Anyway, currently stream replicating does not work at all and it does not seem to be easy to do. The StreamReplicator class works in
trivial examples but breaks in real world code.
No, it works just fine. It just needs to be used properly.
Well, it's escalated quickly from "just use replicator->createStreamReplica()" to creating new components with intricate
knowledge of the internal workings of Live555.
Unfortunately the problem in this case was that although “StreamReplicator” works well for replicating ‘frames’ (opaque chunks of data), your application (as we eventually discovered) is a bit more complex, because you need to replicate not just ‘frames’, but ‘frames-with-extra-parameters’. (The ‘extra parameters’ are needed to construct a “JPEGVideoSource” or a “H264VideoStreamFramer” object, both of which you’ll need in order to transmit MJPEG or H.264 video (respectively), because of the special nature of those video formats.) That’s what makes things more complex in your case.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

Loading...