GStreamer Element Dependencies: How to connect playbin to hlssink?











up vote
1
down vote

favorite












I'm new to GStreamer and I've been trying to build some simple pipelines using gst-launch-1.0. I'm having difficulty figuring out what elements need to go together. For example, I want to stream a webm video to a HTML5 video tag inside a browser. To open and read the file I think I need to use playbin:



gst-launch-1.0 playbin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm ...


Then to make the file available I use hlssink:



! hlssink max-files=5  playlist-root=http://10.12.9.3/ location=/var/www/html/hlssink.webm


Then in my browser I use the video tag:



<video src="http://10.12.9.3/hlssink.webm" controls>


So this pipeline won't start and I get the following error:



WARNING: erroneous pipeline: could not link playbin0 to hlssink0


So it tells me I can't link these two elements. Fine, but where do I look to find what other elements I need to include in my pipeline to make this work?



Thanks in advance.










share|improve this question


























    up vote
    1
    down vote

    favorite












    I'm new to GStreamer and I've been trying to build some simple pipelines using gst-launch-1.0. I'm having difficulty figuring out what elements need to go together. For example, I want to stream a webm video to a HTML5 video tag inside a browser. To open and read the file I think I need to use playbin:



    gst-launch-1.0 playbin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm ...


    Then to make the file available I use hlssink:



    ! hlssink max-files=5  playlist-root=http://10.12.9.3/ location=/var/www/html/hlssink.webm


    Then in my browser I use the video tag:



    <video src="http://10.12.9.3/hlssink.webm" controls>


    So this pipeline won't start and I get the following error:



    WARNING: erroneous pipeline: could not link playbin0 to hlssink0


    So it tells me I can't link these two elements. Fine, but where do I look to find what other elements I need to include in my pipeline to make this work?



    Thanks in advance.










    share|improve this question
























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I'm new to GStreamer and I've been trying to build some simple pipelines using gst-launch-1.0. I'm having difficulty figuring out what elements need to go together. For example, I want to stream a webm video to a HTML5 video tag inside a browser. To open and read the file I think I need to use playbin:



      gst-launch-1.0 playbin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm ...


      Then to make the file available I use hlssink:



      ! hlssink max-files=5  playlist-root=http://10.12.9.3/ location=/var/www/html/hlssink.webm


      Then in my browser I use the video tag:



      <video src="http://10.12.9.3/hlssink.webm" controls>


      So this pipeline won't start and I get the following error:



      WARNING: erroneous pipeline: could not link playbin0 to hlssink0


      So it tells me I can't link these two elements. Fine, but where do I look to find what other elements I need to include in my pipeline to make this work?



      Thanks in advance.










      share|improve this question













      I'm new to GStreamer and I've been trying to build some simple pipelines using gst-launch-1.0. I'm having difficulty figuring out what elements need to go together. For example, I want to stream a webm video to a HTML5 video tag inside a browser. To open and read the file I think I need to use playbin:



      gst-launch-1.0 playbin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm ...


      Then to make the file available I use hlssink:



      ! hlssink max-files=5  playlist-root=http://10.12.9.3/ location=/var/www/html/hlssink.webm


      Then in my browser I use the video tag:



      <video src="http://10.12.9.3/hlssink.webm" controls>


      So this pipeline won't start and I get the following error:



      WARNING: erroneous pipeline: could not link playbin0 to hlssink0


      So it tells me I can't link these two elements. Fine, but where do I look to find what other elements I need to include in my pipeline to make this work?



      Thanks in advance.







      video video-streaming html5 gstreamer






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Aug 3 '14 at 1:25









      Przemek Lach

      612




      612






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          0
          down vote













          I guess you should stop and read a bit the gstreamer documentation to understand how it works. GStreamer is a framework for building a graph for data processing. You can find the manual here: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html



          Playbin is a special element (a bin) and it will assemble an internal pipeline for you. Playbin won't expose any linking pads as it is designed to create a full playback pipeline, that's why you can't link it to hlssink.



          You can use other elements to create a pipeline to transcode your webm input to hlssink, for example you can use uridecodebin that is another bin that will automatically create a decoding pipeline for any media that you use (provided that you have the right plugins for it). Something like:



          gst-launch-1.0 uridecodebin uri=<youruri> name=decbin ! queue ! videoconvert ! x264enc ! tsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! faac ! muxer.


          I haven't tested it, but something like that should work to convert your input to mpegts using H264 and AAC, for example, and pass the data for hlssink.



          To check what elements you have on your system use the gst-inspect-1.0 tool. It will list all elements if you don't give it an argument or give you details about a particular element if you give it one, like:



          gst-inspect-1.0 hlsssink


          EDIT: fixing the pipeline you proposed in comments



          gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer.





          share|improve this answer























          • So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
            – Przemek Lach
            Aug 3 '14 at 20:04










          • I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
            – Przemek Lach
            Aug 3 '14 at 20:05












          • you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
            – thiagoss
            Aug 4 '14 at 0:03










          • Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
            – thiagoss
            Aug 4 '14 at 0:06










          • Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
            – Przemek Lach
            Aug 10 '14 at 19:06











          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "3"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f791711%2fgstreamer-element-dependencies-how-to-connect-playbin-to-hlssink%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          0
          down vote













          I guess you should stop and read a bit the gstreamer documentation to understand how it works. GStreamer is a framework for building a graph for data processing. You can find the manual here: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html



          Playbin is a special element (a bin) and it will assemble an internal pipeline for you. Playbin won't expose any linking pads as it is designed to create a full playback pipeline, that's why you can't link it to hlssink.



          You can use other elements to create a pipeline to transcode your webm input to hlssink, for example you can use uridecodebin that is another bin that will automatically create a decoding pipeline for any media that you use (provided that you have the right plugins for it). Something like:



          gst-launch-1.0 uridecodebin uri=<youruri> name=decbin ! queue ! videoconvert ! x264enc ! tsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! faac ! muxer.


          I haven't tested it, but something like that should work to convert your input to mpegts using H264 and AAC, for example, and pass the data for hlssink.



          To check what elements you have on your system use the gst-inspect-1.0 tool. It will list all elements if you don't give it an argument or give you details about a particular element if you give it one, like:



          gst-inspect-1.0 hlsssink


          EDIT: fixing the pipeline you proposed in comments



          gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer.





          share|improve this answer























          • So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
            – Przemek Lach
            Aug 3 '14 at 20:04










          • I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
            – Przemek Lach
            Aug 3 '14 at 20:05












          • you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
            – thiagoss
            Aug 4 '14 at 0:03










          • Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
            – thiagoss
            Aug 4 '14 at 0:06










          • Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
            – Przemek Lach
            Aug 10 '14 at 19:06















          up vote
          0
          down vote













          I guess you should stop and read a bit the gstreamer documentation to understand how it works. GStreamer is a framework for building a graph for data processing. You can find the manual here: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html



          Playbin is a special element (a bin) and it will assemble an internal pipeline for you. Playbin won't expose any linking pads as it is designed to create a full playback pipeline, that's why you can't link it to hlssink.



          You can use other elements to create a pipeline to transcode your webm input to hlssink, for example you can use uridecodebin that is another bin that will automatically create a decoding pipeline for any media that you use (provided that you have the right plugins for it). Something like:



          gst-launch-1.0 uridecodebin uri=<youruri> name=decbin ! queue ! videoconvert ! x264enc ! tsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! faac ! muxer.


          I haven't tested it, but something like that should work to convert your input to mpegts using H264 and AAC, for example, and pass the data for hlssink.



          To check what elements you have on your system use the gst-inspect-1.0 tool. It will list all elements if you don't give it an argument or give you details about a particular element if you give it one, like:



          gst-inspect-1.0 hlsssink


          EDIT: fixing the pipeline you proposed in comments



          gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer.





          share|improve this answer























          • So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
            – Przemek Lach
            Aug 3 '14 at 20:04










          • I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
            – Przemek Lach
            Aug 3 '14 at 20:05












          • you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
            – thiagoss
            Aug 4 '14 at 0:03










          • Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
            – thiagoss
            Aug 4 '14 at 0:06










          • Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
            – Przemek Lach
            Aug 10 '14 at 19:06













          up vote
          0
          down vote










          up vote
          0
          down vote









          I guess you should stop and read a bit the gstreamer documentation to understand how it works. GStreamer is a framework for building a graph for data processing. You can find the manual here: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html



          Playbin is a special element (a bin) and it will assemble an internal pipeline for you. Playbin won't expose any linking pads as it is designed to create a full playback pipeline, that's why you can't link it to hlssink.



          You can use other elements to create a pipeline to transcode your webm input to hlssink, for example you can use uridecodebin that is another bin that will automatically create a decoding pipeline for any media that you use (provided that you have the right plugins for it). Something like:



          gst-launch-1.0 uridecodebin uri=<youruri> name=decbin ! queue ! videoconvert ! x264enc ! tsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! faac ! muxer.


          I haven't tested it, but something like that should work to convert your input to mpegts using H264 and AAC, for example, and pass the data for hlssink.



          To check what elements you have on your system use the gst-inspect-1.0 tool. It will list all elements if you don't give it an argument or give you details about a particular element if you give it one, like:



          gst-inspect-1.0 hlsssink


          EDIT: fixing the pipeline you proposed in comments



          gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer.





          share|improve this answer














          I guess you should stop and read a bit the gstreamer documentation to understand how it works. GStreamer is a framework for building a graph for data processing. You can find the manual here: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html



          Playbin is a special element (a bin) and it will assemble an internal pipeline for you. Playbin won't expose any linking pads as it is designed to create a full playback pipeline, that's why you can't link it to hlssink.



          You can use other elements to create a pipeline to transcode your webm input to hlssink, for example you can use uridecodebin that is another bin that will automatically create a decoding pipeline for any media that you use (provided that you have the right plugins for it). Something like:



          gst-launch-1.0 uridecodebin uri=<youruri> name=decbin ! queue ! videoconvert ! x264enc ! tsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! faac ! muxer.


          I haven't tested it, but something like that should work to convert your input to mpegts using H264 and AAC, for example, and pass the data for hlssink.



          To check what elements you have on your system use the gst-inspect-1.0 tool. It will list all elements if you don't give it an argument or give you details about a particular element if you give it one, like:



          gst-inspect-1.0 hlsssink


          EDIT: fixing the pipeline you proposed in comments



          gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer.






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Aug 4 '14 at 0:08

























          answered Aug 3 '14 at 17:10









          thiagoss

          1011




          1011












          • So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
            – Przemek Lach
            Aug 3 '14 at 20:04










          • I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
            – Przemek Lach
            Aug 3 '14 at 20:05












          • you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
            – thiagoss
            Aug 4 '14 at 0:03










          • Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
            – thiagoss
            Aug 4 '14 at 0:06










          • Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
            – Przemek Lach
            Aug 10 '14 at 19:06


















          • So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
            – Przemek Lach
            Aug 3 '14 at 20:04










          • I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
            – Przemek Lach
            Aug 3 '14 at 20:05












          • you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
            – thiagoss
            Aug 4 '14 at 0:03










          • Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
            – thiagoss
            Aug 4 '14 at 0:06










          • Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
            – Przemek Lach
            Aug 10 '14 at 19:06
















          So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
          – Przemek Lach
          Aug 3 '14 at 20:04




          So I have read through the documentation before but I still have a hard time figuring out the order of elements. Maybe it's because I don't have a background in video/audio editing so I don't really understand the process. If I just want to stream a .webm file why do I have to do things like videoconvert, x264enc etc. I can serve the .webm file directly to my browser via apache, why do I need to use these extra gstreamer elements to simply move the file through a simple pipeline?
          – Przemek Lach
          Aug 3 '14 at 20:04












          I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
          – Przemek Lach
          Aug 3 '14 at 20:05






          I tried to use your pipeline and I had to change a couple of things: gst-launch-1.0 uridecodebin uri=file:///home/ubuntu/g-streamer-lively/skyrim.webm name=decbin ! queue ! videoconvert ! x264enc ! mpegtsmux name=muxer ! hlssink decbin. ! queue ! audioconvert ! avenc_aac ! muxer. ! hlssink max-files=5 playlist-root=10.12.9.3 location=/var/www/html/hlssink.webm. Now when I try to start I get error: WARNING: erroneous pipeline: link without source element.
          – Przemek Lach
          Aug 3 '14 at 20:05














          you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
          – thiagoss
          Aug 4 '14 at 0:03




          you're not "moving" a file through the pipeline, it will get processed at each node. When you use uridecodebin it will demux and decode your input file, assuming you have both video and audio you will get 2 outputs from it. Each one is getting reencoded with a new format to get muxed into mpegts that is usually what is served over HLS protocol that you wanted to use.
          – thiagoss
          Aug 4 '14 at 0:03












          Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
          – thiagoss
          Aug 4 '14 at 0:06




          Your pipeline is wrong, there are two hlssink instances created and I'm not sure you understand what HLS is and how it works by the parameters you are selecting. HLS will create a playlist of small files to be download and played sequentially.
          – thiagoss
          Aug 4 '14 at 0:06












          Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
          – Przemek Lach
          Aug 10 '14 at 19:06




          Hi thanks for the edit. I tried your new pipeline and I get the following error: ERROR: from element /GstPipeline:pipeline0/avenc_aac:avenc_aac0: Codec is experimental, but settings don't allow encoders to produce output of experimental quality. I tried to figure out where I can set 'experimental quality' but wasn't able to find it. Am I missing a flag or something?
          – Przemek Lach
          Aug 10 '14 at 19:06


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Super User!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f791711%2fgstreamer-element-dependencies-how-to-connect-playbin-to-hlssink%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Сан-Квентин

          Алькесар

          Josef Freinademetz