How do i append two video files data to a source buffer using media source api?

mediasource append buffer
failed to remove source buffer from media source
media source extensions
html5 video media source
mediasource seeking
nodejs mediasource
mediasource istypesupported
webkitmediasource

I have two videos name v11.webm and v12.webm.

What i want is that these two videos should run seamlessly without any gap.

I am following the media source api approach of appending data to source buffer.

I am referring the demo given on this link

I have modified that example and removed the part of chunking the video and also tried to append data to source buffer file wise.

My code is as follows:

<script>

var video = document.querySelector('video');

window.MediaSource = window.MediaSource || window.WebKitMediaSource;
if (!!!window.MediaSource) {
  alert('MediaSource API is not available');
}

var mediaSource = new MediaSource();

video.src = window.URL.createObjectURL(mediaSource);

mediaSource.addEventListener('webkitsourceopen', function(e) {

    var sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');  

    for(i=1;i<=2;i++)
    {
        (function(i){

          GET('v1'+i+'.webm', function(uInt8Array) {
              var file = new Blob([uInt8Array], {type: 'video/webm'});

              var reader = new FileReader();
              reader.onload = function(e) {
                sourceBuffer.append(new Uint8Array(e.target.result));            
              };
              reader.readAsArrayBuffer(file);

          });
        })(i);
    }

}, false);

mediaSource.addEventListener('webkitsourceended', function(e) {
  logger.log('mediaSource readyState: ' + this.readyState);
}, false);

function GET(url, callback) {
 // alert(url);
  var xhr = new XMLHttpRequest();
  xhr.open('GET', url, true);
  xhr.responseType = 'arraybuffer';
  xhr.send();

  xhr.onload = function(e) {
    if (xhr.status != 200) {
      alert("Unexpected status code " + xhr.status + " for " + url);
      return false;
    }
    callback(new Uint8Array(xhr.response));
  };
}
</script>

Right now the code is not working as desired.

There is an inconsistent mixing of v11.webm and v12.webm file data.

It is not running seamlessly.

Perhaps a bit late, but I was able to figure this out. Your new video is overwriting the old one, because they both begin at time 0. You have to specify that your new video begins at time X before appending it, so your 'webkitsourceopen' event function should be:

/* forget the sourcebuffer variable, we'll just manipulate mediaSource */
mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');

/* it seems ok to set initial duration 0 */
var duration = 0;
var totalVideos = 2;

/* use this type of loop to ensure that that a single video
   is downloaded and appended before moving on to the next video,
   mediasource seems picky about these being in order */
var i = 0;
(function readChunk_(i){

    /* the GET function already returns a Uint8Array.
       the demo you linked reads it in filereader in order to manipulate it;
       you just want to immediately append it */
    GET('v1' + (i + 1) + '.webm', function(uint8Array){

        if(i == totalVideos) {
            mediaSource.endOfStream();
        } else {

            /* assuming your videos are put together correctly
               (i.e. duration is correct), set the timestamp offset
               to the length of the total video */
            mediaSource.sourceBuffers[0].timestampOffset = duration;

            mediaSource.sourceBuffers[0].append(uint8Array);

            /* set new total length */
            duration = mediaSource.duration;

            readChunk(++i);
        }
    });
})(i);

Now if only MediaSource wasn't so frustratingly picky about the structure of the videos it accepts. I have yet to find a single sample .webm that works besides the same one used in Eric Bidelman's Demo you linked.

EDIT: After doing more testing, the way I set duration may not be correct. If you seem to get exponential duration growth after each append, try setting the timestampoffset to 0 and not changing it. I have no idea why that seems to fix it, and it may be a problem with how I'm generating the webm files.

How to correctly append buffers to play multiple media files in , Found this question and answer How do i append two video files data to a source buffer using media source api? which appeared to indicate  Found this question and answer How do i append two video files data to a source buffer using media source api? which appeared to indicate that setting the .timestampOffset property of MediaSource could result in sequencing media playback of discrete buffers appended to SourceBuffer.

What I'm missing in your code is : mediaSource.endOfStream();

Can you elaborate on the inconsistent mixing issue?

[media-source] How to correctly append buffers to play multiple , [media-source] How to correctly append buffers to play multiple media two video files data to a source buffer using media source api?](https://  From: guest271314 via GitHub <sysbot+gh@w3.org> Date: Fri, 25 Aug 2017 16:04:24 +0000 To: public-html-media@w3.org Message-ID: <issues.opened-252947195-1503677055-sysbot+gh@w3.org>

I get perfect example to solve this problem into simple a way and short way...

i am using three static files, but you can append data from socket's or any api also.

<!DOCTYPE html>
<html>

<head>
</head>

<body>
  <br>
  <video controls="true" autoplay="true"></video>

  <script>
    (async() => {


      const mediaSource = new MediaSource();

      const video = document.querySelector("video");

      // video.oncanplay = e => video.play();

      const urls = ["https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4", "https://raw.githubusercontent.com/w3c/web-platform-tests/master/media-source/mp4/test.mp4","https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4"];

      const request = url => fetch(url).then(response => response.arrayBuffer());

      // `urls.reverse()` stops at `.currentTime` : `9`
      const files = await Promise.all(urls.map(request));

      /*
       `.webm` files
       Uncaught DOMException: Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
       Uncaught DOMException: Failed to set the 'timestampOffset' property on 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
      */
      // const mimeCodec = "video/webm; codecs=opus";
      // https://stackoverflow.com/questions/14108536/how-do-i-append-two-video-files-data-to-a-source-buffer-using-media-source-api/
      const mimeCodec = "video/mp4; codecs=avc1.42E01E, mp4a.40.2";


      const media = await Promise.all(files.map(file => {
        return new Promise(resolve => {
          let media = document.createElement("video");
          let blobURL = URL.createObjectURL(new Blob([file]));
          media.onloadedmetadata = async e => {
            resolve({
              mediaDuration: media.duration,
              mediaBuffer: file
            })
          }
          media.src = blobURL;
        })
      }));

      console.log(media);

      mediaSource.addEventListener("sourceopen", sourceOpen);

      video.src = URL.createObjectURL(mediaSource);

      async function sourceOpen(event) {

        if (MediaSource.isTypeSupported(mimeCodec)) {
          const sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);

          for (let chunk of media) {
            await new Promise(resolve => {
              sourceBuffer.appendBuffer(chunk.mediaBuffer);
              sourceBuffer.onupdateend = e => {
                sourceBuffer.onupdateend = null;
                sourceBuffer.timestampOffset += chunk.mediaDuration;
                console.log(mediaSource.duration);
                resolve()
              }
            })

          }

          mediaSource.endOfStream();

        }  
        else {
          console.warn(mimeCodec + " not supported");
        }
      };

    })()
  </script>


</body>

</html>

Media Source API: Automatically Ensure Seamless Playback of , The Media Source API enables JavaScript to construct media streams for it's possible to use SourceBuffer sequence mode to ensure media segments are Get video segments and append them to sourceBuffer. In this example, video is fetched from the server then stored using the File APIs: It has one of two values:​. The MediaSource interface of the Media Source Extensions API represents a source of media data for an HTMLMediaElement object. A MediaSource object can be attached to a HTMLMediaElement to be played in the user agent.

Just set the sourceBuffer's mode to 'sequence' (default seems to be 'segments')

From the doc: https://developer.mozilla.org/en-US/docs/Web/API/SourceBuffer/mode

sequence: The order in which the segments are appended to the SourceBuffer determines the order in which they are played. Segment timestamps are generated automatically for the segments that observe this order.

In my app, I just set it after adding the the source buffer to the media source:

// Create media source for the stream, and create the source buffer when ready
let self = this;
this._mediaSource = new MediaSource();
this._mediaSource.addEventListener('sourceopen', function () {
  self._sourceBuffer = self.mediaSource.addSourceBuffer(environment.recordingMimeType);
  self._sourceBuffer.mode = 'sequence'; // This is the relevant part
  self._sourceBuffer.addEventListener('error', function (ev) {
    console.error("Source buffer error ??");
    console.error(ev);
  });
});

SourceBuffer, The appendBuffer() method of the SourceBuffer interface appends media segment data from an ArrayBuffer or ArrayBufferView object to the  Environment The stream has correct Access-Control-Allow-Origin headers (CORS) There are no network errors such as 404s in the browser console when trying to play the stream The issue observed is not already reported by searching on Githu

SourceBuffer.appendBuffer(), With the MediaSource API, you can generate and configure media streams on media data held by media-related HTML tags such as <audio> or <video> . 2. 3​. < audio controls>. Your browser doesn't support HTML audio element. fetched​, append it to SourceBuffer with the appendBuffer() method. To get started uploading videos through Buffer, select a video file from your computer—either an mp4, mov, or avi file of up to 1 gigabyte in size. Drop your video into the Buffer composer window or choose the file you’d like to upload, and Buffer will do the rest!

How to Stream Truncated Audio Using MediaSource API, Media Source Extensions (MSE) as adds buffer-based source options to HTML5 media for to the video object, you can append data to the buffers using any number of file request schemes. The MSE API itself is a simple concept and basically follows these steps: Repeat step 2 until all segments have been played. Attach a MediaSource instance to a media element. As with many things in web development these days, you start with feature detection. Next, get a media element, either an <audio> or <video>

MSEPrimer · WebPlatform Docs, YouTube uses the Media Source Extensions (MSE) API to replace the behavior to retrieve the video file remotely, but it will listen to the attached Source Buffers instead. Our Source Buffer will take two kinds of data as input: understand which kind of data we should append to the Media Source buffer. The Media Source API enables JavaScript to construct media streams for playback. From Chrome 50, it's possible to use SourceBuffer sequence mode to ensure media segments are automatically relocated in the timeline in the order they were appended, without gaps between them.

Comments
  • Were you ever able to figure this one out? The MediaSource Spec says that such a thing is possible using Timestamp Offsets ( dvcs.w3.org/hg/html-media/raw-file/6d127e69c9f8/media-source/… ), but I haven't been able to find exactly how to set such an offset.
  • If you use ffprobe to check out the video you can see why the video is being denied, properly a codec not matching
  • Thanks nir for replying. I am unsure about where should i check for mediaSource.endofStream()? Inconsistent mixing means that video12.webm sound is coming for few seconds then video11.webm sound starts coming and more importantly video is stuck. Ideally video11.webm should play first then video12.webm.
  • in the demo you've provided they call endofStream(). I think that they referred to the same problem in the function readChunk_(i), take a look at their comments.
  • i had a look at there code and added the similar condition inside reader.onload function : if(i==2) { mediaSource.endOfStream(); } else { if (video.paused) { video.play(); } } But now only second video that is v12.webm gets played, v11.webm is escaped