VideoCompositor
Static Member Summary
| Static Public Members | ||
| public static get |
Effects: * |
|
| public static get |
|
|
| public static get |
|
|
Static Method Summary
| Static Public Methods | ||
| public static |
calculatePlaylistDuration(playlist: Object): number Calculate the duration of the passed playlist.
|
|
| public static |
calculateTrackDuration(track: Array): number Calculate the duration of the passed playlist track.
|
|
| public static |
renderPlaylist(playlist: Object, canvas: Canvas, currentTime: number) Render a graphical representation of a playlist to a canvas.
|
|
| public static |
validatePlaylist(playlist: Object) Validate that the playlist is correct and playable.
|
|
Constructor Summary
| Public Constructor | ||
| public |
constructor(canvas: Canvas, audioCtx: AudioContext) Instantiate the VideoCompositor using the passed canvas to render too.
|
|
Member Summary
| Public Members | ||
| public get |
currentTime: number: * Get how far through the playlist has been played.
|
|
| public set |
currentTime(time: number): * Sets the current time through the playlist.
|
|
| public |
|
|
| public set |
playbackRate: * Sets the playback rate of the video compositor.
|
|
| public get |
playbackRate: * Gets the playback rate.
|
|
| public set |
Set the playlist object to be played.
|
|
| public get |
Get the playlist object.
|
|
| public get |
postPlayTime: * |
|
| public set |
postPlayTime: * Sets how long mediasources will exist for after they have been .
|
|
| public set |
preloadTime: * Sets how far in the future to look for preloading mediasources.
|
|
| public get |
preloadTime: * |
|
Method Summary
| Public Methods | ||
| public |
addEventListener(type: String, func: Function) This adds event listeners to the video compositor.
|
|
| public |
getAudioContext(): AudioContext Returns the audio context that was either passed into the constructor or created internally.
|
|
| public |
getAudioNodeForTrack(track: Array): GainNode Gets an audio bus for the given playlist track.
|
|
| public |
pause() Pause playback of the playlist.
|
|
| public |
play() Play the playlist.
|
|
| public |
preload() Starts the underlying video/image elements pre-loading.
|
|
| public |
registerMediaSourceListener(mediaSourceID: String, mediaSourceListener: Object) This method allows you to create a listeners for events on a specific MediaSource.
|
|
| public |
removeEventListener(type: String, func: Function): boolean This removes event listeners from the video compositor that were added using addEventListener.
|
|
| public |
unregisterMediaSourceListener(mediaSourceID: String, mediaSourceListener: Object): boolean This method allows you to remove a listener from a specific MediaSource.
|
|
Static Public Members
public static get Effects: * source
public static get FragmentShaders: * source
public static get VertexShaders: * source
Static Public Methods
public static calculatePlaylistDuration(playlist: Object): number source
Calculate the duration of the passed playlist.
Will return the time that the last media source in the longest track stops playing.
Params:
| Name | Type | Attribute | Description |
| playlist | Object | This is a playlist object. |
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}],
[{type:"video", start:6, duration:4, src:"video3.mp4", id:"video3"}]
]
}
var playilstDuration = VideoCompositor.calculateTrackDuration(playlist);
//playlistDuration === 10
public static calculateTrackDuration(track: Array): number source
Calculate the duration of the passed playlist track.
Will return the time that the last media source in the track stops playing.
Params:
| Name | Type | Attribute | Description |
| track | Array | this is track which consists of an array object of MediaSourceReferences (typically a track from a playlist object). |
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}],
[{type:"video", start:6, duration:4, src:"video3.mp4", id:"video3"}]
]
}
var track0Duration = VideoCompositor.calculateTrackDuration(playlist.tracks[0]);
var track1Duration = VideoCompositor.calculateTrackDuration(playlist.tracks[1]);
//track0Duration === 8
//track1Duration === 10
TODO:
- Beacuse media source reference are stored in order this could implemented be far quicker.
public static renderPlaylist(playlist: Object, canvas: Canvas, currentTime: number) source
Render a graphical representation of a playlist to a canvas.
This function is useful for rendering a graphical display of a playlist to check MediaSourceReferences are aligned on tracks as you'd expect. It can also be called in an update loop with the currentTime of a VideoCompositor instance passed in to create a live timeline viewer.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:2, duration:4, src:"video2.mp4", id:"video2"}],
]
}
var visualisationCanvas = document.getElementById("vis-canvas");
VideoCompositor.renderPlaylist(playlist, visualisationCanvas, 0);
public static validatePlaylist(playlist: Object) source
Validate that the playlist is correct and playable.
This static function will analyze a playlist and check for common errors. on encountering an error it will throw an exception. The errors it currently checks for are:
Error 1. MediaSourceReferences have a unique ID
Error 2. The playlist media sources have all the expected properties.
Error 3. MediaSourceReferences in single track are sequential.
Error 4. MediaSourceReferences in single track don't overlap
Params:
| Name | Type | Attribute | Description |
| playlist | Object | This is a playlist object. |
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:2, duration:4, src:"video2.mp4", id:"video2"}],
]
}
var playilstDuration = VideoCompositor.validatePlaylist(playlist);
//Will throw error 4 becuase mediaSourceReference video1 and video2 overlap by 2 seconds.
TODO:
- Better coverage of possible errors in a playlist.
Public Constructors
public constructor(canvas: Canvas, audioCtx: AudioContext) source
Instantiate the VideoCompositor using the passed canvas to render too.
You can also pass an AudioContext for use when calling getAudioNodeForTrack. If one is not provided a context will be created internally and be accessible via the getAudioContext function.
Params:
| Name | Type | Attribute | Description |
| canvas | Canvas | The canvas element to render too. |
|
| audioCtx | AudioContext | The AudioContext to create AudioNode's with. |
Example:
var canvas = document.getElementById('canvas');
var audioCtx = new AudioContext();
var videoCompositor = new VideoCompositor(canvas, audioCtx);
Public Members
public get currentTime: number: * source
Get how far through the playlist has been played.
Getting this value will give the current playhead position. Can be used for updating timelines.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
var time = videoCompositor.currentTime;
//time === 0
public set currentTime(time: number): * source
Sets the current time through the playlist.
Setting this is how you seek through the content. Should be frame accurate, but probably slow.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
videoCompositor.currentTime = 3; //Skip three seconds in.
videoCompositor.play();
public set playbackRate: * source
Sets the playback rate of the video compositor. Msut be greater than 0.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
videoCompositor.playbackRate = 2.0; //Play at double speed
videoCompositor.play();
public get playbackRate: * source
Gets the playback rate.
Example:
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
console.log(videoCompositor. playbackRate); // will print 1.0.
public set playlist(playlist: Object): * source
Set the playlist object to be played.
Playlist objects describe a sequence of media sources to be played along with effects to be applied to them. They can be modified as they are being played to create dynamic or user customizable content.
At the top level playlist consist of tracks and effects. A track is an array of MediaSourceReferences. MediaSourceReference are an object which describe a piece of media to be played, the three fundamental MediaSourceRefernce types are "video", "image", and "canvas". Internally MediaSoureReferences are used to create MediaSources which are object that contain the underlying HTML element as well as handling loading and rendering of that element ot the output canvas.
The order in which simultaneous individual tracks get rendered is determined by there index in the overall tracks list. A track at index 0 will be rendered after a track at index 1.
Effects are objects consisting of GLSL vertex and fragment shaders, and a list of MediaSource ID's for them to be applied to. Effects get applied independently to any MediaSources in their input list.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video.mp4", id:"video"}]
]
}
var canvas = document.getElementById("canvas");
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
videoCompositor.play();
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video.mp4", id:"video"}, {type:"video", start:4, duration:4, src:"video1.mp4", id:"video1"}]
]
}
var playlist = {
tracks:[
[{type:"video", start:0, sourceStart:10, duration:4, src:"video.mp4", id:"video"}]
]
}
var playlist = {
tracks:[
[{type:"video", start:0, duration:10, src:"video.mp4", id:"gs-video"}],
[{type:"image", start:0, duration:10, src:"background.png", id:"background"}]
]
effects:{
"green-screen":{ //A unique ID for this effect.
"inputs":["gs-video"], //The id of the video to apply the effect to.
"effect": VideoCompositor.Effects.GREENSCREEN //Use the built-in greenscreen effect shader.
}
}
}
var playlist = {
tracks:[
[{type:"video", start:0, duration:10, src:"video1.mp4", id:"video1"}],
[ {type:"video", start:8, duration:10, src:"video2.mp4", id:"video2"}]
]
effects:{
"fade-out":{ //A unique ID for this effect.
"inputs":["video1"], //The id of the video to apply the effect to.
"effect": VideoCompositor.Effects.FADEOUT2SEC //Use the built-in fade-out effect shader.
},
"fade-in":{ //A unique ID for this effect.
"inputs":["video2"], //The id of the video to apply the effect to.
"effect": VideoCompositor.Effects.FADEIN2SEC //Use the built-in fade-in effect shader.
}
}
}
public get postPlayTime: * source
public get preloadTime: * source
Public Methods
public addEventListener(type: String, func: Function) source
This adds event listeners to the video compositor. Events directed at the underlying canvas are transparently passed through, While events that target a video like element are handled within the VideoCompositor. Currently the VideoCompositor only handles a limited number of video like events ("play", "pause", "ended").
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
videoCompositor.addEventListener("play", function(){console.log("Started playing")});
videoCompositor.addEventListener("pause", function(){console.log("Paused")});
videoCompositor.addEventListener("ended", function(){console.log("Finished playing")});
videoCompositor.play();
public getAudioContext(): AudioContext source
Returns the audio context that was either passed into the constructor or created internally.
Return:
| AudioContext | The audio context used to create any nodes required by calls to getAudioNodeForTrack |
Example:
var audioCtx = new AudioContext();
var videoCompositor = new VideoCompositor(canvas, audioCtx);
var returnedAudioContext = videoCompositor.getAudioContext();
//returnedAudioContext and audiotCtx are the same object.
var videoCompositor = new VideoCompositor(canvas); //Don't pass in an audio context
var audioCtx = videoCompositor.getAudioContext();
//audioCtx was created inside the VideoCompositor constructor
public getAudioNodeForTrack(track: Array): GainNode source
Gets an audio bus for the given playlist track.
In some instances you may want to feed the audio output of the media sources from a given track into a web audio API context. This function provides a mechanism for acquiring an audio GainNode which represents a "bus" of a given track.
Note: In order for the media sources on a track to play correctly once you have an AudioNode for the track you must connect the Audio Node to the audio contexts destination (even if you want to mute them you must set the gain to 0 then connect them to the output).
Params:
| Name | Type | Attribute | Description |
| track | Array | this is track which consist of an array object of MediaSourceReferences (typically a track from a playlist object). |
Return:
| GainNode | this is a web audio GainNode which has the output of any audio producing media sources from the passed track played out of it. |
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var audioCtx = new AudioContext();
var canvas = document.getElementById("canvas");
var videoCompositor = new VideoCompositor(canvas, audioCtx);
videoCompositor.playlist = playlist;
var trackGainNode = videoCompositor.getAudioNodeForTrack(playlist.tracks[0]);
trackGainNode.gain.value = 0.0; // Mute the track
public pause() source
Pause playback of the playlist. Call play() to resume playing.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
videoCompositor.play();
setTimeout(videoCompositor.pause, 3000); //pause after 3 seconds
public play() source
Play the playlist. If a pause() has been called previously playback will resume from that point.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
videoCompositor.play();
public preload() source
Starts the underlying video/image elements pre-loading. Behavior is not guaranteed and depends on how the browser treats video pre-loading under the hood.
Example:
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"}]
]
}
videoCompositor.preload();
//now when play is called is should start quicker.
public registerMediaSourceListener(mediaSourceID: String, mediaSourceListener: Object) source
This method allows you to create a listeners for events on a specific MediaSource.
To use this you must pass an object which has one or more the following function properties: play, pause, seek, isReady, load, destroy, render.
These functions get called when the correspoinding action on the MediaSource happen. In the case of the render listener it will be called every time a frame is drawn so the function should aim to return as quickly as possible to avoid hanging the render loop.
The use-case for this is synchronising external actions to a specfic media source, such as subtitle rendering or animations on a canvasMediaSource.
The listeners get passed a reference to the internal MediaSource object and somtimes extra data relevant to that sepcific actions function ("seek" gets the time seeking too, "render" gets the shaders rendering parameters).
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
var videoListener = {
render: function(mediaSource, renderParameters){
//This will get called every frame.
var time = renderParameters.progress * mediaSource.duration;
console.log('Progress through ID', mediaSource.id, ':', time);
},
seek:function(mediaSource, seekTime){
//This function will get called on seek
console.log("Seeking ID", mediaSource.id, "to :", seekTime);
},
play:function(mediaSource){
//This function will get called on play
console.log("Plating ID", mediaSource.id);
},
}
videoCompositor.registerMediaSourceListener("video1", videoListener);
videoCompositor.play();
public removeEventListener(type: String, func: Function): boolean source
This removes event listeners from the video compositor that were added using addEventListener.
Example:
var playlist = {
tracks:[
[{type:"video", start:0, duration:4, src:"video1.mp4", id:"video1"},{type:"video", start:4, duration:4, src:"video2.mp4", id:"video2"}]
]
}
var canvas = document.getElementById('canvas');
var videoCompositor = new VideoCompositor(canvas);
videoCompositor.playlist = playlist;
var playingCallback = function(){console.log("playing");};
videoCompositor.addEventListener("play", playingCallback);
videoCompositor.play();
videoCompositor.removeEventListener("play", playingCallback);