Class: module:VideoContext

module:VideoContext

new module:VideoContext(canvas, initErrorCallback, options)

Initialise the VideoContext and render to the specific canvas. A 2nd parameter can be passed to the constructor which is a function that get's called if the VideoContext fails to initialise.

Parameters:
Name Type Description
canvas Canvas

the canvas element to render the output to.

initErrorCallback function

a callback for if initialising the canvas failed.

options Object

a nuber of custom options which can be set on the VideoContext, generally best left as default.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement, function(){console.error("Sorry, your browser dosen\'t support WebGL");});
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();

Members

currentTime

Get how far through the internal timeline has been played.

Getting this value will give the current playhead position. Can be used for updating timelines.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();
setTimeout(function(){console.log(ctx.currentTime);},1000); //should print roughly 1.0

currentTime

Set the progress through the internal timeline. Setting this can be used as a way to implement a scrubaable timeline.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(20);
ctx.currentTime = 10; // seek 10 seconds in
ctx.play();

destination

Get the final node in the render graph which represents the canvas to display content on to.

This proprety is read-only and there can only ever be one destination node. Other nodes can connect to this but you cannot connect this node to anything.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.start(0);
videoNode.stop(10);
videoNode.connect(ctx.destination);

duration

Get the time at which the last node in the current internal timeline finishes playing.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
console.log(ctx.duration); //prints 0

var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);

console.log(ctx.duration); //prints 10

ctx.play();

element

Get the canvas that the VideoContext is using.

Source:

playbackRate

Return the current playbackRate of the video context.

Source:

playbackRate

Set the playback rate of the VideoContext instance. This will alter the playback speed of all media elements played through the VideoContext.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.start(0);
videoNode.stop(10);
videoNode.connect(ctx.destination);
ctx.playbackRate = 2;
ctx.play(); // Double playback rate means this will finish playing in 5 seconds.

state

Get the current state.

This will be either

  • VideoContext.STATE.PLAYING: current sources on timeline are active
  • VideoContext.STATE.PAUSED: all sources are paused
  • VideoContext.STATE.STALLED: one or more sources is unable to play
  • VideoContext.STATE.ENDED: all sources have finished playing
  • VideoContext.STATE.BROKEN: the render graph is in a broken state
Source:

Methods

canvas(src) → {CanvasNode}

Create a new node representing a canvas source

Parameters:
Name Type Description
src Canvas

The canvas element to create the canvas node from.

Source:
Returns:

A new canvas node.

Type
CanvasNode

compositor(definition) → {CompositingNode}

Create a new compositiing node.

Compositing nodes are used for operations such as combining multiple video sources into a single track/connection for further processing in the graph.

A compositing node is slightly different to other processing nodes in that it only has one input in it's definition but can have unlimited connections made to it. The shader in the definition is run for each input in turn, drawing them to the output buffer. This means there can be no interaction between the spearte inputs to a compositing node, as they are individually processed in seperate shader passes.

Parameters:
Name Type Description
definition Object

this is an object defining the shaders, inputs, and properties of the compositing node to create. Builtin definitions can be found by accessing VideoContext.DEFINITIONS

Source:
Returns:

A new compositing node created from the passed definition.

Type
CompositingNode
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);

//A simple compositing node definition which just renders all the inputs to the output buffer.
var combineDefinition = {
    vertexShader : "\
        attribute vec2 a_position;\
        attribute vec2 a_texCoord;\
        varying vec2 v_texCoord;\
        void main() {\
            gl_Position = vec4(vec2(2.0,2.0)*vec2(1.0, 1.0), 0.0, 1.0);\
            v_texCoord = a_texCoord;\
        }",
    fragmentShader : "\
        precision mediump float;\
        uniform sampler2D u_image;\
        uniform float a;\
        varying vec2 v_texCoord;\
        varying float v_progress;\
        void main(){\
            vec4 color = texture2D(u_image, v_texCoord);\
            gl_FragColor = color;\
        }",
    properties:{
        "a":{type:"uniform", value:0.0},
    },
    inputs:["u_image"]
};
//Create the node, passing in the definition.
var trackNode = videoCtx.compositor(combineDefinition);

//create two videos which will play at back to back
var videoNode1 = ctx.video("video1.mp4");
videoNode1.play(0);
videoNode1.stop(10);
var videoNode2 = ctx.video("video2.mp4");
videoNode2.play(10);
videoNode2.stop(20);

//Connect the nodes to the combine node. This will give a single connection representing the two videos which can
//be connected to other effects such as LUTs, chromakeyers, etc.
videoNode1.connect(trackNode);
videoNode2.connect(trackNode);

//Don't do anything exciting, just connect it to the output.
trackNode.connect(ctx.destination);

createCanvasSourceNode()

Source:

createCompositingNode()

Source:

createEffectNode()

Source:

createImageSourceNode()

Source:

createTransitionNode()

Source:

createVideoSourceNode()

Source:

effect(definition) → {EffectNode}

Create a new effect node.

Parameters:
Name Type Description
definition Object

this is an object defining the shaders, inputs, and properties of the compositing node to create. Builtin definitions can be found by accessing VideoContext.DEFINITIONS.

Source:
Returns:

A new effect node created from the passed definition

Type
EffectNode

image(src, preloadTimeopt, imageElementAttributesopt) → {ImageNode}

Create a new node representing an image source

Parameters:
Name Type Attributes Default Description
src string | Image

The url or image element to create the image node from.

preloadTime number <optional>
4

How long before a node is to be displayed to attmept to load it.

imageElementAttributes Object <optional>

Any attributes to be given to the underlying image element.

Source:
Returns:

A new image node.

Type
ImageNode
Examples
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var imageNode = ctx.image("image.png");
var canvasElement = document.getElementById("canvas");
var imageElement = document.getElementById("image");
var ctx = new VideoContext(canvasElement);
var imageNode = ctx.image(imageElement);

pause()

Pause playback of the VideoContext

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(20);
ctx.currentTime = 10; // seek 10 seconds in
ctx.play();
setTimeout(function(){ctx.pause();}, 1000); //pause playback after roughly one second.

play()

Start the VideoContext playing

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
videoNode.connect(ctx.destination);
videoNode.start(0);
videoNode.stop(10);
ctx.play();

registerCallback(type, func)

Regsiter a callback to listen to one of the following events: "stalled", "update", "ended", "content", "nocontent"

"stalled" happend anytime playback is stopped due to unavailbale data for playing assets (i.e video still loading) . "update" is called any time a frame is rendered to the screen. "ended" is called once plackback has finished (i.e ctx.currentTime == ctx.duration). "content" is called a the start of a time region where there is content playing out of one or more sourceNodes. "nocontent" is called at the start of any time region where the VideoContext is still playing, but there are currently no activly playing soureces.

Parameters:
Name Type Description
type String

the event to register against ("stalled", "update", or "ended").

func function

the callback to register.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
ctx.registerCallback("stalled", function(){console.log("Playback stalled");});
ctx.registerCallback("update", function(){console.log("new frame");});
ctx.registerCallback("ended", function(){console.log("Playback ended");});

registerTimelineCallback(time, func, ordering)

Register a callback to happen at a specific point in time.

Parameters:
Name Type Default Description
time number

the time at which to trigger the callback.

func function

the callback to register.

ordering number 0

the order in which to call the callback if more than one is registered for the same time.

Source:

reset()

Destroy all nodes in the graph and reset the timeline. After calling this any created nodes will be unusable.

Source:

transition(definition) → {TransitionNode}

Create a new transition node.

Transistion nodes are a type of effect node which have parameters which can be changed as events on the timeline.

For example a transition node which cross-fades between two videos could have a "mix" property which sets the progress through the transistion. Rather than having to write your own code to adjust this property at specfic points in time a transition node has a "transition" function which takes a startTime, stopTime, targetValue, and a propertyName (which will be "mix"). This will linearly interpolate the property from the curernt value to tragetValue between the startTime and stopTime.

Parameters:
Name Type Description
definition Object

this is an object defining the shaders, inputs, and properties of the transition node to create.

Source:
Returns:

A new transition node created from the passed definition.

Type
TransitionNode
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);

//A simple cross-fade node definition which cross-fades between two videos based on the mix property.
var crossfadeDefinition = {
    vertexShader : "\
       attribute vec2 a_position;\
       attribute vec2 a_texCoord;\
       varying vec2 v_texCoord;\
       void main() {\
           gl_Position = vec4(vec2(2.0,2.0)*a_position-vec2(1.0, 1.0), 0.0, 1.0);\
           v_texCoord = a_texCoord;\
        }",
    fragmentShader : "\
        precision mediump float;\
        uniform sampler2D u_image_a;\
        uniform sampler2D u_image_b;\
        uniform float mix;\
        varying vec2 v_texCoord;\
        varying float v_mix;\
        void main(){\
            vec4 color_a = texture2D(u_image_a, v_texCoord);\
            vec4 color_b = texture2D(u_image_b, v_texCoord);\
            color_a[0] *= mix;\
            color_a[1] *= mix;\
            color_a[2] *= mix;\
            color_a[3] *= mix;\
            color_b[0] *= (1.0 - mix);\
            color_b[1] *= (1.0 - mix);\
            color_b[2] *= (1.0 - mix);\
            color_b[3] *= (1.0 - mix);\
            gl_FragColor = color_a + color_b;\
        }",
    properties:{
        "mix":{type:"uniform", value:0.0},
    },
    inputs:["u_image_a","u_image_b"]
};

//Create the node, passing in the definition.
var transitionNode = videoCtx.transition(crossfadeDefinition);

//create two videos which will overlap by two seconds
var videoNode1 = ctx.video("video1.mp4");
videoNode1.play(0);
videoNode1.stop(10);
var videoNode2 = ctx.video("video2.mp4");
videoNode2.play(8);
videoNode2.stop(18);

//Connect the nodes to the transistion node.
videoNode1.connect(transitionNode);
videoNode2.connect(transitionNode);

//Set-up a transition which happens at the crossover point of the playback of the two videos
transitionNode.transition(8,10,1.0,"mix");

//Connect the transition node to the output
transitionNode.connect(ctx.destination);

//start playback
ctx.play();

unregisterCallback(func)

Remove a previously registed callback

Parameters:
Name Type Description
func function

the callback to remove.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);

//the callback
var updateCallback = function(){console.log("new frame")};

//register the callback
ctx.registerCallback("update", updateCallback);
//then unregister it
ctx.unregisterCallback(updateCallback);

unregisterTimelineCallback(func)

Unregister a callback which happens at a specific point in time.

Parameters:
Name Type Description
func function

the callback to unregister.

Source:

update(dt)

This allows manual calling of the update loop of the videoContext.

Parameters:
Name Type Description
dt Number

The difference in seconds between this and the previous calling of update.

Source:
Example
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement, undefined, {"manualUpdate" : true});

var previousTime;
function update(time){
    if (previousTime === undefined) previousTime = time;
    var dt = (time - previousTime)/1000;
    ctx.update(dt);
    previousTime = time;
    requestAnimationFrame(update);
}
update();

video(src) → {VideoNode}

Create a new node representing a video source

Parameters:
Name Type Description
src string | Video

The URL or video element to create the video from.

Source:
Returns:

A new video node.

Type
VideoNode
Examples
var canvasElement = document.getElementById("canvas");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video("video.mp4");
var canvasElement = document.getElementById("canvas");
var videoElement = document.getElementById("video");
var ctx = new VideoContext(canvasElement);
var videoNode = ctx.video(videoElement);