Technologies for the Near Future

SMIL

Read about the W3C's SMIL Activity at the W3C Web site

What is it?
The Synchronised Multimedia Integration Language is a language written in XML that allows authors to express relationships between different objects (such as a sequence of images, some text, video and audio files) over time. A simple example is to specify a changing sequence of images to occur alongside an audio track - in other words a narrated slideshow. A more complex example is a number of video and audio segments, which have hyperlinks that can be used to swap between them, and text captions that can be optionally turned on or off.
How does it work?
SMIL provides a set of tags that look like HTML tags, which specify what kind of objects to play, whether to play them one after another, or paralell, and so on. The actual objects themselves are not included in the markup but by URI reference - so anything can be played with the appropriate plugin.
Who is using it?
A number of free and commercial players and authoring tools are available for SMIL, including those from RealNetworks, Microsoft, Apple, and GRiNS.
What does it look like?
There are existing SMIL presentations available, including the WGBH (US) National Centre for Accessible Multimedia's demonstration piece for described, captioned video, which works using the Real Player or a compatible player.
Where is it going?
The new draft version of SMIL, SMIL-Boston, is also being supported by some tools and includes further accessibility features, animation (to increase the speed of simple dynamic changes), and event triggers to provide more complex interactivity through scripting.

This page was produced by Charles McCathieNevile of W3C as part of a presentation for online@RMIT. All responsibility for errors rests with the author.