Its About Time: Link Streams as Continuous Metadata

Kevin Page,
Don Cruickshank and David De Roure

IAM Logo

Intelligence, Agents, Multimedia
Department of Electronics and Computer Science

University 
					of Southampton Logo

Time for what?

How do we stream?

Let's examine a couple of fundamental details about streamed media

Distributed Multimedia Applications

We're talking in terms of distributed multimedia applications, and this is a nice checklist of what is required to support such applications:
Adcock et al. identified four resources to support distributed multimedia applications:
  1. Explicit support for temporal (streamed) media
    We need to successfully deliver streamed media - this puts extra requirements on the protocol stack, from the network up to the application
  2. The ability to specify and reserve a required quality of service (QoS)
  3. Synchronisation within and between multimedia elements
    The presentation has to now co-ordinate elements in a temporal manner
  4. Presentation and communication between groups of collaborating users
    i.e. multicast

How do current systems shape up?

Current Infrastructure (1)

The pieces fall into place:

Current Infrastructure (2)

The pieces fall into place:

Current Infrastructure (3)

The pieces fall into place:

Where next? (1)

Where next? (2)

A Simple Scenario (1)

A seminar presentation:
Maybe at a university, maybe a conference presentation, maybe a class lecture

A Simple Scenario (2)

Maybe something like...

HyStream screenshot

A dream come true?!

Metadata / Mediadata Relationships

How can we categorise metadata in relation to its associated media content, which we refer to as mediadata?
Should we stream the metadata? Can we just pre-load it?

What is Continuous Metadata?

The Temporal Linking Service (1)

The Temporal Linking Service (2)

The "HyStream" TLS Client

HyStream screenshot

Demoing tomorrow!

TLTP+FOHM (1)

TLTP+FOHM (2)

We have a demo!

A Generic Framework for Continuous Metadata (1)

A Generic Framework for Continuous Metadata (2)

Next we'll consider framework elements in a simple unicast version

Sources and Flows

Presentation Points

The presentation point is the node where a user views a combination of media and metadata flows.
There is no reason why a presentation point should only be the convergence of a single mediadata and metadata flow; it should pull together and synchronise as many metadata flows as the user requests.
It places requirements on the metadata flow:
and the information encoded in the flow

Filters and Control

Expanding into Multicast

To fulfil the fourth requirement the framework should support multicast

Let's elaborate on this multicast framework with an example

Multicast Framework Example

Multicast diagram

Multicast Framework Example

Multicast diagram

Multicast Framework Example

Multicast diagram

Multicast Framework Example

Multicast diagram

Multicast Framework Example

Multicast diagram

Multicast Framework Example

Multicast diagram

Summary

More at my continuous metadata page