Interactive television is here to stay

It may look odd, in 2017, to give a paper a title that starts with the two words “interactive television”, one of the oldest pairs of words in broadcasting and multimedia.

The reason is that, in spite of their long acquaintance, interactivity and television remains two words that, put together, make an oxymoron, unless one thinks that have broadcast and broadband on the same device makes television interactive.

So why the title? Because interactive television is no longer an oxymoron. There is an existence proof that interactive television exists.

Let’s roll the time axis back to November 2013 when the European Commission funded the BRIDging the Gap for Enhanced broadcasT (BRIDGET) project which had the goal to open new dimensions for multimedia content creation and consumption by enhancing broadcast programmes with brid­gets: links from a TV programme to external interactive media elements such as web pages, images, audio clips, video and synthetic 3D models.

Some members of the BRIDGET project proposed to MPEG the development of a standard that eventually became ISO/IEC 23000-18 – Multimedia Linking Application Format (MLAF).

What is MLAF about? It is a data format that describes, in a standard way,

  • The source content (TV program): content ID, start and end time of bridget validity, metadata (typically connected to production) etc.
  • The bridget (the link, well, that is also the name of the project, but with capital letters): metadata, information on how the bridget is presented to the viewer (during bridget validity) etc.
  • The destination content (where the bridget takes the viewer to): content ID, start and end time of destination content, metadata etc.

Interactivity is at the level of content, not at the level of infrastructure. Of course if there is no infrastructure there will be no interactivity, but infrastructure alone does not provide interactivity.

A basics workflow of a TV program enriched with bridgets is

  • A human editor has created a bridget relative to a time interval of a TV program
  • That bridget becomes available at that time to a TV viewer
  • If the TV viewer so decides the viewer can access the additional information that the editor had decided to make available.

Therefore we need two main elements

  1. An authoring tool to create bridgets
  2. A bridget “player”.

Interactive television is here because the elements required to implement the bridget scenario exist and can be deployed in the real world. They are called TVBridge and are shown in the figure below

To view or restart animation press crtl-R/cmd-R

In the figure one sees that

  1. At the studio side
    1. The TV program is uploaded to the broadcast server and to the Bridget Authoring Tool
    2. The Authoring Tool computes the audio fingerprints of the program and uploads them to the Audio Fingerprint Server
    3. A human editor creates the bridgets and deploys them to a web server
  2. At the viewer side
    1. A Bridget App listens to the audio coming from the TV set, computes the instantaneous fringerprints and sends them to the Audio Fingerprint Server
    2. The Audio Fingerprint Server identifies the program and the time, and sends the data to the Bridget App
    3. The Bridget App requests the list of bridgets of the program to the Bridget and Media Server and presents bridget icons at the appropriate time
    4. If viewer taps the icon the destination media are presented
    5. If viewer likes it, the bridget can be posted to a social network.

TVBridge utilises a specific technology (audio fingerprinting) to get an external device (second screen) in sync with what is going on on the TV set. Depending on context other technologies can be also be employed to do that job.

How does TVBridge look like? Let’s have a look at the following screenshot of the webapp.

At the centre the TV program to be bridgeted can be seen. On the right at the top there is a video navigation bar based on the shots identified. Below that there are tabs to preview the bridgets, to generate a zip file with all the bridgets and to deploy the bridgets on the web server. At the bottom all the bridgets that have already been created are displayed (in the figure only the first can be seen).

How are bridgets created? This is shown in the following figure.

The first line contains the bridget title, followed by tags, description and layout. This is used to indicated what layout is used to display the bridget information on the mobile app.

Finally here is an example of the end user experience of a bridget of the Visit London program.

All this looks very nice, but you may ask “How can I take benefit from a system like TVBridge?” The answer is easy because there are so many applications possible and here is a first list.

  • Education: while the lecturer or a documentary talks of something, a bridget pop ups. The student/viewer can tap the bridget and know more about the topic
  • News: something important happens in a place that is not well know. By tapping the bridget media information about the place becomes available
  • Advertisements: Follow up to product placement, user-centric advertising, user-selectable product ads
  • Access to multilanguage
  • Subscription based bridgets
  • Commentator tracks to a program
  • Program analytics
  • And many more

Interactive television not only is here, but is here to stay. Some broadcasters are busy adapting TVBridge to TV programs. Yes, because technology is impotant because it enables interactive television, but television, even with bridgets, is in the hands of creatives.

If you do not want to miss the train please contact

+39 335 612 11 59

info@wimlabs.com

www.wimlabs.com

Leave a Reply

Your email address will not be published. Required fields are marked *