Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

                                                                                               

TV PLAYER REPORT TEST TOOL

Document Description

This document accompanies the release of the Kantar Spring library implementation test tool. It covers:

  • The heartbeats sent from the Spring libraries
  • How to verify the accuracy of your implementation
  • The Kantar Spring Testing Tool that provides logstream access for implementation validation

 

Heartbeats sent from the libraries

 

This section briefly explains what the heartbeats sent from the libraries should look like. A concrete example of a viewing session is used.

The content stream is started and the first request is transmitted:

 

Please use the record layout descriptions below for reference.

  • counting domain: example.2cnt.net. Here, the “Sitename” is “example”.
  • pl = player
  • plv = player version
  • sx = width of the stream window
  • sy = height of the stream window
  • stream = stream name
  • cq = contentId
  • uid = unique Id of the use process
  • pst = play state (list of viewing intervals on the stream)
  • dur = stream length in seconds (set up by the client)
  • vt = view time in seconds (visual contact with the stream)

  • First play state: 0+0+mbeswh :

 

The actual record output should look similar to below:

After viewing 2 min of the stream:

The output records should look similar to the records below, note the time of the “heartbeat” records (21, 41, 61 seconds etcetera)

Stopping the stream after 2:00 min:

Last play state: 1+120+mbeswj = 120 sec playtime.

 

Note that the "uid" (uid=3f3tv5p) and stream name "stream=od" remained the same during the whole view sequence. This should always be the case when the implementation is correct.

If the "uid" or stream name changes, there is something wrong with the implementation and more than one view is being counted for this single view sequence.
The above example is a generic one.

For the BARB TV Player project, you should be seeing heartbeats sent out at regu-lar0,3,10,20,30,40,50,60,120,180,.. seconds.
NOTE: There may be 1 or 2 seconds added to every heartbeat due to internal workings of the library.

Verifying the accuracy of your implementation

Introduction

There are three steps involved in verifying your implementation.

  1. Observe the streams and verifying the content of heartbeats (recommended)
  2. Kantar scenario testing, using a staging release of your player (recommended)
  3. Kantar Spring confirm streams received and content of streams is as expected (mandatory)

Step 1

The log stream test tool provides a real-time quality assessment of your integration.

Where implementation does not meet project measurement requirements, the tool will give appro-priate warnings to highlight the variable(s) that need to be redefined.

Once no warnings or errors are returned, the implementation is verified as correct, i.e. correctly mapped variables are being sent as part of every request sent to the Kantar Media Spring meas-urement systems; user views from your app are being correctly received and processed.

Step 2

Once the log stream test tool verification is complete, your implementation can move to the next QA stage.

This recommended second stage involves a staging release of your player be made available to Kantar Media so we may run scenario streams.

We recommend using test flight (https://testflightapp.com/) for this purpose and can provide the de-vice UDIDs or equivalent. Other methods or products for providing access to a pre-release version of your implementation are of course, also accepted.

Scenario testing replicates a number of standard user actions so we may confirm how your player behaves in each case, and importantly, how our library acts accordingly, i.e. that it is correctly cap-turing all possible user interactions.

Typically, this step is completed in 3-4 working days.

Step 3

This last stage involves a Kantar Media Spring sense check to confirm the content of the streams is as expected.

While variables may be correctly mapped as part of your implementation, we undertake a human eye review in order to verify the data collected.

 This is important because each broadcaster’s player app may work in subtly different ways, and in many cases, is subject to customization.

 Moreover it is essential we confirm that stream interruptions are being correctly reported.

 Examples of this include:

  • No pre- or mid- or post-rolls are being measured as part of the programme stream
  • No false ending implementations are affecting the measurement. For example when the duration of a stream is reported as higher than it actually is, or when the measurement is stopped before the player reaches the full duration of the stream.

Typically, this step is scheduled and completed for a single working day.

 Final sign-off

Once all 3 steps are complete, Kantar Media will sign-off your implementation, confirming that all aspects are as expected.

Your implementation may then be scheduled for live publication.

Kantar Spring Testing Tool

Introduction

The tool provides test teams with access to Kantar Spring servers for the purpose of validating Kan-tar Spring library implementations for mobile platforms. It connects to the logstream on the measurement servers in real-time, filtering data traffic based on the values entered in the form to display your unique test.

The tool will:

  • Parse the logstream
  • Validate JSON objects (the URI-strings sent to our systems)
  • Throw exceptions in case of errors (e.g. JSON structure, in the order in which requests are sent)

Each app should be tested with at least two devices; this is to ensure that different devices are uniquely identified. You may use an emulator as one of these devices – the tool will work for either emulator or real mobile device tests.

Procedure Overview

Getting started

 

You will need the device ID of your test device and optionally the name of your app (<application>).

  • For iOS, please use:
    • iOS 5 or lower: the MAC address
    • iOS 6 or 7: the Apple Advertising ID
  • For Android, please use the Android-ID (AID) or Device-ID (DID).

 

You can read your device ID (MAC address, Advertising ID, AID, DID) with the help of free apps that can be downloaded from the respective app stores.

  • iOS: App “my device info” at Apple App Store
  • Android: App “ID” at Google Play,
  • Windows Phone: Tool “device ID” at Windows Phone Marketplace

 

Your browser settings:

  • Please use an html5-capable browser
  • Please enable Java Script
  • We recommend the use of a modern up-to-date browser such as Firefox, Chrome, ...

Please ensure that WebSockets is not being blocked by a firewall.
http://en.wikipedia.org/wiki/WebSocket

Beginning the test

Log on to the web interface here:
http://mobiletest.2cnt.net/doc/html5/mobile.html
Enter on the Initial Screen (see illustration below)

  • Field 1 - your application name (<application>) – OPTIONAL
  • Field 2 - your device ID – MANDATORY
    NOTE: please insert the MAC address in case of iOS only with upper case letters!
  • Field 3 - your Site Code (a.k.a. Sitename) – MANDATORY

Then Select the operating system of your app: Android, iOS …
Click the button "Start test!"






Step 1

 


  • No labels