Media Performance Data

The SBC Core system monitors the packet loss and jitter exceeding the jitter buffers capacity using the playout time series. The playout time series consists of 31 quality measurements, each representing a consecutive period. Taken as a whole, the measurements represent how the playout buffer viewed the jitter and packet loss over consecutive periods. Within each period the quality is classified into four possible values:

  • Good
  • Acceptable
  • Poor
  • Unacceptable

Whenever the playout buffer has no data to play due to packet loss or excessive jitter, the SBC tracks the duration of the missing data over a time period. The total duration of the missing data over a time period is compared against three programmable thresholds (THRESHOLD0, THRESHOLD1, and THRESHOLD2) to classify the performance during the period. The threshold comparison is listed in the table below.

Threshold Comparison

If the duration of the missing data isQuality is considered
Less than or equal to THRESHOLD0Good
Greater than THRESHOLD0 and less than THRESHOLD1Acceptable
Greater than THRESHOLD1 and less than THRESHOLD2Poor
Greater than THRESHOLD2Unacceptable

The time series provides an approximate indication of the location where problems arise in the packets, which is essential for determining call problems. For example, a large single-event outage or a continuous series of packet issues distributed throughout the call.

Since the period is fixed, the duration of the calls affects the number of time intervals used for collecting data. Using a default period of 20 seconds, a short call of 1-30 seconds produces data for one or two periods, whereas a longer call of 10 minutes will have data for the last 30 periods. The calls that last more than 31 time periods will have data only for the last 31 time periods of the call (old data is discarded). Suppose you wish to obtain data at a more granular level. In that case, you can configure a shorter time period, which precludes you from monitoring longer calls (since only the last 31 time periods are recorded). 

Configuring the Playout Time Series Period and Thresholds

Playout Buffer Sizing Chart

Codec

Playout Buffer
Length (ms)

Number of
Frames

Frame Size
(bytes)

Total Size
(bytes)

G.71140040803200
G.711 Side B40040803200
G.72640040602400
G.7295005010500
G.723150050241200
iLBC 20ms5002050950
iLBC 30ms60025321000
AMR/EFR5002532875
EVRC/EVRC-B5002522550
G.72240040803200
G.722.140020801600
G.722.240020621230
Opus40040300

12000

To configure the playout time series parameters, you set the thresholds to detect a certain percentage of missing data within a time period. For example, to configure a 20-second time period where between one and two percent of missing data is considered Poor quality, and more than two percent of missing data is considered Unacceptable:

  • Calculate the duration of the percentages of the 20 seconds:
    • 1 percent of 20 seconds = 0.2 seconds (200 msec)
    • 2 percent of 20 seconds = 0.4 seconds (400 msec)
  • Assign these values (in milliseconds) to playoutTimeseriesThreshold1 and playoutTimeseriesThreshold2. The playoutTimeseriesThreshold0 is generally set to 0 (default).

The following CLI commands illustrate the example above.

set system dspPad playoutTimeseriesPeriod 20000
set system dspPad playoutTimeseriesThreshold1 200
set system dspPad playoutTimeseriesThreshold12 400
  • No labels