Media Performance Data
The SBC Core system monitors the packet loss and jitter exceeding the jitter buffers capacity using the playout time series. The playout time series consists of 31 quality measurements, each representing a consecutive period. Taken as a whole, the measurements represent how the playout buffer viewed the jitter and packet loss over consecutive periods. Within each period the quality is classified into four possible values:
- Good
- Acceptable
- Poor
- Unacceptable
Whenever the playout buffer has no data to play due to packet loss or excessive jitter, the SBC tracks the duration of the missing data over a time period. The total duration of the missing data over a time period is compared against three programmable thresholds (THRESHOLD0, THRESHOLD1, and THRESHOLD2) to classify the performance during the period. The threshold comparison is listed in the table below.
Threshold Comparison
If the duration of the missing data is | Quality is considered |
---|---|
Less than or equal to THRESHOLD0 | Good |
Greater than THRESHOLD0 and less than THRESHOLD1 | Acceptable |
Greater than THRESHOLD1 and less than THRESHOLD2 | Poor |
Greater than THRESHOLD2 | Unacceptable |
The time series provides an approximate indication of the location where problems arise in the packets, which is essential for determining call problems. For example, a large single-event outage or a continuous series of packet issues distributed throughout the call.
Since the period is fixed, the duration of the calls affects the number of time intervals used for collecting data. Using a default period of 20 seconds, a short call of 1-30 seconds produces data for one or two periods, whereas a longer call of 10 minutes will have data for the last 30 periods. The calls that last more than 31 time periods will have data only for the last 31 time periods of the call (old data is discarded). Suppose you wish to obtain data at a more granular level. In that case, you can configure a shorter time period, which precludes you from monitoring longer calls (since only the last 31 time periods are recorded).
Configuring the Playout Time Series Period and Thresholds
Playout Buffer Sizing Chart
Codec | Playout Buffer | Number of | Frame Size | Total Size |
---|---|---|---|---|
G.711 | 400 | 40 | 80 | 3200 |
G.711 Side B | 400 | 40 | 80 | 3200 |
G.726 | 400 | 40 | 60 | 2400 |
G.729 | 500 | 50 | 10 | 500 |
G.723 | 1500 | 50 | 24 | 1200 |
iLBC 20ms | 500 | 20 | 50 | 950 |
iLBC 30ms | 600 | 25 | 32 | 1000 |
AMR/EFR | 500 | 25 | 32 | 875 |
EVRC/EVRC-B | 500 | 25 | 22 | 550 |
G.722 | 400 | 40 | 80 | 3200 |
G.722.1 | 400 | 20 | 80 | 1600 |
G.722.2 | 400 | 20 | 62 | 1230 |
Opus | 400 | 40 | 300 | 12000 |
To configure the playout time series parameters, you set the thresholds to detect a certain percentage of missing data within a time period. For example, to configure a 20-second time period where between one and two percent of missing data is considered Poor quality, and more than two percent of missing data is considered Unacceptable:
- Calculate the duration of the percentages of the 20 seconds:
- 1 percent of 20 seconds = 0.2 seconds (200 msec)
- 2 percent of 20 seconds = 0.4 seconds (400 msec)
- Assign these values (in milliseconds) to
playoutTimeseriesThreshold1
andplayoutTimeseriesThreshold2
. TheplayoutTimeseriesThreshold0
is generally set to 0 (default).
The following CLI commands illustrate the example above.
set system dspPad playoutTimeseriesPeriod 20000 set system dspPad playoutTimeseriesThreshold1 200 set system dspPad playoutTimeseriesThreshold12 400