The SBC Core system monitors the packet loss and jitter exceeding the jitter buffers capacity using the playout time series. The playout time series consists of 31 quality measurements, with each measurement representing a consecutive time period. Taken as a whole, the measurements represent how the playout buffer viewed the jitter and packet loss over consecutive time periods. Within each time period the quality is classified into four possible values:
Whenever the playout buffer has no data to play due to packet loss or excessive jitter, the SBC tracks the missing data duration over a time period. The total duration of the missing data over a time period is compared against three programmable thresholds to classify the performance during the period (THRESHOLD0, THRESHOLD1, and THRESHOLD2). The threshold comparison is listed in the table below.
Table 1: Threshold Comparison
If the duration of the missing data is | Quality is considered |
---|---|
Less than or equal to THRESHOLD0 | Good |
Greater than THRESHOLD0 and less than THRESHOLD1 | Acceptable |
Greater than THRESHOLD1 and less than THRESHOLD2 | Poor |
Greater than THRESHOLD2 | Unacceptable |
The time series provides an approximate indication of the location where problem arises in the packets, for exactly determining the call problems. For example, a large single-event outage or a continuous series of packet issues distributed throughout the call.
Since the time period is fixed, the duration of the calls affect the number of time period intervals that are used for collecting data. By using a default time period of 20 seconds, a short call of 1-30 seconds, produces data for one or two time periods, whereas a longer call of 10 minutes will have data for the last 30 time periods. The calls which lasts more than 31 time periods will have data only for the last 31 time periods of the call (old data is discarded). If you wish to obtain data at a more granular level, you can configure the time period to be shorter, however this precludes you from monitoring longer calls (since only the last 31 time periods are recorded).
Figure 2: Playout Buffer Sizing Chart
Codec | Playout Buffer | Number of | Frame Size | Total Size |
---|---|---|---|---|
G.711 | 400 | 40 | 80 | 3200 |
G.711 Side B | 400 | 40 | 80 | 3200 |
G.726 | 400 | 40 | 60 | 2400 |
G.729 | 500 | 50 | 10 | 500 |
G.723 | 1500 | 50 | 24 | 1200 |
iLBC 20ms | 500 | 20 | 50 | 950 |
iLBC 30ms | 600 | 25 | 32 | 1000 |
AMR/EFR | 500 | 25 | 32 | 875 |
EVRC/EVRC-B | 500 | 25 | 22 | 550 |
G.722 | 400 | 40 | 80 | 3200 |
G.722.1 | 400 | 20 | 80 | 1600 |
G.722.2 | 400 | 20 | 62 | 1230 |
Opus | 400 | 40 | 300 | 12000 |
To configure the playout time series parameters, you set the thresholds to detect a certain percentage of missing data within a time period. For example, to configure a 20-second time period where between one and two percent of missing data is considered Poor quality, and more than two percent of missing data is considered Unacceptable:
playoutTimeseriesThreshold1
and playoutTimeseriesThreshold2
. The playoutTimeseriesThreshold0
is generally set to 0 (default).The following CLI commands illustrate the example above.
set system dspPad playoutTimeseriesPeriod 20000 set system dspPad playoutTimeseriesThreshold1 200 set system dspPad playoutTimeseriesThreshold12 400