The
Whenever the playout buffer has no data to play due to packet loss or excessive jitter, the
The time series provides an approximate indication of the location where problem arises in the packets, for exactly determining the call problems. For example, a large single-event outage or a continuous series of packet issues distributed throughout the call.
Since the time period is fixed, the duration of the calls affect the number of time period intervals that are used for collecting data. By using a default time period of 20 seconds, a short call of 1-30 seconds, produces data for one or two time periods, whereas a longer call of 10 minutes will have data for the last 30 time periods. The calls which lasts more than 31 time periods will have data only for the last 31 time periods of the call (old data is discarded). If you wish to obtain data at a more granular level, you can configure the time period to be shorter, however this precludes you from monitoring longer calls (since only the last 31 time periods are recorded).
To configure the playout time series parameters, you set the thresholds to detect a certain percentage of missing data within a time period. For example, to configure a 20-second time period where between one and two percent of missing data is considered Poor quality, and more than two percent of missing data is considered Unacceptable:
playoutTimeseriesThreshold1
and playoutTimeseriesThreshold2
. The playoutTimeseriesThreshold0
is generally set to 0
(default).The following CLI commands illustrate the example above.
% set system dspPad playoutTimeseriesPeriod 20000 % set system dspPad playoutTimeseriesThreshold1 200 % set system dspPad playoutTimeseriesThreshold12 400