Repository logo
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
Repository logo
  • Communities & Collections
  • Research Outputs
  • Fundings & Projects
  • People
  • Statistics
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Yкраї́нська
  • Log In
    or
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Indian Institute of Technology Madras
  3. Publication3
  4. Minimizing Jitter in TDM over PSN Networks-Statistical Approach
 
  • Details
Options

Minimizing Jitter in TDM over PSN Networks-Statistical Approach

Date Issued
01-02-2020
Author(s)
Manivasakan, R. 
Indian Institute of Technology, Madras
Syed, Junaid
Yennapu, Surya Meghana
DOI
10.1109/ICITIIT49094.2020.9071544
Abstract
In many communication networks, controlling the packet delay variation (pdv) (in packet switched networks (PSN)) or jitter (in TDM networks) are challenging. In PSNs, pdv is crucial for real-time (e.g. telephony) and near real-time (e.g. streaming audio, video, VoD). Conventionally, many methods were adopted among which the most primitive way of traffic regulation is through the leaky bucket algorithm wherein the bursty sources in the internet are 'regulated' or 'shaped'. There are other schemes which are adopt 'deterministic' way to reduce jitter. In this paper, we adopt a method of dictating the service distribution (marginal joint) to control the output pdv or jitter. We rely heavily on the result in [1] which showed that positive correlations in service intervals do reduce the mean variance of queue length while it is vice versa for negative correlations. We show analytically that for negatively correlated queue, the IDI variance decreases as the correlation increases (and vice versa for positive correlation). But this is at the cost of increased delay (waiting time plus service time). Note that IDI variance appears as pdv or jitter at receiver/downstream node. To demonstrate the efficacy of the above findings, we apply the above to steaming audio and TDM over PSN problems. It is demonstrated that the jitter is minimized at the receiver.
Indian Institute of Technology Madras Knowledge Repository developed and maintained by the Library

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback