IEEE 802.3 Ethernet and OIF/CEI specify the required characteristics of high-speed serial link building blocks so that the interoperability among them provided by various parties be guaranteed. The three major building blocks of a high-speed serial link are transmitter, channel and receiver. Input signal to a receiver is degraded due to ISI, jitter, noise, etc. Receiver interference tolerance (ITOL) is a capability to work properly with such degraded signal, and therefore interference tolerance test is one of the most critical compliance items. 

The 106G Ethernet [1] and the 112G OIF/CEI [2] standards development was started in 2017, and they are now in finalizing phase. While receiver interference test and its methodology has been in these standards since 25+Gbps/lane serial link [3][4][5], one critical parameter “value” change for the conversion from the measured PAM-4 jitter to the reference TX jitter model was made in 2021 to the latest 802.3 Ethernet standard drafts [1][2]. The standard documents do not describe the reason for the change and the associated technical discussion because of the nature of the standards. This paper analyzes the mathematical/theoretical issue in the previous interference tolerance test standards [3][4], and discusses two types of reference TX jitter parameters estimation methods [6][7] to solve or alleviate the issue. While final change in the standards may be less than these proposals, it is an improvement, and it would be beneficial for the practitioners to know the history behind it, especially when they face challenges to literally follow the standards.

The paper referenced here received the Best Paper Award at DesignCon 2022. To read the entire DesignCon 2022 paper, download the PDF.