How do the different NDT protocols report “Jitter”?

“Jitter” is an often requested metric. However, there are different definitions. This can make it confusing across user communities. Section 1.1 - Terminology in IETF RFC 3393, discusses the two most common meanings of “jitter” and to avoid confusion, uses the more precise term, “delay variation”:

"Jitter" commonly has two meanings: The first meaning is the variation of a signal with respect to some clock signal, where the arrival time of the signal is expected to coincide with the arrival of the clock signal. This meaning is used with reference to synchronous signals and might be used to measure the quality of circuit emulation, for example. There is also a metric called "wander" used in this context.

The second meaning has to do with the variation of a metric (e.g., delay) with respect to some reference metric (e.g., average delay or minimum delay). This meaning is frequently used by computer scientists and frequently (but not always) refers to variation in delay.

... we will avoid the term "jitter" whenever possible and stick to delay variation which is more precise.

The web100 NDT client reported “jitter”, which was actually an estimate calculated on the client side as “MaxRTT - MinRTT”.

ndt5 does not report jitter. We removed it to avoid confusion.

ndt7 will provide the Round Trip Time Variation (RTTVar), as defined in the Linux kernel. This measures round trip time variation, which is frequently what people mean when they ask for jitter.