I'm doing some experiments with UDP and compact RIOs, where I'm characterizing average/probable latency of UDP packet steams on a network with different amounts of artificial congestion (using iperf3). I'm running several tests with different sized payloads (20, 50, 100, 200, 500, and 1000 bytes). My question is:
From 20 to 100 bytes, I saw average latency increase as a function of packet size, as well as a function of link congestion (as expected). However, at a UDP payload 200 bytes and above, the latency of the UDP packets arrival suddenly decreases sharply. Is there an expected reason for this? I haven't been able to find anything about this behavior via google or these forums.