6. Transmission Control Protocol (TCP)

Why TCP Acts As It Does

/Node%20anatomy
/Node%20anatomy
/Node%20anatomy
/Node%20anatomy
/Node%20anatomy
/Node%20anatomy

TCP Tuning

Bandwidth Delay Product (BDP)

Examples of packet pacing

4 streams, into a 12 Gbps disk system

No FQ pacing, 640 GB in 577 seconds

With no pacing, the 100 Gbps Ethernet interface sends packets in short bursts of 100 Gbps. Since our target rate is only 12 Gbps, we could distribute the packets more evenly, and possibly avoid having bursts arrive at one or more queues along the path to the receiver.

/Node%20anatomy
/Node%20anatomy

3Gb per stream FQ pacing, 640 GB in 487 seconds

Notice that when we pace packets at the sender to spread them out, the transfer is much smoother, and there are no more TCP retransmissions

/Node%20anatomy
/Node%20anatomy

Other ways to tune

Ethernet driver ring buffers

Ethernet Offloading

Ethernet flow control – sometimes mitigates buffer issues

Storage system (formerly “disks”) configuration

MTU (IP maximum transmission unit) akin to Ethernet frame size

Sometimes Less Is More

If you experience sickly performance due to queue loss

Parallelism

Currently, up to 10 Gigabit Ethernet can be filled by a typical server grade computer

On 100 Gigabit, various transfer speeds > 80 Gbps have been recorded (see pic)

Working 100G flows tend to be 30 Gbps or less

Most transfers over 10 Gbps is using multiple connections

Globus GridFTP opens on the order of several hundred connections at once.