Recent work has shown that retransmissions can cause heavy-tailed transmission delays even when packet sizes are light-tailed. Moreover, the impact of heavy tailed delays persist even when packets are of finite size. The key question we study in this paper is how the use of coding techniques to transmit information could mitigate delays. To investigate this problem, we consider an important communication channel called the Binary Erasure Channel, where transmitted bits are either received successfully or lost (called an erasure). This model is a good abstraction of not only the wireless channel but also the higher layer link, where erasure errors can happen. Many coding schemes, known as erasure codes, have been designed for this channel. Specifically, we focus on the fixed rate coding scheme, where decoding is said to be successful if a certain fraction of the codeword is received correctly. We study two different scenarios: (I) A codeword of length βLc is retransmitted as a unit until the receiver successfully receives more than βLc bits in the last transmission. (II) All successfully received bits from every (re)transmissions are buffered at the receiver according to their positions in the codeword, and the transmission completes once the received bits become decodable for the first time. Our studies reveal that complicated and surprising relationships exist between the coding complexity and the transmission delay/throughput. From a theoretical perspective, our results provide a benchmark to quantify the tradeoffs between coding complexity and transmission throughput for receivers that use memory to buffer (re)transmissions until success and those that do not buffer intermediate transmissions. © 2011 IEEE.