Article ID: | iaor20011828 |
Country: | United Kingdom |
Volume: | 358 |
Issue: | 1773 |
Start Page Number: | 2209 |
End Page Number: | 2215 |
Publication Date: | Aug 2000 |
Journal: | Philosophical Transactions of the Royal Society of London |
Authors: | Crowcroft J. |
Keywords: | service, internet |
The art and science of tele-traffic modelling is quite mature. On the other hand, Internet traffic seems to defy all attempts to capture its essence in simple models. This is not so surprising when we consider that the Internet consists of a large number of self-organizing systems, each evolved almost independently, which is quite a different way to construct a network than the ground-up design associated with telecommunications. IP routing, TCP congestion control, Relative Transport Protocol playout and loss adaption, Web caching and load balancing, and user behaviour are all involved in a system of massive complexity. In this paper, we survey some of these mechanisms and some of the attempts to bring this unruly bunch of schemes into a more coherent whole. We argue that these attempts are misguided, and that the strength of the Internet design is in the loose organization of these components. As the commercial investors turn their eyes on the Internet with a view to pricing, we argue that they should take extreme care not to propose mechanisms that kill the goose that lays the golden egg.