[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Three states for a binary bit (was Re: TCP (and SCTP) sucks on high speed networks)
Guys - the real problem is that you keep acting like congestion is a property of the network. It isn't. It is a property of the sources' behavior (trying to put 10 lbs of s*** in a 5 lb bag). Temporary queueing delay is only a symptom, or as a more precise term, an epiphenomenon. Dealing with excessive demands from sources can be done in many ways. The load can be spread out (traffic engineering), the sources can negotiate for share (make a market in transient capacity), the load can be dropped (use a more lossy compression algorithm, the load can be deferred (shift traffic to less used time periods), and one can provide incentive to the network builder to increase capacity (adequate provisioning, sometimes disparagingly called "overprovisioning" by those who think that one can operate links at 95% capacity, in violation of Little's theorem). By focusing design effort on a particular implementation, one implicitly adopts its assumptions. In this case (trying to overload ECN with fairness and reliable detection of congestion), you tend to bind the architectural accidents of today's Internet (single path routing, no traffic engineering, no market making, no load shifting to other times, no adaptive coding) as a permanent solution. And worse, that solution is limited. So before you call for "more complexity in the net", try thinking about "more intelligence at the endpoints". Only if you have given that serious consideration, AND tried to deploy end-to-end solutions (which take several years), should you dare to try to impose centralized and application-ignorant solutions. It's lazy and arrogant to presume that the network designer knows what the users need in a resource allocation algorithm, such as managing "congestion". At most, the network can detect congestion.
Last updated: Tue Sep 04 01:06:11 2001
6315 messages in chronological order