[Home] [By Thread] [By Date] [Recent Entries]
> Gavin's already questioned your history, so I'll let it go. I see HTTP > 1.1 as the natural extension of an IETF notion of creating protocols by > slapping extra headers onto information. HTTP was wise to reuse the > same header infrastructure that had worked for prior protocols, but that > doesn't make HTTP a brillant fundamental architecture. No, but generalizing the behaviour of other application layer agreements (protocols and APIs) certainly does. There are many application protocols that retrieve stuff; FTP, IMAP, NNTP, SMTP (VRFY & EXPN, for example). To generalize those into "GET", plus a URI to identify what is being "gotten", was a stroke of brilliance. Economies of scale, which previously failed to materialize because everybody had their own notion of "retrieve" before (and none of them worked with anybody else's), suddenly emerged, and companies such as Akamai were born. getFile(), getName(), getArticle(), getStockQuote() were what we had *before* the Web. I can't see any reason to go back after seeing what can be achieved with GET. MB -- Mark Baker, Chief Science Officer, Planetfred, Inc. Ottawa, Ontario, CANADA. mbaker@p... http://www.markbaker.ca http://www.planetfred.com
|

Cart



