[Home] [By Thread] [By Date] [Recent Entries]

  • From: David Brownell <david-b@p...>
  • To: xml-dev@l...
  • Date: Thu, 18 Jan 2001 07:31:42 -0800

A decently designed binary data encoding protocol will
take less space to encode and transmit, and less time
to decode, than a decently designed text based one.

Just think:  writing an integer as four bytes takes
one instruction, maybe two if byteswapping is needed.
But writing "32768" takes more bytes, and more time
to encode/decode.  Text compression costs even more.
There is no way to recover those differences.


Of course those "decently designed" binary protocols
rely on things like contextualized encodings.  If you
demand self-descriptive data, then it's easy to get to
the point where manipulating type data costs so much
that encoding/decoding differences are in the noise.

And some programming environments basically don't
support binary data worth speaking of.  So there are
going to be tradeoffs to consider.

- Dave




Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member