I have tested them on James Clark's test archive, Jon Bosak's OT, and the
XML spec. (I used xmlwf -d, which is James' great "canonical XML" converter,
and diff to test differences.) Compression gains are modest (2% to 6%), but
these gains seem to survive survive gzip (deflate) compression (and may
indeed improve compressibility). It is easy to create documents which
exhibit 80% or more compression, but such documents are artificial. I would
be surprised if the general rate of compression for well-suited documents
was much more than 25%. These rates are small, but this kind of compression
does not seem to disrupt further compressions to binary forms. Furthermore,
the compressed data is still editable text, and may be accepted by
conforming WebSGML applications.
This kind of compression is suited for use as an end-to-end compression:
with the ubiquity of intermediate point-to-point compression (e.g.
modem-to-modem), it is possible that binary end-to-end compression methods
are no more effective than this.
I am interested in getting test results from WWW documents which use
namespaces heavily, in particular database dumps and RDF.
Please email me if you are interested in trying them out. The source code
uses Mozilla license. Current version of the software should handle most
character encodings on UNIX hosts (not shift-JIS): EBCDIC and wide encodings
are passed through without alteration. On Win32, the 16-wide encodings and
shift-JIS will have problems.
Example of STX encoding.
========================
<?xml?>
<top>
<p>blah</p>
<p>blah<i>blah</i>blah</p>
<p>blah</p>
<p/>
<p>blah<hr></hr><hr></hr></p>
<p id="p1">blah</p>
</top>
will be compressed as:
<?stx><?xml?>
<top>
<p>blah</>
<>blah<i>blah</i>blah</>
<>blah</>
<p/>
<p>blah<hr></><></></>
<p id="p1">blah</>
</>
Rick Jelliffe