I wrote the following two functions (in bold) with the purpose of compressing/decompressing string
values within a Delphi application:
implementation
uses
ZLib;
functionZCompressString(aText: string; aCompressionLevel: TZCompressionLevel): string;
var
strInput,
strOutput: TStringStream;
Zipper: TZCompressionStream;
begin
Result:= '';
strInput:= TStringStream.Create(aText);
strOutput:= TStringStream.Create;
try
Zipper:= TZCompressionStream.Create(strOutput, aCompressionLevel);
try
Zipper.CopyFrom(strInput, strInput.Size);
finally
Zipper.Free;
end;
Result:= strOutput.DataString;
finally
strInput.Free;
strOutput.Free;
end;
end;
functionZDecompressString(aText: string): string;
var
strInput,
strOutput: TStringStream;
Unzipper: TZDecompressionStream;
begin
Result:= '';
strInput:= TStringStream.Create(aText);
strOutput:= TStringStream.Create;
try
Unzipper:= TZDecompressionStream.Create(strInput);
try
strOutput.CopyFrom(Unzipper, Unzipper.Size);
finally
Unzipper.Free;
end;
Result:= strOutput.DataString;
finally
strInput.Free;
strOutput.Free;
end;
end;
The main advantage of the above functions over the ZCompressStr
and ZDecompressStr
routines shipped with ZLib.pas, is that you won’t have potential data lost when handling Unicode <->ANSI conversions. In other words, the above functions will work in both ANSI Delphi versions (previous to Delphi 2009) and Unicode Delphi versions (Delphi 2009, 2010, XE and so on).
Note that you need to include ZLib
in the uses clause. ZLib.pas is a high level wrapper to the ZLib library created by Jean-Loup Gailly and Mark Adle for data compression.
In addition, using the TZCompressionStream
and TZDecompressionStream
classes, you can also create (compress) and decompress ZIP files.
What did (do) I need this for? Well, the applicability for data compression is wide...
A real life example? I needed to store JSON string
s in a MySQL database. As a way to optimize resources, I compressed all JSON string
s before the insertion into the database. After the retrieval, I was able to decompress each string
to its original value. The compression rate was huge: I packed ~9000 chars in ~300 per JSON on average. This is a considerable saving: my table contains more than one million rows. Do the math yourself! :-)