Discussion:
Data compression APIs?
Mitch Zollinger
2012-01-26 01:11:57 UTC
Permalink
In designing the technology that satisfies the use case doc posted
previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would
appear that we need to support compression / decompression APIs.

The reason is that for our "MsgSec" protocol, we encrypt every message
going over the wire and since encrypted data cannot be compressed, the
compression has to happen before encryption. Therefore standard things
like HTTP gzip compression will not work.

What we're looking for is something along the lines of:

function compress(data, algorithm)
function uncompress(data, algorithm)

where algorithm is one of the standard ones (gzip, bzip2, etc.)

Is this the right forum for looking at this type of functionality?

Mitch
Richard L. Barnes
2012-01-26 02:55:36 UTC
Permalink
You're exactly right about the need to compress before encrypting, but my inclination would be to say that a crypto API is probably not the right place for compression/decompression.

In general, one of the goals of crypto software design is to be validatable (e.g., for FIPS 140-2 validation [1]), and the addition of unnecessary features within the "crypto perimeter" tends to make that validation harder. For example, Firefox contains a FIPS-accredited crypto module [2], so one could imagine that if the bridge to the Javascript API were implemented properly, this accreditation might be extended to the Javascript layer as well. This would be more difficult if functions were introduced at the Javascript layer beyond what is present in the FIPS-accredited module. (I do note that NSS includes a copy of zlib [3], but it's not immediately clear whether that's within the accredited part.)

--Richard


[1] <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>
[2] <http://support.mozilla.org/en-US/kb/Configuring%20Firefox%20for%20FIPS%20140-2>
[3] <http://hg.mozilla.org/mozilla-central/file/8eab8fdaa675/security/nss/lib/zlib>
In designing the technology that satisfies the use case doc posted previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would appear that we need to support compression / decompression APIs.
The reason is that for our "MsgSec" protocol, we encrypt every message going over the wire and since encrypted data cannot be compressed, the compression has to happen before encryption. Therefore standard things like HTTP gzip compression will not work.
function compress(data, algorithm)
function uncompress(data, algorithm)
where algorithm is one of the standard ones (gzip, bzip2, etc.)
Is this the right forum for looking at this type of functionality?
Mitch
Henry B. Hotz
2012-01-26 03:14:11 UTC
Permalink
While I can appreciate the considerations you raise, I think the complexity is just something we need to deal with if we are going to exchange any significant quantity of encrypted data.

TLS compresses data, and there are still 140-certified implementations, aren't there?
Post by Richard L. Barnes
You're exactly right about the need to compress before encrypting, but my inclination would be to say that a crypto API is probably not the right place for compression/decompression.
In general, one of the goals of crypto software design is to be validatable (e.g., for FIPS 140-2 validation [1]), and the addition of unnecessary features within the "crypto perimeter" tends to make that validation harder. For example, Firefox contains a FIPS-accredited crypto module [2], so one could imagine that if the bridge to the Javascript API were implemented properly, this accreditation might be extended to the Javascript layer as well. This would be more difficult if functions were introduced at the Javascript layer beyond what is present in the FIPS-accredited module. (I do note that NSS includes a copy of zlib [3], but it's not immediately clear whether that's within the accredited part.)
--Richard
[1] <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>
[2] <http://support.mozilla.org/en-US/kb/Configuring%20Firefox%20for%20FIPS%20140-2>
[3] <http://hg.mozilla.org/mozilla-central/file/8eab8fdaa675/security/nss/lib/zlib>
In designing the technology that satisfies the use case doc posted previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would appear that we need to support compression / decompression APIs.
The reason is that for our "MsgSec" protocol, we encrypt every message going over the wire and since encrypted data cannot be compressed, the compression has to happen before encryption. Therefore standard things like HTTP gzip compression will not work.
function compress(data, algorithm)
function uncompress(data, algorithm)
where algorithm is one of the standard ones (gzip, bzip2, etc.)
Is this the right forum for looking at this type of functionality?
Mitch
------------------------------------------------------
The opinions expressed in this message are mine,
not those of Caltech, JPL, NASA, or the US Government.
Henry.B.Hotz-***@public.gmane.org, or hbhotz-***@public.gmane.org
Mark Watson
2012-01-26 04:36:06 UTC
Permalink
I imagine that any JS "Web Compression" API would be technically completely independent of the Web Crypto APIs (not least because it can be so, as well as for the reasons below).

We raised it here because one of the strongest use-cases for a Web Compression API would be compression before encryption and so the rationale for such a compression API is much stronger when a Web Crypto API exists. Absent the Web Crypto API there are fewer and weaker arguments for a Web Compression API.

So the questions to this group are: Does this make sense ? Where should it be progressed ? Is there existing work in this area we should look at first ?

...Mark


On Jan 25, 2012, at 6:55 PM, Richard L. Barnes wrote:

You're exactly right about the need to compress before encrypting, but my inclination would be to say that a crypto API is probably not the right place for compression/decompression.

In general, one of the goals of crypto software design is to be validatable (e.g., for FIPS 140-2 validation [1]), and the addition of unnecessary features within the "crypto perimeter" tends to make that validation harder. For example, Firefox contains a FIPS-accredited crypto module [2], so one could imagine that if the bridge to the Javascript API were implemented properly, this accreditation might be extended to the Javascript layer as well. This would be more difficult if functions were introduced at the Javascript layer beyond what is present in the FIPS-accredited module. (I do note that NSS includes a copy of zlib [3], but it's not immediately clear whether that's within the accredited part.)

--Richard


[1] <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>
[2] <http://support.mozilla.org/en-US/kb/Configuring%20Firefox%20for%20FIPS%20140-2>
[3] <http://hg.mozilla.org/mozilla-central/file/8eab8fdaa675/security/nss/lib/zlib>



On Jan 25, 2012, at 8:11 PM, Mitch Zollinger wrote:

In designing the technology that satisfies the use case doc posted previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would appear that we need to support compression / decompression APIs.

The reason is that for our "MsgSec" protocol, we encrypt every message going over the wire and since encrypted data cannot be compressed, the compression has to happen before encryption. Therefore standard things like HTTP gzip compression will not work.

What we're looking for is something along the lines of:

function compress(data, algorithm)
function uncompress(data, algorithm)

where algorithm is one of the standard ones (gzip, bzip2, etc.)

Is this the right forum for looking at this type of functionality?

Mitch
Jarred Nicholls
2012-01-26 05:03:31 UTC
Permalink
I imagine that any JS "Web Compression" API would be *technically*completely independent of the Web Crypto APIs (not least because it can be
so, as well as for the reasons below).
We raised it here because one of the strongest use-cases for a Web
Compression API would be compression before encryption and so the rationale
for such a compression API is much stronger when a Web Crypto API exists.
Absent the Web Crypto API there are fewer and weaker arguments for a Web
Compression API.
So the questions to this group are: Does this make sense ?
Yes, absolutely.
Where should it be progressed ?
public-webapps most likely? but I agree that its major use case doesn't
exist yet, which would possibly make it less attractive to progress for
others until that use case is tangible. The crypto APIs are still
functional without the existence of compression APIs, obviously. Since
they are orthogonal, we can progress the crypto APIs and then introduce the
ideas around compression APIs with a strong use case shortly thereafter.
Thoughts?
Is there existing work in this area we should look at first ?
...Mark
You're exactly right about the need to compress before encrypting, but
my inclination would be to say that a crypto API is probably not the right
place for compression/decompression.
In general, one of the goals of crypto software design is to be
validatable (e.g., for FIPS 140-2 validation [1]), and the addition of
unnecessary features within the "crypto perimeter" tends to make that
validation harder. For example, Firefox contains a FIPS-accredited crypto
module [2], so one could imagine that if the bridge to the Javascript API
were implemented properly, this accreditation might be extended to the
Javascript layer as well. This would be more difficult if functions were
introduced at the Javascript layer beyond what is present in the
FIPS-accredited module. (I do note that NSS includes a copy of zlib [3],
but it's not immediately clear whether that's within the accredited part.)
--Richard
[1] <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>
[2] <
http://support.mozilla.org/en-US/kb/Configuring%20Firefox%20for%20FIPS%20140-2
[3] <
http://hg.mozilla.org/mozilla-central/file/8eab8fdaa675/security/nss/lib/zlib
In designing the technology that satisfies the use case doc posted
previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would
appear that we need to support compression / decompression APIs.
The reason is that for our "MsgSec" protocol, we encrypt every message
going over the wire and since encrypted data cannot be compressed, the
compression has to happen before encryption. Therefore standard things like
HTTP gzip compression will not work.
function compress(data, algorithm)
function uncompress(data, algorithm)
where algorithm is one of the standard ones (gzip, bzip2, etc.)
Is this the right forum for looking at this type of functionality?
Mitch
Channy Yun
2012-01-26 05:04:45 UTC
Permalink
FYI. despite of a little different point,

http://www.russellbeattie.com/blog/we-need-a-standard-zipped-html-file-format

In fact, most of app packaging are accompanied with encryption and
compression.

Channy
---------------------
Tech Evangelist : Web 2.0, Web Standards, Open Source and Firefox
http://channy.creation.net
I imagine that any JS "Web Compression" API would be *technically*completely independent of the Web Crypto APIs (not least because it can be
so, as well as for the reasons below).
We raised it here because one of the strongest use-cases for a Web
Compression API would be compression before encryption and so the rationale
for such a compression API is much stronger when a Web Crypto API exists.
Absent the Web Crypto API there are fewer and weaker arguments for a Web
Compression API.
So the questions to this group are: Does this make sense ? Where should
it be progressed ? Is there existing work in this area we should look at
first ?
...Mark
You're exactly right about the need to compress before encrypting, but
my inclination would be to say that a crypto API is probably not the right
place for compression/decompression.
In general, one of the goals of crypto software design is to be
validatable (e.g., for FIPS 140-2 validation [1]), and the addition of
unnecessary features within the "crypto perimeter" tends to make that
validation harder. For example, Firefox contains a FIPS-accredited crypto
module [2], so one could imagine that if the bridge to the Javascript API
were implemented properly, this accreditation might be extended to the
Javascript layer as well. This would be more difficult if functions were
introduced at the Javascript layer beyond what is present in the
FIPS-accredited module. (I do note that NSS includes a copy of zlib [3],
but it's not immediately clear whether that's within the accredited part.)
--Richard
[1] <http://csrc.nist.gov/publications/fips/fips140-2/fips1402.pdf>
[2] <
http://support.mozilla.org/en-US/kb/Configuring%20Firefox%20for%20FIPS%20140-2
[3] <
http://hg.mozilla.org/mozilla-central/file/8eab8fdaa675/security/nss/lib/zlib
In designing the technology that satisfies the use case doc posted
previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would
appear that we need to support compression / decompression APIs.
The reason is that for our "MsgSec" protocol, we encrypt every message
going over the wire and since encrypted data cannot be compressed, the
compression has to happen before encryption. Therefore standard things like
HTTP gzip compression will not work.
function compress(data, algorithm)
function uncompress(data, algorithm)
where algorithm is one of the standard ones (gzip, bzip2, etc.)
Is this the right forum for looking at this type of functionality?
Mitch
Harry Halpin
2012-01-30 10:52:17 UTC
Permalink
Post by Mitch Zollinger
In designing the technology that satisfies the use case doc posted
previously (http://www.w3.org/wiki/NetflixWebCryptoUseCase) it would
appear that we need to support compression / decompression APIs.
The reason is that for our "MsgSec" protocol, we encrypt every message
going over the wire and since encrypted data cannot be compressed, the
compression has to happen before encryption. Therefore standard things
like HTTP gzip compression will not work.
function compress(data, algorithm)
function uncompress(data, algorithm)
where algorithm is one of the standard ones (gzip, bzip2, etc.)
So, I think while I could see this being outside the scope of the WG, I
would be OK with it being part of the scope of the WG if we don't get
feature-bloat.

So, is there anything *beyond* a simple compress/uncompress that is needed?

Note that there is some experience (albeit in a separate field, W3C
Widgets/Google Gadgets/etc. of even simple gzip being surprisingly
difficult to standardize. However, since there are crypto ramifications
to compression, I'm game for it being in scope. How many people *want*
this? Any objections?
Post by Mitch Zollinger
Is this the right forum for looking at this type of functionality?
Mitch
Tom Ritter
2012-01-30 13:36:41 UTC
Permalink
How many people *want* this?
+1
Mitch Zollinger
2012-01-30 17:23:56 UTC
Permalink
+1, Netflix (obviously!)
How many people *want* this?
+1
Loading...