Sliced cube data compression

Pack data into a cube to compress, then slice to decompress

The easiest way to describe this is with an example.

Say I want to represent numerals 0 to 9 on a 4 x 4 dot matrix display. To store the information of each numeral (uncompressed) will require 16 bits, 160 bits all up. Or to store it as a 3D matrix it will be 10 x 4 x 4.

Representation of a ‘0’:

0110

1001

1001

0110

Now lets say I stack four 4 x 4 matrices together making a 3 dimensional 4 x 4 x 4 matrix. I can slice this matrix 12 times: 4 front vertical slices, 4 horizontal slices, 4 side vertical slices.

So I hypothesise that if I’m really tricky I can take 4 of the 4 x 4 matrices and stack them (front vertically) so 6 slices in the other two directions will reveal the other six 4 x 4 matrices.

If the data I wanted to store was ‘nice’, I could store a maximum of 12 x 4 x 4 (192 bits) on a 4 x 4 x 4 (64 bits) array; a compression factor of three. The less ‘nice’ the data is, the less slices would contain the information I want to store.

But I also need to store ordering information. There’s no point in storing numerals 0 to 9 all mixed up if I can’t retrieve them in an ordered form. So I take the slices out in a particular order (starting with the front vertical slice going to the back vertical slice, then the top horizontal slice to the bottom horizontal slice, then the left vertical slice to the right vertical slice) and look-up the order the data is reconstructed.

So I form a ‘footer’ where ordering information can be looked-up. Each number from 0 to 9 can be represented by 4 bits (another 40 bits). So now my original 160 bits can be compressed to 104 bits.

An additional tricky thing I could do is rotate my 4 x 4 arrays to make them slot together more easily. The four orientations of rotation (0, 90, 180 and 270 degrees) could also be stored with the ‘footer’ information, although this will increase the footer size (from 40 to 60 bits in this example).

The 4 x 4 matrix is just an example. Any sized matrix could be used.

To use as a compression technique data could be analysed and packed into 3D matrices of most effective size. A ‘header’ could contain the matrix size.

I imagine this would be a ‘computationally expensive’ compression technique, but ‘computationally cheap’ decompression.

Note: this is just a thought experiment, I haven’t actually tried it.

-- xaviergisz,
Jan 04 2006

3D ambigram

a manifestation of this idea. 3D ambigrams were first invented by Douglas Hofstadter. [xaviergisz, Jan 07 2007]

Reminds me of the old chestnut of sorting an image's pixel data, and then run-length encoding that data... Decompression algorithm is left as an exercise of the reader.

-- Dub,
Jan 07 2007

Not really. LZW builds up a tree out of overlapping linear fragments. It has nothing to do with two-dimensional or three dimensional compression.

-- jutta,
Jan 08 2007

random,
halfbakery