Product: Camera: Digital: Image Format
Penrose Pixels   (+5)  [vote for, against]
Aperiodic arrangement of image elements

Pixels. Computer displays and digital cameras have them. They are almost always arranged in a regular rectangular grid.

One of the failings of this is that pointing a camera at a computer display, or in fact anything with a fine, regular pattern will produce aliasing known as Moiré. To counteract this, most cameras have an anti-aliasing filter which just blurs the image slightly before the sensor. Manufacturing limits mean these filters are not perfect, so most of them are a tradeoff between Moiré and a loss of overall sharpness in the image.

I propose a fundamental re-think of pixel positions by exploiting the fascinating properties of Penrose tiling. These non-periodic tile patterns use a simple set of pieces, but the resulting arrangement has no translational symmetry, that is, it never repeats.

Slightly more processing power would be needed to turn the Penrose Pixel image into a conventional one, similarly, a Penrose Pixel display would need to pre-process the video feed heading to it, but this should be well within the capabilities of modern hardware.

The resulting pseudo-random arrangement of pixels would effectively have spatial dithering built-in, so even without an antialiasing filter Moiré would not be visible.
-- mitxela, May 17 2018

Penrose tiling
[mitxela, May 17 2018]

Could you achieve the same thing less complexly by just using regular pixels but randomly offsetting them by a small amount? It seems to me that mapping from normal pixels to Penrose pixels would be a nightmare, but a screen (or camera sensor) in which each pixel was offset in X and Y by a random amount up to (say) 10% of its size might achieve the same result.
-- MaxwellBuchanan, May 17 2018

If the display had the same Penrose tiling as the camera sensor, the mapping wouldn't be an issue (how to turn the non-regular 2D "grid" into a 1D string of 1s and 0s is another matter...).
However, that would mean that ALL cameras and displays would need to have exactly the same tiling; no "More Megapixels!!" and such.
-- neutrinos_shadow, May 17 2018

I'm pretty sure what the main text is describing can only be accurate some of the time. Too many irregular edges of things getting imaged won't exactly match the irregularities of the camera or screen tiling system.

The simplest solution is actually to do super-high-resolution pixels, such that they are so small as to be unnoticeable by the eye. In this case the edges at that scale can be jaggy, but since the jags are too small to be noticed, the problem is basically solved.
-- Vernon, May 17 2018

[Max] It's true that adding spatial jitter would have the same effect, but in order to reconstruct the image it helps if the jitter is pseudorandom, so the offsets can all be calculated from a starting seed. Penrose tiling seemed like a cool way to achieve that.

[neut] Potentially cameras with more megapixels could just continue to extend the pattern edges.

[Vern] This is mostly true, and certainly just upping the resolution is a more practical solution, but Moiré in particular will never be eliminated this way. It is the same effect as getting a beat frequency when two high-pitched sounds are mixed.
-- mitxela, May 17 2018

[bigs] Are you saying the idea wouldn't work?
-- mitxela, May 17 2018

Storing images in this way would make any kind of image processing algorithms a nightmare. Also, it would make zooming into or otherwise re-scaling an image onscreen nearly impossible.
-- notexactly, May 18 2018

When your screen seems to shine like a big piece of sky, that's a moiré
When you've jittery pics 'cos they're not non-periodic, that's a moiré
When it stops making sense 'cos there's wave interference, that's a moiré
When those shimmering lines look like mad porcupines, that's a moiré

etc., etc...
-- hippo, May 18 2018

^Damn, you beat me to it
-- not_morrison_rm, May 18 2018

^ and me, but mine wouldn't have been as good.

I'm going to claim pre-emptive delegation as a skill now.
-- Loris, May 18 2018

How about cameras?
-- Dub, May 18 2018

So, yes - camera sensors could have a Penrose pixel pattern too, but I think this idea would only work if everything, camera sensors, image file formats, display screens, etc., had exactly the same Penrose pattern
-- hippo, May 19 2018

But that would make all image processing (even just displaying images at different sizes or locations on the same screen) virtually impossible. You wouldn't even be able to display text on the screen without first printing it out and taking a photo of it, unless you developed some algorithm to convert normal pixels to Penrose pixels. So I'm against that form of the idea. The way I read it, though, was that the camera and the display would do the conversion transparently (though I guess this would still require the same algorithm, but at least I as a software developer wouldn't have to implement it), so that all current software would still work.
-- notexactly, May 19 2018

Aliasing is a fundamental phenomenon of information theory. Strange sampling schemes won't help you avoid it.

"Just blur the image slightly" is provably the correct answer, even if it is stated somewhat simplistically here.
-- Wrongfellow, May 19 2018

[bigs] I think you are doing that thing where you drink wine and start talking about water wheels.

//Aliasing is a fundamental phenomenon of information theory. Strange sampling schemes won't help you avoid it.// Now this is a really interesting point, because that's what people used to believe, but under certain conditions signals can be reconstructed from fewer sampling points than Shannon-Nyquist would suggest. Look up "compressive sampling". In short, the aliasing is often characteristic and so long as the signal is not too complex, the aliasing can be used in order to reconstruct the true signal.

For this idea though, we are not removing the aliasing, just smearing it out, in the same way that decimation filters use dithering to reduce quantization noise. The noise is still there, but instead of distinct peaks, it has been smoothed out.

[hippo] if only I could bun your anno...

[notexactly] Resampling from one format to another might be complex, but computers are good at it. Since the Penrose pattern is predictable it shouldn't be too hard. I imagine a Penrose Pixel display would be able to accept ordinary, rectangular pixel video input and have dedicated hardware on board to process it.
-- mitxela, May 19 2018

I had a very quick look, and I don't think it contradicts my point so much as augments it.

Wikipedia: //A common goal of the engineering field of signal processing is to reconstruct a signal from a series of sampling measurements. In general, this task is impossible because there is no way to reconstruct a signal during the times that the signal is not measured. Nevertheless, with prior knowledge or assumptions about the signal, it turns out to be possible to perfectly reconstruct a signal from a series of measurements. //

The key is the "with prior knowledge" bit. Shannon and Nyquist's work has never been invalidated. Recent progress has all been about clever ways of using a-priori information to kinda side-step it.
-- Wrongfellow, May 19 2018

random, halfbakery