Reply by Ron N May 27, 20082008-05-27
On May 26, 10:41 pm, "Green Xenon [Radium]" <gluceg...@excite.com>
wrote:
> It is an inadequate amount of pixels-per-area that causes blocking.
Inadequate pixel density does not have to cause blocking unless you display those pixels as adjacent rectangles with sharp edges of sufficient difference in luminance or color. If you do, then even a very high pixel-per-area count image and display might still look "blocky" under a good magnifier. IMHO. YMMV.
Reply by Green Xenon [Radium] May 27, 20082008-05-27
Ron N wrote:


> On May 26, 10:13 am, "Green Xenon [Radium]" <gluceg...@excite.com> > wrote:
>> You say pixelation/jaggies have nothing to do with aliasing. Then why >> does an insufficient pixelXpixel resolution cause an image to pixelate?
> > Pixels can become visible because of the addition of high > frequency edges during the imaging process.
Even smooth too-high-frequency sine-wave-like spatial signals can cause pixels to become visible as well, right? The high-frequency signal does not have to have sharp components [i.e. waveforms other than sine] to cause 'blocking'.
> Depends on > whether you are talking about the pixel resolution of the > capture, the compressed storage format, the decompressed > image format, or the display format, and/or any conversion > or filtering processes between any of the above and also > between the display and the back of the viewers eyeballs.
I am definitely talking about the compressed storage format and decompressed image format. Maybe also the capture.
Reply by Green Xenon [Radium] May 27, 20082008-05-27
bharat pathak wrote:
> Hi, > > Let's look at it from sampling theorem perpective. Any > ADC has 2 stages. > > 1. first stage that performs time discretization. > 2. second stage that performs value discretization. > > Time discretization will tell how fine a pixel could be. > Basically the resolution of the image like 512x512 pixels > or 1920x1080 pixels. One aspect of image looking coarse > or fine can come from this. This exactly is something like > your digital camera specifying 5 mega pixels or 10 mega > pixels. If you are not making hoardings out of your photos > then you really don't have to buy 10 mega pixel camera. > So this pixel density makes sense only when you tell what > size of display in inches u r going to display or print the > image and what is the distance from which you will be viewing > the same. > > Second aspect is pixel depth (or value quantization). Let > us assume that a gray scale pixel could be represented using > 8 bit depth. But instead you represented using 4 bit depth > then the image is coarsely quantized and thus causes contouring > artifacts. The one that u see on you tube fall under this > category. Originally the video would like fine. but due to > compression, you see blocking artifacts (which u refer as > pixellation/jagginess). > > Hope this clears some cloud. > > Bharat Pathak > > Arithos Designs > www.Arithos.com > > DSP Design Consultancy and Training Company. > > > >
http://www.usu.edu/sanderso/multinet/depth.html "Pixel Depth refers to the number of colors possible on screen." Sorry to be such a pain but an inadequate pixel-depth does not cause 'blocking', it causes there to be fewer colors available. It is an inadequate amount of pixels-per-area that causes blocking. Once again, sorry if you feel my responses are annoying. I&#4294967295;ve got a neurological disorder which I would like to discuss briefly. I am not trying to make excuses for any of my posts but I don&#4294967295;t want readers to wrongly-assume that I troll/spam the net just for attention nor do I want them to think I am lazy or unwilling to do my own research. I am really interested in the stuff I post about and hope that the readers will not get upset at me. I have a neurological disability called Asperger's Syndrome. I would like to give you all some information about my disability. The reason I am posting this message about Asperger's is to help avoid any potential misunderstandings [though it's probably too late]. I have been diagnosed with Asperger's Syndrome (AS). AS is a neurological condition that causes significant impairment in social interactions. People with AS see the world differently and this can often bring them in conflict with conventional ways of thinking. They have difficulty in reading body language, and interpreting subtle cues. In my situation, I have significant difficulty with natural conversation, reading social cues, and maintaining eye contact. This can lead to a great deal of misunderstanding about my intent or my behavior. For example, I may not always know what to say in social situations, so I may look away or may not say anything. I also may not always respond quickly when asked direct questions, but if given time I am able express my ideas. AS also decreases my attention span and makes it difficult for me do as much mental work for as long as most others can do. It also impairs my short-term memory and ability to retain information. On the internet, the cyber-equivalent of my disability is probably noticed. I do apologize profusely, for any inconvenience it causes. Thank you very much in advance for your understanding, cooperation, and assistance.
Reply by bharat pathak May 26, 20082008-05-26
Hi,

   Let's look at it from sampling theorem perpective. Any
   ADC has 2 stages.

   1. first  stage that performs time  discretization.
   2. second stage that performs value discretization.

   Time discretization will tell how fine a pixel could be.
   Basically the resolution of the image like 512x512 pixels
   or 1920x1080 pixels. One aspect of image looking coarse
   or fine can come from this. This exactly is something like
   your digital camera specifying 5 mega pixels or 10 mega
   pixels. If you are not making hoardings out of your photos
   then you really don't have to buy 10 mega pixel camera.
   So this pixel density makes sense only when you tell what
   size of display in inches u r going to display or print the
   image and what is the distance from which you will be viewing
   the same.

   Second aspect is pixel depth (or value quantization). Let
   us assume that a gray scale pixel could be represented using
   8 bit depth. But instead you represented using 4 bit depth
   then the image is coarsely quantized and thus causes contouring
   artifacts. The one that u see on you tube fall under this 
   category. Originally the video would like fine. but due to
   compression, you see blocking artifacts (which u refer as 
   pixellation/jagginess). 

Hope this clears some cloud.

Bharat Pathak

Arithos Designs
www.Arithos.com

DSP Design Consultancy and Training Company.




Reply by Ron N May 26, 20082008-05-26
On May 26, 10:13 am, "Green Xenon [Radium]" <gluceg...@excite.com>
wrote:
> You say pixelation/jaggies have nothing to do with aliasing. Then why > does an insufficient pixelXpixel resolution cause an image to pixelate?
Pixels can become visible because of the addition of high frequency edges during the imaging process. Depends on whether you are talking about the pixel resolution of the capture, the compressed storage format, the decompressed image format, or the display format, and/or any conversion or filtering processes between any of the above and also between the display and the back of the viewers eyeballs. IMHO. YMMV. -- rhn A.T nicholson d.0.t C-o-M
Reply by Green Xenon [Radium] May 26, 20082008-05-26
bharat pathak wrote:


> No it is not a concept of aliasing. It is a concept of > imaging. When you are zooming into the image you will > see pixellated image (or jaggies). Zooming in is the > process of interpolation by means of pixel duplication. > So whenever we interpolate data new frequencies get > created called "images" by means of "zero insertion" > now we need to filter the "images". If improper filtering > is applied this causes jagginess or pixaltion. Hence > a solution to overcome this problem is a technique > called DCDi (patented by Faroudja, now the patent is with > STM). Full form of DCDi is direction correlated diogonal > interpolation. search google and u will hit a page which > explains DCDi. Hence jagginess or pixalation is not a > concept of aliasing but a concept of "imaging"
You say pixelation/jaggies have nothing to do with aliasing. Then why does an insufficient pixelXpixel resolution cause an image to pixelate? My monitor's resolution has a pixelXpixel resolution of 1280 X 1024. The pixelation/jaggies on many of the compressed video on Utube as well as many movies generally available online have an insufficient pixelXpixel resolution. This causes pixelation/jaggies. Why does it cause pixelation/jaggies if pixelation/jaggies have nothing to do with aliasing? AFAIK, a higher pixelXpixel resolution allows for higher spatial frequencies without spatial distortion. http://en.wikipedia.org/wiki/Spatial_frequency
Reply by Green Xenon [Radium] May 26, 20082008-05-26
bharat pathak wrote:


> An example of spatial aliasing is the cart wheels > in a movie when captured by motion camera at 24hz > seems to roll in opposite direction. and then back > in the same direction.
That temporal aliasing. Not spatial aliasing.
Reply by bharat pathak May 26, 20082008-05-26
An example of spatial aliasing is the cart wheels
in a movie when captured by motion camera at 24hz
seems to roll in opposite direction. and then back
in the same direction.

This is phenomena of temporal frequencies getting
aliased. When they get aliased it shows up on 2D
image. Some companies claim that they can correct
such artifacts. even if possible it will require
lot of hardware to achieve it like frame buffers or
so. and the advantage u gain will be small w.r.t 
just living with the phenomena.

Regards
Bharat Pathak

Arithos Designs
www.Arithos.com


Reply by bharat pathak May 26, 20082008-05-26
>Are pixelation/jaggies caused by spatial aliasing?
No it is not a concept of aliasing. It is a concept of imaging. When you are zooming into the image you will see pixellated image (or jaggies). Zooming in is the process of interpolation by means of pixel duplication. So whenever we interpolate data new frequencies get created called "images" by means of "zero insertion" now we need to filter the "images". If improper filtering is applied this causes jagginess or pixaltion. Hence a solution to overcome this problem is a technique called DCDi (patented by Faroudja, now the patent is with STM). Full form of DCDi is direction correlated diogonal interpolation. search google and u will hit a page which explains DCDi. Hence jagginess or pixalation is not a concept of aliasing but a concept of "imaging" Regards Bharat Pathak Arithos Designs www.Arithos.com DSP Design Consultancy and Training Company.
Reply by Green Xenon [Radium] May 26, 20082008-05-26
bharat pathak wrote:


> Pixellation and jaggies i think are similar. Moire patterns > are caused mainly when we do interlaced to progressive > conversion. when doing I2P we take help of motion, if pixels > are static we do temporal interpolation and if things are > moving we do spatial interpolation. Sometimes the motion detection > makes errors and in area's there is no motion, declares motion > and thus spatial deinterlacing takes place. this is mainly > a problem in high freq area's. in spatial domain we try to > create info even though it is missing. and hence this creates > moire patterns. i have ppt explaining deinterlacing in simple > terms, if interested send me mail at bharat at arithos dot com.
Are pixelation/jaggies caused by spatial aliasing?