DSPRelated.com
Forums

which one is the most successful scheme to reduce blocky artifacts in low bit rate JPEG image?

Started by walala November 16, 2003
Dear all,

After reading more and more and got answers from here, I now am more clear
about the interesting problem:

Give an image encoding/decoding system, in hardware implementation, it will
have round-off or finite word-length error; and even with perfect
floating-point arithematic, there is still block artifacts.

I don't know if they can be treated unitedly or not... I want to ask which
scheme is the currently best scheme to combat blocky artifacts?

I searched IEEE library, there seem to be 1000+ papers on this topic...
there are a number of solutions and each has many variants... After reading,
I guess I understand that every method has its advantage and disadvantage:
Currently I favor one scheme, to use a low pass filter but do edge detection
to distinguish between edge and non-edge to treat them differently... How
does this sound? I also heard about POCS, but not sure about it... it took
me several days to understand a technique and do programming to evaluate it,
if I keep doing like this, it may need 3000+ days for me to get the best
solution...

Anybody give me some pointers? Thanks a lot in advance!

-Walala


Hi,

> I don't know if they can be treated unitedly or not... I want to ask which > scheme is the currently best scheme to combat blocky artifacts?
An encoding scheme that doesn't use blocks... There are many. JPEG2000 is one of them. You can even write up your own simplistic wavelet coder without much hassle (my students are doing it this semester... ;-)
> Currently I favor one scheme, to use a low pass filter but do edge detection > to distinguish between edge and non-edge to treat them differently... How > does this sound? I also heard about POCS, but not sure about it... it took > me several days to understand a technique and do programming to evaluate it, > if I keep doing like this, it may need 3000+ days for me to get the best > solution...
There is no "best" solution. There are several, depending on preferences, and specifically on the input data and type of input data. Also, depends on which errors you want to tolerate. JPEG artifacts look different than wavelet- artefacts (e.g. JPEG2000). So long, Thomas
Thomas Richter wrote:
> Hi, > >> I don't know if they can be treated unitedly or not... I want to ask >> which scheme is the currently best scheme to combat blocky artifacts? > > An encoding scheme that doesn't use blocks... There are many. > JPEG2000 is > one of them. You can even write up your own simplistic wavelet coder > without much hassle (my students are doing it this semester... ;-)
I did a little JPEG2000 style codec in matlab at Uni... not too bad. We weren't allowed to use the wavelet toolbox :-P Ben -- I'm not just a number. To many, I'm known as a String...
"Ben Pope" <spam@hotmail.com> wrote in message
news:bpa7i3$1lg4jl$1@ID-191149.news.uni-berlin.de...
> Thomas Richter wrote: > > Hi, > > > >> I don't know if they can be treated unitedly or not... I want to ask > >> which scheme is the currently best scheme to combat blocky artifacts? > > > > An encoding scheme that doesn't use blocks... There are many. > > JPEG2000 is > > one of them. You can even write up your own simplistic wavelet coder > > without much hassle (my students are doing it this semester... ;-) > > I did a little JPEG2000 style codec in matlab at Uni... not too bad. We > weren't allowed to use the wavelet toolbox :-P >
Dear Ben, I heard a lot buzzword about JPEG 2000 and I am very interested in trying it out.... I have been looking around for matlab code but found none... Could you please let me take a look at your code? If it is possible and if it is not secret...? Thank you very much for your answer! Best, Michael.
"Thomas Richter" <thor@cleopatra.math.tu-berlin.de> wrote in message
news:bpa4p6$6nh$3@mamenchi.zrz.TU-Berlin.DE...
> Hi, > > > I don't know if they can be treated unitedly or not... I want to ask
which
> > scheme is the currently best scheme to combat blocky artifacts? > > An encoding scheme that doesn't use blocks... There are many. JPEG2000 is > one of them. You can even write up your own simplistic wavelet coder
without
> much hassle (my students are doing it this semester... ;-) > > > Currently I favor one scheme, to use a low pass filter but do edge
detection
> > to distinguish between edge and non-edge to treat them differently...
How
> > does this sound? I also heard about POCS, but not sure about it... it
took
> > me several days to understand a technique and do programming to evaluate
it,
> > if I keep doing like this, it may need 3000+ days for me to get the best > > solution... > > There is no "best" solution. There are several, depending on preferences, > and specifically on the input data and type of input data. Also, depends
on
> which errors you want to tolerate. JPEG artifacts look different than
wavelet-
> artefacts (e.g. JPEG2000). > > So long, > Thomas > >
Dear Prof. Thomas, Thanks for your answer. Is there any survey on which solution is at least good comparing with others? For low bit rate JPEG blocky artifacts? I guess currently I 'd better looking at JPEG artifacts... Best, -Walala
walala wrote:
> Dear Ben, > > I heard a lot buzzword about JPEG 2000 and I am very interested in > trying it out.... I have been looking around for matlab code but > found none... Could you please let me take a look at your code? If it > is possible and if it is not secret...? > > Thank you very much for your answer!
Can't find the code now, but you basically have to do this: You need to filters, the simplest pair being low [1 1] and high [-1 1] (you'll need to normalise those so divide by root 2) Then you create yourself 4 small images with a combination of those filters (the two dimensions are linearly seperable, so you can filter one way, then filter the transposed image): CA = Low, Low CH = High, Low CV = Low, High CD = High, High You then subsample each of those images (just grab every other pixel) If you rebuild the image as: ___________ | CA | CH | |_____|_____| | CV | CD | |_____|_____| You'll find it takes up the same amount of space. But most of the energy will be in the top left square, you can then recurse into that area and do the same thing: ___________ |__|__| CH | |__|__|_____| | CV | CD | |_____|_____| The more you recurse, the more you push the energy to the top left of the image. You can then compress it much better since you effectively have a sparse representation. What you usually do is dead-band quantise the wavelet coefficients (with a dead band thats 1-2 buckets is usual) then entropy code it (arithmetic, huffman etc) order to lossless compress it. The decoding is pretty much the reverse. when you get back to your CA, CH, CV, CD you can just upsample then filter as before for each quadrant and then add all the images together to form your decoded image. Ben -- I'm not just a number. To many, I'm known as a String...
"walala" <mizhael@yahoo.com> wrote

> Anybody give me some pointers? Thanks a lot in advance!
Try http://citeseer.nj.nec.com/kim98fixedpoint.html BTW if you are using a coarse enough DCT approximation such that reconstruction gets you blocks you are better off using a transform like binDCT ... a fast integer DCT approximation, build entirely from ladder structures so invertible (lossless). Marco
"Marco Al" <m.f.al@student.utwente.nl>

> "walala" <mizhael@yahoo.com> wrote > > > Anybody give me some pointers? Thanks a lot in advance! > > Try http://citeseer.nj.nec.com/kim98fixedpoint.html
Oops replied in the wrong thread, I should say that if you need deblocking to deal with artifacts from the transform even without further quantization of the coefficients then you are doing something very very wrong. That said, if you get your transform to work right and start doing lossy coding with it and want to implement deblocking for that ... I was impressed by the simplicity of http://www.iti.gr/db.php/en/publications/details/445.html Basically it just takes a weighted average of a coefficient and the corresponding coefficients in the 8 neighbouring blocks and then clips the result to quantization boundaries. For something so simple and fast the results in the paper looked pretty good. Marco
Hi,

> I heard a lot buzzword about JPEG 2000 and I am very interested in trying it > out.... I have been looking around for matlab code but found none... Could > you please let me take a look at your code? If it is possible and if it is > not secret...?
A very nice version for Java can be found here: http://jj2000.epfl.ch/ So long, Thomas
In comp.compression walala <mizhael@yahoo.com> wrote:

> Thanks for your answer. Is there any survey on which solution is at least > good comparing with others? For low bit rate JPEG blocky artifacts? I guess > currently I 'd better looking at JPEG artifacts...
Well, a lot of tests have been made by the JPEG folks at the ISO when JPEG2000 was launched. Back then it wasn't quite clear whether the next standard would be DCT or wavelet based. The result is known... Unfortunately, I don't think that these internal testings are available in public. Concerning the traditional JPEG blocking artefacts: There are also means against them, the basic idea is making smart use of the DCT DC coefficients and post-processing them. This is of course some kind of "workaround" that seems to work fine for natural images. This kind of de-blocking works only for the blocking artefacts that are created at high compression rates, though, not for the general JPEG DCT artefact. So long, Thomas