DSPRelated.com
Forums

(image process) are there any guidelines on how to design good filters to enhance an image?

Started by walala October 18, 2003
dear all,

I am facing the following problem that I need your help:

We have in our experiments some special images need to be processed. My task
is to look at these images and see how to improve/enhance them. In fact we/I
have no idea on how much enhancement we can get.

So what I've done in the past month is to play with different kind of
filters to try on the images. We use PSNR as judgement. It turns out I found
a Gaussian 3x3 filter has particularly high enhancement to the images,
comparing with other filters provided by Matlab, such as "laplacian",
"averaging"...

So my focus was shifted to Gaussian-like filters: after playing with many
Gaussian-like filters, it turns out that there are other non-Gaussian
filters(but numerically similar to Gaussian filter) which performed better
than Gaussian filter. So I designed a "stupid" search program to search for
it. After one week running, it gives a small 3x3 filter which can be said as
best... with best PSNR...

But this procedure seems very ad-hoc. And the best filter on one image gets
changed on another image. In reality there is no source image available to
allow computing PSNR and search one week on a super-computer to get another
filter.

Are there any guidelines on how to design good filters that applicable to
most(if not all) images? Discussion should be in two cases: a) with source
image and can get PSNR to compare results; b) with no source image and only
have the reconstructed images, so no PSNR can be computed, how to design a
"good" filter fit into this case?

I believe there are some great treatment, the problem is just I am a layman
and not even know where to locate them...

Please give me some pointers!

Thanks a lot!

-Walala


"walala" <mizhael@yahoo.com> wrote in message news:<bmslat$l3c$1@mozo.cc.purdue.edu>...
> dear all, > > I am facing the following problem that I need your help: > > We have in our experiments some special images need to be processed. My task > is to look at these images and see how to improve/enhance them. In fact we/I > have no idea on how much enhancement we can get. > > So what I've done in the past month is to play with different kind of > filters to try on the images. We use PSNR as judgement. It turns out I found > a Gaussian 3x3 filter has particularly high enhancement to the images, > comparing with other filters provided by Matlab, such as "laplacian", > "averaging"... > > So my focus was shifted to Gaussian-like filters: after playing with many > Gaussian-like filters, it turns out that there are other non-Gaussian > filters(but numerically similar to Gaussian filter) which performed better > than Gaussian filter. So I designed a "stupid" search program to search for > it. After one week running, it gives a small 3x3 filter which can be said as > best... with best PSNR... > > But this procedure seems very ad-hoc. And the best filter on one image gets > changed on another image. In reality there is no source image available to > allow computing PSNR and search one week on a super-computer to get another > filter. > > Are there any guidelines on how to design good filters that applicable to > most(if not all) images? Discussion should be in two cases: a) with source > image and can get PSNR to compare results; b) with no source image and only > have the reconstructed images, so no PSNR can be computed, how to design a > "good" filter fit into this case? > > I believe there are some great treatment, the problem is just I am a layman > and not even know where to locate them...
Welcome to the world of practical signal processing! You have learned a couple of lessons professors and researchers who have spent decades working on such matters need not even begin to comprehend: - A filter or other processor that is highly optimized in one case need not be any better (and can in fact perform worse) than standard, "naive" filters cases that were not included in the optimization training set. - Trying to optimize performance on a known source (where you know what image has been degraded and try to recover it from noise) is a completely different problem than recovering an unknown image in noise. My advice is to find as many Image Processing tools and learn as many tricks as you possibly can. I used the book on image processing by Gonzales and Woods when I first did these kinds of things. There may be other, better books available these days, I don't know. I would also learn as much about implementations of these techniques as possible. Perhaps implementing a naive median filter or histogram equalization does the job for you. There may even be free IP software packages around, that you can use and modify. My point is that there may not be an easy way to find your filters. It may be that what you need is some sort of interactive processing tool, where a human does some manipulations of the images by means of computer software. More often than not, the "scientist" can only provide the proper software to do the job and some general guidelines on how to do the job, leaving the details to be decided by the human operator who actually do the analysis work. Rune
"Rune Allnor" <allnor@tele.ntnu.no> wrote in message
news:f56893ae.0310190614.66c0ab94@posting.google.com...
> "walala" <mizhael@yahoo.com> wrote in message
news:<bmslat$l3c$1@mozo.cc.purdue.edu>...
> > dear all, > > > > I am facing the following problem that I need your help: > > > > We have in our experiments some special images need to be processed. My
task
> > is to look at these images and see how to improve/enhance them. In fact
we/I
> > have no idea on how much enhancement we can get. > > > > So what I've done in the past month is to play with different kind of > > filters to try on the images. We use PSNR as judgement. It turns out I
found
> > a Gaussian 3x3 filter has particularly high enhancement to the images, > > comparing with other filters provided by Matlab, such as "laplacian", > > "averaging"... > > > > So my focus was shifted to Gaussian-like filters: after playing with
many
> > Gaussian-like filters, it turns out that there are other non-Gaussian > > filters(but numerically similar to Gaussian filter) which performed
better
> > than Gaussian filter. So I designed a "stupid" search program to search
for
> > it. After one week running, it gives a small 3x3 filter which can be
said as
> > best... with best PSNR... > > > > But this procedure seems very ad-hoc. And the best filter on one image
gets
> > changed on another image. In reality there is no source image available
to
> > allow computing PSNR and search one week on a super-computer to get
another
> > filter. > > > > Are there any guidelines on how to design good filters that applicable
to
> > most(if not all) images? Discussion should be in two cases: a) with
source
> > image and can get PSNR to compare results; b) with no source image and
only
> > have the reconstructed images, so no PSNR can be computed, how to design
a
> > "good" filter fit into this case? > > > > I believe there are some great treatment, the problem is just I am a
layman
> > and not even know where to locate them... > > Welcome to the world of practical signal processing! You have learned a > couple of lessons professors and researchers who have spent decades
working
> on such matters need not even begin to comprehend: > > - A filter or other processor that is highly optimized in one case need > not be any better (and can in fact perform worse) than standard, "naive" > filters cases that were not included in the optimization training set. > > - Trying to optimize performance on a known source (where you know what > image has been degraded and try to recover it from noise) is a
completely
> different problem than recovering an unknown image in noise. > > My advice is to find as many Image Processing tools and learn as many
tricks
> as you possibly can. I used the book on image processing by Gonzales and > Woods when I first did these kinds of things. There may be other, better > books available these days, I don't know. I would also learn as much about > implementations of these techniques as possible. Perhaps implementing a > naive median filter or histogram equalization does the job for you. There > may even be free IP software packages around, that you can use and modify. > > My point is that there may not be an easy way to find your filters. It may > be that what you need is some sort of interactive processing tool, where > a human does some manipulations of the images by means of computer
software.
> More often than not, the "scientist" can only provide the proper software > to do the job and some general guidelines on how to do the job, leaving
the
> details to be decided by the human operator who actually do the analysis > work. > > Rune
Rune, Thanks for your posting... You mean all experts do filter design by ad hoc search and trial and error? What I need is pointers/guidelines on systematically designing a filter based on the input. Since although the source is unknown, yet the reconstructed input still has some statistical property. If some mathematical analytical model can be made based on the parameters extracted from the input, then a filter can be adapted to those parameters for that particular input. Since it turns out already that the "best" filter for a source is "near Gaussian" and it is small, 3x3, I believe the "best" filter for anohter source would also be "near Gaussian" but with adjusted numbers.... So is there any pointers/guidelines on mathmatical modeling for derivation of such kind of filter? Thanks, Walala

walala wrote:

> > You mean all experts do filter design by ad hoc search and trial and error? > > What I need is pointers/guidelines on systematically designing a filter > based on the input. Since although the source is unknown, yet the > reconstructed input still has some statistical property. If some > mathematical analytical model can be made based on the parameters extracted > from the input, then a filter can be adapted to those parameters for that > particular input. > > Since it turns out already that the "best" filter for a source is "near > Gaussian" and it is small, 3x3, I believe the "best" filter for anohter > source would also be "near Gaussian" but with adjusted numbers.... So is > there any pointers/guidelines on mathmatical modeling for derivation of such > kind of filter?
There's something that doesn't make any sense here. You say that you have a filter designed by trial and error that is near Gaussian and 3 X 3. If you are talking filters with integer values - there really aren't that many that fit that description a person could write them all down in a matter of minutes. If you're implementing this experiment with floating point values you probably are splitting hairs that aren't visible to the human eye. But that said, given the limited dynamic range of images and the non-lineararity of the quantization process you might want to also consider non-linear filters for noise removal. A median filter would be a good place to start your investigation. -jim -----= Posted via Newsfeeds.Com, Uncensored Usenet News =----- http://www.newsfeeds.com - The #1 Newsgroup Service in the World! -----== Over 100,000 Newsgroups - 19 Different Servers! =-----
walala wrote:
> Are there any guidelines on how to design good filters that applicable to > most(if not all) images? Discussion should be in two cases: a) with source > image and can get PSNR to compare results; b) with no source image and > only have the reconstructed images, so no PSNR can be computed, how to > design a "good" filter fit into this case?
You need to first work out your metric... you say you use PSNR, but without a source image... so how can you possibly measure PSNR? You need to classify what part of the image is most important to you, and may have to throw the rest away - this may mean multiple filters to pick out multiple details. You may want to classify your images in terms of some statistical measure and then manipulate them to fit that measure... You need to be able to classify your distortion in order to work out how to remove, or at least minimise it. For example, if you are only interested in the vertical edges of the image, just pick that out. [-1, 0, 1]... the image won't be like the original, but that doesn't matter - your required detail is as plain as day. Ben -- I'm not just a number. To many, I'm known as a String...
"jim" <"N0sp"@m.sjedging@mwt.net> wrote in message
news:3f92c911_5@corp.newsgroups.com...
> There's something that doesn't make any sense here. You say that you have > a filter designed by trial and error that is near Gaussian and 3 X 3. If > you are talking filters with integer values - there really aren't that > many that fit that description a person could write them all down in a > matter of minutes. If you're implementing this experiment with floating > point values you probably are splitting hairs that aren't visible to the > human eye. > But that said, given the limited dynamic range of images and the > non-lineararity of the quantization process you might want to also > consider non-linear filters for noise removal. A median filter would be a > good place to start your investigation. > > -jim > >
Dear Jim, Thanks for your comments. Yeah, I am doing "floating-point" search and trial & error ... so the search space is huge. I just try to search for a few days... Because as I told you, I first tried Matlab's filter, found a strange thing is that Gaussian filter did the job particularly well while other filter failed badly. So I decide to search following Gaussian's pattern: [a b a; b c b; a b a] Here is my pattern, so I essentially only have three unknowns... and I search from -1 to 1 with finite precision... I finally got: -0.0496 0.1408 -0.0496 0.1408 0.6355 0.1408 -0.0496 0.1408 -0.0496 which resembles Gaussian but with changed numbers... if I change the input source, it is again this kind of structure perform best but with changed numbers. For example, -0.0496 changed to -0.025, 0.1408 changed to 0.1525, 0.6355 changed to 0.4935... Is this a low pass, high pass, or other filter? I believe my filter should be adaptive, hopefully the numbers in this filter can change according some underlying properties in the input source. How to do that? All filter designs are ad hoc? I have been strongly encouraged by the fact that only this kind of structure filter can work, other types such as "averaging", "laplacian" donot work.... interesting... I think! Can experts give me more opinions?

walala wrote:

> Thanks for your comments. Yeah, I am doing "floating-point" search and trial > & error ... so the search space is huge. I just try to search for a few > days... > > Because as I told you, I first tried Matlab's filter, found a strange thing > is that Gaussian filter did the job particularly well while other filter > failed badly. > > So I decide to search following Gaussian's pattern: > > [a b a; > b c b; > a b a] > > Here is my pattern, so I essentially only have three unknowns... and I > search from -1 to 1 with finite precision... >
Ok, so the filter must be symetrical - this is not exactly the same as gaussian. What you haven't yet explained is how you are determining that one filter works better than another.
> I finally got: > > -0.0496 0.1408 -0.0496 > 0.1408 0.6355 0.1408 > -0.0496 0.1408 -0.0496 >
That looks good.
> which resembles Gaussian but with changed numbers... if I change the input > source, it is again this kind of structure perform best but with changed > numbers. For example, -0.0496 changed to -0.025, 0.1408 changed to 0.1525, > 0.6355 changed to 0.4935...
When you say "performs best". How are you arriving at that conclusion.
> > Is this a low pass, high pass, or other filter?
Its low pass. Its not really gaussian with the negative values. Somewhere half way beteen an unsharp mask and gaussian.
> > I believe my filter should be adaptive, hopefully the numbers in this filter > can change according some underlying properties in the input source. How to > do that? >
Well, it sounds like you are already doing it by brute force. Maybe if you explain what the input source is and how you determine that the filter "works", then some one might suggest ways to optimize your search.
> All filter designs are ad hoc? > > I have been strongly encouraged by the fact that only this kind of structure > filter can work, other types such as "averaging", "laplacian" donot work.... > interesting... I think!
No, not very interesting or true. What exactly are you trying to accomplish. What works depends on what you are trying to do. -jim -----= Posted via Newsfeeds.Com, Uncensored Usenet News =----- http://www.newsfeeds.com - The #1 Newsgroup Service in the World! -----== Over 100,000 Newsgroups - 19 Different Servers! =-----
> Ok, so the filter must be symetrical - this is not exactly the same as > gaussian. What you haven't yet explained is how you are determining that > one filter works better than another.
The filter is supposed to work on input images and enhance them. I have several images, but not all. So I tried on these a few images on my hand and "Gaussian-like"(as the following one) filter has great PSNR comparing with "averaging" or other filters(e.g. 39dB vs. 10dB)... since I have source images so I can computer PSNR and compare, right?
> > > I finally got: > > > > -0.0496 0.1408 -0.0496 > > 0.1408 0.6355 0.1408 > > -0.0496 0.1408 -0.0496 > > > Its low pass. Its not really gaussian with the negative values. Somewhere > half way beteen an unsharp mask and gaussian. >
Ok, I see.
> > > > I believe my filter should be adaptive, hopefully the numbers in this
filter
> > can change according some underlying properties in the input source.
How to
> > do that? > > > Well, it sounds like you are already doing it by brute force. Maybe if you > explain what the input source is and how you determine that the filter > "works", then some one might suggest ways to optimize your search.
The input images are some jpeg images having block-artifacts. As you know that JPEG images have block-artifacts under low bit rate. Someone else in my group did some experiments to pre-filter the images to try to get rid of the artifacts. But that also introduces other artifacts. So now it is a combined artifact... and to be frankly, all processing techniques looked the same to me on images, because they are nearly same, but in terms PSRN, they have big difference... so I trial and error and found one with both good PSNR and looking(but still ugly though)... But I cannot do trial-and-error search for every input, right? Because after this "training" stage, I don't have source images available any more, right? My fitler should work adaptively itself...
"walala" <mizhael@yahoo.com> wrote in message news:<bmuae2$pbe$1@mozo.cc.purdue.edu>...
> You mean all experts do filter design by ad hoc search and trial and error?
Eh... I can only answer for myself and the people I have worked with... but yes, quite a lot is trial and error, some times completely ad hoc, other times based on experience or educated guesses. When presented with a problem, there is usually more than one way of doing things. In my opinion, one part of the job is to know as many such tools and tricks and methods as possible, the other part is to evaluate which methods works how well and when. If you look in the only book I know of that teaches data processing, Yilmaz: Seismic Data Processing SEG, 1987 you will find a lot of more or less standard flow charts for doing various standard processing tasks. But in a seismics processing lab the analysts spend days and weeks trying to fine-tune the processing parameters. Some of these guys are truly artists. Actually, someone I worked with did a test where several different processing teams got the same data set and were assigned the same processing tasks. When the results came back, you almost had to know in advance that these images were supposed to show the same geologigal features. They almost looked like data from different oil fields. Data processing is a very subjective dicipline.
> What I need is pointers/guidelines on systematically designing a filter > based on the input. Since although the source is unknown, yet the > reconstructed input still has some statistical property. If some > mathematical analytical model can be made based on the parameters extracted > from the input, then a filter can be adapted to those parameters for that > particular input.
Hmmm... this depends on the application. Be aware that you, by choosing this analytical model, puts constraints on the resulting images. If you somehow gets an image that does not fit that model, you may be in trouble. If, for instance, you work with medical images and search for soft tissue injuries, a processing algorithm could use known information about e.g. the skeleton, and use deviations from the "expected" image to assess tissue damages. But what if there is severe damages to the bones as well? If the processing algorithm assumes an undamaged skeleton, the automatic processing routine will get lost. But then, if your images are sufficiently regular, like what could be expected in certain quality control tasks on a production line, such variations need not be a problem at all.
> Since it turns out already that the "best" filter for a source is "near > Gaussian" and it is small, 3x3, I believe the "best" filter for anohter > source would also be "near Gaussian" but with adjusted numbers.... So is > there any pointers/guidelines on mathmatical modeling for derivation of such > kind of filter?
Wiener filters come to mind... again, there are plenty filters and tricks available for image processing. Whether you can do stuff completely automatic or need a human operator in the loop, depends on the application. Rune
"Ben Pope" <spam@hotmail.com> wrote in message
news:bmuqba$qdnh9$1@ID-191149.news.uni-berlin.de...
> walala wrote: > > Are there any guidelines on how to design good filters that applicable
to
> > most(if not all) images? Discussion should be in two cases: a) with
source
> > image and can get PSNR to compare results; b) with no source image and > > only have the reconstructed images, so no PSNR can be computed, how to > > design a "good" filter fit into this case? > > > You need to first work out your metric... you say you use PSNR, but
without
> a source image... so how can you possibly measure PSNR? >
Bob, thanks for your posting, That's the problem. The filter is supposed to work on all images. In this "design" stage, I have a few source images to test PSNR... later on, the filter is supposed to adapt to input and do filtering blindly... at that time, no PSNR can be measured. So what I want to know is exactly how to design a filter that can adaptively work on images to achieve satisfactory PSNR(although may not be the best, but al least works for most cases) after the initial training stage.