DSPRelated.com
Forums

Kalman Assumption

Started by Cagdas Ozgenc April 26, 2010
On Apr 27, 4:29&#4294967295;pm, Rune Allnor <all...@tele.ntnu.no> wrote:
> On 27 apr, 04:54, Tim Wescott <t...@seemywebsite.now> wrote: > > > Indeed, the first step in applying someone's "optimal" formulation is > > deciding if their "optimal" comes within the bounds of your "good enough". > > Ehh... I would rate that as the *second* step. The first item on > my list would be find out in what sense an 'optimal' filter is > optimal: > > - Error magnitude? > - Operational robustness? > - Computational efficency? > - Ease of implementation? > - Economy? > - Balancing all the above? > > Rune
Of course you may discuss many aspects regarding optimality. But within the context of estimation theory it usually means unbiased estimate with the lowest variance among all other possible estimators.
On 27 apr, 16:11, Cagdas Ozgenc <cagdas.ozg...@gmail.com> wrote:
> On Apr 27, 4:29&#4294967295;pm, Rune Allnor <all...@tele.ntnu.no> wrote: > > > > > > > On 27 apr, 04:54, Tim Wescott <t...@seemywebsite.now> wrote: > > > > Indeed, the first step in applying someone's "optimal" formulation is > > > deciding if their "optimal" comes within the bounds of your "good enough". > > > Ehh... I would rate that as the *second* step. The first item on > > my list would be find out in what sense an 'optimal' filter is > > optimal: > > > - Error magnitude? > > - Operational robustness? > > - Computational efficency? > > - Ease of implementation? > > - Economy? > > - Balancing all the above? > > > Rune > > Of course you may discuss many aspects regarding optimality. But > within the context of estimation theory it usually means unbiased > estimate with the lowest variance among all other possible estimators.
Sure. Buth that's not the important question. The important questions are the second and third items in my list, robustness and computational efficiency. In that order. There is no need to use a very efficient estimator if it is so sensitive to model errors that any hint of model mis-match throws it off. Rune
Cagdas Ozgenc wrote:
> On Apr 27, 6:54 am, Tim Wescott <t...@seemywebsite.now> wrote: >> Peter K. wrote: >>> On 26 Apr, 21:52, HardySpicer <gyansor...@gmail.com> wrote: >>>> Don't think so. You can design an H infinity linear Kalman filter >>>> which is only a slight modification and you don't even need to know >>>> what the covariance matrices are at all. >>>> H infinity will give you the minimum of the maximum error. >>> As Tim says, the Kalman filter is the optimal linear filter for >>> minimizing the average estimation error. Reformulations using H- >>> infinity techniques do not give an optimal linear filter in this >>> sense. >>> As you say, though, H-nifty (sic) give the optimal in terms of >>> minimizing the worst case estimation error... which may or may not >>> give "better" results than the Kalman approach. >>> Depending on the application, neither "optimal" approach may give >>> exactly what the user is after... their idea of "optimal' may be >>> different from what the mathematical formulations give. >> Indeed, the first step in applying someone's "optimal" formulation is >> deciding if their "optimal" comes within the bounds of your "good enough". >> >> -- >> Tim Wescott >> Control system and signal processing consultingwww.wescottdesign.com- Hide quoted text - >> >> - Show quoted text - > > Bottom line is without Gaussian distribution assumption only the > optimality condition doesn't hold. But it is still the best linear > estimator, but not the best overall estimator. Right?
Right. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Frnak McKenney wrote:
> On Mon, 26 Apr 2010 19:54:35 -0700, Tim Wescott <tim@seemywebsite.now> wrote: > >> Indeed, the first step in applying someone's "optimal" formulation is >> deciding if their "optimal" comes within the bounds of your "good enough". > > Tim, > > A phrase that deserves repeating in a variety of contexts. Mind if > I steal it? > > > Frank "TaglinesRUs" McKenney
Attribute it to me, and otherwise it's all yours. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Rune Allnor wrote:
> On 27 apr, 04:54, Tim Wescott <t...@seemywebsite.now> wrote: > >> Indeed, the first step in applying someone's "optimal" formulation is >> deciding if their "optimal" comes within the bounds of your "good enough". > > Ehh... I would rate that as the *second* step. The first item on > my list would be find out in what sense an 'optimal' filter is > optimal: > > - Error magnitude? > - Operational robustness? > - Computational efficency? > - Ease of implementation? > - Economy? > - Balancing all the above?
That doesn't sound as nifty. Besides, evaluating all of those are (IMHO) an essential part of deciding of someone's "optimal" is within my bounds of "good enough". Sometimes the best _practical_ solution is really crappy _technically_, because sometimes the variable in most urgent need of optimization is engineering time, or weight, or power consumption, or some other parameter that just doesn't find its way into the usual elegant formulations of "optimal". -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
On Apr 27, 2:53&#4294967295;pm, Tim Wescott <t...@seemywebsite.now> wrote:
> HardySpicer wrote: > > On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: > >> Cagdas Ozgenc wrote: > >>> Hello, > >>> In Kalman filtering does the process noise have to be Gaussian or > >>> would any uncorrelated covariance stationary noise satisfy the > >>> requirements? > >>> When I follow the derivations of the filter I haven't encountered any > >>> requirements on Gaussian distribution, but in many sources Gaussian > >>> tag seems to go together. > >> The Kalman filter is only guaranteed to be optimal when: > > >> * The modeled system is linear. > >> * Any time-varying behavior of the system is known. > >> * The noise (process and measurement) is Gaussian. > >> * The noise's time-dependent behavior is known > >> &#4294967295; &#4294967295;(note that this means the noise doesn't have to be stationary -- > >> &#4294967295; &#4294967295;just that it's time-dependent behavior is known). > >> * The model exactly matches reality. > > >> None of these requirements can be met in reality, but the math is at its > >> most tractable when you assume them. &#4294967295;Often the Gaussian noise > >> assumption comes the closest to being true -- but not always. > > >> If your system matches all of the above assumptions _except_ the > >> Gaussian noise assumption, then the Kalman filter that you design will > >> have the lowest error variance of any possible _linear_ filter, but > >> there may be nonlinear filters with better (perhaps significantly > >> better) performance. > > > Don't think so. You can design an H infinity linear Kalman filter > > which is only a slight modification and you don't even need to know > > what the covariance matrices are at all. > > H infinity will give you the minimum of the maximum error. > > But strictly speaking the H-infinity filter isn't a Kalman filter. &#4294967295;It's > certainly not what Rudi Kalman cooked up. &#4294967295;It is a state-space state > estimator, and is one of the broader family of "Kalmanesque" filters, > however. > > And the H-infinity filter won't minimize the error variance -- it > minimizes the min-max error, by definition. > > -- > Tim Wescott > Control system and signal processing consultingwww.wescottdesign.com
Who says that minimum mean-square error is the best? That's just one convenient criterion. For example, the optimal control problem with a Kalman filter is pretty bad. It doesn't even have integral action. Simple PID gives better results for many occasions. Hardy

HardySpicer wrote:


> Who says that minimum mean-square error is the best? That's just one > convenient criterion. > For example, the optimal control problem with a Kalman filter is > pretty bad. It doesn't even have integral action. > Simple PID gives better results for many occasions.
"Of all the idiots the most insufferable are those who are not completely deprived mind" Francois de La Rochefoucauld
On 27 apr, 17:25, Tim Wescott <t...@seemywebsite.now> wrote:
> Rune Allnor wrote: > > On 27 apr, 04:54, Tim Wescott <t...@seemywebsite.now> wrote: > > >> Indeed, the first step
...
> > Ehh... I would rate that as the *second* step.
...
> That doesn't sound as nifty. &#4294967295;
Suffice it to say that rethorics has never been among my fortes. Rune
HardySpicer wrote:
> On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: >> HardySpicer wrote: >>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: >>>> Cagdas Ozgenc wrote: >>>>> Hello, >>>>> In Kalman filtering does the process noise have to be Gaussian or >>>>> would any uncorrelated covariance stationary noise satisfy the >>>>> requirements? >>>>> When I follow the derivations of the filter I haven't encountered any >>>>> requirements on Gaussian distribution, but in many sources Gaussian >>>>> tag seems to go together. >>>> The Kalman filter is only guaranteed to be optimal when: >>>> * The modeled system is linear. >>>> * Any time-varying behavior of the system is known. >>>> * The noise (process and measurement) is Gaussian. >>>> * The noise's time-dependent behavior is known >>>> (note that this means the noise doesn't have to be stationary -- >>>> just that it's time-dependent behavior is known). >>>> * The model exactly matches reality. >>>> None of these requirements can be met in reality, but the math is at its >>>> most tractable when you assume them. Often the Gaussian noise >>>> assumption comes the closest to being true -- but not always. >>>> If your system matches all of the above assumptions _except_ the >>>> Gaussian noise assumption, then the Kalman filter that you design will >>>> have the lowest error variance of any possible _linear_ filter, but >>>> there may be nonlinear filters with better (perhaps significantly >>>> better) performance. >>> Don't think so. You can design an H infinity linear Kalman filter >>> which is only a slight modification and you don't even need to know >>> what the covariance matrices are at all. >>> H infinity will give you the minimum of the maximum error. >> But strictly speaking the H-infinity filter isn't a Kalman filter. It's >> certainly not what Rudi Kalman cooked up. It is a state-space state >> estimator, and is one of the broader family of "Kalmanesque" filters, >> however. >> >> And the H-infinity filter won't minimize the error variance -- it >> minimizes the min-max error, by definition. >> >> -- >> Tim Wescott >> Control system and signal processing consultingwww.wescottdesign.com > > Who says that minimum mean-square error is the best? That's just one > convenient criterion.
Not me! I made the point in another branch of this thread -- my "optimum" may well not be your "optimum". Indeed, my "optimum" may be a horrendous failure to fall inside the bounds of your "good enough". Minimum mean-square error certainly makes the math easy, though.
> For example, the optimal control problem with a Kalman filter is > pretty bad. It doesn't even have integral action. > Simple PID gives better results for many occasions.
OTOH, if you model the plant as having an uncontrolled integrator and you track that integrator with your Kalman, you suddenly have an 'I' term. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
On Apr 28, 6:43&#4294967295;am, Tim Wescott <t...@seemywebsite.now> wrote:
> HardySpicer wrote: > > On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: > >> HardySpicer wrote: > >>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: > >>>> Cagdas Ozgenc wrote: > >>>>> Hello, > >>>>> In Kalman filtering does the process noise have to be Gaussian or > >>>>> would any uncorrelated covariance stationary noise satisfy the > >>>>> requirements? > >>>>> When I follow the derivations of the filter I haven't encountered any > >>>>> requirements on Gaussian distribution, but in many sources Gaussian > >>>>> tag seems to go together. > >>>> The Kalman filter is only guaranteed to be optimal when: > >>>> * The modeled system is linear. > >>>> * Any time-varying behavior of the system is known. > >>>> * The noise (process and measurement) is Gaussian. > >>>> * The noise's time-dependent behavior is known > >>>> &#4294967295; &#4294967295;(note that this means the noise doesn't have to be stationary -- > >>>> &#4294967295; &#4294967295;just that it's time-dependent behavior is known). > >>>> * The model exactly matches reality. > >>>> None of these requirements can be met in reality, but the math is at its > >>>> most tractable when you assume them. &#4294967295;Often the Gaussian noise > >>>> assumption comes the closest to being true -- but not always. > >>>> If your system matches all of the above assumptions _except_ the > >>>> Gaussian noise assumption, then the Kalman filter that you design will > >>>> have the lowest error variance of any possible _linear_ filter, but > >>>> there may be nonlinear filters with better (perhaps significantly > >>>> better) performance. > >>> Don't think so. You can design an H infinity linear Kalman filter > >>> which is only a slight modification and you don't even need to know > >>> what the covariance matrices are at all. > >>> H infinity will give you the minimum of the maximum error. > >> But strictly speaking the H-infinity filter isn't a Kalman filter. &#4294967295;It's > >> certainly not what Rudi Kalman cooked up. &#4294967295;It is a state-space state > >> estimator, and is one of the broader family of "Kalmanesque" filters, > >> however. > > >> And the H-infinity filter won't minimize the error variance -- it > >> minimizes the min-max error, by definition. > > >> -- > >> Tim Wescott > >> Control system and signal processing consultingwww.wescottdesign.com > > > Who says that minimum mean-square error is the best? That's just one > > convenient criterion. > > Not me! &#4294967295;I made the point in another branch of this thread -- my > "optimum" may well not be your "optimum". &#4294967295;Indeed, my "optimum" may be a > horrendous failure to fall inside the bounds of your "good enough". > > Minimum mean-square error certainly makes the math easy, though. > > > For example, the optimal control problem with a Kalman filter is > > pretty bad. It doesn't even have integral action. > > Simple PID gives better results for many occasions. > > OTOH, if you model the plant as having an uncontrolled integrator and > you track that integrator with your Kalman, you suddenly have an 'I' term. > > -- > Tim Wescott > Control system and signal processing consultingwww.wescottdesign.com
That's right and what people did, but it doesn't come out naturally, whereas it does in H infinity control. Kalman filters are not robust to changes in the plant either. Hardy