Reply by Tim Wescott April 28, 20102010-04-28
HardySpicer wrote:
> On Apr 28, 8:04 am, Tim Wescott <t...@seemywebsite.now> wrote: >> HardySpicer wrote: >>> On Apr 28, 6:43 am, Tim Wescott <t...@seemywebsite.now> wrote: >>>> HardySpicer wrote: >>>>> On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: >>>>>> HardySpicer wrote: >>>>>>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: >>>>>>>> Cagdas Ozgenc wrote: >>>>>>>>> Hello, >>>>>>>>> In Kalman filtering does the process noise have to be Gaussian or >>>>>>>>> would any uncorrelated covariance stationary noise satisfy the >>>>>>>>> requirements? >>>>>>>>> When I follow the derivations of the filter I haven't encountered any >>>>>>>>> requirements on Gaussian distribution, but in many sources Gaussian >>>>>>>>> tag seems to go together. >>>>>>>> The Kalman filter is only guaranteed to be optimal when: >>>>>>>> * The modeled system is linear. >>>>>>>> * Any time-varying behavior of the system is known. >>>>>>>> * The noise (process and measurement) is Gaussian. >>>>>>>> * The noise's time-dependent behavior is known >>>>>>>> (note that this means the noise doesn't have to be stationary -- >>>>>>>> just that it's time-dependent behavior is known). >>>>>>>> * The model exactly matches reality. >>>>>>>> None of these requirements can be met in reality, but the math is at its >>>>>>>> most tractable when you assume them. Often the Gaussian noise >>>>>>>> assumption comes the closest to being true -- but not always. >>>>>>>> If your system matches all of the above assumptions _except_ the >>>>>>>> Gaussian noise assumption, then the Kalman filter that you design will >>>>>>>> have the lowest error variance of any possible _linear_ filter, but >>>>>>>> there may be nonlinear filters with better (perhaps significantly >>>>>>>> better) performance. >>>>>>> Don't think so. You can design an H infinity linear Kalman filter >>>>>>> which is only a slight modification and you don't even need to know >>>>>>> what the covariance matrices are at all. >>>>>>> H infinity will give you the minimum of the maximum error. >>>>>> But strictly speaking the H-infinity filter isn't a Kalman filter. It's >>>>>> certainly not what Rudi Kalman cooked up. It is a state-space state >>>>>> estimator, and is one of the broader family of "Kalmanesque" filters, >>>>>> however. >>>>>> And the H-infinity filter won't minimize the error variance -- it >>>>>> minimizes the min-max error, by definition. >>>>>> -- >>>>>> Tim Wescott >>>>>> Control system and signal processing consultingwww.wescottdesign.com >>>>> Who says that minimum mean-square error is the best? That's just one >>>>> convenient criterion. >>>> Not me! I made the point in another branch of this thread -- my >>>> "optimum" may well not be your "optimum". Indeed, my "optimum" may be a >>>> horrendous failure to fall inside the bounds of your "good enough". >>>> Minimum mean-square error certainly makes the math easy, though. >>>>> For example, the optimal control problem with a Kalman filter is >>>>> pretty bad. It doesn't even have integral action. >>>>> Simple PID gives better results for many occasions. >>>> OTOH, if you model the plant as having an uncontrolled integrator and >>>> you track that integrator with your Kalman, you suddenly have an 'I' term. >>>> -- >>>> Tim Wescott >>>> Control system and signal processing consultingwww.wescottdesign.com >>> That's right and what people did, but it doesn't come out naturally, >>> whereas it does in H infinity control. >>> Kalman filters are not robust to changes in the plant either. >> No, and H-infinity filters are. The biggest drawback from the >> perspective of my current project is that H-infinity filters require a >> lot of computation at design time, and I'm working on an extended Kalman >> filter (it's actually morphed into a hybrid extended-unscented filter), >> for which the filter must compute the gains -- essentially doing a >> design cycle -- at each iteration. The gain computation is easy with a >> Kalman-Kalman, but extracting all the eigenvalues for an >> H-infinity-Kalman is _expensive_. >> >> -- >> Tim Wescott >> Control system and signal processing consultingwww.wescottdesign.com > > Always suspicious about extended Kalman filters since they are not > guaranteed to converge. > I would do a separate estimation of the plant with say a Volterra type > LMS estimator and use that in some way to feed an estimator of the > states.
This particular Kalman was pretty strongly dependent on 3-D angles; it worked OK as an extended Kalman, but really started to shine when it got turned into an unscented Kalman. I didn't consider doing the Volterra series, because what's a few more terms in a really severely nonlinear transform like 3-D angles? But the unscented version is working like dynamite -- and not in the sense that it's blowing up in my face. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Reply by HardySpicer April 28, 20102010-04-28
On Apr 28, 8:04&#4294967295;am, Tim Wescott <t...@seemywebsite.now> wrote:
> HardySpicer wrote: > > On Apr 28, 6:43 am, Tim Wescott <t...@seemywebsite.now> wrote: > >> HardySpicer wrote: > >>> On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: > >>>> HardySpicer wrote: > >>>>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: > >>>>>> Cagdas Ozgenc wrote: > >>>>>>> Hello, > >>>>>>> In Kalman filtering does the process noise have to be Gaussian or > >>>>>>> would any uncorrelated covariance stationary noise satisfy the > >>>>>>> requirements? > >>>>>>> When I follow the derivations of the filter I haven't encountered any > >>>>>>> requirements on Gaussian distribution, but in many sources Gaussian > >>>>>>> tag seems to go together. > >>>>>> The Kalman filter is only guaranteed to be optimal when: > >>>>>> * The modeled system is linear. > >>>>>> * Any time-varying behavior of the system is known. > >>>>>> * The noise (process and measurement) is Gaussian. > >>>>>> * The noise's time-dependent behavior is known > >>>>>> &#4294967295; &#4294967295;(note that this means the noise doesn't have to be stationary -- > >>>>>> &#4294967295; &#4294967295;just that it's time-dependent behavior is known). > >>>>>> * The model exactly matches reality. > >>>>>> None of these requirements can be met in reality, but the math is at its > >>>>>> most tractable when you assume them. &#4294967295;Often the Gaussian noise > >>>>>> assumption comes the closest to being true -- but not always. > >>>>>> If your system matches all of the above assumptions _except_ the > >>>>>> Gaussian noise assumption, then the Kalman filter that you design will > >>>>>> have the lowest error variance of any possible _linear_ filter, but > >>>>>> there may be nonlinear filters with better (perhaps significantly > >>>>>> better) performance. > >>>>> Don't think so. You can design an H infinity linear Kalman filter > >>>>> which is only a slight modification and you don't even need to know > >>>>> what the covariance matrices are at all. > >>>>> H infinity will give you the minimum of the maximum error. > >>>> But strictly speaking the H-infinity filter isn't a Kalman filter. &#4294967295;It's > >>>> certainly not what Rudi Kalman cooked up. &#4294967295;It is a state-space state > >>>> estimator, and is one of the broader family of "Kalmanesque" filters, > >>>> however. > >>>> And the H-infinity filter won't minimize the error variance -- it > >>>> minimizes the min-max error, by definition. > >>>> -- > >>>> Tim Wescott > >>>> Control system and signal processing consultingwww.wescottdesign.com > >>> Who says that minimum mean-square error is the best? That's just one > >>> convenient criterion. > >> Not me! &#4294967295;I made the point in another branch of this thread -- my > >> "optimum" may well not be your "optimum". &#4294967295;Indeed, my "optimum" may be a > >> horrendous failure to fall inside the bounds of your "good enough". > > >> Minimum mean-square error certainly makes the math easy, though. > > >>> For example, the optimal control problem with a Kalman filter is > >>> pretty bad. It doesn't even have integral action. > >>> Simple PID gives better results for many occasions. > >> OTOH, if you model the plant as having an uncontrolled integrator and > >> you track that integrator with your Kalman, you suddenly have an 'I' term. > > >> -- > >> Tim Wescott > >> Control system and signal processing consultingwww.wescottdesign.com > > > That's right and what people did, but it doesn't come out naturally, > > whereas it does in H infinity control. > > Kalman filters are not robust to changes in the plant either. > > No, and H-infinity filters are. &#4294967295;The biggest drawback from the > perspective of my current project is that H-infinity filters require a > lot of computation at design time, and I'm working on an extended Kalman > filter (it's actually morphed into a hybrid extended-unscented filter), > for which the filter must compute the gains -- essentially doing a > design cycle -- at each iteration. &#4294967295;The gain computation is easy with a > Kalman-Kalman, but extracting all the eigenvalues for an > H-infinity-Kalman is _expensive_. > > -- > Tim Wescott > Control system and signal processing consultingwww.wescottdesign.com
Always suspicious about extended Kalman filters since they are not guaranteed to converge. I would do a separate estimation of the plant with say a Volterra type LMS estimator and use that in some way to feed an estimator of the states. Hardy
Reply by Tim Wescott April 27, 20102010-04-27
HardySpicer wrote:
> On Apr 28, 6:43 am, Tim Wescott <t...@seemywebsite.now> wrote: >> HardySpicer wrote: >>> On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: >>>> HardySpicer wrote: >>>>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: >>>>>> Cagdas Ozgenc wrote: >>>>>>> Hello, >>>>>>> In Kalman filtering does the process noise have to be Gaussian or >>>>>>> would any uncorrelated covariance stationary noise satisfy the >>>>>>> requirements? >>>>>>> When I follow the derivations of the filter I haven't encountered any >>>>>>> requirements on Gaussian distribution, but in many sources Gaussian >>>>>>> tag seems to go together. >>>>>> The Kalman filter is only guaranteed to be optimal when: >>>>>> * The modeled system is linear. >>>>>> * Any time-varying behavior of the system is known. >>>>>> * The noise (process and measurement) is Gaussian. >>>>>> * The noise's time-dependent behavior is known >>>>>> (note that this means the noise doesn't have to be stationary -- >>>>>> just that it's time-dependent behavior is known). >>>>>> * The model exactly matches reality. >>>>>> None of these requirements can be met in reality, but the math is at its >>>>>> most tractable when you assume them. Often the Gaussian noise >>>>>> assumption comes the closest to being true -- but not always. >>>>>> If your system matches all of the above assumptions _except_ the >>>>>> Gaussian noise assumption, then the Kalman filter that you design will >>>>>> have the lowest error variance of any possible _linear_ filter, but >>>>>> there may be nonlinear filters with better (perhaps significantly >>>>>> better) performance. >>>>> Don't think so. You can design an H infinity linear Kalman filter >>>>> which is only a slight modification and you don't even need to know >>>>> what the covariance matrices are at all. >>>>> H infinity will give you the minimum of the maximum error. >>>> But strictly speaking the H-infinity filter isn't a Kalman filter. It's >>>> certainly not what Rudi Kalman cooked up. It is a state-space state >>>> estimator, and is one of the broader family of "Kalmanesque" filters, >>>> however. >>>> And the H-infinity filter won't minimize the error variance -- it >>>> minimizes the min-max error, by definition. >>>> -- >>>> Tim Wescott >>>> Control system and signal processing consultingwww.wescottdesign.com >>> Who says that minimum mean-square error is the best? That's just one >>> convenient criterion. >> Not me! I made the point in another branch of this thread -- my >> "optimum" may well not be your "optimum". Indeed, my "optimum" may be a >> horrendous failure to fall inside the bounds of your "good enough". >> >> Minimum mean-square error certainly makes the math easy, though. >> >>> For example, the optimal control problem with a Kalman filter is >>> pretty bad. It doesn't even have integral action. >>> Simple PID gives better results for many occasions. >> OTOH, if you model the plant as having an uncontrolled integrator and >> you track that integrator with your Kalman, you suddenly have an 'I' term. >> >> -- >> Tim Wescott >> Control system and signal processing consultingwww.wescottdesign.com > > That's right and what people did, but it doesn't come out naturally, > whereas it does in H infinity control. > Kalman filters are not robust to changes in the plant either.
No, and H-infinity filters are. The biggest drawback from the perspective of my current project is that H-infinity filters require a lot of computation at design time, and I'm working on an extended Kalman filter (it's actually morphed into a hybrid extended-unscented filter), for which the filter must compute the gains -- essentially doing a design cycle -- at each iteration. The gain computation is easy with a Kalman-Kalman, but extracting all the eigenvalues for an H-infinity-Kalman is _expensive_. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Reply by HardySpicer April 27, 20102010-04-27
On Apr 28, 6:43&#4294967295;am, Tim Wescott <t...@seemywebsite.now> wrote:
> HardySpicer wrote: > > On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: > >> HardySpicer wrote: > >>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: > >>>> Cagdas Ozgenc wrote: > >>>>> Hello, > >>>>> In Kalman filtering does the process noise have to be Gaussian or > >>>>> would any uncorrelated covariance stationary noise satisfy the > >>>>> requirements? > >>>>> When I follow the derivations of the filter I haven't encountered any > >>>>> requirements on Gaussian distribution, but in many sources Gaussian > >>>>> tag seems to go together. > >>>> The Kalman filter is only guaranteed to be optimal when: > >>>> * The modeled system is linear. > >>>> * Any time-varying behavior of the system is known. > >>>> * The noise (process and measurement) is Gaussian. > >>>> * The noise's time-dependent behavior is known > >>>> &#4294967295; &#4294967295;(note that this means the noise doesn't have to be stationary -- > >>>> &#4294967295; &#4294967295;just that it's time-dependent behavior is known). > >>>> * The model exactly matches reality. > >>>> None of these requirements can be met in reality, but the math is at its > >>>> most tractable when you assume them. &#4294967295;Often the Gaussian noise > >>>> assumption comes the closest to being true -- but not always. > >>>> If your system matches all of the above assumptions _except_ the > >>>> Gaussian noise assumption, then the Kalman filter that you design will > >>>> have the lowest error variance of any possible _linear_ filter, but > >>>> there may be nonlinear filters with better (perhaps significantly > >>>> better) performance. > >>> Don't think so. You can design an H infinity linear Kalman filter > >>> which is only a slight modification and you don't even need to know > >>> what the covariance matrices are at all. > >>> H infinity will give you the minimum of the maximum error. > >> But strictly speaking the H-infinity filter isn't a Kalman filter. &#4294967295;It's > >> certainly not what Rudi Kalman cooked up. &#4294967295;It is a state-space state > >> estimator, and is one of the broader family of "Kalmanesque" filters, > >> however. > > >> And the H-infinity filter won't minimize the error variance -- it > >> minimizes the min-max error, by definition. > > >> -- > >> Tim Wescott > >> Control system and signal processing consultingwww.wescottdesign.com > > > Who says that minimum mean-square error is the best? That's just one > > convenient criterion. > > Not me! &#4294967295;I made the point in another branch of this thread -- my > "optimum" may well not be your "optimum". &#4294967295;Indeed, my "optimum" may be a > horrendous failure to fall inside the bounds of your "good enough". > > Minimum mean-square error certainly makes the math easy, though. > > > For example, the optimal control problem with a Kalman filter is > > pretty bad. It doesn't even have integral action. > > Simple PID gives better results for many occasions. > > OTOH, if you model the plant as having an uncontrolled integrator and > you track that integrator with your Kalman, you suddenly have an 'I' term. > > -- > Tim Wescott > Control system and signal processing consultingwww.wescottdesign.com
That's right and what people did, but it doesn't come out naturally, whereas it does in H infinity control. Kalman filters are not robust to changes in the plant either. Hardy
Reply by Tim Wescott April 27, 20102010-04-27
HardySpicer wrote:
> On Apr 27, 2:53 pm, Tim Wescott <t...@seemywebsite.now> wrote: >> HardySpicer wrote: >>> On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: >>>> Cagdas Ozgenc wrote: >>>>> Hello, >>>>> In Kalman filtering does the process noise have to be Gaussian or >>>>> would any uncorrelated covariance stationary noise satisfy the >>>>> requirements? >>>>> When I follow the derivations of the filter I haven't encountered any >>>>> requirements on Gaussian distribution, but in many sources Gaussian >>>>> tag seems to go together. >>>> The Kalman filter is only guaranteed to be optimal when: >>>> * The modeled system is linear. >>>> * Any time-varying behavior of the system is known. >>>> * The noise (process and measurement) is Gaussian. >>>> * The noise's time-dependent behavior is known >>>> (note that this means the noise doesn't have to be stationary -- >>>> just that it's time-dependent behavior is known). >>>> * The model exactly matches reality. >>>> None of these requirements can be met in reality, but the math is at its >>>> most tractable when you assume them. Often the Gaussian noise >>>> assumption comes the closest to being true -- but not always. >>>> If your system matches all of the above assumptions _except_ the >>>> Gaussian noise assumption, then the Kalman filter that you design will >>>> have the lowest error variance of any possible _linear_ filter, but >>>> there may be nonlinear filters with better (perhaps significantly >>>> better) performance. >>> Don't think so. You can design an H infinity linear Kalman filter >>> which is only a slight modification and you don't even need to know >>> what the covariance matrices are at all. >>> H infinity will give you the minimum of the maximum error. >> But strictly speaking the H-infinity filter isn't a Kalman filter. It's >> certainly not what Rudi Kalman cooked up. It is a state-space state >> estimator, and is one of the broader family of "Kalmanesque" filters, >> however. >> >> And the H-infinity filter won't minimize the error variance -- it >> minimizes the min-max error, by definition. >> >> -- >> Tim Wescott >> Control system and signal processing consultingwww.wescottdesign.com > > Who says that minimum mean-square error is the best? That's just one > convenient criterion.
Not me! I made the point in another branch of this thread -- my "optimum" may well not be your "optimum". Indeed, my "optimum" may be a horrendous failure to fall inside the bounds of your "good enough". Minimum mean-square error certainly makes the math easy, though.
> For example, the optimal control problem with a Kalman filter is > pretty bad. It doesn't even have integral action. > Simple PID gives better results for many occasions.
OTOH, if you model the plant as having an uncontrolled integrator and you track that integrator with your Kalman, you suddenly have an 'I' term. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Reply by Rune Allnor April 27, 20102010-04-27
On 27 apr, 17:25, Tim Wescott <t...@seemywebsite.now> wrote:
> Rune Allnor wrote: > > On 27 apr, 04:54, Tim Wescott <t...@seemywebsite.now> wrote: > > >> Indeed, the first step
...
> > Ehh... I would rate that as the *second* step.
...
> That doesn't sound as nifty. &#4294967295;
Suffice it to say that rethorics has never been among my fortes. Rune
Reply by Vladimir Vassilevsky April 27, 20102010-04-27

HardySpicer wrote:


> Who says that minimum mean-square error is the best? That's just one > convenient criterion. > For example, the optimal control problem with a Kalman filter is > pretty bad. It doesn't even have integral action. > Simple PID gives better results for many occasions.
"Of all the idiots the most insufferable are those who are not completely deprived mind" Francois de La Rochefoucauld
Reply by HardySpicer April 27, 20102010-04-27
On Apr 27, 2:53&#4294967295;pm, Tim Wescott <t...@seemywebsite.now> wrote:
> HardySpicer wrote: > > On Apr 27, 4:40 am, Tim Wescott <t...@seemywebsite.now> wrote: > >> Cagdas Ozgenc wrote: > >>> Hello, > >>> In Kalman filtering does the process noise have to be Gaussian or > >>> would any uncorrelated covariance stationary noise satisfy the > >>> requirements? > >>> When I follow the derivations of the filter I haven't encountered any > >>> requirements on Gaussian distribution, but in many sources Gaussian > >>> tag seems to go together. > >> The Kalman filter is only guaranteed to be optimal when: > > >> * The modeled system is linear. > >> * Any time-varying behavior of the system is known. > >> * The noise (process and measurement) is Gaussian. > >> * The noise's time-dependent behavior is known > >> &#4294967295; &#4294967295;(note that this means the noise doesn't have to be stationary -- > >> &#4294967295; &#4294967295;just that it's time-dependent behavior is known). > >> * The model exactly matches reality. > > >> None of these requirements can be met in reality, but the math is at its > >> most tractable when you assume them. &#4294967295;Often the Gaussian noise > >> assumption comes the closest to being true -- but not always. > > >> If your system matches all of the above assumptions _except_ the > >> Gaussian noise assumption, then the Kalman filter that you design will > >> have the lowest error variance of any possible _linear_ filter, but > >> there may be nonlinear filters with better (perhaps significantly > >> better) performance. > > > Don't think so. You can design an H infinity linear Kalman filter > > which is only a slight modification and you don't even need to know > > what the covariance matrices are at all. > > H infinity will give you the minimum of the maximum error. > > But strictly speaking the H-infinity filter isn't a Kalman filter. &#4294967295;It's > certainly not what Rudi Kalman cooked up. &#4294967295;It is a state-space state > estimator, and is one of the broader family of "Kalmanesque" filters, > however. > > And the H-infinity filter won't minimize the error variance -- it > minimizes the min-max error, by definition. > > -- > Tim Wescott > Control system and signal processing consultingwww.wescottdesign.com
Who says that minimum mean-square error is the best? That's just one convenient criterion. For example, the optimal control problem with a Kalman filter is pretty bad. It doesn't even have integral action. Simple PID gives better results for many occasions. Hardy
Reply by Tim Wescott April 27, 20102010-04-27
Rune Allnor wrote:
> On 27 apr, 04:54, Tim Wescott <t...@seemywebsite.now> wrote: > >> Indeed, the first step in applying someone's "optimal" formulation is >> deciding if their "optimal" comes within the bounds of your "good enough". > > Ehh... I would rate that as the *second* step. The first item on > my list would be find out in what sense an 'optimal' filter is > optimal: > > - Error magnitude? > - Operational robustness? > - Computational efficency? > - Ease of implementation? > - Economy? > - Balancing all the above?
That doesn't sound as nifty. Besides, evaluating all of those are (IMHO) an essential part of deciding of someone's "optimal" is within my bounds of "good enough". Sometimes the best _practical_ solution is really crappy _technically_, because sometimes the variable in most urgent need of optimization is engineering time, or weight, or power consumption, or some other parameter that just doesn't find its way into the usual elegant formulations of "optimal". -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Reply by Tim Wescott April 27, 20102010-04-27
Frnak McKenney wrote:
> On Mon, 26 Apr 2010 19:54:35 -0700, Tim Wescott <tim@seemywebsite.now> wrote: > >> Indeed, the first step in applying someone's "optimal" formulation is >> deciding if their "optimal" comes within the bounds of your "good enough". > > Tim, > > A phrase that deserves repeating in a variety of contexts. Mind if > I steal it? > > > Frank "TaglinesRUs" McKenney
Attribute it to me, and otherwise it's all yours. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com