Reply by HardySpicer January 9, 20092009-01-09
On Jan 9, 4:08&#4294967295;pm, Tim Wescott <t...@seemywebsite.com> wrote:
> On Thu, 08 Jan 2009 11:21:21 -0800, HardySpicer wrote: > > For the standard SISO Wiener filter we minimize the cost J > > > J=E[e^2]=E(d-W'X)^2 > > > where W is a vector of weights and X is a vector of regressers. (d is > > desired output) &#4294967295;Also ' denotes transpose. We do this by diferentiating > > wrt the weight vector W and arrive at the standard Wiener solution. > > > However, in the case where W is asymmetric &#4294967295;Matrix and d is a vector > > (also X is a vector still) we have to differentiate wrt a Matrix and the > > error is &#4294967295;a vector. > > > J=e'e = (d-W'X)'(d-W'X) > > > ie dJ/dW (J is still a scalar) > > > I have a paper that just says that the answer is the same form but with > > no derivation! Differentiating wrt a matrix however is slightly > > different. > > > ------------------------------------------------------ For example... > > Differentiation of a scalar wrt a vector of the quadratic form > > > y=x'Ax where a is a matrix and x a vector > > > dy/dx = Ax+A'x = 2Ax if A is symmetric. > > > Now in the multidimensional Wiener filter we have a term X'WW'X which > > needs differentiating wrt the matrix W. > > > We also have &#4294967295;terms X'Wd and dW'X which need differentiating wrt the > > matrix W. > > > H. > > This is a solved problem if you bear in mind that a steady state Kalman > filter is really just a Wiener filter with a fancy name. > > Then web search accordingly. > > -- > Tim Wescott > Control systems and communications consultinghttp://www.wescottdesign.com > > Need to learn how to apply control theory in your embedded system? > "Applied Control Theory for Embedded Systems" by Tim Wescott > Elsevier/Newnes,http://www.wescottdesign.com/actfes/actfes.html
Trouble with Kalman filters is that you need to solve a Ricatti equation for the gain matrix. With Polynomial or LMS type adaptive Wiener filters, the computation is far less. H.
Reply by steveu January 8, 20092009-01-08
>This is a solved problem if you bear in mind that a steady state Kalman >filter is really just a Wiener filter with a fancy name. > >Then web search accordingly. > >-- >Tim Wescott
I'm OK with the adaptive filtering, but I'm still puzzled. What makes "Kalman" a fancier name than "Weiner"? :-\ Steve
Reply by Tim Wescott January 8, 20092009-01-08
On Thu, 08 Jan 2009 11:21:21 -0800, HardySpicer wrote:

> For the standard SISO Wiener filter we minimize the cost J > > J=E[e^2]=E(d-W'X)^2 > > where W is a vector of weights and X is a vector of regressers. (d is > desired output) Also ' denotes transpose. We do this by diferentiating > wrt the weight vector W and arrive at the standard Wiener solution. > > However, in the case where W is asymmetric Matrix and d is a vector > (also X is a vector still) we have to differentiate wrt a Matrix and the > error is a vector. > > J=e'e = (d-W'X)'(d-W'X) > > ie dJ/dW (J is still a scalar) > > I have a paper that just says that the answer is the same form but with > no derivation! Differentiating wrt a matrix however is slightly > different. > > ------------------------------------------------------ For example... > Differentiation of a scalar wrt a vector of the quadratic form > > y=x'Ax where a is a matrix and x a vector > > > dy/dx = Ax+A'x = 2Ax if A is symmetric. > > > > > Now in the multidimensional Wiener filter we have a term X'WW'X which > needs differentiating wrt the matrix W. > > We also have terms X'Wd and dW'X which need differentiating wrt the > matrix W. > > > H.
This is a solved problem if you bear in mind that a steady state Kalman filter is really just a Wiener filter with a fancy name. Then web search accordingly. -- Tim Wescott Control systems and communications consulting http://www.wescottdesign.com Need to learn how to apply control theory in your embedded system? "Applied Control Theory for Embedded Systems" by Tim Wescott Elsevier/Newnes, http://www.wescottdesign.com/actfes/actfes.html
Reply by HardySpicer January 8, 20092009-01-08
On Jan 9, 10:36&#4294967295;am, HardySpicer <gyansor...@gmail.com> wrote:
> On Jan 9, 8:21&#4294967295;am, HardySpicer <gyansor...@gmail.com> wrote: > > > > > For the standard SISO Wiener filter we minimize the cost J > > > J=E[e^2]=E(d-W'X)^2 > > > where W is a vector of weights and X is a vector of regressers. (d is > > desired output) &#4294967295;Also ' denotes transpose. > > We do this by diferentiating wrt the weight vector W and arrive at the > > standard Wiener solution. > > > However, in the case where W is asymmetric &#4294967295;Matrix and d is a vector > > (also X is a vector still) we have to differentiate wrt a Matrix and > > the error is &#4294967295;a vector. > > > J=e'e = (d-W'X)'(d-W'X) > > > ie dJ/dW (J is still a scalar) > > > I have a paper that just says that the answer is the same form but > > with no derivation! Differentiating wrt a matrix however is slightly > > different. > > > ------------------------------------------------------ > > For example... > > Differentiation of a scalar wrt a vector of the quadratic form > > > y=x'Ax where a is a matrix and x a vector > > > dy/dx = Ax+A'x = 2Ax if A is symmetric. > > > Now in the multidimensional Wiener filter we have a term > > X'WW'X which needs differentiating wrt the matrix W. > > > We also have &#4294967295;terms X'Wd and dW'X which need differentiating wrt the > > matrix W. > > > H. > > Actually I just found this result from > > www.che.iitm.ac.in/~naras/ch544/matrix.pdf > > That if you differentiate the norm squared > > ||AX+b||^2 > > you get > 2AXX' + 2bX' > > using this and substituting > > ||-WX+d||^2 > > I get > > -2WXX' +2dX'=0 > > Now since E[XX'] = R, the correlation matrix I get > > WR=E[dX'] &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; (1) > > or > > W=RdxR^-1 > > which is not the same result as in the paper. However, R is symmetric, > from (1) > > W'R=E[Xd'] > > W'=R^-1 Rxd &#4294967295;(Rxd is the cross-correlation matrix between X and d) > > which is more like it. Now does W'=W ie is the filter weight matrix > symmetric?? > > H.
oops it's right. I Had minimized ||-WX+d ||^2 when it should have been ||-W'X+d ||^2. all ok!
Reply by HardySpicer January 8, 20092009-01-08
On Jan 9, 10:36&#4294967295;am, HardySpicer <gyansor...@gmail.com> wrote:
> On Jan 9, 8:21&#4294967295;am, HardySpicer <gyansor...@gmail.com> wrote: > > > > > For the standard SISO Wiener filter we minimize the cost J > > > J=E[e^2]=E(d-W'X)^2 > > > where W is a vector of weights and X is a vector of regressers. (d is > > desired output) &#4294967295;Also ' denotes transpose. > > We do this by diferentiating wrt the weight vector W and arrive at the > > standard Wiener solution. > > > However, in the case where W is asymmetric &#4294967295;Matrix and d is a vector > > (also X is a vector still) we have to differentiate wrt a Matrix and > > the error is &#4294967295;a vector. > > > J=e'e = (d-W'X)'(d-W'X) > > > ie dJ/dW (J is still a scalar) > > > I have a paper that just says that the answer is the same form but > > with no derivation! Differentiating wrt a matrix however is slightly > > different. > > > ------------------------------------------------------ > > For example... > > Differentiation of a scalar wrt a vector of the quadratic form > > > y=x'Ax where a is a matrix and x a vector > > > dy/dx = Ax+A'x = 2Ax if A is symmetric. > > > Now in the multidimensional Wiener filter we have a term > > X'WW'X which needs differentiating wrt the matrix W. > > > We also have &#4294967295;terms X'Wd and dW'X which need differentiating wrt the > > matrix W. > > > H. > > Actually I just found this result from > > www.che.iitm.ac.in/~naras/ch544/matrix.pdf > > That if you differentiate the norm squared > > ||AX+b||^2 > > you get > 2AXX' + 2bX' > > using this and substituting > > ||-WX+d||^2 > > I get > > -2WXX' +2dX'=0 > > Now since E[XX'] = R, the correlation matrix I get > > WR=E[dX'] &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; &#4294967295; (1) > > or > > W=RdxR^-1 > > which is not the same result as in the paper. However, R is symmetric, > from (1) > > W'R=E[Xd'] > > W'=R^-1 Rxd &#4294967295;(Rxd is the cross-correlation matrix between X and d) > > which is more like it. Now does W'=W ie is the filter weight matrix > symmetric?? > > H.
Sorry - I should have said That if you differentiate the norm squared ||AX+b||^2 with respect to A.
Reply by HardySpicer January 8, 20092009-01-08
On Jan 9, 8:21&#4294967295;am, HardySpicer <gyansor...@gmail.com> wrote:
> For the standard SISO Wiener filter we minimize the cost J > > J=E[e^2]=E(d-W'X)^2 > > where W is a vector of weights and X is a vector of regressers. (d is > desired output) &#4294967295;Also ' denotes transpose. > We do this by diferentiating wrt the weight vector W and arrive at the > standard Wiener solution. > > However, in the case where W is asymmetric &#4294967295;Matrix and d is a vector > (also X is a vector still) we have to differentiate wrt a Matrix and > the error is &#4294967295;a vector. > > J=e'e = (d-W'X)'(d-W'X) > > ie dJ/dW (J is still a scalar) > > I have a paper that just says that the answer is the same form but > with no derivation! Differentiating wrt a matrix however is slightly > different. > > ------------------------------------------------------ > For example... > Differentiation of a scalar wrt a vector of the quadratic form > > y=x'Ax where a is a matrix and x a vector > > dy/dx = Ax+A'x = 2Ax if A is symmetric. > > Now in the multidimensional Wiener filter we have a term > X'WW'X which needs differentiating wrt the matrix W. > > We also have &#4294967295;terms X'Wd and dW'X which need differentiating wrt the > matrix W. > > H.
Actually I just found this result from www.che.iitm.ac.in/~naras/ch544/matrix.pdf That if you differentiate the norm squared ||AX+b||^2 you get 2AXX' + 2bX' using this and substituting ||-WX+d||^2 I get -2WXX' +2dX'=0 Now since E[XX'] = R, the correlation matrix I get WR=E[dX'] (1) or W=RdxR^-1 which is not the same result as in the paper. However, R is symmetric, from (1) W'R=E[Xd'] W'=R^-1 Rxd (Rxd is the cross-correlation matrix between X and d) which is more like it. Now does W'=W ie is the filter weight matrix symmetric?? H.
Reply by HardySpicer January 8, 20092009-01-08
For the standard SISO Wiener filter we minimize the cost J

J=E[e^2]=E(d-W'X)^2

where W is a vector of weights and X is a vector of regressers. (d is
desired output)  Also ' denotes transpose.
We do this by diferentiating wrt the weight vector W and arrive at the
standard Wiener solution.

However, in the case where W is asymmetric  Matrix and d is a vector
(also X is a vector still) we have to differentiate wrt a Matrix and
the error is  a vector.

J=e'e = (d-W'X)'(d-W'X)

ie dJ/dW (J is still a scalar)

I have a paper that just says that the answer is the same form but
with no derivation! Differentiating wrt a matrix however is slightly
different.

------------------------------------------------------
For example...
Differentiation of a scalar wrt a vector of the quadratic form

y=x'Ax where a is a matrix and x a vector


dy/dx = Ax+A'x = 2Ax if A is symmetric.




Now in the multidimensional Wiener filter we have a term
X'WW'X which needs differentiating wrt the matrix W.

We also have  terms X'Wd and dW'X which need differentiating wrt the
matrix W.


H.