Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 2011, Article ID 385870, 5 pages doi:10.1155/2011/385870 Research Article A Representation of Nonhomogeneous Quadratic Forms with Application to the Least Squares Solution Czesław Stepniak Institute of Mathematics, University of Rzeszów, Rejtana 16 A, 35-959 Rzeszów, Poland stepniak@umcs.lublin.pl Correspondence should be addressed to Czesław Stepniak, Received 5 June 2011; Revised 16 July 2011; Accepted 16 July 2011 Academic Editor: Piermarco Cannarsa This is an open access article distributed under the Creative Copyright q 2011 Czesław Stepniak. Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The least squares problem appears, among others, in linear models, and it refers to inconsistent system of linear equations. A crucial question is how to reduce the least squares solution in such a system to the usual solution in a consistent one. Traditionally, this is reached by differential calculus. We present a purely algebraic approach to this problem based on some identities for nonhomogeneous quadratic forms. 1. Introduction and Notation The least squares problem appears, among others, in linear models, and it refers to inconsistent system Ax b of linear equations. Formally, it reduces to minimizing the nonhomogeneous quadratic form fx Ax − bT Ax − b. 1.1 Classical approach to the problem, presented in such known books as Scheffé 1, Chapter 1, Rao 2, pages 222 and 223, Rao and Toutenburg 3, pages 20–23, uses differential calculus and leads to the so called normal equation AT Ax AT b, which is consistent. The aim of this note is to present some useful algebraic identities for nonhomogeneous quadratic forms leading directly to normal equation. Traditional vector-matrix notation will be used. Among others, if M is a matrix then MT , RM, and rM stand for its transposition, range column space, and rank. Moreover, by Rn will be denoted the n-dimensional euclidean space represented by column vectors. 2 Mathematical Problems in Engineering 2. Background Any system of linear equations may be presented in the vector-matrix form as Ax b, 2.1 where A is a given n × p matrix, b is a given vector in Rn , while x ∈ Rp is unknown vector. It is well known that 2.1 is consistent, if and only if, b belongs to the range RA. If 2.1 is inconsistent, one can seek for a vector x minimizing the norm Ax − b or, equivalently, its square Ax − bT Ax − b. The Least Squares Solution (LSS) of 2.1 is defined as a vector x0 ∈ Rp such that Ax0 − bT Ax0 − b ≤ Ax − bT Ax − b ∀x ∈ Rp . 2.2 A crucial problem is how to reduce the LSS of the inconsistent equation 2.1 to the usual solution of a consistent one. Formally, the least squares problem deals with minimizing the nonhomogeneous quadratic form fx Ax − bT Ax − b. Traditionally, this problem is solved by differential calculus and leads to the normal equation AT Ax AT b. In the next section, we will present some useful algebraic identities for nonhomogeneous quadratic forms. They yield directly the inequality 2.2. 3. Identities and Inequalities for Nonhomogeneous Quadratic Forms The usual, that is homogeneous quadratic form is a real function fM x xT Mx defined on Rp . In this note, we shall consider also nonhomogeneous quadratic forms of type fM,a x x − aT Mx − a, 3.1 where M is a symmetric p × p matrix and a is a vector in Rp . Some inequalities for nonhomogeneous quadratic forms may be found in Stepniak 4. Let us recall one of these results, which is very useful in the nonhomogeneous linear estimation. Lemma 3.1. For any symmetric nonnegative definite matrices M1 and M2 of order p, the condition x − x1 T M1 x − x1 c1 ≥ x − x2 T M2 x − x2 c2 3.2 for some c1 , c2 ∈ R, x1 ,x2 ∈ Rp and all x ∈ Rp implies that M1 − M2 is nonnegative definite and c1 − c2 ≥ 0. Now we will present some identity which may serve as a convenient tool in the LSS of 2.1. For convenience, we will start from the case rA p, leaving the singular case rA < p to Section 5. Mathematical Problems in Engineering 3 Proposition 3.2. For arbitrary n × p matrix A of rank p and arbitrary vector b ∈ Rn T −1 AT A AT Ax − AT b Ax − bT Ax − b AT Ax − AT b −1 bT In − A AT A AT b. 3.3 Proof. Let us start from the trivial identity −1 −1 In A AT A AT In − A AT A A . 3.4 We only need to premultiply this identity by Ax − bT , postmultiply it by Ax − b, and then collect the terms to get 3.3. 4. Least Squares and Usual Solutions: Nonsingular Case As above, we consider an inconsistent equation Ax b, where A is an n × p matrix of rank p. We are interested in the LSS of this equation. Theorem 4.1. Vector x ∈ Rp is a Least Squares Solution of the inconsistent equation Ax b, if and only if, it is usual solution of the consistent equation AT Ax AT b. 4.1 Proof. Consistency of 4.1 follows from the fact that RAT A RAT . We note that the second component in the right side of the identity 3.3 does not depend on x. Thus, we only need to minimize the first one. Since AT A, and in consequence AT A−1 , is positive definite, this component is minimal, if and only if, AT Ax − At b 0. This completes the proof. 5. General Case If rA < p then the matrix AT A is singular, and, therefore, the identity 3.3 is not applicable. However, as we will show, it remains true if we replace AT A−1 by arbitrary generalized inverse AT A− . There are many papers on generalized inverses and several books; among others Bapat 5 Ben-Israel and Greville 6, Campbell and Meyer 7, Pringle and Rayner 8, and Rao 10 may serve as a brief and self-contained and Mitra 9. A recent paper by Stepniak introduction to this field. Let us recall that a given n × p matrix A, its generalized inverse A− is defined as an arbitrary p × n matrix G satisfying the condition AGA A. A key result in this section is stated as follows. 4 Mathematical Problems in Engineering Proposition 5.1. For arbitrary n × p matrix A and arbitrary vector b ∈ Rn T − AT A AT Ax − AT b Ax − bT Ax − b AT Ax − AT b − bT In − A AT A AT b, 5.1 where − means arbitrary generalized inverse. Proof. The idea of the proof is the same as in Proposition 3.2. Since RAT A RAT one can replace the vector AT b by AT Ac for some c. Now we will apply the identity 5.1 to the least squares problem. Theorem 5.2. For arbitrary n × p matrix A and arbitrary vector b ∈ Rn , i a vector x ∈ Rp is a Least Squares Solution of the equation Ax b, if and only if, it is the usual solution of the (consistent) equation AT Ax AT b; ii the lower bound of Ax − bT Ax − b is equal to bT In − AAT A− AT b, and it does not depend on choice of generalized inverse AT A− . Proof. By setting AT b AT Ac, the first component in the right side of 5.1 reduces to x − cT AT Ax − c which is nonnegative and takes zero, if and only if, AT Ax − c 0 or, equivalently, if AT Ax AT b. Since the second component does not depend on x, this is just total minimum. The same setting shows that the lower bound does not depend on the choice of generalized inverse. Acknowledgment The author thanks a referee for his or her useful suggestions concerning presentation of this paper. References 1 H. Scheffé, The Analysis of Variance, John Wiley & Sons, New York, NY, USA, 1959. 2 C. R. Rao, Linear Statistical Inference and Its Applications, Wiley Series in Probability and Mathematical Statistic, John Wiley & Sons, New York, NY, USA, 2nd edition, 1973. 3 C. R. Rao and H. Toutenburg, Linear Models: Least Squares and Alternatives, Springer Series in Statistics, Springer, New York, NY, USA, 1995. “Admissible linear estimators in mixed linear models,” Journal of Multivariate Analysis, 4 C. Stepniak, vol. 31, no. 1, pp. 90–106, 1989. 5 R. B. Bapat, Linear Algebra and Linear Models, Universitext, Springer, New York, NY, USA, 2nd edition, 2000. 6 A. Ben-Israel and T. N. E. Greville, Generalized Inverses: Theory and Applications, CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC, 15, Springer, New York, NY, USA, 2nd edition, 2003. 7 S. L. Campbell and C. D. Meyer, Jr., Generalized Inverses of Linear Transformations, Dover, New York, NY, USA, 1991. 8 R. M. Pringle and A. A. Rayner, Generalized Inverses of Matrices with Applications in Statistics, Griffin, London, UK, 1971. Mathematical Problems in Engineering 5 9 C. R. Rao and S. K. Mitra, Generalized Inverse of Matrices and Its Applications, John Wiley & Sons, New York, NY, USA, 1971. “Through a generalized inverse,” Demonstratio Mathematica, vol. 41, no. 2, pp. 291–296, 10 C. Stepniak, 2008. Advances in Operations Research Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Advances in Decision Sciences Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Mathematical Problems in Engineering Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Journal of Algebra Hindawi Publishing Corporation http://www.hindawi.com Probability and Statistics Volume 2014 The Scientific World Journal Hindawi Publishing Corporation http://www.hindawi.com Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 International Journal of Differential Equations Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Volume 2014 Submit your manuscripts at http://www.hindawi.com International Journal of Advances in Combinatorics Hindawi Publishing Corporation http://www.hindawi.com Mathematical Physics Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Journal of Complex Analysis Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 International Journal of Mathematics and Mathematical Sciences Journal of Hindawi Publishing Corporation http://www.hindawi.com Stochastic Analysis Abstract and Applied Analysis Hindawi Publishing Corporation http://www.hindawi.com Hindawi Publishing Corporation http://www.hindawi.com International Journal of Mathematics Volume 2014 Volume 2014 Discrete Dynamics in Nature and Society Volume 2014 Volume 2014 Journal of Journal of Discrete Mathematics Journal of Volume 2014 Hindawi Publishing Corporation http://www.hindawi.com Applied Mathematics Journal of Function Spaces Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Optimization Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 Hindawi Publishing Corporation http://www.hindawi.com Volume 2014