返回题库

最小二乘回归算法

Least Squares via QR Decomposition

专题
Algorithmic Programming / 算法编程
难度
L4

题目详情

给出一个线性最小二乘回归的算法(求 minβyXβ2\min_\beta\|y-X\beta\|^2)。

Every invertible n×nn\times n matrix AA can be written as A=QRA=QR where QQ is orthogonal (QTQ=IQ^T Q=I) and RR is upper triangular.
One constructs QQ by applying Gram-Schmidt to the columns of A,A, and RR arises from the combination coefficients.

Least squares: If XX is n×p,n\times p, we can do X=QRX=QR (where QQ is n×p,n\times p, RR is p×pp\times p) to solve XβyX\beta \approx y. Then Rβ=QTy.R\beta = Q^T y.

Question: Give an algorithm for linear least squares regression.

解析

标准做法:

  • 对设计矩阵 XX 做 QR 分解 X=QRX=QRQQ 列正交,RR 上三角)。
  • 由正规方程等价变换:
β^=argminyXβ2Rβ^=QTy.\hat\beta=\arg\min\|y-X\beta\|^2\Rightarrow R\hat\beta=Q^T y.
  • 用回代解上三角方程得到 β^\hat\beta

相比直接解 (XTX)β^=XTy(X^TX)\hat\beta=X^Ty,数值更稳定。


Original Explanation

We have Y=Xβ+ϵY=X\beta + \epsilon. Minimize YXβ2.\|Y-X\beta\|^2. The normal equations are (XTX)β^=XTY.(X^TX)\hat\beta=X^T Y. Instead of inverting XTX,X^TX, factor X=QRX=QR and solve Rβ^=QTY.R\hat\beta=Q^T Y.