Ordinary least-squares fit

  1. (6 points)

    Implement a routine which makes a least-squares fit of a given data-set, {xi, yi, δyi | i=1...n}, with a linear combination of given functions, F(x) =∑ k=1..m ck fk(x).

    The subroutine must calculate the vector of the coefficients, ck, and the covariance matrix. The fitting functions {fk(x)} can be programmed, e.g., as

    Make up some interesting fits and check that the reported errors of the fitting coefficients are reasonable. For example,

  2. (3 points) Linear least-squares solution using thin singular-value decomposition.

    Here under the thin singular value decomposition we shall understand a representation of a tall n×m (n>m) matrix A in the form A=UΣVT where U is an orthogonal n×m matrix (UTU=1), Σ is a square m×m diagonal matrix with non-negative real numbers on the diagonal (called singular values of matrix A), and V is a square m×m orthoginal matrix (VTV=1).

    Singular value decomposition can be found by diagonalising the ATA matrix, ATA=VDVT, where D is a diagonal matrix with eigenvalues of ATA on the diagonal and V is the matrix of the corresponding eigenvectors. Indeed it is easy to check (do it) that the sought decomposition can the be constructed as A=UΣVT where Σ=D1/2, U=AVD-1/2.

    Singular value decomposition can be used to solve the linear least squares problem Ax=b. Indeed inserting the decomposition into the equation gives UΣVTx=b. Multiplying from the left with UT and using the orthogonality of U one gets the projected equation ΣVTx=UTb. This is a square system which can be easily solved first by solving the diagonal system Σy=UTb for y and then obtaining x as x=Vy.

    1. Implement a function that makes a singular value decomposition of a (tall) matrix A by diagonalising ATA (with your implementation of the Jacobi eigenvalue algorithm), ATA=VDVT (where D is a diagonal matrix with eigenvalues of ATA and V is the matrix of the corresponding eigenvectors) and then using A=UΣV where U=AVD-1/2 and Σ=D1/2.
    2. Implement a function that solves the linear least squares problem Ax=b using you implementation of singular-value decomposition.
    3. Implement a function that makes ordinary least-squares fit using your implementation of singular-value decomposition.
  3. (1 point)

    Make up two experimental points with error bars, {xi, yi ± Δ yi | i=1,2}. Assume that the theoretical curve is a straight line. Make an ordinary least-squares fit of your experimental data with a linear function y(x)=a+bx. Make a plot of your results. On the plot illustrate the probable distribution of the variable y as function of x (in the assumption of normal distribution of yi, a, b if you need that).

    Hint: the variation of y with respect to variations of a and b as function of x is given as
    δy = δa + δb x.
    Therefore
    δy2 = δa2 + δb2 x2 +2δa δb x.
    For normally distributed a and b this translates into
    Δy2(x) = Δa2 + Δb2x2 + 2<δa δb>x
    where <δa δb> is the covariance.

  4. (0 points)

    Make up an interesting exercise here.