

Then x is computed in the usual SVD manner with the change that each singular value S(i) is replaced by S(i) + p^2/S(i) Is what we call the Picard Condition Vector. It is convenient in the following discussion to represent this as x = V * PCV Instead of the usual x = V * inv(S) * U' * b If we decompose A into its Singular Value Decomposition A = U * S * V'Īnd multiply both sides by the transpose of the augmented LHS, the resulting solution to is x = V * inv(S^2 + p^2*I) * S * U' * b

What we have to solve then is this expanded linear system: * x = Then, one weights these conditioning equations using a parameter usually called "lambda". Traditionally analysts have approached this issue in the linear algebraic system context by appending equations to A*x = b that request each solution value, x(i), to be zero. In fact this graph is extremely mild: the magnitude of the oscillations often measure in the millions, not just a little larger than the true solution. The jagged red line shows the result of a traditional solver on this problem. Here is a graphic showing this behavior using a common test matrix, a 31 x 31 Hilbert matrix, with the blue line being the ideal solution that one would hope a solver could compute. These problems exhibit "ill-conditioning", which means that the solution results are overly sensitive to insignificant changes to the observations, which are given in the right-hand-side vector, b. Such systems often arise, for example, in "inverse problems" in which the analyst is trying to reverse the effects of natural smoothing processes such as heat dissipation, optical blurring, or indirect sensing. CBM)īy Rondall Jones, interest is in automatically solving difficult linear systems, A*x = b He retired recently after nearly 40 years at Sandia National Labs in Albuquerque and now has a chance to return to the problem he studied in his thesis. Ron Jones worked with me in 1985 for his Ph.
