Summary by your own words
The Jacobi method is based on solving for every variable locally with respect to the
other variables; one iteration corresponds to solving for every variable once. It is easy to
understand and implement, but convergence is slow.
Jacobi Method.
The Jacobi method is a method of solving a matrix equation on a
matrix that has no zeros along its main diagonal. Each diagonal element is solved for,
and an approximate value plugged in. The process is then iterated until it converges.
The Jacobi method is easily derived by examining each of the n equations in the linear
system of equations Ax = b in isolation
The Gauss-Seidel method is similar to the Jacobi method except that it uses updated
values as soon as they are available. It generally converges faster than the Jacobi method,
although still relatively slowly.
Gauss-Seidel Method Iterative Method for Linear System.
The method is an improved version
of the Jacobi method. It is defined on matrices with non-zero diagonals, but convergence is only
guaranteed if the matrix is either diagonally dominant, or symmetric and (semi) positive definite.
The successive over-relaxation method can be derived from the Gauss-Seidel method by
introducing an extrapolation parameter omega. This method can converge faster than
Gauss-Seidel by an order of magnitude.
California State University, East Bay
Numerical Analysis Iterative Techniques for Solving Linear Systems.
♣ Successive Over-relaxation Method (SOR). This method of solving a linear system of equations
Ax = b derived by extrapolating the Gauss-Seidel method. This extrapolation takes the form of a
weighted average between the previous iterate and the computed Gauss-Seidel iterate successively for
each component.
If ω = 1, the SOR method simplifies to the Gauss-Seidel method. A theorem due to Kahan (1958) shows
that SOR fails to converge if ω is outside of the interval (0, 2). In general, it is not possible to compute
in advance the value of ω that will maximize the rate of convergence of SOR.
Finally, the symmetric successive over-relaxation method is useful as a pre-conditioner for
non-stationary methods. However, it has no advantage over the successive over-relaxation
method as a stand-alone iterative method.
Neumann Lemma. If A is an n × n matrix with ρ(A)
.
Symmetric Successive Over-relaxation Method (SSOR).
If the matrix A is symmetric, then
the Symmetric Successive Over-relaxation method, combines two SOR sweeps together in such a way
that the resulting iteration matrix is similar to a symmetric matrix. Specifically, the first SOR sweep is
carried out as in SOR, but in the second sweep the unknowns are updated in the reverse order. That is, SSOR is a forward SOR sweep followed by a backward SOR sweep. The similarity of the SSOR iteration
matrix to a symmetric matrix permits the application of SSOR as a pre-conditioner for other iterative
schemes for symmetric matrices. Indeed, this is the primary motivation for SSOR since its convergence
rate , with an optimal value, is usually slower than the convergence rate of SOR with an optimal value.
Needs help with similar assignment?
We are available 24x7 to deliver the best services and assignment ready within 3-4 hours? Order a custom-written, plagiarism-free paper

