The Convergence Analysis of Parallel Alternating Two-stage Iterative Algorithm for Linear Complementarity Problem
Abstract
In this paper, the authors first present parallel alternating two-stage iterative Algorithm some new relaxation algorithms for solving the linear complementarity problem. And then, when the coefficient matrices are monotone or H-matrices, they establish the global convergence theory of the algorithm. The algorithm has less computational complexity and quicker velocity and is especially suitable for parallel computation of large-scale problem.
Keywords
linear complementarity problem; alternating two-stage method; parallel computation; two-stage iterative; con-vergence
DOI
10.12783/dtetr/mcemic2016/9529
10.12783/dtetr/mcemic2016/9529
Refbacks
- There are currently no refbacks.