You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@amartinhuertas and me have been working on a Geometric MultiGrid solver/preconditioner for Gridap. To solve the coarsest-level linear system in parallel, we decided to use MUMPS through a PETScLinearSolver from GridapPETSc.
While profiling the resulting algorithm for a Poisson problem, where we use a preconditioned Conjugate Gradient solver with our GMG preconditioner, we find that a weirdly high amount of time is spent in the method gridap_petsc_gc(), which is called from within solve!() at every iteration of the conjugate gradient.
Due to this, we think it could be advantageous to introduce a new flag/kwarg/setting that deactivates all these automatic calls to the GC that are scattered around the code. The idea is to give an advanced user/developer the ability to manually call the GC optimally depending on the application.
Thanks!
The text was updated successfully, but these errors were encountered:
Hello all,
@amartinhuertas and me have been working on a Geometric MultiGrid solver/preconditioner for Gridap. To solve the coarsest-level linear system in parallel, we decided to use MUMPS through a
PETScLinearSolver
from GridapPETSc.While profiling the resulting algorithm for a Poisson problem, where we use a preconditioned Conjugate Gradient solver with our GMG preconditioner, we find that a weirdly high amount of time is spent in the method
gridap_petsc_gc()
, which is called from withinsolve!()
at every iteration of the conjugate gradient.Due to this, we think it could be advantageous to introduce a new flag/kwarg/setting that deactivates all these automatic calls to the GC that are scattered around the code. The idea is to give an advanced user/developer the ability to manually call the GC optimally depending on the application.
Thanks!
The text was updated successfully, but these errors were encountered: