Eigen: Is it possible to create LeastSquareDiagonalPreconditioner-like conditioner if i only can compute Aty...
up vote
1
down vote
favorite
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
|
show 6 more comments
up vote
1
down vote
favorite
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago
1
Is the actual question how to compute the diagonal ofA^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).
– chtz
yesterday
1
I don't see any obvious solution except computing the squared norm of eachA*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
22 hours ago
|
show 6 more comments
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
I want to solve least-squares like system A^t * A * x = -A^t * x
. (I'm implementing Gauss-Newton method for special problem).
I wrote special routines which allow me to compute A * x
and A^t * y
products. With such routines it's easy to use matrix-free solvers thanks to Eigen.
But my approach converges not as good as Eigen::LeastSquaresConjugateGradient
. I made a small test and it's looks like LeastSquareDiagonalPreconditioner
speed ups convergence a lot.
My question is - how i can use LeastSquareDiagonalPreconditioner
or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.
EDIT
For clarity - i want to use Matrix-free solvers from Eigen with my product routines.
EDIT 2
Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.
c++ linear-algebra eigen least-squares eigen3
c++ linear-algebra eigen least-squares eigen3
edited yesterday
asked 2 days ago
Dark_Daiver
7061830
7061830
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago
1
Is the actual question how to compute the diagonal ofA^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).
– chtz
yesterday
1
I don't see any obvious solution except computing the squared norm of eachA*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
22 hours ago
|
show 6 more comments
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago
1
Is the actual question how to compute the diagonal ofA^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).
– chtz
yesterday
1
I don't see any obvious solution except computing the squared norm of eachA*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
22 hours ago
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago
1
1
Is the actual question how to compute the diagonal of
A^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A
(and maybe it will be more a math question than a programming question).– chtz
yesterday
Is the actual question how to compute the diagonal of
A^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A
(and maybe it will be more a math question than a programming question).– chtz
yesterday
1
1
I don't see any obvious solution except computing the squared norm of each
A*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.– chtz
22 hours ago
I don't see any obvious solution except computing the squared norm of each
A*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.– chtz
22 hours ago
|
show 6 more comments
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
add a comment |
up vote
1
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner
and implementing something like LeastSquareDiagonalPreconditioner ::factorize()
but adapted to your type. Basically you need to compute:
m_invdiag(j) = 1./mat.col(j).squaredNorm();
for all column j
using a strategy to what you already implemented for the product operators.
answered yesterday
ggael
19.5k22944
19.5k22944
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago
I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago
@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago
1
Is the actual question how to compute the diagonal of
A^t*A
or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more aboutA
(and maybe it will be more a math question than a programming question).– chtz
yesterday
1
I don't see any obvious solution except computing the squared norm of each
A*Unit(i)
-- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.– chtz
22 hours ago