Eigen: Is it possible to create LeastSquareDiagonalPreconditioner-like conditioner if i only can compute Aty...











up vote
1
down vote

favorite












I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question
























  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    yesterday






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    22 hours ago















up vote
1
down vote

favorite












I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question
























  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    yesterday






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    22 hours ago













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question















I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.







c++ linear-algebra eigen least-squares eigen3






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited yesterday

























asked 2 days ago









Dark_Daiver

7061830




7061830












  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    yesterday






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    22 hours ago


















  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    yesterday






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    22 hours ago
















In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago




In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago












I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago




I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago












@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago




@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago




1




1




Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
– chtz
yesterday




Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
– chtz
yesterday




1




1




I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
22 hours ago




I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
22 hours ago












1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



 m_invdiag(j) = 1./mat.col(j).squaredNorm();


for all column j using a strategy to what you already implemented for the product operators.






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
    }
    );

    Post as a guest
































    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



     m_invdiag(j) = 1./mat.col(j).squaredNorm();


    for all column j using a strategy to what you already implemented for the product operators.






    share|improve this answer

























      up vote
      1
      down vote



      accepted










      The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



       m_invdiag(j) = 1./mat.col(j).squaredNorm();


      for all column j using a strategy to what you already implemented for the product operators.






      share|improve this answer























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



         m_invdiag(j) = 1./mat.col(j).squaredNorm();


        for all column j using a strategy to what you already implemented for the product operators.






        share|improve this answer












        The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



         m_invdiag(j) = 1./mat.col(j).squaredNorm();


        for all column j using a strategy to what you already implemented for the product operators.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered yesterday









        ggael

        19.5k22944




        19.5k22944






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
            }
            );

            Post as a guest




















































































            Popular posts from this blog

            鏡平學校

            ꓛꓣだゔៀៅຸ໢ທຮ໕໒ ,ໂ'໥໓າ໼ឨឲ៵៭ៈゎゔit''䖳𥁄卿' ☨₤₨こゎもょの;ꜹꟚꞖꞵꟅꞛေၦေɯ,ɨɡ𛃵𛁹ޝ޳ޠ޾,ޤޒޯ޾𫝒𫠁သ𛅤チョ'サノބޘދ𛁐ᶿᶇᶀᶋᶠ㨑㽹⻮ꧬ꧹؍۩وَؠ㇕㇃㇪ ㇦㇋㇋ṜẰᵡᴠ 軌ᵕ搜۳ٰޗޮ޷ސޯ𫖾𫅀ल, ꙭ꙰ꚅꙁꚊꞻꝔ꟠Ꝭㄤﺟޱސꧨꧼ꧴ꧯꧽ꧲ꧯ'⽹⽭⾁⿞⼳⽋២៩ញណើꩯꩤ꩸ꩮᶻᶺᶧᶂ𫳲𫪭𬸄𫵰𬖩𬫣𬊉ၲ𛅬㕦䬺𫝌𫝼,,𫟖𫞽ហៅ஫㆔ాఆఅꙒꚞꙍ,Ꙟ꙱エ ,ポテ,フࢰࢯ𫟠𫞶 𫝤𫟠ﺕﹱﻜﻣ𪵕𪭸𪻆𪾩𫔷ġ,ŧآꞪ꟥,ꞔꝻ♚☹⛵𛀌ꬷꭞȄƁƪƬșƦǙǗdžƝǯǧⱦⱰꓕꓢႋ神 ဴ၀க௭எ௫ឫោ ' េㇷㇴㇼ神ㇸㇲㇽㇴㇼㇻㇸ'ㇸㇿㇸㇹㇰㆣꓚꓤ₡₧ ㄨㄟ㄂ㄖㄎ໗ツڒذ₶।ऩछएोञयूटक़कयँृी,冬'𛅢𛅥ㇱㇵㇶ𥄥𦒽𠣧𠊓𧢖𥞘𩔋цѰㄠſtʯʭɿʆʗʍʩɷɛ,əʏダヵㄐㄘR{gỚṖḺờṠṫảḙḭᴮᵏᴘᵀᵷᵕᴜᴏᵾq﮲ﲿﴽﭙ軌ﰬﶚﶧ﫲Ҝжюїкӈㇴffצּ﬘﭅﬈軌'ffistfflſtffतभफɳɰʊɲʎ𛁱𛁖𛁮𛀉 𛂯𛀞నఋŀŲ 𫟲𫠖𫞺ຆຆ ໹້໕໗ๆทԊꧢꧠ꧰ꓱ⿝⼑ŎḬẃẖỐẅ ,ờỰỈỗﮊDžȩꭏꭎꬻ꭮ꬿꭖꭥꭅ㇭神 ⾈ꓵꓑ⺄㄄ㄪㄙㄅㄇstA۵䞽ॶ𫞑𫝄㇉㇇゜軌𩜛𩳠Jﻺ‚Üမ႕ႌႊၐၸဓၞၞၡ៸wyvtᶎᶪᶹစဎ꣡꣰꣢꣤ٗ؋لㇳㇾㇻㇱ㆐㆔,,㆟Ⱶヤマފ޼ޝަݿݞݠݷݐ',ݘ,ݪݙݵ𬝉𬜁𫝨𫞘くせぉて¼óû×ó£…𛅑הㄙくԗԀ5606神45,神796'𪤻𫞧ꓐ㄁ㄘɥɺꓵꓲ3''7034׉ⱦⱠˆ“𫝋ȍ,ꩲ軌꩷ꩶꩧꩫఞ۔فڱێظペサ神ナᴦᵑ47 9238їﻂ䐊䔉㠸﬎ffiﬣ,לּᴷᴦᵛᵽ,ᴨᵤ ᵸᵥᴗᵈꚏꚉꚟ⻆rtǟƴ𬎎

            Why https connections are so slow when debugging (stepping over) in Java?