Python (multiprocessing): How to pass a dictionary as the argument of a worker process initializer function?












1















I'm using a function to initialize the worker processes of a process pool, and this function has a single argument, which is a dictionary. When the process pool is created and the function is called to initialize each worker process I get an error regarding the wrong number of arguments:



TypeError: _init_worker() takes 1 positional argument but 2 were given


The process initializer function being used:



def _init_worker(shared_arrays):

_global_shared_arrays = shared_arrays


The initializer is being called in the normal way for each worker process:



with multiprocessing.Pool(processes=_NUMBER_OF_WORKER_PROCESSES,
initializer=_init_worker, initargs=(arrays_dict)) as pool:


I think this has something to do with how dictionaries are passed as the argument, in that the above error always lists the number of items in the dictionary as the number of positional arguments that were passed, as if what's being passed is the keys of the dictionary rather than the dictionary itself. When I step into the code in the debugger this is exactly what's going on, i.e. if there's a single item in the dictionary argument then only the key is passed through to the initializer function, rather than the dictionary itself. If there are multiple items in the dictionary used as the argument being passed to the initializer function then the above error message is displayed, reporting the number of items in the dictionary as the number of positional arguments given, so it's somehow passing the keys of the dictionary as arguments rather than the dictionary itself.



Can anyone see what I'm doing wrong here? Thanks in advance for your suggestions.










share|improve this question


















  • 2





    add a comma to make initargs to be a tuple, initargs=(arrays_dict,)

    – Jacky
    Nov 19 '18 at 14:59
















1















I'm using a function to initialize the worker processes of a process pool, and this function has a single argument, which is a dictionary. When the process pool is created and the function is called to initialize each worker process I get an error regarding the wrong number of arguments:



TypeError: _init_worker() takes 1 positional argument but 2 were given


The process initializer function being used:



def _init_worker(shared_arrays):

_global_shared_arrays = shared_arrays


The initializer is being called in the normal way for each worker process:



with multiprocessing.Pool(processes=_NUMBER_OF_WORKER_PROCESSES,
initializer=_init_worker, initargs=(arrays_dict)) as pool:


I think this has something to do with how dictionaries are passed as the argument, in that the above error always lists the number of items in the dictionary as the number of positional arguments that were passed, as if what's being passed is the keys of the dictionary rather than the dictionary itself. When I step into the code in the debugger this is exactly what's going on, i.e. if there's a single item in the dictionary argument then only the key is passed through to the initializer function, rather than the dictionary itself. If there are multiple items in the dictionary used as the argument being passed to the initializer function then the above error message is displayed, reporting the number of items in the dictionary as the number of positional arguments given, so it's somehow passing the keys of the dictionary as arguments rather than the dictionary itself.



Can anyone see what I'm doing wrong here? Thanks in advance for your suggestions.










share|improve this question


















  • 2





    add a comma to make initargs to be a tuple, initargs=(arrays_dict,)

    – Jacky
    Nov 19 '18 at 14:59














1












1








1








I'm using a function to initialize the worker processes of a process pool, and this function has a single argument, which is a dictionary. When the process pool is created and the function is called to initialize each worker process I get an error regarding the wrong number of arguments:



TypeError: _init_worker() takes 1 positional argument but 2 were given


The process initializer function being used:



def _init_worker(shared_arrays):

_global_shared_arrays = shared_arrays


The initializer is being called in the normal way for each worker process:



with multiprocessing.Pool(processes=_NUMBER_OF_WORKER_PROCESSES,
initializer=_init_worker, initargs=(arrays_dict)) as pool:


I think this has something to do with how dictionaries are passed as the argument, in that the above error always lists the number of items in the dictionary as the number of positional arguments that were passed, as if what's being passed is the keys of the dictionary rather than the dictionary itself. When I step into the code in the debugger this is exactly what's going on, i.e. if there's a single item in the dictionary argument then only the key is passed through to the initializer function, rather than the dictionary itself. If there are multiple items in the dictionary used as the argument being passed to the initializer function then the above error message is displayed, reporting the number of items in the dictionary as the number of positional arguments given, so it's somehow passing the keys of the dictionary as arguments rather than the dictionary itself.



Can anyone see what I'm doing wrong here? Thanks in advance for your suggestions.










share|improve this question














I'm using a function to initialize the worker processes of a process pool, and this function has a single argument, which is a dictionary. When the process pool is created and the function is called to initialize each worker process I get an error regarding the wrong number of arguments:



TypeError: _init_worker() takes 1 positional argument but 2 were given


The process initializer function being used:



def _init_worker(shared_arrays):

_global_shared_arrays = shared_arrays


The initializer is being called in the normal way for each worker process:



with multiprocessing.Pool(processes=_NUMBER_OF_WORKER_PROCESSES,
initializer=_init_worker, initargs=(arrays_dict)) as pool:


I think this has something to do with how dictionaries are passed as the argument, in that the above error always lists the number of items in the dictionary as the number of positional arguments that were passed, as if what's being passed is the keys of the dictionary rather than the dictionary itself. When I step into the code in the debugger this is exactly what's going on, i.e. if there's a single item in the dictionary argument then only the key is passed through to the initializer function, rather than the dictionary itself. If there are multiple items in the dictionary used as the argument being passed to the initializer function then the above error message is displayed, reporting the number of items in the dictionary as the number of positional arguments given, so it's somehow passing the keys of the dictionary as arguments rather than the dictionary itself.



Can anyone see what I'm doing wrong here? Thanks in advance for your suggestions.







python python-multiprocessing






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 19 '18 at 14:44









James AdamsJames Adams

3,313125283




3,313125283








  • 2





    add a comma to make initargs to be a tuple, initargs=(arrays_dict,)

    – Jacky
    Nov 19 '18 at 14:59














  • 2





    add a comma to make initargs to be a tuple, initargs=(arrays_dict,)

    – Jacky
    Nov 19 '18 at 14:59








2




2





add a comma to make initargs to be a tuple, initargs=(arrays_dict,)

– Jacky
Nov 19 '18 at 14:59





add a comma to make initargs to be a tuple, initargs=(arrays_dict,)

– Jacky
Nov 19 '18 at 14:59












2 Answers
2






active

oldest

votes


















1














if you look at the documentation here



you will see the following:



If initializer is not None then each worker process will   
call initializer(*initargs) when it starts.


as you can see the the args for the initializer function are are being unpacked by the * operator.

So your custom init function should be ready to accept more than one argument in case you pass it a dict with more than one element or else it will fail.

Something like this: def _init_worker(*shared_arrays)






share|improve this answer


























  • Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

    – James Adams
    Nov 19 '18 at 15:06





















0














initargs will be unpacked, so you must pass a tuple, like



initargs=(arrays_dict,)





share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53377028%2fpython-multiprocessing-how-to-pass-a-dictionary-as-the-argument-of-a-worker-p%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1














    if you look at the documentation here



    you will see the following:



    If initializer is not None then each worker process will   
    call initializer(*initargs) when it starts.


    as you can see the the args for the initializer function are are being unpacked by the * operator.

    So your custom init function should be ready to accept more than one argument in case you pass it a dict with more than one element or else it will fail.

    Something like this: def _init_worker(*shared_arrays)






    share|improve this answer


























    • Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

      – James Adams
      Nov 19 '18 at 15:06


















    1














    if you look at the documentation here



    you will see the following:



    If initializer is not None then each worker process will   
    call initializer(*initargs) when it starts.


    as you can see the the args for the initializer function are are being unpacked by the * operator.

    So your custom init function should be ready to accept more than one argument in case you pass it a dict with more than one element or else it will fail.

    Something like this: def _init_worker(*shared_arrays)






    share|improve this answer


























    • Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

      – James Adams
      Nov 19 '18 at 15:06
















    1












    1








    1







    if you look at the documentation here



    you will see the following:



    If initializer is not None then each worker process will   
    call initializer(*initargs) when it starts.


    as you can see the the args for the initializer function are are being unpacked by the * operator.

    So your custom init function should be ready to accept more than one argument in case you pass it a dict with more than one element or else it will fail.

    Something like this: def _init_worker(*shared_arrays)






    share|improve this answer















    if you look at the documentation here



    you will see the following:



    If initializer is not None then each worker process will   
    call initializer(*initargs) when it starts.


    as you can see the the args for the initializer function are are being unpacked by the * operator.

    So your custom init function should be ready to accept more than one argument in case you pass it a dict with more than one element or else it will fail.

    Something like this: def _init_worker(*shared_arrays)







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Nov 19 '18 at 15:02

























    answered Nov 19 '18 at 14:56









    eladm26eladm26

    387314




    387314













    • Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

      – James Adams
      Nov 19 '18 at 15:06





















    • Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

      – James Adams
      Nov 19 '18 at 15:06



















    Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

    – James Adams
    Nov 19 '18 at 15:06







    Thank you for this clarification. I remedied the issue by surrounding the dictionary argument with brackets, it's now being unpacked as expected. As suggested by @Jacky in the comment above, using a comma after the argument also works, as this turns the argument into a tuple.

    – James Adams
    Nov 19 '18 at 15:06















    0














    initargs will be unpacked, so you must pass a tuple, like



    initargs=(arrays_dict,)





    share|improve this answer




























      0














      initargs will be unpacked, so you must pass a tuple, like



      initargs=(arrays_dict,)





      share|improve this answer


























        0












        0








        0







        initargs will be unpacked, so you must pass a tuple, like



        initargs=(arrays_dict,)





        share|improve this answer













        initargs will be unpacked, so you must pass a tuple, like



        initargs=(arrays_dict,)






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 19 '18 at 15:00









        pxepxe

        1207




        1207






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53377028%2fpython-multiprocessing-how-to-pass-a-dictionary-as-the-argument-of-a-worker-p%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            鏡平學校

            ꓛꓣだゔៀៅຸ໢ທຮ໕໒ ,ໂ'໥໓າ໼ឨឲ៵៭ៈゎゔit''䖳𥁄卿' ☨₤₨こゎもょの;ꜹꟚꞖꞵꟅꞛေၦေɯ,ɨɡ𛃵𛁹ޝ޳ޠ޾,ޤޒޯ޾𫝒𫠁သ𛅤チョ'サノބޘދ𛁐ᶿᶇᶀᶋᶠ㨑㽹⻮ꧬ꧹؍۩وَؠ㇕㇃㇪ ㇦㇋㇋ṜẰᵡᴠ 軌ᵕ搜۳ٰޗޮ޷ސޯ𫖾𫅀ल, ꙭ꙰ꚅꙁꚊꞻꝔ꟠Ꝭㄤﺟޱސꧨꧼ꧴ꧯꧽ꧲ꧯ'⽹⽭⾁⿞⼳⽋២៩ញណើꩯꩤ꩸ꩮᶻᶺᶧᶂ𫳲𫪭𬸄𫵰𬖩𬫣𬊉ၲ𛅬㕦䬺𫝌𫝼,,𫟖𫞽ហៅ஫㆔ాఆఅꙒꚞꙍ,Ꙟ꙱エ ,ポテ,フࢰࢯ𫟠𫞶 𫝤𫟠ﺕﹱﻜﻣ𪵕𪭸𪻆𪾩𫔷ġ,ŧآꞪ꟥,ꞔꝻ♚☹⛵𛀌ꬷꭞȄƁƪƬșƦǙǗdžƝǯǧⱦⱰꓕꓢႋ神 ဴ၀க௭எ௫ឫោ ' េㇷㇴㇼ神ㇸㇲㇽㇴㇼㇻㇸ'ㇸㇿㇸㇹㇰㆣꓚꓤ₡₧ ㄨㄟ㄂ㄖㄎ໗ツڒذ₶।ऩछएोञयूटक़कयँृी,冬'𛅢𛅥ㇱㇵㇶ𥄥𦒽𠣧𠊓𧢖𥞘𩔋цѰㄠſtʯʭɿʆʗʍʩɷɛ,əʏダヵㄐㄘR{gỚṖḺờṠṫảḙḭᴮᵏᴘᵀᵷᵕᴜᴏᵾq﮲ﲿﴽﭙ軌ﰬﶚﶧ﫲Ҝжюїкӈㇴffצּ﬘﭅﬈軌'ffistfflſtffतभफɳɰʊɲʎ𛁱𛁖𛁮𛀉 𛂯𛀞నఋŀŲ 𫟲𫠖𫞺ຆຆ ໹້໕໗ๆทԊꧢꧠ꧰ꓱ⿝⼑ŎḬẃẖỐẅ ,ờỰỈỗﮊDžȩꭏꭎꬻ꭮ꬿꭖꭥꭅ㇭神 ⾈ꓵꓑ⺄㄄ㄪㄙㄅㄇstA۵䞽ॶ𫞑𫝄㇉㇇゜軌𩜛𩳠Jﻺ‚Üမ႕ႌႊၐၸဓၞၞၡ៸wyvtᶎᶪᶹစဎ꣡꣰꣢꣤ٗ؋لㇳㇾㇻㇱ㆐㆔,,㆟Ⱶヤマފ޼ޝަݿݞݠݷݐ',ݘ,ݪݙݵ𬝉𬜁𫝨𫞘くせぉて¼óû×ó£…𛅑הㄙくԗԀ5606神45,神796'𪤻𫞧ꓐ㄁ㄘɥɺꓵꓲ3''7034׉ⱦⱠˆ“𫝋ȍ,ꩲ軌꩷ꩶꩧꩫఞ۔فڱێظペサ神ナᴦᵑ47 9238їﻂ䐊䔉㠸﬎ffiﬣ,לּᴷᴦᵛᵽ,ᴨᵤ ᵸᵥᴗᵈꚏꚉꚟ⻆rtǟƴ𬎎

            Why https connections are so slow when debugging (stepping over) in Java?