java.lang.Long cannot be cast to java.lang.Double ERROR when using MAX()












2















Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.



Some notes :




  • I used the MAX function on another dataset and it was working. ( So max() works )

  • I didn't have this issue before the update of dataprep yesterday, the
    flow was working.

  • I tried many time to edit the recipe to isolate the
    issue but it seems to be that MAX() function

  • The column i'm using MAX() on are of type INT. i tried to convert INT->
    FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue


Here is the log



java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)









share|improve this question

























  • I edited my answer so you know that the issue is now fixed.

    – Iñigo
    Nov 21 '18 at 8:14
















2















Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.



Some notes :




  • I used the MAX function on another dataset and it was working. ( So max() works )

  • I didn't have this issue before the update of dataprep yesterday, the
    flow was working.

  • I tried many time to edit the recipe to isolate the
    issue but it seems to be that MAX() function

  • The column i'm using MAX() on are of type INT. i tried to convert INT->
    FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue


Here is the log



java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)









share|improve this question

























  • I edited my answer so you know that the issue is now fixed.

    – Iñigo
    Nov 21 '18 at 8:14














2












2








2


4






Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.



Some notes :




  • I used the MAX function on another dataset and it was working. ( So max() works )

  • I didn't have this issue before the update of dataprep yesterday, the
    flow was working.

  • I tried many time to edit the recipe to isolate the
    issue but it seems to be that MAX() function

  • The column i'm using MAX() on are of type INT. i tried to convert INT->
    FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue


Here is the log



java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)









share|improve this question
















Since the update of Cloud Dataprep yesterday 19/11/2018, I got an error everytime I'm using the function MAX(), either alone or in pivot.



Some notes :




  • I used the MAX function on another dataset and it was working. ( So max() works )

  • I didn't have this issue before the update of dataprep yesterday, the
    flow was working.

  • I tried many time to edit the recipe to isolate the
    issue but it seems to be that MAX() function

  • The column i'm using MAX() on are of type INT. i tried to convert INT->
    FLOAT -> INT to make sure it's INT before using MAX() but keep getting the same issue


Here is the log



java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Double
at com.trifacta.google.dataflow.functions.MaxCombineFn.binaryOperation(MaxCombineFn.java:18)
at com.trifacta.google.dataflow.functions.BinaryOperationCombineFn.addInput(BinaryOperationCombineFn.java:60)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:295)
at org.apache.beam.sdk.transforms.CombineFns$ComposedCombineFn.addInput(CombineFns.java:212)
at org.apache.beam.runners.core.GlobalCombineFnRunners$CombineFnRunner.addInput(GlobalCombineFnRunners.java:109)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:163)
at com.google.cloud.dataflow.worker.PartialGroupByKeyParDoFns$ValueCombiner.add(PartialGroupByKeyParDoFns.java:141)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$CombiningGroupingTable$1.add(GroupingTables.java:385)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:230)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingTables$GroupingTableBase.put(GroupingTables.java:210)
at com.google.cloud.dataflow.worker.util.common.worker.SimplePartialGroupByKeyParDoFn.processElement(SimplePartialGroupByKeyParDoFn.java:35)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:71)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:128)






google-cloud-dataprep






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 21 '18 at 3:54









Iñigo

827314




827314










asked Nov 20 '18 at 9:23









FontainFontain

112




112













  • I edited my answer so you know that the issue is now fixed.

    – Iñigo
    Nov 21 '18 at 8:14



















  • I edited my answer so you know that the issue is now fixed.

    – Iñigo
    Nov 21 '18 at 8:14

















I edited my answer so you know that the issue is now fixed.

– Iñigo
Nov 21 '18 at 8:14





I edited my answer so you know that the issue is now fixed.

– Iñigo
Nov 21 '18 at 8:14












1 Answer
1






active

oldest

votes


















0














I'm with Google Cloud Platform Support.



This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).



There is a Public Issue regarding this, feel free to add info or anything you feel is needed.



EDIT: The issue is fixed now, could you try now and tell me if it worked?






share|improve this answer

























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53389809%2fjava-lang-long-cannot-be-cast-to-java-lang-double-error-when-using-max%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    I'm with Google Cloud Platform Support.



    This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).



    There is a Public Issue regarding this, feel free to add info or anything you feel is needed.



    EDIT: The issue is fixed now, could you try now and tell me if it worked?






    share|improve this answer






























      0














      I'm with Google Cloud Platform Support.



      This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).



      There is a Public Issue regarding this, feel free to add info or anything you feel is needed.



      EDIT: The issue is fixed now, could you try now and tell me if it worked?






      share|improve this answer




























        0












        0








        0







        I'm with Google Cloud Platform Support.



        This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).



        There is a Public Issue regarding this, feel free to add info or anything you feel is needed.



        EDIT: The issue is fixed now, could you try now and tell me if it worked?






        share|improve this answer















        I'm with Google Cloud Platform Support.



        This is is an internal issue that happened after the update on the 19th (as you said). We know about this and we are working along the Trifacta team (as this is a third party product developed and managed by them).



        There is a Public Issue regarding this, feel free to add info or anything you feel is needed.



        EDIT: The issue is fixed now, could you try now and tell me if it worked?







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Nov 21 '18 at 8:13

























        answered Nov 20 '18 at 15:58









        IñigoIñigo

        827314




        827314
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53389809%2fjava-lang-long-cannot-be-cast-to-java-lang-double-error-when-using-max%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Guess what letter conforming each word

            Port of Spain

            Run scheduled task as local user group (not BUILTIN)