In Keras, how to apply softmax function on each row of the weight matrix?












0














from keras.models import Model
from keras.models import Input
from keras.layers import Dense

a = Input(shape=(3,))
b = Dense(2, use_bias=False)(a)
model = Model(inputs=a, outputs=b)


Suppose that the weights of the Dense layer in the above code is [[2, 3], [3, 1], [-1, 1]]. If we give [[2, 1, 3]] as an input to the model, then the output will be:



no softmax



But I want to apply the softmax function to each row of the Dense layer, so that the output will be:



with softmax



How can I do this?










share|improve this question
























  • You mean you want the softmax to be applied on the weights of Dense layer and not on its output, right?
    – today
    Nov 13 at 7:29










  • @today Yes, exactly.
    – zxcv
    Nov 13 at 9:59
















0














from keras.models import Model
from keras.models import Input
from keras.layers import Dense

a = Input(shape=(3,))
b = Dense(2, use_bias=False)(a)
model = Model(inputs=a, outputs=b)


Suppose that the weights of the Dense layer in the above code is [[2, 3], [3, 1], [-1, 1]]. If we give [[2, 1, 3]] as an input to the model, then the output will be:



no softmax



But I want to apply the softmax function to each row of the Dense layer, so that the output will be:



with softmax



How can I do this?










share|improve this question
























  • You mean you want the softmax to be applied on the weights of Dense layer and not on its output, right?
    – today
    Nov 13 at 7:29










  • @today Yes, exactly.
    – zxcv
    Nov 13 at 9:59














0












0








0







from keras.models import Model
from keras.models import Input
from keras.layers import Dense

a = Input(shape=(3,))
b = Dense(2, use_bias=False)(a)
model = Model(inputs=a, outputs=b)


Suppose that the weights of the Dense layer in the above code is [[2, 3], [3, 1], [-1, 1]]. If we give [[2, 1, 3]] as an input to the model, then the output will be:



no softmax



But I want to apply the softmax function to each row of the Dense layer, so that the output will be:



with softmax



How can I do this?










share|improve this question















from keras.models import Model
from keras.models import Input
from keras.layers import Dense

a = Input(shape=(3,))
b = Dense(2, use_bias=False)(a)
model = Model(inputs=a, outputs=b)


Suppose that the weights of the Dense layer in the above code is [[2, 3], [3, 1], [-1, 1]]. If we give [[2, 1, 3]] as an input to the model, then the output will be:



no softmax



But I want to apply the softmax function to each row of the Dense layer, so that the output will be:



with softmax



How can I do this?







python machine-learning keras keras-layer softmax






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 14 at 5:04

























asked Nov 13 at 3:41









zxcv

285




285












  • You mean you want the softmax to be applied on the weights of Dense layer and not on its output, right?
    – today
    Nov 13 at 7:29










  • @today Yes, exactly.
    – zxcv
    Nov 13 at 9:59


















  • You mean you want the softmax to be applied on the weights of Dense layer and not on its output, right?
    – today
    Nov 13 at 7:29










  • @today Yes, exactly.
    – zxcv
    Nov 13 at 9:59
















You mean you want the softmax to be applied on the weights of Dense layer and not on its output, right?
– today
Nov 13 at 7:29




You mean you want the softmax to be applied on the weights of Dense layer and not on its output, right?
– today
Nov 13 at 7:29












@today Yes, exactly.
– zxcv
Nov 13 at 9:59




@today Yes, exactly.
– zxcv
Nov 13 at 9:59












1 Answer
1






active

oldest

votes


















1














One way to achieve what you are looking for is to define a custom layer by subclassing the Dense layer and overriding its call method:



from keras import backend as K

class CustomDense(Dense):
def __init__(self, units, **kwargs):
super(CustomDense, self).__init__(units, **kwargs)

def call(self, inputs):
output = K.dot(inputs, K.softmax(self.kernel, axis=-1))
if self.use_bias:
output = K.bias_add(output, self.bias, data_format='channels_last')
if self.activation is not None:
output = self.activation(output)
return output


Test to make sure it works:



model = Sequential()
model.add(CustomDense(2, use_bias=False, input_shape=(3,)))

model.compile(loss='mse', optimizer='adam')

import numpy as np

w = np.array([[2,3], [3,1], [1,-1]])
inp = np.array([[2,1,3]])

model.layers[0].set_weights([w])
print(model.predict(inp))

# output
[[4.0610714 1.9389288]]


Verify it using numpy:



soft_w = np.exp(w) / np.sum(np.exp(w), axis=-1, keepdims=True)
print(np.dot(inp, soft_w))

[[4.06107115 1.93892885]]





share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53273456%2fin-keras-how-to-apply-softmax-function-on-each-row-of-the-weight-matrix%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1














    One way to achieve what you are looking for is to define a custom layer by subclassing the Dense layer and overriding its call method:



    from keras import backend as K

    class CustomDense(Dense):
    def __init__(self, units, **kwargs):
    super(CustomDense, self).__init__(units, **kwargs)

    def call(self, inputs):
    output = K.dot(inputs, K.softmax(self.kernel, axis=-1))
    if self.use_bias:
    output = K.bias_add(output, self.bias, data_format='channels_last')
    if self.activation is not None:
    output = self.activation(output)
    return output


    Test to make sure it works:



    model = Sequential()
    model.add(CustomDense(2, use_bias=False, input_shape=(3,)))

    model.compile(loss='mse', optimizer='adam')

    import numpy as np

    w = np.array([[2,3], [3,1], [1,-1]])
    inp = np.array([[2,1,3]])

    model.layers[0].set_weights([w])
    print(model.predict(inp))

    # output
    [[4.0610714 1.9389288]]


    Verify it using numpy:



    soft_w = np.exp(w) / np.sum(np.exp(w), axis=-1, keepdims=True)
    print(np.dot(inp, soft_w))

    [[4.06107115 1.93892885]]





    share|improve this answer


























      1














      One way to achieve what you are looking for is to define a custom layer by subclassing the Dense layer and overriding its call method:



      from keras import backend as K

      class CustomDense(Dense):
      def __init__(self, units, **kwargs):
      super(CustomDense, self).__init__(units, **kwargs)

      def call(self, inputs):
      output = K.dot(inputs, K.softmax(self.kernel, axis=-1))
      if self.use_bias:
      output = K.bias_add(output, self.bias, data_format='channels_last')
      if self.activation is not None:
      output = self.activation(output)
      return output


      Test to make sure it works:



      model = Sequential()
      model.add(CustomDense(2, use_bias=False, input_shape=(3,)))

      model.compile(loss='mse', optimizer='adam')

      import numpy as np

      w = np.array([[2,3], [3,1], [1,-1]])
      inp = np.array([[2,1,3]])

      model.layers[0].set_weights([w])
      print(model.predict(inp))

      # output
      [[4.0610714 1.9389288]]


      Verify it using numpy:



      soft_w = np.exp(w) / np.sum(np.exp(w), axis=-1, keepdims=True)
      print(np.dot(inp, soft_w))

      [[4.06107115 1.93892885]]





      share|improve this answer
























        1












        1








        1






        One way to achieve what you are looking for is to define a custom layer by subclassing the Dense layer and overriding its call method:



        from keras import backend as K

        class CustomDense(Dense):
        def __init__(self, units, **kwargs):
        super(CustomDense, self).__init__(units, **kwargs)

        def call(self, inputs):
        output = K.dot(inputs, K.softmax(self.kernel, axis=-1))
        if self.use_bias:
        output = K.bias_add(output, self.bias, data_format='channels_last')
        if self.activation is not None:
        output = self.activation(output)
        return output


        Test to make sure it works:



        model = Sequential()
        model.add(CustomDense(2, use_bias=False, input_shape=(3,)))

        model.compile(loss='mse', optimizer='adam')

        import numpy as np

        w = np.array([[2,3], [3,1], [1,-1]])
        inp = np.array([[2,1,3]])

        model.layers[0].set_weights([w])
        print(model.predict(inp))

        # output
        [[4.0610714 1.9389288]]


        Verify it using numpy:



        soft_w = np.exp(w) / np.sum(np.exp(w), axis=-1, keepdims=True)
        print(np.dot(inp, soft_w))

        [[4.06107115 1.93892885]]





        share|improve this answer












        One way to achieve what you are looking for is to define a custom layer by subclassing the Dense layer and overriding its call method:



        from keras import backend as K

        class CustomDense(Dense):
        def __init__(self, units, **kwargs):
        super(CustomDense, self).__init__(units, **kwargs)

        def call(self, inputs):
        output = K.dot(inputs, K.softmax(self.kernel, axis=-1))
        if self.use_bias:
        output = K.bias_add(output, self.bias, data_format='channels_last')
        if self.activation is not None:
        output = self.activation(output)
        return output


        Test to make sure it works:



        model = Sequential()
        model.add(CustomDense(2, use_bias=False, input_shape=(3,)))

        model.compile(loss='mse', optimizer='adam')

        import numpy as np

        w = np.array([[2,3], [3,1], [1,-1]])
        inp = np.array([[2,1,3]])

        model.layers[0].set_weights([w])
        print(model.predict(inp))

        # output
        [[4.0610714 1.9389288]]


        Verify it using numpy:



        soft_w = np.exp(w) / np.sum(np.exp(w), axis=-1, keepdims=True)
        print(np.dot(inp, soft_w))

        [[4.06107115 1.93892885]]






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 13 at 10:58









        today

        9,28621435




        9,28621435






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53273456%2fin-keras-how-to-apply-softmax-function-on-each-row-of-the-weight-matrix%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            鏡平學校

            ꓛꓣだゔៀៅຸ໢ທຮ໕໒ ,ໂ'໥໓າ໼ឨឲ៵៭ៈゎゔit''䖳𥁄卿' ☨₤₨こゎもょの;ꜹꟚꞖꞵꟅꞛေၦေɯ,ɨɡ𛃵𛁹ޝ޳ޠ޾,ޤޒޯ޾𫝒𫠁သ𛅤チョ'サノބޘދ𛁐ᶿᶇᶀᶋᶠ㨑㽹⻮ꧬ꧹؍۩وَؠ㇕㇃㇪ ㇦㇋㇋ṜẰᵡᴠ 軌ᵕ搜۳ٰޗޮ޷ސޯ𫖾𫅀ल, ꙭ꙰ꚅꙁꚊꞻꝔ꟠Ꝭㄤﺟޱސꧨꧼ꧴ꧯꧽ꧲ꧯ'⽹⽭⾁⿞⼳⽋២៩ញណើꩯꩤ꩸ꩮᶻᶺᶧᶂ𫳲𫪭𬸄𫵰𬖩𬫣𬊉ၲ𛅬㕦䬺𫝌𫝼,,𫟖𫞽ហៅ஫㆔ాఆఅꙒꚞꙍ,Ꙟ꙱エ ,ポテ,フࢰࢯ𫟠𫞶 𫝤𫟠ﺕﹱﻜﻣ𪵕𪭸𪻆𪾩𫔷ġ,ŧآꞪ꟥,ꞔꝻ♚☹⛵𛀌ꬷꭞȄƁƪƬșƦǙǗdžƝǯǧⱦⱰꓕꓢႋ神 ဴ၀க௭எ௫ឫោ ' េㇷㇴㇼ神ㇸㇲㇽㇴㇼㇻㇸ'ㇸㇿㇸㇹㇰㆣꓚꓤ₡₧ ㄨㄟ㄂ㄖㄎ໗ツڒذ₶।ऩछएोञयूटक़कयँृी,冬'𛅢𛅥ㇱㇵㇶ𥄥𦒽𠣧𠊓𧢖𥞘𩔋цѰㄠſtʯʭɿʆʗʍʩɷɛ,əʏダヵㄐㄘR{gỚṖḺờṠṫảḙḭᴮᵏᴘᵀᵷᵕᴜᴏᵾq﮲ﲿﴽﭙ軌ﰬﶚﶧ﫲Ҝжюїкӈㇴffצּ﬘﭅﬈軌'ffistfflſtffतभफɳɰʊɲʎ𛁱𛁖𛁮𛀉 𛂯𛀞నఋŀŲ 𫟲𫠖𫞺ຆຆ ໹້໕໗ๆทԊꧢꧠ꧰ꓱ⿝⼑ŎḬẃẖỐẅ ,ờỰỈỗﮊDžȩꭏꭎꬻ꭮ꬿꭖꭥꭅ㇭神 ⾈ꓵꓑ⺄㄄ㄪㄙㄅㄇstA۵䞽ॶ𫞑𫝄㇉㇇゜軌𩜛𩳠Jﻺ‚Üမ႕ႌႊၐၸဓၞၞၡ៸wyvtᶎᶪᶹစဎ꣡꣰꣢꣤ٗ؋لㇳㇾㇻㇱ㆐㆔,,㆟Ⱶヤマފ޼ޝަݿݞݠݷݐ',ݘ,ݪݙݵ𬝉𬜁𫝨𫞘くせぉて¼óû×ó£…𛅑הㄙくԗԀ5606神45,神796'𪤻𫞧ꓐ㄁ㄘɥɺꓵꓲ3''7034׉ⱦⱠˆ“𫝋ȍ,ꩲ軌꩷ꩶꩧꩫఞ۔فڱێظペサ神ナᴦᵑ47 9238їﻂ䐊䔉㠸﬎ffiﬣ,לּᴷᴦᵛᵽ,ᴨᵤ ᵸᵥᴗᵈꚏꚉꚟ⻆rtǟƴ𬎎

            Why https connections are so slow when debugging (stepping over) in Java?