strange loss curves when BatchNormalization used in Keras
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
add a comment |
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
Maybe this answer is relevant.
– today
Nov 17 '18 at 15:06
add a comment |
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
Part of code:
mobilenetv2 = MobileNetV2(input_shape=(IMG_SIZE, IMG_SIZE, CHANNELS),
alpha=1.0,
depth_multiplier=1,
include_top=False,
weights='imagenet',
input_tensor=None,
pooling=None,
classes=12)
for layer in mobilenetv2.layers:
layer.trainable = False
last = mobilenetv2.layers[-1].output
x = Flatten()(last)
x = Dense(120, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Dense(84, use_bias=False)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
preds = Dense(12, activation='softmax')(x)
model = Model(inputs=mobilenetv2.input, outputs=preds)
but the loss curve:
Are the above curves normal? I did not use dropout layers, because I am asked to compare dropout layers with BatchNormalization. but the curves look strange. Are my codes right? Or anything missing?
Thanks
keras keras-layer batch-normalization
keras keras-layer batch-normalization
asked Nov 17 '18 at 14:35
BAEBAE
2,71562969
2,71562969
Maybe this answer is relevant.
– today
Nov 17 '18 at 15:06
add a comment |
Maybe this answer is relevant.
– today
Nov 17 '18 at 15:06
Maybe this answer is relevant.
– today
Nov 17 '18 at 15:06
Maybe this answer is relevant.
– today
Nov 17 '18 at 15:06
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53352213%2fstrange-loss-curves-when-batchnormalization-used-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53352213%2fstrange-loss-curves-when-batchnormalization-used-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Maybe this answer is relevant.
– today
Nov 17 '18 at 15:06