From d92cd27cf4afd1239156aefe675ede80de2939f9 Mon Sep 17 00:00:00 2001 From: Jiaming Song Date: Thu, 21 Mar 2019 16:36:09 +0800 Subject: [PATCH] Add files via upload some writing style fix --- keras/4.4-overfitting-and-underfitting.ipynb | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/keras/4.4-overfitting-and-underfitting.ipynb b/keras/4.4-overfitting-and-underfitting.ipynb index d28bebd..c2a801b 100644 --- a/keras/4.4-overfitting-and-underfitting.ipynb +++ b/keras/4.4-overfitting-and-underfitting.ipynb @@ -409,7 +409,7 @@ "the \"L2 norm\" of the weights). L2 regularization is also called _weight decay_ in the context of neural networks. Don't let the different \n", "name confuse you: weight decay is mathematically the exact same as L2 regularization.\n", "\n", - "In Analytics-zoo Keras API, weight regularization is added by passing _weight regularizer instances_ to layers as keyword arguments. Let's add L2 weight \n", + "In Keras API of Analytics Zoo, weight regularization is added by passing _weight regularizer instances_ to layers as keyword arguments. Let's add L2 weight \n", "regularization to our movie review classification network:" ] }, @@ -518,7 +518,7 @@ "As you can see, the model with L2 regularization (dots) has become much more resistant to overfitting than the reference model (crosses), \n", "even though both models have the same number of parameters.\n", "\n", - "As alternatives to L2 regularization, you could use one of the following Analytics-zoo Keras API weight regularizers: " + "As alternatives to L2 regularization, you could use one of the following Keras API of Analytics Zoo weight regularizers: " ] }, { @@ -568,7 +568,7 @@ "The core idea is that introducing noise in the output values of a layer can break up happenstance patterns that are not significant (what \n", "Hinton refers to as \"conspiracies\"), which the network would start memorizing if no noise was present. \n", "\n", - "In Analytics-zoo Keras API you can introduce dropout in a network via the `Dropout` layer, which gets applied to the output of layer right before it, e.g.:" + "In Keras API of Analytics Zoo you can introduce dropout in a network via the `Dropout` layer, which gets applied to the output of layer right before it, e.g.:" ] }, {