Skip to content

Commit

Permalink
Add files via upload
Browse files Browse the repository at this point in the history
some writing style fix
  • Loading branch information
Jiaming Song authored Mar 21, 2019
1 parent fb2f9ec commit d92cd27
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions keras/4.4-overfitting-and-underfitting.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -409,7 +409,7 @@
"the \"L2 norm\" of the weights). L2 regularization is also called _weight decay_ in the context of neural networks. Don't let the different \n",
"name confuse you: weight decay is mathematically the exact same as L2 regularization.\n",
"\n",
"In Analytics-zoo Keras API, weight regularization is added by passing _weight regularizer instances_ to layers as keyword arguments. Let's add L2 weight \n",
"In Keras API of Analytics Zoo, weight regularization is added by passing _weight regularizer instances_ to layers as keyword arguments. Let's add L2 weight \n",
"regularization to our movie review classification network:"
]
},
Expand Down Expand Up @@ -518,7 +518,7 @@
"As you can see, the model with L2 regularization (dots) has become much more resistant to overfitting than the reference model (crosses), \n",
"even though both models have the same number of parameters.\n",
"\n",
"As alternatives to L2 regularization, you could use one of the following Analytics-zoo Keras API weight regularizers: "
"As alternatives to L2 regularization, you could use one of the following Keras API of Analytics Zoo weight regularizers: "
]
},
{
Expand Down Expand Up @@ -568,7 +568,7 @@
"The core idea is that introducing noise in the output values of a layer can break up happenstance patterns that are not significant (what \n",
"Hinton refers to as \"conspiracies\"), which the network would start memorizing if no noise was present. \n",
"\n",
"In Analytics-zoo Keras API you can introduce dropout in a network via the `Dropout` layer, which gets applied to the output of layer right before it, e.g.:"
"In Keras API of Analytics Zoo you can introduce dropout in a network via the `Dropout` layer, which gets applied to the output of layer right before it, e.g.:"
]
},
{
Expand Down

0 comments on commit d92cd27

Please sign in to comment.