Unverified Commit 05f27429 authored by linbo.jin's avatar linbo.jin Committed by GitHub
Browse files

Fix notebook text

test data -> training data
parent 15b33360
......@@ -142,7 +142,7 @@
"\n",
"In other words, our model would *overfit* to the training data. Learning how to deal with overfitting is important. Although it's often possible to achieve high accuracy on the *training set*, what we really want is to develop models that generalize well to a *testing data* (or data they haven't seen before).\n",
"\n",
"The opposite of overfitting is *underfitting*. Underfitting occurs when there is still room for improvement on the test data if you continue to train for more epochs. This means the network has not yet learned all the relevant patterns in the training data. \n",
"The opposite of overfitting is *underfitting*. Underfitting occurs when there is still room for improvement on the training data if you continue to train for more epochs. This means the network has not yet learned all the relevant patterns in the training data. \n",
"\n",
"If you train for too long though, the model will start to overfit and learn patterns from the training data that don't generalize to the test data. We need to strike a balance. Understanding how to train for an appropriate number of epochs as we'll explore below is a useful skill.\n",
"\n",
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment