Lesson 0:
- Incredibly flexible function (NN), all-purpose parameter fitting(Backprop), fast and scalable(GPU) made DL possible
- ?? before any function describes the function in Ipynb
- Shift + Enter in Ipynb
- Try things with your own data
Lesson 1:
- Infinitely flexible function, all purpose parameter fitting and tuning which is fast and scalable.
- Neural network is the universal approximation function.
- Gradient Descent & Backpropagation is the tuning part.
- NVIDIA GPU supports CUDA which is efficient in running deep learning computations.
- P2 and t2.micro
- AWS Setup
- AMI: Amazon Machine Images: Snapshot of instance at a given point in time
- Literate programming
- Cmd + Shift + P for keyboard shortcuts on jupyter notebook
- tmux
- Test, Train and Validation Datasets. Also, introduced to sample data.
- Magic function
- Pretrained Model - Model with Learned Parameters
- VGG 16 model
- Keras.json & theanorc file for configuration
- Concept of Batch and Mini Batch
- GPU has limited memory and the data transfer is a costly operation
- Finetuning replaces the last layer of a pretrained network with the current classes