深度学习¶
约 218 个字 预计阅读时间 1 分钟
Reference¶
- https://www.cse.iitm.ac.in/~miteshk/CS7015/Slides/Handout/
- https://www.cse.iitm.ac.in/~miteshk/CS6910.html
Content¶
- Multilayer Perceptrons (MLPs)
- Multilayer Network of Sigmoid Neurons
- Feedforward Neural Networks and Backpropagation
- GD and Variants
- PCA
- Autoencoders
- Regularization
- \(l_{2}\) 正则化
- 数据集增强
- 参数共享与绑定
- 输入添加噪声
- 输出添加噪声
- 提前停止
- 集成方法
- 随机失活(Dropout)
- Neural Network Training Tips
- 无监督预训练
- 变更激活函数
- 变更初始化策略
- 批量归一化(Batch Normalization)
- Batch Normalization 详解
- NLP Basics
- Normalizing Textual Data in NLP
- Regular Expressions (RE)
- Tokenization
- Lemmatization
- Stemming
- Stopword removal
- Parts of Speech (POS) Tagging
- Text Representation in NLP
- One-Hot Encoding
- Bag of Words (BOW)
- N-Grams
- Text Embedding Techniques in NLP
- Word Embedding
- Word2Vec (SkipGram, Continuous Bag of Words - CBOW)
- GloVe (Global Vectors for Word Representation)
- fastText
- Pre-Trained Embedding
- ELMo (Embeddings from Language Models)
- BERT (Bidirectional Encoder Representations from Transformers)
- Document Embedding - Doc2Vec
- Word Embedding
- Normalizing Textual Data in NLP
- CNN
- RNN
- LSTM and GRU
- 关于激活函数
- Encoder Decoder Models
- Transformers
- Paper Reading
Exercise¶
-
FNN from scratch
-
CNN from scratch
- https://github.com/vzhou842/cnn-from-scratch
-
RNN from scratch
-
LSTM from scratch
- https://github.com/CallMeTwitch/Neural-Network-Zoo/blob/main/LongShortTermMemoryNetwork.py
-
Transformer from scatch
-
CNN for NLP Tasks
-
RNN for NLP Tasks
-
Transformers for NLP Tasks