You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Set of notebooks associated with Chapter 4 of the book.
3
+
## 🔖 Outline
4
4
5
-
1.**[One Pipeline Many Classifiers](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/01_OnePipeline_ManyClassifiers.ipynb)**: Here we demonstrate text classification using various algorithms such as Naive Bayes, Logistic Regression and Support Vector Machines.
5
+
To be added
6
+
7
+
8
+
## 🗒️ Notebooks
9
+
10
+
Set of notebooks associated with the chapter.
11
+
12
+
1.**[One Pipeline Many Classifiers](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/01_OnePipeline_ManyClassifiers.ipynb)**: Here we demonstrate text classification using various algorithms such as Naive Bayes, Logistic Regression, and Support Vector Machines.
6
13
7
14
2.**[Doc2Vec for Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/02_Doc2Vec_Example.ipynb)**: Here we demonstrate how to train your own Doc2Vec embedding and use it for text classification.
8
15
@@ -12,14 +19,31 @@ Set of notebooks associated with Chapter 4 of the book.
12
19
13
20
5.**[NNs for Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/05_DeepNN_Example.ipynb)**: Here we demonstrate text classification using pre-trained and custom word embeddings with various Deep Learning Models.
14
21
15
-
6.**[BERT: Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/06_BERT_IMDB_Sentiment_Classification.ipynb)**: Here we demonstrate how we train and finetune pytorch pre-trained BERT on IMDB reviews to predict their sentiment using HuggingFace Transformers library.
22
+
6.**[BERT: Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/06_BERT_IMDB_Sentiment_Classification.ipynb)**: Here we demonstrate how we train and fine-tune pytorch pre-trained BERT on IMDB reviews to predict their sentiment using HuggingFace Transformers library.
16
23
17
24
7.**[BERT: Text CLassification using Ktrain](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/07_BERT_Sentiment_Classification_IMDB_ktrain.ipynb)**: Here we demonstrate how we can use BERT to predict the sentiment of movie reviews using the ktrain library.
18
25
19
26
8.**[LIME-1](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/08_LimeDemo.ipynb)**: Here we demonstrate how to interpret the predictions of a logistic regression model using LIME.
20
27
21
-
9.**[LIME-2](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/09_Lime_RNN.ipynb)**: Here we demonstrate how to interpret predictions of a RNN model using LIME.
28
+
9.**[LIME-2](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/09_Lime_RNN.ipynb)**: Here we demonstrate how to interpret predictions of an RNN model using LIME.
22
29
23
30
10.**[SHAP](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/10_ShapDemo.ipynb)**: Here we demonstrate how to interpret ML and DL text classification models using SHAP.
24
31
25
-
11.**[Spam Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/11_SpamClassification.ipynb)**: Here we demonstrate how to classify a text message as SPAM or HAM using pre-trained models from the fastai library.
32
+
11.**[Spam Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/11_SpamClassification.ipynb)**: Here we demonstrate how to classify a text message as SPAM or HAM using pre-trained models from the fastai library.
0 commit comments