With a slight delay of a week, here's the third installment in a text classification series. This one covers text classification using a fine-tunned BERT model. The whole tutorial is made up of a Jupyter Notebook stored on Google Colab so you could start doing data science right away. Notebooks: https://github.com/adam0ling/twitter_sentiment Social links: * github: https://github.com/adam0ling * discord: https://discord.gg/sy4yS4m * twitter: https://twitter.com/adam0ling * instagram: https://www.instagram.com/adam0ling/ Timestamps: Intro 00:00 Showcase: 00:25 Google Colab: 00:58 Notebook Overview: 01:43 Imports: 02:00 Data: 03:06 Data Preparation: 06:33 Modelling: 17:19 First Iteration Results: 21:49 Model Saving: 23:54 Neuron Weights and Loss: 26:00 Second Training Iteration: 29:35 Second Training Results: 33:13 BERT vs. other models: 36:45 Modifying BERT: 39:56 Outro: 41:42 --- "Tokyo Music Walker - Way Home" is under a Creative Commons (cc-by) license Music promoted by BreakingCopyright: https://bit.ly/tokyo-walker-way-home

machine learningdeep learningtext classificationneural networktutorialnatural language processinglogistic regressiongoogle colaboratorylogistic regression examplenlp tutorialdata sciencetwitter analysis pythonsouth parkartificial intelligencemachine learning tutorialmachine learning pythondropout layer deep learningtext classification tensorflowBERTpythonjupyter notebook tutorialtf hubtensorflow tutorialgpt 3sentiment analysis