OpenAI has the entire AI community debating its decision to not release the fully trained version of its powerful new text generator model dubbed GPT-2. I'm going to explain how GPT-2 works using code, math, and animations. We'll discuss its potential applications (both good and bad), ways of preventing misuse, and at the end of the video I'll give my take on whether OpenAI was justified in doing so. The transformer architecture is quickly replacing recurrent networks for sequence learning, and OpenAI's GPT-2 is the latest example of using it at scale. Enjoy! Code for this video: https://github.com/openai/gpt-2 Please Subscribe! And Like. And comment. Thats what keeps me going. Want more education? Connect with me here: Twitter: https://twitter.com/sirajraval instagram: https://www.instagram.com/sirajraval Facebook: https://www.facebook.com/sirajology More learning resources: https://medium.com/@asierarranz/i-have-created-a-website-to-query-the-gpt-2-openai-model-11dd30e1c8b0 https://blog.openai.com/better-language-models/ http://jalammar.github.io/illustrated-transformer/ https://mchromiak.github.io/articles/2017/Sep/12/Transformer-Attention-is-all-you-need/#.XHVUts9KiLI Web Demo of GPT-2: https://www.askskynet.com/ Gradient Descent: https://www.youtube.com/watch?v=XdM6ER7zTLk&t=774s Fakebox: https://machinebox.io/docs/fakebox Privacy tools: https://github.com/OpenMined/PySyft/tree/master/examples/tutorials Join us at the School of AI: https://theschool.ai/ Join us in the Wizards Slack channel: http://wizards.herokuapp.com/ Please support me on Patreon: https://www.patreon.com/user?u=3191693 Signup for my newsletter for exciting updates in the field of AI: https://goo.gl/FZzJ5w Hit the Join button above to sign up to become a member of my channel for access to exclusive content! Join my AI community: http://chatgptschool.io/ Sign up for my AI Sports betting Bot, WagerGPT! (500 spots available): https://www.wagergpt.xyz

openaisiraj ravaltext generationgeneratormachine learningnlpnatural language processingtransformergoogleAIartificial intelligencedeep learningresearchgradient descentGPT2GPT-2generative modelingfake newsspeech generationpythoncodemathmathermaticsdistributiondata