English | Size: 873.29 MB
Genre: eLearning
What you’ll learn
How to implement state-of-the-art text generation AI models
Background information about GPT-Neo, a state-of-the-art text generation NLP model
How to use Happy Transformer — a Python library for implementing NLP Transformer models
How to train/implement GPT-2
How to implement different text generation algorithms
How to fetch data using Hugging Face’s Datasets library
How to train GPT-Neo using Happy Transformer
How to create a web app with 100% Python using Anvil
How to host a Transformer model on Paperspace
GPT-3 is a state-of-the-art text generation natural language processing (NLP) model created by OpenAI. You can use it to generate text that resembles text generated by a human.
This course will cover how to create a web app that uses an open-source version of GPT-3 called GPT-Neo with 100% Python. That’s right, no HTML, Javascript, CSS or any other programming language is required. Just 100% Python!
You will learn how to:
Implement GPT-Neo (and GPT-2) with Happy Transformer
Train GPT-Neo to generate unique text for a specific domain
Create a web app using 100% Python with Anvil!
Host your language model using Google Colab and Paperspace
Installations:
NONE!!! All of the tools we use in this tutorial are web-based. They include Google Colab, Anvil and Paperspace. So regardless of if you’re on Mac, Windows or Linux, you will not have to worry about downloading any software.
Technologies:
Model: GPT-Neo — an open-source version of GPT-3 created by Eleuther AI
Framework: Happy Transformer — an open-source Python package that allows us to implement and train GPT-Neo with just a few lines of code
Web technologies: Anvil — a website that allows us to develop web app using Python
Backend technologies: We’ll cover how to use both Google Colab and Paperspace to host the model. Anvil automatically covers hosting the web app.
About the instructor:
My name is Eric Fillion, and I’m from Canada. I’m on a mission to make state-of-the-art advances in the field of NLP through creating open-source tools and by creating educational content. In early 2020, I led a team that launched an open-source Python Package called Happy Transformer. Happy Transformer allows programmers to implement and train state-of-the-art Transformer models with just a few lines of code. Since its release, it has won awards and has been downloaded over 13k times.
Requirements:
A basic understanding of Python
A google account — for Google Colab
Who this course is for:
Python developers interested in AI and NLP
nitro.download/view/30E3F6DFFD46536/Create-a-Text-Generation-Web-App-with-100-Python-NLP.part1.rar
nitro.download/view/1462108F3F2DE35/Create-a-Text-Generation-Web-App-with-100-Python-NLP.part2.rar
nitro.download/view/A4DBF76DE3DFDB3/Create-a-Text-Generation-Web-App-with-100-Python-NLP.part3.rar
rapidgator.net/file/c5a4913055212ebfa410ecfc1a9cbfd2/Create-a-Text-Generation-Web-App-with-100-Python-NLP.part1.rar.html
rapidgator.net/file/cc520bda9f170b028ff8158531b69b67/Create-a-Text-Generation-Web-App-with-100-Python-NLP.part2.rar.html
rapidgator.net/file/aac86c3f835939336b62b32f86f493cd/Create-a-Text-Generation-Web-App-with-100-Python-NLP.part3.rar.html
If any links die or problem unrar, send request to
forms.gle/e557HbjJ5vatekDV9