
Learning Deep Learning: From Perceptron to Large Language Models
English | Tutorial | Size: 2.76 GB
13+ Hours of Video Instruction
A complete guide to deep learning for artificial intelligence
Deep learning (DL) is a key component of today’s exciting advances in machine learning and artificial intelligence. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this video course is ideal for developers, data scientists, analysts, and others-including those with no prior machine learning or statistics experience.
After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Magnus Ekman shows how to use them to build advanced architectures, including the Transformer. He describes how these concepts are used to build modern networks for computer vision and natural language processing (NLP), including large language models and multimodal networks. The code repository is located on github.com/NVDLI/LDL.
About the Instructor:
Magnus Ekman, Ph.D., is a director of architecture at NVIDIA Corporation. His doctorate is in computer engineering, and he is the inventor of multiple patents. He was first exposed to artificial neural networks in the late nineties in his native country, Sweden. After some dabbling in evolutionary computation, he ended up focusing on computer architecture and relocated to Silicon Valley. He previously worked with processor design and R&D at Sun Microsystems and Samsung Research America. In his current role at NVIDIA, he leads an engineering team working on CPU performance and power efficiency for chips used to run artificial intelligence (AI) applications.
As the deep learning (DL) and AI fields exploded in the past few years, fueled by NVIDIA’s GPU technology and CUDA, Dr. Ekman found himself in the middle of a company expanding beyond computer graphics into becoming an AI powerhouse. As a part of that journey, he challenged himself to stay up-to-date with the most recent developments in the field. He partnered with the NVIDIA Deep Learning Institute (DLI) and wrote the book Learning Deep Learning, which was published by Pearson in 2021. He is thrilled to now follow up with this video, which is based on the book but also contains additional and updated content about the most recent development in large language models.
Skill Level:
Beginner
Intermediate
Learn How To:
Apply core concepts of perceptrons, gradient-based learning, sigmoid neurons, and backpropagation
Utilize DL frameworks to make it easier to develop more complicated and useful neural networks
Utilize convolutional neural networks (CNNs) to perform image classification and analysis
Apply recurrent neural networks (RNNs) and long short-term memory (LSTM) to text and other variable-length sequences
Build a natural language translation application using sequence-to-sequence networks based on the transformer architecture
Use the transformer architecture for other natural language processing (NLP) tasks, and how to engineer prompts for large language models (LLM)
Combine image and text data and build multimodal networks, including an image captioning application
DOWNLOAD:
RAPIDGATOR:
rapidgator.net/file/1a218c1dc062875f42b41d6ed16b149c/Learning_Deep_Learning_From_Perceptron_to_Large_Language_Models.part1.rar.html
rapidgator.net/file/3ff2d2ed07f54fecb0ac448be0beaa0e/Learning_Deep_Learning_From_Perceptron_to_Large_Language_Models.part2.rar.html
rapidgator.net/file/df7bc437f263e2bc711f5b8bdade0703/Learning_Deep_Learning_From_Perceptron_to_Large_Language_Models.part3.rar.html
NITROFLARE:
nitroflare.com/view/FFE7EA89AABAF61/Learning_Deep_Learning_From_Perceptron_to_Large_Language_Models.part1.rar
nitroflare.com/view/643643A3B272092/Learning_Deep_Learning_From_Perceptron_to_Large_Language_Models.part2.rar
nitroflare.com/view/3A156F1337AD663/Learning_Deep_Learning_From_Perceptron_to_Large_Language_Models.part3.rar