Tags
Language
Tags
May 2024
Su Mo Tu We Th Fr Sa
28 29 30 1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31 1

Learning Deep Learning: From Perceptron to Large Language Models

Posted By: IrGens
Learning Deep Learning: From Perceptron to Large Language Models

Learning Deep Learning: From Perceptron to Large Language Models
ISBN: 0138177651 | .MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 13h 23m | 2.76 GB
Instructor: Magnus Ekman

The Sneak Peek program provides early access to Pearson video products and is exclusively available to Safari subscribers. Content for titles in this program is made available throughout the development cycle, so products may not be complete, edited, or finalized, including video post-production editing.

Introduction

Learning Deep Learning: Introduction

Lesson 1: Deep Learning Introduction

Topics
1.1 Deep Learning and Its History
1.2 Prerequisites

Lesson 2: Neural Network Fundamentals I

Topics
2.1 The Perceptron and Its Learning Algorithm
2.2 Programming Example: Perceptron
2.3 Understanding the Bias Term
2.4 Matrix and Vector Notation for Neural Networks
2.5 Perceptron Limitations
2.6 Solving Learning Problem with Gradient Descent
2.7 Computing Gradient with the Chain Rule
2.8 The Backpropagation Algorithm
2.9 Programming Example: Learning the XOR Function
2.10 What Activation Function to Use
2.11 Lesson 2 Summary

Lesson 3: Neural Network Fundamentals II

Topics
3.1 Datasets and Generalization
3.2 Multiclass Classification
3.3 Programming Example: Digit Classification with Python
3.4 DL Frameworks
3.5 Programming Example: Digit Classification with TensorFlow
3.6 Programming Example: Digit Classification with PyTorch
3.7 Avoiding Saturating Neurons and Vanishing Gradients—Part I
3.8 Avoiding Saturating Neurons and Vanishing Gradients—Part II
3.9 Variations on Gradient Descent
3.10 Programming Example: Improved Digit Classification with TensorFlow
3.11 Programming Example: Improved Digit Classification with PyTorch
3.12 Problem Types, Output Units, and Loss Functions
3.13 Regularization Techniques
3.14 Programming Example: Regression Problem with TensorFlow
3.15 Programming Example: Regression Problem with PyTorch
3.16 Lesson 3 Summary

Lesson 4: Convolutional Neural Networks (CNN) and Image Classification

Topics
4.1 The CIFAR-10 Dataset
4.2 Convolutional Layer
4.3 Building a Convolutional Neural Network
4.4 Programming Example: Image Classification Using CNN with TensorFlow
4.5 Programming Example: Image Classification Using CNN with PyTorch
4.6 AlexNet
4.7 VGGNet
4.8 GoogLeNet
4.9 ResNet
4.10 Programming Example: Using a Pretrained Network with TensorFlow
4.11 Programming Example: Using a Pretrained Network with PyTorch
4.12 Transfer Learning
4.13 Efficient CNNs
4.14 Lesson 4 Summary

Lesson 5: Recurrent Neural Networks (RNN) and Time Series Prediction

Topics
5.1 Problem Types Involving Sequential Data
5.2 Recurrent Neural Networks
5.3 Programming Example: Forecasting Book Sales with TensorFlow
5.4 Programming Example: Forecasting Book Sales with PyTorch
5.5 Backpropagation Through Time and Keeping Gradients Healthy
5.6 Long Short-Term Memory
5.7 Autoregression and Beam Search
5.8 Programming Example: Text Autocompletion with TensorFlow
5.9 Programming Example: Text Autocompletion with PyTorch
5.10 Lesson 5 Summary

Lesson 6: Neural Language Models and Word Embeddings

Topics
6.1 Language Models
6.2 Word Embeddings
6.3 Programming Example: Language Model and Word Embeddings with TensorFlow
6.4 Programming Example: Language Model and Word Embeddings with PyTorch
6.5 Word2vec
6.6 Programming Example: Using Pretrained GloVe Embeddings
6.7 Handling Out-of-Vocabulary Words with Wordpieces
6.8 Lesson 6 Summary

Lesson 7: Encoder–Decoder Networks, Attention, Transformers, and Neural Machine Translation

Topics
7.1 Encoder–Decoder Network for Neural Machine Translation
7.2 Programming Example: Neural Machine Translation with TensorFlow
7.3 Programming Example: Neural Machine Translation with PyTorch
7.4 Attention
7.5 The Transformer
7.6 Programming Example: Machine Translation Using Transformer with TensorFlow
7.7 Programming Example: Machine Translation Using Transformer with PyTorch
7.8 Lesson 7 Summary

Lesson 8: Large Language Models

Topics
8.1 Overview of BERT
8.2 Overview of GPT
8.3 From GPT to GPT4
8.4 Handling Chat History
8.5 Prompt Tuning
8.6 Retrieving Data and Using Tools
8.7 Open Datasets and Models
8.8 Demo: Large Language Model Prompting
8.9 Lesson 8 Summary

Lesson 9: Multi-Modal Networks and Image Captioning

Topics
9.1 Multimodal learning
9.2 Programming Example: Multimodal Classification with TensorFlow
9.3 Programming Example: Multimodal Classification with PyTorch
9.4 Image Captioning with Attention
9.5 Programming Example: Image Captioning with TensorFlow
9.6 Programming Example: Image Captioning with PyTorch
9.7 Multimodal Large Language Models
9.8 Lesson 9 Summary

Lesson 10: Multi-Task Learning and Computer Vision Beyond Classification

Topics
10.1 Multitask Learning
10.2 Programming Example: Multitask Learning with TensorFlow
10.3 Programming Example: Multitask Learning with PyTorch
10.4 Object Detection with R-CNN
10.5 Improved Object Detection with Fast and Faster R-CNN
10.6 Segmentation with Deconvolution Network and U-Net
10.7 Instance Segmentation with Mask R-CNN
10.8 Lesson 10 Summary

Lesson 11: Applying Deep Learning

Topics
11.1 Ethical AI and Data Ethics
11.2 Process for Tuning a Network
11.3 Further Studies

Summary

Learning Deep Learning: Summary


Learning Deep Learning: From Perceptron to Large Language Models

Learning Deep Learning: From Perceptron to Large Language Models