Intro to NLP
03. Structured Languages1:31
03. Structured Languages
04. Grammar0:35
04. Grammar
05. Unstructured Text1:17
05. Unstructured Text
06. Counting Words
07. Context Is Everything1:34
08. NLP and Pipelines0:48
09. How NLP Pipelines Work1:11
10. Text Processing1:56
11. Feature Extraction2:48
12. Modeling0:54
Text Processing
02. Coding Exercises
03. Introduction to GPU Workspaces
04. Workspaces Best Practices
05. Workspace
06. Capturing Text Data1:18
07. Cleaning6:24
08. Normalization2:19
09. Tokenization2:11
10. Stop Word Removal1:12
11. Part-of-Speech Tagging1:23
12. Named Entity Recognition0:52
13. Stemming and Lemmatization3:37
14. Summary0:47
08. Normalization
Spam Classifier with Naive Bayes
01. Intro0:22
02. Guess the Person2:01
03. Known and Inferred0:49
04. Guess the Person Now5:04
05. Bayes Theorem2:04
06. Quiz False Positives1:17
06. Quiz: False Positives
07. Solution False Positives4:38
08. Bayesian Learning 11:26
08.2 Bayesian Learning 1
08.2 Bayesian Learning 1
09. Bayesian Learning 21:04
10. Bayesian Learning 32:10
11. Naive Bayes Algorithm 14:06
11. Naive Bayes Algorithm 1
12. Naive Bayes Algorithm 21:10
13. Building a Spam Classifier
14. Project
15. Spam Classifier – Workspace
16. Outro
Part of Speech Tagging with HMMs
01. Intro0:23
02. Part of Speech Tagging1:21
03. Lookup Table1:11
04. Bigrams2:18
05. When bigrams won’t work0:55
06. Hidden Markov Models3:07
07. Quiz How many paths3:18
08. Solution How many paths1:24
09. Quiz How many paths now0:54
10. Quiz Which path is more likely0:23
10. Quiz Which path is more likely
11. Solution Which path is more likely0:17
12. Viterbi Algorithm Idea0:50
13. Viterbi Algorithm4:12
14. Further Reading
15. Outro0:14
Project Part of Speech Tagging
01. Lesson Plan – Week 3
02. Introduction
03. Workspace Part of Speech Tagging
Project Description – Part of Speech Tagging
Project Rubric – Part of Speech Tagging
Feature extraction and embeddings
01. Feature Extraction0:42
02. Bag of Words3:17
03. TF-IDF1:38
04. One-Hot Encoding0:55
05. Word Embeddings1:02
06. Word2Vec2:19
07. GloVe3:01
08. Embeddings for Deep Learning3:38
09. t-SNE1:31
10. Summary0:23
Topic Modeling
01. Intro0:53
02. References
03. Bag of Words1:23
03. Bag of Words
04. Latent Variables1:30
04. Latent Variables
05. Matrix Multiplication2:11
06. Matrices2:29
07. Quiz Picking Topics1:53
07. Quiz Picking Topics
08. Solution Picking Topics1:44
09. Beta Distributions3:17
10. Dirichlet Distributions2:10
11. Latent Dirichlet Allocation0:40
12. Sample a Topic1:22
12.2 Sample a Topic1:22
13. Sample a Word1:54
13.2 Sample a Word1:54
14. Combining the Models4:43
15. Outro0:23
16. Notebook Topic Modeling
17. [SOLUTION] Topic Modeling
18. Next Steps
Sentiment Analysis
01. Intro1:03
02. Sentiment Analysis with a Regular Classifier2:28
05. Sentiment Analysis with RNN0:19
06. Notebook Sentiment Analysis with an RNN
07. [SOLUTION] Sentiment Analysis with an RNN
08. Optional Material
09. Outro
Sequence to Sequence
01. Introducing Jay Alammar
02. Previous Material
03. Jay Introduction4:02
04. Applications00:00:00
05. Architectures00:00:00
06. Architectures in More Depth00:00:00
07. Outro
Deep Learning Attention
01. Introduction to Attention4:21
02. Sequence to Sequence Recap2:55
03. Encoding — Attention Overview1:21
04. Decoding — Attention Overview2:31
05. Attention Overview
06. Attention Encoder00:00:00
07. Attention Decoder2:28
08. Attention Encoder & Decoder
09. Bahdanau and Luong Attention4:37
10. Multiplicative Attention6:09
11. Additive Attention00:00:00
12. Additive and Multiplicative Attention
13. Computer Vision Applications00:00:00
14. NLP Application Google Neural Machine Translation
15. Other Attention Methods4:16
16. The Transformer and Self-Attention4:25
17. Notebook Attention Basics
18. [SOLUTION] Attention Basics
19. Outro
RNN Keras Lab
01. Intro
02. Machine Translation
02. Machine Translation
02.2 Machine Translation
03. Deciphering Code with character-level RNNs
04. [SOLUTION] Deciphering code with character-level RNNs
05. Congratulations!
Cloud Computing Setup Instructions
01. Overview
02. Create an AWS Account
03. Get Access to GPU Instances
04. Launch Your Instance
05. Remotely Connecting to Your Instance
Project Machine Translation
01. Introduction to GPU Workspaces
02. Workspaces: Best Practices
03. NLP Machine Translation Workspace
04. Project: Machine Translation
Project Description – Project: Machine Translation
Project Rubric – Project: Machine Translation
Intro to Voice User Interfaces
01. Welcome to Voice User Interfaces!00:00:00
02. VUI Overview00:00:00
03. VUI Applications00:00:00
(Optional) Alexa History Skill
Speech Recognition
01. Intro00:00:00
02. Challenges in ASR00:00:00
03. Signal Analysis00:00:00
04. References: Signal Analysis
05. Quiz: FFT00:00:00
06. Feature Extraction with MFCC00:00:00
07. References: Feature Extraction
08. Quiz: MFCC00:00:00
09. Phonetics00:00:00
10. References: Phonetics
12. Voice Data Lab Introduction00:00:00
13. Lab: Voice Data
14. Acoustic Models and the Trouble with Time00:00:00
15. HMMs in Speech Recognition00:00:00
16. Language Models00:00:00
17. N-Grams00:00:00
18. Quiz: N-Grams
19. References: Traditional ASR
20. A New Paradigm00:00:00
21. Deep Neural Networks as Speech Models00:00:00
22. Connectionist Tempora Classification (CTC)
23. References: Deep Neural Network ASR
24. Outro00:00:00
11. Quiz: Phonetics
Project DNN Speech Recognizer
01. Overview
02. Introduction to GPU Workspaces
03. Workspaces Best Practices
04. Tasks
05. VUI Speech Recognizer Workspace
Project Description – Project DNN Speech Recognizer
Project Rubric – Project DNN Speech Recognizer
Recurrent Neural Networks
01. Introducing00:00:00
02. RNN Introduction00:00:00
03. RNN History00:00:00
04. RNN Applications00:00:00
05. Feedforward Neural Network-Reminder00:00:00
05.2 Feedforward Neural Network-Reminder00:00:00
06. The Feedforward Process
06.2 The Feedforward Process00:00:00
07. Feedforward Quiz
07. Feedforward Quiz
07.2 Feedforward Quiz
08. Backpropagation- Theory00:00:00
09. Backpropagation – Example (part a)
10. Backpropagation- Example (part b)
11. Backpropagation Quiz
12. RNN (part a)
13. RNN (part b)
14. RNN- Unfolded Model
15. Unfolded Model Quiz
16. RNN- Example
17. Backpropagation Through Time (part a)
18. Backpropagation Through Time (part b)
19. Backpropagation Through Time (part c)
20. BPTT Quiz 1
21. BPTT Quiz 2
22. BPTT Quiz 3
23. Some more math
24. RNN Summary
25. From RNN to LSTM
26. Wrap Up
06.3 The Feedforward Process00:00:00
08.2 Backpropagation- Theory00:00:00
Long Short-Term Memory Networks (LSTM)
01. Intro to LSTM
02. RNN vs LSTM00:00:00
03. Basics of LSTM00:00:00
04. Architecture of LSTM00:00:00
05. The Learn Gate00:00:00
06. The Forget Gate00:00:00
07. The Remember Gate00:00:00
08. The Use Gate00:00:00
09. Putting it All Together00:00:00
10. Quiz
10. Quiz
11. Other architectures00:00:00
12. Outro LSTM
Keras
01. Intro
02. Keras
03. Pre-Lab: Student Admissions in Keras
04. Lab: Student Admissions in Keras
05. Optimizers in Keras
06. Mini Project Intro00:00:00
07. Pre-Lab: IMDB Data in Keras
08. Lab: IMDB Data in Keras
Sentiment Prediction RNN
01. Intro
02. Sentiment RNN00:00:00
03. Data Preprocessing00:03:23
04. Creating Testing Sets00:01:14
05. Building the RNN00:00:00
06. Training the Network00:00:00
07. Solutions00:00:00
Embeddings and Word2Vec
01. Additional NLP Lessons
02. Embeddings Intro
03. Implementing Word2Vec00:00:00
04. Subsampling Solution00:04:21
05. Making Batches00:03:24
06. Batches Solution00:01:45
07. Building the Network00:00:00
08. Negative Sampling00:00:00
09. Building the Network Solution00:00:00
10. Training Results00:03:13
Project Part of Speech Tagging
01. Introduction
Project Description – Part of Speech Tagging
Project Rubric – Part of Speech Tagging
LLMs Module: Introduction to Large Language Models
Introduction to the course2:21
What are LLMs2:56
How large is an LLM2:56
General purpose models1:09
Pretraining and fine tuning2:39
What can LLMs be used for3:18
LLMs Module: The Transformer Architecture
Deep learning recap2:32
The problem with RNNs3:35
The solution attention is all you need2:50
The transformer architecture1:02
Input embeddings2:54
Multiheaded attention3:59
Feedforward layer2:38
Masked multihead attention1:24
Predicting the final outputs1:44
LLMs Module: Getting Started With GPT Models
What does GPT mean1:28
The development of ChatGPT2:28
OpenAI API2:58
Generating text2:25
Customizing GPT output00:00:00
Key word text summarization3:47
Coding a simple chatbot6:17
Introduction to LangChain in Python1:27
LangChain00:00:00
Adding custom data to our chatbot5:21
LLMs Module: Hugging Face Transformers
Hugging Face package2:41
The transformer pipeline5:50
Pretrained tokenizers9:01
Special tokens2:54
Hugging Face and PyTorchTensorFlow4:32
Saving and loading models00:00:00
LLMs Module: Question and Answer Models With BERT
GPT vs BERT00:00:00
BERT architecture4:40
Loading the model and tokenizer1:48
BERT embeddings3:43
Creating a QA bot8:41
BERT RoBERTa DistilBERT3:06
LLMs Module: Text Classification With XLNet
GPT vs BERT vs XLNET4:17
Preprocessing our data9:58
XLNet Embeddings4:25
Fine tuning XLNet.mp43:54
Evaluating our model00:00:00
LangChain Module: Introduction
Introduction to the course4:54
Business applications of LangChain5:23
What makes LangChain powerful4:33
What does the course cover5:33
LangChain Module: Tokens, Models, and Prices
LangChain Module: Setting Up the Environment
Setting up a custom anaconda environment for Jupyter integration3:42
Obtaining an OpenAI API key00:00:00
LangChain Module: The OpenAI API
First steps
System user and assistant roles
Creating a sarcastic chatbot
Temperature max tokens and streaming
LangChain Module: Model Inputs
The LangChain framework
ChatOpenAI6:25
System and human messages00:04:30
AI messages00:05:08
Prompt templates and prompt values00:05:23
Chat prompt templates and chat prompt values00:06:06
Fewshot chat message prompt templates00:06:23
LLMChain00:00:00
LangChain Module: Message History and Chatbot Memory
Chat message history00:06:01
Conversation buffer memory Implementing the setup00:03:49
Conversation buffer memory Configuring the chain00:06:38
Conversation buffer window memory00:04:02
Conversation summary memory00:00:00
Combined memory00:00:00
LangChain Module: Output Parsers
String output parser00:02:44
Commaseparated list output parser00:03:16
Datetime output parser00:02:48
LangChain Module: LangChain Expression Language (LCEL)
Piping a prompt model and an output parser00:06:52
Batching00:04:36
Streaming00:04:18
The Runnable and RunnableSequence classes00:04:53
Piping chains and the RunnablePassthrough class00:07:32
Graphing Runnables00:02:15
RunnableParallel00:06:24
Piping a RunnableParallel with other Runnables00:05:32
RunnableLambda00:05:24
The chain decorator00:04:22
Adding memory to a chain Part 1 Implementing the setup00:04:02
RunnablePassthrough with additional keys00:05:25
Itemgetter00:03:25
Adding memory to a chain Part 2 Creating the chain00:00:00
LangChain Module: Retrieval Augmented Generation (RAG)
How to integrate custom data into an LLM00:04:03
Introduction to RAG00:03:40
Introduction to document loading and splitting00:03:56
Introduction to document embedding00:06:46
Introduction to document storing retrieval and generation00:03:49
Indexing Document loading with PyPDFLoader00:07:11
Indexing Document loading with Docx2txtLoader00:02:25
Indexing Document splitting with character text splitter Theory00:00:00
Indexing Document splitting with character text splitter Code along00:05:20
Indexing Document splitting with Markdown header text splitter00:05:53
Indexing Text embedding with OpenAI00:06:00
Indexing Creating a Chroma vectorstore00:05:42
Indexing Inspecting and managing documents in a vectorstore00:04:22
Retrieval Similarity search00:05:29
Retrieval Maximal Marginal Relevance MMR search00:06:48
Retrieval Vectorstorebacked retriever00:03:30
Generation Stuffing documents00:04:23
Generation Generating a response00:03:41
LangChain Module: Tools and Agents
Introduction to reasoning chatbots00:03:06
Tools toolkits agents and agent executors00:06:41
Creating a Wikipedia tool and piping it to a chain00:06:03
Creating a retriever and a custom tool00:05:38
LangChain hub00:04:07
Creating a tool calling agent and an agent executor00:05:40
AgentAction and AgentFinish00:04:38
Vector Databases Module: Introduction
Introduction to the course00:02:59
Database comparison SQL NoSQL and Vector00:00:00
Understanding vector databases00:04:17
Vector Databases Module: Basics of Vector Space and High-Dimensional Data
Introduction to vector space00:00:00
Distance metrics in vector space00:00:00
Vector embeddings walkthrough00:00:00
Vector Databases Module: Introduction to The Pinecone Vector Database
Vector databases comparison00:06:59
Pinecone registration walkthrough and creating an Index00:03:39
Connecting to Pinecone using Python00:02:56
Pinecone solution
Creating and deleting a Pinecone index using Python00:03:26
Upserting data to a pinecone vector database00:03:51
Getting to know the fine web data set and loading it to Jupyter00:02:07
Upserting data from a text file and using an embedding algorithm00:05:21
Vector Databases Module: Semantic Search with Pinecone and Custom (Case Study)
Introduction to semantic search00:03:45
Introduction to the case study smart search for data science courses00:05:07
Getting to know the data for the case study
Data loading and preprocessing00:04:28
Pinecone Python APIs and connecting to the Pinecone server00:04:17
Embedding Algorithms00:00:00
Embedding the data and upserting the files to Pinecone00:03:28
Similarity search and querying the data00:04:27
How to update and change your vector database00:00:00
Data preprocessing and embedding for courses with section data00:04:11
Assignment 2
Upserting the new updated files to Pinecone00:02:10
Similarity search and querying courses and sections data00:00:00
Assignment 3
Using the BERT embedding algorithm00:03:44
Vector database for recommendation engines00:03:41
Vector database for semantic image search00:00:00
Vector database for biomedical research00:00:00