AI深度探索:從機器學習到NLP精通
這些視頻涵蓋了人工智慧領域廣泛的主題,著重於理論和實際應用。主要領域包括: 機器學習與NLP技術:深入探討機器學習和NLP的當代方法,包括對抗性攻擊、模仿攻擊、自注意力機制和轉換器模型網路。
內容簡介
作者介紹
適合人群
你將會學到什麼
購買須知
-
自然語言處理(Natural Language Processing, NLP
這系列視頻專注於自然語言處理(Natural Language Processing, NLP)的各種面向,從對抗式攻擊到模型的理解和應用。系列開始於由姜成翰助教講授的三部分「自然語言處理上的對抗式攻擊」,深入探討了在自然語言處理領域中的安全性和漏洞。隨後,進入「自然語言處理上的模仿攻擊和後門攻擊」,這部分著重於更複雜和隱蔽的攻擊技術。 接著,莫烦Python的一系列視頻深入介紹了語言模型和注意力機制,解釋了機器如何理解語言,從句向量到詞向量的概念。同時,探討了搜索引擎的運作方式和自然語言處理技術的基礎。 StatQuest的視頻「Word Embedding and Word2Vec, Clearly Explained!!!」提供了對詞嵌入和Word2Vec的清晰解釋。而codebasics的「NLP Tutorial For Beginners In Python」系列從基礎介紹到實踐應用,涵蓋了NLP的多個重要主題,包括正則表達式、NLP的技術分類、NLP任務、處理流程、Spacy與NLTK的比較,以及分詞、詞幹提取、詞性標注和命名實體識別等關鍵技術。這些視頻為初學者提供了一個全面且深入的自然語言處理學習路徑。
-
【機器學習2022】自然語言處理上的對抗式攻擊 (由姜成翰助教講授) - Part 1 - Hung-yi Lee
-
【機器學習2022】自然語言處理上的對抗式攻擊 (由姜成翰助教講授) - Part 3 - Hung-yi Lee
-
【機器學習2022】自然語言處理上的對抗式攻擊 (由姜成翰助教講授) - Part 2 - Hung-yi Lee
-
【機器學習2022】自然語言處理上的模仿攻擊 (Imitation Attack) 以及後門攻擊 (Backdoor Attack) (由姜成翰助教講授) - Hung-yi Lee
-
肩膀上的眺望 - 预训练语言模型【莫烦Python NLP 自然语言处理教学】
当今的人工智能中,我们常花大量的时间和计算资源,训练一个优质的模型,这也算作是一种将资源和时间转化成知识,存储在模型的形式。那我们怎么把它复用起来呢?"莫烦Python" NLP 教学目录: https://mofanpy.com/tutorials/machine-learning/nlp/支持莫烦做更好的视...
-
请注意 注意力 - 自然语言处理注意力机制 【莫烦Python】
在注意力之上的注意力长什么样?从生物学的角度来理解机器学习(自然语言处理NLP)中的注意力机制。"莫烦Python" NLP 教学目录: https://mofanpy.com/tutorials/machine-learning/nlp/支持莫烦做更好的视频: https://mofanpy.com/supp...
-
请注意用词-语言模型的注意力【莫烦Python】
人类如何观察事物,怎样注意一件事物?机器呢?AI自然语言处理Python教学:https://www.youtube.com/watch?v=YXAQIRP2HXE&list=PLXO45tsB95cJDXCl_LjPSR3TWVXoPWU0Q更多有趣AI科普-莫烦Python:https://mofanpy.c...
-
机器这样理解语言-句向量 - 莫烦Python
人机对话,机器是怎么理解句子的?句向量...机器理解词语:https://www.youtube.com/watch?v=2fhChTLQDfY更多有趣AI科普:莫烦Python https://mofanpy.com支持莫烦:https://mofanpy.com/support/
-
机器是这样理解语言的 - 词向量 - 莫烦Python
语言是一门学问,不断变化的人类语言需要用一生来掌握。而且学会语言,往往是从学会词语的意思开始,将词组成话,才让语言有了意义,今天我们就来聊一聊计算机是如何理解词汇的。"莫烦Python" NLP 教学目录: https://mofanpy.com/tutorials/machine-learning/nlp/支...
-
你天天用的搜索引擎是怎样工作的?【莫烦Python】
"莫烦Python" NLP 教学目录: https://mofanpy.com/tutorials/machine-learning/nlp/生活中我们已经离不开互联网,而互联网也离不开一门技术,这种技术在早期的互联网中发挥着决定性的作用,它连接着人与人,人与网。它,就是我们的搜索引擎。更多请关注【莫烦Pyt...
-
计算机怎么理解人类语言?什么是自然语言处理 NLP 技术?【莫烦Python】
"莫烦Python" NLP 教学目录: https://mofanpy.com/tutorials/machine-learning/nlp/支持莫烦做更好的视频: https://mofanpy.com/support/计算机读懂语言,在如今已经不是什么新鲜的事情了,不过你有没有想过计算机是如何读懂人类语言...
-
Word Embedding and Word2Vec, Clearly Explained!!! - StatQuest
Words are great, but if we want to use them as input to a neural network, we have to convert them to numbers. One of the most popular methods for assigning n...
-
Introduction | NLP Tutorial For Beginners In Python - Season 1 Episode 1 - codebasics
We are starting an Natural Language Processing tutorial for beginners series in Python. This will be the first episode in season 1. In this introduction video we will discuss following topics, ⭐️ Timestamps ⭐️ 00:00 Introduction on NLP playlist 00:45 Four Unique things about this tutorial series 02:07 My favorite NLP book 02:57 NLP in real life 08:37 NLP career roles with salaries
-
Why NLP is booming right now? | NLP Tutorial For Beginners In Python - S1 E2 - codebasics
NLP has been around in academia for more than 50 years. In the industry however it started becoming popular in last 5 to 10 years time period. In this video I will discuss 5 reasons why natural language processing is booming right now.
-
Regex For NLP: NLP Tutorial For Beginners In Python - S1 E3 - codebasics
NLP problems are solved either using heuristics/rule based approach or using machine learning. Regex offers a powerful rule based approach where you extract patterns from your text to extract useful information for a given NLP task. In this video I will discuss two use cases (1) a customer service chatbot (2) Information extraction and show you how regular expressions can help solve some of the simple tasks. Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/py/blob... #nlpregexpython #NLP #regexnlp #nlpregextutorial #regexinchatbot #regex
-
Three Category Of Techniques for NLP : NLP Tutorial For Beginners In Python - S1 E4 - codebasics
codebasics
-
NLP Tasks: NLP Tutorial For Beginners In Python - S1 E5 - codebasics
We are going to discuss 10 different types of NLP tasks along with some technical details. We will also talk about some real life use cases of these NLP tasks. NLP Playlist: • NLP Tutorial Python Practical NLP Book In India: https://www.shroffpublishers.com/book... Practical NLP Book Link For USA: https://amzn.to/3Aoeocm #nlptasks #nlpusecases #nlptutorial #nlpforbeginners #nlp
-
NLP Pipeline: NLP Tutorial For Beginners In Python - S1 E6 - codebasics
An end to end NLP project consists of many steps. These steps together forms an NLP pipeline. The pipeline has various stages such as data acquistion, data clearning, pre-processing, model building, deployment, monitor and update etc. We will use a real life example and go through various stages of NLP pipeline in this video. Complete NLP Playlist: • NLP Tutorial Python #nlp #nlppipeline #nlptutorial #nlpbeginnertutorial #nlppython
-
Spacy vs NLTK: NLP Tutorial For Beginners In Python - S1 E7 - codebasics
An end to end NLP project consists of many steps. These steps together forms an NLP pipeline. The pipeline has various stages such as data acquisition, data cleaning, pre-processing, model building, deployment, monitor and update etc. We will use a real life example and go through various stages of NLP pipeline in this video. Complete NLP Playlist: • NLP Tutorial Python Practical NLP Book In India: https://www.shroffpublishers.com/book... Practical NLP Book Link For USA: https://amzn.to/3Aoeocm 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #spacytutorialnlp #nltk #nltktutorial #spacyvsnltk #nltkinpython #nltkpython #nltkpythontutorial
-
Tokenization in Spacy: NLP Tutorial For Beginners - S1 E8 - codebasics
Word and sentence tokenization can be done easily using the spacy library in python. In this NLP tutorial, we will cover tokenization and a few related topics. NLP platform: https://www.firstlanguage.in/ ⭐️ Timestamps ⭐️ 00:00 What is tokenization 02:35 Install spacy 02:49 Coding starts 03:23 Basic English word tokenization 14:15 Span object 15:00 Token attributes 18:40 Grab emails from the student information doc 23:58 Tokenization in Hindi 26:13 Customize tokenization rule 29:52 Sentence tokenization (or segmentation) 33:15 Exercise Code: https://github.com/codebasics/nlp-tut... Exercise: In the above code, go to the end and you will find exercises Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #spacytutorialnlp #spacytutorialnlp #wordtokenization #tokenizerspacy #tokenizationnlp #wordtokenizerspacy #tokenizationandspacy #spacynlp
-
Language Processing Pipeline in Spacy: NLP Tutorial For Beginners - S1 E9 - codebasics
In this tutorial, we will cover the language processing pipeline in spacy. We can get some pre-trained language pipelines that give some components such as tagger, parser, ner free out of the box. Using these components you can detect part of speech, named entities, perform sentence segmentation, etc. NLP platform: https://www.firstlanguage.in/ ⭐️ Timestamps ⭐️ 00:00 Introduction 00:11 Blank NLP pipeline 02:33 Download pre-trained pipeline 08:29 Named entity recognition 13:59 Language processing pipeline in French 16:16 Add the component to a blank pipeline Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #spacytutorialnlp #pipeinspacy #nlppipeline #languageprocessingpipeline #createpipelines #spacypipeline #customspacypipeline
-
Stemming and Lemmatization: NLP Tutorial For Beginners - S1 E10 - codebasics
Stemming and lemmatization are two popular techniques to reduce a given word to its base word. Stemming uses a fixed set of rules to remove suffixes, and prefixes whereas lemmatization use language knowledge to come up with a correct base word. Stemming will be demonstrated in ntlk (spacy doesn't support stemming) whereas code for lemmatization is written in spacy NLP platform: https://www.firstlanguage.in/ Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #spacytutorialnlp #nlptutorialpython #naturallanguageprocessingstemming #nlpstemming #nlpstemmingtutorial #stemming #lemmatization
-
Part Of Speech POS Tagging: NLP Tutorial For Beginners - S1 E11 - codebasics
Part of speech or POS tagging is used to tag parts of speech while building an NLP application. In this video, we will cover the basics of POS first and then write code in spacy NLP platform: https://www.firstlanguage.in/ Code:https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python ⭐️ Timestamps ⭐️ 00:00 What is Part of Speech (POS) 09:52 Coding in Spacy for POS 15:28 Tags in Spacy 18:37 Remove unnecessary tokens from Microsoft earning report using POS tags 24:22 Exercise 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #postagging #partsofspeechtagging #typesofpostagging #rulebasedpostagging #partofspeechtag #partofspeechinspacy #posnlptutorial
-
Named Entity Recognition (NER): NLP Tutorial For Beginners - S1 E12 - codebasics
Named Entity Recognition, also known as NER is a technique used in NLP to identify specific entities such as a person, product, location, money, etc from the text. It has many useful real-life use cases such as document search, recommendations, customer support ticket routing, and many more. In this video, I will give you a very simple explanation of NER, and then we will write code in Spacy to explore detect entities. Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python ⭐️ Timestamps ⭐️ 00:00 Introduction 00:13 Application of NER in Real Life 06:49 Coding: NER in Spacy 17:47 Approaches to Build a NER System 🔖Hashtags🔖 #namedentityrecognition #ner #nernlp #whatisner #applicationofner #nerinspacy #spacy #nersystem #spacyner #nertutorial
-
-
元學習 Meta Learning
這一系列視頻專注於元學習(Meta Learning)的不同方面,提供了對這個複雜領域的深入解析。系列首先由李宏毅教授的講座「各種奇葩的元學習用法」開始,介紹了元學習在機器學習中的獨特應用和方法。接著,系列進入「元學習與機器學習的三個步驟」和「萬物皆可 Meta」兩部分,深入探討元學習的基礎理論和實踐方法。 系列還包括助教陳建成講授的兩部分「More about Meta Learning」補充課程,這部分更加深入地分析了元學習的高級概念和技術。此外,「Meta Learning - Gradient Descent as LSTM」三部分系列和「Meta Learning – Metric-based」系列則分別從梯度下降和基於度量的角度,介紹了元學習的具體實現方式。 此外,「More about Auto-encoder」四部分系列則深入探討了自編碼器在元學習中的應用和技術細節。這一系列視頻為有興趣於元學習和機器學習進階主題的學習者提供了豐富的知識和實踐指南。
-
【機器學習 2022】各種奇葩的元學習 (Meta Learning) 用法 - Hung-yi Lee
-
【機器學習2021】元學習 Meta Learning (一) - 元學習跟機器學習一樣也是三個步驟 - Hung-yi Lee
https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/meta_v3.pdf
-
【機器學習2021】元學習 Meta Learning (二) - 萬物皆可 Meta - Hung-yi Lee
https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/meta_v3.pdf
-
[TA 補充課] More about Meta Learning (由助教陳建成同學講授) (1/2)
slides: http://speech.ee.ntu.edu.tw/~tlkagk/courses/ML2020/Meta_learning_and_more.pdf
-
[TA 補充課] More about Meta Learning (由助教陳建成同學講授) (2/2)
slides: http://speech.ee.ntu.edu.tw/~tlkagk/courses/ML2020/Meta_learning_and_more.pdf
-
Meta Learning - Gradient Descent as LSTM (1/3) - Hung-yi Lee
-
Meta Learning - Gradient Descent as LSTM (2/3) - Hung-yi Lee
-
Meta Learning - Gradient Descent as LSTM (3/3) - Hung-yi Lee
-
Meta Learning – Metric-based (1/3)
-
Meta Learning – Metric-based (2/3)
-
Meta Learning – Metric-based (3/3)
-
Meta Learning - Train+Test as RNN
-
More about Auto-encoder (1/4)
-
More about Auto-encoder (2/4)
-
More about Auto-encoder (3/4)
-
More about Auto-encoder (4/4)
-
-
Transformer
這個視頻系列集中介紹了Transformer架構及其變體,對於想要深入理解Transformer和自注意力機制的學習者來說,這是一個非常有價值的資源。系列由李宏毅教授的兩部分講座「Transformer (上)」和「Transformer (下)」開始,這些講座提供了對Transformer基本結構和工作原理的全面解釋。 接著,助教紀伯翰進一步深入探討了Transformer及其變體在「Transformer and its variant」補充課中。此外,李宏毅教授的另外兩部分講座「自注意力機制 (Self-attention) (上)」和「(下)」則專注於自注意力機制,這是Transformer模型的核心部分。 StatQuest的三部視頻「Attention for Neural Networks, Clearly Explained!!!」、「Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!!」和「Decoder-Only Transformers, ChatGPTs specific Transformer, Clearly Explained!!!」則以易於理解的方式解釋了注意力機制和Transformer網絡。這些視頻為學習者提供了對這些複雜概念的深入洞察和清晰解釋。
-
【機器學習2021】Transformer (上) - Hung-yi Lee
slides: https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/seq2seq_v9.pdf
-
【機器學習2021】Transformer (下) - Hung-yi Lee
slides: https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/seq2seq_v9.pdf
-
[TA 補充課] Transformer and its variant (由助教紀伯翰同學講授)
slides: https://docs.google.com/presentation/d/1saF8O0TFQDTLmLpoeOPdsQylXimxyGp5yOMFNexEXkg/edit?usp=sharing
-
【機器學習2021】自注意力機制 (Self-attention) (上) - Hung-yi Lee
ML2021 week3 3/12 Self-attentionslides: https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/self_v7.pdf
-
【機器學習2021】自注意力機制 (Self-attention) (下) - Hung-yi Lee
slides: https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/self_v7.pdf
-
Attention for Neural Networks, Clearly Explained!!! - StatQuest
Attention is one of the most important concepts behind Transformers and Large Language Models, like ChatGPT. However, it's not that complicated. In this Stat...
-
Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!! - StatQuest
Transformer Neural Networks are the heart of pretty much everything exciting in AI right now. ChatGPT, Google Translate and many other cool things, are based...
-
Decoder-Only Transformers, ChatGPTs specific Transformer, Clearly Explained!!! - StatQuest
Transformers are taking over AI right now, and quite possibly their most famous use is in ChatGPT. ChatGPT uses a specific type of Transformer called a Decod...
-
ELMO, BERT, GPT - Hung-yi Lee
source of the image at the end of the video:https://twitter.com/gregd_nlp/status/1096244878600818693
-
-
機器終身學習 (Life Long Learning, LL)
這一系列視頻集中探討了機器終身學習(Life Long Learning, LL)的概念,特別是在解決災難性遺忘(Catastrophic Forgetting)問題方面的策略和方法。這是對於希望理解如何讓人工智慧系統更有效地持續學習和適應新信息的學習者來說的重要資源。 系列由李宏毅教授的兩部分講座開始,首先是「機器終身學習 (Life Long Learning, LL) (一)」,這部分講座著重於介紹什麼是災難性遺忘以及它對當前人工智慧系統的影響。其次是「機器終身學習 (二)」,這部分則深入探討了克服災難性遺忘問題的不同方法和策略。 此外,由助教楊舒涵主講的補充課程「More about Lifelong Learning」進一步深入探討了終身學習的進階概念和技術。這些視頻為學習者提供了深入的理論背景和實踐策略,以幫助他們更好地理解如何設計和實施有效的機器終身學習系統。
-
【機器學習2021】機器終身學習 (Life Long Learning, LL) (一) - 為什麼今日的人工智慧無法成為天網?災難性遺忘(Catastrophic Forgetting) - Hung-yi Lee
https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/life_v2.pdf
-
【機器學習2021】機器終身學習 (Life Long Learning, LL) (二) - 災難性遺忘(Catastrophic Forgetting)的克服之道 - Hung-yi Lee
https://speech.ee.ntu.edu.tw/~hylee/ml/ml2021-course-data/life_v2.pdf
-
[TA 補充課] More about Lifelong Learning (由助教楊舒涵同學講授)
slides: http://speech.ee.ntu.edu.tw/~tlkagk/courses/ML2020/MoreLL.pdf
-
-
遺傳演算法(Genetic Algorithm)
這系列視頻專注於遺傳演算法(Genetic Algorithm)和進化策略(Evolution Strategy),提供從基礎知識到實際應用的全面介紹。莫烦Python和Noureddin Sadawi的講座探討了遺傳演算法的基本概念、微生物遺傳演算法、進化策略及其在機器學習中的應用,如句子配對和旅行商人問題(TSP)等。此外,還涵蓋了神經網絡進化(Neuro-Evolution)和NEAT(NeuroEvolution of Augmenting Topologies)算法在監督學習和強化學習中的應用。 Noureddin Sadawi的一系列30個短視頻深入介紹了遺傳演算法的不同方面,包括選擇方法、交叉和變異操作,並通過Java實現展示了這些概念。這些視頻是對於想要深入瞭解遺傳演算法及其在複雜問題解決中的應用的學習者的寶貴資源。 整體來說,這些視頻提供了對遺傳演算法和進化策略的全面了解,幫助學習者掌握這些強大的演算法工具,並將它們應用於實際的機器學習問題中。
-
什么是遗传算法? What is Genetic Algorithm? - 莫烦Python
这次我们尝试踏足机器学习中的另外一个领域, 用进化理论来解决复杂的问题. 遗传算法是进化算法的一个分支. 它将达尔文的进化理论搬进了计算机.通过 "莫烦 Python" 支持我做出更好的视频: https://mofanpy.com/support/莫烦 Python 更多有趣的教程: https://mofan...
-
#1.1 进化算法简介 (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial)
进化算法包括了很多算法, 比如遗传算法(Genetic Algorithm), 进化策略(Evolution Strategy), 神经进化(NeuroEvolution) 等等.If you like this, please star my Tutorial code on Github.Code: http...
-
#2.1 遗传算法 Genetic Algorithm (进化算法教学教程 Tutorial)
简单的遗传算法 Python 实现.If you like this, please star my Tutorial code on Github.Code: https://github.com/MorvanZhou/Evolutionary-Algorithm通过 "莫烦 Python" 支持我做出更好的视...
-
#2.2 遗传算法 例子: 句子配对 (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial)
用遗传算法做句子配对练习.If you like this, please star my Tutorial code on Github.Code: https://github.com/MorvanZhou/Evolutionary-Algorithm"莫烦Python" 进化算法目录: https://mo...
-
#2.3 遗传算法 例子: 旅行商人 TSP (机器学习 Genetic Algorithm 进化算法 Evolutionary Algorithm 教程教学 tutorial)
用遗传算法来解决旅行商人, 最短路径的问题.If you like this, please star my Tutorial code on Github.Code: https://github.com/MorvanZhou/Evolutionary-Algorithm"莫烦Python" 进化算法目录: h...
-
#2.4 微生物遗传算法 Microbial Genetic Algorithm (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial)
Microbial GA 很好的解决了遗传算法中的 Elitism 问题, 我们用简单的例子看看.If you like this, please star my Tutorial code on Github.Code: https://github.com/MorvanZhou/Evolutionary-Al...
-
什么是进化策略 What is Evolution Strategy? - 莫烦Python
进化是大自然赋予我们的礼物, 我们也能学习自然界的这份礼物, 将它放入计算机, 让计算机也能用进化来解决问题. 我们接着上回提到的遗传算法, 来说一说另一种使用进化理论的优化模式-进化策略 (Evolution Strategy).遗传算法简介: https://www.youtube.com/watch?v=B...
-
#3.1 进化策略 Evolution Strategy (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
要我用一句话概括ES: 在程序里生宝宝, 杀死不乖的宝宝, 让乖宝宝继续生宝宝. 乍一听, 怎么和 GA 是完全一样的逻辑呢? 没关系, 我们在接下来的内容中揭晓他们的不同.If you like this, please star my Tutorial code on Github.Code: https:/...
-
#3.2 (1+1)ES 简单有效的进化策略 (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
(1+1)-ES 是 ES 进化策略的一种形式, 也是众多形式中比较方便有效的一种. 如果要我用一句话来概括 (1+1)-ES: 一个爸爸和一个孩子的战争If you like this, please star my Tutorial code on Github.Code: https://github.co...
-
#3.3 进化策略+梯度下降=Natural ES (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
Natural ES, 应该就是算一种用适应度诱导的梯度下降法, 如果要我用一句话来概括 NES: 生宝宝, 用好宝宝的梯度辅助找到前进的方向If you like this, please star my Tutorial code on Github.Code: https://github.com/Morv...
-
什么是神经网络进化? What is Neuro-Evolution? - 莫烦Python
在进化算法这系列的内容中我做了很久铺垫, 现在总算到了最前沿最先进的技术了. 我们知道机器学习, 深度学习很多时候都和神经网络是分不开的. 那将进化和神经网络结合也在近些年有了突破. 进化策略简介: https://www.youtube.com/watch?v=Etj_gclFFFo遗传算法简介: http...
-
#4.1 神经网络进化 NeuroEvolution (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
神经网络在当今是一种迅速发展的机器学习方式, 使用反向传播的神经网络更是被推向了一轮又一轮的高峰, 可是我们的视野请不要被反向传播的神经网络变得狭隘. 因为使用神经网络的机器学习方法还有这么一种叫做: 神经进化 (NeuroEvolution). 这种神经网络个人认为是更接近于生物的神经网络系统, 因为他和生物神...
-
#4.2 NEAT 监督学习 Supervised learning (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
这次我们拿 NEAT 做一次监督学习的例子. 接着我们来说说 neat-python 网页上的一个使用例子, 用 neat 来进化出一个神经网络预测 XOR 判断.Code: https://github.com/MorvanZhou/Evolutionary-Algorithm"莫烦Python" 进化算法目录...
-
#4.3 NEAT 强化学习 Reinforcement Learning (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
这次我们用 NEAT 来做强化学习 (Reinforcement Learning), 这个强化学习可是没有反向传播的神经网络哦, 有的只是一个不断进化 (还可能进化到主宰人类) 的神经网络!! (哈哈, 骗你的, 因为每次提到在电脑里进化, 联想到科幻片, 我就激动!)Code: https://github....
-
#4.4 OpenAI ES 大规模强化学习 Reinforcement learning (机器学习 进化算法 Evolutionary Algorithm 教程教学 tutorial) - 莫烦Python
我们见到了使用 NEAT 来进化出一个会立杆子的机器人. 这次, 我们使用另一种进化算法 Evolution Strategy (后面都用简称 ES 代替) 来实现大规模强化学习. 如果你的计算机是多核的, 我们还能将模拟程序并行到你多个核上去. 如果我用一句话概括强化学习上的 ES : 在自己附近生宝宝, 让自...
-
Genetic Algorithm in Artificial Intelligence - The Math of Intelligence (Week 9)
Evolutionary/genetic algorithms are somewhat of a mystery to many in the machine learning discipline. You don't see papers regularly published using them but...
-
Genetic Algorithms 1/30: Introduction .. Searching and the Search Space
In this series I give a practical introduction to genetic algorithms
-
Genetic Algorithms 2/30: the Methodology of a Genetic Algorithm - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmsMore here: https://www.softlight.tech/
-
Genetic Algorithms 3/30: Outline of the Basic Genetic Algorithm - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmsMore here: https://www.softlight.tech/
-
Genetic Algorithms 4/30: Genetic Operators .. for Binary Representation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmsMore here: https://www.softlight.tech/
-
Genetic Algorithms 5/30: The Microbial Genetic Algorithm - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmsMore here: https://www.softlight.tech/
-
Genetic Algorithms 6/30: Example Problem .. Dashed Line Detection - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 7/30: The Fitness Function - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 8/30: Binary Genetic Algorithm .. Java Code 1/2 - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 9/30: Binary Genetic Algorithm .. Java Code 2/2 - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 10/30: More on Representation .. the Knapsack Problem - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 11/30: Permutation Representation .. the Travelling Salesman Problem - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 12/30: Value Representation (e.g. Weights of ANN) - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 13/30: Tree Representation .. (e.g. Function Approximation) - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 14/30: The Roulette Wheel Selection Method - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 15/30: Java Implementation of the Roulette Wheel Selection Method - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 16/30: the Rank Selection Method - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 17/30: the Steady State Selection Method - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 18/30: Order One Crossover - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 19/30: Java Implementation of Order One Crossover - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 20/30: Partially Mapped Crossover - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 21/30: Cycle Crossover - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 22/30: Insert Mutation with Java Implementation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 23/30: Swap Mutation with Java Implementation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 24/30: Inversion Mutation with Java Implementation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 25/30: Scramble Mutation with Java Implementation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 26/30: Example Problem .. Dashed Line Detection - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 27/30: Evolving the Population .. with java implementation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 28/30: The Fitness Function .. with java implementation - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 29/30: Full Java Implementation of Permutation GA 1/2 - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
Genetic Algorithms 30/30: Full Java Implementation of Permutation GA 2/2 - Noureddin Sadawi
In this series I give a practical introduction to genetic algorithmshttps://www.softlight.tech/
-
-
Information Retrieval (IR)
這系列視頻由Victor Lavrenko主講,全面介紹信息檢索(Information Retrieval, IR)的關鍵原理和技術。視頻從文本的基本法則,如Zipf's、Zipf-Mandelbrot和Benford's法則開始,探討了它們在文本處理中的重要性。接著,講解了詞袋模型(Bag-of-Words)和向量空間模型,包括查詢和文檔向量、點積、歐幾里得距離和逆文檔頻率(IDF)。 系列進一步深入了解向量空間模型的應用,如余弦相似性、雅卡爾係數、Minkowski範數和卡方距離。Lavrenko還講述了詞語變形(Morphological Variation)、詞幹提取算法、字符N-gram和查詢基詞幹提取等主題。 在討論同義詞和多義詞問題時,Lavrenko介紹了MeSH詞庫、Wordnet和統計同義詞。視頻還涵蓋了關於向量空間模型的擴展、用戶交互在信息檢索中的作用,以及相關反饋和偽相關反饋的技術。 最後,系列探討了分類器、大邊界分類、主動攻擊算法、支持向量機(SVM)和學習排序在信息檢索中的應用。這些視頻是對於希望深入瞭解信息檢索領域核心概念和技術的學生和專業人士的寶貴資源。
-
IR2.1 Three fundamental laws of text - Victor Lavrenko
-
IR2.2 Zipf's law - Victor Lavrenko
-
IR2.3 Zipf-Mandelbrot law - Victor Lavrenko
-
IR2.4 Benford's law - Victor Lavrenko
-
IR2.5 Heaps' law - Victor Lavrenko
-
IR2.6 Clumping - Victor Lavrenko
-
IR2.7 Applying the laws - Victor Lavrenko
-
IR2.8 Capture-recapture with search engines - Victor Lavrenko
-
IR2.9 Summary - Victor Lavrenko
-
IR3.1 Bag-of-words matching - Victor Lavrenko
-
IR3.2 Overview of the vector space model - Victor Lavrenko
-
IR3.3 Query and document vectors - Victor Lavrenko
-
IR3.4 Dot product and Euclidean distance - Victor Lavrenko
-
IR3.5 Term weighting - Victor Lavrenko
-
IR3.6 Inverse document frequency (idf) - Victor Lavrenko
-
IR3.7 Feature selection with tf-idf - Victor Lavrenko
-
IR3.8 Pivoted document length normalization - https://www.youtube.com/watch?v=tfIOG5mrAmg&list=PLBv09BD7ez_77rla9ZYx-OAdgo2r9USm4&index=8
-
IR3.9 State-of-the-art retrieval formula - Victor Lavrenko
-
IR3.10 tf-idf weighted sum - Victor Lavrenko
-
IR3.11 Cosine similarity and Jacquard coefficient - Victor Lavrenko
-
IR3.12 Cosine similarity with tf-idf weights - Victor Lavrenko
-
IR3.13 Minkowski norm and chi-squared distance - Victor Lavrenko
-
IR3.14 Phrases and multi-word features - Victor Lavrenko
-
IR3.15 Applications of the vector-space model - Victor Lavrenko
-
IR4.1 Vocabulary mismatch in IR - Victor Lavrenko
-
IR4.2 Causes of vocabulary mismatch - Victor Lavrenko
-
IR4.3 How to tokenize text - Victor Lavrenko
-
IR4.4 Tokenizing Asian languages - Victor Lavrenko
-
IR4.5 Morphological variation - Victor Lavrenko
-
IR4.6 Stemming algorithms - Victor Lavrenko
-
IR4.7 Porter and Krovetz stemmer - Victor Lavrenko
-
IR4.8 Character n-grams - Victor Lavrenko
-
IR4.9 Query-based stemming - Victor Lavrenko
-
IR4.10 Removing stopwords - Victor Lavrenko
-
IR4.11 Synonymy and polysemy in IR - Victor Lavrenko
-
IR4.12 MeSH thesaurus - Victor Lavrenko
-
IR4.13 Finding synonyms in Wordnet - Victor Lavrenko
-
IR4.14 Statistical synonyms - Victor Lavrenko
-
IR4.15 Examples of statistical synonyms - Victor Lavrenko
-
IR4.16 Cosine similarity and the correlation coefficient - Victor Lavrenko
-
IR4.17 Generalizsed vector space model - Victor Lavrenko
-
IR4.18 Synonym expansion: Google tilde - Victor Lavrenko
-
IR4.19 User interaction in information retrieval - Victor Lavrenko
-
IR4.20 Relevance Feedback - Victor Lavrenko
-
IR4.21 Rocchio feedback algorithm - Victor Lavrenko
-
IR4.22 Rocchio algorithm illustration - Victor Lavrenko
-
IR4.23 Pseudo-relevance feedback - Victor Lavrenko
-
IR4.24 Illustration of pseudo-relevance feedback - Victor Lavrenko
-
IR4.25 Why pseudo feedback works - Victor Lavrenko
-
IR20.1 Centroid classifier - Victor Lavrenko
-
IR20.2 Large margin classification - Victor Lavrenko
-
IR20.3 Passive-aggressive algorithm (PA) - Victor Lavrenko
-
IR20.4 Convergence of the PA algorithm - Victor Lavrenko
-
IR20.5 SVM explained visually - Victor Lavrenko
-
IR20.6 Sequential minimal optimization (SMO) - Victor Lavrenko
-
IR20.7 Learning to rank for Information Retrieval - Victor Lavrenko
-
IR20.8 Learning to rank with an SVM - Victor Lavrenko
-
IR20.9 Learning to rank: features - Victor Lavrenko
-
IR20.10 Learning to rank with click data - Victor Lavrenko
-
IR20.11 Summary - Victor Lavrenko
-
Relevance model 1: Bernoulli sets vs. multinomial urns - Victor Lavrenko
Relevance model is the language model of the relevant class. In this video we look at the difference between the multinomial model (the one used in relevance models) and the multiple-Bernoulli model, which forms the basis for the classical probabilistic models.
-
Relevance model 2: the sampling game - Victor Lavrenko
How can we estimate the language model of the relevant class if we have no examples of relevant documents? We play a sampling game as follows. The relevance model is an urn with unknown parameters. We draw several samples from it and observe the query. What is the probability of observing the word "w" on the next draw?
-
Relevance model 3: probability for a set of words - Victor Lavrenko
How do we estimate a joint probability of observing a set of words? We cannot count the frequency (it'll be zero for a large set), and should not assume that the words are independent (pointless). Instead, we assume the words are exchangeable (order-invariant), and apply the de-Finetti theorem, which provides a convenient form for the joint distribution.
-
Relevance model 4: Bayesian interpretation - Victor Lavrenko
Another way to interpret the relevance model is via Bayesian estimation: the relevance model could be one of a large set of urns. We know what the urns are, but don't know which one is correct, so we compute the posterior probability for each candidate urn, and combine them via Bayesian (conditional) expectation.
-
Relevance model 5: summary of assumptions - Victor Lavrenko
The relevance model ranking is based on the probability ranking principle (PRP). It uses the background (corpus) model as a language model for the non-relevant class (just like the classical model), but has a novel estimate for the relevance model. The estimate is based on the sampling game and the de-Finetti theorem with Dirac spikes as the prior.
-
Relevance model 6: cross-language estimation - Victor Lavrenko
Can we "translate" an English query into a Chinese relevance model? Yes, we just need to have access to a parallel corpus.
-
Relevance model 7: ranking functions - Victor Lavrenko
The probability ranking principle (PRP) seems like the obvious ranking function for relevance models, but when we use it in the form of the odds ratio (as in the classical model), we bias our rankings towards the wrong type of document. A better approach is to use the Kullback-Leibler divergence between the relevance model and each of the document models.
-
-
進階自然語言處理 advanced
這一系列的進階自然語言處理(NLP)教程由codebasics提供,涵蓋了文本表示和處理的多個關鍵技術。從基本的文本表示技術,如標籤編碼、獨熱編碼,到更複雜的表示法,如詞袋(Bag Of Words, BOW)、n-gram和TF-IDF。這些教程深入解釋了每種技術的工作原理及其在NLP中的應用。 系列還介紹了Word Embeddings,這是一種先進的文本表示方法,並利用Spacy和Gensim進行了具體實例教學,如新聞分類。此外,還有對fastText的介紹和教學,包括如何在fastText中訓練自定義詞向量和進行文本分類。 最後,系列對聊天機器人的概念進行了介紹,這是一個當前在NLP領域中非常流行的應用。這些視頻為希望深入瞭解NLP進階技術的初學者提供了全面而深入的學習資源。
-
Text Representation Basics: NLP Tutorial For Beginners - Season 2 Episode 1 - codebasics
Machine learning models only understand numbers, they can't work on the text. In this video, we will discuss some basics of machine learning such as, 1) What are the features in machine learning? 2) What is feature engineering? 3) How text representation is used in NLP to extract features from text We will go into detail about text representation in future videos. Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #textrepresentationnlp #nlpfeatureengineering #textrepresentation #textrepresentationspacy
-
Text Representation: Label & One Hot Encoding: NLP Tutorial For Beginners - S2 E2 - codebasics
Label encoding and one hot encoding are primitive ways of representing text as numbers. These methods are not popular in NLP but it is important that we understand these techniques along with their shortcomings. In this video, we will cover how both of these methods work. Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #onehotencodingnlp #onehotencoding #onehotencodingpython #textrepresentationnlp #textrepresentation
-
Text Representation Using Bag Of Words (BOW): NLP Tutorial For Beginners - S2 E3 - codebasics
Bag of words (a.k.a. BOW) is a technique used for text representation in natural language processing. In this NLP tutorial, we will go over how a bag of words works and also write some code for email classification that uses a bag of words and the Naive Bayes classifier in machine learning. Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... ⭐️ Timestamps ⭐️ 00:00 Theory 08:00 Coding Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #nlpbagofwords #bagofwords #bagofwordsexample #bagofwordsusingnlp #bagofwordsnlp
-
Stop Words: NLP Tutorial For Beginners - S2 E4 - codebasics
Stop words are frequently used words in a language such as "a", "the", "so" etc. (in engllish) that do not add a lot of value to certain NLP tasks. Hence it is better to remove them during pre-processing stage. In this video, we will cover, ⭐️ Timestamps ⭐️ 00:00 Theory 05:00 Coding Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/KirandeepMarala/nl... Complete NLP Playlist: • NLP Tutorial Python Practical NLP Book In India: https://www.shroffpublishers.com/book... Practical NLP Book Link For USA: https://amzn.to/3Aoeocm 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #nlpstopwords #stopwords #stopwordspython #stopwordsinnlp
-
Text Representation Using Bag Of n-grams: NLP Tutorial For Beginners - S2 E5 - YouTube
Bag of n-grams is a text representation technique in NLP that uses a group of words to vectorize a given text. Bag of words is a particular case of bag of n-grams (with n=1). In this video, I will explain in straightforward language how bag of n-grams work along with coding in spacy and sklearn. Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python ⭐️ Timestamps ⭐️ 00:00 Theory: What is a bag of n-grams 07:31 Coding: Bag of n-gram demo using sklearn CountVectorizer 17:06 Coding: News categories classification problem 18:50 Coding: Handle class imbalance 23:07 Coding: Train a model using raw text 32:20 Coding: Train a model using preprocessed text Practical NLP Book In India: https://www.shroffpublishers.com/book... Practical NLP Book Link For USA: https://amzn.to/3Aoeocm Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses. 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #nlptextrepresentation #nlpngram #ngramsmodels #ngrams #bigrams
-
Text Representation Using TF-IDF: NLP Tutorial For Beginners - S2 E6 - codebasics
TF-IDF (term frequency, inverse document frequency) is a text representation technique in NLP that tackles the word count influence of common English words such as the, is etc (stop words) and some other generic words that are not stop words but can appear in any document. The idea is to give a high score to terms that are really relevant to a given document. In this video, I will explain TF-IDF in a very simple manner such that even a high school student can understand it easily 😊 Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python ⭐️ Timestamps ⭐️ 00:00 What is TF-IDF 11:32 Limitations of TF-IDF 12:17 Coding: sklearn TfidfVectorizer 21:38 Coding: Ecommerce item category classification using tf-idf Stackoverflow question on usage of log in tf-idf formula: https://stackoverflow.com/questions/2.... #tfidf #naturallanguageprocessing #textanalytics
-
Text Representation Using Word Embeddings: NLP Tutorial For Beginners - S2 E7 - codebasics
Word embeddings have revolutionized NLP in the last few years. Word2vec, Glove, fastText are a few popular word embedding techniques. New transformer-based word and document embedding techniques such as BERT, GPT, ElMo are further advancing the arena of representing text accurately in form of a dense vector. In this video, we will have an overview of word embedding. There is no coding in this video. Full tutorial on Word2vec: • What is Word2Vec? A Simple Explanatio... Complete NLP Playlist: • NLP Tutorial Python Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses. 🔖Hashtags🔖 #nlp #nlptutorial #textrepresentation #wordembeddings #nlptextrepresentation #nlpwordembeddings
-
Word vectors in Spacy overview: NLP Tutorial For Beginners - S2 E8 - codebasics
Spacy has inbuilt word embeddings that one can use by loading an appropriate model. For example for the English model, you need to load either a midsize or a large model to access these word embeddings. In this video, I will show you how to access these embeddings or word vectors in the Spacy NLP library. Code: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python Spacy English model page: https://spacy.io/models/en Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses. 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #spacytutorialnlp #wordvectorsspacy #spacywordvectors
-
News classification using Spacy word vectors: NLP Tutorial For Beginners - S2 E9 - codebasics
Spacy's English large and medium size models are shipped with Glove word embeddings. In this tutorial, we will do news classification (fake vs real) using these word embeddings. We will train the Multinomial Naive Bayes classifier and KNN classifier to train the model and compare their performance. Code: https://github.com/codebasics/nlp-tut... Exercise: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python Dataset credit: https://www.kaggle.com/datasets/clmen... Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses. 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #spacytutorial #spacytutorialnlp #wordvectorsspacy #spacywordvectors
-
Word vectors in Gensim overview: NLP Tutorial For Beginners - S2 E10 - codebasics
Gensim is an NLP library where you can access some prebuilt word embeddings such as word2vec model trained on Google news corpus or GloVe model trained on data from twitter. In this video, we will get an overview of word vectors in gensim, explore some of the useful apis etc. Code: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses. Need help building software or data analytics/AI solutions? My company https://www.atliq.com/ can help. Click on the Contact button on that website. 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #gensimtutorial #gensimtutorialnlp #wordvectorsgensim
-
News classification using Gensim word vectors: NLP Tutorial For Beginners - S2 E11 - codebasics
Using Gensim's word2vec model, we will classify news articles as fake vs real in this video. Code: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses. Need help building software or data analytics/AI solutions? My company https://www.atliq.com/ can help. Click on the Contact button on that website. 🔖Hashtags🔖 #nlp #nlptutorial #nlppython #gensimtutorial #gensimtutorialnlp #wordvectorsgensim #gensimwordvectors
-
fastText Tutorial | Train Custom Word Vectors in fastText | NLP Tutorial For Beginners - S2 E12 - codebasics
fastText is a word embedding technique similar to word2vec with one key difference. It uses the character n grams instead of words to train a neural network to produce word embeddings or word vectors. In this video, we will cover the following topics, ⭐️ Timestamps ⭐️ 00:00 What is fastText? 08:06 fastText installation 10:42 fastText pre-trained models 18:51 Generate word embeddings for Indian food recipes Code: https://github.com/codebasics/nlp-tut... Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlptutorial #nlppython #spacytutorialnlp #fasttexttutorial #wordembeddingsnlp #fasttexttutorialnlp #fasttext #nlpfasttexttutorial #nlpfasttext #fasttextnlp #textclassification #fasttextpythontutorial #fasttextclassification #fasttextwithpython
-
fastText tutorial | Text Classification Using fastText | NLP Tutorial For Beginners - S2 E13 - codebasics
fastText is a word embedding technique similar to word2vec with one key difference. It uses character n grams instead of words to train a neural network to produce word embeddings or word vectors. In this NLP tutorial, we will do text classification for e-commerce items using fastText library Code: https://github.com/codebasics/nlp-tut... Exercise: will be available at some point in next few days after video post Complete NLP Playlist: • NLP Tutorial Python 🔖Hashtags🔖 #nlppython #nlptutorial #spacytutorialnlp #nlpfasttextclassification #textclassification #fasttext #fasttexttextclassification #fasttextexplained #fasttextclassification #fasttextpythontutorial #fasttextnlptutorial #fasttextexample #fastTexttutorial #wordembeddingsnlp
-
Introduction to Chatbots | NLP Tutorial | S3 E1 - codebasics
In this video, we will discuss, 1) Type of chatbots (1) flow-based (or goal-oriented dialogue) (2) open-ended chatbots. 2) Implementation options (Dialogflow, RASA, Amazon Lex, IBM Watson Assistant etc.), Custom development using OpenAI API, LLMs, Hugging face, etc Complete NLP Playlist: • NLP Tutorial Python
-
End-to-End NLP Project | Build a Chatbot in Dialogflow | NLP Tutorial | S3 E2 - codebasics
Description: In this video, we will build a chatbot in Dialogflow for a food delivery system. It will be an end-to-end project covering Dialogflow basics, building a backend in python and Fastapi, interactions with MySQL database, and much more. We will cover Dialogflow fundamentals such as intents, entities, contexts, etc. Source code: https://codebasics.io/resources/end-t... Timestamps: 00:00 - 01.53: Introduction 01.54 - 4.23: Problem statement 4.24 - 10. 00: Scope of Work 10.01 - 12.23: Using ChatGPT for Solution Design & Architecture 12.24 - 14.00: Finalizing Chatbot Building Steps 14.00 - 17.14: Reason for choosing DialogFlow 17.30 - 20.27: Dialogflow setup 20.28 - 24.16: Dialogflow intents 24.17 - 28.09: Dialogflow entities 28.10 - 52.45: Dialogflow setup 52.46 - 01.04.48: Dialogflow contexts 01.04.50 - 01.06.22 : Fulfillment 01.06.23 - 01.10.14 : Database setup 01.10.15 - 01.15.10 : Backend setup 01.15.11 - 02:24:48: FastAPI Python Backend coding 02.24.49 - 02.49.38 - Website integration 02.49.29 - 02.55.04 Exercise and next steps 02.55.05 - Happy ending!
-