返回到 Natural Language Processing with Attention Models
DeepLearning.AI

Natural Language Processing with Attention Models

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.

状态:PyTorch (Machine Learning Library)
状态:Embeddings
中级课程小时

精选评论

RJ

4.0评论日期:Sep 28, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

JH

5.0评论日期:Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB

5.0评论日期:Dec 31, 2020

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

AT

4.0评论日期:Oct 14, 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

LL

5.0评论日期:Jun 22, 2021

T​his course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

MN

4.0评论日期:Mar 27, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

DS

5.0评论日期:Apr 28, 2023

The course is so great you should definitely check it out it will give you deep insight through Natural language processing.

SB

5.0评论日期:Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

MS

4.0评论日期:Oct 2, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

SN

5.0评论日期:Apr 9, 2023

This is a very useful course for learning attention technique. Teacher explained very clearly. Assignments were really exciting.

ND

5.0评论日期:Sep 23, 2021

It's a great way to get started with state-of-the-art NLP techniques, following the recommended papers is extremely useful.

WZ

5.0评论日期:Dec 28, 2023

This NLP specialization is very well designed. I refreshed my AL learning at school years ago, and learned new things here.

所有审阅

显示:20/263

Xu Ouyang
1.0
评论日期:Sep 26, 2020
Lucas Fernandes
2.0
评论日期:Sep 27, 2020
Shikhin Mehrotra
1.0
评论日期:Sep 28, 2020
Konstantinos Krommydas
2.0
评论日期:Oct 5, 2020
Boris Kabakov
1.0
评论日期:Sep 25, 2020
Ryan Baten
2.0
评论日期:Oct 6, 2020
Vincent Fritsch
1.0
评论日期:Nov 27, 2020
Ravi Shankar Karedla
3.0
评论日期:Oct 6, 2020
D. Refaeli
1.0
评论日期:Mar 22, 2021
Eitan Israeli
2.0
评论日期:Oct 2, 2020
Han-Chung Lee
2.0
评论日期:Oct 4, 2020
Jeremy Ong Chun Hooi
5.0
评论日期:Oct 5, 2020
Paul Jay Ledbetter III
1.0
评论日期:Nov 3, 2020
Muhammad Maiz Ghauri
1.0
评论日期:Dec 5, 2020
Brooke Fujita
1.0
评论日期:Nov 8, 2020
Siddharth Shukla
1.0
评论日期:Sep 19, 2021
Logan Markewich
3.0
评论日期:Apr 15, 2021
Jesús Díaz Martín
2.0
评论日期:Nov 11, 2020
Jorge Antonio Chan-Lau
3.0
评论日期:Oct 28, 2020
Raviteja Reddy Ganta
3.0
评论日期:Oct 14, 2020