Chevron Left
返回到 Natural Language Processing with Attention Models

学生对 DeepLearning.AI 提供的 Natural Language Processing with Attention Models 的评价和反馈

4.4
1,082 个评分

课程概述

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

热门审阅

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB

Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

筛选依据:

101 - Natural Language Processing with Attention Models 的 125 个评论(共 260 个)

创建者 wei z

Dec 28, 2023

This NLP specialization is very well designed. I refreshed my AL learning at school years ago, and learned new things here.

创建者 Nicolas F V D

Sep 23, 2021

It's a great way to get started with state-of-the-art NLP techniques, following the recommended papers is extremely useful.

创建者 Dmitri

Dec 20, 2020

Great course! I understood a lot of things and got a valuable experience working with the state-of-the-art architectures 👍

创建者 Umberto S

Apr 17, 2021

Really practical course. It seems the SOTA in NLP, touching Transformers, BERT, T5, Reformers. So I think it's worth it

创建者 Olawale S

Mar 18, 2024

I would give this class 6 stars if possible. Topics were well explained and, at a good pace. Thanks to the team @DLAI

创建者 Ovidio M M

Apr 17, 2021

This is a very recommendable course to understand state-of-the-art NLP techniques and models using neural networks.

创建者 Shahin Z

Oct 27, 2020

Everything was great.

Slides & notebooks/exercise were amazing

The content is superb and very up-to-date.

创建者 rijal g

Jul 6, 2024

From this course, I get the first principle in Attention Model and foundation of large language model!

创建者 Eliude V

Jun 6, 2024

Excelente curso, conteúdo nota 10/10. Realmente aprende-se o que eles se propuseram à ensinar 👍🏽

创建者 Ruiliang L

May 31, 2021

The course is good. If we download powerpoint and files in jupyter notebook, that will be great.

创建者 Ajay G

Jul 21, 2021

Nice course to get the details of Attention with latest state of the art deep learning models.

创建者 Ricardo A

Apr 26, 2024

I recommend NLP Attention Models course as now incorporates TensorFlow for the assignments.

创建者 Martin P

Mar 22, 2021

Great course with great lecturers. Lecturers have clearly showed how far NLP research is.

创建者 Syed M F R

Oct 10, 2020

Loved the last week of the course, stood out amongst the other 15 of the specialization.

创建者 Hoang V N

Jun 2, 2025

Easy and straightforward explaination, this is one of the first course I can understand

创建者 Snehasish S

Aug 24, 2023

Very Good course to understand the concept of transformer model and transfer learning

创建者 B S

Oct 19, 2020

Critical to keep abreast of state of art models in NLP and new frameworks like Trax.

创建者 Pranjal K

Dec 5, 2023

Amazing. Got the hard topics, very clear description. Huge Thanks and shoutout :)

创建者 Kam K

Aug 18, 2021

I liked the BERT sections and references to the theory behind positional encoders

创建者 Yun-Chen L

Nov 18, 2020

It's good course, you can learn a lot of model and basic concept like attention.

创建者 Björn R

Oct 17, 2020

This course made the latest technology of NLP easy to understand and implement.

创建者 Madhur G

Oct 13, 2020

It is great. It helps us to learn and implement the latest NLP architectures.

创建者 Frankenstyle

Dec 30, 2020

One of the most comprehensive courses in NLP with challenging quizzes around

创建者 Bas v d R

Oct 20, 2020

Incredibly interesting course showing state-of-the-art language modelling

创建者 Susan M

May 25, 2023

Excellence in integrity of instruction including methods and community.