Chevron Left
返回到 Natural Language Processing with Sequence Models

学生对 DeepLearning.AI 提供的 Natural Language Processing with Sequence Models 的评价和反馈

4.5
1,175 个评分

课程概述

In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

热门审阅

AB

Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.

SA

Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

筛选依据:

26 - Natural Language Processing with Sequence Models 的 50 个评论(共 241 个)

创建者 Julio W

Jun 28, 2021

I learn to hate Trax in this course. In the assignements, Trax is used only for toy problems, and then, we use a precomputed model. Even the fastnp is used in a very slow mode. Why to learn NLP in a obscure and really bad documented framework if we will finish to use a precomputed models?

Moreover, I try to replicate the results in my own machine (or even in colab) it does not work, because Trax change a lot between versions. Again, why to use, in a course, a framework that is not stable?

In my opinion, using a new and obscure framework to tech new concepts, only because you love it, is (at least) antipedadogical.

创建者 Jorge A C

Oct 13, 2020

As other course reviewers noted, this course did not help much to build the intuition underlying the methods used. The video lectures were short and the explanations, though concise, were convoluted and not clear at all. For a real understanding of what sequence models are capable of I recommend watching the lecture videos of Stanford CS224N. I

创建者 DANG M K

Aug 29, 2020

This course material is not good compare to the Deeplearning Specialization. I hope the instructor will write down to explain detail, not just reading from slide

创建者 bdug

Apr 23, 2021

I was disapointed by this course:

I did not like at all the use of Trax. At our level (student), we need a well established and documented library like Keras or Pytorch to illustrate the concepts. Trax is badly documented. And since the installation of the Trax version used in the assignement fails in Google Colab (!!), I had hard time reproducing the assignements in google colab.

Week 3 is just a scam since it says "go and read this blog" or "watch this video in another specialization". At that moment I simply felt robbed.

创建者 Dimitry I

Apr 14, 2021

Very superficial course, just like the rest in the specialization. Quizzes and assignments are a joke. Didn't want to give negative feedback at first, but now that I am doing course #4 in the specialization, which covers material I don't know much about (Attention), I've realized how bad these courses are. Very sad.

创建者 John Y

Jan 6, 2022

This was another great course. I previously put on my to do list learning or reviewing about classes and I was happy to see it covered here. I enjoyed learning about data manipulation, sampling, and iteration or generation process and Trax. At first I was a little hesitant about learning a new program or library like Trax but I found Lukas' talk to be helpful and convincing. I feel Trax does simplify the coding process quite nicely. The homework seemed repetitive but I found that approach to be very useful because I think the intent was to help us familiarize with the coding process and Trax more quickly. I previously completed the DL Specialization and appreciated this course very much. Imo, someone new to DL and RNN might find this course confusing because the concepts are not explained as much in depth as in the DL Course.

创建者 Tay J

Sep 1, 2024

Didn't like that the labs are in Tensorflow instead of PyTorch, but that is a personal preference In other terms, it is a great course, especially the last week, in which the basics of metric learning are smoothly explained

创建者 Baurjan S

Sep 26, 2020

Great Course as usual. Tried siamese models but got a very different results. Will need to study more on the conceptual side and implementation behind them. But overall, I am glad I touched LSTMs.

创建者 Sudharsan

Aug 16, 2020

Learning about the Trax library and solving practical problems with the library was really interesting. Siamese network architecture was great thing to learn.

创建者 Zoltan S

Aug 1, 2020

This is an excellent course with some cutting edge material, and also an introduction to a new learning framework trax.

创建者 Jerry C

Oct 11, 2020

Great course! However the assignments are handholding too much step by step... I'd prefer the assignments to allow students to think more for themselves when implementing functions etc. (and only unhide hints or seek help on Slack when struggling for a long time)

创建者 Swakkhar S

Sep 23, 2020

First two courses were much better. It introduces trax, which is great. However, the materials of this course is already covered in the 5th course in the deep learning specialization. On the whole, great course, great efforts by the team.

创建者 JJ Y

Sep 26, 2020

Sequence models are heavy subjects and it would be unrealistic to expect a 4-week course to go into all the depths of RNN, GRN, LSTM etc. This course does a great job covering important types of neural networks and showing their applications. However, the labs and assignments could have done more in (a) helping us look a little deeper into the implementations of different NN building components, and (b) aligning better with the lecture videos.

Really Good examples: Week1 labs and assignment illustrate the implementations of some of the basic layer classes, and outline the overall flow of NN training with Trax. Week4 labs and assignment illustrate the implementation of the loss layer based on the unique triple loss function.

Not so Good examples: Week1 uses a whole video explaining gradient calculation in Trax. Yet there is no illustration of how it's integrated in backward propagation in Trax. Week2 videos and the labs/assignment are more disjoint. There is a video explaining the scan() function, but it does not show up in the assignment at all.

创建者 Sina M

May 14, 2023

Compared to prior deepLearning Ai courses. the lecturers were very robotic and un natural. The explanations were much less clear and less effort was made to explain the intuitons behind formulas.

创建者 Benjamin W

Jul 13, 2023

I can not give a good rating for this specialization. The teaching style seems antiquated to me. Instructors are reading text like an AI from a text ticker (you can even see the eyes moving). Sometimes the quality is not good ("plopp-noise" because a bad microphone or headset was used). The Jupyter Notebooks for the Assignements do hang often. Moreover I think the quality of the code could be better. Be warned: All the exercises are based on TRAX, which is a Python library not maintained anymore. So you will learn it here and never use it again. I'm working through the Specialization because my employer paid it. I can not say, it's very motivating. Despite my negative review, I learned something from it.

创建者 Kota M

Aug 21, 2021

Sadly, the quality of the material is much lower than the previous two courses. Assignment repeatedly asks to implement data generators with a lot of for-loops. We should focus more on the network architecture rather than python programming. That being said, the implementation is not good either. Learners would have to learn to program anyways.

创建者 Patrick C

Dec 23, 2020

Assignments are very difficult to complete because of inaccurate information (off by one on indices and other sloppy mistakes). You also don't learn much from them because almost all the code is already provided. It would be much better if they built up your understand from first principles instead of rushing through fill in the blank problems.

创建者 Mostafa E

Dec 13, 2020

The course did well in explaining the concepts of RNNs... but it may in fact have provided less knowledge than the NLP course in Deep Learning specialization.

I was looking forward to see more details on how translation works using LSTMs, go over some famous LSTM networks such as GNMT, and explain some accuracy measures such as the BLEU score.

创建者 Artem R

Dec 1, 2020

Course could be completed without watching videos - just by using hints and comments in assignments, videos are short and shallow, choice of Deep Learning framework (TRAX) is questionable - I won't use it in production.

Despite the course is 4 weeks long it could be accomplished in 4 days - I don't feel that it was worth the time.

创建者 George L

Mar 21, 2021

Compared with the Deep Learning specialization, this specialization was designed in a way that nobody can understand. Although the assignment could be easy at times, the point is being missed when people cannot really understand and learn. Bad teacher. Andrew Ng, please!

创建者 Youran W

Dec 3, 2020

All the assignments are extremely similar.

创建者 Xinlong L

Aug 22, 2021

I did not enjoy the course at all. It looks like the instructor is just reading materials rather than really teaching. He just focused on reading and did not explain anything. I took Andrew's deep learning specialization, and that course was really great. But I am so disappointed at this course. deeplearning.ai please do strict quality control on the courses otherwise it harms your brand

创建者 Yanting H

Oct 13, 2020

Oversimplified illustration of all core definitions and it is not reasonable from any sense to use trax instead of a popular framework like Tensorflow or Pytorch for the assignment. Also, the design of assignment is weak, you can barely learn anything from filling the blanks.

创建者 Ngacim

Nov 27, 2020

1) The course videos just throw out various nouns and you need to goole them to understand what they mean.

2) The assignments try their best to explain concepts in a way that often seems redundant.

创建者 Emanuel D

Jan 26, 2021

For me, it is very dissapointing, time is spent on irrelevant things, like python syntax and generators in first week. There are missing video tutorials on how to use Trax.