Chevron Left
Retour à Natural Language Processing with Sequence Models

Avis et commentaires pour d'étudiants pour Natural Language Processing with Sequence Models par deeplearning.ai

4.5
étoiles
910 évaluations
184 avis

À propos du cours

In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Meilleurs avis

SA

27 sept. 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

AU

11 nov. 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.

Filtrer par :

176 - 192 sur 192 Avis pour Natural Language Processing with Sequence Models

par Rajaseharan R

9 mars 2022

T​oo much focus on the Data generator in the assignments. There should be a library function in Trax to do it. Might have to do some data preparation before hand but the generator should be a standard library function. Also, I hoped to learn a bit more indepth in terms of entity labelling.

par Huang J

23 déc. 2020

The course videos are too short to convey the ideas behind the methodology. It requires understanding of the methodology before following the course material. Also, the introduction on Trax is fine, but would prefer to have a version of the assignments on TensorFlow.

par Irakli S

2 juil. 2022

G​ood videos, however assignments weren't up to par with videos. Often I had to write assignments that weren't very much related to the videos and the stuff that were actually in the videos was already implemented.

par A V A

13 nov. 2020

Good course teaching the applicatons of LSTMs/GRUs in language generation, NER and for matching question duplicates using Siamese networks. Would have been more helpful if there was more depth in the topics.

par J N B P

16 mars 2021

This course is good for practical knowledge with really good projects but it lags in the theoretical part you must be familiar with the concepts to get the most out of this course.

par Nguyen B L

5 juil. 2021

I am now confusing by too many Deep learning framework. Also the content is somehow repeated with the Deep learning specialization.

par shinichiro i

24 avr. 2021

I just want them to use Keras, since I have no inclination to study new shiny fancy framework such as Trax.

par YuLin D

22 juil. 2022

Great Course! But it would be better if use Tensorflow or Pytorch, Trax is not very friendly to Mac users.

par martin k

26 avr. 2021

Lectures are quite good, but assignments are really bad. Not helpful at all

par Deleted A

3 janv. 2021

assignments were easy and similar.learned less than expected.

par Alberto S

1 nov. 2020

Content is interesting, but some details are under explained.

par Ashim M

22 nov. 2020

Would've been better with a better documented library.

par Mahsa S

26 mars 2021

I prefer to learn more about nlp in pytorch

par Leon V

28 sept. 2020

Grader output could be more useful.

par Greg D

24 déc. 2020

Spends a lot of time going over tedious implementation details rather than teaching interesting NLP topics and nuances, especially in the assignments. Introduction to Trax seems to be the only saving grace, one bonus star :)))).

For having Andrew Ng's course as suggested background for this course this is a big step (read as fall) down.

par Miguel Á C T

19 mars 2021

The course is good as an example of code that executes tasks correctly; that is, you can see how neural networks are defined and used in Trax. However, from a pedagogical point of view, I find it quite weak. Concepts are poorly explained and notebooks consist of little more than copying and pasting previously displayed code.

par Manikandan N

26 juil. 2022

Useless i would strongly recommend you to don't enroll this course. None of the content is properly covered. For your kind notice Andrew Ng is not the instructor of this course.