SA
27 sept. 2020
Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.
AU
11 nov. 2021
This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.
par Ahnaf A K
•6 août 2020
It was a bit repetitive of the 'Sequence Model' course from the Deep Learning specialization, only with the exception of implementing in TRAX.
par Nishank L
•14 nov. 2021
Assignments are good. Can we have these using pytorch. Or better: Can a person choose his own language and build entire code on that !!
par Osama A O
•19 oct. 2020
Great course, although would have been better if assignments were implemented in Keras or PyTorch. Otherwise, definitely worth it!
par Matthew P
•7 janv. 2021
Great information, but some of the assignments had errors and there weren't many interactions from the TAs on the Slack or Forum
par Marc G
•10 févr. 2022
Great course! I would have liked Keras/TensorFlow 2.x or Pytorch to be used instead of Trax which is not as frequently used.
par Manuela D
•14 mars 2022
Interesting and well explained altough assignment excercise are difficult to understand and they are just focus on Trax
par Mohsen A F
•24 oct. 2020
The clarity of exposition was superb! 1 star less for using TRAX. I would have rathered to use Keras or Tensorflow.
par Saurabh K
•24 mai 2021
We might have included little bit more details on dimensions of the inputs and outputs of the Sequence models.
par Mridul G
•14 juil. 2021
The course is very good, but its not complete in itself. The way course was taken and everything is good.
par Hair P
•20 nov. 2020
Overall the content was great. Please make sure that errors in the notebooks are corrected.
par RAHUL K
•18 sept. 2020
The course is designed quite well to boost understanding of Sequence Models in great depth
par Steve H
•3 avr. 2021
Excellent course, but probably worth doing the deep learning specialisation first!
par Ke Z
•24 févr. 2021
I dont like to use TRAX. If it is using tensorflow, then I will give 5 stars
par Alireza S
•11 déc. 2021
I prefer that the lecturer using TensorFlow instead of Trax for exercises
par kerolos E
•23 mars 2022
Almost perfect. More Explanation in implementation is needed.
par Vitalii S
•21 janv. 2021
Good information, but some assignments were an embarrassment.
par Nikita M
•7 déc. 2020
Not as good as original courses by Andrew
par Gonzalo A M
•14 janv. 2021
it was good but it could be better
par Ruiwen W
•1 août 2020
some errors in the assignments
par V B
•24 sept. 2020
NA
par Yaron K
•29 avr. 2022
The 4th week on Siamese networks was well done. The Weeks on RNN GRU and LSTMs basically gave the equations and some intuition but most of the emphasis was on building a model with them using Googles TRAX Deep learning Framework model. Which the lecturers believe to be better than Tenserflow2. At least when it comes to debugging - it isn't. Make the smallest error (say with shape parameters) - and you get a mass of error messages which don't really help. Now at least for shape errors there is no excuse for this - since all that is needed is to run checks on the first batch of the first epoch that pinpoint exactly where there's a shape discrepancy.
par Amlan C
•9 oct. 2020
Despite the theoretical underpinings I do not feel this course lets you write an NER algo on your own . Majority of these courses have been using data Whats supplied by coursera and so is the case with models. In real life we have to either create this data or use some opensource data like from kaggle or whatever. I think it'd be better if we orient the course using publicly available appropriate data and models trained by students to be used for actual analysis.
par Maury S
•8 mars 2021
Like some of the other courses in this specialization, this one has promise but comes off as a so far somewhat careless effort compared to the usual quality of content from Andrew Ng. The lecturers are OK but not great, and it is unclear what the role of Lukasz Kaiser is beyond reading introductions to many of the lecture. There is a strange focus on simplifying with the Google Trax model at the cost of not really teaching the underlying maths.
par Petru R
•13 avr. 2022
The course requires a solid background on deep learning, it does not explain in detail the LSTMs or how is the programming part keeping the weights of the 2 parts of the siamese network identical.
Is Trax providing other ways of generating data for siamese networks for training other than writing a custom function?
par Business D
•14 déc. 2020
I regret a lack of proper guidance in the coding exercises, compounded with the incomplete documentation of the trax library. I also feel we could build models with greater performance. An accuracy of 0.54 for the identification of question duplicates doesn't seem to be the state of the art...
You could do better!