I would love some pointers to additional references for each video. Also, the instructor keeps saying that the math behind backprop is hard. What about an optional video with that? Otherwise, awesome!
I highly appreciated the interviews at the end of some weeks. I am currently trying to transition from a research background in Systems/Computational Biology to work professionally in deep learning :)
par Ivan M•
The course is fantastic, but I did Andrew Ng's Machine Learning course before and I miss some things here.
First, this course is more direct and faster than the other one and there are some basic concepts that are not explained here, so I recommend doing Machine Learning before. Also, I miss the little questions inside each video (especially the ones that ask about ideas that are about to be explained and make you think a little more). They have been included in the test at the end of the week, which has now 10 questions instead of 5. I also miss the lectures after the videos, which helped with the hardest concepts. The whole Machine Learning course seemed more inspiring than this one. As a little detail, I preferred the sans-serif font in the Machine Learning course slides than the one used here.
The other thing I don't like are the Jupyter notebooks. I get the point and they should be a good tool to code and learn and to evaluate the exercises, but I prefer the pdfs and the downloadable programming files. In the Machine Learning course you had a lot of structured Matlab/Octave files in your computer that you could then reuse easily. Here you have a document mixing text and code and it is not clear where all the code files are or how to download them for later use, Also, I like to program in my own environment, with my preferred text editor (with autocompletion, colors combinations, keyboard shortcuts...). Here you must use a basic online editor that also is hard to navigate through using the keyboard, because the text parts are also editable and selectable and you must jump from one part to another to move yourself through the document. And you need to do so, because your screen has a size and the explanations and other functions are long and they are far away from the code when you start programming. It's a very awkward way of working.
The programming exercises are very guided and you must just fill little snippets of code, which is not hard to do. They must "cheat" you giving you half the info you need for a formula to make you think a little more or it would be too easy, but the whole structure of the program is done and, although everything is very detailed in the comments, the fact that you don't program all of it doesn't help you understand the key concepts explained in the slides.
You must know some Python to be comfortable when coding, because there is no explicit material about the language syntax (in the Machine Learning course there was a video with a quick tour about Matlab/Octave and more optional short videos to learn the basics in less than one hour),
Anyway, the course materials are great and updated, and the derivatives role in the learning process is here explained clearly (I didn't understand its importance in the other course). I love the interviews with the Heroes of Deep Learning, which give you an insight of how things are now and how they have been before, explained by the poeple who invented the functions and tools we use today.
Andrew Ng is a great teacher.
par Stephen K•
Tying your shoelaces is easy...if you have two hands. Some reviewers say this course is easy too. But you will be confronted with multiplying matrices and some differentiation. More than anything, I found it difficult to keep track of the different matrices, and particularly their dimensions, which if you do this course you will see is vital. There's also a lot of notation to overcome. You will need to understand some python, particularly how to extract values from tuples or dictionaries, and being familiar with user-defined functions will also help. So, easy?
The course starts with a 0-level neural network and builds up to a deep neural network. It's a nice way to easy yourself into what is clearly a complicated subject. The downside (at least for me) was that each week I was hit by yet more new notation, and I felt that some of what I'd been taught in the previous week (and was clinging on to by my fingertips) was almost redundant. It made my head spin. Nonetheless, I persevered and passed the course.
So, I've gained an appreciation of approximately how a neural network works. I could not build a neural network from scratch without massive recourse to my notes and assignments, and plenty of time. Is this how people build neural networks, or are they using libraries to make the job much easier (Tensorflow, Keras, etc.?) Or, can I use the final assignment as a template and apply this to many problems? I don't know.
I thought the notes were quite poor. There is a mountain of writing on most slides at the end. I scribbled more notes to explain Andrew's notes, otherwise a week later it'll be clear as Aramaic. However, Andrew repeats and explains well what's happening. He has a calm and reassuring manner, which I really liked.
People have complained about assignments being too easy. Not for me. I thought they were a good way to reinforce the lectures, and provided a means to see how a neural network could be built in practice. The assignments are more like lectures with your participation than traditional assignments. This is a plus point, in my view.
Finally, I'm still blown away how just a 'simple' logistic regression with sigmoid activation function can predict cats from random images so well. I've done the course, but it's like magic!
par David R•
Overall the courses in the specialization are great and provide great introduction to these topics, as well as practical experience. Many topics are explained clearly, with valuable field practitioners insight, and you are given quizzes and code-exercises that help deepen the understanding of how to implement the concepts in the videos. I would recommend to take them after the initial Andrew Ng ML course by Stanford, unless you have prior background in this topic.
There are a few shortbacks:
1 - the video editing is poor and sloppy. Its not too bad, but it’s sometimes can be a bit annoying.
2 - most of the exercises are too easy, and are almost copy-paste. I need to go over them and create variations of them in-order to strengthen my practical skills. Some exercises are quite challenging though (especially in course 4 and 5), and I need to go over them just to really nail them down, as things scale up quickly. Course 3 has no exercises as its more theoretical. Some exercises have bugs - so make sure to look at the discussion board for tips (the final exercise has a huge bug that was super annoying).
3 - there are no summary readings - you have to (re)watch the videos in order to check something, which is annoying. This is partially solved because the exercises themselves usually hold a lot of (textual) summary, with equations.
4 - the 3rd course was a bit less interesting in my opinion, but I did learn some stuff from it. So in the end it’s worth it.
5 - Slide graphics and Andrew handwriting could be improved.
6 - the online Coursera Jupyter notebook environment was a bit slow, and sometimes get stuck.
Again overall - highly recommended
par Halil D•
Learning from reliable resources is crucial. Andrew Ng is ranked #3 in the field of Deep Learning, in terms of the number of citations, on Google Scholar. Therefore, being able to learn from a person like him is an extremely valuable chance. I learned a lot, but would like to tell the things that should be improved:
• There are lots of redundant repetition. It kills the flow and creates a serious mess
• Assignments are only focused on finding a few missing lines in the cells. Therefore, it cannot evaluate whether you "understand the big picture" and "can build a model on your own" or not
• Sometimes terms/concepts are not clearly explained OR not explained at the right time. Example: A new term "activations" comes up in a video, and you wonder what is that. However, you learn what actually it is, maybe in the next video by your own inference
Advice for learners: Before starting to a programming assignment, download the whole folder of this programming assignment (you cannot download a folder, but you can download it file by file and create the same folder with its original structure) and work on your computer. By this way, you can prevent the "kernel disconnection" risk of the online version, and also replace the notes within the "Markdown" cells with your own summary. When you complete the programming assignment, you will just need to copy the codes within the "Code" cells to the online version, and then submit
par Shrihan D•
Fantastic course, great for newbies to get into machine learning; however, some prior experience with basic statistical learning algorithms (linear regression, logistic regression), experience with basic linear algebra (vectors, matrices, matrix multiplication), and experience with multivariable calculus (chain rule, partial derivatives) is required to extract as much as possible from this course. For the programming exercises, it is required to know the fundamentals of python programming (OOP is not necessary and the course teaches you NumPy as you go along). The programming exercise in the final week went a little bit over my head with the caching of forward propagation values, but it was nevertheless a great course. On to course 2!
par Akif E S•
I think while writing helper functions, expected outputs' should be same as our test and train data. It causes some misunderstandings. I know the fact that when we don't use assess' it will take time to see output but I think that this is a sactificial thing.
And also for the students that know calculus well, optional videos' can be much more detailed like dZ computation or the concepts of deep learning via calculus.
Except these two reviews, I think this was a really good course. I really thank you to you who prepared these courses.
My best wishes.
par Nowroz I•
I loved this course as it explains the intuition behind the methods used in deep learning. As I have no problem with Calculus and Linear Algebra, I was able to calculate the derivatives by myself. People who are not accustomed to working with NumPy may find the assignments overwhelming. Hence, my suggestion will be to learn the NumPy (only the basics will do) before starting this course.
I give four stars because the course is great and the programming assignments too. But I think sometimes the programming assignments were a little condescending and easy. Don't get mi wrong, there were moments that I din't know what to do, but there were also a lot of times that all the procedure was explained.
This course was really clear my concepts of Deep Learning and how actually neural network works.
par Shravan V•
The course exercises were very well thought out and well designed. The instructions were not crystal clear, which led me to errors in the notebook. In week 4's last assignment, it wasn't made clear that the function definitions I had written in the preceding assignment should not be cut and pasted into the notebook, but that the grading system would use its own function definitions; this led to my submission leading to grading errors. Took many hours to figure out what was wrong, through the help of one very helpful person (Paul Mielke) on the forum.
Andrew Ng's handwriting is TERRIBLE. He should either practice writing more clearly, or use slides.
I would have appreciated having written down lecture notes; having to take notes on the fly was hard as I was sometimes watching the lectures on the train or during dialysis (one arm is disabled).
Is it really necessary to use up so much of the screen when showing the videos with the logo of deeplearning.ai?
Just a comment on one important shortcoming of online instruction: As a professor who teaches statistics, it is interesting to see the loss in learning that the student experiences through the absences of individualized feedback. One learns way more when one can talk to the teacher(s), and I guess this high volume throughput style of teaching limits what can be taught online.
par Omar A•
If you have taken this course after ML by Andrew, you will see exactly the same material covered in 1 week expanded in 4 Weeks except using Python instead of octave or Matlab.
If you have calculus background I expect you to get tedious from elementary approaches in the lectures to get rid of Math and calculus.
Programming exercises in this course are very easy and below the level of first excellent experience with ML course.
There is no easy way to get lectures slides, No reading sections in this course. Like this course made to make systematic approaches to get things done without actual care about understanding the theories and concepts.
The good news comes when you have no previous knowledge about NN and elementary python skills, then this course is an excellent way for you to start.
The content is great and I learned a lot. Certainly there could be a lot more feedback by the instructor in the forum. My feeling is that the students are really left on their own. Good from one point of view (cause you really have no choice than crush your head on the problem for days until you understand or give up), bad from another (it takes a lot longer to clarify difficult points). Fortunately the forum is populated by very clever students that take the time to answer questions. As a beginner I learned the broad strokes and intuitions for NN in this course, but the details about certain formulas are still very obscure and I was hoping for a better explanation of those.
par Trevor M•
info is really good, but there's a lot of handholding in the assignments where it matters, but also, no help afterwards,
Assignments might as well be a follow-along, one-day seminar, as opposed to a bonafide challenging assignment. I can only hope that the latter assignments get better as the material become more challenging.
I loved the assignments for the Machine Learning course with Andrew Ng (with Matlab), but these assignments are far too trivial, and are essentially just "fill in the blank". Perhaps, given that I've already taken that course, I should be looking for something more challenging than this course. Lectures, on the other hand are very good.
par veit s•
Programming assignments are too easy, mostly copy and paste.
par Anne R•
The programming assignments provided a good framework in order to practice coding the main functions in a neural network. This was helpful to understand the matrix operations underlying the forward and backward processing in a general L layer network. Without a previous background in linear algebra and in neural networks however this course would be challenging and maybe very frustrating due to the limited debug information available.
The course videos need to be a lot more focused on the details being conveyed. The verbal and visual discussion and explanation provided is in my opinion not effective. The slides are cluttered and contain many errors, the verbal portion is like a casual conversation that repeats quite a bit, and the script provided for those that get tired of the repetition contains many transcription errors. I would recommend that someone be paid to correct the scripts to help those that prefer this way of working through the course material.
par Tracy B•
The notation used in the course was horrible and correct math notation should be used even if the course is not intended for math students.
I also feel this course should not be labeled as intermediate skill level. This was a very beginner level course. I have a PhD in applied math and was simply looking for knowledge in deep learning since my doctoral work was in a different field. It was very clear that I am WAY behind the target audience of this course. That's not necessarily a negative reflection on the course, but I still didn't find it very useful and feel like it should be labeled as a beginner level course.
par Jérôme B•
To me, this is a failed attempt at simplifying those concepts. After spending hours trying to figure it out, now I find the algorithm behind the Neural Network very simple, and I can easily explain it to someone. But in this course I had to figure out by myself what was the point of those hundreds of lines of maths. So, very interesting concepts, but the "transmitting style" wasn't for me.
par Ofer B•
Very abstract, and the examples are not as concrete as they could be. I'd use better visuals to ensure that the concepts in each video are understood 100% visually.
par Muhammad A•
Great attempt but it failed to provide complete details. Specifically the project files and their loading mechanism
par Francis J•
too easy, suitable as an entry level class
par Tobias G•
Few Detail. Mathematics missing.
par Gaetano P•
The course is well structured and the explanation is linear and mostly clear, but:
1- in 2020 I expect that in doing such a course are going to be applied relatively modern teaching standards, like for example avoiding handwritten text. What is the purpose of writing on the screen if you can use animations to more clearly connect concepts during your lessons?
2- I don't expect that errors to be just rectified before the video. Reupload the video? Errors like that during long formulas and explanations are just going to kill the learning. It is pointless to write before the video that in the future video you will make an error. Just correct it ON the video.
3- If you can't explain in-depth calculus, just to di with the help of someone else. You cannot exclude calculus.
4- The only thing i've learned in this course is vectorization (thank you). The rest is just copy the formula given during the explanation (handwritten on the screen.....) and paste during the exam. I didn't learn how to apply a neural network because during the "exams" it was built already. I expected assignments to make me build an create every piece of the network, instead it was all already done and all i had to do was repeat what Andrew says in the video. This is NOT learning. You need an assignment per video for that kind of thing, you can't just go forward and write some formulas on the screen pretending you have "explained it" because nothing seems explained to me. Why should i use those methods or formulas instead of others? Nothing is explained.
par David B•
This course is really quite bad. I'm not sure why the rating is so high. Probably because they are only prompting people who completed the course to rate it.
The main problem with the course is that It spends the majority of its time describing a byzantine set of notation while avoiding actually helping you understand how to apply the concepts you're learning. So you learn that a^[l](i) is the activation vector for layer "l" and example "i" but then you get to the python portion and, big surprise, none of that information is even slightly useful.
Even worse, the course hasn't chosen its audience. If you're good at math you'll be annoyed about the math explanations. If you're good at programming you'll be annoyed by the programming explanations. Rather than isolate that material in a way that lets people skip parts which they already understand, you get a really basic explanation of everything all globbed together.
Anyway, I'll still try to hack through this thing to finish it, I'm just letting you know that if you're underwhelmed, you're not alone.
par Richard R•
Meh. I don't know why we are spending so much time in Week 2 talking about the math and how to not use FOR loops in week two when he STILL hasn't given any kind of overview about why we do this math, how we're going to use it to identify cats in pictures. Instead, we're just yakking on about math math math math math with NO context whatsoever. If I wanted a math class, I would have taken a deep-in-the-weeds math class. I expected a higher level of instruction for this higher level of abstraction but instead it seems that he just wants to talk about math and how to use vectors in NumPy. Zzzzzzzz.