Logistic Regression with NumPy and Python
11 792 déjà inscrits
11 792 déjà inscrits
Welcome to this project-based course on Logistic with NumPy and Python. In this project, you will do all the machine learning without using any of the popular machine learning libraries such as scikit-learn and statsmodels. The aim of this project and is to implement all the machinery, including gradient descent, cost function, and logistic regression, of the various learning algorithms yourself, so you have a deeper understanding of the fundamentals. By the time you complete this project, you will be able to build a logistic regression model using Python and NumPy, conduct basic exploratory data analysis, and implement gradient descent from scratch. The prerequisites for this project are prior programming experience in Python and a basic understanding of machine learning theory. This course runs on Coursera's hands-on project platform called Rhyme. On Rhyme, you do projects in a hands-on manner in your browser. You will get instant access to pre-configured cloud desktops containing all of the software and data you need for the project. Everything is already set up directly in your internet browser so you can just focus on learning. For this project, you’ll get instant access to a cloud desktop with Python, Jupyter, NumPy, and Seaborn pre-installed.
Votre enseignant(e) vous guidera étape par étape, grâce à une vidéo en écran partagé sur votre espace de travail :
Votre espace de travail est un bureau cloud situé dans votre navigateur, aucun téléchargement n'est requis.
Votre enseignant(e) vous guide étape par étape dans une vidéo en écran partagé
par PP3 avr. 2020
Thank You... Very nice and valuable knowledge provided.
par MS1 avr. 2020
Problem was that rhyme could not run for more than the alloted time because I had many errors in between because of which I couldn't complete my whole code in the given time.
par CB23 mai 2020
Its a good course. Instructor is good. Lot of concepts cleared and enough practice has done.
par MV7 nov. 2021
Well explained all the basic components of gradient descent. Exactly as advertised.