Deep-Dive into Tensorflow Activation Functions

Offert par
Coursera Project Network
Dans ce Projet Guidé, vous :

Learn when, where, why and how to use different activation functions and for which situations

Code examples of each activation function from scratch in Python

Clock2 hours
IntermediateIntermédiaire
CloudAucun téléchargement requis
VideoVidéo en écran partagé
Comment DotsAnglais
LaptopOrdinateur de bureau uniquement

You've learned how to use Tensorflow. You've learned the important functions, how to design and implement sequential and functional models, and have completed several test projects. What's next? It's time to take a deep dive into activation functions, the essential function of every node and layer of a neural network, deciding whether to fire or not to fire, and adding an element of non-linearity (in most cases). In this 2 hour course-based project, you will join me in a deep-dive into an exhaustive list of activation functions usable in Tensorflow and other frameworks. I will explain the working details of each activation function, describe the differences between each and their pros and cons, and I will demonstrate each function being used, both from scratch and within Tensorflow. Join me and boost your AI & machine learning knowledge, while also receiving a certificate to boost your resume in the process! Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Les compétences que vous développerez

  • Neural Network Activation Functions
  • Deep Learning
  • Artificial Neural Network
  • Python Programming
  • Tensorflow

Apprendrez étape par étape

Votre enseignant(e) vous guidera étape par étape, grâce à une vidéo en écran partagé sur votre espace de travail :

  1. Review the Activation Functions, Their Properties & the Principle of Nonlinearity

  2. Implementing Linear and Binary Step Activations

  3. Implementing Ridge-based Activation Functions (ReLu family)

  4. Implementing Variations of Relu & the Swish Family of Non-Monotonic Activations

  5. Implementing Radial-based Activation Functions (RBF family)

Comment fonctionnent les Projets Guidés

Votre espace de travail est un bureau cloud situé dans votre navigateur, aucun téléchargement n'est requis.

Votre enseignant(e) vous guide étape par étape dans une vidéo en écran partagé

Foire Aux Questions

Foire Aux Questions

D'autres questions ? Visitez le Centre d'Aide pour les Étudiants.