Chevron Left
Retour à Generating New Recipes using GPT-2

Avis et commentaires pour d'étudiants pour Generating New Recipes using GPT-2 par Coursera Project Network

4.3
étoiles
32 évaluations
4 avis

À propos du cours

In this 2 hour long project, you will learn how to preprocess a text dataset comprising recipes, and split it into a training and validation set. You will learn how to use the HuggingFace library to fine-tune a deep, generative model, and specifically how to train such a model on Google Colab. Finally, you will learn how to use GPT-2 effectively to create realistic and unique recipes from lists of ingredients based on the aforementioned dataset. This project aims to teach you how to fine-tune a large-scale model, and the sheer magnitude of resources it takes for these models to learn. You will also learn about knowledge distillation and its efficacy in use cases such as this one. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions....

Meilleurs avis

Filtrer par :

1 - 4 sur 4 Avis pour Generating New Recipes using GPT-2

par Raj R

25 août 2020

Excellent hands-on experience on how to do use big Models like GPT-2

par Samar M

30 août 2020

Thank you for your efforts but resources of this course not available once the course have completed!!

par Jorge G

25 févr. 2021

I do not recommend taking this type of course, take one and pass it, however after a few days I have tried to review the material, and my surprise is that it asks me to pay again to be able to review the material. Of course coursera gives me a small discount for having already paid it previously. It is very easy to download the videos and difficult to get hold of the material, but with ingenuity it is possible. Then I recommend uploading them to YouTube and keeping them private for when they want to consult (they avoid legal problems and can share with friends), then they can request a refund.

par Mayur S

24 sept. 2020

One of my primary motivation to opt for this course was to understand how to fine-tune GPT2. Bit disappointed.