Welcome to this course on TensorFlow Lite. You've seen a lot of deep building algorithms run maybe on your laptop and on the Cloud, but there's something magical to getting these algorithms. Maybe a model that you've trained to run in your hand, on your smartphone or on a lightweight embedded processor like, an Arduino Raspberry Pi. You can buy for some tens of US dollars. So in this four-week course, you'll learn to build these models and have them run at the edge. There is something really magical about having models running on mobile devices and running away from the computer. We've all seen or we will see in this course demos where I can show image classification, object detection on your smartphone. But I think also even more exciting is when you get smaller than the smartphone and you build smart embedded systems that respond to visual input or respond to audio input I think that is really cool. Actually the first time someone showed me a [inaudible] that was trained and was running on a smart phone, was pointing around recognizing different objects and there was a magical moments in that. There was one funny thing though, I was really excited when I could point it to the banana and it would tell us banana. But then later on my friends told me, I know there's a banana. Why do I need it tot tell me it is a banana. That's the thing for technologists. We get excited when the technology works the way we do. It's a good reminder for people in the outside world that we're just building technology that's catching up with what real people can do. Like I recognized it was a banana. But I think one of the things that then becomes really important from that is then when you can take those models and do transfer learning on those models. One of the things that for me that made it doubly magical was when I had that mobile app that could spot a banana but then I did transfer learning on the model with a bunch of flowers that I don't recognize. Then I could actually point my mobile device at one of those flowers and it says this is a [inaudible] which is a type of virus apparently. Which I didn't actually realize. So being able to learn things like that. So now it's seeing things and classifying things that I don't know and it's going beyond me, which I thought was really cool. Yeah, I think these technologies opens up room for a lot of creativity to come up with brand new applications like fly recognition and many others that I hope maybe some of the learners watching this will come up with. Yeah. Beyond running these things on mobile, there are also other edge devices like the lightweight embedded processor. I saw you building earlier today one of your robots with a screwdriver. This guy. So this is a self-driving car. It's built with a Raspberry Pi right here. So the Raspberry Pi runs TensorFlow lights and it's running a thing called donkey car. The idea behind this one is that you can effectively train a car by manually driving it round a course and then capturing frames as you're manually driving it. Then you label those frames with the details of the controller. So how hard was I accelerating? How was I turning left? Was I turning right? They can become the labels to the feature of the camera. Then you can train a model off of that. Deploy the model to the device and then as long as you put the device on the same track it could self-drive around that track, which is really exciting. It's rough around the edges, as you can see but it's beginning to work really well. No, that's really cool. I did my PhD as a Robotics and Machine Learning Process. I think stuff like this together. In this course, learners will be able to run in a simulator as well of this vehicle even if they haven't built one of these at home. Yeah, pretty much. I mean, they won't be able to simulate the vehicle directly if they haven't built one but with a Raspberry Pi, they'll be able to deploy the software to it and try it all out. If you don't have a Raspberry Pi, one of the things that we have in the course is that when somebody builds a model and converts it to TensorFlow Lite, they can actually run it in collab with a TensorFlow Lite interpreter. So we're trying to make it as easy as possible for people to be able to do many of these models and many mobile scenarios. In some scenarios like, if for example, a live camera feed, you're going to need a camera. But on the whole, we just we want to be able to show people how you can take your model convert it, deploy it, and run it on something like this or your Android or your iOS device. Cool. It's nice. So I think one of the most exciting trends in computing has been the rise of Cloud computing, but then also the rise of Edge Computing, where edge means your smart phones and devices like these where the data is collected, where you can also do the computation. Cloud computing has a certain Moore's law, Edge Computing has a different Moore's law. This is a different rate and the network between them is also getting faster, but at yet a different Moore's law. So because of all of these three different rates of Moore's law, it feels like my teams are re-balancing almost every year what we do in the Cloud and what we do at the edge, and in order to deploy machine learning models knowing how to do the Edge computation is a very important part. So in this course, you'll learned how to do these Edge deployments. Let's get started by going onto the next video.