你将学到什么
Prepare models for battery-operated devices
Execute models on Android and iOS platforms
Deploy models on embedded systems like Raspberry Pi and microcontrollers
课程概况
Bringing a machine learning model into the real world involves a lot more than just modeling. This Specialization will teach you how to navigate various deployment scenarios and use data more effectively to train your model.
This second course teaches you how to run your machine learning models in mobile applications. You’ll learn how to prepare models for a lower-powered, battery-operated devices, then execute models on both Android and iOS platforms. Finally, you’ll explore how to deploy on embedded systems using TensorFlow on Raspberry Pi and microcontrollers.
This Specialization builds upon our TensorFlow in Practice Specialization. If you are new to TensorFlow, we recommend that you take the TensorFlow in Practice Specialization first. To develop a deeper, foundational understanding of how neural networks work, we recommend that you take the Deep Learning Specialization.
课程大纲
Device-based models with TensorFlow Lite
Welcome to this course on TensorFlow Lite, an exciting technology that allows you to put your models directly and literally into people's hands. You'll start with a deep dive into the technology, and how it works, learning about how you can optimize your models for mobile use -- where battery power and processing power become an important factor. You'll then look at building applications on Android and iOS that use models, and you'll see how to use the TensorFlow Lite Interpreter in these environments. You'll wrap up the course with a look at embedded systems and microcontrollers, running your models on Raspberry Pi and SparkFun Edge boards.
Don't worry if you don't have access to the hardware -- for the most part you'll be able to do everything in emulated environments. So, let's get started by looking at what TensorFlow is and how it works!
Building the TensorFLow model on IOS
The other popular mobile operating system is, of course, iOS. So this week you'll do very similar tasks to last week -- learning how to take models and run them on iOS. You'll need some programming background with Swift for iOS to fully understand everything we go through, but even if you don't have this expertise, I think this weeks content is something you'll find fun to explore -- and you'll learn how to build a variety of ML applications that run on this important operating system!
TensorFlow Lite on devices
Now that you've looked at TensorFlow Lite and explored building apps on Android and iOS that use it, the next and final step is to explore embedded systems like Raspberry Pi, and learn how to get your models running on that. The nice thing is that the Pi is a full Linux system, so it can run Python, allowing you to either use the full TensorFlow for Training and Inference, or just the Interpreter for Inference. I'd recommend the latter, as training on a Pi can be slow!