你将学到什么
Create custom layers in Keras.
Use custom layers in Keras models.
课程概况
In this 1-hour long project-based course, you will learn how to create a custom layer in Keras, and create a model using the custom layer. In this project, we will create a simplified version of a Parametric ReLU layer, and use it in a neural network model. Then we will use the neural network to solve a multi-class classification problem. We will also compare our activation layer with the more commonly used ReLU activation layer.
This course runs on Coursera’s hands-on project platform called Rhyme. On Rhyme, you do projects in a hands-on manner in your browser. You will get instant access to pre-configured cloud desktops containing all of the software and data you need for the project. Everything is already set up directly in your Internet browser so you can just focus on learning. For this project, you’ll get instant access to a cloud desktop with (e.g. Python, Jupyter, and Tensorflow) pre-installed.
Prerequisites:
In order to be successful in this project, you should be familiar with python programming, neural networks, and Keras.
Notes:
– You will be able to access the cloud desktop 5 times. However, you will be able to access instructions videos as many times as you want.
– This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.
课程大纲
Create Custom Layers in Keras
Welcome to the course Create Custom Layers in Keras! In this 1-hour long project-based course, you will learn how to create a custom layer in Keras and create a model using the custom layer. We will create a simplified version of a Parametric ReLU layer and use it in a neural network model. Then we will use the neural network to solve a multi-class classification problem. We will also compare our activation layer with the more commonly used ReLU activation layer.
课程项目
Introduction and Importing Libraries
Importing and Visualizing Data
Creating a Custom Layer
Creating the Model
Model Training
Comparison with ReLU