Part 10 (5 points, coding task)
Rectified Linear Unit, or the “ReLU”, is one of the most common used function in deep learning. It is defined as
\text{ReLU}(x)= \max \left\{ 0,x \right\} .
In this part, you are asked to use PyTorch to build a ReLU class named My_ReLU that subclasses nn.module.
A successful class works in the following ways:
-
The initialization of an object in
My_ReLUdoes not take any input. -
Suppose we have a
My_ReLUobject calledactiviation0. When we callactiviation0(x)with inputxthat is a tensorxwith an arbitrary dimension and shape, we get an outputyfrom the element-wise ReLU acvitivation onx.