2025 USA-NA-AIO Round 1, Problem 2, Part 10

Part 10 (5 points, coding task)

Rectified Linear Unit, or the “ReLU”, is one of the most common used function in deep learning. It is defined as

\text{ReLU}(x)= \max \left\{ 0,x \right\} .

In this part, you are asked to use PyTorch to build a ReLU class named My_ReLU that subclasses nn.module.

A successful class works in the following ways:

  • The initialization of an object in My_ReLU does not take any input.

  • Suppose we have a My_ReLU object called activiation0. When we call activiation0(x)with input x that is a tensor x with an arbitrary dimension and shape, we get an output y from the element-wise ReLU acvitivation on x.

### WRITE YOUR SOLUTION HERE ###

class My_ReLU(nn.Module):
    def __init__(self):
        super().__init__()

    def forward(self, x):
        return torch.max(torch.zeros_like(x), x)

""" END OF THIS PART """