2025 USA-NA-AIO Round 1, Problem 2, Part 13

Part 13 (15 points, coding task)

In this part, we use the training dataset constructed in Part 12 to train a model defined in Part 11.

  • Use mean-squred error (MSE) as the loss function.

  • Use Adam as the optimization algorithm.

  • Do whole-batch training in each epoch.

  • After every 10 epochs, print the following sentence:

    Epoch: XXX. Loss: XXX.

    The loss value should be with 4 decimal places.

  • Generate an epoch-MSE loss plot after completing the training. Set the x-label as epoch and the y-label as MSE loss.

1 Like
# HYPERPARAMETERS
''' DO NOT CHANGE ANYTHING IN THIS CODE CELL '''

hidden_features1 = 32
hidden_features2 = 16

num_epochs = 500
learning_rate = 1e-3

### WRITE YOUR SOLUTION HERE ###

my_mlp_model = My_MLP_Model(1, hidden_features1, hidden_features2, 1)
optimizer = torch.optim.Adam(my_mlp_model.parameters(), lr = learning_rate)
loss_fn = torch.nn.MSELoss()

loss_list_plot = []

for epoch in range(num_epochs):
    optimizer.zero_grad()
    y_pred = my_mlp_model(x_train.reshape(-1,1))
    loss = loss_fn(y_pred, y_train.reshape(-1,1))
    loss.backward()
    optimizer.step()

    if epoch % 10 == 0:
        print(f"Epoch: {epoch}. Loss: {loss.item():.4f}")
        loss_list_plot.append(loss.item())

plt.plot(loss_list_plot)
plt.xlabel("epoch")
plt.ylabel("MSE loss")
plt.show()
""" END OF THIS PART """
1 Like

My solution

mymodel = My_MLP_Model(1, 3, 3, 1)
criterion = nn.MSELoss()
optimizer = optim.Adam(params = mymodel.parameters(), lr = 0.07)

epochs = 1000
losses = []
for i in range(epochs):
    y_pred = mymodel(x_train)
    loss = criterion(y_pred, y_train)
    if (i%10 == 0):
        print("Epoch:"+str(i)+". Loss:"+f"{loss.item():.4f}"+".")

    losses.append(loss.item())
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

plt.plot(losses)
plt.xlabel("epoch")
plt.ylabel("MSE loss")
plt.show()