PyTorch Neural Network Design

Explore layers,optimizers,and loss functions.

1. Which PyTorch class is used to define a fully connected (dense) layer?
2. Select all non-linear activation functions from the following PyTorch modules:
3. PyTorch's autograd system automatically computes gradients for all tensor operations by default.
4. What does the abbreviation 'nn' stand for in torch.nn?
5. Which activation function is commonly used in the output layer for multi-class classification tasks?
6. Which of the following are optimizers provided by torch.optim?
7. In PyTorch, parameters of a model created with nn.Module have requires_grad=True by default.
8. What method must be overridden in a custom nn.Module subclass to define the forward computation?
9. Which PyTorch function is used to save only the learned parameters of a model (not the architecture)?
10. Which of the following are necessary steps in a standard PyTorch training loop (select all that apply)?
11. nn.Sequential is a PyTorch container that applies modules in the order they are passed, without requiring a custom forward() method.
12. What is the full name of the activation function commonly abbreviated as ReLU?
13. Which method is called on the loss tensor to compute gradients during backpropagation?
14. Which of these are valid loss functions in torch.nn for regression tasks?
15. PyTorch's autograd can compute gradients for operations on both CPU and GPU tensors.
Answered 0 of 0 — 0 correct