PythonFan.org
Toggle Menu
Home
Online Python Compiler
Tutorials
Python FastAPI
Python Pandas
Python PyTorch
Python Seaborn
Blog
All Posts
PyTorch Neural Network Design
Explore layers,optimizers,and loss functions.
1. Which PyTorch class is used to define a fully connected (dense) layer?
nn.Conv2d
nn.Linear
nn.MaxPool2d
nn.RNN
2. Select all non-linear activation functions from the following PyTorch modules:
nn.ReLU
nn.Sigmoid
nn.Identity
nn.Tanh
3. PyTorch's autograd system automatically computes gradients for all tensor operations by default.
True
False
4. What does the abbreviation 'nn' stand for in torch.nn?
5. Which activation function is commonly used in the output layer for multi-class classification tasks?
ReLU
Sigmoid
Softmax
Tanh
6. Which of the following are optimizers provided by torch.optim?
Adam
SGD
CrossEntropyLoss
RMSprop
7. In PyTorch, parameters of a model created with nn.Module have requires_grad=True by default.
True
False
8. What method must be overridden in a custom nn.Module subclass to define the forward computation?
9. Which PyTorch function is used to save only the learned parameters of a model (not the architecture)?
torch.save(model, 'model.pth')
torch.save(model.state_dict(), 'model.pth')
model.save('model.pth')
torch.export.save(model, 'model.pth')
10. Which of the following are necessary steps in a standard PyTorch training loop (select all that apply)?
Forward pass (compute predictions)
Backward pass (compute gradients)
Zero gradients (reset optimizer state)
Update weights (optimizer step)
11. nn.Sequential is a PyTorch container that applies modules in the order they are passed, without requiring a custom forward() method.
True
False
12. What is the full name of the activation function commonly abbreviated as ReLU?
13. Which method is called on the loss tensor to compute gradients during backpropagation?
loss.grad()
loss.backward()
optimizer.backward()
model.backward()
14. Which of these are valid loss functions in torch.nn for regression tasks?
MSELoss
CrossEntropyLoss
L1Loss
BCELoss
15. PyTorch's autograd can compute gradients for operations on both CPU and GPU tensors.
True
False
Reset
Answered 0 of 0 — 0 correct