Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter ArenaQuestion
← ScienceWhich risk increases when a neural network's activation functions saturate?
A)Vanishing gradient during backpropagation✓
B)Exploding gradients during feedforward
C)Increased model generalization performance
D)Faster convergence during network training
💡 Explanation
Vanishing gradients increase given activation function saturation because the derivative becomes near-zero and prevents learning via gradient descent. Therefore, the backpropagated error signal diminishes, rather than stronger, which stalls training even with powerful hardware.
🏆 Up to £1,000 monthly prize pool
Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.
Related Questions
Browse Science →- Which outcome results when high humidity reduces molecular hydrogen bonding in nylon fibers?
- Which outcome results when a catalyst with strong product affinity is introduced to a reversible reaction at equilibrium?
- Which outcome occurs when gravitational and inertial reaction tensors show equivalence?
- Which process enables 3D imaging in birefringent crystals?
- Which outcome results when free-fall ends inside a spacecraft?
- Which consequence results when thin-film interference affects solar cell performance?
