VibraXX
Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter Arena
HomeCategoriesLanguage & CommunicationQuestion
Question
Language & Communication

A deep neural network's performance degrades due to adversarial noise injected during training. Which consequence dominates if the network lacks explicit denoising layers?

A)Overfitting to training examples accelerates.
B)Vanishing gradients become more problematic.
C)Feature maps become overly sparse.
D)Robustness to perturbations decreases sharply.

💡 Explanation

If a neural network lacks denoising layers, adversarial noise directly corrupts learned features, because the network cannot distinguish signal from noise. Therefore, robustness to perturbations will sharply decrease, rather than overfitting, which is a separate concern, or gradients vanishing, which can be independent.

🏆 Up to £1,000 monthly prize pool

Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.

⚡ Enter Arena

Related Questions

Browse Language & Communication