VibraXX
Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter Arena
HomeCategoriesLogic & PuzzlesQuestion
Question
Logic & Puzzles

A simple neural network uses a sequence to adjust synaptic weights during learning. If the learning rate decreases too slowly, which outcome occurs?

A)Weights converge with oscillations
B)The network learns local minima
C)Learning generalizes effectively to data
D)Synaptic weights stabilize instantly

💡 Explanation

The network learns local minima because slow reduction of the learning rate prevents escaping local minima in the error surface; the Momentum algorithm is stalled. Therefore, the network fails to reach the optimal solution, rather than converging quickly or generalizing well.

🏆 Up to £1,000 monthly prize pool

Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.

⚡ Enter Arena

Related Questions

Browse Logic & Puzzles