Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter ArenaQuestion
← Language & CommunicationWhy does a recurrent neural network (RNN) struggle with long-range dependencies in syntax analysis compared to a Transformer network?
A)RNNs utilize fixed positional encodings.
B)RNNs have smaller hidden state dimensions.
C)RNNs suffer from vanishing gradients.✓
D)RNNs lack attention mechanisms entirely.
💡 Explanation
RNNs struggle because they suffer from the vanishing gradient problem, which hinders their ability to learn relationships over long sequences. Because gradients diminish over many time steps, the network struggles to propagate information effectively; therefore, it cannot accurately model long-range syntactic dependencies, rather than the attention mechanism inherent in Transformers.
🏆 Up to £1,000 monthly prize pool
Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.
Related Questions
Browse Language & Communication →- Why does a human interpreter struggle to accurately convey sarcasm from a speaker using low acoustic variability in their intonation during a remote video call?
- A patient exhibits impaired speech production but intact comprehension after a stroke; why does double dissociation with Wernicke's aphasia strengthen structure-function inference?
- A Mandarin Chinese speaker from Beijing struggles to be understood in Guangzhou despite using correct vocabulary. Which mechanism explains this communication breakdown?
- A social media platform restricts character count; which mechanism promotes emoji adoption in informal communication?
- Why does understanding abstract concepts like 'grasping an idea' benefit from embodied experiences?
- Why does discourse degrade when 'given' information is presented as 'new' within a spoken weather report?
