Live Quiz Arena
🎁 1 Free Round Daily
⚡ Enter ArenaQuestion
← Language & CommunicationWhy does a chatbot utilizing a Markov chain exhibit incoherence more readily than one using a transformer network?
A)Markov chains use lexical semantics
B)Transformers manage long-range dependencies better
C)Markov chains lack long-range context✓
D)Transformers ignore surface level features
💡 Explanation
Markov chains determine the next word based only on the preceding n words, rather than considering the overall context; therefore, coherence suffers due to the lack of long-range dependencies. Transformers use attention mechanisms to relate words across the entire input, allowing better coherence rather than just local relationships.
🏆 Up to £1,000 monthly prize pool
Ready for the live challenge? Join the next global round now.
*Terms apply. Skill-based competition.
Related Questions
Browse Language & Communication →- Why does an ambiguous sentence parsed by a statistical machine translation system sometimes yield a semantically incoherent translation?
- In Tokyo Japanese, if a speaker utters the phrase "hashi" (bridge) immediately before "ga" (subject marker) during casual conversation, which consequence follows?
- Why does a proto-language reconstruction become less certain as the time depth increases?
- Which consequence results when a laryngeal consonant weakens through lenition?
- In American Sign Language, why does assimilation impact the articulation of sequential signs?
- If a computational linguist aims to enhance a speech recognition system's resilience to accents using machine learning, which consequence follows?
