To the people who are thinking about other languages, "Lexical decoding" - recognition of a word during reading was the strongest predictor of reading compression (as opposed to phonological).
Restating the highlighted result: Gc ("Comprehension-Knowledge") had the strongest effect on both lexical and phonological decoding. Knowing a word makes it easiest to comprehend when reading. This is probably completely obvious, but the broader point is that rich conversations with students that involve teaching them lots of words will improve their reading.
Only partially supported interpretation/application - All this business about phonics will only take you so far if the adults in a kids life (including their teachers) are not talking to them richly about a lot of stuff. Asking teachers to do a lot of rote repetition risks cutting out the really important part of school where students are actually building vocabulary. Teachers that use/teach large vocabularies may be unexpectedly more effective at teaching reading.
Somewhat off-topic but related - doing a lot of AI-assisted coding lately, multiple projects at the same time. I'm basically in 15 hour loops of reviewing code/output and creating prompts, switching to the next project while Claude (and others) work. I noticed an almost dyslexia emerging where I'm thinking of the prompt/instructions while typing, then I look at what I typed and it's not what I was thinking - sometimes it's a combination of two different thoughts/prompts that are actually intended for separate projects. It's so weird - I can instantly recognize it's wrong when re-reading the prompt and I still have my intended prompt/instructions in my head. I've never been diagnosed with dyslexia or ever had similar things happen, before AI seemingly captured my brain with the promise of delivering one some of the dozens of my dormant projects/ideas. Maybe I need a break...
[Not an expert] I met someone once who got very expensive glasses with some carefully calibrated prism to perfectly align both eyes. Like when you are exhausted you get double vision. His eyes were fine but with the glasses he could read all day as opposed to only a few hours. He only used them when his eyes started to tire. He said that anyone could have them made but was very curious how many poor readers could benefit as his tired eyes just felt like not wanting to read anymore.
In fact, I find your very comment as an example of the variety of visual experiences that exist that I hadn't even thought of. I mean when you say “Like when you are exhausted you get double vision” that itself sounds alien to me. I have been exhausted many times but never associated it with seeing double. I have only experienced involuntary double vision in certain circumstances during my recovery from a severe TBI.
In my mid forties now reading without good reading glasses is absolutely awful, I could see some people struggling with enjoyment of reading if they weren't aware their eyesight is an issue.
Something to check if you are getting older and not enjoying reading as much lately!
Same. Having aging eyes has increased my empathy. When I can't read restaurant menus, or dosage information on a bottle, or see which direction the battery is supposed to go, or the right button on some tiny remote (and then inevitably fumble and guess when glasses aren't hand), ... I've learned a lot about what navigating the world might be like for others.
It would be interesting to see a study comparing languages where writing encodes sounds like English versus languages where writing encodes meaning, like Chinese. And also how a person’s visual and auditory capabilities relate to reading. Because languages like English need both I think.
I’m learning Japanese, and I’ve started learning Chinese characters, both their meanings and how to read them. Reading them feels different than English... I have a hypothesis that our brains work differently when processing symbols that encode meanings as opposed to just sounds. English requires an extra step, where characters are translated into sounds and then into words.
With Chinese characters, you are immediately looking at the meaning; you don’t need translation into sounds. This feels like a more efficient process cognitively to me, even though I have to memorize to recognize more characters.
The other major branch of the sino-tibetan languages is traditionally written in alphabetic scripts like Tibetan. If there's a meaningful practical difference associated with what kind of writing system you use, it's not obvious to me what it would be. Modern English is much less phonetic than classical Tibetan as well.
You should think "the step from single sounds to syllables", and the way to do that is to begin with the easy syllables like "tu", "mi", "el" (not unlike the multiplication or addition tables) before moving to longer ones. [And note that M alone is not "em", it has to be "m" when learning to read - a common pedagogical mistake! M + I makes "mi" not "emi", so M must be "m".] At least that's how children are taught in Finnish schools since sometime before the 1980s, and since then almost everyone learns to read during the first school semester. Also, one simple and efficient protection against dyslexia is to play the Graphogame (or similar) to get a lot of repetition with the sound-letter correspondences while learning to read (for various reasons, some brains take longer to build the necessary connections and you want to avoid the negative affects of learning slower than your peers if you can).
"Reading is almost phonetic" is a largely meaningless phrase. There are some orthographies that are more regular than others. But, indeed, the very confusion people love to talk about with English only works if it is phonetic, but ambiguous.
And just solving for one form of ambiguity does not, necessarily, help. Consider contronyms. Words that are literally their own opposites.
I'm convinced the main thing lost in getting kids to read, is that too many mistake interaction with the words as automatic. It isn't. Taking apart a word symbol by symbol and putting it back together in a different form is the entire point of teaching how to read. And if you don't teach kids to do that with words, are you surprised when they can't do it with equations?
I think you misunderstand. In a largely phonetic language, almost everyone learns to read in one school semester, after which it's a fully solved problem - no spelling bees or anything. Peculiarly, you don't need spelling bees either when learning English later. ("Contronyms" and "words" are orthogonal to reading as they apply to spoken language too (and it's very much automatic).)
It's always seemed crazy to me when people talk about "the cause" of dyslexia. There are so many brain processes involved in reading, that any number of issues could be a cause of someone's dyslexia. It's like saying "the cause" of water drainage issues is beavers.
Restating the highlighted result: Gc ("Comprehension-Knowledge") had the strongest effect on both lexical and phonological decoding. Knowing a word makes it easiest to comprehend when reading. This is probably completely obvious, but the broader point is that rich conversations with students that involve teaching them lots of words will improve their reading.
Only partially supported interpretation/application - All this business about phonics will only take you so far if the adults in a kids life (including their teachers) are not talking to them richly about a lot of stuff. Asking teachers to do a lot of rote repetition risks cutting out the really important part of school where students are actually building vocabulary. Teachers that use/teach large vocabularies may be unexpectedly more effective at teaching reading.
Something to check if you are getting older and not enjoying reading as much lately!
I’m learning Japanese, and I’ve started learning Chinese characters, both their meanings and how to read them. Reading them feels different than English... I have a hypothesis that our brains work differently when processing symbols that encode meanings as opposed to just sounds. English requires an extra step, where characters are translated into sounds and then into words.
With Chinese characters, you are immediately looking at the meaning; you don’t need translation into sounds. This feels like a more efficient process cognitively to me, even though I have to memorize to recognize more characters.
They all activate different regions of the brain and clearly are being processed in different ways.
https://pmc.ncbi.nlm.nih.gov/articles/PMC2782536/
Anyway, from my experience with my daughter, the step from single letters to silabes is difficult.
And just solving for one form of ambiguity does not, necessarily, help. Consider contronyms. Words that are literally their own opposites.
I'm convinced the main thing lost in getting kids to read, is that too many mistake interaction with the words as automatic. It isn't. Taking apart a word symbol by symbol and putting it back together in a different form is the entire point of teaching how to read. And if you don't teach kids to do that with words, are you surprised when they can't do it with equations?
Is bad grammar one of the reasons, even though the title suggests there is just one?
But the details of the new study seem to support exactly that original idea.
Perhaps a little more detail on why and what kinds of smart, but it was a pretty broad set of mental skills that mattered
"Since the 1990s, the phonological deficit hypothesis has been the dominant explanation favored by researchers" https://en.wikipedia.org/wiki/Phonological_deficit_hypothesi...