Discussion about this post

User's avatar
Neural Foundry's avatar

Solid breakdown of how selective data presentation works. The "up to" framing is brutal once you see it – it technically allows for any distribution where at least one student hit the max gain, which could mean most kids saw way less. I've reviewed enough education pilots to know that when methodology details are missing and numbers are announced at holiday timing, someone upstream made acalculation about attention spans. What bugs me is the opportunity cost: parents who actually want to support learning at home get handed a platform without the context to judge whether it solves their kid's actual bottlenecks. If the phonics gains are real and sustained, great – but the way it's communicated undermines rather than builds the trust needed for harder conversations down the road when results are mixed.

Wonder Out Loud's avatar

Thanks for this article. This has been niggling at me too. The data tells us children learned the skills that were taught. That’s encouraging. What it doesn’t yet tell us is whether those skills lead to stronger reading and writing over time. That’s not an attack on teachers or families - it’s the next question any serious education system should ask.

4 more comments...

No posts

Ready for more?