I was getting that too all morning (about 4:30 - 8:00 EST)... just stopped about 10 minutes ago. During the outage Google Reader still had all the new stories from overnight, though.
We're up against some MzScheme memory limit. Usually it's not a problem because we lazily load items from disk. Eventually, though, the news process runs out of memory and starts GCing excessively. We have other software that notices this and restarts it, but this time it missed the problem.
Is running out of memory a bug in MzScheme, or instead e.g. Hacker News not removing references to items not recently used (that could be latest lazily loaded from disk if needed again)?
What I meant to ask was is this an issue with MzScheme or the Hacker News code? The reason why I'm asking is because I have a project I'm writing in Arc on MzScheme, and it would be helpful for me to know if MzScheme has a garbage collection bug that was biting you ;-/
It's not a bug in MzScheme. The fundamental problem is that the server is a 32 bit machine with 4 GB of memory, only about 2.5 GB of which we can use. Really we should just upgrade to a 64 bit server.
Something wrong with specific posts or my account or some javascript mess up that the posts are not loading completely.