wyntarvox: (Default)
wyntarvox ([personal profile] wyntarvox) wrote in [site community profile] dw_dev2009-05-20 07:20 pm
Entry tags:

S2 Max Recursion and Comment Threads

I understand why the max recursion limit exists, but at the moment it's sort of ugly in its execution with regard to comment threads.

For example, if a comment thread exceeds the limit the layer dies as expected. However, there's no obvious indication beforehand that it's going to happen, so someone doing something as innocent as commenting effectively ruins the remainder of the comment threads on the entry. Now, you can see the rest of the comments if you view the entry in a site scheme, but that's not immediately obvious to Joe User.

I'll be the first to admit that I'm definitely no programming expert so I may be asking the impossible... Can we only kill the function that causes the recursion limit in a way that'll let the remaining comment threads on an entry get printed? It was also mentioned to me that perhaps we could look in to printing the comment threads iteratively rather than recursively? At the least, can/should we prevent people from making the nth comment that'll hit the recursion limit and therefore break the rest of that entry's comments?
jimmyb: (Default)

[personal profile] jimmyb 2009-05-21 01:15 am (UTC)(link)
Well of course, not dying at all would be much nicer, but if it HAS to die...let it be "friendly".

What sorts of strains is loading that many comments at a time putting on the server though? If any...
mark: A photo of Mark kneeling on top of the Taal Volcano in the Philippines. It was a long hike. (Default)

[staff profile] mark 2009-05-21 01:46 am (UTC)(link)
Tons of strain. :)

It's mostly loading the rows from talktext2 though, since that's potentially a LOT of data off of the disks. If you have 250 comments on the page, all reasonably long (couple kilobytes), you're talking about loading a megabyte of data...

Which is a lot. But, caching! You might mention caching! That should fix this problem, right?

Well, sure, to some extent. There's a compounding problem in popular communities where people are editing and posting new comments a lot, leading to some of the caches getting cleared fairly often.

Of course, counter-argument is: computers have gotten a lot faster since this was instituted a few years ago. We now have so much RAM available that we can, in theory, maintain huge cache hit rates and maybe give people everything they want and a bunny rabbit...

But I dunno. We'll have to play with it over time. I'll be gone for the next month, but I'm amenable to revisiting old LJ performance decisions to see if we can get around them.
jimmyb: (Default)

[personal profile] jimmyb 2009-05-21 01:49 am (UTC)(link)
Well, yes, YOU may have tons of RAM. But I've only got 2GBs in my box, so that's why I ask about the strain on servers. :)