Which does make sense, since it's a constantly updated site-- regenerating all of the files (to update "what's new" lists and such) each time an article is written would be major overkill.
While I'm sure you guys already do this, proper caching can have a similar effect to a completely static site in terms of performance.
I don't understand. Why not start with caching, then do all those things you mentioned when the site bogs down again? It'll be after all the same queries, the same rendering etc. post-cache; the same technique to improve them will work in the presence of cache.
So why not start with caching, especially if it buys you a large chunk of time to prepare the other stuff?
I read this a lot how people say caching should be a last resort, but it really depends on what you are doing. Many times caching can be the easiest thing to implement on certain things.
I have done sites that have a signup page that have a lot of dynamic statistics on that page... I could have spent plenty of time optimizing the calls, but it is far easier to just cache the whole thing, and it will be faster. Clearing the cache is usually not nearly that bad either.
If you're just slinging static HTML, there's really not that big of a difference, and if it's a frequently accessed resource where a difference might be expected to matter, it will be cached regardless.
once it's cached though, loading new pages or new data is trivial.
It really depends on the use case- if I'm navigating my banking website, I would prefer a longer load time up front if it made it navigating between accounts really fast. If I'm checking my power bill, I'm probably only going to view one page so I just want it loaded ASAP
So what's the thought process for choosing between that and cached dynamic pages? First visitor doesn't have to pay latency--serving a stale page while caching new one also works.
Otherwise static is 100% the way to go, if only to reduce how much code to deploy.
reply