Tutorial :How fast should a dynamically generated web page be created?



Question:

I have a number data-driven web based applications that serve both internal and public users and would like to gauge how fast you would expect a page to be created (in milliseconds) in order to maintain user satisfaction and scalability.

So, how fast does a page have be created to maintain a fast site?

The sites are developed in ASP classic, with a SQL Server backend generating XML recordsets that I render using XSLT. Not the most efficient technique and pages take between 7ms to 120ms to create (i.e. Timer interval between first line of code and the 'Response.Write') depending on the complexity of the page. Slower pages are due to the database running bigger and more complex queries. Even if I re-wrote all the ASP classic to ASP.NET there will not be any significant improvement to the overall page render speed.

I've often heard Jeff say he wants SO to be the fastest site, and his blogs have discussed optimisation of his code and database but how far do you have to go in optimising your the code? Is shaving off milliseconds by using StringBuffer instead of String + String a good use of my time?

[Clarification]

At what point do you start to think "This page is taking too long to create?". Is it over 20ms, over 200ms or is it OK for a page to take over a second to build? What are your "target times?"


Solution:1

This depends entirely on your audience and targets - I've worked on apps with a target 'onload' event at <4secs, and on apps where the time on server is expected to be <1ms. It can go either way - but whatever you do you need to be aware that whatever performance optimisations you do at the server-side level are likely to be dwarfed by both network performance, still the major bottleneck with the web, and perceptive load times.

Yahoo has some excellent guidelines for general website performance, especially on the perceptive load area.

Hopefully you're already smart enough to be caching what you can and doing the little things like avoiding massive chains of Response.Writes...


Solution:2

Users doesn't care about how fast you prepare your data, they only care about the actual loading-time of the page.

If you have a lot of overhead in rendering, your users will experience your site to be slow. Concerning classic ASP, string concatenation is considered very bad practice since it's gonna be really slow when you hit the critical string-length where it will start to be a burden to the server.

Using an array (jscript) or a .NET StringBuffer can improve the rendering-time significantly. Aswell as unloading unecesary CPU-usage, that would allow your server to handle more traffic, I would say those kind of obvious optimization is very worth doing.


Solution:3

A very interesting screencast on this topic van be found here: link text .

Although it is made by a Rails guy, it is perfectly applicable to other frameworks.


Solution:4

If you can shave off milliseconds by just changing one thing, go for it!

You might want to have a look into caching database requests as well.


Solution:5

One factor that affects user satisfaction to server response time is how often he is requesting a new page. If you're presenting (say) a page with lots of information that the user is going to spend some time to read, a longer "rendering" time is okay. In contrast, if the person is quickly navigating through pages, he will want a near instantaneous response.

For example, if you're on a news site, you probably will be okay if it takes a full second or two for the next page, since you're going to be spending 30 seconds to read it.

On the other hand, if you're browsing through an interactive map, you probably want the response to be less than a second.


Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Previous
Next Post »