Tutorial :Constantly Querying Server via Javascript - Good Idea?


I've got a small website that has about 5-10 administrators. I've set it up to monitor what each administrator is doing (adding items, removing items, etc). I had a list within our admin-panel that shows the previous 10 activities performed by the collective administration. Today, I decided to make this self-updating every 30 seconds.

My question is simple: is there any problem doing this? I'm calling a small bit of text with each request, and the request is likely only running on 3 or 4 computers at a time (reflecting the number of concurrent-administrators logged in).

  $(document).ready(function(){      setInterval("activity()", 30000);    });      function activity() {      $("#recent_activity").load("../home/login #recent_activity .data");    }  

Produces the following (or similar - only with 10 rows) with each request.

<table>    <tbody>      <tr>        <td><p>jsampson</p></td>        <td><p>logged out</p></td>        <td><p>28 minutes 27 seconds ago</p></td>      </tr>      <tr>        <td><p>jdoe</p></td>        <td><p>logged in</p></td>        <td><p>29 minutes 45 seconds ago</p></td>      </tr>    </tbody>  </table>  


3-4 users every 30 seconds isn't very much at all. Even 300 users at that rate wouldn't be much at all.

You may want to check into these questions:

You can cache this as well, and it would be advisable especially if the query to generate the page is computationally heavy, but of course take into account what kind of lag you want in the most recent content being displayed.


You should cache this and only update the cache every 30 seconds.


No, there shouldn't be any problem at all. I do the same thing at 1 minute intervals for a notification system I wrote on my company's intranet portal. Any web server should be able to handle this, honestly.

It's really no worse (in fact, significantly better) than them, say, refreshing their browser every 30 seconds... Considering the significantly smaller data transfer, it would likely be something on the scale of 10-20 times better than refreshing it... or, about the same bandwidth of refreshing once every 5-10 minutes. :-)


No issue at all in my opinion. Sites that run on a much bigger scale (Betfair for instance) use hundreds of xhr calls per minute per connected client. Obviously they have much bigger hardware infrastructures but the browser copes fine.

I have sites that use a smaller interval and they scale to a few hundred concurrent users running from a sinlge webserver.


I don't think it would pose a problem.


I would say it would primarily depend on how expensive that query is.

Although it is a low number of users now, will it always be so?


As altCognito pointed out -- the web traffic from this isn't likely to be an issue.

The only thing I'd check is whether the database load required for this will be an issue. Ie. if this is fed by a query that takes some time to run, it'll cause problems. If that's the case though, I'd recommend adding some caching to the data, or maintaining the data for this in memory instead of the DB (only loading from the DB on startup, and adding things to this in-server-memory list as they happen).


Weigh it against the alternative. If each user was refreshing the page every 30 seconds, loading the whole page, the amount of server side processing and traffic generated would be much greater than just refreshing the "interesting parts."

This is what AJAX was made for.


Did you see the Google Wave preview...? For such a small number this is not an issue, especially as the administrators will know about this. (It's not like you're putting some burden on some visitor's CPU or mobile internet connection.)


That many users are not going to bring down your server.

If you are really concerned about performance, I will give you 2 advice:

  • Use JSON for sending data to the client, it will be lighter than formatted HTML.
  • Use fixed date for history data and compute the relative time on the client, it will allow you to cache the history data.

    {"user":"jsampson","action":"logged out","date":"20090622T170822"}


Yeah, this shouldn't be a problem in the slightest.

That being said, if you're concerned about the amount of data that's sent back you can always make the call send back a simple flag IF there's new data and then go fetch the data if that's the case.

But yeah, with that number of users you shouldn't have any problems. A custom RoR-based message board I frequent uses a similar technique to see if threads have been updated while you're reading it and it has upwards of 100 users hitting it concurrently without any problem.


There are several different ways to emulate server push over HTTP. Such techniques recently got a fancy name: Comet.

If your server configuration allows for indefinitely running scripts, you could use an iframe to implement the panel and use chunked transfers (e.g. via PHP's flush()) to create a persistent HTTP connection. Such a solution should have the least overhead if the time interval between consecutive messages is short. For long intervals, client-side polling is preferable as no TCP connection has to be maintained.


I think what you did is great.

If I were doing this project, I'd start with what you have and add a setTimeout() event to increment the minutes/seconds display every second.

The users would perceive the display to be real-time, and they'd probably never hit the page refresh.

The danger with updating only every 30 seconds is some people will reflexively hit refresh for "the latest" every time they turn their attention toward it.

Also consider specially marking anything with a time less than five minutes. And color coding logged in versus logged out. It'll be easier for people to "scan" because they'll be able to pick out changes without reading all the text.

Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Next Post »