Tutorial :Why not just using ajax for Page Requests to load the page content?



Question:

Many web pages load all of their content to change very little information.

Now I would like to know why shouldn't the developers just use ajax for the main page requests?

On my own webpage, I would like to develop the main requests on my webpage with just ajax but I don't know any specific cons with this approach.

Does anybody have an idea why someone shouldn't use ajax so much?


Solution:1

Search engines, crawlers/spiders, browsers with no javascript, screen readers and other consumers of the content will not be very happy with it.

You can provide tons of ajax behavior on top of you website if you already support standard server side navigation for the full content. Have a look at progressive enhancement (SO) and progressive enhancement (wiki).


Solution:2

The whole premise really is that with AJAX you don't need to reload the whole page to update a small percentage of that webpage. This saves bandwidth and is usually much quicker than reloading the whole page.

But if you are using AJAX to load the whole page this is in fact counterproductive. You have to write customised routines to deal with the callback of the AJAX data. Its a whole lot of extra work for little to no increase in performance.

General rule for where to use AJAX: If your updating >50% of your page, just reload, else use AJAX.


Solution:3

I'll give you one very good reason.

If you turn off javascript in the browser it won't work.


Solution:4

The biggest con are users who have JavaScript disabled. Your website simply won't work for them.


Solution:5

Aside from the answers already posted, using AJAX can have ugly side effects on browser control, such as the stop button not working.


Solution:6

One thing is that you want content to have a static url, you want people to be able to link to your pages, bookmark them, etc.

If everything is ajaxified, this could be tricky and/or tedious.


Solution:7

Well if you want to AJAX load new pages, such as the same way Gmail works, I suggest your links are normal A HREF links that point to a true full rendering page URL and alos use an onclick event that stop the attempt at normal link loading and make your AJAX calls. The problem here is you'll be doing almost double coding unless you architecture this all very well.

This way the normal non JS links load the full page, and the JS calls only load the new parts or page. This means spider indexing works again too.


Solution:8

Well, you can always add the onclick event unobtrusively using jquery and stop the normal URL handling.

Eg:

HTML

<a id="ajaxify-this" href="my-working-url">Click me to do AJAXy stuff if you have javascript</a>  

then Javascript

$(document).ready(function() {           $("#ajaxify-this").click( function(e) {         updateContent(); // do something ajaxy with the page         return false; // stop the click from causing navigation     })  }  


Solution:9

I use only JavaScript and EJS as template Engine for my own webside. One step closer to SOFEA/SOUI.

Search engines, crawlers/spiders, browsers with no javascript, screen readers dislike it, right. But I follow the mainstream ;)


Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Previous
Next Post »