Are you ready for AJAX risks?

It's a cool collection of technologies, but there's downside to those slick user interfaces. Here's how to be best prepared.

1 2 3 4 5 Page 4
Page 4 of 5

Reach

Reach risk is as much a marketing issue as it is a technical one. The problem with AJAX is that not everyone can use it. Even if our AJAX application supports the majority of browser variants, there is still that segment of users who will not have JavaScript enabled in their browsers. This might be true if they are in a tightly controlled corporate environment where security is important. Also, some people just turn it off because they don't want to be bothered by pop-ups and other intrusive dynamic behaviors. Between 3% and 10% of the general public has JavaScript disabled at any given time.

Reach is also affected by every other risk mentioned here. Having lower SERPs affects reach because fewer people can be exposed to the site. Losing users because the interface is too new or innovative naturally affects reach, as does losing people due to upgrades in browser technology that break Web site functionality. The only way to totally minimize reach risk is to eliminate all but the most basic, correctly formatted HTML.

Monetization

Internet marketers are also quickly realizing that AJAX throws a popular Web site revenue model into disarray. Although it's true that Google Adsense uses a CPC (cost per click) model, many other advertising-driven site use the CPM (cost per thousand impressions) model that rewards advertisers for mere page views. The idea here is that marketers believe that the value of advertising is more to do with branding and recognition than direct conversions. Whether this is true, under CPM, an average click-through is expensive. Ads generally get low click-through rates (sometimes 0.1% or less). AJAX creates a problem for CPM because under normal conditions if hyperlinks trigger an XHR instead of a full page load, the ad does not register another impression. The benefits are still reaped for the advertiser, but the Web site loses revenue. Simply implementing a trigger to refresh the ad based on a page event (such as an XHR) might not be a fair way to solve the problem either. Disagreements are bound to surface about what kind of request should fairly trigger an impression. The magic of XHR and JavaScript might also seem a bit too ambiguous for some advertisers wary of impression fraud. This event system also lacks a directly comparable baseline from which to compare different Web sites. If one Web site loads more content on each XHR or uses more pagination than another, the number of impressions can be artificially inflated.

Risk Assessment and Best Practices

The number of variables in evaluating the role of AJAX in your project can be a bit overwhelming. The important thing to remember is that all software projects have risk. AJAX is no different in this regard. We already discussed some of these, and following are a few strategies for reducing overall risk.

Use a Specialized AJAX Framework or Component

Save time by leaving browser compatibility and optimization issues to the people that know them best. There are well-optimized third-party AJAX frameworks and components available that have already solved many of the cross-browser issues. Many of these are maintained quite aggressively with regular updates. This can be a cost and time-savings approach well worth any new introduced risks. Judge a framework or tool by the length of time it has been in continuous development and the quality of support available and balance that with the degree to which you are prepared to build a dependence on it.

AJAX Framework and Component Suite Examples

Dojo, open-source

Prototype, open-source

DWR, open-source

Nitobi, commercial

Telerik, commercial

Progressive Enhancement and Unobtrusive JavaScript

Progressive Enhancement (PE) can be an excellent way to build AJAX applications that function well, even when the client browser can't execute the JavaScript and perform the XHRs. PE is different from Graceful Degradation because in the latter, we build rich functionality and then some mechanism for degrading the page so that it at least looks okay on incompatible browsers. PE is sometimes also referred to as Hijax.

  •  PE essentially means that you should write your application in such a way that it functions without JavaScript.

  •  Layer on JavaScript functionality after the application is working.

  •  Make all basic content accessible to all browsers.

  •  Make all basic functionality accessible to all browsers.

  •  Be sure enhanced layout is provided by externally linked CSS.

  •  Provide enhanced behaviors with unobtrusive, externally linked JavaScript.

  •  See that end-user browser preferences are respected.

In PE, we begin by writing the application with a traditional post-back architecture and then incrementally enhancing it to include unobtrusive event handlers (not using embedded HTML events, but in externally referenced JavaScript) linked to XHR calls as a means for retrieving information. The server can then return a portion of the page instead of the entire page. This page fragment can then be inserted into the currently loaded page without the need for a page refresh.

When a user visits the page with a browser that doesn't support JavaScript, the XHR code is ignored, and the traditional model continues to function perfectly. It's the opposite paradigm of Graceful Degradation. By abstracting out the server-side API [application programming interface], it's possible to build both versions with relatively little effort, but some planning is required.

This has benefits for accessibility (by supporting a non-JavaScript browser), as well as Search Engine Optimization (by supporting bookmarkable links to all content).

Following is an example of unobtrusive enhancement to a hyperlink. In the first code snippet, we show a hard link to a dynamic page containing customer information.

<a href="showCustomerDetails.php">Show Customer Details</a>

In the next snippet, we see the same link; only we intercept the click and execute an AJAX request for the same information. By calling our showCustomerDetails.php page with the attribute contentOnly=true, we tell it to simply output the content, without any of the page formatting. Then, we can use DHTML to place it on the page after the AJAX request returns the content.

<a href="showCustomerDetails.php"
onclick="returnAjaxContent('showCustomerDetails.php?contentOnly=true', myDomNode); return false;">
Show Customer Details
</a>


When the user without JavaScript clicks the link, the contents of the onclick attribute are ignored, and the page showCustomerDetails.php loads normally. If the user has JavaScript, this page cannot be loaded (because of the return false at the end of the onclick), and instead the AJAX request triggers, using the returnAJAXContent() method that we just made up but would handle the XHR in the example.

What's even more preferable, and more in keeping with the progressive enhancement methodology, is to remove all inline JavaScript completely. In our example here, we can apply a unique CSS class to the link instead of using the onclick attribute:

<a href="showCustomerDetails.php" class="AJAXDetails">
Show Customer Details
</a>


Then, in our onload event when the page is downloaded to the browser, execute something like the following in externally referenced JavaScript to attach the event to the hyperlink:

function attachCustomerDetailsEvent() {
  var docLinks = document.getElementsByTagName("a");
  for (var a=0; a < docLinks.length; a++) {
    if (docLinks[a].className.match("ajaxDetails")) {
      docLinks[a].onclick = function() {

        returnAjaxContent('showCustomerDetails.php?contentOnly=true', myDomNode);
        return false;
      };
    }
  }
}


This loops through all the <A> tags on the page; find the one marked with the class AJAXDetails and attach the event. This code would then be totally unobtrusive to a browser without JavaScript.

Google Sitemaps

Google has provided us a way of helping it find the entirety of our sites for indexing. It does this by allowing developers to define an XML-based sitemap containing such information as URLs for important pages, when they were last updated, and how often they are updated.

Google Sitemaps are helpful in situations where it is difficult to access all areas of a Web site strictly through the browseable interface. It can also help the search engine find orphaned pages and pages behind Web forms.

If an application uses unique URLs to construct Web page states, Sitemap XML can be a useful tool to help Google find all important content but is not a guarantee that it will. It also has the advantage of being one of the few SEO techniques actually sanctioned by Google.

Many free tools are available to assist with the generation of a Google Sitemap file, but one is easily created if you can crawl and provide information about important areas of your Web site. Following is an example of a Google Sitemap XML file:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.google.com/schemas/sitemap/0.84">
   <url>
      <loc>http://www.nitobi.com/</loc>
      <lastmod>2007-10-01</lastmod>
      <priority>1.0</priority>
   </url>
   <url>
      <loc>http://www.nitobi.com/products/</loc>
      <lastmod>2005-10-03T12:00:00+00:00</lastmod>
      <changefreq>weekly</changefreq>
   </url>
   <url>
      <loc>http://www.nitobi.com/news/</loc>
   </url>

</urlset>

The LOC tag provides a reference to the URL. LASTMOD describes when it was last updated, CHANGEFREQ gives Google an idea of how often the content is updated, and PRIORITY is a number between 0 and 1 that indicates a reasonable importance score. In general, it's not advantageous to make all pages a 1.0 because it will not increase your ranking overall. Additionally, new articles or pages should receive a higher priority than the home page, for example, if it is relatively static.

After a sitemaps file has been created, Google must be made aware of it. This can be done by visiting webmaster tools on Google.com. In a short time, the file will be downloaded and then re-downloaded at regular intervals, so be sure to keep it up-to-date.

1 2 3 4 5 Page 4
Page 4 of 5
8 simple ways to clean data with Excel
  
Shop Tech Products at Amazon