Are you ready for AJAX risks?

It's a cool collection of technologies, but there's downside to those slick user interfaces. Here's how to be best prepared.

1 2 3 4 5 Page 3
Page 3 of 5

Trainability

In the public Web, application users are not generally trainable because they start off with a weak relationship to the vendor. The trainability of your audience depends on the nature of the relationship, on their own motivation to learn, the depth of training required, and, of course their attention span. Training for a Web application might include on-site demonstrations, embedded Flash movie tutorials, or printed instructions. In a consumer-targeted application, switching costs are generally low, and users are poorly motivated to acclimate to a new interface or workflow. Factors that affect trainability include the following:

  •  Strength of the relationship -- Employees are much more likely to be motivated to learn a new workflow than strangers on the Web. Existing customers are also more likely to take the time to learn than new sales leads.

  •  Payoff for the user -- People are more motivated to learn if there is a payoff, such as getting free access to a valuable service, being entertained, or getting to keep their job. If the payoff is ambiguous or not valuable enough, users are less motivated to learn.

  •  Difficulty of the task -- More difficult tasks require a greater commitment to learn.

In the enterprise, we generally have more influence over our users than in consumer-vendor relationships. In other words, our ability to get users to learn a new interface is stronger. That said, the importance of getting user acceptance can't be understated. End-user rejection is one of the major causes of software project failure (see Patterns of Software Systems Failure and Success, by Capers Jones; International Thompson Computer Press, 1996).

Legal

Web accessibility is an issue that links the legal environment to the technical world of Web application design. In the U.S., Section 508 dictates how government organizations can build software and limits the use of rich Internet applications (RIA) -- at least to the extent that they can still be built to support assistive devices such as text-to-speech software. There are some ways of building accessible AJAX applications, and some corporations might believe that because they are in the private sector, they are immune to lawsuits. In fact, there have been efforts to sue private corporations with inaccessible web sites under the Americans with Disabilities Act (ADA), such as the widely publicized Target Corp. Web site case in 2006. Increasingly, accessibility will become a topical issue as RIA becomes the norm. Fortunately, key organizations are attempting to address the issue with updated legislation and software solutions.

Section 508

Section 508 of the Rehabilitation Act requires that U.S. government organizations use computer software and hardware that meets clearly defined standards of accessibility. Although Section 508 doesn't require private sector companies to conform to the standards, it does provide strong motivation by requiring Federal agencies to use vendors that best meet the standards.

Telecommunications Act

Unlike 508, Section 255 of the Telecommunications Act does indeed apply to the private sector. It states that telecommunication products and services be accessible whenever it is "readily achievable" -- a vague and wide-reaching requirement.

ADA

The Americans with Disabilities Act (ADA) basically requires accessibility in the provision of public services and employment. The ADA empowers employees to ask for "reasonable accommodations" throughout the enterprise, including intranet sites, software, and hardware. The ADA is also applied to Web sites of organizations and businesses, for example, in the Target Web site lawsuit, causing concern throughout the country of sudden heightened legal exposure.

Marketing Risks

All organizations should be concerned about marketing. Internet marketing has spawned a new breed of marketers who have to know about search engine optimization, Web site monetization, as well as understand the target audience and its cultural and technological attributes. All the other risks mentioned here ultimately become marketing risks because they impact the ability of an organization to conduct its business online.

Search Engine Accessibility

Many organizations rely heavily on search-engine rankings for their business. Doing anything that might potentially impact rankings negatively would be deemed unacceptable. A lot of marketers are concerned that using AJAX on a corporate site might mean that pages will no longer turn up in search-engine results pages (SERP). This is a real and important consideration. It's also important to note that nobody but the search engine "insiders" (the Google engineers) know exactly how their technologies work. They don't want us to know, probably because knowing would give us an unfair advantage over people who are trying to make good Web sites and deserve good rankings, too. Google's modus operandi has always been to reward people who make Web sites for users, not search engines. Unfortunately, in practice, this isn't even close to being true. Search engine optimization (SEO) is a veritable minefield of Do's and Don'ts, many of which could sink a Web site for good.

Before we look at this in more detail, we should begin with a bit of overview. Search engines use special programs called bots to scour the Web and index its contents. Each engine uses different techniques for finding new sites and weighting their importance. Some allow people to directly submit specific sites, and even specific hyperlinks, for indexing. Others rely on the organic evolution of inbound links to "point" the bots in the right direction. Inbound links are direct links from other sites that are already in the search engine. The problem with bots is that they are not proper Web browsers. Google, for example, previously used an antiquated Lynx browser to scour Web pages, meaning it was unable to evaluate JavaScript and read the results. Recently, Google appears to have upgraded its crawler technology to use a Mozilla variant (the same engine that Firefox uses). There is evidence that the Google crawler (a.k.a., Googlebot) is now capable of clicking JavaScript-loaded hyperlinks and executing the code inside.

With Google using Mozilla, all common sense points to the likelihood that Googlebot can indeed interpret JavaScript, but that doesn't necessarily help AJAX to be search-engine-accessible. For a page to turn up in Google SERPs, it must have a unique URL. This means that content loaded as part of an XHR request will not be directly indexable. Even if Google captures the text resulting from an XHR, it would not direct people to that application state through a simple hyperlink. This affects SERPs negatively.

Google is not the only search engine, however, and other engines (MSN Search and Yahoo) are reportedly even less forgiving when it comes to JavaScript. That doesn't imply necessarily that a site must be AJAX or JavaScript-free, because bots are actually good at skipping over stuff they don't understand. If an application is "behind the firewall" or protected by a log-in, SERPs won't matter, and this can all be disregarded. It does, however, reinforce that using AJAX to draw in key content is perilous if SERPs on that content are important.

The allure of a richer user experience might tempt developers to try one of many so-called black hat techniques to trick the search engines into indexing the site. If caught, these can land the site on a permanent blacklist. Some examples of black-hat techniques follow:

  •  Cloaking -- Redirection to a mirror site that is search-engine accessible by detecting the Googlebot user agent string.

  •  Invisible text -- Hiding content on the page in invisible places (hidden SPANs or absolutely positioned off the screen) for the purpose of improving SERPs.

  •  Duplicate content -- Setting up mirror pages with the same content but perhaps less JavaScript with the hope of getting that content indexed, but directing most people to the correct version. This is sometimes used with cloaking.

Given the current status of Googlebot technology, some factors increase the risk of search engine inaccessibility:

  •  AJAX is used for primary navigation (navigation between major areas of a site).

  •  The application is content-driven and SERPs are important.

  •  Links followed by search engine bots cannot be indexed -- the URLs cannot be displayed by browsers without some sort of redirection.

1 2 3 4 5 Page 3
Page 3 of 5
8 simple ways to clean data with Excel
  
Shop Tech Products at Amazon