5. Use HTML5 and other emerging standards
Not every method of dealing with Web outages is centered on the hardware or the connection between the site and the user. New standards, especially HTML5, have built-in mechanisms for making a site more reliable. Many of those mechanisms involve the use of advanced programming techniques to address site-to-site transmissions.
"HTML5 is a very important advance in browser capabilities," says Michael Gordon, the chief strategy officer and co-founder of Limelight Networks Inc., a CDN provider based in Tempe, Ariz. Features likely to be important for enterprises, he says, include the canvas tag, which provides dynamic rendering of bitmap images (think Flash-like 2D drawings) that will "significantly advance user interfaces"; the postMessage API, which will allow one Web server to communicate with another through a user's browser; and the client-side storage API, which will allow Web applications to store files on a user client.
In general, Web programming will become more like desktop programming, where data exchange, interface elements and APIs are more solid, and emerging technologies go through a rigorous testing process. One example of this is OpenID, which provides desktop-like functionality (in this case, authenticating one site with the log-in from another) to streamline development. The reusable code and predictable structure for OpenID, OpenSocial, OAuth and other Web standards will make the Web more reliable in the long run.
Not everyone agrees that these standards will promote better Internet uptime, however. Clearly, new standards encourage better programming methods, but they may also lead to even more Web applications and a greater strain.
"HTML5 will allow Web application developers to build richer desktop-like applications, and we will continue to see less dependence on operating systems and more dependence on the Web and Web browsers" to perform the most common tasks, such as managing data shared between Web applications, says Web developer Crosland. All this, in turn, could ironically make things even more congested.
In the end, most experts -- including analysts Skorupa and Staten -- insist that 100% uptime for every site on the public Internet is not necessarily the goal, and that site operators should still plan for occasional outages.
"You can't ensure your site will never go down," says Forrester's Staten. "Each company has to find the right balance between best efforts for availability and the cost of doing so."
John Brandon is a veteran of the computing industry, having worked as an IT manager for 10 years and as a tech journalist for another 10. He has written more than 2,500 articles and is a regular Computerworld contributor.