In the beginning, there were hits. Today, hits are largely discredited as a measure of Web site traffic, since they count individual files served up. A single Web page can account for a dozen or more hits if it has a lot of photos, while a text-only page could generate just a single hit.
These days, the Weberati talk of metrics such as page views, ad impressions and unique users. But don't be fooled by precise-sounding terminology and numbers. There are so many ways to define and count Web visits that traffic measurement is as much an art as it is a science.
More
For example, what counts as a page view? Is it when a Web page is first requested? When content has completely finished loading? Or when a tracking pixel—a tiny file placed on a page specifically for counting page views—is called? Such distinctions are important to Internet ad buyers, because the numbers can differ depending on the definition used. Consider the impatient user who requests a page but then hits Back or surfs elsewhere before that page—and its ad—loads.
Search engines complicate the problem. Their automated software "robots" scour the Internet and index sites. IT staffs monitoring server load may need to factor in robot activity for capacity planning, but site operators must filter it out to get an accurate count of how many real people are visiting a site.
Finding and discarding activity of known robots such as Mountain View, Calif.-based Google Inc.'s is only one step in factoring out Web crawlers, notes George Ivie, executive director of New York-based Media Rating Council Inc., a trade organization seeking to develop and enforce audience measurement standards. Ideally, he says, analysis would also check for obvious automated activity, such as a visitor from the same IP address who clicks through 10 pages per second.
Trickier still are internal users. Should IT staffers at Seattle-based Amazon.com Inc. be counted as visitors if they're testing an updated part of the site? Probably not. But what about the receptionist who surfs to buy a gift?
Soft Numbers
One of the softest Web numbers is the tally of unique visitors per month. For sites that require registration and log-in, it's fairly simple. But the rest must depend on other devices, ranging from analyzing server log files to using cookies, which are small pieces of data stored in a user's browser that can be accessed by a Web site the next time that user visits.
Web site operators usually get information about site traffic from their own server logs, an outside online advertising company such as New York-based DoubleClick Inc., or a third-party rating service. Major sites typically use a combination of sources.
"We track all the page views internally that we get. We also double-check it with our ad server, DART," says Jim Candor, vice president for new media at AccuWeather Inc. in State College, Pa., which recently announced that it had surpassed 1 billion page views. DART is an ad-serving technology from DoubleClick that lets online staff set up when and where ads appear on a site; it also measures how many people view each ad.
Numbers from AccuWeather's server logs showed only a "slight discrepancy" with the DART figures, within a percentage point or two, Candor says. How did AccuWeather tally up 1 billion pages viewed over the site's history? "We track each type of page internally; we put a 1-by-1 spotlight [tracking] pixel internally," he says. The count began at the site's December 1997 launch.
In addition to using outside rating services, log file analysis is also quite useful, says Jeff Julian, president and publisher of IDG.net, a Computerworld.com sister site. It lets him see what people do after they arrive at a site. Server logs usually record each visitor's domain or IP address, browser type and files requested. Web site staff can then use commercial log analysis software or home-brewed code to sift through the raw data and pull together the statistics they're seeking.
Sites that don't require user registration use various techniques to estimate how many unique visitors—different individuals—are arriving each month. Some check to see whether there's an existing cookie; if not, the first-time visitor gets a cookie with a unique user ID. Then, if the user returns, the site knows he was there before.
The New York-based Interactive Advertising Bureau recently took a first crack at developing online audience measurement guidelines, issued in January. In them, the group defines visits and page impressions and presents proposals to deal with page caching and to filter out "nonhuman activity."
Ivie says that ultimately, he would like to see both internal Web sites and outside measurement agencies submit to external auditing, just as newspapers do for circulation claims. So far, he says, Atlanta-based CNN.com is the lone major consumer site that has submitted to Media Rating Council auditing.
"The major problem is there's no accountability in the Internet environment," Ivie says. "Our members struggle trying to figure out what numbers to rely on."
|
See additional Computerworld QuickStudies
E-Commerce Grows Up
Stories in this report:
- E-Commerce Grows Up
- The Story So Far: E-Commerce
- The Next Steps in E-Retailing
- Building B2B Trust
- E-Future Lies In the Back Office
- The Benefits of Being an E-Commerce Latecomer
- Getting Personal Boosts Revenue
- Field Report: The Case of the Disappering Commerce Servers
- QuickStudy: Measuring Web Site Traffic
- How Companies Respond Quickly to E-mail
- Design Tips for Web Site Shopability
- Open Rules for E-Business
- XM Radio Builds Tight B2B Links
- Building a Web Site for 30 Million Customers
- Second-Generation E-Commerce