The Federal Election Commission is charged with tracking, reporting on and investigating matters related to the financing of presidential and congressional election campaigns. Established in 1975 in response to the Watergate scandal, the FEC is akin to the IRS in its function and powers. A good portion of its 350 employees are analysts trained to spot holes or questionable data in campaign finance reports and then follow up on them.
Through the years, the FEC has levied millions of dollars in fines against office-chasers of all political stripes and wrangled with the Supreme Court over the laws that govern its operations. Meanwhile, a big and relatively new part of the agency's charter is to provide campaign finance data to U.S. citizens via its Web site.
The FEC certainly hasn't been immune to criticism. In May, for example, The New York Times ran an editorial headlined "Crippled Election Commission," in which it took the FEC to task for behind-the-scenes wrangling over the comings and goings of various commissioners. At the time, the six-member commission had four vacant positions, which now have been filled.
James Allen manages the FEC's IT infrastructure and spoke recently with Computerworld about what it takes to keep up with the ins and outs of campaign financing from a technology standpoint. Among his comments: "I'm the nerdy data guy, not a politician." Excerpts from the interview follow:
You have a data center at the FEC's offices in Washington and also contract with an external vendor to run servers for you out of facilities in Waltham, Mass. and Herndon, Va. How do all those IT pieces work together? Here at the commission, we have a small data center — roughly 90 servers, a mixture of Unix and Windows. Our Unix systems handle Oracle databases; the Windows servers handle file shares, SQL [Server] databases, the Lotus Notes e-mail backbone and various support functions. I have a staff of four people to make sure all our internal servers are operational, have the proper storage and that patching is maintained. Another group of people deals with desktop issues.
Our vendor, Savvis, hosts servers for us in Waltham that handle the public-facing aspects of what we do — the data we provide to the public. So they maintain our Web servers, database servers and application servers. They have around-the-clock staffing, which we do not have here, and they provide patching and keep the servers operational. They manage our network down to the core switch here at the FEC, and we have biweekly technical meetings to go over any issues.
In Herndon, there is a separate set of servers that acts as our back-end database — these take information from the various House and Senate committees that report on campaign finance. The public doesn't see this data as it appears here; this is the 'raw' information that our analysts take and look into. We have a T1 line to the Senate so they can file their reports securely and quickly. After the data has been cleared by our analysts — and we have a 48-hour turnaround time — we post it on the public Web site.
Do you consider your hosted services to be an example of cloud computing? No, because we have specific servers that run our site, and we own those servers. Savvis just manages them for us. It's cloud computing only if I can connect and use a service with no regard to where that service may be emanating from. Savvis has talked to us about cloud computing, and that is very interesting to me. I can see it from a disaster recovery perspective. If I can contractually request services, and I get those services as specified in the contract in a secure manner, then that's fabulous.
But the FEC doesn't allow anything other than FEC operations on our systems. So I might have to look at a "private" cloud computing model with only the FEC on it. That might be prohibitively expensive, though.
Tell me more about your Web portal: who developed it, and who maintains it? The Web site has been around for many years — it came into being almost as soon as the Web was available, sometime in the mid- to late 1990s. The site has gone from a pretty standard non-interactive site to a much more robust site. A lot of that is due to Alec Palmer, our current CIO. He's just very Web-savvy, and he's interested in putting as much data out to the public as he can.
And he wants to put the data out there in more interesting ways than we have hitherto done. Before there were flat files, and it was harder to dig for that summary information you may have wanted. Alec's the one who pushed for the map application — those beautiful bubbles on our home page. He's assembled a tremendous staff of very talented programmers and enterprise architects and others. Savvis is hosting the Web site; they give us a secure environment to work that magic. But all the programming is done [by] the FEC.
How have IT operations changed in the 30 years you've been at the FEC? It's become much more complex, and at the same time we've really reached out to provide more and different kinds of data to the public. Years ago, if you wanted to get information, you had to come to our office here in Washington to do research. Over the years, we developed a process so people could dial in over a modem and view reports; this was pre-Internet. Now it's all done via the Web, and we've vastly expanded the ability for citizens to do queries about who's getting what money, and from where.
You can really drill down into the details about which PACs or corporations are giving money to candidates, to see what the candidate really believes in. This has required us to move from a single-tier database into a multitiered system. But the graphics are definitely much better than they were.
And 30 years ago, I could manage all the hardware myself. It's gone from a one-man operation to a three-ring circus — with a lot more knowledge and skills required.
When I first started at the FEC, the security implications also were nowhere near what they are now. You just didn't hear of anyone hacking into databases back then — it was a rare occurrence. Now we have to run virus protection software, configuration management software, firewalls and router rules and intrusion-detection and -prevention [systems]. All of these are necessary to protect and ensure that all the information we pump out there is accurate and not being defaced or changed by some miscreant.
What is Savvis' role in security versus what is done internally? Savvis monitors our network very closely. They run [intrusion-detection tools] across the network and let us know if there's been any attempted scanning or anything amiss on our servers or network. We run checks internally for issues that would be within the commission itself — an employee looking somewhere they shouldn't, for example. We can detect that here.
Nothing's happened recently, but you're always having people scan your network. That's why we have to monitor it so closely: it's a sewer out there. There are thousands of viruses, so we have multilayered virus protection. When something comes into our mail router, it's scanned for viruses, then it's scanned again in software that Savvis uses, then it's scanned again when the e-mail hits our internal systems.
We've been fortunate, but there's a huge amount of effort in ensuring that nothing happens. Because once something does, it's very difficult to regain the trust of the public. We have not had that happen, thank God.