Linux primer for networkers

Installing, supporting and particularly diagnosing networks can involve much detective work, in addition to decent levels of frustration and caffeine. As with all investigations, the more data that is available to form and test hypotheses, the more likely a problem will be solved. Therefore, the more options or tools available for data gathering, the more useful the data obtained.

Open-source programs for Linux-based systems constitute some of the best tools available for network diagnosis, performance analysis and baseline determination. However, in my experience, the majority of network administrators today avoid Linux, perhaps intimidated by the process of setting up such a system.

This is a turnaround from 13 or so years ago, when networkers chose Linux simply because networking was built into the kernel of the operating system. Those who attempted IP networking in 1993 on a Windows 3.11 box, manually installing and configuring the network interface card (NIC) and loading a third-party IP stack (probably Trumpet Winsock), understand what I'm talking about.

While many excellent open-source diagnosis tools exist for Windows (Ethereal, for example), by shunning Linux, some essential tools are left off the workbench. A screwdriver, vice grips and a lot of muscle power can remove the spring that connects a deck to a riding mower, but a spring-removal tool is a lot easier and doesn't lead to the pain of skinned knuckles. Just as there is a small learning curve associated with using a spring puller, setting up a Linux system for network monitoring can be simple with a little guidance.

Linux distributions combine the Linux kernel with open-source packages and customized scripts to create servers and workstations. The kernel is customizable and can run on old Intel 386 processors, but for performance the target network machine should have at least a 667MHz processor, based on my experience.

Although distributions can be installed with a Windows-type graphical user interface to conserve resources (because one benefit is to use computers no one else has a use for), the installation should be text only, and not include installing any GUI software.

When choosing a distribution, factors such as cost, ease of installation, support and kernel version should be considered. Some add-on packages, such as Web100, require a certain version of kernel because it actually modifies the kernel (most packages do not).

Support doesn't necessarily mean a telephone number to call; most of the best support is in the form of FAQs, Web sites, mailing lists and discussion boards. Fedora Core is a great package, for example, for building network monitoring devices.

Whether ordering an enterprise package through the mail or downloading and burning ISO images to CDs, the easiest installation is by booting off an install CD. Fortunately, most outcast machines nowadays do have a workable CD drive.

It is possible -- albeit more difficult -- to load the operating system by creating boot floppy disks and loading over the network. Regardless, the machine should be booted with whatever peripherals (such as PC NICs, in the case of a laptop monitor) will be needed, to prevent having to manually add drivers later.

The installation methods are very similar for all distributions. After booting the installation media, graphical and textual installation method options are presented. As mentioned earlier, the text-based installation should be selected if performance and or video compatibility are a concern. This is not shunning GUI machines, but the purpose here is to build remote probes and monitoring machines that might be placed in a communications room away from the data center to collect statistics. A GUI interface isn't necessary for the applications these machines will run.

After entering in the time zone, keyboard type and other parameters that all operating system installations seem to ask for, the installation program enters the disk preparation phase. If there is an option to auto partition, use it. If not, it's best to create at least two partitions, one for the root system and added packages (/) and one (such as /var) for data deposit. By separating the data, there is no danger of processes such as tcpdump filling up the root partition writing trace outputs. A filled root partition will often make the machine inaccessible.

When faced with what packages to install, the applications that will run on the machine must be considered. For example, if the machine is to be used as a traffic-type monitor that delivers its output via a Web interface (such as ntop) then a Web server (such as Apache) should be selected. In setting up a Snort intrusion-detection machine, Snort requires that MySQL should be installed. In any case, development tools and languages should always be selected. It's a letdown to start a software compiling process and find out that a needed compiler, such as GCC, can't be found.

Often, the package installation phase will offer "server," "desktop" or other options. This can aid in filtering the number of packages available to install. Because this machine will serve in a server-type capacity (i.e., the network administrator will use a client to access the data) selecting "server" is appropriate. However, this does not negate the need to select individual packages. While package selection varies by need, basic network support, development and an SSH (Secure Shell) daemon are three essentials.

For the onboard NIC (or NICs, often useful for out-of-band management), network information will have to be entered. For remote probes hopping subnet to subnet, Dynamic Host Configuration Protocol can often be easier just because it lets you avoid having to edit files to change network parameters, but then again it may be more appropriate in larger networks to have a certain IP address assigned to monitors and appropriately firewalled throughout the core prior to probe deployment.

The root password should be a strong password, of course, and a second account should be created with an equally strong password. Remotely, the machine will be accessed via this second account, then "su" for root needs. Performing tasks as root that don't need to be performed as root can be quite dangerous to the integrity of a system if a command is executed by mistake, such as rm -f (remove files, ignore nonexistent files, never prompt) without doing a pwd (print name of current/working directory) command to verify the directory that the files will be removed from.

Once the installation process begins, the process can take over an hour. Depending on the distribution and the packages selected, CDs may have to be swapped out if the packages selected are located on different CDs. After the installation is complete and the CD is removed from the tray, the machine will successfully boot into Linux for the first time.

In the next article, we'll discuss testing the system and installing and using a basic sniffer package as essential as the Phillips screwdriver: tcpdump.

Copyright © 2006 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon