The painful truth about age discrimination in tech

1 2 3 Page 2
Page 2 of 3

Something has been pushing IT workers out as they hit their high-earning, low-unemployment 40s and beyond. Is it burnout or pervasive age discrimination? What are the culprits contributing to this "Logan's Run"-like marketplace?

Sure, your average IT operation is staffed by people whose answer to the question "What were you doing when the Berlin Wall fell?" is going to be "teething." But it's not purely a hatred of older people that's led to a sharp falloff in older IT workers. Here are some possible factors.

A change in the IT culture. The Net is rife with mainframe operators and Cobol pros who will tell you that they got into IT for love of the challenge or subject. It was passion-driven. Now, however, IT occupations are rigorously bound by performance metrics and other management controls that provide a healthy reality check to anyone who thought passion would be enough to sustain a 25-year career in coding.

Bean-counting. Older workers have a (not entirely) undeserved reputation for being expensive, which hurts them going and coming. If there's what the federal Bureau of Labor Statistics quaintly calls a "mass layoff event," the high-paying jobs are looked at carefully to see if the worker brings a perceived value to the organization. If not, the math is brutal: Ax two or three high-paying positions and see an immediate growth in the margins. And when it's time to hire, two entry-level workers provide -- in theory -- more bang for the buck than one expensive member of a protected class (that is, older workers for whom the government has imposed more hurdles to lay them off).

The persistent devaluation of experience and skills. Any developer can tell you that not all C or PHP or Java programmers are created equal; some are vastly more productive or creative. However, unless or until there is a way to explicitly demonstrate the productivity differential between a good programmer and a mediocre one, inexperienced or nontechnical hiring managers tend to look at resumes with an eye for youth, under the "more bang for the buck" theory. Cheaper young 'uns will work longer hours and produce more code. The very concept of viewing experience as an asset for raising productivity is a nonfactor -- much to the detriment of the developer workplace.

According to one 20-year telecommunications veteran who asked to remain anonymous, when high-tech companies began incorporating more business-oriented managers into their upper tiers, these managers were not able to accurately assess the merits of developers with know-how: "It is nearly impossible to judge quality work if you never did it yourself," he says. "The latest fad was the idiotic belief that management was generic, a skill that could be taught at school and could then be sent anywhere to do any management job."

Another way in which experience is actually seen as a flaw rather than a virtue: Hiring managers are unable to map how 10 years of experience in one programming language can inform or enhance a programmer's months of experience with a newer technology. Instead, they dismiss the decade of experience as a sign of inflexibility or being unable to keep up -- an assumption that penalizes IT pros for being present during the last 10 years of their jobs.

As former Intel CEO Craig Barrett once said, "The half-life of an engineer, software or hardware, is only a few years." With this kind of attitude at the top, there's no cultural incentive to foster a hiring strategy that rewards experience or longevity.

1 2 3 Page 2
Page 2 of 3
7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon