Tech luminaries we lost in 2013

These 14 individuals left lasting impressions on the industry — some by creating innovative technologies, and others by building game-changing companies. And all left their marks by challenging the status quo.

In Memoriam
Getty Images / IDG

They ranged in age from 26 to 93 at the time of their deaths, but their time on Earth didn't matter: Each person on this somber list made an impact on technology and helped shape the world we live in today. In that sense, their lives, however long or short, were a gain for our society and an example for the rest of us to follow.

Here, we look at the accomplishments of 14 men and women, all pioneers in various facets of the technology field, who died in 2013.

Hank Asher: Color him a data genius

Most high-tech geniuses wait until college to drop out of school and start companies. Hank Asher didn't make it through high school. He also didn't start programming until the 1990s, after running a painting business and, briefly, smuggling drugs.

He created software used for "data fusion," integrating databases and mining them. He started and sold two data companies, Database Technologies and Seisint, and was trying to turn around a third, TLO, when he died. Among his products were AutoTrack, Accurint and Matrix, a data-mining tool used by law enforcement.

Asher also fought against child pornography, and contributed to relief work in Haiti. He was 61.

Aaron Swartz: Programming prodigy

Aaron Swartz was a wunderkind, at 14 contributing code to the RSS standard. Mentored by luminaries like Tim Berners-Lee and Lawrence Lessig, Swartz in his teens founded a company that would be merged into Reddit.

Swartz became an activist, hacking the legal database PACER to make public court decisions freely available. He also hacked into JSTOR, an academic database, for which he was arrested and charged with multiple felonies.

sparked outrage from a broad swath of political, legal and Internet activists, bitter that one with so much promise was now gone. He was 26.

Yvonne Brill: Rocket woman

We can thank Yvonne Brill for making our lives better through satellites. While working for an RCA unit, the brilliant rocket scientist in 1967 developed the hydrazine resistojet, which became a standard for how satellites are propelled through space. She worked on myriad propulsion systems, including for the first weather satellite, the Nova moon rockets and the Space Shuttle.

When she started her career in 1945, few women were in engineering; she couldn't study engineering at the University of Manitoba because the school couldn't accommodate her at an outdoor engineering camp. Her success has propelled numerous other women into science. She was 88.

Eleanor Adair: A research maverick

Eleanor R. Adair was a bold researcher. So bold, in fact, that she made herself the initial test subject in her work. At the John B. Pierce Laboratory in the 1970s, she began to study the impact of microwaves on squirrel monkeys and on humans, exposing them to its heat effects.

She also would look into whether microwave radiation is harmful to humans. Dr. Adair concluded it was not, and maintained this stance even as cell phones, held directly to people's heads, became widely used. Though others assert carcinogenic dangers from microwave radiation, her work has not been disproved. She was 86.

Kenneth Appel: Making the computer count

Once upon a time, nobody did mathematical proofs with a computer. But in 1976, Kenneth Appel, then a mathematician at the University of Illinois, decided to use a mainframe to prove the "four colors" theorem, which held that for any map, four colors were all that were needed to ensure no adjacent states would be the same color.

Working with a colleague, Wolfgang Haken, and an IBM 360, it took 1,200 hours, but they produced a 140-page proof, with 400 supplemental diagrams. They wrote on a chalkboard, "FOUR COLORS SUFFICE." Math has never been the same. Appel was 80.

Amar Bose: A sound life

Amar Bose liked to solve problems. In 1956, he had a problem with hi-fi speakers he purchased, and figured out how to make them sound better, using reflected sound. He started his now-famous speaker company in 1964 and continued to develop new approaches to acoustics, including working out noise-canceling headphones while on a plane flight.

Bose taught at his alma mater, MIT, from 1955 until 2001, and donated much of his personal wealth to the university. He created a revolution in sound that still resonates. So does his legacy as a teacher and a manager. He was 83.

Ken Brill: Plugged in

Data centers were a largely inchoate concept until decided to make uptime his full-time pursuit. Ken Brill, who had been in the power-supply business, founded the UpTime Institute in 1993. He would use it as a platform to define how data centers should be rated, improve their uptime and encourage more efficient operations.

Brill, in effect, defined the data center, building a foundation on which cloud computing could emerge. He also highlighted how improvements in power supplies were not keeping up with Moore's Law. Brill was 69.

Doug Engelbart: Of mouse and man

Fortunes were made from Doug Engelbart's ideas; none by him. One of computing's greatest visionaries, he invented the computer mouse and significantly contributed to the development of hypertext, word processing, graphical user interfaces, networking and real-time collaboration, including videoconferencing. He displayed early forms of all of them in "The Mother of all Demos."

But he didn't commercialize his ideas; that would be left to others, including members of his lab at Stanford Research Institute, many of whom went to Xerox's Palo Alto Research Center when he lost funding. But then, Engelbart didn't want to get rich; he wanted to enrich human life. And that he did. He was 88.

Barnaby Jack: "Jackpotter"

It was almost cartoonish, what Barnaby Jack could do with technology. He made ATMs spit out cash, dubbed "jackpotting." He showed how he could hack a pacemaker from 30 feet to make it discharge enough electricity to kill its user.

Brilliant at finding flaws in embedded devices, the New Zealand-born hacker died the week before giving a talk at the Black Hat security conference on how pacemakers and other medical devices could be hacked. Creative and playful, Jack was also ethical — his famous hacks were demonstrated publicly only after the companies involved had a chance to make fixes. He was 35.

Ray Dolby: Amplification and clarification

Back when all sound was analog, cassette tapes were popular. The hiss they made when played was not. Ray Dolby silenced the hiss, with Dolby A for professional studios (1966) and Dolby B for consumers (1968).

Starting with 1971's A Clockwork Orange, Dolby also revolutionized movies and movie-going with the creation of surround sound, additional audio channels. Sound became part of the art form, and Dolby won multiple Oscars. Later, Dolby Digital became part of the DVD standard.

Dolby himself, a modest man, did not trumpet his accomplishments or his highly profitable company. We hear better for them. He was 80.

Wayne Green: Taking a byte out of life

Laptops, cell phones, e-mail, consumer computing. The potential for all of them fascinated Wayne Green, who began writing about amateur radio in the 1950s. He put together a publishing fiefdom in Peterborough, N.H., turning the setting for Our Town into a bully pulpit for a tech-centric future.

Green's most influential magazine was Byte, but he published many others, including 73, his long-running amateur radio magazine. Green, who called himself a "conspiracy factist," believed among other things that the moonwalk never happened. But he knew good technology when he saw it. He was 91.

Hiroshi Yamauchi: Game man

Hiroshi Yamauchi took over his family's staid Japanese playing-card company in 1949, at age 21. He ran Nintendo until 2002, and was chairman until 2005. He turned Nintendo into one of the world's most important entertainment companies.

Though Yamauchi was not a gamer, under him Nintendo released groundbreaking games like Donkey Kong and Legend of Zelda as well as seminal platforms like the GameBoy and the GameCube. It also began development of the Wii on his watch.

At the Wii's height, Yamauchi was Japan's richest man. He also was majority owner of the Seattle Mariners baseball team, though he never saw them play. He was 85.

Clifford Nass: Single tasker

Are you reading this while trying to do four other things? You're a sucker for irrelevancy, in the words of Clifford Nass, a Stanford sociologist who studied how people communicate. You're also becoming the norm, which worried him. He found that multitaskers have trouble concentrating and get worse as they multitask more.

Nass, who helped develop Microsoft Bob, worried that our relationship with technology was making us less intelligent. He said earlier this year, "We've got to make face-to-face time sacred." In his honor, speak face to face with that person you're about to text. He was 55.

Willis Ware: Overlooked visionary

Few saw the future of computing with the clarity of Willis Ware, a RAND engineer who said in 1966, "Every man will communicate through a computer whatever he does. It will change and reshape his life, modify his career and force him to accept a life of continuous change."

Ware worked on classified electronics during World War II, built an early digital computer with John von Neumann, then built another for RAND. Ware in 1972 recommended against secret databases, saying that people should know what data is held on them and how it is used. He was ignored. He was 93.

Copyright © 2013 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon