Tech luminaries we lost in 2019

We remember scientists, programmers, engineers, and business leaders who bettered the world through technology.

CW  >  In Memoriam 2019  >  Luminaries we lost this year
FreedomMaster / Getty Images

Every now and then, dreamers find themselves in the right point in time and space to spark a revolution. Whether it’s the invention of a technology, the start of a company, or a gaze to the stars, these moments change the future and define their creators’ legacies.

But as the computer industry ages, so too do its founders. In the past 12 months, we have bid farewell to designers of algorithms, teachers of students, and connectors of countries. We will always remember the 13 luminaries featured here who shaped our technology and our culture.

Nancy Grace Roman: Mother of the Hubble telescope

CW  >  In Memoriam 2019  >  Nancy Grace Roman Maia Weinstock

Well before and after she formed an astronomy club at the age of 11, Nancy Grace Roman looked to the stars. After she earned her Ph.D. in astronomy in 1949, she joined the Naval Research Laboratory. There, she observed that a star had changed its emission spectrum, a discovery that earned her recognition and ultimately led to her joining NASA in 1959, just a year after the organization’s founding. She became NASA’s first chief of astronomy and solar physics — and one of NASA’s first female executives — in 1961; a year later, she led NASA’s first successful astronomical mission, the Orbiting Solar Observatory-1.

In an age when most telescopes were terrestrial, Roman was a tireless champion for space astronomy. Her leadership, management, and advocacy made possible the Hubble Space Telescope, plans for which began in 1968. Finally launched in 1990, more than a decade after Roman’s retirement, the Hubble has since produced the clearest and most astounding images of our universe, revealing new insights about its nature and origin.

Among the many honors she received in her lifetime were the Women in Aerospace’s Lifetime Achievement Award, an asteroid named after her, and inclusion in Lego’s Women of NASA set, complete with a toy Hubble.

“I’ve always been curious. I just wanted to satisfy my curiosity,” said Roman, asking, “The things around us — the planets, the stars, the galaxies — what are they? How did they come to be? What’s going to happen to them?”

The answers we know and those still to come will be in part thanks to Roman, who died last Christmas at the age of 93.

Editor’s note: Roman died in December 2018, too late to be included in last year’s “Tech luminaries we lost” story.

Harlan Anderson: Digital entrepreneur

CW  >  In Memoriam 2019  >  Harlan E. Anderson Brian Anderson / Marcin Wichary (CC BY 2.0)

Waffling in college but needing to choose a major, Harlan E. “Andy” Anderson chose engineering physics. It taught him “to have a healthy curiosity, think logically and... go in a variety of directions, which turned out fortunately to include computers,” he wrote on his blog.

After college, those directions pointed to MIT’s Lincoln Laboratory, where Anderson encountered the Whirlwind computer. This experience influenced Anderson’s later work at MIT on SAGE, the United States’ air defense system during the Cold War.

When MIT’s Air Force contract expired, Anderson and his MIT manager Ken Olsen in 1957 founded their own company: Digital Equipment Corporation. The success of its PDP series of minicomputers (pictured above: a PDP-1) set DEC on the path to becoming the country’s second largest computer company. But according to Anderson, DEC’s greatest accomplishment was “bringing the man/machine interaction capabilities to the commercial world.”

Anderson stayed with DEC for only nine years, leaving well before the company’s eventual decline and acquisition by Compaq, which was in turn acquired by Hewlett-Packard. In the late ‘60s, Anderson served as director of technology for Time, Inc., and later provided venture capital for a number of technology startups.

Anderson published his memoir in 2009. He was 89 when he passed.

Conway Berners-Lee: Achieving Mark 1

CW  >  In Memoriam 2019  >  Conway Berners-Lee Maggie Jones

During World War II, Conway Berners-Lee served in the British army, applying his education in mathematics to the Corps of Royal Electrical and Mechanical Engineers. After leaving the military with the rank of major, Berners-Lee was employed by electrical engineering firm Ferranti at its London computer center. There he worked on the Ferranti Mark 1, the world’s first commercial, general-purpose computer.

Over his career, Berners-Lee broadened the appeal and application of computers, developing agricultural software as well as text-compression algorithms that were used in early electronic medical records. After retirement, he volunteered with the National Missing Persons Helpline.

But Berners-Lee’s most significant role may have been that of a parent. While working on the Ferranti Mark 1, Berners-Lee met Mary Lee Woods, whom he married in 1954. Berners-Lee and Woods had four children, including Sir Tim Berners-Lee, inventor of the World Wide Web, who named his parents as his childhood role models.

Conway Berners-Lee was 97.

Nils Nilsson: A star developer

CW  >  In Memoriam 2019  >  Nils Nilsson Linda A. Cicero / Stanford News Service

Before there were Roombas or Boston Dynamics, there was Shakey: the first general-purpose robot, which used wireless communications and audio-video sensors to interpret its surroundings and make decisions to interact with it. Shakey, named after its jerky movements, was developed from 1966 to 1972 under the co-direction of Nils J. Nilsson, a pioneer in artificial intelligence.

In 1965, Nilsson published one of the first books about neural networks, decades before the technology was economically feasible. In 1968, he contributed to the development of A*, an algorithm for determining the shortest path a robot could traverse between two points while navigating obstacles; this algorithm is used today in everything from video games to search algorithms. He also co-developed the Stanford Research Institute Problem Solver (STRIPS) automatic planning system in 1971. Nilsson’s later research delved into natural language understanding, a technology that now has widespread applications, from Siri to Watson.

All this work was done at Stanford Research Institute (now SRI International), an outgrowth of Nilsson’s alma mater. His career spanned decades at SRI and included teaching at Stanford University’s School of Engineering and serving as chair of the school’s Department of Computer Science.

Nilsson was 86 when he died.

William Newman: Painting the future

CW  >  In Memoriam 2019  >  William Newman George Coulouris / Marcin Wichary (CC BY-SA 4.0)

William Newman was a teacher and researcher in the fields of human-computer interaction and system design; he co-authored several textbooks and invented the first User Interface Management System, a precursor to HyperCard. But it was at Xerox’s Palo Alto Research Center (PARC) that Newman illustrated the future.

From 1973 to 1979, Newman put his Ph.D. in computer graphics to use developing several pioneering applications for Xerox’s Alto computer (pictured above): a paint program, pop-up menus, a forerunner to the PDF file format, and more. The pixel-based bitmap graphics Newman demonstrated on the Alto became the basis for almost all non-vector images used today.

“Playful, whimsical and kind, he was intellectually generous and the most effective computer programmer I have ever worked with,” wrote his friend George Coulouris, a visiting professor at the University of Cambridge Computer Laboratory.

Newman passed away at the age of 80.

Jean-Marie Hullot: Vision builder

CW  >  In Memoriam 2019  >  Jean-Marie Hullot Françoise Brenckmann (CC BY 2.0)

Finding the original Macintosh computer’s graphic user interface to be rudimentary, Jean-Marie Hullot developed tools for building more sophisticated GUIs. He presented his work in a seminar at Stanford University, garnering the attention of Steve Jobs, who recruited him to join NeXT in 1986. There Hullot developed Interface Builder, an easy-to-use software development application for NeXT computers and, later, Macs and the iPhone.

Hullot left NeXT in 1996 to co-found RealNames, a company aimed at standardizing the internet namespace. He reunited with Jobs at Apple in 2001 to develop the syncing functionality for iCal and iSync, laying the groundwork for a day when users would have multiple devices — a reality he also helped create by working on iPhone OS 1.

Hullot was also passionate about capturing and preserving the natural world, having co-founded the Iris Foundation, a nature conservation nonprofit. And in 2009, Hullot, along with four other former Apple employees, founded Fotopedia, a crowdsourced photographic encyclopedia. The service closed in 2014, declaring, “We truly believe in the concept of storytelling but don’t think there is a suitable business in it yet.”

But Hullot’s vision resonated with many. Wrote software engineer Bruce Henderson, “He was a blazing intellect who saw farther, did more.”

Hullot was 65.

H. Ross Perot: IT baron

CW  >  In Memoriam 2019  >  H. Ross Perot Allan Warren (CC BY 3.0)

Henry Ross Perot may be remembered as a businessman who ran for president — but that man's business was IT, an industry that his companies revolutionized.

After completing a stint in the Navy, Perot began a short but successful career as an IBM salesman. In 1962, recognizing that many of IBM’s clients lacked the training to use their new computers efficiently, Perot founded Electronic Data Systems, an IT services firm that specialized in data processing — essentially pioneering IT outsourcing as a business model. The company’s wildly successful public offering yielded Perot hundreds of millions of dollars, leading a 1968 Fortune magazine cover story to call him “the fastest richest Texan ever.”

EDS’ success continued in subsequent decades, and it eventually became one of the leading firms to address the Y2K bug. The company was acquired by General Motors in 1984, then Hewlett-Packard in 2008.

EDS wasn’t Perot’s only success. In 1979, Perot passed on the opportunity to buy Microsoft for $2 million. He learned from that mistake: Perot was the first major investor in NeXT, Steve Jobs’ post-Apple startup, purchasing 16% of the stock for $20 million in 1987. A year later, Perot founded the eponymous Perot Systems, an IT services firm to rival his former company EDS. Dell acquired Perot Systems in 2009.

Perot ran for United States president in 1992 and 1996; in the former election, he earned nearly 19% of the popular vote, potentially costing incumbent President George H. W. Bush his re-election and influencing the policies President Clinton would subsequently pursue.

Perot was 89.

Fernando Corbató: Time sharer

CW  >  In Memoriam 2019  >  Fernando Corbató MIT CSAIL

At the beginning of a career at MIT that spanned more than half a century, Fernando “Corby” Corbató saw the computing power of the school’s IBM 704 being wasted as it idled between programs. He took this opportunity to demonstrate a concept he’d been toying with: time-sharing, which allowed a computer to have multiple users simultaneously. Corbató outlined what would become the Compatible Time-Sharing System, or CTSS.

But “what began as a demo turned into a more and more viable system as the pieces began to fall in place,” recalled Corbató. Those pieces included email, text editors, and the first user passwords.

Corbató succeeded CTSS with the Multiplexed Information and Computing Service, or Multics, a collaboration among MIT, Bell Labs, and General Electric. When that project wound down, some of its alumni, including Ken Thompson, developed their own successor: UNIX.

At a time when computers could take hours or days to return results, Corbató’s work acquainted users with near-instantaneous response times. Said Corbató, “One of the things I’m proud of is having had an influence on the computing world, and that influence has been in a lot of ways.”

Corbató died at 93 of complications from diabetes.

Patrick Henry Winston: Intelligent speaker

CW  >  In Memoriam 2019  >  Patrick Henry Winston MIT CSAIL

Patrick Henry Winston saw artificial intelligence as having the potential to be our greatest teacher — but to do that, AI had to understand our stories.

Winston pursued this mission at MIT, where he earned his bachelor’s, master’s, and doctoral degrees, culminating in a thesis under the famed Marvin Minsky. Winston spent the rest of his life on his alma mater’s faculty, succeeding Minsky as director of MIT’s Artificial Intelligence Laboratory, which eventually became CSAIL: the Computer Science and Artificial Intelligence Laboratory. At CSAIL, Winston led Genesis, an initiative to develop an AI that could understand and interpret short stories.

He also empowered the MIT community to share their own stories with his annual lecture, “How to Speak,” brimming with strategies for oral presentations, conversations, and interviews. “He long held that great ideas weren’t enough — you had to know how to convey them,” wrote MIT colleague Randall Davis.

As much as Winston loved AI, he loved people more. Just five months before he passed at the age of 76, his very last lecture concluded: “What is the greatest computing innovation of all time? It’s us.”

Danny Cohen: Pilot programmer

CW  >  In Memoriam 2019  >  Danny Cohen Delia Heilig

First, he gave computers wings; then, he gave them a voice. These are just two of Danny Cohen’s many contributions as an early internet pioneer.

In 1967, while a doctoral student at Harvard, this Israeli-born computer scientist and pilot developed the first 3D flight simulator as a training tool, replacing the cameras and 2D models that commercial pilots historically used. The Cohen–Sutherland algorithm he co-authored was one of the first methods for rendering a moving perspective while navigating a 3D space.

Cohen was then recruited by the Advanced Research Projects Agency (ARPA) to devise a method for transmitting speech over the ARPAnet. The result was the world’s first voice-over-internet conference call in 1978. Other prototypes he developed included a digital library, an e-commerce system, and a high-speed local-area networking system for computer clusters, a technology that would evolve into cloud computing.

For all these accomplishments, Cohen was among the first class inducted into the Internet Hall of Fame — and in 2013, an all-star cast of internet luminaries celebrated his achievements.

Cohen was 81.

Sally Floyd: Routing the way

CW  >  In Memoriam 2019  >  Sally Floyd Carole Leita

After working as a computer systems engineer for San Francisco’s BART public transit system, Sally Floyd enrolled at the University of California, Berkeley. Upon earning a Ph.D. in computer science in 1989, she joined the Network Research Group at Lawrence Berkeley National Laboratory.

It was at that time, when internet protocols were inefficient and not widely understood, that Floyd co-developed Random Early Decision (RED), an algorithm that prevents internet congestion by keeping packets of data from overloading buffers or being discarded. RED is used in almost all network routers today and is the basis for the field of Active Queue Management.

Among other honors, Floyd’s work earned her the Association for Computing Machinery’s Special Interest Group on Data Communication (SIGCOMM) award. “Dr. Floyd has combined the community spirit and dedication that was present in the early Internet with the intellectual rigor that characterizes today’s research community,” said the award’s 2007 press release. At the time, Floyd was computer science’s eighth most cited researcher; today, her 1993 RED paper has been cited over 9,100 times.

Floyd later worked at the International Computer Science Institute’s Center for Internet Research from 1999 until 2007, when she retired after being diagnosed with multiple sclerosis. She died from metastatic gallbladder cancer at the age of 69.

Tarek Kamel: Connecting nations

CW  >  In Memoriam 2019  >  Tarek Kamel Henrik Ishihara / Globaljuggler (CC BY-SA 3.0)

When the Arab Spring arrived in Egypt in January 2011, Tarek Kamel was the country's minister of communications and information technology. He was also the one who armed the protesters with social media: almost twenty years earlier, Kamel had established Egypt’s first internet connection.

After earning his Ph.D. in Germany in 1992, Kamel returned to his native Egypt, where he spent seven years as a manager of the Communications and Networking Department at the Cabinet Information and Decision Support Centre. During this time, he founded the Internet Society of Egypt, putting the country online. Kamel joined the Egyptian Ministry of Communications and Information Technology upon its formation in 1999.

After departing the Egyptian government in 2011, Kamel joined the Internet Corporation for Assigned Names and Numbers (ICANN) as senior vice president of global government and IGO engagement.

“Tarek had the unique ability to bring people together and forget our differences,” wrote Göran Marby, president and CEO of ICANN, in a blog post dedicated to Kamel’s memory. Vinton Cerf, co-developer of TCP/IP, chimed in: “Our Internet community has lost a kindred spirit so devoted to the idea of a global Internet to hold and use in common.”

Kamel was 57.

Mark Hurd: Human calculator

CW  >  In Memoriam 2019  >  Mark Hurd Oracle / Hartmann Studios (CC BY 2.0)

Mark Hurd’s corporate rise had a humble beginning: he started in 1980 as a junior sales representative at NCR Corp., which made computers, ATMs and point-of-sale systems. Hurd rose through the ranks, leading the company’s Teradata data-warehousing division before being named NCR president, COO, and finally CEO in 2003.

In 2005, he was recruited from NCR to become Hewlett-Packard’s president and CEO. Over the next five years, Hurd oversaw the layoff of 15,000 employees at the then-struggling company, as well as its 2008 acquisition of Ross Perot’s Electronic Data Systems (EDS). These and other strategic moves led the company to over five consecutive years of profit increases.

In 2010, Hurd joined Oracle as co-president, then served as co-CEO from 2014 until shortly before his passing. Hurd significantly grew not only Oracle’s cloud business, but also its employee base: hiring increased by 25% and included 1,300 students and recent graduates annually.

Sometimes called a “human calculator,” Hurd was known in the industry for his impressive memory and quick grasp of complicated information. “All of us will miss Mark’s keen mind and rare ability to analyze, simplify and solve problems quickly,” wrote Oracle co-founder and CTO Larry Ellison. “I will miss his kindness and sense of humor.”

Hurd was 62.

countries. We will always remember the 13 luminaries featured here who shaped our technology and our culture.

See also: Tech luminaries we lost in 2018 | 2017 | 2016 | 2015 | 2014

Copyright © 2019 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon