Martha Bird has one of the more unusual positions in the technology space: She's a business anthropologist. Bird, who ran a family farm in New Hampshire before earning an anthropology Ph.D., has developed her craft over 15-plus years at a variety of organizations β a nonprofit, a telecommunications company and an e-commerce firm.
Her niche is helping global brands optimize their systems to meet the needs of users across multiple markets. She now works in the Innovation Labs at ADP, a provider of human resources management software and services headquartered in Roseland, N.J. "I was brought in to ensure that we are addressing real human needs in the tools we build," she says. Here, she explains more about her work:
What does your job entail? My role is always about thinking about the intersections of technologies and people or, put another way, about the human-machine relationships in cultural context. ADP has embraced design thinking as a guiding UX principle and approach because we know that ultimately our tools need to meet real human needs.
I think of my work as a complement to the work already being undertaken in this area, just as understanding a user's journey must also account for the cultural landscapes β organizational, culture, national culture, geography, tech infrastructure, gender β on which these journeys are mapped. One of my main areas of focus right now is designing technologies that work for people, such as conversational user interfaces and chatbots, and how to build these tools so that people can get in and get what they need without digging around and pogo-sticking across systems.
What does an anthropologist bring to work on chatbots? I pay attention to how the conversations and interactions we are developing are crafted. We [at ADP] serve different professional audiences across different geographies and different cultures and so we need to think carefully about the personality and tone of voice of our system.
We also need to be mindful that what might warrant a high-priority notification or even how frequent a notification is sent is culturally dependent. For instance, a German and a Brazilian might not have the same sense of urgency around time and scheduling. We call these "cultural precisions," which need to be accounted for as we build out our systems.
Why are people's relationships with technology so important? Any technology development and deployment happening today not informed by systems-level thinking is going to create issues upstream. What do I mean by this? I mean that in order to solve for the challenges of any particular step in a process, you need to have an understanding of the entire ecosystem in which that process lives. You need an understanding of the technical infrastructure, compliance particularities, company culture, geographic and cultural considerations, etc.
Short of understanding how each informs and influences the other, you will end up deploying systems that are brittle and linear, poorly integrated as well as poorly aligned to both the practices and the people, and ultimately, the place where they are implemented.
What design flaws can doom a technology? It strikes me that we're in a period that's exponentially turbocharged, and new and emerging technologies are coming fast and furious. The technology itself has outpaced our imagination on how to use it. People tend to embrace these new technologies just for the sake of embracing them. So you have a reimagination of older tools dressed up like they're new, and they're not serving any practical utility. So using the latest and greatest technology to build something that people won't use is a major design flaw.
Can you give me an example? If I could order a pizza in three clicks, I'm not sure why I'd want a chatbot to come on asking me seven questions. It's that sort of bloating for the sake of the new.
What surprising observation or insight have you made in your work? I find it interesting how practitioners learn to navigate tools that may not be serving them well, how over time they develop their own grassroots workarounds and local shortcuts. These homegrown solutions become a kind of personal accomplishment and a source of pride, so much so that making a system easier to use and more intuitive may not immediately be welcomed despite requests for such streamlining.
You've talked about how to work on cross-functional development teams efficiently. What's the recipe for that? It has been my experience that adding variety of background and expertise to the mix really helps open up new perspectives. Greater team diversity tends to introduce new ways of thinking, which in turn can have a very positive impact on product.
It's bringing someone like me, a social scientist, onto a team heavy with developers. It's about creating a team where people respect each other's opinions and are open to that creative friction. We live in an increasingly global world and we're building our tools for a global audience, and we have to expand our teams to build products that are more representative of the people who are in the world.
Martha Bird and her mother, Patricia Pond, around the year 2000.
How do you build for a diverse audience? I'm a big proponent of spending time with the people we're designing for and asking ourselves, "What would we want, how would we use this, how would I want to be treated?" A lot comes from that, essentially trying to understand what's relevant to others.
Can this thought process be taught? It can be learned. It takes a commitment to a certain kind of vulnerability of remaining silent, so you can actually listen. It's also about the way one listens: listening disengaged from self-interest. It's about stepping back and listening with respect. I call it "active listening."
Do enough IT people do this? I think on balance it's not one of their strongest aptitudes. I think tech folks are focused on writing code and developing algorithms but would do well to be more exposed to the end users and see the psycho/emotional/utility end. I find developers are amenable to that, but they don't get enough opportunity to do it.
You've also used the term "deeply hanging out" to describe your work. What's that? It goes back to the idea of respectful active listening. I'm not entering people's spaces with a clipboard and a camera, and recording like they were human specimens. I'm trying to understand what their expectations are in life generally and how does that actually relate to their work life.
I spend time in places where people make meaning with the tools they use. If it's a software system for practitioners, I spend time in their office or cube or wherever it is they're working and pay attention.
What are "contextually sensitive conversational cues"? In conversational interfaces, it's incredibly important to understand both your audience and your own voice, and how the tone will vary for a positive situation to a negative situation.
For example, the tone will vary for positive situation (like presenting a job offer) to a potentially negative situation (starting employment separation). If you don't adjust for these different audiences and scenarios, conversations can easily come across as unnatural or, worse, offensive.
What's a "consistent personality" in chatbots? Before we started creating conversations, we identified the key personality attributes we wanted for our conversational user interface. They include not only who was this system and what it was like but also what it wasn't.
It was important to be proactive, but never pushy, and that being friendly was authentic but was not forced. We refer to those attributes regularly while we create new conversations and content. We can ask ourselves, "Is this something the system would say?" If the answer is no, we could adjust the conversation. That's what it's like to ensure consistent personality.