5 minute read

IBM’s Stacy Hobson wants to build tech that works for everyone

Hobson and team are focused on surfacing the potential risks of new technologies before they get built.

Why do we need responsible AI?

Hobson and team are focused on surfacing the potential risks of new technologies before they get built.

It wasn’t long into her career as a systems test engineer that Stacy Hobson flagged an unintelligible error message and was told to let it go because the system was working just fine. “I thought to myself, ‘Why would we create error messages that a human wouldn’t understand?” she said.

That question inspired her to go on for a PhD in cognitive science, to learn how to advocate for more interpretable and human-centered technologies. Today, Hobson leads a team of researchers developing tools and best practices to help other IBMers and their clients anticipate the larger impacts of their work. Her office is helping expand the notion of what counts as well-functioning technology to include human and societal considerations like fairness, accessibility, and equity.

“Ultimately, I want to leave the world a better place,” she said. “I don’t want my daughters to have to think about bias or discrimination with AI – or neurotech, or quantum, or some other emergent technology. I want the world to be different for them.”

Overcoming biases to think critically about technology

The history of technology is filled with complex, and sometimes unexpected outcomes. When introducing Responsive and Inclusive Technologies (R&I), the initiative Hobson leads, she opens with examples of algorithms gone awry: racist chatbots, airport screeners that are biased against transgender people, ranking systems for kidney transplants that prioritize white patients over Black patients. “The message I want to get across is that the technologies we’re creating have a real impact on people’s lives,” she said.

Over the past decade, reducing algorithmic bias has been an active area of research, with IBM releasing the industry’s first bias-mitigation software in 2018. Two years later, George Floyd’s murder by police put renewed urgency on addressing systemic racism and lifting the barriers facing Black Americans. The event triggered a period of self-reflection in the tech industry on its role and responsibility in promoting diversity and advancing social justice.

A month after Floyd’s death, IBM announced its decision to no longer sell general purpose facial recognition or analysis software. The move led Hobson to propose a larger effort investigating the societal impacts of AI and other emerging technologies. She lobbied for a research group dedicated to the development of responsible and inclusive tech at IBM Research, which she launched in 2021.

Since then, Hobson and her team of 15 research scientists and engineers, based in Yorktown Heights, N.Y., Rio de Janeiro, and Zurich, have developed a suite of critical-thinking tools. The goal is to catch problematic concepts and designs early on, before the technology is built — and in some cases, determine whether it should be built at all. One of the tools, a chatbot called Impact Bot, helps researchers communicate the impacts of technologies they're developing and identify ways of mitigating harm. Another tool, Responsible Tech Cards, helps researchers examine the technologies' intended uses and potential outcomes. The team is currently refining Responsible Tech Cards based on internal testing and feedback.

Tools like these are essential, said Hobson, to help technologists anticipate and mitigate potential harm. Engineers are trained to think that almost any problem can be solved with technology, a bias known as techno-solutionism. “When we default to this thinking, we often miss important context, like disparate impacts, social considerations, and unequal access to technology,” she said.

“This is what I want to do for the rest of my life”

Tech was not an obvious career choice in rural South Carolina where Hobson grew up, the youngest of five children. She might have never stumbled on computer science were it not for a summer enrichment program during high school. “I just knew,” she said. “This is what I wanted to do for the rest of my life.” She majored in computer science at South Carolina State University, against the advice of her mother, who was a schoolteacher. “I grew up in a way that you don’t argue with your parents,” she said. “This was the one time I pushed back. I said, ‘I want to do this!’”

At South Carolina State, a historically Black university, Hobson began to delve deeper into her interest in computing, becoming one of just two women to graduate in computer science that year. As a junior, she was awarded an IBM-sponsored GEM fellowship for underrepresented minorities in STEM, earning a full ride to Duke University for her master’s.

At Duke, she landed a series of summer internships at IBM, where she met her husband, Arthur, at a company picnic. After finishing her degree, she joined IBM’s Poughkeepsie office where she worked until she flagged that fateful error message. After earning a joint PhD in neuroscience and cognitive science at University of Maryland in 2007, she returned to IBM and joined IBM Research.

At Research, she has embraced problems that were timely and socially relevant. She arrived at the height of the financial crisis as banks were grappling with how much debt they could safely carry; Hobson’s risk-modeling work provided answers. And in the aftermath of Hurricane Sandy, she worked on data-sharing research to help government agencies better coordinate services. In 2018, she founded an internal grassroots research initiative at IBM called Tech for Justice that helps create tech solutions to combat discrimination.

Hobson credits her determination with getting her to where she is today as an executive at a major tech company raising two young children. “Most of what I’ve done in my life, I’ve said, ‘I want to do this, and I will do it regardless of the barriers,’” she said. “I will give myself a maximum of 24 hours to whine about it being difficult. Then I just do it.”

She brings the same doggedness to her work. Convincing people to consider the societal impacts of tech in a capitalist economy driven by short-term profits is no easy task. At IBM Research and across the company, she is known for her persistence and passion, which comes across in both the boardroom and with her team.

“She tells you exactly what is wrong and what is right,” said Sara Berger, a research staff scientist who reports to Hobson.

Opening the tech silo to regular people

In two years, Hobson has brought together a tight-knit group of researchers from diverse backgrounds that share her commitment to driving change in tech. True to R&I’s inclusive mission, she is known for her encouraging, supporting, and sharing the spotlight with her reports. “Stacy seems like she came straight out of a leadership book,” said Berger. “I’ll read something about how good leaders create leaders and say, ‘she does that!’”

In May, the team will present two papers at the top conference for research on human-centered technology, ACM’s Conference on Human Factors in Computing (CHI). The team is also in the process of open-sourcing four of their tools, including Responsible Tech Cards and Impact Bot.

One major goal for this year, Hobson said, will be to put together a consortium of nonprofits, technologists, and everyday people to draft standards for what responsible technology should look like.

“Technology today is designed, developed, and tested by technologists in a silo,” she said. “For our work to have real impact, we need input from regular people, including those who stand to be negatively impacted and who often don’t have a say.”

Rather than have technology foisted upon communities, Hobson wants to reverse the paradigm. “It’s important to hear the perspectives of regular members of society,” she said. “We want the input to be collaborative and not extractive.”