TWO ROADS CONVERGED
BY JENNIFER DAVIES BUB P’23, ’25, ’27
Illustration by Angela Hsieh | Photography by Tom Kates

We live in a world where our daily routines are increasingly dependent on computer algorithms. Some make automated decisions that help us in our day-to-day lives, like GPS apps. Others are used to build artificial intelligence (AI) that takes over specific tasks—just ask Siri or Alexa.

 But some algorithms have wider-reaching social and cultural consequences, determining anything from mortgage and loan approvals, to the allocation of health care and social services, to medical diagnoses. When the data used to build those algorithms is incomplete, misapplied, or biased, the resulting automated decisions can create negative consequences that in many cases may affect communities that are already marginalized, underserved, or underrepresented. At the core of this problem is the fundamental truth about algorithms: Garbage in, garbage out. And likewise, bias in, bias out.

That’s why Kasia Chmielinski ’02 and Veronica Rotemberg ’02 are on a mission: to create tools and practices that encourage responsible AI development by addressing inequity and prioritizing transparency and inclusion.

 
Curious-Minded
For more than a decade, Kasia, a technologist at McKinsey & Company and an affiliate at the Berkman Klein Center for Internet and Society at Harvard University, has been developing technology and analyzing data in academia, industry, and government. Tackling projects on the leading edge, Kasia, who uses they/them pronouns, was engaged with product teams in the development of voice recognition technology in its infancy at Google; and completed a tour of civil service with the United States Digital Service, an elite technology unit within the Executive Office of the President, to gather data and build tools to address critical issues, including the opioid crisis. A self-described tinkerer, the common thread throughout their work has been the belief that “technology should help us move forward without mirroring existing systemic injustice,” and the key to better AI lies in the data.

In 2019, Kasia reconnected with Veronica Rotemberg ’02 via text, catching up about life, the state of the world, and ultimately about their work. The two quickly discovered that their shared passion for the sciences and problem solving—nurtured first in Ms. Denise Labieniec’s classroom, and then at Harvard, from which they both graduated as physics majors in 2006—wasn’t the only thing they still had in common.

Veronica, a dermatologist who also has a Ph.D. in biomedical engineering, is the director of the imaging informatics program in the dermatology service at Memorial Sloan Kettering Cancer Center, and she leads the AI working group for the International Skin Imaging Collaboration (ISIC), an academic and industry group whose goal is to reduce deaths caused by melanoma. In her work and research, she found herself looking to technology to save lives through early melanoma detection. An absence of proper labeling and documentation of available data, and a historical issue getting representative data across skin tones, were proving to be barriers, particularly with regard to understanding the potential for automated diagnosis in diverse skin types.

When Kasia shared the work they had underway as the co-founder of the Data Nutrition Project (DNP)—a project that spun out of a fellowship funded through the Assembly program at the Berkman Klein Center and the MIT Media Lab—Veronica was eager to hear more.

A cross-industry, independent nonprofit, the DNP’s mission is to tackle the challenges of ethics and governance of AI. Kasia was looking for datasets to analyze and work with to begin developing standards, and Veronica was creating and sharing datasets in conjunction with her ISIC research. With a few fateful keystrokes,
the stars began to align.

“That’s the beauty of having this network of incredibly smart people who are doing impactful work. Your private and work life can start to overlap in these meaningful ways,” Kasia recalls. “Veronica was driving a conceptual shift in thinking about representation and diversity in data in this really important way in practice. And I’m over here tinkering with these same concepts.” And then it clicked: “We realized that the things we were doing were actually two sides of the same coin.”

Veronica adds, “People who weren’t curious could have thought we were working in different fields, on different things that weren’t related…But both of us saw that there was this overlap. And instead of just moving on, we thought, ‘There’s something here. We should explore it.’”

Garbage In, Garbage Out
“When it comes to AI, you are what you eat,” Kasia says at the start of any talk or workshop, to set the stage for understanding the fundamentals of how algorithms work. “Developers train algorithms with data, and the algorithms start to see the patterns in that data, and then they can make predictions based on what they’ve learned from what we’ve fed them.” So what do you do when the data you’ve fed the algorithm doesn’t match with what it’s going to face in the real world? That’s exactly what Kasia’s DNP team spends most of its time thinking about.

“We need to make it easier for practitioners to quickly assess the viability and fitness of datasets they intend to train AI algorithms on,” says Kasia. Recognizing that the missing step in the AI development pipeline is an assessment of datasets based on standard quality measures that are both qualitative and quantitative, the DNP team devised a solution: Package these measures into an easy-to-use dataset nutrition label.

The label takes inspiration from nutritional labels on food packaging, highlighting the “ingredients” of a dataset to help shed light on whether it is healthy for a particular statistical use case. The goal is to mitigate harms caused by statistical systems (i.e. automated decision-making systems, artificial intelligence, advanced analytics) by providing information at a glance about the dataset that is mapped to a set of common use cases.

“The analogy of a food nutrition label is a great example of an agreed-upon standard built for a consumer audience,” says Kasia. It also helps shift the focus from the algorithms, the model, and the outcomes of the model as the source of the problem, and directs it to the data upon which the models depend.

To develop a label prototype, the DNP team needed working datasets that had an impact along a few different vectors: they were being used to build AI, they were about people, and ideally, they included some sensitivity to subpopulations and attributes (like race or gender).

“I also wanted to have access to the person who built the dataset so that we could ask questions of a technical and a socio-technical nature—where it came from, who made the decision, who funded it,” Kasia says. “So when it came up that Veronica had this dataset, and she was literally publishing it for people to build AI—there’s literally a competition every year!—I thought, ‘This is perfect!”

They began working together, and soon made a seismic discovery. “All of a sudden, it’s like you can’t unsee what you’ve seen,” says Veronica. Through the process of dissecting the sources and decisions behind the data, the need for transparency became clear. “I realized, ‘You’re the only person qualified to analyze what the potential biases are in the dataset because you built it,’” says Veronica. “And then, you become so tuned to looking for them that you are able to recognize them everywhere.”

“It’s really cool to see that realization play out from the perspective of the label, because that’s exactly what we’re trying to do,” adds Kasia. Just like we want to know what’s in food before we eat it, “before you feed [data] to your model, you want to be able know what’s in there, and see that it’s healthy. The whole idea started from there.” The ultimate goal is to have data users recognize that evey dataset has health, and think about the qualities in the data even when there is no label.

The Art and Science of Collaboration

Today, the two talk excitedly about the AI Label for Melanoma Classification, which outlines the contents of the data, its potential use cases, and alerts to consider as warnings—the kind of transparency that was previously nearly impossible to assess. Their collaboration, their progress, and what comes next are all points of pride and excitement. “It was immediately obvious that it was important work as soon as it was done,” says Veronica. And Kasia praises Veronica for being willing to take the risk and follow the trail where it led: “It’s very laudable. And I’m very fortunate for that…It’s a risk to call out flaws in things that you’ve built or others have built.”

From the outset of their collaboration, Veronica says, “I was finding, with our ISIC collaborators and the support of funding from the Melanoma Research Alliance, that algorithms for melanoma detection were impacted by even more than we had originally thought.” She realized that more aspects of the datasets needed labeling than were included in the current DNP label schema: the source, certain image characteristics, and even the clinics in which the clinical data originated. “For example, we found that average algorithms for images from some clinics were able to identify melanomas over 90 percent of the time, whereas in other clinics they could only do so about 1 percent of the time,” Veronica says.

“That would make a huge difference to you as a patient, if you were in the clinic where algorithms correctly identified melanomas 1 percent of the time—that’s significantly worse than random chance!”

A dataset nutrition label would ideally capture that kind of information, highlighting the known issues and providing mitigation strategies for practitioners. Without the label, you wouldn’t know if you were in a clinic that had high or low accuracy rates of identifying melanomas. “It really highlights the important work that DNP is doing to begin the conversation, and the way that collaboration with AI researchers is so key to improving the outcome for a given application,” Veronica says. Shifting the focus of her research primarily to when detection algorithms do and don’t work, and the challenges involved in deploying them, she says the necessary work of investigating potential biases—not just between clinics but across skin tones and underrepresented populations—is ongoing.

Today, the nonprofit has tremendous momentum, and Kasia and Veronica both express their excitement and gratitude that so many others who weren’t part of the initial fellowship are joining the effort, lending their passion and expertise. From practitioners, technologists, researchers, and policy experts, to artists and designers, “it’s really a labor of love,” says Kasia. With the label at the center, the work continues to raise awareness, drive change, and ultimately shift the paradigm.

“I feel privileged to have had a few moments like this in my life, where you realize you are at the forefront of something,” says Kasia. “Actually, first, it’s terrifying, because you think, ‘If we are the experts, what does that say?’ But it’s a great moment when you realize you are at the precipice of this thing, with your friend. And you’re looking around going, ‘Did we take the wrong road? I think we’re the only ones up here!’”

The two scientists also have a deep appreciation for how unique and rewarding their partnership is. “I think we both probably would have done a couple of hours of extra work just to help each other, and because we believe in the mission. But then it turned into this thing that we both really care about, and that really matters in its own right,” says Veronica. “It kind of flies in the face of a lot of the tropes about how scientific collaborations happen,” Kasia adds. “And it’s lovely, when you can have a good time and also have a good collaboration,” they say. “Because it’s not always like that. It’s a lot about titles, and money, and publishing rights, and IP—things that are very hierarchical and patriarchal, honestly. And so to be able to turn that on its head a little is really fun.”
 
Two Sides of a Coin
When Veronica and Kasia reflect on their individual paths, some essential common themes emerge: Curiosity. Conviction. Gratitude. Humility. Passion for work and for life, and synergy between the two. Courage and determination to overcome limitations, whether stated or implied. And the deeply rooted ability to be comfortable in a space that’s uncharted, disruptive, and ultimately a driver for systemic change.

Both are children of immigrants, raised with an understanding of mixed cultural experiences and a framework of expectations. They learned to embrace being “at the intersection of many things.” Kasia’s mother is an artist, and they say they were surrounded by art and music at home but always felt drawn to math and craved order. Veronica’s mom is an engineer; she once pulled Kasia aside in high school to explain the pros and cons of that career, and she has remained a powerful resource and role model as Veronica pursued her passion for science and medicine.

Self-described “nerds” at Winsor, Kasia and Veronica were kindred spirits, at their best when faced with a problem to solve in the classroom, or during Math Olympiads, robotics club, and engineering competitions. They recall a defining experience at one engineering competition when, as the only all-girl team, they proved to be a formidable force. “It was so hard! We were certain we did terribly,” says Veronica, who was the team captain. Listening to all the other teams talk about how well they thought they had done, the team feared the worst. “We tried to convince our mentor to let us get on the bus and go home before the awards were presented.” When they learned they had placed second, it was eye-opening. “All the other teams were either all, or largely, male,” Kasia says. “Clearly, there was a difference in the way we saw ourselves in that environment. But we kicked their butts!”

“That was the proudest I had ever been at that point,” Veronica recalls. That kind of experience played out over and over again in her life, first in college, then in her work, and she says it taught her an important lesson about trusting her instincts and abilities: “I learned I had to stop being self-deprecating.”

Eager to help others learn from their experiences, and to support and witness the emergence of the next generation of scientists, both alums mentor students at all levels of learning, and have returned to Winsor to speak with students. Joining one of Ms. Labieniec’s classes in 2020, and a virtual assembly in 2021, they shared insights on a variety of topics including the fallibility of machines, opportunities to address bias in science and technology, and the importance of pursuing a unique career and life path. Embracing opportunities to educate and inspire others about the future of technology, Kasia also frequently joins podcasts, including Youth AI, and gives talks at programs like Girls Who Code; and is working on a children’s book to deliver the same messages to a younger audience.

Veronica, who says she benefited from the mentorship of strong women in science throughout her education and now finds great satisfaction in mentoring medical students, welcomed the opportunity to mentor Alex Gorham ’21 for her senior Independent Learning Experience (ILE) in the spring 2021. Alex worked on a website and a cell phone app in development to help individuals upload images and review AI feedback.

“It was a great project because it was so applied,” says Veronica, and it helped her “get a sense of what the day-to-day life of a scientist is. The lab meetings, thinking about things that don’t go right the first, or second, or third time around.” Alex was awarded Winsor’s Madras Science Prize in June 2021, an acknowledgement of her commitment to science and her dedication and success in her ILE; Veronica beams when talking about Alex’s work ethic and accomplishments.

Full of humor and humility, the two alums also reflect on how their lives—like their work—embody two sides of the same coin, both following their passion, connected by an authentic, supportive network, but with some notable distinctions. “Veronica is on a path I would self-eject from regularly,” Kasia says. “It’s rigorous, and structural, and hierarchical.”

Veronica interjects with a laugh, “In my defense, I text Kasia to complain about those things all the time!” Kasia continues, “Honestly. She’s got kids and a partner, she’s a working mom, a superhero. My life is completely different, even though we have the same beginnings really. We went to the same schools, studied the same things—though she studied more!—in undergrad. And then, we meet again.”

“We graduated 20 years ago. It took us as long as some of the students we work with have been alive to reconnect and start working together! And all of those meandering paths are really informing what we are doing now,” Veronica says. “I don’t think I would be doing this if I weren’t also a doctor, and also an engineer, and also all the parts of all of the other things I’ve learned.”

Find What Fits
Kasia offers this advice for students interested in studying science: “I wish someone had told me that you don’t just get one shot to be a genius—make it or break it by the time you’re 21. That is something you see a lot in technology. It’s the lone-wolf trope of a guy who’s a hacker in his basement…We have this vision in our mind about how you make an impact as an individual, especially around technology and science. And I beg to differ. I think that it’s about the people you get to work with, and…[e]ach experience you have from the past helps you craft a better version of yourself for the future. Every step you take can be a better step.”

Veronica concurs. “There’s no expiration date on when you can have a career peak, or flash of brilliance, or huge impact,” she says. “Believing that there is hurts nontraditional careers and paths, and even family structures. I have two young kids at home, there’s a limit to what I can reasonably do in one day. So that just becomes part of the process, thinking about whether something is really important and needs to take priority over other projects. And deciding I’m going to do it over 10 years rather than five.” Echoing Kasia’s earlier advice, she adds, “You never know what the next step you take is going to inform, or the step after that, or what you’ll be doing 20 years from there. So the only thing you can do is put one foot in front of the other.” It illustrates that “you don’t have to choose a career path thinking ‘this is the one that is successful.’ You have to find the one that fits you.”

A final word of advice: “Go toward the thing that gives you energy,” says Kasia. It won’t always be easy, and you will have to work hard, “But if you are moving forward, constantly questioning, and doing things that give you joy and are giving back to you, that, to me, is a direction.”