As technology becomes more ingrained in daily life, domestic abusers and human trafficking perpetrators are using it in insidious new ways that can target their victims even from a distance. That’s why Nicola Dell, a computer scientist at Cornell Tech, studies technology-facilitated abuse and how to stop it. Her pioneering work helps survivors of domestic violence and human trafficking so they can regain their personal and digital safety.
Dell’s research focuses on predicting and preventing potential actions from attackers who can bypass many types of security precautions simply through their in-depth knowledge of their targets. For example, instead of tracking someone by following them in a car or on foot, attackers can now surreptitiously track their target’s every move using the tracking technologies of smartphones or other digital devices.
While it is common for perpetrators to stalk, stalk, harass, or even impersonate the people they intend to harm, this type of technological abuse is understudied by computer scientists. The new challenges make it “a very interesting game [research] space from a human-machine interaction perspective,” says Dell.
In 2018, Dell co-founded the Clinic to End Tech Abuse at Cornell Tech. The first of its kind, CETA offers free consultations to survivors of domestic violence. The center helps them discover ways their devices and accounts may be compromised, as well as steps they can take to improve and maintain their digital security and privacy. Partly as a result of his work at CETA, Dell received a 2024 MacArthur Fellowship, a “no strings attached” award of $800,000 over five years that recognizes creativity and promise for future research.
Rosanna Bellini worked for Dell as a postdoctoral researcher before becoming research director at CETA. When she met Dell in 2019, Bellini said, “she struck me as someone who was incredibly intelligent” and “whose brain works at a million miles an hour.” But Dell’s intelligence isn’t the only trait that impressed Bellini, who is also a computer scientist at New York University.
“I felt like his interests in these areas were really genuine,” she says. “There was this element of wanting [to help people] …because it was the right thing to do.
Fall into IT
Dell was born in Zimbabwe, where she lived until leaving for university. “I wasn’t someone who wrote code at five years old,” she says. She didn’t start using computers until she was a teenager in the 1990s. When she was about 13, “there was a computer in the school library,” she says. In her high school, she was one of the first students to choose computer science. At that time, his school had enough computers that only about 10 students could take this course out of a class of about 150 students. “I was really lucky to be offered this option,” says Dell.
This course helped Dell discover her passion and aptitude for computing, which led her to major in the field at the University of East Anglia in Norwich, England. “In many ways, I chose computer science because it looked cool” and “it seemed like a reasonable thing to do at the time,” she says.
This choice would set her apart in a way she hadn’t expected. Because she had previously attended all-girls schools, she was unaware of the gender disparity in computer science until she moved to England. It was only then that she realized she was one of a handful of women at this university pursuing a degree in computer science. She also found that there wasn’t much awareness of the gender gap or support for women who struggled to be among the few in the field.
“Being around a lot of men, especially many of whom have had coding childhoods since they were little,” Dell says. Changing majors wasn’t really an option because the British school system essentially requires high school students to choose their major when applying to college. “I just remember feeling intimidated and then toughing it out.”

When she began a Ph.D. At the University of Washington in Seattle, she was interested in research in computer graphics and computer vision. However, when she met her advisor, the late Gaetano Borriello, everything changed. Borriello was focused on how technology could help improve the lives of the underserved, and Dell found she was drawn to designing technologies that could work well in low-income or low-resource settings.
Now, Dell’s guidance helps students and young researchers from diverse backgrounds find their place in computing while working on problems with both academic and societal impacts.
“For me personally, academia was a world I didn’t have access to before,” says Ian Solano-Kamaiko, a Ph.D. student in computer science at Cornell Tech. He worked for several years as an engineer before beginning graduate school, where Dell served as one of his two advisors.
“Pursuing a doctorate with the goal of remaining in academia, particularly at [predominately white] “Elite institutions like Cornell are an extraordinarily opaque process characterized by unspoken rules, expectations and procedures,” Solano-Kamaiko says. “It can be disorienting and difficult to navigate. In this context, Nicki has been instrumental. She helped demystify these opaque structures, advised me on strategic approaches aligned with my career goals, and always advocated for my interests throughout my PhD.
Solano-Kamaiko’s research focuses on informatics in health care settings, with an emphasis on studying how personal, social and environmental factors, such as where a person was born and resides, contribute to inequities that affect community and home health care workers. If he finds himself stuck on a problem, Dell encourages him “to just pick up the pen, to keep putting one foot in front of the other,” he says. “There is this faith that everything will work out.”
Behind the scenes of technology-based violence research
Dell began researching how technology can be misused in domestic violence in 2016. She later expanded the scope of her work to include studying the misuse of technology in human trafficking. In technology design, it’s common to consider the potential users of a piece of technology and how the design can best serve them, Dell says. “But we often don’t think about adversarial design – or overuse, as we like to call it.” A completely different approach is needed “to protect yourself from someone who lives in the same house or knows your children, knows their birthdates, has access to your email accounts and can open your computer while you’re in the shower,” she says.
For example, she and her colleagues developed a new algorithm to identify applications that may be used for harassment, identity theft, fraud, theft and concealment of information. “Thanks to our work, the Google Play Store has already removed hundreds of apps for policy violations,” the researchers wrote in a 2020 conference proceeding.
Dell and his colleagues also created a new framework to analyze passwordless authentication systems. In these “password” services, users can unlock a device with their fingerprint, face scan, or PIN rather than providing a password. While these systems may be easier for legitimate users to navigate, they are also used to harm at-risk users. For example, attackers can log into their victims’ smartphones using a known PIN and then add their own fingerprint to the device settings. Even if their target later changes their password, the attacker can still access the phone without authorization.
Dell and his team examined 19 passkey services in their study and found that “in the most egregious cases, faulty implementations of major services supporting passkeys allow continued illicit access with no way for a victim to restore the security of their account,” they write.
When Dell discovers such vulnerabilities, it notifies technology companies of problems with their products and potential fixes. “These requests are perceived differently depending on the company,” she says. “It also partly depends on how complex or difficult it is to make changes.”
One of the big challenges is negotiating the “dual-use nature” of technologies that have both legitimate uses and potential abuses, Dell says. Sometimes this dual nature can be easily addressed with a few careful considerations. For example, she and her collaborators note that parental monitoring applications allowing the location of children could be misused by perpetrators of domestic violence to track adults without their knowledge. The discovery comes with a clear message for technology companies, Dell says: Technology tracking doesn’t have to be secret.
“If someone is tracking your location, there should be a warning,” if there isn’t one already, Dell says. Making this change does not prevent legitimate use of these applications. “Even if it’s a child, the child needs to know that ‘mommy can see where you are,'” she says.
Finding a balance between security and ease of use is another technology dilemma. In the event that someone is blocked from accessing their account, for example by entering the wrong password too many times, technology companies often offer alternative routes to account access. Users can then unlock the device by answering security questions or entering an old password. While these “essentially roundabout methods” are a boon to legitimate users, Dell claims, they are also easily exploited.
Dell has interviewed survivors of human trafficking and professional advocates on how technology has been used to coerce and control them and how it could be used to help them regain their digital safety and security. She also has analyzed online forum entries written by alleged intimate partner abusers detailing how they used technology to monitor survivors. This work focuses on the understanding the foundations of abuse to guide conversations about how to end it, notes Dell. For example, her work involves studying how to identify crucial moments in the cycle of domestic violence where interventions could be safely applied to prevent or mitigate harm to survivors.
At CETA, Dell and his team also encourage tech professionals to give back to the community in ways that might be new to them. In opening this center, “one of the things we were trying to do was create models to encourage more volunteer technology work,” she says. Such pro bono efforts are less common in the tech sector than in other sectors, such as the legal sector, she says. She has found that students and professionals crave these opportunities. Technical volunteers are trained on topics such as domestic violence, trauma-informed care, and setting boundaries.
The center also attracts a different type of volunteer: social workers who want to expand their technological skills to better understand what to look for and how to help people mitigate harm. Through these interdisciplinary partnerships, everyone has the opportunity to develop their skills to help survivors of real-world abuse continue to regain their digital safety and security.
“Anyone,” says Dell, “can be trained to make a difference. »