Skip to main content

Focusing on advancing safety and sustainability, UL grows its partnership with Northwestern

The safety of a new smart TV may be straightforward to certify based on current standards, but that same TV might become a fundamentally different product after a couple of software updates. Who can certify that it is just as safe as when it was bought? What is the benchmark for artificial intelligence (AI) and machine learning safety? 

In a cooperative initiative, UL Research Institutes (ULRI) and Northwestern University are working to bring safety to the forefront of the rapidly expanding field of AI and machine learning through their leadership of the Center for Advancing Safety of Machine Intelligence (CASMI) research hub. Launched in February 2022, CASMI brings together and coordinates a wide-ranging research network focused on better-incorporating safety and equity into machine learning. 

“If used safely, AI can significantly help developers create solutions to some of society’s greatest challenges,” said ULRI Digital Safety Research Institute (DSRI) Executive Director Jill Crisman, who co-leads the CASMI collaboration for UL. “CASMI is conducting foundational research in AI and machine learning safety to help developers create safer AI solutions.”

kris-hammond.jpg
Kris Hammond, computer science professor at Northwestern, also directs CASMI.

Kris Hammond, a Northwestern computer science professor and director of CASMI, pitched a unique solution: No person or university could answer the AI safety question, but Northwestern’s interdisciplinary nature and wealth of research centers could provide a good foundation for finding the right experts, questions, and areas of research. Using some initial funding from UL Research Institutes, Hammond and Sarah Spurlock, former associate director of CASMI, started the Machine Learning Impact Initiative in 2019. It would become CASMI’s predecessor. 

“Our goal is to make things safer,” said Hammond. “I do my core work in artificial intelligence. I believe in the technology tremendously. But I don't want it to hurt people. And so the goal is figuring out how we can actually move the technology forward and make sure it's safe, which is very much in line with what all three UL enterprise organizations are really all about.”

CASMI’s day-to-day activities are focused on bringing experts together through workshops, open discussions and sponsored research to define the problems they want to solve. A true collaboration between Northwestern and ULRI, CASMI leaders meet regularly with DSRI’s Crisman. CASMI aims to leverage Northwestern’s network and ULRI’s position as a global safety science leader to build a multidisciplinary network focused on the right areas and groups of expert researchers.

“We're at the beginning of that phase where people don't quite understand what potential harms there are,” Hammond said, “and sometimes they don’t understand the nature of the difference between something that is harmful in the moment, harmful over time, harmful for an individual, harmful for society in general and even harmful for the environment.” 

For example, an AI that filters resumes may discriminate on the basis of gender and ethnicity because of old data. A self-driving car that can’t recognize people will cause physical harm. An AI that predicts the likelihood of someone committing a crime during parole may be racially biased and thus perpetuate harmful stereotypes. These human-designed machines can change the course of someone’s life and need to be evaluated for their potential harm before they can be implemented.

Knowing exactly what to research and allocate resources to is rare in the safety AI world, so CASMI is designed to assess timely needs within the field.

“The structure allows us to be flexible and to be able to issue that call for research proposals and see what's out there and take an assessment at that time of the things that are really important that need to be done,” Spurlock said. “It allows us to be more nimble than some long-term center funding would be, which is great in this area where the technology is changing really fast and the work that other groups are doing is constantly evolving. It was really helpful for us to not have to identify in advance everything that CASMI would do for three years, but to be able to identify that along the way.”

Maia Lee Jacobs, assistant professor of computer science and preventative medicine, is conducting one of the eight new research projects to receive funding from CASMI’s latest call for research proposals. Her work is focused on co-designing machine learning interfaces with the patients who will use them, specifically in the realm of prenatal care. The algorithm can predict when a pregnant patient will be stressed and even suggest treatment options, but the key is in helping the patient take control of their health with the aid of the algorithm. 

maia-jacobs.jpg
Maia Jacobs, assistant professor of computer science and preventative medicine, is leading a CASMI-funded project to improve prenatal health care.

“From a health side, it's a really important context because prenatal stress management can be really important for improving the health of both the parent as well as the child,” Jacobs explained. “But before we can make these machine learning models really useful, we need to look at how we make them easier to understand to the end user. The goal of this research is to work directly with pregnant people to co-design the interfaces of these machine learning tools so that they can be interpretable, and that means working with people who have potentially never had any prior exposure to these types of technologies.”

In its current form, AI privileges those who have technical expertise and are familiar with how the AI measures and determines success. Making it more accessible and minimizing the potential for harm will allow clinicians and patients to do more shared decision making which, said Jacobs, is “considered the gold standard of health care.”

This co-designed interface isn’t restricted to health care. “What this research would allow us to do is to look at using these similar methods to talk about machine learning and co-design machine learning with anyone regardless of their background or education,” Jacobs said. “The potential scalability for this is very high.”

When looking for sources of funding for her research, Jacobs applied for CASMI since she “felt that the mission of this project aligned very nicely with the mission of CASMI and the DSRI.” Safety can take on many forms depending on the context, and Jacobs said her research can go on to inform the future of both institutions.

Following the success of the initial CASMI partnership, other connections have grown between the UL enterprise organizations and Northwestern. In March of this year, Ken Boyce, currently senior director for principal engineering at UL Solutions, joined the Executive Council of the Paula M. Trienens Institute for Sustainability and Energy at Northwestern.

“I’m excited about being able to pair our scientists and engineers working on these topics with some of the brilliant minds affiliated with the Institute,” he said. “It’s still early days for me, so I’m still understanding what that looks like, but I’m very excited about the opportunity to do that in a very real way.”

Boyce attended the Trienens Institute's Annual Research Symposium in December, where he was impressed by the scope and depth of the work being done. The Institute’s and the UL enterprise’s priorities seemed to intersect perfectly, Boyce said, and the work provided a good foundation for implementing strategic initiatives into society at large. The research symposium signaled to Boyce that building a connection with the Trienens Institute would allow both organizations to make important and impactful strides in safety, security and sustainability. 

ken-boyce.jpg
Ken Boyce, Senior Director for Principal Engineering at UL Solutions. Ken joined the Executive Council for the Trienens Institute in May 2023.

With decades of experience in energy storage, renewable energy technologies and energy infrastructure, Boyce works closely with the U.S. Department of Energy and U.S. National Laboratories to conduct research on energy technologies. He also has experience in fire suppression and cybersecurity at UL Solutions. He’s witnessed the great innovations coming out of these fields, but pure innovation isn’t enough to get the technologies contributing to a functional society.

“There’s obviously a desperate need for more circular economy practices and more sustainable approaches to the way the world consumes materials,” Boyce said. “If we get more sustainable and we’ve put people in peril in the process, we haven’t done our job the right way. So it’s one of those places where the intersection between safety and sustainability is really important in the Institute and UL Solutions effectively moving forward together in a responsible way.”

In the spirit of fostering a safe progression toward a more sustainable world, ULRI hosted its 2023 annual research symposium at Northwestern on July 31–Aug. 2. ULRI’s premier researchers and eminent American and international experts shared their scientific insights and findings related to the urgent issue of building resilience for a sustainable future, with an eye toward identifying potential collaborations and actionable discoveries. 

In September 2022, ULRI announced the creation of its new institute, the Materials Discovery Research Institute (MDRI). MDRI is in the process of creating a state-of-the-art materials laboratory in Skokie, Illinois, which will open in early 2024 and focus on the accelerated discovery of materials for renewable energy, environmental sustainability and individual and societal health. MDRI is working closely with Northwestern’s Corporate Engagement team to connect with relevant faculty leaders for mutually beneficial research discussions to drive research forward in this area.

“In the face of global climate change and other challenges to sustainability, we’re committed to bringing a safety science perspective to the design, development and deployment of engineered technologies and practices,” said Deepa Shankar, director of partnerships at ULRI. “We want to ensure that innovation is created with safety outcomes in mind, thereby helping to ensure a resilient and sustainable future.” 

Given DSRI’s close partnership with CASMI, hosting the research symposium at Northwestern and the Kellogg Global Hub “was an easy decision and a natural fit,” Shankar said.

deepa-shankar.jpg
Deepa Shankar, director of partnerships for UL Research Institutes. Deepa has been involved in broadening the ULRI’s relationship with Northwestern beyond the initial CASMI connection.

"We’d like to thank Northwestern for its ongoing partnership with ULRI, both on the symposium and through our many collaborations,” added Shankar. “ULRI and Northwestern are each global research leaders; however, by collaborating, our collective impact surpasses what we can achieve independently.”

Whether it’s in partnership with DSRI or UL Solutions, Northwestern institutes are always on the path of innovation and will continue to welcome transformative discoveries in safety and sustainability. The spirit of collaboration is cross-cutting regardless of the fields the UL enterprise’s three organizations and Northwestern pursue together. In the words of Hammond: “We believe in their mission tremendously. We also have expertise in terms of who is out there, what they can do and what needs to be done. It's a genuine collaboration.”

 

RETURN TO PARTNERSHIP STORIES