In MIT Curriculum 6.036 (Introduction to Machine Learning) students study the principles behind powerful models that help clinicians diagnose disease or assist recruiters in screening job candidates.
Now, thanks to computing’s Social and Ethical Responsibility (SERC) framework, these students will also stop to consider the implications. of these artificial intelligence tools, which sometimes come with their share of unintended consequences.
Last winter, a team of SERC Scholars worked with instructor Leslie Keelbling, a Panasonic Professor of Computer Science and Engineering, and 6.036 teaching assistants, to come up with material covering ethical computing, data and model bias, and objectivity in machine learning. with weekly laboratories. The process was initiated in the fall of 2019 by Jacob Andreas, X Consortium assistant professor in the Department of Electrical Engineering and Computer Science. SERC Scholars collaborate in multidisciplinary teams to help postdocs and faculty develop new course materials.
Because 6.036 is such a large curriculum, more than 500 students, who were enrolled in the Spring Term of 2021, were grappling with these ethical dimensions along with their efforts to learn new computing technologies. For some, this may be their first experience thinking critically about the potential negative effects of machine learning in an academic setting.
SERC scholars evaluated each laboratory to develop concrete examples and ethics-related questions to fit that week’s content. Each brought a different toolset. Serena Booth is a graduate student in the Interactive Robotics Group of the Computer Science and Artificial Intelligence Laboratory (CSAIL). Marion Boulicault was a graduate student in the Department of Linguistics and Philosophy, and is now a postdoc at MIT Schwarzman College of Computing, where SERC is based. and Rodrigo Ochigem was a graduate student in the Program in History, Anthropology, and Science, Technology and Society (HASTS) and is now an assistant professor at Leiden University in the Netherlands. He collaborated closely with Teaching Assistant Dikshita Kumar, MEng ’21, who was instrumental in developing the course material.
He brainstormed and iterated on each lab, working closely with teaching assistants to fit the material and advance the curriculum’s core learning objectives. Plus, they helped teaching assistants determine the best way to present material and lead conversations on topics with social implications such as race, gender, and surveillance.
“In a class like 6.036, we are dealing with 500 people who are not there to learn about ethics. They think they are meant to learn the nuts and bolts of machine learning, like loss functions, activation functions, and things like that. The challenge before us is to motivate those students to actually participate in these discussions in a very active and engaged manner. We have done this very deeply by combining social questions with technical content,” Booth says.
For example, in a lab on how to represent input features for a machine learning model, they introduced different definitions of fairness, asking students to consider the pros and cons of each definition, then asking them to compare those features. Challenged to think about what should be inputted. A model to make it fair.
There are now four labs published on MIT OpenCourseWare. A new team of SERC Scholars is revising the other eight based on feedback from instructors and students, with a focus on learning objectives, filling gaps and highlighting key concepts.
a deliberate approach
The students’ efforts at 6.036 show how SERC aims to work with the faculty, says Julie Shah, SERC’s Associate Dean and Professor of Aeronautics and Astronautics. He optimized the SERC process due to the unique nature of this large course and tight time constraints.
SERC was founded more than two years ago through the MIT Schwarzman College of Computing as a deliberate approach to co-creating and launching new course materials focused on social and responsible computing in a collaborative setting. To bring the faculty together.
Each semester, the SERC team invites approximately a dozen faculty members to join a working group dedicated to developing new curriculum materials (there are several SERC working groups, each with a different mission). They are purposeful in what they invite, and want to include faculty members who will make fruitful partnerships in small subgroups, says SERC associate dean David Kaiser, the Germhausen Professor of History of Science and Professor of Physics.
These subgroups of two or three faculty members increased their shared interest during the term to develop new material related to ethics. But instead of one discipline serving another, the process is a two-way street; Each faculty member brings new material back into their curriculum, Shah explains. Faculty from all five MIT schools are included in the Action Group.
“Part of it involves going outside your normal disciplinary boundaries and building a language, and then trusting and collaborating with someone new outside your normal circles. That’s why I think our deliberate approach was so successful. It’s good to pilot the material and put new things back on their course, but building relationships is core. It creates something valuable for everyone,” she says.
Over the past two years, Shah and Kaiser have been impressed by the energy and enthusiasm surrounding these efforts.
He has worked with approximately 80 faculty members since the program began, and more than 2,100 students have taken courses that include new SERC material from the previous year alone. Those students are not all necessarily engineers – they were exposed to about 500 SERC materials through courses offered at the School of Humanities, Arts and Social Sciences, the Sloan School of Management and the School of Architecture and Planning.
Central to SERC is the principle that ethics and social responsibility in computing should be integrated into all areas of teaching at MIT, so it becomes as relevant as the technical parts of the curriculum, Shah says. Technology, and AI in particular, now touches almost every industry, so students of all disciplines should have training that helps them understand these tools, and think deeply about their power and pitfalls.
“It’s no one else’s job to figure out why or what happens when things go wrong. It’s all of us responsibility and we can all be equipped to do it. Let’s get used to it. Let’s ask those tough questions.” Build the muscle of being able to pause and ask, even if we can’t identify a single answer at the end of a problem,” Kaiser says.
For the three SERC scholars, the careful formulation of ethical questions was uniquely challenging, when there was no answer key to refer to. But thinking deeply about such thorny problems helped Booth, Boulicault, and Ochigem learn, grow, and see the world through the lens of other disciplines.
They are hopeful that graduates and teaching assistants in 6.036 will take these important lessons to heart and in their future careers.
“I was inspired and excited by the process, and I learned a lot, not only the technical stuff, but also what you can accomplish when you collaborate across all disciplines. Just found the scale of the effort exciting. If We have this group of 500 students who go out there with a better understanding of how to think about these kinds of problems in the world, so I think we can really make a difference,” Boulicault says.