The future of data science is a mystery surrounded by concerns about autonomy, responsibility, and other ethical issues related to technology. The recent Facebook-Cambridge Analytica data scandal raised concerns of personal privacy, ethical principles for social media businesses, as well as misinformation.
In his single-credit PHIL 293 course “Ethics for Data Sciences”, assistant professor of philosophy Taylor Davis challenges students to re-think what they believe about morality in the wake of these controversies in science and technology.
In light of advancing technologies in artificial intelligence and machine learning, Davis said it is becoming more clear that ethical training is necessary for data science.
“Broadly what is happening, especially we see Mark Zuckerberg testify in front of Congress, is ethics pervades all of our lives as we become more reliant on it in many ways,” Davis said.
Partnered with the Integrative Data Science Initiative, faculty are planing to make the course three credits with an online version, said Matthew Kroll, post-doctoral researcher in philosophy, and even advance ethics through other initiatives.
“One of the questions I start the class with is ‘What does it mean to be ethical in the age of the big data?'” Kroll said. The students discuss current events like the Facebook Cambridge Analytica data scandal and individuals and organizations should have done.
Kroll has also previously taught the course by giving students a “tool kit” of philosophical methods of reasoning to address these issues as well as concerns of rights and autonomy with respect to artificial intelligence and autonomous cars.
“If you’ve never taken an ethics course, you at least get to dip your toes in the water of what ethics is,” Kroll said.
By teaching students how to form philosophical arguments and put issues in historical perspectives, Kroll hopes to teach students how to make better ethical decisions.
“If these students feel like in their professional lives they reach a moment or a threshold that, if there’s an ethical issue in their workplace like ‘Are we gonna sell data to a particular government interest?’ to at least say ‘I don’t think this is right’ or ‘I think this is an unethical use of user data,'” Kroll said. “If one, two, or three students do that, then I’ll feel like the class is a success.”
Through an applied ethics course, Davis emphasized philosophy runs in the background as he instructs. In contrast to the standard philosophy-based methods of beginning with general, theoretical principles, he delivers current events as case studies.
Davis created the course by substituted data science with engineering from an engineering ethics course.
With a case-study based bottom-up approach, students begin with real world problems then figure out how to reason with those actions in mind. From these methods of isolating actions from agents and forming arguments, they learn how ethical principles such as not causing unnecessary harm to others and respecting personal rights come into play.
Through these methods, students find truth and clarity on tricky issues that they may apply to other problems such as autonomy of self-driving cars and gene editing technologies, Kroll said.
“Machine learning algorithms perform calculations in opaque methods,” Davis said. “They create a complicated model that researchers need to understand.” Davis said this would let researchers determine how to resolve social inequality issues related to women and minorities or forming predictions on individual habits. He wants students to consider how similar people are to robots that may perform human-like actions and how that raises issues about personal agency.
“The idea is to give people that kind of training to see and identify ethical issues,” Davis said.
Students will continue to grapple with what it means to be human in the “age of big data”, Kroll said. “The ethics in data science instruction serves to empower students in their future careers.”