Symposium on the Future of Libraries
The workforce that creates algorithms that shape our information infrastructure remains predominantly male and white; the hypothesis that this skew in gender and race affects algorithm design has been backed by evidence from computer and data scientists studying the phenomenon such as Suresh Venkatasubramanian at the University of Utah and Cathy O’Neil at Columbia University. Our study involved a cross-institutional survey of computer science students at three universities: University of Southern California (USC), California State University, Los Angeles (Cal State LA), and Boise State University (Boise State). All three institutions have distinctly different student populations. USC has a diverse student body in terms of race and gender. Women comprised 44% of the incoming class at USC’s Viterbi School of Engineering in 2017. Cal State LA is a Hispanic Serving Institution where first generation students are almost 80% of the student population. Boise State is more racially homogeneous but has seen an increasing number of non-traditional students in attendance. Our survey considered how much of an impact a diverse student body has on perceptions of algorithm bias. As computer science subject librarians and a computer science associate professor, we are aware that many of our students are likely to enter careers where they will build information searching and machine learning tools. We feel well positioned to have access to a student population who will shape the information universe of the future.
There is concern among social scientists and computer scientists about the presence of bias in machine learning and big-data algorithms. In recent years, journalistic as well as scholarly investigations have addressed algorithm bias. Safiya Noble, an Assistant Professor of Communications at USC, details significant bias against women and people of color within the Google search structure in her book Algorithms of Oppression (2018). Our survey of prevailing attitudes and knowledge about algorithm bias focused on undergraduate and graduate students in computer science. The aim of the survey was to gather data that gave us an understanding of how students understand bias within search and machine learning algorithms.
Librarians have historically been at the forefront of shaping information ethics, addressing questions of access, equity, and intellectual property rights. As the information universe becomes increasingly dominated by algorithms, computer scientists and engineers have ethical obligations to create systems that do no harm. In our discussion, we underscore that librarians, as information workers, have a role in partnering with information scientists to ensure that libraries can be spaces where communities can optimize their search for information and expect fair treatment from automated systems, whether they are K-12 students, college students, or the general public. Librarians have traditionally been teachers of information literacy, which is the ability to locate, evaluate, and use information effectively. In our study, we want to expand the concept of information literacy to include search literacy, the understanding of how search structures and other automated information environments are generated with human input. We present the preliminary results of our study in this session and encourage questions and discussions at the end.
Session Learning Outcomes:
Attendees will recognize examples of bias in machine learning and computer algorithms.
Attendees will give examples of computer science student perceptions.
Attendees will summarize themes or ideas that may influence the future of libraries.
ALA Unit/Subunit: ALA, Center for the Future of Libraries
Meeting Type: Symposium on the Future of Libraries
Cost: Included with full conference registration.