Science and Technology Studies—also known as Science and Technology in Society—is a field of study that emerged during the 1960s and 1970s and began to consolidate into a distinct sub-discipline in the 1980s. The first STS program was founded at MIT in the 1970s and by 2011, 111 STS research centres had developed worldwide. STS is concerned with the history and philosophy of science and technology, the relationship between institutions and the public, policy making, and the cultural and literary representation of science and technology. The field draws scholars from a variety of disciplines including anthropology, history, political science, literary studies, and sociology.
The College of Humanities, Arts, and Social Sciences (CoHASS) recently launched a new Minor in Science, Technology and Society. This interdisciplinary field seeks to introduce undergraduate students to ways of thinking about science and technology from a range of social and humanist perspectives. As a “technological” university with growing strengths in the humanities and social sciences, STS forms a critical part of both scientific, engineering, and social science education. Scientists and engineers need to be equipped to think about technical problems, from a range of points of view. Likewise, students of the humanities and social sciences should be able to get a better grasp of the technosciences and how they impact our lives and society. The aim of this minor, then, is to build an intellectual bridge between the humanities, social sciences, and the natural science and engineering and to educate provide ways for all students to develop ways of thinking creatively and innovatively about science, technology, and its role in the world.
On May 7, 1959, the scientist-turned-novelist C. P. Snow delivered the annual Reid Lecture at the University of Cambridge, in which he lamented the division of intellectual life into “two cultures”, the arts and sciences. His lecture generated widespread discussion until February 1962 when the literary critic F. R. Leavis challenged Snow’s thesis in another Cambridge lecture. Prior to Snow’s lecture, the arts tended to be regarded as traditional and conservative while science was typically presented as dynamic and progressive. In the wake of the “two cultures” debate, science became an increasingly professionalised and institutionalised discipline.
Consequently, the arts and sciences are often perceived as mutually exclusive. However, as Associate Professor Park Hyung Wook (History) argues, neither field functions in isolation from the other and STS scholars seek to identify the ways in which they connect. Consequently, the two fields are synergised into a comprehensive whole. Associate Professor Hallam Stevens (History) states that, this synergy is crucial for tackling the “big problems facing the world, [including] climate change, pandemic disease, artificial intelligence, and the growth of cities”.
Prof. Stevens adds that while these global problems do require technical solutions, they are “human problems at the same time”. Global problems cannot be resolved through technology alone. To craft the most effective, comprehensive, and ethical solutions, emerging technologies and their potential impact on human civilisation must be scrutinised. STS scholars argue that we must collaborate in cross-disciplinary teams during the process of research and development in order to anticipate an array of ethical, economic and social consequences and minimise risks to human health.
One example of a difficult scenario is the deployment of autonomous vehicles. While self-driving cars present greater ease of convenience and represent significant gains in efficiency, the on-board software sometimes struggles to distinguish between humans and objects, which has re-sulted in various crashes and even deaths. This issue raises the question: where does our responsibility lie? Our response necessitates a shift outside of the narrow engineering perspective, which is primarily concerned with problem solving, to consider the ethics, legal responsibilities, cultural impact, and social repercussions of new technology.
Another example is Prof. Park’s research into ageing. While ageing is a natural biological process, humans have long sought to disrupt and alter it, which changes what we consider to be “normal”. He argues that old age has become “irrelevant in the contemporary postmodern world”. This claim is supported by the fact that societies systematically craft guidelines and standards that older people are expected to abide by in order to slow the aging process. What are the ethical implications of these guidelines once they have been established as the norm? In response, should we simply accept the biological effects of ageing and resist the urge to mask the fact that we are advancing in years?
Prof. Stevens teaches Science, Technology, and Science Fiction, which examines how science fiction has influenced recent developments in science and technology including fields such as virtual reality, biotechnology, and nanotechnology, as well as the Internet and communications technolo-gies. The concept of virtual reality, for example, was first imagined in novels such as Neuromancer (1984) by William Gibson. Our hopes, fears, and understandings of these domains are written into the texts. Prof. Stevens notes: “one of the places where people actually think through the consequences of these technological innovations is in fiction, whether that be in video games, movies or novels. It’s an equally relevant way of thinking through consequences”. Literature plays a vital role in considering the social and cultural effects of emergent technologies. Analysis of these forms helps us to arrive at a deeper understanding of the ways in which our culture comes to terms with the dangers, threats, and complexities of contemporary technologies.
Associate Professor Daniel Jernigan (English) teaches a course on Science and Literature that begins by examining some of the ways in which contemporary advancements in science and technology continue to shape and reshape society, but then moves beyond a focus on science fiction to consider science’s influence on literature more generally. Prof. Jernigan is especially interested in the way in which developments in twentieth-century physics and mathematics have provided new metaphors for reconsidering humanity’s understanding of itself, even as contemporary authors attempt to accurately and compellingly represent the world we live in, together with all of its manifest quirks. The Russian novelist Vladimir Nabokov once suggested that “A writer should have the precision of a poet and the imagination of a scientist”. Nabokov’s largely tongue-in-cheek proposition comes alive in Prof. Jernigan’s course, as the oddities of special and general relativity, mathematical incompleteness, quantum uncertainty and chaotic systems become important literary metaphors for understanding the human condition in a postmodern world.
Dr. Melvin Chen is a philosopher of Artificial Intelligence (AI) whose prior research on the philosophical foundations of AI, causal epistemology, creative cognition, and the ethics of care has led him to his current role as the PI of an interdisciplinary medical AI research project, generously funded by an intramural ACE grant. His collaborator is Associate Professor Chew Lock Yue (Physics). While there are reasons in favour of implementing medical AI technology in the healthcare domain, Dr. Chen argues that two deficits in state-of-the-art medical AI systems must first be addressed: the causality deficit and the care deficit. In attempting to address the causality deficit, Dr. Chen and Prof. Chew aim to develop an algorithm that approximates causal rea-soning in the medical domain. Their longer-term aim will be to see how far they can go in ad-dressing the care deficit, especially in the context of care robots.
Dr. Chen teaches semester-length courses on the philosophy of AI, creative cognition and the philosophy of imagination, and logic and causal reasoning at undergraduate level. He also offers seminars on the ethics of care and the philosophical foundations of AI ethics at postgraduate level. His philosophical skepticism is neatly balanced by the can-do attitude of his collaborator Assoc. Prof. Chew, and they hope between them to achieve modest success in putting the ‘care’ back into ‘healthcare’ through the appropriate implementation of assistive medical AI technolo-gies. In addition, Dr. Chen sees the history of the AI research tradition as one of noble failure, in which light is continually being shed (through the failed aspirations of AI researchers) on precisely what it is that makes us human: an ability to make sense of and care for our fellow human beings, an understanding of the facts of human experience and the nature of mortality, and an appropriate situation and embodiment in complex networks of social and affective relations.
There are numerous contributions from the field of sociology ranging from modules on the body and genomic science, through to medicine and technological development. These modules are united by a concern with the unprecedented array of challenges brought about by advancements in science and technology. They question how scientific knowledge is created, disseminated and adopted and investigate how discoveries and inventions are made and accepted by broader society. Students on these modules explore the role of universities, research institutes, industry and business in developing new scientific modalities and making use of new technologies.
Body, Self and Society invites students to investigate the complex relationship between the self and society. Rather than treating the body as simply a biological entity, this course helps us to see how it is also socially, culturally, and politically constructed. Finally, Health, Medicine and Society moves beyond biomedical approaches to sickness to investigate the social contexts of illness. These include the impact of gender, ethnicity, sexuality, class, religion, and ageing. Together these courses make an invaluable contribution to the new minor by focusing attention on science and society.
Assistant Professor Michael Stanley-Baker (History) is an expert in Chinese Medicine and Religion who stresses the importance of the humanities in the field of medicine. For example, doctor-patient interaction is not simply about biological knowledge but, as Prof. Stanley-Baker states, “a collaboration between doctor and patient”. Although the symptoms experienced by a patient may align with the symptoms identified in theory, his or her experience is ultimately subjective and unique.
Prof. Stanley-Baker points out that “medicine is not only a science, but also a culture” and requires skills in cultural translation. Doctors often forget this fact in their heavy reliance on the clinical data and statistics while remaining ignorant of the fact that “there are all kinds of hidden social values inherent in their production and evaluation”. What is perceived to be objective evidence is, in truth, highly subjective and steeped in cultural values and biases that affect the way the evidence is created and recorded.
Prof. Stanley-Baker states: “Every area of medicine speaks a specific language, which is some-times mistaken for the truth. It’s not truth, it’s culture. And we need to be conscious when doc-tors speak across one culture to someone else’s culture, especially in multicultural, multilingual Singapore. Doctors from different cultures all have different ways of understanding disease and have to be very good at translating their knowledge to patients”.
Humanistic understanding is an especially important resource for helping the public gain an understanding of how the biomedical field handles the uncertainty of medical practice. Such understanding is severely lacking in China, where doctors and nurses sometimes have to guard against the potential for violent attacks. Humanities research can raise health literacy and help individuals gain greater understanding of the anomalies and uncertainties that are always present in the prac-tice of medicine. “An understanding of the human condition is needed to be humane” Prof. Stanley-Baker insists. Prof. Park agrees and expresses concern that doctors tend to regard their patients as machines who respond uniformly to a given ailment or condition.
“What does it mean for a computer to know?” Prof. Hallam wonders. Knowing how to drive a car after reading a manual is distinct from the act of driving, which highlights the importance of the body. Prof. Stevens notes that AI is disembodied, which limits its possibilities. To be able to replicate the human, we might have to put an AI inside a human body, which is a complicated relationship that must be carefully analysed and discussed. It is our responsibility to enter the world with an awareness of how seemingly minor decisions about the uses of science and tech-nology can carry ramifications for language, culture, politics and power.