RESPONSIBILITY IN THE CONTEXT OF SCIENCE, TECHNOLOGY AND SOCIETY RELATIONS. NEW FORMS FOR AN OLD ISSUE? Abstract. Responsibility is currently an important issue in scientific research and technology development. This paper discusses the rise of responsibility from the perspective of science-society relationships. The first section shortly illustrates the significance of responsibility in two emerging technological fields and in EU science policy. The second section examines the notion of responsibility. The third part explores the changing features of science-society relationships and the consequent transformations of responsibility in science and technology. Drawing on this analysis, the paper argues that responsibility can be better conceived as a changing element in the science and technology landscape rather than a new one. Keywords: responsibility, science and society, uncer- 789 tainty, ELSI, participation, technology assessment Introduction The last twenty years have witnessed the entrance of responsibility from philosophers' speculation into scientific research and technological development. This process has driven to the progressive institutionalization of concepts like 'responsible research and innovation' and 'responsible governance of science and technology' in practice and policy. This paper is an attempt to frame this statement in a broader perspective on science-society relationships. From such a point of view, responsibility can be better conceived as a changing element in the science and technology landscape rather than a new one. This inquiry into responsibility will be realised in three steps. The first section of the paper will briefly illustrate the significance of responsibility in two emerging technological fields (nanotechnology and synthetic biology) and in the broader context of EU science policy, where Responsible Research and Innovation (RRI) is now an important, constituent part of the new Horizon 2020 Framework program for research and development. The second section will discuss the notion of responsibility, so as to identify * Simone Arnaldi, PhD, Research Fellow, University of Padova, Department of Political Science, Law and International Studies, and Adjunct Professor of Sociology, University of Trieste. some central characteristics for further discussion. More precisely, responsibility is here defined as an 'assignment process' of tasks by and amongst different social actors. Moving from this definition, we will connect the issue of responsibility to issues of social (socio-technical) order and, more specifically, to the division of techno-scientific labour in society. The third part will link the latter aspect to responsibility and discuss its changing features in the broader context of science-society relationships, with specific regard to the way these features were defined under the so-called 'traditional social contract of science' and the more recent configurations of the relationships between science, technology and society1. Examples of responsibility in technology policy: nanotechnology, synthetic biology and beyond Nanotechnology and synthetic biology are prominent examples of the progressive institutionalization of responsibility in science and technology development and policy. Furthermore, nanotechnology has probably been the most important antecedent for the development of Responsible Research and Innovation as an horizontal notion across the different scien-790 tific and technological fields in the EU science policy (Grunwald, 2014). In very broad terms, nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers. Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering. (NNI, n.d.) Nanotechnology is acknowledged as a key enabling technology for a wide range of technological sectors, including electronics, optical sciences and engineering, manufacturing, chemistry, materials, biotechnology, with prospected applications in medicine, environmental remediation, energy, communication, inter alia. Nanotechnology has rapidly become established as a top priority in research practice and science and technology policy. Several indicators confirm the increasing dedication and commitment of the scientific community, the relevance for policy makers and companies, and the attention of media and the public. Turning to the notion of responsibility, the debate on the responsible development of nanotechnology is now at least one decade long 1 This paper consists of an integration andfurther development of other works of the author (Arnaldi, 2014; Arnaldi et al., 2014; Arnaldi and Bianchi, 2014). (Grunwald, 2014) and the recent history of this field is characterized by several documents that explicitly mention 'responsibility' as a tenet of policy and R&D practices. A prominent policy example is the Code of Conduct for Responsible Nanosciences and Nanotechnologies by the European Commission in 2008 (European Commission, 2008). The Code aims at enabling safe and beneficial innovation through nanotechnologies and to foster the organization of collective responsibility for the field (von Schomberg, 2007). The notion of 'responsible development' functions as an overarching ethical framework for innovation, a general foundation of different principles, which should inspire actions (such as sustainability, inclusiveness, excellence, innovation and accountability). Alongside the Code, which was initiated by public authorities, private organizations too launched initiatives seeking to outline and foster a 'responsible way' to develop nanotechnol-ogy. A prominent example is the Responsible Nanocode, which "aims to provide clear guidance about the expected behaviour of companies in relation to their nanotechnology activities" (NIA, n.d.) through the implementation of a set of 'principles' ranging from "board accountability" (Principle 1) and "worker health and safety" (Principle 3) to "wider social, environmental, health and ethical implications and impacts" (Principle 5). Although it is still less visible in policy and, especially, in public debate, 791 synthetic biology is experiencing a rapid growth that closely resembles that of nanotechnology. Synthetic biology represents the latest phase in the development of biotechnology This allows them to 'design'and 'create'micro-organisms that may perform a variety of useful tasks. At the same time these organisms are becoming increasingly more estranged from those we may find in nature. Synthetic biologists look at biology from an engineering perspective [and synbio approaches] allow the combination of multiple genes, newly constructed 'biological parts' or the use of non-natural molecules to construct new biological pathways, functions and (in the future) entire organisms, that have no blueprint in nature. (Rerimassie and Koenig, n.d.)2 2 The two fields are linked. More precisely, the so-called "bottom-up" approach to nanotechnology has used "biomolecular self-assembly for the construction of artificial structures and devices, which is based on the molecular recognition properties of biological macromolecules such as DNA or proteins. f^J In that sense, DNA nanotechnology can be regarded as one aspect of in vitro synthetic biology." (Jungmann et al., 2008: 1) At the same time, "faJdvances in nanoscale fabrication, assembly, and characterization are providing the tools and materials for characterizing and emulating the smallest scale features of biology. Further, they are revealing unique physical properties that emerge at the nanoscale". (Doktycz and Simpson, 2007:1) On this relation, see also Ball (2005). The US Presidential Commission for the Study of Bioethical Issues (PCSBI), which is an advisory body to the US President in ethical aspects related to medicine, science, technology, and engineering, issued a report and recommendations for synthetic biology governance and regulation (PCSBI, 2010). The Commission includes "responsible stewardship" among the principles that should govern synthetic biology. For instance, such responsible stewardship is discussed in conjunction with the importance of the responsibility of scientists and with the delimitation of who are the legitimate actors in scientific research. In particular, the risks must be identified and anticipated with systems and policies to assess and respond to them while supporting work toward potential benefits Responsible conduct of synthetic biology research, like all areas of biological research, rests heavily on the behaviour of individual scientists. Creating a culture of responsibility in the synthetic biology community could do more to promote responsible stewardship in synthetic biology than any other single strategy. There are actors in the world of synthetic biology who practice outside of conventional biological or medical research settings. This poses a new challenge regarding 792 the need to educate and inform synthetic biologists in all communities about their responsibilities and obligations, particularly with regard to biosafety and biosecurity. (PCSBI, 2010: 133-134) The growing emphasis on the responsible development and governance of science and technology, which is exemplified by the two domains of nan-otechnology and synthetic biology, over the years has accompanied and sustained the parallel establishment of responsibility as a general feature of technology policy and development. In Europe, such a gradual process has resulted in the adoption of the notion of Responsible Research and Innovation (RRI) as a cross-cutting issue under the EU Framework Programme for Research and Innovation 'Horizon 2020' (von Schomberg, 2013), representing a core value in the new research agenda of the European Union. Similarly, the 'sister concept' of 'responsible innovation' (Owen et al., 2012; 2013) has made its way into the academic debate. Responsible innovation is considered an answer to the policy and regulatory dilemmas that are posed by techno-scien-tific fields whose impacts are poorly characterized or highly uncertain. While risk-based governance and the regulatory science that supports it are challenged by the complex and uncertain nature of these phenomena, responsible innovation argues "that stewardship of science and innovation must not only include broad reflection and deliberation on their products, [_] but also (and critically) the very purposes of science or innovation" (Owen et al., 2013). Responsibility, assignments and socio-technical order Section one has briefly presented some salient aspects of the policy and expert debate on responsibility in science and technology, as developed in two important emerging technological fields (nanotechnology and synthetic biology) and in the more general perspective of Responsible Research and Innovation (RRI). This second section reflects on the notion of responsibility itself, starting with some indications about the way sociology has dealt with this concept. In reviewing the use and scope of this concept in sociology, Strydom (1999) assigns crucial importance to the notion of social role for the sociological understanding of the notion of responsibility. The author notes that two major perspectives can be distinguished in sociological thought. On the one hand, the traditional definition of this notion refers to individual responsibility within informal and pre-institutional (e.g. friendship, family, kinship) or institutional contexts (e.g. occupational role). In both cases, responsibilities are defined and assigned "within the normative confines of a given institutional framework" (Strydom, 1999: 68). Classic notions like Durkheim's functional division of social labour or Talcott Par-sons's view of responsibility as a complex of duties associated to social roles 793 are in line with this traditional view of responsibility. The "profession" is a cornerstone of this link between roles and responsibility, and between individuals and society. In Parsons's terms, a profession can be conceived as "a category of occupational role which is organized about the mastery of and fiduciary responsibility for any important segment of a society's cultural tradition, including responsibility for its perpetuation and for its further development" (Parsons, 1959: 547). On the other hand, Strydom identifies a post-traditional, but still individual, notion of responsibility, which refers to individuals who possess special knowledge, abilities, judgement, power or influence in particular domains of social life, [rather than] observing, traditional or conventional limits take the initiative to shift the boundaries by assuming individual responsibility for the (re)design and (re)organization of institutions and social systems themselves with a view to the constant monitoring and the reduction or avoidance of negative features and effects. (Strydom, 1999: 68-9) For Strydom, public intellectuals or prominent individuals challenging established conventions exemplify this post-traditional category of responsibility. Finally, we assign a distinct place to Max Weber's 'ethics of responsibility' 794 depending on the claim that Weber adopts a non-functionalist point of view, focusing instead on the agent's calculative capacity of appraising means and ends, actions and their consequences and to (proactively) adapt actions accordingly. Strydom criticizes these sociological approaches to responsibility on the ground that the centrality of the individual agent they presume challenges their capacity to deal adequately with the collective nature of science-based technology, which needs instead an equally collective concept of respon-sibility3. As an alternative option, Strydom proposes the concept of frame to translate this 'new conception' of responsibility into sociological terms. Frames constitute discursive structures coordinating responsibility in society. For Strydom, today, the responsibility frame occupies the central place vacated by rights and justice It amounts to the assumption that everyone is required to assume the same responsibility and, hence, that the responsibility frame is a comprehensively determining or constraining structure. (Strydom, 1999: 76-7) In other words, Strydom uses responsibility as a "master frame" of contemporary society. While this choice has the merit to emphasise the importance responsibility has assumed in the contemporary technology-infused society, this perspective does not illuminate the ways responsibility (in science and technology) is concretely organized in specific social settings. To further our discussion in this direction, we refer to Armin Grunwald's general four-point reconstruction of issues of responsibility in the scientific and technical domain (Grunwald, 2014). According to this author, responsibility implies the following elements: • someone (an actor, e.g. a scientist) assumes responsibility or is made responsible (responsibility is assigned to her/him) for • something such as the results of actions or decisions, e.g. for avoiding adverse health effects of nanomaterials or synthetic organisms, relative to • rules and criteria which orientate responsibility from less responsible or irresponsible action, and relative to the 3 While Strydom treats individual and collective responsibility as opposites, the non-exclusivity of responsibility implies that the fact that "one person is responsible (for something) does not mean that other people are not equally responsible" (Ladd,1982:9). A discussion of the individual and collective notions of responsibility is beyond the scope of this article. For the current purposes, it suffices to stress that both views of responsibility can be described according to the general interpretive framework of responsibility that is illustrated below. • knowledge available about the impacts and consequences of the action or decision under consideration, including also meta-knowledge about the epistemological status of that knowledge and uncertainties involved. Such an understanding of responsibility as an 'assignment process' squarely places it in the context of social relations, broadly understood. Responsibility is indeed a responsibility to someone, either to someone who assigns responsibilities or someone who is concerned by the agents' behaviour and decisions. From this point of view, we can distinguish three distinct moments of responsibility in social relationships: (1) assumption, i.e. the agent takes responsibility for her own (or others' behaviour), anticipating the consequences and making the behaviour consistent with this anticipation; (2) ascription, i.e. the other agent, who is the reference of the first agent's behaviour and is concerned by the consequences of her actions, can hold the first agent responsible for the consequences of her actions; (3) subjection, as the second agent can ask the first agent to answer for her actions, because they either create a damage or a benefit; symmetrically, the first agent, when assuming responsibility, assumes a duty to answer to the second agent who is concerned by her actions4. Assignments of responsibility affect concrete actors in concrete constella- 795 tions and are the result of situated social and organizational configurations, which variously connect the three moments of responsibility with the four elements in Grunwald's reconstruction. For instance, 'rules and criteria' can define what is relevant as an object of assessment in terms of responsibility ('something'), and what knowledge is relevant for individuals, groups and organizations in such an assessment ('knowledge available'). In turn, the 'knowledge available' can either narrow or broaden what constitutes a consequence (e.g. side effects, long term impacts, etc.), and help define new 'rules and criteria' for responsibility orientation. A general interpretative scheme of the different, pertinent levels of responsibility can be outlined along three dimensions (see Fig. 1). The first dimension concerns the level of micro-scale, interpersonal relations, of individuals who assume and assess responsibility according to their representations of the situation, the meaning they assign to actions and consequences, and the preferences orienting their action and evaluation. The second dimension refers to the broader level of organizational configurations and policy mechanisms, which grant institutional force to specific rules and criteria, thus setting boundaries, constraints, and directions for responsible actions. Eventually, the third dimensions 4 These "three moments" of responsibility are a distillation of the ongoing debate on this issues, which is too broad to be discussed in this paper. The main references for our formulation of the three moments of responsibility are Hart (1968), Schlenker et al. (1994), Pellizzoni (2004), Vincent (2011). regards what, in a loose sense, we may call structures, i.e. the material and discursive settings defining science, technology and society relations, which shape the general frame for discussions about responsibility. Figure 1: DIMENSIONS AND LEVELS OF RESPONSIBILITY 796 Source: Arnaldi et al. (2014) The next section focuses on science, technology and society relations and their transformation as a consequence of the shift from the so called "traditional social contract of science" to the more recent reconfigurations that followed its demise. We will then discuss what implications this shift had for responsibility, referring to the three moments of responsibility (assumption, acsription and subjection) we have distinguished above. The traditional social contract of science and the meaning of responsibility The relationship between responsibility and technological innovation can be linked to the more general relationship between science, technology and society in internally differentiated social systems. Technoscientific labour does not waive the more general rule of the division of social labour. From this point of view, the 'traditional social contract of science' that dominated our underderstanding of science-society relationships in the post-war period, considers scientists, engineers and all other societal actors that are involved in the research and tehnological development process are first and foremost professionals (Weber, 2004; see also, for example, Merton, 2000a; 2000b). As a result, responsibility in the processes of scientific research and technological development is limited to the social role or roles these actors have within the social division of labour. These roles are primarily professional ones. The professional role of the scientist, and of the other actors that are involved in research and development, is defined according to the more general 'place' which science is acknowledged in society and which is also outlined in terms of society internal differentiation. Merton's (2000c) examination of the specific institutions of science as a social sub-system is an exemplary exploration of such differentiation, but it is to Vannevar Bush (1945) that most observers refer to outline the terms of what we have called the traditional 'social contract of science'5. According to such a view of these relations, scientific research can make society prosper by providing knowledge (Pellizzoni, 2010) and fostering technological innovation (Jasanoff, 2003), so that societal needs can have a more and more effective response. For making this feasible, politics and the public administration have to provide as much support as possible to a free and independent research. This view draws a direct, linear and non-problematic relation between the benefits of basic and applied scientific research and their transfer to society through technological innovation. As a consequence of this linear relation, technoscientific work is considered as self-justifying and self-referring. Drawing from the work of van Lente (1993; 2000), we can say that this 797 vision, which is based on more general normative idea of scientific and technological progress (van Lente, 1993; 2000), places scientists and technologists in a "protected space". Such a space is exclusive and, to some extent, independent from the rest of society, on the basis of the social mandate (Parsons's "fiduciary responsibility") to pursue scientific and technical progress. According to this view, responsible scientists are first and foremost good scientists, whose responsibility is to advance science and technology. Politicians are good politicians as long as they support scientists and technologists to fulfil their mandate. In this traditional view, the assumption of responsibility is therefore primarily restricted to scientific and technological progress. As to the impacts of scientific and technological advances, they can always be anticipated, assessed and controlled by resorting to expert knowledge. "To reassure the public, and to keep the wheels of science and industry turning, governments have developed a series of predictive 5 In his genealogy of the "social contract of science", David Guston (2000) notices that "Bush makes no mention in his report of such an idea and neither does John Steelman in his Science and Public Policy five years later. Yet commonalities between the two, despite their partisan differences, point toward a tacit understanding of four essential elements of postwar science policy: the unique partnership between the federal government and universities for the support of basic research; the integrity of scientists as the recipients of federal largesse; the easy translation of research results into economic and other benefits, and the institutional and conceptual separation between politics and science". methods [_] that are designed, on the whole, to facilitate management and control, even in areas of high uncertainty" (Jasanoff, 2003: 238). To put it simple: scientific knowledge makes us able to predict consequences; technology provides us with the means to control them. The concept of risk has been of capital importance in asserting the idea of a rational, quantitative, rigorous, and neutral treatment of uncertainty. The so-called risk governance6 meets these criteria and is based on a clear separation between risk assessment and risk management. In the risk governance model, the former is considered as a process of scientific research that is aimed at the identification and at the objective and factual measurement of risk. The latter is related to the normative considerations about risky alternatives and the measures to be taken for their management. Risk assessment concerns facts, calculation and quantification; risk management is the place for values, acceptability, and decisions. The former is the terrain of experts; the latter is a prerogative of decision makers. In the conventional view, it is up to the experts to produce factual evidence and recommendations to address the risks identified in the assessment phase. And decision-makers have to define the threshold of acceptable risk and to take adequate measures accordingly. 798 This strict separation reflects what Callon et al. (2001) have defined as the model of "double delegation", i.e. the distinction between the knowledge of nature, which is delegated to the scientists, and the knowledge of sociopolitical collectives, which is a prerogative of politicians. This distinction defines, therefore, different types of responsibilities that can be assigned to the actors involved in the processes of technological innovation and, more generally, in technoscientific labour. Firstly, scientists can be held responsible for a flawed risk analysis. This means, once again, that a scientist can be called to answer for not being a good scientist, but cannot be accused of having caused any adverse effects if they are not a consequence of these shortcomings, whether intentional or unintentional. Secondly, the responsibility of having failed to follow the recommendations elaborated in the risk analysis phase can be ascribed to decision-makers, and they can be held responsible for that. Again, different roles define differentiated responsibilities also with regard to the possibility of ascribing responsibility. 6 For a characterization of this approach and, in more general terms, of the relationship between risk, scientific knowledge and policy, see Wynne and Felt (2007) and Pellizzoni (2010). Uncertainty and the crisis of the traditional model of science-policy relations The recent decades have witnessed a crisis of the traditional model of the 'social contract of science' as a consequence of the changed status of uncertainty in the debate on science and technology. Subsequently, a similar fate happened to the specific patterns of the division of social labour that were related to this model, and to the institutional force they exerted, both in terms of behavioural compliance and of shared representations of the world (Pellizzoni and Ylonen, 2008; Pellizzoni, 2005). Accordingly, the framing of responsibility that accompanied this peculiar representation of science, technology and society relations was challenged. This section of the article describes the altered relationship between scientific knowledge, technological change and uncertainty on the one hand, and its connection with the crisis and the subsequent revision of the traditional model of science-society relations on the other. As we have mentioned above, the traditional form of the 'social contract of science' defines a direct and straightforward relationship between science, technology and policy: more (expert) knowledge means less uncertainty in anticipating policy impacts; more technology provides more 799 control over them. However, in the recent years, the relation between technoscientific knowledge and policy was acknowledged to be rather a 'reverse' one. In general, and in a nutshell, we can consider this situation as the result of two distinct but converging processes: the fall of the boundaries between the laboratory and the environment on the one hand, and of the boundary between science and policy on the other hand. The 'disappearance' of the boundaries between the laboratory and the environment is a result of the recognition of the ecological nature of technology. As Luhmann points out, technology can indeed be considered as "the extensive causal closure of an operational area" (Luhmann, 1993: 87), that can function because it is possible to eliminate most of the external interference in the operation of these "rigid couplings" in a physically restricted area and for a limited period of time. However, as the German sociologist notes, this closure is necessarily imperfect and, as mentioned, is limited in time and space. In Luhmann's own words: ftJhe difficulty of bringing about these conditions for even a brief period and for only small volumes, i.e. experimentally, indicates that any transformation into consumer technologies engenders a multitude of additional problems - precisely as a consequence of the attempt to establish, and in the long term to reproduce, a difference between controlled and noncontrolled causality. f^J [EJcological problems are actuated precisely by technology functioning and attaining its ends. Although unwanted side-effects, when known, can also be understood as problems to be solved by technology, this means only that these secondary technologies can then for their part again set off ecological problems. (Luhmann, 1993: 95-6)7 This impossibility to close off our manipulative knowledge of nature in the laboratory produces uncertainty rather than reducing it, once such knowledge is out in society and in the environment. As a result of this radical uncertainty, the distinction between scientific knowledge and policy that is implied by the model of double delegation disappears. So does its operational translation into the distinction between risk assessment and risk management. In a 'stronger fashion' (Pellizzoni, 2010), nature and society are co-produced and, therefore, "the ways in which we know and represent the world (both nature and society) are inseparable from the way we choose to live in it" (Jasanoff, 2004: 2). Similarly, this research question draws upon the consolidated understanding that reception of scientific knowledge, technology developments, and their consequences 800 is never, and never can be, a purely intellectual processes, about recep- tion of knowledge per se. People experience these in the form of material social relationships, interactions and interests, and thus they logically define and judge the risk, the risk information, or the scientific knowledge as part and parcel of that 'social package'. (Wynne, 1992: 281-2) Knowledge and technology, therefore, implicitly incorporate models and worldviews (Wynne, 1995), including, first of all, the separation between society and nature, values and facts. In a 'weaker fashion', the risk is not considered objectively, as a natural object, but as the result of the interaction between social processes and the natural world. It is the forms of social organization that decide which events are considered as risks, while others are grouped under the category of hazards, i.e. the possible damage is attributed to external and environmental factors that are independent from decisions, or they are neglected completely (Kermisch, 2012; Douglas and Wildavsky, 1982). ^ We note briefly that other authoritative accounts converge with Luhmanns' conclusions, although they adopt very different perspectives. To name a few, Giddens (1999), Beck (1999) and, in the science and technology domain, van de Poel (2009). New configurations of the relationship between science, technology and society beyond the traditional model of the 'social contract of science' Faced with the new scenario we presented in the previous section, the relationship between science, technology and society has been reorganized beyond the traditional 'social contract of science' following two directions: (1) the broadening of the forms of knowledge that are considered relevant to the discussion on the science and technology; (2) the definition of a wider range of criteria for the assessment of the social impacts of technological and scientific progress. Regarding the first aspect, the literature outlines distinct but convergent perspectives on the legitimacy of diverse forms of knowledge in the process of scientific research. As part of their reflection on post-normal science, Fun-towicz and Ravetz observe how the interaction between the epistemic and axiological dimensions legitimizes a plurality of perspectives (epistemic, axiological and related to the interests of social actors) to manage systemic complexity and to ensure the quality of postnormal science (Funtowicz and Ravetz, 1993). The value of stakeholder participation in science is therefore seen, in this case, in epistemic rather than political terms (Pellizzoni, 2003). 801 It is a necessary condition for the construction of robust knowledge in the context of post-normal science. This knowledge is not just about nature, but about society as well, concerning the whole "worlds of relevance" of social actors (Limoges, 1993), including the social relations that are perceived as relevant for the definition of a techno-scientific issue. Such knowledge may be based, for example, on analogies with the previous experience of the subject and is not limited to specific tehnological processes and products, but instead concerns social actors themselves and their performance. Therefore, as indicated by Wynne and Felt (2007), social actors focus their attention and their concerns not only on research and tehnology and the assessment of its consequences in the terms defined by the conceptual pair risk assessment/risk management. Rather, their concerns and foci of attention relate to the social and institutional framework of the governance of techno-science, to the meanings, choices, and priorities that define and select possible alternatives for the development of scientific research and technological innovation. The expansion of the types of knowledge that are considered relevant to the debate on science and technology is intertwined with the broader definition of the criteria for the assessment of the impacts of technological and scientific progress. Such expansion also adds ethical and social issues to considerations about economic impacts. The ELSI acronym (ethical, legal and social issues) was first used by the National Institutes of Health in the context of the Human Genome Initiative, a global initiative aimed at sequencing the human genome, and has later come to commonly indicate any type of research about the ethical, legal and social implications that accompany scientific and technological change. Thompson (2010) noted that it would be difficult to identify a unifying feature of these diverse studies other than normativity. If we consider, for example, nanotechnology, we can rightly include amongst ELSI studies, works on disparate themes: the possible invasion of privacy in the case of sophisticated diagnostic tools in the field of nanomedicine (Etp, 2006a; 2006b), ethical and political aspects of military research (Altman, 2006), the environmental and human health impacts of nanoparticles (Helland et al., 2010), the impact of nanotechnol-ogy on the persistence and the possible aggravation of inequalities in the field of international health (Arnaldi et al., 2009) or development (Meridian Institute, 2005)8. At the policy level, the attention given to the diversity of knowledge forms and to this broader view of the impact of scientific and technological development has resulted, essentially, in three mechanisms: (1) the promotion of mechanisms and policies that can broadly be described as 'participatory'; (2) the creation of technology assessment organizations and approaches; 802 (3) the combination of both. This 'participatory' label is used to describe a wide variety of experiences. The relevant literature has attempted to give an account of the multiplicity of participatory mechanisms on the basis of several criteria. For example, Rowe and Frewer (2005) proposed a classification of participatory mechanisms based on their capacity to maximize "the relevant information (knowledge and/or opinions) from the maximum number of relevant sources and transferring this efficiently to the appropriate receivers" (Rowe and Frewer, 2005: 263). Starting from this premise, Rowe and Frewer distinguish three different types of mechanisms of public involvement: public communication, public consultation, public participation. The first two types are characterized by one-way information flows, respectively from the sponsor of the involvement activities to the participants and from the participants to the sponsor; the third type by two-way flows, establishing a dialogue between the sponsors and the participants. However, if organizations, generally 8 This broadening of the relevant issues and the 'participatory turn' in science-society relations that is shortly described in the following paragraphs assigns to social sciences and ethics a central role in exploring implications of technological change and in acting as "spokepersons" of the public's beliefs and values. For ethics, this latter role creates a tension between ethics as a discipline and the so called "lay ethics" as the inclusion of public's values and moral evaluation in deliberations on science and technology. For the purposes of our discussion, the former understanding of ethics can be framed under the expansion of the relevant forms of knowledge, while the latter concerns the scope of public participation in science and technology. government bodies, can be an important actor in public participation, by actively promoting, for example, workshops, consensus conferences, deliberative polling (top-down participation), these experiences do not cover the complete set of participatory experiences and they are representative only of what Bucchi calls "top down" participatory initiatives. Participatory activities can be autonomously initiated and conducted by civil society organizations or organized groups of citizens (bottom-up participation). Bucchi (2008) and Bucchi and Neresini (2008) note that these initiatives, ranging from local protests to community-based research, may not be formally included in the decision-making process, but they may significantly contribute to the production of scientific knowledge thus configuring processes of informal technology assessment (Rip, 1986). Besides participatory mechanisms, technology assessment is the second policy mechanism implemented as an answer to the crisis of the science-society relationships as configured by the traditional model of the social contract of science. The idea of technology assessment (TA) generically refers to a scientific, interactive and communicative process with the aim to contribute to the public and political opinion forming on science and tech- 803 nology related societal aspects like exploitation of potential, dealing with secondary effects, and technological risks, overcoming problems of legitimacy and technology conflicts. (Fleischer et al., 2005: 1113) TA was originally centred on a systematic impact analysis of technology development, difusion, or change, with a particular attention to unintended, indirect or long-term consequences (Coates, quoted in van den Ende et al., 1998: 6). TA has been institutionalized through the creation of agencies typically linked to parliamentary bodies, whose aim was to inform policy, like the OTA in the US (Herdman and Jensen, 1997). Without going into details, some clearly identifiable trends marked its development from the late eighties-early nineties of the twentieth century (Tran and Daim, 2008): (1) the gradual loss of centrality of the internal dynamics of the techno-sci-entific domain, and of the creation and modification of devices and technical systems as the exclusive object and reference of the analysis; (2) the progressive limitation of the previously dominant, if not exclusive, role of the scientific and technical expertise in the assessment process, and, finally, (3) the acknowledgement that the ambition to systematically assess the development of a technology and its impacts is inescapably challenged by uncertainty. As a result, interest in the social and institutional context of innovation has dramatically increased. A number of other social actors then made their appearance in TA processes in addition to the scientists. These actors have a role in orienting technology policies or, more generally, innovation processes, but they are also simple users, citizens or representatives of concerned or affected social groups. In this case, Technology Assessment becomes a close germane of deliberative democracy, understood as a "process based on the public discussion among free and equal individuals" (Pel-lizzoni, 2005: 14) preceding the decision. The changed forms of responsibility beyond the traditional social contract of science The previous sections have illustrated the changing configurations of science, technology and society relations before and after the crisis of what we have called the traditional model of the 'social contract of science'. According to the goal we set for this paper, we will now discuss what implications this change had for the way the responsibility of actors involved in the development of science and technology, particularly decision-makers and scientists, is understood. In the traditional social contract of science, the meaning of responsibility was linked primarily to a broader understanding of the relationship 804 between science and society marked by the separation imposed by the "double delegation" model (Callon et al., 2001). Albeit with different roles, this model identifies scientific experts and decision-makers as relevant actors and relegates the general public to a purely passive role. Recalling the 'three moments of responsibility' that characterize our understanding of the notion, we can make the following broad statements about responsibility in the traditional social contract of science. Firstly, the 'assumption of responsibility' is considered mainly, if not exclusively, in terms of the professional role of the scientist. In other words, the scientist must be, first and foremost, a good scientist. In addition, she/he may be called upon to answer for mistakes and pitfalls in risk assessment. Again, she/he can be called to answer for not being a good scientist, but cannot be accused of any adverse effect that is not a consequence of these shortcomings, whether intentional or not. The moral of 'speaking truth to power' follows this logic (Jasanoff, 2003). Contrarily, in the configurations of the relationship between science, technology and society emerging from the criticism of the traditional social contract of science, the ways in which responsibility is understood change significantly, as well as their basic premises (see Table 1 for a summary). As a matter of fact, the main point that we have tried to highlight in this part of the work is the overcoming of the division between science and society that is postulated by the double delegation. Science and society, facts and the values, are, in fact, viewed as a connected and inextricable whole. We have also illustrated the consequences that arise from this change: (1) the expansion of the forms of knowledge that are considered relevant to decisions on science and technology; (2) the broadening of the criteria for assessing the social implications of science and technology; (3) the promotion of participatory mechanisms for the elaboration and assessment of technology-related choices as a consequence of the first two points. In this new framework, the first aspect to note is the widened range of the relevant actors in the decision-making process. The public and, especially, civil society organizations cease to be merely passive recipients of technology (policies) impacts, as in the framework of double delegation. Second, the assumption of responsibility of the scientist is broadened to include also the ethical, legal and social implications of technology and innovation: it does not concern only aspects that (can be) relegated to the phase of risk management, or to post hoc impacts of a neutral technology. Along with the actors related to them, these aspects form instead an indistinguishable whole together with aspects related to research and technological development. Eventually, scientists may be ascribed for the unexpected and indirect effects of their work, in a context in which a radical uncertainty characterizes the consequences of technology. For the same consequences, scientists can also be held responsible, if not legally (in the sense of liability), then at least morally (in the sense of blameworthiness). 805 Table 1: THE CHANGING FORMS OF RESPONSIBILITY Traditional configuration New configuration Science-society relations Separation Connection Assumption of responsibility Role Extended impacts Ascription of responsibility Role ELSl Subjection to responsibility Science and technology Society Actors Scientists and decision makers Scientists, decision makers, publics Uncertainty Controllable (risk) Radical Source: Arnaldi and Bianchi (2014) Closing remarks: new forms of an old issue? The paper has discussed the issue of responsibility in science and technology. After presenting the initial emphasis on the importance of this notion in current research and technology development practices and policies, we have examined the features of responsibility and defined the notion as an "assignment process" occurring in society across interactions, institutions and policies from the micro to the macro-level. We then focused on the changing configurations of science-society relations, their determinants and their consequences for responsibility. We illustrated the features of the traditional model of the 'social contract of science', such as the linear relation between science, innovation and societal benefits, the direct relationship between scientific knowledge and policy, and the technocratic perspective on managing science, technology and their boundaries with society. Later, we have explained how the radical uncertainty surrounding technoscience and its impacts challenged and renewed the configurations of science-society relations. This change came along with the broadening of the range of actors and knowledge relevant for science and technology policy. Such a broadened view has been accompanied by the diffusion of new mechanisms intended to ensure the engagement of these heterogeneous actors in decisions about science and technology, as well as by the widening of the criteria for assessing techno-scientific impacts on society, all the way to the inclusion of ELSI in such evaluations. The adoption of this comparative perspective highlighted how responsibility changed accordingly, expanding the range of subjects, i.e. the social actors who assume or are ascribed one or more responsibilities, and its objects, i.e. the consequences of decisions actors can be called to answer for. 806 Through this analysis, the paper suggests that responsibility is varied and diversified, and depend on the broader division of techno-scientific labour in the society that frames it. Although the two perspectives of responsibility we have discussed (in the traditional 'social contract of science' and in the renewed forms that followed its demise) should be considered only as models, ideal types, they clearly highlight the key question that the choice is not between responsible and irresponsible science and technology, but it is rather between different versions of responsibility, that different social actors call for and enact, and that need to be examined, understood, and coordinated. BIBLIOGRAPHY Altmann, Juergen (2006): Military Nanotechnology: Potential Applications and Preventive Arms Control, London, Routledge. Arnaldi, Simone, Arianna Ferrari, Paolo Magaudda and Francesca Marin (2014): Introduction: Nanotechnology and the Quest for Responsibility. In Simone Arnaldi, Arianna Ferrari, Paolo Magaudda and Francesca Marin (eds.), Responsibility in nanotechnology development, 1-17. Dordrecht: Springer. Arnaldi, Simone (2014): Que tan laxa debe ser? La regulacion de la nanotecnolog^a en la opinion de los stakeholders Italianos. Mundo Nano. Revista Interdisci-plinaria en Nanociencias y Nanotecnolog^a. Arnaldi, Simone and Luca Bianchi (2014): Responsabilita e innovazione tecnolog-ica. Un possibile contributo dell'analisi sociologica. In Massimo Chiocca and Luca Valli (eds.), L'innovazione responsabile. Volume III. Strumenti, 88-118. Roma: Retecamere. Arnaldi, Simone, Mariassunta Piccinni and Piera Poletti (eds.) (2009): Small Divides, Big Challenges? Nanotechnology and Human Health. Studies in Ethics, Law and Technology 3 (3). Available on http://www.bepress.com/selt/vol3/iss3/., 6. 4. 2014. Ball, Philip (2005): Synthetic biology for nanotechnology. Nanotechnology 16: R1-R8. Beck, Ulrich (1999): L'epoca delle conseguenze secondarie e la politicizzazione della modernita. In Ulrich Beck, Anthony Giddens and Scott Lash, Moder-nizzazione Riflessiva. Politica, tradizione ed estetica nell'ordine sociale della modernita, 29-99. Trieste: Asterios. Bucchi, Massimiano and Federico Neresini (2008): Science and public participation. In Edward J. Hackett, Olga Amsterdamska Michale J. Lynch (eds.), Handbook of Science and Technology Studies, 449-473. Cambridge, Mass: MIT press. Bucchi, Massimiano (2008): Scegliere il mondo che vogliamo. Cittadini, politica e tecnoscienza. Bologna: Il Mulino. Bush, Vannevar (1945): Science. The Endless Frontier. Washington, D. C.: US Government Printing Office. Callon, Michel, Pierre Lescoumes and Yannick Barthe (2001): Agir dans un monde incertain: Essai sur la democratie technique. Paris: Editions du Seuil. Doktycz, Michael J. and Michael L. Simpson (2007): Nano-enabled synthetic biol- 807 ogy. Molecular Systems Biology 3: Article number 125. Douglas, Mary and Aaron Wildavsky (1982): Risk and Culture: An essay on the selection of technical and environmental dangers. Berkeley, CA: University of California Press. Etp - European Technology Platform on Nanomedicine (2006a): Nanomedicine. Nanotechnology for Health. European Technology Platform, Strategy Research Agenda for Nanomedicine. Brussels: European Commission. Etp - European Technology Platform on Nanomedicine (2006b): Strategic Research Agenda for Nanomedicine. Nanomedicine: Nanotechnology for Health. Brussels: European Commission. European Commission (2008): Commission Recommendation of 07/02/2008 on a code of conduct for responsible nanosciences and nanotechnologies research. Available on http:// eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELE X:32008H0345&from=EN, 3.4.2014. Fleischer, Torstein, Michael Decker and Ulrich Fiedeler (2005): Assessing emerging technologies—Methodological challenges and the case of nanotechnologies. Technological Forecasting & Social Change 72: 1112-1121. Funtowicz, Silvio and Jerome Ravetz (1993): Science for the Post-Normal Age. Futures 25 (7): 739-755. Giddens, Anthony (1999): Vivere in una societa post-tradizionale. In Ulrich Beck, Anthony Giddens and Scott Lash, Modernizzazione Riflessiva. Politica, tradizione ed estetica nell'ordine sociale della modernita, 101-160. Trieste: Asterios. Grunwald, Armin (2014): Responsible Research and Innovation: an emerging issue in research policy rooted in the debate on nanotechnology. In Simone Arnaldi, Arianna Ferrari, Paolo Magaudda and Francesca Marin (eds.) Responsibility in nanotechnology development, 189-203. Dordrecht: Springer. Guston, David H. (2000): Retiring the social contract of science. Issues in science and technology 16 (4). Available on http://issues.org/16-4/p_guston/, 4. 4. 2014. Hart, Herbert L. A. (1968): Punishment and responsibility. Oxford: Clarendon Press. Helland, Äsgeir, Hans Kastenholz, Aake Thidell, Peter Arnfalk and Knut Deppert (2010): I materiali nanoparticolati e la loro regolamentazione in Europa: un'analisi delle opinioni degli stakeholder. In Simone Arnaldi and Andrea Lorenzet (eds.), Innovazioni in corso. Il dibattito sulle nanotecnologie fra diritto, etica e societa, 61-82. Bologna: Il Mulino. Herdman, Roger C. and James E. Jensen (1997): The OTA Story: The Agency Perspective. Technological Forecasting and Social Change 54: 131-143. Jasanoff, Sheila (2004): The Idiom of Co-production. In Sheila Jasanoff (ed.), States of Knowledge. The Co-production of Science and Social Order, 1-12. London: Routledge. Jasanoff, Sheila (2003): Technologies of Humility: Citizens Participation in Governing Science. Minerva 41: 223-244. Jungmann Ralph, Stephan Renner and Friedrich C. Simmel (2008): From DNA nanotechnology to synthetic biology. HFSP Journal 2 (2): 99-109. Kermisch, Celine (2012): Risk and Responsibility: A Complex and Evolving Relationship. Science and Engineering Ethics 18 (1), 91-102. 808 Ladd, John (1982): Philosophical remarks on professional responsibility in organi- zations. International Journal of Applied Philosophy 1 (2): 58-70. Limoges, Camille (1993): Expert knowledge and decision-making in controversy contexts. Public Understanding of Science 2 (4): 417-426. Meridian Institute (2005): Nanotechnology and the Poor: Opportunities and Risks. Washington, DC: The Meridian Institute. Merton, Robert K. (2000a): Il ruolo dell'intellettuale nella burocrazia pubblica. In Robert K. Merton, Teoria e struttura sociale. Volume II. Studi sulla struttura sociale e culturale, 423-449. Bologna: Il Mulino. Merton, Robert K. (2000b): La macchina, l'operaio, il tecnico. In Robert K. Merton, Teoria e struttura sociale. Volume III. Sociologia della conoscenza e sociologia della scienza, 1075-1093. Bologna: Il Mulino. Merton, Robert K. (2000c): Scienza e struttura sociale democratica. In Robert K. Merton, Teoria e struttura sociale. Volume III. Sociologia della conoscenza e sociologia della scienza, 1055-1073. Bologna: Il Mulino. NIA - Nanotechnology Industries Association (n.d.): Responsible Nano-Code http://www.nanotechia.org/activities/responsible-nano-code, 30. 12. 2013. NNI - National Nanotechnology Initiative (n.d.): What is nanotehnology? http:// www.nano.gov/node/242, 3. 4. 2014. Owen, Richard, Philip Macnaghten and Jack Stilgoe (2012): Responsible research and innovation: from science in society to science for society, with society. Science and Public Policy 6: 751-760. Owen, Richard, Jack Stilgoe, Philip Macnaghten, Erik Fisher, Michael Gorman and Guston, David H. (2013): A framework for responsible innovation. In Richard Owen, Maggie Heintz and John Bessant (eds.), Responsible Innovation, 27-50. Chichester, UK: Wiley. Parsons, Talcott (1959): Some problems confronting sociology as a profession. American Sociological Review 24 (4): 547-569. PCSBI - Presidential Commission for the Study of Bioethical Issues (2010): The ethics of synthetic biology and emerging technologies. Available on http://bioeth-ics.gov/synthetic-biology-report, 3. 4. 2014. Pellizzoni, Luigi (2003): Knowledge, Uncertainty and the Transformation of the Public Sphere. European Journal of Social Theory 6 (3): 327-355. Pellizzoni, Luigi (2004): Responsibility and environmental governance. Environmental Politics 13 (3), 541-565. Pellizzoni, Luigi (2005): Discutere l'incerto. In Luigi Pellizzoni (ed.), La deliberazi-one pubblica, 91-114. Roma: Meltemi. Pellizzoni, Luigi (2010): Risk and Responsibility in a Manufactured World. Science and Engineering Ethics 16 (3): 463-478. Pellizzoni, Luigi and Marja Ylonen (2008): Responsibility in Uncertain Times: An Institutional Perspective on Precaution. Global Environmental Politics 8 (3): 51-73. Rerimassie, Virgil and Harald Koenig (n.d.): What is synthetic biology? Available on http://synenergene.eu/information/what-synthetic-biology, 3. 4. 2014. Rip, Arie (1986): Controversies as informal technology assessment. Knowledge: Creation, Diffusion, Utilization 8 (2): 349-371. Rowe, Gene and Lynn J. Frewer (2005): A Typology of Public Engagement Mecha- 809 nisms. Science Technology & Human Values 30 (2): 251-290. Schlenker, Barry R., Thomas W. Britt, John Pennington, Rodolfo Murphy and Kevin Doherty (1994): The triangle model of responsibility. Psychological Review 101 (4): 632-652. Strydom, Piet (1999): The Challenge of Responsibility for Sociology. Current Sociology 47 (3): 65-88. Tran, Thien A., Daim, Tugrul (2008): A taxonomic review of methods and tools applied in technology assessment. Technological Forecasting & Social Change 75: 1396-1405. van de Poel, Ibo (2009): The Introduction of Nanotechnology as a Societal Experiment. In Simone Arnaldi, Andrea Lorenzet and Federica Russo (eds.), Techno-science in progress. Managing the uncertainty of nanotechnology, 129-142. Amsterdam: IOS Press. van den Ende, Jan, Karel Mulder, Marjoljin Knot, Ellen Moors and Philip Vergragt (1998): Traditional and Modern Technology Assessment: Toward a Toolkit. Technological Forecasting and Social Change 58: 5-21. van Lente, Harro (1993): Promising technology: the dynamics of expectations in technological development: Delft: Eburon Academic Publishers. van Lente, Harro (2000): Forceful Futures: from Promise to Requirement. In Nick Brown, Brian Rappert and Andrew Webster (eds.), Contested Futures. A sociology of prospective techno-science, 43-64. Aldershot: Ashgate. Vincent, Nicole A. (2011): A Structured Taxonomy of Responsibility Concepts. In Nicole A. Vincent, Ibo van de Poel and Jeroen van den Hove (eds.), Moral responsibility: beyond free will and determinism, 15-25. Dordrecht, Springer. 810 von Schombeg, Rene (2007): From the ethics of technology to the ethics of knowledge assessment. In Pierre Goujon et al. (eds.), The information society: innovation, legitimacy, ethics and democracy, 39-55. Springer, Boston. von Schomberg, Rene (2013): A Vision of Responsible Research and Innovation. In Richard Owen, Maggie Heintz and John Bessant (eds.), Responsible Innovation, 51-73. Chichester, UK: Wiley. Weber, Max (2004): The Vocation Lectures. Indianapolis, IN: Hackett Publishing. Wynne, Brian (1992): Uncertainty and environmental learning. Global Environmental Change 6 (2), 111-127. Wynne, Brian (1995): Technology assessment and reflexive social learning: observations from the risk field. In Arie Rip, Johan Schot and Thomas J. Misa (eds.), Managing Technology in Society. The Approach of Constructive Technology Assessment, 19-35. London: Pinter Publishers. Wynne, Brian and Ulrike Felt (2007): Taking European Knowledge Society Seriously. Report of the Expert Group on Science and Governance to the Science, Economy and Society Directorate. Brussels: European Commission.