For decades, the authors of the following papers have pointed out the need to systematically examine knowledge-based programs such as that of nuclear waste management for errors and to define processes for detecting and correcting them. As early as 1981, in their book «Ways Out of the Disposal Trap», they wrote about the need for «a systematic assessment and monitoring of research» including ensuring «a program of quality assurance of one’s work …». Over the years, this blog has also featured recurring posts and thoughts on error culture and learning processes.[1] The authors’ last published contribution on error culture appeared in the spring of 2021 in an article for the Foundation «Life and Environment» entitled «Learning from errors: the thorny path of site search», which is attached to this post.
On dependencies in knowledge processes
Anyone who struggles with errors and error culture will at some point encounter Ludvik Fleck. This medical doctor and microbiologist, who worked in the first decades of the last century in Lviv, now part of Ukraine, became fundamentally interested in the question of how scientific knowledge comes about. Fleck, who was well-read in the history of science, recognized that knowledge is not created by a so-called objective cognitive process that is independent of social values, but that social conditions, values and preferences have a decisive influence on this process. His main paper «Genesis and Development of a Scientific Fact», published in 1935 and almost unrecognized for many decades, shows the relationship between cognition and social value preferences very impressively.[2] From this perspective, science and the development of science can only be understood by taking into account the social and historical contexts. From this insight, Fleck developed his theory of thinking styles, which form when knowledge bearers join together in thinking collectives and subscribe to closed scientific explanations of the world.
Fleck thus recognized very well the relation of knowledge to social norms as well as the complexity of breaking down these relations in a highly differentiated society. His epistemology, however, was based on the assumption of «democratically» organized science following universally valid norms of knowledge generation.[3] Thus, his interest was less related to the social relations among members of thought collectives. This is understandable from the perspective of that time, but would have to be supplemented in essential points under the organizational models of society in force today and the programs pursued today, such as that of the site search for deep geological repositories for radioactive waste.
Thus, in such processes of cognition, it is not only a matter of understanding how a scientific cognition is formed in a particular social context, but also of clarifying which social forces influence the formation of this cognition. For in reality, the free formation of opinion, even in democratically organized societies, is co-determined by dependencies between knowledge bearers. As soon as members of a thought collective are confronted with economic dependencies and political and ideological consent is expected from them, a process of cognition is no longer free. Such a process subordinates itself to dependencies and power imbalances, and the styles of thinking that develop under such conditions are increasingly influenced and decided by political power positions. We are thus standing outside a science-based process of cognition as it is represented and emphasized in a free knowledge society – for example at universities. There is a convergence of opinions, which is not only shaped by the specific social context, but is to a large extent controlled by the convergence of interests – especially economic and political ones. Economic, political or ideological dependencies affect the process of cognition and can significantly influence and distort such a process.
Such a development has consequences. The scientific processing of projects or programs is no longer primarily about the search for scientific «truths», but about the channeling of interests of thought collectives – of whatever kind. The «search for truth» is subordinated to political, economic or ideological interests. Questioning and doubt as well as contradiction and debate of controversial ideas, some of the most important features of any free scientific process, practically thin out in an environment with a pronounced convergence of interests. Under such conditions, not only the probabilities of scientific missteps increase, but also the risk of deliberate influence and manipulation of processes.
The consequence of dependencies on economic policy programs
The explanation of modern knowledge programs today still assumes idealized conditions in the search for scientific knowledge. This notion contrasts with the reality that modern technologized societies have to deal with: creating and securing prosperity of the population and coping with upcoming problems. Governments and public authorities are therefore obliged to meet a variety of needs of the population, such as full employment, nutrition, health care or access to infrastructure. Complex mega-programs and projects, the size of which is almost unmanageable, are supposed to meet these enormous demands.[4] This development towards overarching mega-programs is leading to fundamental structural adjustments in modern societies and to increasing influence in economic policy control by governments and large power conglomerates. Science and technology play essential roles in assisting and addressing specific technical or scientific issues, but are no longer at the forefront of the scientific management of the consequences of such projects and the associated economic policy and political decision-making processes. This has fundamental consequences for decision-making processes in megaprojects: these are no longer necessarily knowledge-based à priori, but increasingly follow political and economic constraints.
This naturally raises fundamental questions about how scientific knowledge processes can still be sufficiently integrated into these large-scale political landscapes. Large-scale programs demand the management of such processes and assume that science and technology place their participation in the service of these large-scale projects. Findings that could endanger these programs are therefore marginalized and sometimes openly opposed by the management. In essence, the decision-makers are primarily concerned with the short- and medium-term implementation of such large-scale projects, not with the possible long-term effects of wrong decisions.
A crux …
This division of labor presents modern societies not only with formidable challenges in the concrete management of their large-scale projects, but also with a fundamental dilemma in the allocation of responsibilities and leadership claims. The primary issue is to reconcile the political and administrative leadership of large-scale projects with the scientific viability of decisions. Today, this relationship between project management and scientific planning and quality assurance of programs is often out of kilter.
Particularly in these large-scale programs and megaprojects led by government agencies, the scientific nature of the solutions often gets lost. While science is used and employed to provide concrete solutions to problems, it has lost its primacy as a fulcrum for knowledge-based decision-making. The claim to leadership in science-related decisions has increasingly shifted toward politically active institutions. Today, it is essentially governments and administrations that decide on the extent and quality of scientific expertise. This development is being followed with increasing concern by scientific institutions, but also by governmental institutions, as expressed in various recent publications.[5] «Protecting scientific integrity in government is vital for the country. The convergence of economic, health, social, and climate crises facing the nation underscores the need for fact-based decisions guided by the best available science» concludes, for example, a study published in 2022 by the U.S. « National Science and Technology Council ». [6]
A commitment to knowledge- and fact-based decisions – as it is also stated for example in the German site selection law in paragraph 1 – implies the acceptance of scientific methods, and thus also the process of questioning scientific ideas, programs and projects or the evaluation and analysis of program developments. In a word, the culture of error. This realization calls for a new debate regarding the management of government mega-programs and the integration of scientific expertise according to the proven principles of scientific research. The clarification of the relationship between political and scientific institutions as well as the definition of the rights of scientific action within large-scale projects therefore need to be regulated.
And this brings us back to the actual topic of error culture, which should be a mandatory part of all such large-scale projects. In the following articles we will present a number of concrete examples of how institutions deal with scientific facts they do not agree with, and conclude these articles with possibilities for a different way of dealing with dissenting scientific opinions. A good introduction to this is the attached text which appeared in the «Foundation Life and Environment» in 2021.
© Foundation «Life and Environment» (Stiftung Leben und Umwelt), Heinrich Böll Foundation, Lower Saxony
Learning from mistakes: the thorny path of site search …
Learning is based on mistakes. Learning therefore means not only being allowed to make mistakes, but also being able to talk about them explicitly and to stand by mistakes. But what if institutions or individuals do not want to have made mistakes? When making mistakes is deliberately hidden or not allowed to be talked about? Can a system based on such a foundation function at all in the long run? Especially in a high-risk area like that of nuclear energy with its long-lived radiating waste?
Standing by mistakes is highly culture-dependent. There are cultures in the world that are better or less able to deal with it. Our western civilization clearly belongs to those cultures that have difficulties in dealing with mistakes – despite enlightenment and an extremely successful scientific-technical track record. Our 2,000-year-old Christian roots with a pronounced culture of guilt still stand in the way of an open discussion of mistakes and shortcomings. This is true today even in science and especially in dealing with risk technologies. Waste management exemplifies the cultural weaknesses in dealing with a cultural asset that no one actually wants and that society likes to push away and suppress: waste. Above all, it becomes clear how difficult a learning culture based on mistakes – error recognition and correction, or «error culture» for short – still has it today.
«Our 2,000-year-old Christian roots with a pronounced blame culture still stand in the way of openly dealing with mistakes and shortcomings. »
Experiences in the disposal of toxic waste
If we look back at the history of waste disposal in the last century, this societal disinterest in all transformation products of our technology and consumption can be traced very clearly. The productivity and efficiency peaks of modern industrial societies also correlate impressively with the increase in waste volumes and the hazardousness of our legacies. Whereas at the beginning of industrialization we initially began to store these often toxic legacies at the sites or dump them into rivers, groundwater or in disused gravel pits and quarries, the damage caused by such disposal practices, which cannot be explained away, has led our industrial societies to slowly rethink and improve their approach to the disposal of hazardous waste. Nevertheless, the fundamental lessons underlying a proactive error culture have never really been incorporated into the planning and execution of national or international waste management programs: Officials do like to point to error culture, to «lessons learned» or quality assurance programs. But reality speaks a different language: even in the so-called highly developed nations. To this day, there is not a single storage project for highly toxic waste in the world that is capable of meeting the long-term quality requirements for such a project – neither for radioactive waste nor for chemo-toxic hazardous waste. A major reason for this misery is the lack of an error recognition and correction culture («error culture») worthy of the name in the first place. We can trace the reasons for this exemplarily in the developments in Switzerland, for example in the search for sites for final repositories for radioactive waste.
Experiences in Switzerland
In Switzerland, the search for sites for repositories for radioactive waste was initiated by the core melt accident at the Lucens nuclear power plant in 1969. The waste resulting from this accident – including melted fuel elements from the natural uranium heavy water experimental power plant and an irradiated reactor core – called for interim storage or repository options. Thus, in 1972, the Swiss nuclear industry and the Swiss Confederation founded the so-called National Cooperative for the Disposal of Radioactive Waste (NAGRA), which has been searching for sites for high-level or low- and intermediate-level waste repositories for almost 50 years. This program had to be discontinued several times. Host rocks and regions alternated. From anhydrite and salt rocks, the search moved to crystalline rocks, marl and then to various clay formations. Site search programs had to be restarted several times – from the Swiss Jura through the Alps to the northeast of the country. What is striking about this soon-to-be 50-year search process slalom is the lack of an official historical analysis of these developments that would identify and address the reasons for the failure of these programs. After all, when programs fail multiple times, there must be reasons. There were many of theirs. The most important one: there is not in our Swiss programs what constitutes a culture of failure, namely a fear- and repression-free approach to criticism.
«There is not in our Swiss programs what constitutes a culture of error, namely a fear- and repression-free approach to criticism. »
The lack of a review of the past history of the site search goes hand in hand with the efforts practiced by the responsible institutions to present the current site search program in an extremely positive light. From the very beginning, the Swiss Federal Office of Energy (SFOE), which is responsible for managing the process, pointed to the model character of the Deep Geological Repository sectoral plan – the Swiss site selection procedure – and also emphasized the pioneering character of this process internationally. In Germany, in particular, this image of an exemplary procedure received much recognition, although a large number of basic problems were apparent from the very beginning, both in the development of the concept and in the execution of the program. This manifests itself mainly in four basic areas:
Concept: the concept of the Deep Geological Repository sectoral plan was carried out in terms of content by Nagra, i.e., by the organization of nuclear power producers responsible for final disposal. This fact was never made visible and the disclosure of these relations were neither confirmed nor denied by the responsible institutions. But the congruence with the previous concept proposals of the nuclear industry and the identical approach to the execution of the program cannot be overlooked. The fact that a search concept was developed by the supervised party itself and not by an institution or commission with independent interests, but that this fact has been concealed and so far ignored in the social debate, is detrimental to the credibility of this process. In addition, the planning was hopelessly underestimated from the very beginning in terms of the time required for implementation, which the authorities now admit with arbitrary excuses.
«That a concept was developed by the supervised party itself and not by an institution or commission independent of interests, but this fact is concealed …, is detrimental to the credibility of the process. »
Handling: the handling of the site search led to countless debates about the methods used. In particular, the approach to handling the field work – seismic and drilling programs – has been criticized, as has the lack of transparency in publishing the results. Equally controversial is the development of the repository concepts, which were to a certain extent based on the roof – i.e. the facilities on the surface – without knowing the foundation – i.e. the conditions underground. Finally, it was and is criticized that decisions are changed without justification, i.e. that there is no justified and comprehensible change management of resolutions and decisions. This is evident, for example, in the case of the previously preferred connection of surface facilities to the subsoil, which – contrary to the original intentions – is now to be made with shafts and not with a ramp.
Structural weaknesses and dependencies: one of the major issues that arises in Switzerland is the structural dependency of the authorities on the waste producers. For example, it became known in the course of the proceedings that the supervising authority, the Federal Nuclear Safety Inspectorate ENSI, has no power of decisions. It can comment on reports from the waste producers, but it cannot issue orders – as any building authority does in any building permit procedure. This is a matter for the political authorities. In the end, such organizational constructs weaken not only the control institution, but also the procedure and thus the credibility of the procedure as a whole. It is therefore only logical that the Swiss safety authority ENSI should be regarded as a weak and dependent supervisory authority.
Participation: Finally, participation is also criticized. In particular, there are criticisms that there is no procedure and no discussion at eye level and that objections to the procedure that are not appropriate are not taken up, overheard or omitted. In addition, there has been increasing criticism of the management of the participation process and the influence exerted by the institutions that endorse the procedure. Citizen participation and the creation of acceptance represent the basis for the implementation of such projects. If these cannot be ensured, there is a risk that a program will be abandoned prematurely, as has repeatedly been the case in Switzerland in the past. Genuine participation takes time. It determines whether a culture of error and discussion can be successfully managed.
Many of the problem points raised above are well known in Switzerland – as, incidentally, in neighboring countries. What is particularly interesting is that the responsible authorities and instances simply do not respond to the questions and criticisms raised. The law of silence applies, which tries to silence the unpleasant and excludes the not-being-allowed. Omertà and exclusion of facts are extremely good indicators for pre-traced, closed processes, which primarily serve the goal of pushing through a certain program without consideration of undesirable developments.
Why defensive attitudes towards an open discussion? So, what’s next?
One can certainly have different opinions on the conceptual, organizational, content-related and participatory issues outlined above. However, the question in this context is a completely different one. Do other points of view and does criticism of specific points have any chance at all of being taken up and dealt with openly and self-critically? The culture of error begins not only with the recognition and correction of shortcomings, but primarily with the acceptance of unpleasant questions and findings, and with their treatment and acknowledgement. In short, in the process of questioning. Yet it is precisely this process that our institutions find extremely difficult. It is not just a matter of the aforementioned culturally determined hurdles in dealing with errors. Defensive attitudes towards major changes are often also institutionally fixed or inherent in the program: an open discussion of one’s own conceptions and possible conceptual inadequacies or errors can endanger an entire program. The institutions entrusted with such programs naturally resist allowing such a development to take place at all. After all, their raison d’être is precisely to carry out and execute a program once it has been determined and legally established. If disruptive factors come into this process that could endanger this goal, the emergence of defense mechanisms is obvious and understandable. On the one hand, because the legitimacy of continuing such a program is fundamentally disturbed. Second, because it also calls into question the role of the supporting institutions. And finally, because fears – including fears of being blamed, reprimands or existential fears – also play a role in such situations and moments. Fear is a great inhibitor. This insight also led William Edwards Deming, the great mastermind in the field of quality management, to include freedom from fear in his 14-point program for «Total Quality Management» (TQM): [7] «Drive out fear. » It should be added here that drive out fear should not only be demonstrated in the organization but in the entire execution of a program. So, slightly modified, this could also mean:
«Drive our fear so that everyone can work effectively for a program or participate in it in a questioning way.»
It is interesting in this context how cultural identities are expressed in such contexts: for example, if the U.S. Nuclear Regulatory Commission (NRC) speaks openly about the need for nuclear industry employees to be able to function in their jobs «without fear of reprisal»,[8] the word «fear» is avoided in the corresponding International Atomic Energy Agency (IAEA) safety culture documents.[9] In our European – and especially our German – language area, we are all too happy to avoid the terms fear and fearless in such contexts.
There is no patent recipe for what should actually be done for an open discussion culture and thus for the integration of an error recognition and elimination culture. What there is, however, is experience of what should certainly not be done, or experience of ways that have proven to be effective in other complex large-scale projects. The decisive factor here is probably to approach the management of this program, which is spread over generations, in a completely different way. Namely, in the knowledge that every project, as good as it may seem at the moment, also has or can have weaknesses that lead to failure. This realization should actually lead to the development of rules for dealing with the recognition, discussion and correction of errors or potential errors. Of course, this should also include institutional safeguards in the sense that, to a certain extent, a kind of «judiciary» in the process ensures that this open, questioning and self-questioning discussion can actually take place. The success of this challenging process of site selection and the realization of socially acceptable repository projects will therefore depend to a large extent on whether it is actually possible to set the course in the search and implementation process in such a way that the warnings from «outside the box» are received, dealt with and integrated by the responsible institutions. For without acceptance on the part of the affected site communities, a site search project and, above all, a repository project will hardly be feasible in the long run.
About the author: Marcos Buser, a geologist and social scientist, has accompanied nuclear waste management programs for four and a half decades as a scientist and in various official capacities, including as a member of the Swiss Federal Commission for Nuclear Safety.
This article appeared at the link: https://www.slu-boell.de/de/2021/05/12/von-fehlern-lernen-der-dornige-weg-der-standortsuche
[1] E.g. Buser, M. Schacht oder Rampe: Fehlerkultur bei der Erschliessung eines geologischen Tiefenlagers, 18. März 2021, https://www.nuclearwaste.info/schacht-oder-rampe-fehlerkultur-bei-der-erschliessung-eines-geologischen-tiefenlagers/; Buser, M., Jaccard, J.-P. Fehlerkultur beim ENSI: ein Buch mit sieben Siegeln, 9. März 2021, https://www.nuclearwaste.info/fehlerkultur-im-ensi-ein-buch-mit-sieben-siegeln/; Jaccard, J.P. Das ENSI ist nicht unfehlbar, es hat sich geirrt. 9. März 2021. https://www.nuclearwaste.info/das-ensi-ist-nicht-unfehlbar-es-hat-sich-geirrt/.
[2] Ludwik Fleck, 1935/2017. Entstehung und Entwicklung einer wissenschaftlichen Tatsache. Einführung in die Lehre von Denkstil und Denkkollektiv. Suhrkamp taschenbuch wissenschaft.
[3] Ludwik Fleck, 1936/2019. Das Problem einer Theorie des Erkennens. In: Erfahrung und Tatsache. Gesammelte Aufsätze. Suhrkamp taschenbuch wissenschaft.
[4] Priemus, H. / van Wee, B. (Hg.), 2013. International Handbook on Mega-Projects, Edward Elgar; Irlbeck, Benjamin, 2017. Megaprojekte. Herausforderungen, Lösungsansätze und aktuelle Beispiele. AV Akademiker Verlag. Frahm, M., Rahebi, H., 2021. Management von Groß- und Megaprojekten im Bauwesen, Springer Verlag. Usw.
[5] King, Anthony, 2016. Science, policy and policymaking, ENBO-Reports, 2016/Vol. 7, No. 11.
[6] National Science and Technology Council, 2022. Protecting the Integrity of Government Science. The White House. January 2022. https://www.whitehouse.gov/wp-content/uploads/2022/01/01-22-Protecting_the_Integrity_of_Government_Science.pdf
[7] W.E. Demings 14 points.
[8] U.S.. Policy Statement on the Freedom of Employees in the Nuclear Industry to Raise Safety Concerns without Fear of Retaliation, Federal Register, 61 FR 24336, May 14, 1996.
[9] In the IAEA reports on Safety Culture (INSAG 1991 ff.) there are no references to the motive of fear in safety culture.
Leave a Reply