Google-translated text from Russian
Original: za.ru/texts/2007/08/10-217.html
The structure of a global catastrophe.
Version 0,943
November 14, 2007
(c) 2007 Alexei Turchin
Attention, this is not the final text, and it may be typos, factual inaccuracies and semantic elements missing! I would be grateful for the constructive comments.
Alexei Turchin
Abstract 6
Summary. 6
Terms 7
Introduction. 8
Before the interval: 21 century. 10
Methodology. 11
The problems of numerical calculations of probabilities of different scenarios. 12
Numerical evaluation of the probability of a global disaster, given the various sponsors. 20
Global catastrophe, and the forecast horizon. 21
Historiography. 24
Levels of possible degradation. 26
One sourse global disaster scenarios. 29
Global risks posed by man. 29
Technological risks. 30
Risks, technology for which is already ready. 30
Nuclear weapons. 30
Nuclear winter. 31
Full radioactive contamination. 32
Superbomb. 34
Accumulation of antimatter. 35
Cheap bomb. 35
Uniform attack on radiation facilities. 36
The explosion of powerful bombs in space (artificial gamma-vsplesk). 37
Integration of factors affecting nuclear weapons. 37
The price issue. 37
The probability of the event. 37
Changing the probability of a global disaster caused by the nuclear weapon over time. 41
The strategy of deterrence in question. 42
The findings of a nuclear war. 43
The global chemical contamination. 44
The findings on the risks for which the technology is already available. 48
Risks, which seems inevitable occurrence because of the ongoing nature of development technologies. 49
Biological weapons. 49
The probability of use of biological weapons and its distribution over time. 55
Superdrug. 58
The risks associated with samokopiruyuschimisya ideas - Meme. 62
Artificial intelligence. 63
What you need to II in terms of technology. 64
Why II is a universal absolute weapon. 65
The system aims. 65
II-proektov The fight among themselves. 65
Smarter people. 66
AI and its applet. 66
Riot II. 67
Speed start. 67
Scenarios "rapid" strata: 68
Slow Start II and the fight around among themselves. 69
A smooth transition. 69
"uprising" robots. 71
The control and the possibility of extermination. 71
AI and its purpose. 72
AI and the state. 73
The likelihood of II. 73
Other risks associated with computers. 75
Time outbreak II. 76
The risks associated with robots and nanotehnolgiyami. 77
Robot-raspylitel 78
Samovosproizvodyaschiysya robot. 78
Swarm robots 78
The major Army combat robots beyond the control of 78
Unlimited nanorobotov spread. 79
The probability of occurrence and nanorobotov possible time for this event 82
Technological options provoking natural disasters. 83
Rejection of asteroids 83
Creating artificial sverhvulkana. 83
Technological risks associated with a new discoveries. 85
Unsuccessful physical experiment. 85
The risks associated with the development of outer space. 89
Ksenobiologicheskie risks. 90
Clash with us beyond reasonable force. 91
The risks associated with the program SETI. 92
A new kind of absolute weapons, based on some new physical principles. 96
Risks associated with the blurring of boundaries between humans and subhuman. 97
The risks associated with the problem of "philosophical" Zombie. 97
Natural risks. 98
Geological disaster. 98
Sverhvulkanov eruption. 98
Falling asteroids. 100
Zone defeat depending on the forces of the explosion. 102
Solar flares and increase luminosity. 103
Gamma. 106
Supernovae. 107
Global warming. 108
Sverh-tsunami. 109
Sverh-zemletryasenie. 110
The emergence of a new disease in nature. 111
Marginals natural risks. 112
The shift in the vacuum state with a lower energy. 112
Unidentified Earth processes in the kernel. 112
Explosions other planets of the solar system. 115
Nemesis. 115
The termination of the "protection", which gave us antropny principle. 116
Species risks not associated with the new technologies. 117
The exhaustion of resources. 117
Relocation. 118
The collapse of the biosphere. 118
The socio-economic crisis. War. 119
Genetic degradation and the weakening of fertility (the ability to reproduction). 119
"The moral degradation". 119
Exclusion of other species. 119
Unknown to us now causes disasters. 120
Common signs of any dangerous agent. 121
Ways to arise. 122
Exit point and the beginning of distribution. 122
Circulation importantly destruction. 123
Method of distribution. 124
The method of causing death. 125
Typical types of destructive effects: 126
The temporal structure of the events. 127
Predavariynye situation 129
The intentional and random global catastrophe. 129
The machine ship days 130
Multiple scenarios. 132
The integration of different technologies, creating a situation of risk. 132
The pair scenarios. 134
Historical analogies global catastrophe. 138
The inevitability of achieving steady state. 141
The risk of recurrence. 142
Global risks and the pace of the problem. 142
The relative strength of various hazardous technologies. 144
The sequence of the emergence of different technologies over time. 144
A comparison of risk factors for different technologies. 146
The objectives of the establishment. 148
Social groups that are willing to risk the fate of the planet. 152
Synthesis factor associated with the human factor. 152
Action on nuclear blow. 154
The price issue. 155
Global Risks second kind. 158
Events that may open a window of vulnerability. 159
System crises 160
The crisis of crises. 168
Technological Singularity. 170
Perepotreblenie leads to a simultaneous exhaust all resources. 173
The systemic crisis and technological risks. 174
The degree of motivation and knowledge of global risk factors. 175
Kriptovoyny, the arms race and other factors for scenarios that increase the likelihood of a global catastrophe. 178
Kriptovoyna. 178
Vulnerability to micro effects. 179
The arms race. 179
Moral degradation. 180
Hostility scenario as a factor. 181
Revenge scenario as a factor. 182
The war scenario as a factor. 182
The degradation of the biosphere. 184
Global disinfection. 185
Business scenario as a factor. 185
The point of absolute vulnerability. 186
Raskachivayuschee management. 187
The controlled and uncontrolled global risks. Problems understanding of global risk. 187
Discovery. 190
Common behaviors systems on the brink of sustainability. 191
The transition process from one catastrophe to another level. 192
Scheme scenarios. 193
A lack of resources, new technologies war-unexpected results-dissemination technology. 193
Global Risks third kind. 194
Moore's Law. 195
Ways to prevent global risks. 199
The overall concept of predotvratimosti global risks. 199
Active Shields. 201
Existing and future boards: 204
The sequence of actions boards at various levels. 205
Saving the world balance. 205
Control system. 206
Conscious stop technological progress. 207
Funding preventive strike. 208
Removing sources of risk over large distances from the Earth. 209
The survival in nature, the creation of remote settlements. 210
The creation of a dossier on global risks and increase public awareness. 210
Shelters and bunkers. 211
Rapid resettlement in space. 215
All the better way would cost. 215
Degradation in the steady state. 215
Preventing one disaster to another. 216
Organizations responsible for disaster prevention. 217
Risk assessment of global catastrophe in the world today. 219
The problem of modeling scenarios for global catastrophe. 219
Global catastrophe and the problem of faith. 224
The global disaster and the device society. 230
Russia and global catastrophe. 232
The global disaster and the current situation in the world. 234
The world after global catastrophe. 234
A world without a global disaster: the best realistic option prevent global catastrophe. 236
Indirect ways to assess the likelihood of a global catastrophe. 237
Pareto Law. 237
The hypothesis of "Black Queen." 238
Fermi paradox. 239
"Doomsday argument". Gotta formula. 240
Discourse on the end Kartera-Lesli light. 242
The analogy with the man. 246
Effect observation breeding. Reasoning Bostroma-Tegmarka method for global disaster. 246
Discourse on Simulyatsii. 248
Integration of various indirect estimates. 250
Conclusion. 253
Recommended literature: 254
Literature: 256
General considerations on the mathematical modeling of global risks. 261
What can we save? 262
The correct model of a global catastrophe. 264
Abstract
There are already published several independent lists or dosie (?) On the risks of global terminal catastrophe, which could lead to complete extermination of humanity (Bostrom, Leslie). In this paper I tried to make most complete and comprehend analysis of the such risks. But the main goal of the paper is to study relationship between different global risks and to learn how much these relations increase the probability of the terminal human extinction. In order to do it I made structural analysis of global risks and investigated their temporal, casual and spatial mutual influence as well as their connection with technological development and the rate of creating of the new risks. I show that separate non terminal dangers could in some special circumference nonlinearly interfere and become terminal for humanity. I discuss types of situations there such interference is possible. I distinguish a separate category of 'system risk' which typically is not discussed in the connection with existent risks, but plays main role in all types of catastrophes in complex systems. I critically discuss all types of known methods of prevention. After all I make a conclusion that complex interrelations between different types of risks severely reduce our chances to survive in the XXI century, and thus demand not only huge material efforts to stop them, but much higher and deeper level of understanding.
Summary.
It has been several independent and thorough attempt to draw up a list of files or to different risks that can lead to the complete and irreversible extinction of mankind (Bostrom, Leslie). In this paper an attempt is made to hold the most complete, thorough and accurate analysis of such risks. However, the main objective of the work-to explore the interaction of various global risks between ourselves and figure out how this interaction increases the chances of irreversible global catastrophe. To do this, a structural analysis of global risks and explores parallel and consistent interaction of various risks, as well as their influence on the development of technologies and the emergence in connection with the new risks. It shows that individual risk, not lead to human extinction in themselves, may, with some sequential or joint use, this lead to a negative result. Of course, considered the situation in which such a pathological self becomes possible risks. Commit a separate category of "systemic risk", which is usually not considered in the context of a global disaster, while playing a leading role in most disasters complex systems. Reviewed and critically analyze all the classic ways of confronting global risks (world government, bunkers, exploration of space, billboards and control system). As a result, concludes that the complex interaction of risk significantly worsens the chances of survival of mankind in the XXI century, and therefore requires more than just investment, but much more careful consideration and understanding for its prevention.
Terms
In this work a number of common terms used in the following (details of each term will be explained in the text):
Universal AI Artificial Intelligence, and is capable of learning any intellectual activity, accessible to humans
Global disaster-an event in which all people on earth irreversibly extinct.
Post-apokaliptic world-Earth after it occurred on a very large disaster, but a number of people survived.
The doomsday device , the weapons ship day-any device, substance or method, designed specifically for a final and irreversible destruction of mankind.
The agent-a substance virus, bacterium or any other factor propagates exposure causes death.
Singularity-a point in time in the region in 2030, when some predictive curves go to infinity. Using extremely rapid growth in technological advances, especially computers, bio and nano technology, brain research and II systems and involves a qualitative change in mankind. The term was put into use Vernorom Vindzhem in in 1993.
Moore's Law, originally belongs to doubling the number of transistors on microprocessors every two years. I am talking about Moore's Law, in the broad sense, as any technological progress, where a substantial period of the option doubled to order several years.
Introduction.
This text is addressed in any future and existing organizations that would prevent global catastrophe or by the nature of their activities to face with them, including the various governments, research institutions, security services, military and non-governmental funds, their managers and staff, as well as futurists, young scientists and all those interested in the future of humanity. The purpose of this text is to describe the final space on the global catastrophe. Under the definitive global catastrophe, I mean the event, which, according to the definition of Nick Bostrom, "exterminated mind- life on Earth or irreversibly harm its potential." The complete extinction of all human beings is the most likely form of such events, and more on the text by the words "global catastrophe" will be in mind this event.
"Space-term opportunities", ascending to the book "Fantastic" futurology and Stanislaw Lem. He contrast to the submissions of individual scenarios and possibilities. Lem following results compare to clarify the term: although the number of possible chess parties indefinitely, the very description of the rules of the game and basic principles of the strategy is the final volume and undestandable. As an example, he cited the Cold War space capabilities, which were put to the emergence of a certain technology, and within which unfolded any confrontation scenarios: Caribbean crisis, the arms race, etc. Description scenarios virtually useless because, although each can be very intriguing, the likelihood of its implementation is very small. The more specific details in the scenario, the less likely it - although visibility likelihood of this increase. However, the analysis of individual scenarios cut space gives us opportunities, and therefore useful.
One of the most important ways to achieve security is accounting for all possible scenarios, according to their likelihood, the construction of "failtures" tree. For example, aviation security is achieved in particular by the fact that all kinds of disaster scenarios, up to a certain, just-calculated risk taken into account. Description space on the global catastrophe it is intended to prevent. Therefore, it should focus on those key points, which will allow management to manage the risk of an increasing number of potential catastrophic scenarios. In addition, it must provide the information to understand and easy-to-use suitable for practical use, and it is desirable that this information would be adapted for those targeting consumers who will direct the prevention of global risks. However, the task of determining those consumers itself is not easy.