The Russian Academy of Sciences
Institute of Africa
A.V.Turchin
THE STRUCTURE OF THE GLOBAL CATASTROPHE
Risks for civilization in the XXI century
Parenthetical word: Nick Bostrom
The foreword: G.G. Malinetski
The joint project of The Centre Civilizational And Regional Researchesin
Institut of Africa of the Russian Academy of Sciences and
The Russian Transhumanist Movement.
Moscow 2008
A SERIES «DIALOGUES ABOUT THE FUTURE»
Т.2
Editor-in-chiefs:
I. V.Sledzevsky
Valery Prajd
Mail to:
Contents
Parenthetical word by Nick Bostrom.
G.G. Malinetsky. Reflexions about the inconceivable.
Preface
Terms
Introduction
Part 1. The analysis of Risks
Chapter 1. The general remarks
Space of possibilities
Problems of calculation of probabilities of various scenarios
Principles of classification of global risks
Chapter 2. Nuclear weapons
2.1 "Nuclear winter”.
2.2 Full radioactive contamination
2.3 Other dangers of the nuclear weapon
2.4 Integration of hurting factors of the nuclear weapons.
2.5 Cost of creation of the nuclear potential, able to threaten a survival of a human civilisation
2.6 Probability of the global catastrophe caused by the nuclear weapons
2.7 Change of probability of the global catastrophe caused by the nuclear weapon by time
2.8 Strategy of nuclear deterrence is in doubt
2.9 Nuclear terrorism as the factor of global catastrophe
2.10. Conclusions on risks of application of the nuclear weapon
Chapter 3. Global chemical contamination
Conclusions about technologically ready risks
The risks, which occurrence it seems inevitable, proceeding from current character of development of technologies
Chapter 4. The biological weapons
The general reasons and the basic scenarios
Structure of biological catastrophe
"Self-replicating" synthesizer of DNA
Plural biological strike
Biological delivery systems
Probability of application of the biological weapon and its distribution in time
Chapter 5. Superdrug
Chapter 6. The risks connected with self-copiing ideas (meme)
Chapter 7. Artificial intelligence
The general description of a problem
AI as universal absolute weapon
System of the goals
Struggle of AI-projects among themselves
«The advanced human»
AI and its separate copies
AI "revolt"
Speed of start
Scenarios of "fast start”
Slow start and struggle of different AI among themselves
Smooth transition. Transformation of total control state into AI
"Revolt" of robots
The control and destruction possibility
AI and the states
Probability of AI catastrophe
Other risks connected with computers
Time of creation of AI
Chapter 8. The risks connected with robots and nanotechnologies.
The robot-spray
The self-reproducing robot.
Cloud of microrobots
The armies of large fighting robots leaving from under the control
The nanotehnological weapons
Unlimited spreading of self-breeding nanorobots
Probability of occurrence nanorobots and possible time for this event
Chapter 9. Technological ways of intended creation of natural catastrophes
Deviation of asteroids
Creation of an artificial supervolcano
Intended destruction of the ozone layer
Chapter 10. The technological risks connected with essentially new discovery
Unsuccessful physical experiment
The new types of weapon, the new energy sources, new environments of distribution and ways of long-range action
Chapter 11. The risks created by space technologies
Chapter 12. The risks connected with program SETI
Chapter 13. The risks connected with washing out of borders between human and inhuman
Chapter 14. The risks connected with natural catastrophes
Chapter 15. Global warming
Chapter 16. The anthropogenous risks which have been not connected with new technologies
Chapter 17. The reasons of catastrophes unknown to us now
Chapter 18. Ways of detection of one-factorial scenarios of global catastrophe
Chapter 19. Multifactorial scenarios
Chapter 20. The events changing probability of global catastrophe.
Chapter 21. Криптовойны, race of arms and others сценарные the factors raising probability of global catastrophe
Chapter 22. The factors influencing for speed of progress
Глава 23. Защита от глобальных рисков
Общее понятие о предотвратимости глобальных рисков
Активные щиты
Действующие и будущие щиты
Сохранение мирового баланса сил
Возможная система контроля над глобальными рисками
Сознательная остановка технологического прогресса
Средства превентивного удара
Chapter 24. Indirect ways of an estimation of probability of global catastrophe
Chapter 25. The Most probable scenario of global catastrophe
Part 2. Methodology of the analysis of global risks.
Chapter 1. The general remarks. An error as intellectual catastrophe.
Chapter 2. Errors, possible only concerning threats to mankind existence
45. Uncertainty of values of new terms
Chapter 3. As когнитивные the distortions, able to concern any risks, influence an estimation of global risks
5. Skill of conducting disputes is harmful
16. Underestimation of value of remote events (discountrate)
17. Conscious unwillingness to know the unpleasant facts
23. Difficulty in delimitation of own knowledge
24. Humour as the factor of possible errors
25. A panic
26. Drowsiness and other factors of natural instability of the human consciousness, influencing occurrence of errors
86. The top border of possible catastrophe is formed on the basis of last experience
97. The minimum perceived risk
Chapter 4. The Obshchelogichesky errors, able to be shown in reasonings on global risks
24. The St.-Petersburg paradox
Chapter 5. The specific errors arising in discussions about danger of uncontrollable development of an artificial intellect
Chapter 6. The specific errors connected by reasonings on risks of use nanotechnologyй
12. E.Dreksler about possible objections of a realizability nanotechnologyй
Chapter 7. Conclusions from the analysis когнитивных distortions in an estimation of global risks
Chapter 8. Possible rules for rather effective estimation of global risks
The conclusion. Prospects of prevention of global catastrophes
G.G.Malinetsky. Reflexions about inconceivable 4
Global instability 5
Psychological discourse 9
Problem of the tool 12
In the plan behind harbingers 17
The foreword 23
Parenthesis НикаBostromа. 29
Terms 30
Introduction 32
Part 1. The analysis Is brave 34
Chapter 1. The general remarks 34
Chapter 2. The nuclear weapon 56
Chapter 3. Global chemical contamination 76
Chapter 4. The biological weapon 81
Chapter 5. A superdrug 90
Chapter 6. The risks connected with self-copied ideas (мемами) 93
Chapter 7. The Artificial intellect 95
Chapter 8. The risks connected with robots and nanotechnologyями 110
Chapter 9. Technological ways провоцирования natural natural catastrophes 118
Chapter 10. The technological risks connected with essentially new discovery 121
Chapter 11. The risks created by space technologies 126
Chapter 12. The risks connected with program SETI 131
Chapter 13. The risks connected with washing out of borders between human and inhuman 141
Chapter 14. The risks connected with natural catastrophes 142
Chapter 15. Global warming 166
Chapter 16. The anthropogenous risks which have been not connected with new technologies 169
Chapter 17. The reasons of catastrophes unknown to us now 175
Chapter 18. Ways of detection of one-factorial scenarios of global catastrophe 176
Chapter 19. Multifactorial scenarios 186
Chapter 20. The events changing probability of global catastrophe. 210
Chapter 21. Криптовойны, race of arms and others сценарные the factors raising probability of global catastrophe 229
Chapter 22. The factors influencing for speed of progress 246
Chapter 23. Protection against global risks 251
Chapter 24. Indirect ways of an estimation of probability of global catastrophe 282
Chapter 25. The Most probable scenario of global catastrophe 300
Part 2. Methodology of the analysis of global risks. 305
Chapter 1. The general remarks. An error as intellectual catastrophe. 305
Chapter 2. Errors, possible only concerning threats to existence of mankind 309
Chapter 3. As когнитивные the distortions, able to concern any risks, influence an estimation of global risks 325
Chapter 4. The Obshchelogichesky errors, able to be shown in reasonings on global risks 362
Chapter 5. The specific errors arising in discussions about danger of uncontrollable development of an artificial intellect 373
Chapter 6. The specific errors connected by reasonings on risks of use nanotechnologyй 383
Chapter 7. Conclusions from the analysis когнитивных distortions in an estimation of global risks 388
Chapter 8. Possible rules for rather effective estimation of global risks 388
The conclusion. Prospects of prevention of global catastrophes 390
The literature: 391
The appendix 1. The table of catastrophes. 402
The appendix 2. Articles. 440
E.Yudkowsky. The Artificial intellect as the positive and negative factor of global risk. 440
N.Bostrom. Introduction in the Theorem of the Doomsday. 495
A.A.Kononov. The ideological beginnings of the general theory неуничтожимости mankind 500
Notes: 515
Parenthetical wordby Nick Bostrom.
Lots of academics spend a lot of time thinking about a lot of things. Unfortunately, threats to the human species is not yet one of them. We may hope that this will change, and perhaps this volume will help stimulate more research on this topic.
I have tried to investigate various aspects of the subject matter, but the study of existential risk is still very much in its infancy. I see it as part of a larger endeavor. As humanity's technological and economic powers grow, and as our scientific understanding deepens, we need to become better at thinking carefully and critically about the really big picture questions for humanity. We need to apply to these big questions at least the same level of attention to detail and analytic rigor that we would expect of a scientific study of the breeding habits of the dung fly or the composition of the rings of Saturn. We know that insight into these little things does not come by clapping our hands, and we should not expect that wisdom about big things to be any easier. But if we make the effort, and if we try to be intellectually honest, and if we build on the vast amount of relevant science that already exists, we are likely to make some progress over time. And that would be an important philanthropic contribution.
Nick Bostrom
Oxford, 7 December 2007
Preface
G.G. Malinetsky.Reflexions about the inconceivable.
G.G. Malinetsky is Deputy director of Keldysh Institute of applied mathematics of the Russian Academy of Sciences
I envy Proust. Revelling past, he leant against rather strong basis: quite reliable present and conclusive future. But for us the past became the past doubly, time is twice lost, because together in due course we have lost also the world in which this time flew. There was a breakage. Progress of centuries has interrupted. And we do not already know, when, in what century we live and whether will be forus any future.
R.Merl. «Malville»
The picture drawn by me, not necessarily should be a picture of full despondency: after all, inevitable catastrophes, probably, are not inevitable. And, of course, chances to avoid catastrophe begins to grow if we safely look to catastropheface to face and we will estimate its danger.
A.Azimov. «A choice of catastrophes»
Such book should appear. Its time has come. It would be good, that it has been written years on 20 earlier. But you can’t changethe pastany more, and it is necessary to thinkabout the future, project it and to comprehend its dangers, risks and threats.
This book is on the verge between the review of the works devoted to scenarios of global catastrophe, executed in the world, between futurological research and the methodological analysis of existing approaches. The author of the book - Alexey Turchin - aspires to the objective analysis, to scientific character, to complete judgement of global risks. Unconditional advantage of the book is its intellectual honesty, aspiration to clear split of the facts, results, hypotheses, doubts, conjectures.
Likely, many readers will have a natural question how the undertaken research corresponds with concrete works on management of risks and designing of the future which are actively conducted in Russia and in the world. About the "bridge", connecting the analysis of hypothetical catastrophes and work under the forecast and the prevention of real failures, disasters, extreme situations, likely, also it is necessary to tell in the foreword to this book.
Global instability
… macroshift is a transformation of a civilisation, in which motive power is the technology, and shift by presence of critical weight of the people who have realised necessity of updating of system of values is started.
E.Laslo. "Macroshift"
Possibly, right now the mankind makes the most important and significant choice in the history. In the self-organising theory - synergetrics (literally, theories of joint action) - are essentially important concept bifurcation. The word has come from the French language where means bifurcation, branching. Bifurcation is a situation of change of number or stability of decisions of certain type at parametre change.
In our case in parameter is the time (more precisely, historical «slow time» as its outstanding French historian Fernan Brodel named). "Decision" are the major quantitative parametres characterising ways of life of our civilisation. And now during a lifetime of one generation the previous trajectory of development are loosing stability.
The obvious certificate to it is a technological limit to which the civilisation has approached. By estimations of ecologists if all world starts to live today under standards of California all reconnoitered stocks of minerals will suffice by one kinds of minerals for 2,5 years, on another on 4. The mankind lives beyond the means - for a year it consumes such quantity of hydrocarbons on which creation at the nature left more than 2 million years. Several years ago there has been passed the important boundary - more than third of oil has started to be extracted on a shelf and from oceanic depths. The Brazilian and American firms have begun drill in the sea on depths of 2,5 kilometers. What it was easy to reach, is already mastered or settled.
The science of the XX century has not solved a problem of manufacture of necessary quantity of a cheap net energy and its effective accumulation. The evident certificate of present world oil crisis a rise in prices for oil with 7 (several decades ago) to 140 dollars for barrel. The same concerns manufactures of the foodstuffs, scenarios of the economic development, aggravating problems of globalisation. Becomes obvious, that the former trajectory of development of mankind has lost stability. Also it is necessary consciously and reasonably choose a new trajectory, or circumstances will choose it for us.
In synergetrics it is shown, that near to a point bifurcation instability takes place. And the small reasons can have the big consequences. We see set of signs of instability of a modern reality. Instability always were the companion of development of mankind.
Instability as the synergetrics shows, have different character. For example, in linear systems they develop on exponential law or, that is the same, on a geometrical progression - in identical number of times for identical time. The elementary example of such growth gives Malthusianequation.
. (1)
Under the assumption of the English priest and the professor of the Ost-Indian company Thomas Malthus (1766-1834), under this law grow number of all species, including man. From the school mathematics the decision of this equation N (t) = N0·exp (t) is known. If to increase initial data twice also the decision will increase twice: the response proportional to influence - the general line of all linear systems.
It is very fast law. According to it, for example, since 1960th years, the computer industry develops. There it is called Moore's law: each 18 months degree of integration of elements of a microcircuit (and with it and speed of computers) doubles.
However there are also faster laws, characteristic for nonlinear systems, for example, systems with a positive feedback. In them the deviation causes the reaction of system increasing a deviation, increasing more strongly, than in the equation (1).
Such instability is described, for example, by the equation (2)
. (2)
But the growth law here is absolutely different:
. (3)
Here is theblow-up regime[1]when the investigated parameterincreases beyond all bounds for limited time tf, which itself depends from initial parameterstf=1/N0.
All it is not mathematical exercise, and has the direct relation to our present and the future. Researches of last decades the XX centuries and findings of paleodemographs, have shown, thatnumber of mankind throughout two millions years frewexactly under the law (2), instead of under the law (1), and the peaking moment is near tf2025.
The law (3) describes singularity point(or exception). Experts in forecasting call theSingularitya hypothetical point in time near to 2030 in which a number prognostic curves go in infinity. Many experts connect it with explosive development of technical progress, in particular information-telecommunication, nano, bioand cogno technologies (English abbreviation isNanoBioInfoCogno - NBIC), with qualitative change of mankind.
Let's argue as the realists firmly standing on the Earth. People’s number can not be infinitelylarge. Therefore the law (3), hyperbolic growth of number of mankind - the main spring of history throughout many centuries - should change. And it occurs. Occurs last 20 years - throughout a life of one generation. It is a question of change of algorithms of development of a civilisation. The closest event of such scale - Neolithic revolution in result of which the mankind managed to pass from hunting and collecting to agriculture and to cattle breeding. By estimations of some experts, during this revolution number of mankind has decreased almost in 10 times.
It is a challenge to the mankind and science, comparable with nothing. The condition of resources, societies and biospheres forces us in very short term of 15-20 years to update or considerably change all set of life-supporting technologies (power, foodstuffs manufacture, transport, management of a society and many other things).
As a matter of fact, it is a question of type bifurcation. Science of ssynergetricsdistinguishessoft bifurcationand rigid bifurcations. In soft bifurcations passage of the new arisen trajectories lay in a vicinity of former, which has lost stability. And further gradually, evolutionary leave from it as parametre is changing. It is a result of important and responsible choice which was made, which essence and value would be found out later, and the development goes evolutionary. It is a variant of the future which is expected by the professor S. P.Kapitsa.
But sometimes also happensrigid bifurcations when the close branch of trajectory is not present and, say, there is a transition to another branch, far enough from previous the branch. This is revolutionary event. It would not be desirable to think, that it waits mankind the next decades, but it is also impossible to exclude such variant. And the common sense prompts, that, hoping for the best, it is necessary to count on the worst and, of course, seriously to reflect on it.
It also is a leitmotif of the book of A. V.Turchin. As a matter of fact, it is the first scientific (how much it possible to be scientific at the analysis of tragic, unprecedented, never occurring events) work devoted to the given circle of problems. The discussed direction of thought develops some decades in the West. Corresponding works are published in many authoritative scientific magazines, in particular, in Nature. It is natural to acquaint the domestic reader with this direction of searches which can appear very important (who it is warned, that is armed).