The result is a highly readable account of the field of global catastrophic risk as it is currently constituted. The book opens with the commonly articulated moral argument for focusing on extreme catastrophes, rooted in a concern for future generations. The core of the book features chapters covering specific risks: Earth-asteroid collision, volcano eruption, nuclear war, global warming, disease outbreaks, biotechnology, and various threats from artificial and extraterrestrial intelligence.
Walsh speaks the most authoritatively on global warming and naturally occurring pathogens, having covered both in great detail over his career with Time magazine. Nonetheless, each of the risks gets a sound treatment, informed by interviews with some of the top experts in the field.
- High-Level Thoughts?
- Search form.
- The VAR Implementation Handbook: Financial Risk and Applications in Asset Management, Mesasurement, and Modeling.
- Nanosilicon: Properties, Synthesis, Applications, Methods of Analysis and Control.
- Waterworks in the Athenian Agora (Agora Picture Book #11).
The book is rich with compelling insights and stories. The reader learns of asteroid collisions that could be mistaken for nuclear attacks, archaeological digs that offer insight into the human death tolls of ancient volcanic eruptions, and the emotional and psychological difficulty of evaluating mass fatalities. The collision garnered wide attention and motivated policy attention to the threats posed by asteroids and comets. As the book explains, efforts to address the threat were already under way, but the collision made the threat more tangible.
This shows how science and politics can combine to make progress, even for catastrophic risks that might seem far-fetched. End Times is framed as a wake-up call for humanity to recognize the urgency of the risks and act accordingly.
Will it succeed? Many previous books have presented similar moral and scientific arguments and have left little impact relative to the scope of the challenge posed by the risks. This raises two questions that get too little attention in studies of catastrophic risk. Jun 24, Rhodes Hileman rated it it was amazing. Best collection of global risk analysis yet.
Generally cogent detailed arguments on 21 separate risk types. For anyone concerned about the future, this volume is a must. Wish I could put 2 and a half stars.
- Active Members:.
- Global Catastrophic Risks.
- Enthusiasm! The Action Handbook.
- The Enemy in Our Hands: Americas Treatment of Prisoners of War from the Revolution to the War on Terror!
- Global Catastrophic Risks!
- The Innovative Individual.
This is a strange one as it is a series of articles about global catastrophe written in a very scholarly manner. There is no apocalypse porn to look at here folks, please move on. Various articles all come to the same conclusion that we are not really in danger of being wiped out.
Classification of global catastrophic risks connected with artificial intelligence
We humanity that is could go backwards or even have a catastrophe or two but we will not become extinct as long as there are at least of us in bonkable condition and not to dis Wish I could put 2 and a half stars. We humanity that is could go backwards or even have a catastrophe or two but we will not become extinct as long as there are at least of us in bonkable condition and not to distant from one another.
We will survive. View 1 comment. Jul 02, Patrick rated it it was ok. The exact opposite of light reading. Cirkovic Light reading generally requires three things: Short, easy to read, and on a light-hearted subject. This book is none of those things: it consists of over pages of scientific essays on the end of the world. Setting the tone is the first chapter by astrophysicist Fred Adams about the inevitable death and decay of the universe.
The basic message is that in The exact opposite of light reading. The basic message is that in billion years, we will be dead, so get used to it. The rest of the book is downhill from there; topics include supervolcanoes, gamma-ray bursts, climate change, pandemics, evil AI, physics experiments that destroy the Earth, nuclear war, nuclear terrorism, bioterrorism, self-replicating nanobots, and totalitarianism. Cheery stuff, basically. Climate change is actually considered a relatively minor problem; after all, it's "only" estimated to kill about 30 million people.
The best essays are actually in Part I, about general cognitive biases and approaches toward risk. James J. Hughes wrote an excellent essay on apocalyptic and millennial ideologies; Eliezer Yudkowsky's essay on cognitive biases affecting risk judgment is brilliant.
Milan M. Cirkovic's essay on anthropic biases is particularly chilling: we may think that certain events are unlikely simply because, had they happened, we wouldn't be here. The only essays in Part I that aren't very interesting are Yacov Y. Haimes' essay on "systems-based risk analysis" which is mostly obvious common sense restated in technical jargon, e. The essays on specific threats honestly aren't that compelling; they don't present a unified narrative or give a good sense of what risks we should be most worried about or most focused on preventing.
The net effect is sort of a list of ways we could die, without a clear sense of how we should be trying to protect ourselves.
The book presents itself as trying to save humanity, but ends up feeling more like a pessimist's anxiety dreams. View 2 comments. Jul 10, Alexandru Tudorica rated it it was amazing Shelves: general-science.
Check this out - nickbostrom. Even infinitesimal action can fundamentally alter large swathes of the future, so being aware of what counts most feels imperative now. I'm inclined to consider that superintelligent AI and molecular nanotechnology are among the most pressing and likely developments that could endanger the survival of mankind - and we should definitely start planning to get it right the first time. We'll get only one chance. One of the most effective solutions for many of these existential risks is space colonization, since the resilience of a species greatly and qualitatively increases with the amount of space occupied and the number of distinct environments conquered other planets, the Moon, asteroids, empty space etc.
Catastrophic Risks (Topic archive) - 80, Hours
Undoubtedly, less critical threats related to global governance systems shouldn't be swept under the rug since addressing them will lower the overall existential risk anyway, but preparing for the latter must definitely be prioritized. Jul 25, Jeffrey Shrader added it. I was hoping for more discussion of correlated risks and the trade-offs faced by public policy when dealing with multi-dimensional, existential risks. The editors lay out exactly why this attention to multi-dimensionality is important: [T]here are also pragmatic reasons for addressing global catastrophic risks as a single field.
Attention is scarce. Mitigation is costly. To decide how to allocate effort and resources, we must make comparative judgements. If we treat risks singly, and never as pa I was hoping for more discussion of correlated risks and the trade-offs faced by public policy when dealing with multi-dimensional, existential risks.
If we treat risks singly, and never as part of an overall threat profile, we may become unduly fixated on the one or two dangers that happen to have captured the public or expert imagination of the day, while neglecting other risks that are more severe or more amenable to mitigation. Much of the text is taken up with examples of different risks, many of them non-catastrophic in the existential sense. The chapter by Posner came closest to the discussion I wanted, so I am reading his book on catastrophes next.
Shelves: really-deep-thinking , environment , futurism , nonfiction , nonfiction-apocalyptic , to-study , at-sfsu. This seems unlikely to be the book to answer my questions, but perhaps I'll give it a look-see just in case. Apr 06, Vivienne DiFiore rated it liked it. I read this book chopped and screwed. First through PDFs of chapters out of order.
Then the rest of the book that I missed. It is a collection of works dealing with collapse. I will here irresponsible speak of the book as a whole maybe coming back to touch it up and deal with the piece individually, but maybe not. What stands out to me is the attention paid to thinking through AI and viral collapse.
And thinking through these ideas generally. I, at first, was overly annoyed by the clear "scifi" w I read this book chopped and screwed. I, at first, was overly annoyed by the clear "scifi" world view of the writers. But a wise person pointed out to me that these nerds are the only people that are going to think through these things and do the work to get to the risk factors and so on. I am less interested in risks on the astronomic level such as being wiped out by a pulsar or a black hole randomly appearing. Let alone going to the lengths to mitigate such risks as those measures seem to be impotent and thus worthless.
The means of mitigation also seem undesirable. I am skeptical of the idea that the scientific community or government will get their shit together enough to do something for the common good. They can't even address the reproducible crisis that affects most scientific studies today.
The risk of totalitarianism is also presented. A totalitarianism that might indeed arise from trying to have world government that knows what's best for it's citizen's and operates with a catastrophe-prevention level of authority. Transhumanist and adjacent nerds who talk about existential risks and down play climate change don't pay enough heed to the fact that climate changes undesirable affects are on the very near horizon. I would however note that people seem to get stuck on the climate issue and it has been greatly politicized and recuperated.
This book helped me to think of neat scifi stories, but not only that it helped me think how I might think of risks they didn't even mention.
Global catastrophic risks
What comes of thinking through risk in an "on demand" economy where means economic collapse can happen in days? What comes of thinking through living in a food-system where water sources for the majority of the food supply are drying up or soil depletion is widespread? What comes of thinking through the storage of and creation of pandemic-capable viruses on a planet with pandemic-friendly infrastructure under the care of scientists who consistently fail their safety inspections?
What comes of thinking through the fact that self-learning AI are already on the internet and what they are doing has already gone beyond the comprehension of their makers? Even writing these prompts the work put in to the writings in this book is painfully evident. Maybe one day I'll sit down and write down my thinking through all this, but for now I'll settle for this. Sometimes I think about human extinction like I do my own death.
A sad inevitability. But why is death sad? And end to a desirable experience? I would say perhaps sometimes death is preferable to some existences. At what point does this come true of a civilization or group of beings? I certainly would cling to life. I know many do even in horrid conditions. I would also like to trouble the conflation of "humanity" with civilization or even identifying the march of progress as "our project" even if it becomes a "post-human" one. I would say that I am on the side of the earth generally.
I would say that I am on the side of life generally. I would even say I am on the side of humans generally. And thinking it through I would like to trouble the centering of "humanity" as a genetic species. I am more interested in "what bodies can do" than their species. But this seems to border on the tangential so I'll not go on. The meta-element of this book was also interesting to me. The discussion of how we do our threat and risk assessment as well as our biases was indeed critically thought out and has definitely been incorporated into my thinking.
Jul 13, Aldwin Susantio added it. The book discuss many global catastrophes in the past like diseases, super-eruption, nuclear and predict potential global catastrophes in the future the end solar system, artificial super intelligence. This book is written by many experts and cover technical and non-technical issues regarding global catastrophes.
Related Global Catastrophic Risks
Copyright 2019 - All Right Reserved