Human extinction

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Nuclear war is an often predicted cause of the extinction of humanity[by whom?]

Human extinction is the hypothetical complete end of the human species. This may result either from natural causes or due to anthropogenic (human) causes, but the risks of extinction through natural disaster, such as a meteorite impact or large-scale volcanism, are generally considered to be comparatively low.[1] Anthropogenic human extinction is sometimes called omnicide.

Many possible scenarios of anthropogenic extinction have been proposed, such as climate change, global nuclear annihilation, biological warfare and ecological collapse. Some scenarios center on emerging technologies, such as advanced artificial intelligence, biotechnology, or self-replicating nanobots. The probability of anthropogenic human extinction within the next hundred years is the topic of an active debate.

History[edit]

The study of human extinction arose relatively recently in human history. Ancient Western philosophers such as Plato, Aristotle, and Lucretius wrote of the end of humankind only as part of a cycle of renewal. Later philosophers such as Al-Ghazali, William of Ockham, and Gerolamo Cardano expanded the study of logic and probability and began discussing abstract possible worlds, including a world without humans. The notion that species can go extinct gained scientific acceptance during the Age of Enlightenment in the 17th and 18th centuries, and by 1800, Georges Cuvier had identified 23 extinct prehistoric species.[2]

In the 19th century, human extinction became a popular topic in science (e.g., Thomas Robert Malthus's An Essay on the Principle of Population) and fiction (e.g., Mary Shelley's The Last Man). In 1863, a few years after Charles Darwin published On the Origin of Species, William King proposed that Neanderthals were an extinct species of the genus Homo. At the turn of the 20th century, Russian cosmism, a precursor to modern transhumanism, advocated avoiding humanity's extinction by colonizing space.[2]

The large scale destruction of World War I and the development of nuclear weapons at the end of World War II demonstrated that omnicide (human extinction caused by human actions) was not only possible, but plausible. In 1950, Leo Szilard suggested it was technologically feasible to build a cobalt bomb that could render the planet unlivable. Rachel Carson's 1962 Silent Spring raised awareness of environmental catastrophe. In 1983, Brandon Carter proposed the Doomsday argument, which used Bayesian probability to predict the total number of humans that will ever exist. By the beginning of the 21st century, the study of "existential risks" that threaten humankind became "a growing field of rigorous scientific inquiry".[2]

Causes[edit]

Ecological[edit]

  • A common belief is that climate change could result in human extinction.[3][4] In November 2017, a statement by 15,364 scientists from 184 countries indicated that increasing levels of greenhouse gases from use of fossil fuels, human population growth, deforestation, and overuse of land for agricultural production, particularly by farming ruminants for meat consumption, are trending in ways that forecast an increase in human misery over coming decades.[5] An October 2017 report published in The Lancet stated that toxic air, water, soils, and workplaces were collectively responsible for nine million deaths worldwide in 2015, particularly from air pollution which was linked to deaths by increasing susceptibility to non-infectious diseases, such as heart disease, stroke, and lung cancer.[6] The report warned that the pollution crisis was exceeding "the envelope on the amount of pollution the Earth can carry" and "threatens the continuing survival of human societies".[6] Carl Sagan and others have raised the prospect of extreme runaway global warming turning Earth into an uninhabitable Venus-like planet. Some scholars argue that much of the world would become uninhabitable under severe global warming, but even these scholars do not tend to argue that it would lead to complete human extinction, according to Kelsey Piper of Vox. All the IPCC scenarios, including the most pessimistic ones, predict temperatures compatible with human survival. The question of human extinction under "unlikely" outlier models is not generally addressed by the scientific literature.[7] Factcheck.org judges that climate change fails to pose an established 'existential risk', stating: "Scientists agree climate change does pose a threat to humans and ecosystems, but they do not envision that climate change will obliterate all people from the planet."[8][9] On a much longer time scale, natural shifts such as Milankovitch cycles (quaternary climatic oscillations) could create unknown climate variability and change.[10]
  • A pandemic[11] involving one or more viruses, prions, or antibiotic-resistant bacteria. Past pandemics include the 1918 Spanish flu outbreak estimated to have killed 3-5% of the global population, the 14th century Eurasian Black Death pandemic, and the various European viruses that decimated indigenous American populations. A deadly pandemic restricted to humans alone would be self-limiting as its mortality would reduce the density of its target population. A pathogen with a broad host range in multiple species, however, could eventually reach even isolated human populations,[12] U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity", if left unchecked, is technically feasible and that the technical obstacles are "trivial". However, they are confident that in practice, countries would be able to "recognize and intervene effectively" to halt the spread of such a microbe and prevent human extinction.[13]
  • Human activity has triggered an extinction event often referred to as the sixth "mass extinction".[14][15][16] The 2019 Global Assessment Report on Biodiversity and Ecosystem Services, published by the United Nations' Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, asserts that roughly one million species of plants and animals face extinction from human impacts such as expanding land use for industrial agriculture and livestock rearing, along with overfishing.[17][18][19] A 1997 assessment states that over a third of Earth's land has been modified by humans, that atmospheric carbon dioxide has increased around 30 percent, that humans are the dominant source of nitrogen fixation, that humans control most of the Earth's accessible surface fresh water, and that species extinction rates may be over a hundred times faster than normal.[20]
  • Overpopulation: The Global Footprint Network estimates that current activity uses resources twice as fast as they can be naturally replenished, and that growing human population and increased consumption pose the risk of resource depletion and a concomitant population crash.[21] Evidence suggests birth rates may be rising in the 21st century in the developed world.[22] Projections vary; researcher Hans Rosling has projected population growth to start to plateau around 11 billion, and then to slowly grow or possibly even shrink thereafter.[23] A 2014 study published in Science asserts that the human population will grow to around 11 billion by 2100 and that growth will continue into the next century.[24]
  • Population decline through a preference for fewer children.[25] If developing world demographics are assumed to become developed world demographics, and if the latter are extrapolated, some projections suggest an extinction before the year 3000. John A. Leslie estimates that if the reproduction rate drops to the German or Japanese level the extinction date will be 2400.[a] However, some models suggest the demographic transition may reverse itself due to evolutionary biology.[22][26]
  • A supervolcanic eruption[27]

Technological[edit]

Some scenarios involve extinction as a result of the effects or use of totally new technologies. Scenarios include:

  • Nuclear[28] and biological[29] weapons, whether used in war or terrorism, could result in human extinction.[30] Some fear a hypothetical World War III could cause the annihilation of humankind, perhaps by a resulting nuclear winter as has been hypothesized by experts.[31] Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks and developing facilities to mitigate disease outbreaks (e.g. better and/or more widely distributed vaccines).[32]
  • The creators of a superintelligent entity could inadvertently give it goals that lead it to annihilate the human race.[33][34] A survey of AI experts estimated that the chance of human-level machine learning having an "extremely bad (e.g., human extinction)" long-term effect on humanity is 5%.[35]
  • Uncontrolled nanotechnology (grey goo) incidents resulting in the destruction of the Earth's ecosystem (ecophagy).[36] Chris Phoenix and Treder classify catastrophic risks posed by nanotechnology into three categories: (1) From augmenting the development of other technologies such as AI and biotechnology. (2) By enabling mass-production of potentially dangerous products that cause risk dynamics (such as arms races) depending on how they are used. (3) From uncontrolled self-perpetuating processes with destructive effects. Several researchers say the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government.[37]
  • Creation of a micro black hole on Earth during the course of a scientific experiment, or other unlikely scientific accidents in high-energy physics research, such as vacuum phase transition or strangelet incidents.[38] There were worries concerning the Large Hadron Collider at CERN as it is feared that collision of protons at near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.[39][40][41]
  • Some scenarios envision that humans could use genetic engineering or technological modifications to split into normal humans and a new species – posthumans.[42][43][44][45][46][47][48][49] Such a species could be fundamentally different from any previous life form on Earth, e.g. by merging humans with technological systems.[50] Such scenarios assess the risk that the "old" human species will be outcompeted and driven to extinction by the new, posthuman entity.[51]

Extraterrestrial[edit]

  • A geological or cosmological disaster such as an impact event of a near-Earth object (NEOs),[52] which serve as an absolute threat to the survival of living species.[53] A single extraterrestrial event (asteroid or comet impact)[54] can lead to widespread species extinctions. However, none of the large "dinosaur-killer" asteroids known to Spaceguard pose a near-term threat of collision with Earth.[55]
  • Supernovae, gamma-ray bursts, solar flares, and cosmic rays, if strong enough, could be lethal to humans on Earth.[56][57][58]
  • The Earth will naturally become uninhabitable due to the Sun's stellar evolution, within about a billion years.[59] In around 1 billion years from now, the Sun's brightness may increase as a result of a shortage of hydrogen, and the heating of its outer layers may cause the Earth's oceans to evaporate, leaving only minor forms of life.[60] Well before this time, the level of carbon dioxide in the atmosphere will be too low to support plant life, destroying the foundation of the food chains.[61] See Future of the Earth.
    About 7–8 billion years from now, if and after the Sun has become a red giant, the Earth will probably be engulfed by an expanding Sun and destroyed.[62][63]
    According to standard physics, the entire universe over much, much larger timescales will become gradually uninhabitable, resulting eventually in unavoidable human extinction associated with the heat death of the universe.[64][65]
  • Invasion by militarily superior extraterrestrials (see alien invasion)[66] – often considered to be a scenario purely from the realm of science fiction, professional SETI researchers have given serious consideration to this possibility, but conclude that it is unlikely.[b]

Probability[edit]

Nick Bostrom argues that it would be "misguided" to assume that the probability of near-term extinction is less than 25% and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure.[1][69] A little more optimistically, philosopher John Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. Leslie's argument is somewhat frequentist, based on the observation that human extinction has never observed, but requires subjective anthropic arguments.[70] Leslie also discusses the anthropic survivorship bias (which he calls an "observational selection" effect on page 139) and states that the a priori certainty of observing an "undisastrous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe."[71]

Some scholars believe that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contingency plans and supplies for a long isolation.[72] In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war.[69] Any number of events could lead to a massive loss of human life; but if the last few (see minimum viable population) most resilient humans are unlikely to also die off, then that particular human extinction scenario may not seem credible.[73]

Prevention[edit]

Stephen Hawking advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war.[74][75]

More economically, some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving global disaster.[69][72] Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.[69][76]

Psychology[edit]

Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:[1][77]

Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of an existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".

All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.[78]

Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."[79]

Ethics[edit]

"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress.[1] Multiple scholars have argued based on the size of the "cosmic endowment" that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. Some of the arguments run as follows:

  • Carl Sagan wrote in 1983: "If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in future generations who would not be born.... (By one calculation), the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss—including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise."[80]
  • Philosopher Derek Parfit in 1984 makes an anthropocentric utilitarian argument that, because all human lives have roughly equal intrinsic value no matter where in time or space they are born, the large number of lives potentially saved in the future should be multiplied by the percentage chance that an action will save them, yielding a large net benefit for even tiny reductions in existential risk.[81]
  • Humanity has a 95% probability of being extinct in 7,800,000 years, according to J. Richard Gott's formulation of the controversial Doomsday argument, which argues that we have probably already lived through half the duration of human history.
  • Philosopher Robert Adams in 1989 rejects Parfit's "impersonal" views, but speaks instead of a moral imperative for loyalty and commitment to "the future of humanity as a vast project... The aspiration for a better society- more just, more rewarding, and more peaceful... our interest in the lives of our children and grandchildren, and the hopes that they will be able, in turn, to have the lives of their children and grandchildren as projects."[82]
  • Philosopher Nick Bostrom argues in 2013 that preference-satisfactionist, democratic, custodial, and intuitionist arguments all converge on the common-sense view that preventing existential risk is a high moral priority, even if the exact "degree of badness" of human extinction varies between these philosophies.[83]

Parfit argues that the size of the "cosmic endowment" can be calculated from the following argument: If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans, then there is a potential for 1016 (or 10,000,000,000,000,000) human lives of normal duration.[81]:453–4 Bostrom goes further, stating that if the universe is empty, then the accessible universe can support at least 1034 biological human life-years; and, if some humans were uploaded onto computers, could even support the equivalent of 1054 cybernetic human life-years.[1]

Placard against omnicide, at Extinction Rebellion (2018).

Some philosophers, among them the antinatalist David Benatar, animal rights activist Steven Best and anarchist Todd May, posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing for example the omnicidal nature of human civilization.[84][85][86]

Research[edit]

Psychologist Steven Pinker calls existential risk a "useless category" that can distract real threats such as climate change and nuclear war. In contrast, other researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed".[87][88] As of 2020, the Biological Weapons Convention organization has an annual budget of US$1.4 million.[89]

Although existential risks are less manageable by individuals than, e.g. health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."[90]

Multiple organizations with the goal of helping prevent human extinction exist. Examples are the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute, and the Global Catastrophic Risk Institute (est. 2011).

In fiction[edit]

Jean-Baptiste Cousin de Grainville's 1805 Le dernier homme (The Last Man), which depicts human extinction due to infertility, is considered the first modern apocalyptic novel and credited with launching the genre.[91] Other notable early works include Mary Shelley's 1826 The Last Man, depicting human extinction caused by a pandemic, and Olaf Stapledon's 1937 Star Maker, "a comparative study of omnicide".[2]

Some 21st century pop-science works, including The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared?[92][93] A threat of human extinction, such as through a technological singularity (also called an intelligence explosion), drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide.[94] Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.[95]

See also[edit]

Notes[edit]

  1. ^ For the "West Germany" extrapolation see: Leslie, 1996 (The End of the World) in the "War, Pollution, and disease" chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the sterilization-for-rupees programs in India, and surveys other infertility or falling birth-rate extinction scenarios. He says that the voluntary small family behaviour may be counter-evolutionary, but that the meme for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.
  2. ^ Former NASA consultant David Brin's criticizes SETI optimism about alien intentions, stating "This is an area in which discussion is called for"[67] and arguing: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal."[68]

References[edit]

  1. ^ a b c d e Bostrom 2013.
  2. ^ a b c d Moynihan, Thomas (23 September 2020). "How Humanity Came To Contemplate Its Possible Extinction: A Timeline". The MIT Press Reader. Retrieved 11 October 2020.
    See also:
  3. ^ Bostrom & Cirkovic 2011, pp. 15-16; Frame & Allen 2011; Leslie 1996, pp. 4-5.
  4. ^ "Majority of Britons believe climate-change could end human race: poll". Reuters. 1 May 2019. Retrieved 24 March 2020.
  5. ^ Ripple WJ, Wolf C, Newsome TM, Galetti M, Alamgir M, Crist E, Mahmoud MI, Laurance WF (13 November 2017). "World Scientists' Warning to Humanity: A Second Notice". BioScience. 67 (12): 1026–1028. doi:10.1093/biosci/bix125.
  6. ^ a b Carrington, Damian (20 October 2017). "Global pollution kills 9m a year and threatens 'survival of human societies'". The Guardian. London, UK. Retrieved 20 October 2017.
  7. ^ Piper, Kelsey (13 June 2019). "Is climate change an "existential threat" — or just a catastrophic one?". Vox. Retrieved 24 March 2020.
  8. ^ Shannon Osaka; Kate Yoder (3 March 2020). "Climate change is a catastrophe. But is it an 'existential threat'?". Grist. Retrieved 24 March 2020.
  9. ^ "FactChecking the October Democratic Debate". FactCheck.org. 16 October 2019. Retrieved 24 March 2020.
  10. ^ Barker, P. A. (2014). "Quaternary climatic instability in south-east Australia from a multi-proxy speleothem record". Journal of Quaternary Science. 29 (6): 589–596. Bibcode:2014JQS....29..589W. doi:10.1002/jqs.2734.
  11. ^ Bostrom & Cirkovic 2011, pp. 16-17; Kilbourne 2011; Bostrom 2002; Leslie 1996, p. 5.
  12. ^ Anders Sandberg; Milan M. Ćirković (9 September 2008). "How can we reduce the risk of human extinction?". Bulletin of the Atomic Scientists. Retrieved 28 January 2014.
  13. ^ Fiorill, Joe (29 July 2005). "Top U.S. Disease Fighters Warn of New Engineered Pathogens but Call Bioweapons Doomsday Unlikely". Global Security Newswire. Retrieved 10 September 2013.
  14. ^ Woodward, Aylin (2020). "18 signs we're in the middle of a 6th mass extinction". Business Insider. Retrieved 19 April 2020.
  15. ^ Ripple WJ, Wolf C, Newsome TM, Galetti M, Alamgir M, Crist E, Mahmoud MI, Laurance WF (13 November 2017). "World Scientists' Warning to Humanity: A Second Notice". BioScience. 67 (12): 1026–1028. doi:10.1093/biosci/bix125. Moreover, we have unleashed a mass extinction event, the sixth in roughly 540 million years, wherein many current life forms could be annihilated or at least committed to extinction by the end of this century.
  16. ^ Ceballos, Gerardo; Ehrlich, Paul R.; Raven, Peter H. (1 June 2020). "Vertebrates on the brink as indicators of biological annihilation and the sixth mass extinction". PNAS. 117 (24): 13596–13602. Bibcode:2020PNAS..11713596C. doi:10.1073/pnas.1922686117. PMC 7306750. PMID 32482862.
  17. ^ Vidal, John (15 March 2019). "The Rapid Decline Of The Natural World Is A Crisis Even Bigger Than Climate Change". The Huffington Post. Retrieved 30 May 2020.
  18. ^ Stokstad, Erik (5 May 2019). "Landmark analysis documents the alarming global decline of nature". Science. AAAS. Retrieved 30 May 2020.
  19. ^ Van Roekel, Annemieke (11 June 2019). "Earth's biota entering a sixth mass extinction, UN report claims". EuroScience. Retrieved 30 May 2020.
  20. ^ Vitousek, P. M., H. A. Mooney, J. Lubchenco, and J. M. Melillo. 1997. Human Domination of Earth's Ecosystems. Science 277 (5325): 494–499
  21. ^ Kilvert, Nick (25 July 2019). "How many humans can Earth sustain?". ABC News (Australian). Retrieved 19 April 2020.
  22. ^ a b Can we be sure the world's population will stop rising?, BBC News, 13 October 2012
  23. ^ Biello, David (2014). "World Should Prepare for 11 Billion or More People". Scientific American. Retrieved 19 April 2020.
  24. ^ Gerland, P.; Raftery, A. E.; Ev Ikova, H.; Li, N.; Gu, D.; Spoorenberg, T.; Alkema, L.; Fosdick, B. K.; Chunn, J.; Lalic, N.; Bay, G.; Buettner, T.; Heilig, G. K.; Wilmoth, J. (18 September 2014). "World population stabilization unlikely this century". Science. AAAS. 346 (6206): 234–7. Bibcode:2014Sci...346..234G. doi:10.1126/science.1257469. ISSN 1095-9203. PMC 4230924. PMID 25301627.
  25. ^ Leslie 1996, p. 6.
  26. ^ Burger, Oskar; DeLong, John P. (19 April 2016). "What if fertility decline is not permanent? The need for an evolutionarily informed approach to understanding low fertility". Philosophical Transactions of the Royal Society B: Biological Sciences. 371 (1692): 20150157. doi:10.1098/rstb.2015.0157. PMC 4822437. PMID 27022084.
  27. ^ Bostrom & Cirkovic 2011, pp. 13-14; Rampino 2011; Leslie 1996, p. 5.
  28. ^ Bostrom & Cirkovic 2011, pp. 20-22; Cirincione 2011; Ackerman & Potter 2011.
  29. ^ Bostrom & Cirkovic 2011, pp. 22-24; Nouri & Chyba 2011.
  30. ^ Bostrom 2002; Leslie 1996, p. 4.
  31. ^ Meyer, Robinson (29 April 2016). "You're More Likely to Die in a Human Extinction Event Than a Car Crash". The Atlantic. Retrieved 19 April 2020.
  32. ^ Ali Noun; Christopher F. Chyba (2008). "Chapter 20: Biotechnology and biosecurity". In Bostrom, Nick; Cirkovic, Milan M. (eds.). Global Catastrophic Risks. Oxford University Press.
  33. ^ Bostrom & Cirkovic 2011, pp. 17-18; Yudkowsky 2011; Bostrom 2002; Leslie 1996, pp. 7-8.
  34. ^ Chalmers, David (2010). "The singularity: A philosophical analysis" (PDF). Journal of Consciousness Studies. 17: 9–10. Retrieved 17 August 2013.
  35. ^ Grace, Katja (2017). "When Will AI Exceed Human Performance? Evidence from AI Experts". Journal of Artificial Intelligence Research. arXiv:1705.08807. Bibcode:2017arXiv170508807G.
  36. ^ Bostrom & Cirkovic 2011, pp. 24-25; Phoenix & Treder 2011; Rees 2003; Bostrom 2002, s. 4.8; Leslie 1996, p. 7.
  37. ^ Phoenix & Treder 2011.
  38. ^ Bostrom & Cirkovic 2011, pp. 18-19; Wilczek 2011; Leslie 1996, pp. 8-9.
  39. ^ Rees 2003.
  40. ^ Matthews, Robert (28 August 1999). "A black hole ate my planet". New Scientist.
  41. ^ "Statement by the Executive Committee of the DPF on the Safety of Collisions at the Large Hadron Collider." Archived 24 October 2009 at the Wayback Machine
  42. ^ Leslie 1996, pp. 6-7.
  43. ^ "EmTech: Get Ready for a New Human Species". Retrieved 1 July 2016.
  44. ^ Hittinger, John (5 October 2015). Thomas Aquinas : teacher of humanity : proceedings from the first conference of the Pontifical Academy of St. Thomas Aquinas held in the United States of America. ISBN 978-1443875547. Retrieved 1 July 2016.
  45. ^ Gruskin, Sofia; Annas, George J.; Grodin, Michael A. (2005). Perspectives on Health and Human Rights. ISBN 9780415948067. Retrieved 1 July 2016.
  46. ^ Miccoli, Anthony (2010). Posthuman Suffering and the Technological Embrace. ISBN 9780739126332. Retrieved 1 July 2016.
  47. ^ "The Transhuman Future: Be More Than You Can Be". Retrieved 1 July 2016.
  48. ^ "WILL YOU JOIN THE TRANSHUMAN EVOLUTION?". Retrieved 1 July 2016.
  49. ^ "How humans are turning into a 'totally different species'". Retrieved 1 July 2016.
  50. ^ Warwick, Kevin (2004). I, Cyborg. University of Illinois Press. ISBN 978-0-252-07215-4.
  51. ^ Bostrom, Nick. "The future of human evolution." Death and anti-death: Two hundred years after Kant, fifty years after Turing (2004): 339-371.
  52. ^ Bostrom & Cirkovic 2011, pp. 14-15; Napier 2011; Bostrom 2002, s. 4.10; Leslie 1996, p. 5.
  53. ^ Perna . D; Barucci M.A; Fulchignoni .M (2013). "The Near-Earth Objects and Their Potential Threat To Our Planet". Astron Astrophys Rev. 21: 65. Bibcode:2013A&ARv..21...65P. doi:10.1007/s00159-013-0065-4. S2CID 122057584.
  54. ^ Alvarez, Luis W. (January 1983). "Experimental evidence that an asteroid impact led to the extinction of many species 65 million years ago". Proc. Natl. Acad. Sci. U.S.A. 80 (2): 627–42. Bibcode:1983PNAS...80..627A. doi:10.1073/pnas.80.2.627. PMC 393431. PMID 16593274.
  55. ^ "2012 Apocalypse FAQ: Why the World Won't End". Space.com. 2012. Retrieved 19 April 2020.
  56. ^ Bostrom & Cirkovic 2011, p. 15; Dar 2011; Leslie 1996, pp. 5-6.
  57. ^ Kluger, Jeffrey (21 December 2012). "The Super-Duper, Planet-Frying, Exploding Star That's Not Going to Hurt Us, So Please Stop Worrying About It". Time Magazine. Retrieved 20 December 2015.
  58. ^ Tuthill, Peter. "WR 104: Technical Questions". Retrieved 20 December 2015.
  59. ^ Wolf, E. T.; Toon, O. B. (27 June 2015). "The evolution of habitable climates under the brightening Sun". Journal of Geophysical Research: Atmospheres. 120 (12): 5775–5794. Bibcode:2015JGRD..120.5775W. doi:10.1002/2015JD023302.
  60. ^ Balzani, Vincenzo; Armaroli, Nicola (2010). Energy for a Sustainable World: From the Oil Age to a Sun-Powered Future. John Wiley & Sons. p. 181. ISBN 978-3-527-63361-6.
  61. ^ Damian Carrington (21 February 2000). "Date set for desert Earth". BBC News. Retrieved 28 January 2014.
  62. ^ Clara Moskowitz (26 February 2008). "Earth's Final Sunset Predicted". space.com. Retrieved 28 January 2014.
  63. ^ Schröder, K. -P.; Connon Smith, R. (2008). "Distant future of the Sun and Earth revisited". Monthly Notices of the Royal Astronomical Society. 386 (1): 155–163. arXiv:0801.4031. Bibcode:2008MNRAS.386..155S. doi:10.1111/j.1365-2966.2008.13022.x. S2CID 10073988.
  64. ^ Leslie 1996, pp. 5-6.
  65. ^ "How humans might outlive Earth, the sun...and even the universe". NBC News. 2017. Retrieved 24 March 2020.
  66. ^ Leslie 1996, p. 9.
  67. ^ Wall, Mike (2015). "Should Humanity Try to Contact Alien Civilizations?". Space.com. Retrieved 20 April 2020.
  68. ^ See full text at SETIleague.org
  69. ^ a b c d Matheny, Jason G. "Reducing the risk of human extinction". Risk Analysis 27.5 (2007): 1335-1344.
  70. ^ Whitmire, Daniel P. (3 August 2017). "Implication of our technological species being first and early". International Journal of Astrobiology. 18 (2): 183–188. doi:10.1017/S1473550417000271.
  71. ^ Leslie 1996, p. 139.
  72. ^ a b Wells, Willard. Apocalypse when?. Praxis, 2009. ISBN 978-0387098364
  73. ^ Tonn, Bruce, and Donald MacGregor. "A singular chain of events". Futures 41.10 (2009): 706-714.
  74. ^ Malik, Tariq. "Stephen Hawking: Humanity Must Colonize Space to Survive". Retrieved 1 July 2016.
  75. ^ Shukman, David (19 January 2016). "Hawking: Humans at risk of lethal 'own goal'". BBC News. Retrieved 1 July 2016.
  76. ^ Hanson, Robin. "Catastrophe, social collapse, and human extinction". Global catastrophic risks 1 (2008): 357.
  77. ^ Yudkowsky, Eliezer. "Cognitive biases potentially affecting judgment of global risks". Global catastrophic risks 1 (2008): 86. p.114
  78. ^ "We're Underestimating the Risk of Human Extinction". The Atlantic. 6 March 2012. Retrieved 1 July 2016.
  79. ^ Is Humanity Suicidal? The New York Times Magazine 30 May 1993)
  80. ^ Sagan, Carl (1983). "Nuclear war and climatic catastrophe: Some policy implications". Foreign Affairs. 62 (2): 257–292. doi:10.2307/20041818. JSTOR 20041818.
  81. ^ a b Parfit, D. (1984) Reasons and Persons. Oxford: Clarendon Press.
  82. ^ Adams, Robert Merrihew (October 1989). "Should Ethics be More Impersonal? a Critical Notice of Derek Parfit, Reasons and Persons". The Philosophical Review. 98 (4): 439–484. doi:10.2307/2185115. JSTOR 2185115.
  83. ^ Bostrom 2013, pp. 23–24.
  84. ^ Benatar, David (2008). Better Never to Have Been: The Harm of Coming into Existence. Oxford University Press. p. 224. ISBN 978-0199549269. Although there are many non-human species - especially carnivores - that also cause a lot of suffering, humans have the unfortunate distinction of being the most destructive and harmful species on earth. The amount of suffering in the world could be radically reduced if there were no more humans.
  85. ^ Best, Steven (2014). The Politics of Total Liberation: Revolution for the 21st Century. Palgrave Macmillan. p. 165. ISBN 978-1137471116. But considered from the standpoint of animals and the earth, the demise of humanity would be the best imaginable event possible, and the sooner the better. The extinction of Homo sapiens would remove the malignancy ravaging the planet, destroy a parasite consuming its host, shut down the killing machines, and allow the earth to regenerate while permitting new species to evolve.
  86. ^ May, Todd (17 December 2018). "Would Human Extinction Be a Tragedy?". The New York Times.
  87. ^ Kupferschmidt, Kai (11 January 2018). "Could science destroy the world? These scholars want to save us from a modern-day Frankenstein". Science | AAAS. Retrieved 20 April 2020.
  88. ^ "Oxford Institute Forecasts The Possible Doom Of Humanity". Popular Science. 2013. Retrieved 20 April 2020.
  89. ^ Toby Ord (2020). The precipice: Existential risk and the future of humanity. ISBN 9780316484893. The international body responsible for the continued prohibition of bioweapons (the Biological Weapons Convention) has an annual budget of $1.4 million - less than the average McDonald's restaurant
  90. ^ "Practical application" page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology Archived 12 May 2005 at the Wayback Machine
  91. ^ Wagar, W. Warren (2003). "Review of The Last Man, Jean-Baptiste François Xavier Cousin de Grainville". Utopian Studies. 14 (1): 178–180. ISSN 1045-991X. JSTOR 20718566.
  92. ^ "He imagines a world without people. But why?". The Boston Globe. 18 August 2007. Retrieved 20 July 2016.
  93. ^ Tucker, Neely (8 March 2008). "Depopulation Boom". The Washington Post. Retrieved 20 July 2016.
  94. ^ Barcella, Laura (2012). The end: 50 apocalyptic visions from pop culture that you should know about -- before it's too late. San Francisco, CA: Zest Books. ISBN 978-0982732250.
  95. ^ Dinello, Daniel (2005). Technophobia!: science fiction visions of posthuman technology (1st ed.). Austin: University of Texas press. ISBN 978-0-292-70986-7.

Sources[edit]

Further reading[edit]