Post by creature386 on Jul 15, 2017 20:17:27 GMT 5
How (if at all) do you think humanity is going to go extinct?
I will give a few options (you can pick multiple options, since some may enhance others). Many of the options are rather exotic, so I'll explain them. I'm sure many of us have wondered how humanity could go extinct. If you are hearing most of the ideas here for the first time, simply post the ones you had before viewing the thread. I don't think all (or even most are plausible), but I try to cover a broad scope, even if some suggestions come from crackpots.
-Environmental destruction:
The planet gets warmer, resources run out, the air gets polluted and we cannot adapt to it.
-A global thermo-nuclear war
-A volcanic super-eruption
-An asteroid impact
-A nearby supernova explosion
-Decadence:
We could become too spoiled to survive. We could spend all day in virtual realities and not notice when one of the other entries of the list happens and kills us.
-Boredom:
We are invincible, but we one day could become too bored to live and simply all commit suicide.
-Evil Jews:
Under some conspiracy theories, there is a group ofJews sinister conspirators who wish to create a New World Order and destroy the entire world population with chemtrails for some nefarious end. If they die (after all, they breathe chemtrail-polluted air), too, it'd be the end of humanity.
-A pandemic disease:
Some disease which we can neither cure nor develop an immunity to kills us. The below could be a source.
-Biotechnology:
Perhaps in combination to the war or to terrorism, someone programs a bioweapon that kills humanity.
Errant GMOs also fall in this category.
-Grey goo:
Consider nanobots that could manipulate matter at an atomic scale (we have them in our body, they're called ribosomes). They could re-assemble the atoms in sand to form computer chips, for example. And they could re-assemble atoms to form another nanobot. If these self-replicating robots get out of control and are not detected early enough, they could wipe out humanity. They could be created by accident, by the unfriendly AI or by aliens.
-A failed physics experiment:
Physicists could accidentally create a black hole or anything else that could destroy the world.
-An unfriendly AI:
Artificial Intelligences can do many things. Let's assume one day an AI appears which can do the following: Improve itself.
In case that happens, it could use its new intelligence to improve itself even further and so on, until its intelligence becomes godlike and its technology as incomprehensible to us as ours is to chimps. Now, that might be great. If it's friendly, it could develop a heaven on Earth and even protect us from all the other existential risks.
But let's assume it isn't. Let's assume it has been programmed to build as many paperclips as possible. At first, it would search them on the ground. As it grows more intelligent and learns more about the world, it would find a job and earn money to buy paperclips and eventually build some by itself. Once it becomes sufficiently advanced, it could convert all raw matter in the solar system (including us) into paperclips.
-Alien invasion:
Any aliens that have mastered interstellar travel would have technology indistinguishable from magic. They could be even more powerful than a godlike AI. Maybe they even have several such AIs in their society? If they are hostile, resistance is futile and we will be assimilated.
-The death of the Sun:
In 5 billion years, the Sun will grow into a red giant. Earth will become too hot to support life. The oceans would even evaporate well before that. If humanity doesn't master interplanetary travel, we are doomed. If we never master interstellar travel, we die when the Sun becomes too dim to give us energy.
-The heat death of the universe:
Eons from now, all the stars in the universe will run out of negentropy and fade away. They will fall into black holes, like the one at the center of the Milky Way. Similarly, the elements will decay, leaving us not many elements that could compose our bodies or whatever media our minds will be stored in. 10100 years in the future, the last black hole (and with it, the last possible energy source) will ave evaporated, leaving us with nothing but a sea of quarks and leptons.
-Vacuum collapse:
Another scenario that could terminate the universe. Consider the possibility of living in a false vacuum. It our cosmos is a false vacuum, it has the possibility of collapsing into a true vacuum. As this vacuum bubble spreads out, it'd destroy everything in its path. It would do so at lightspeed, so there is no way we could detect it and do anything against it. It could happen any second, even as you're reading this text. However, whether we are living in one is very controversial. It is possible that the collapse has already happened shortly after the Big Bang in the inflation area.
From the paper where this was proposed:adsabs.harvard.edu/abs/1980PhRvD..21.3305C
-Simulation termination:
Assuming our whole universe is a computer simulation, its creators could simply shut it down.
-AI failure:
If we develop strong AI and upload our minds into computers, we would strongly depend on computers. Some virus or other software failure could spell the end of human society.
-Supernatural apocalypse:
Assuming evil gods/supernatural entities exist, they could terminate humanity. Perhaps they cooperate with the NWO who is trying to hide them.
-Humanity won't go extinct:
Michio Kaku thinks the heat death of the universe could be avoided by creating a new universe and escaping into it. From his book The Physics of the Impossible:If he heat death of the universe is avoidable, humanity could become truly immortal. As with the vacuum collapse, it is disputed whether such a feat is even physically possible.
-Something entirely else could happen.
I will give a few options (you can pick multiple options, since some may enhance others). Many of the options are rather exotic, so I'll explain them. I'm sure many of us have wondered how humanity could go extinct. If you are hearing most of the ideas here for the first time, simply post the ones you had before viewing the thread. I don't think all (or even most are plausible), but I try to cover a broad scope, even if some suggestions come from crackpots.
-Environmental destruction:
The planet gets warmer, resources run out, the air gets polluted and we cannot adapt to it.
-A global thermo-nuclear war
-A volcanic super-eruption
-An asteroid impact
-A nearby supernova explosion
-Decadence:
We could become too spoiled to survive. We could spend all day in virtual realities and not notice when one of the other entries of the list happens and kills us.
-Boredom:
We are invincible, but we one day could become too bored to live and simply all commit suicide.
-Evil Jews:
Under some conspiracy theories, there is a group of
-A pandemic disease:
Some disease which we can neither cure nor develop an immunity to kills us. The below could be a source.
-Biotechnology:
Perhaps in combination to the war or to terrorism, someone programs a bioweapon that kills humanity.
Errant GMOs also fall in this category.
-Grey goo:
Consider nanobots that could manipulate matter at an atomic scale (we have them in our body, they're called ribosomes). They could re-assemble the atoms in sand to form computer chips, for example. And they could re-assemble atoms to form another nanobot. If these self-replicating robots get out of control and are not detected early enough, they could wipe out humanity. They could be created by accident, by the unfriendly AI or by aliens.
-A failed physics experiment:
Physicists could accidentally create a black hole or anything else that could destroy the world.
-An unfriendly AI:
Artificial Intelligences can do many things. Let's assume one day an AI appears which can do the following: Improve itself.
In case that happens, it could use its new intelligence to improve itself even further and so on, until its intelligence becomes godlike and its technology as incomprehensible to us as ours is to chimps. Now, that might be great. If it's friendly, it could develop a heaven on Earth and even protect us from all the other existential risks.
But let's assume it isn't. Let's assume it has been programmed to build as many paperclips as possible. At first, it would search them on the ground. As it grows more intelligent and learns more about the world, it would find a job and earn money to buy paperclips and eventually build some by itself. Once it becomes sufficiently advanced, it could convert all raw matter in the solar system (including us) into paperclips.
-Alien invasion:
Any aliens that have mastered interstellar travel would have technology indistinguishable from magic. They could be even more powerful than a godlike AI. Maybe they even have several such AIs in their society? If they are hostile, resistance is futile and we will be assimilated.
-The death of the Sun:
In 5 billion years, the Sun will grow into a red giant. Earth will become too hot to support life. The oceans would even evaporate well before that. If humanity doesn't master interplanetary travel, we are doomed. If we never master interstellar travel, we die when the Sun becomes too dim to give us energy.
-The heat death of the universe:
Eons from now, all the stars in the universe will run out of negentropy and fade away. They will fall into black holes, like the one at the center of the Milky Way. Similarly, the elements will decay, leaving us not many elements that could compose our bodies or whatever media our minds will be stored in. 10100 years in the future, the last black hole (and with it, the last possible energy source) will ave evaporated, leaving us with nothing but a sea of quarks and leptons.
-Vacuum collapse:
Another scenario that could terminate the universe. Consider the possibility of living in a false vacuum. It our cosmos is a false vacuum, it has the possibility of collapsing into a true vacuum. As this vacuum bubble spreads out, it'd destroy everything in its path. It would do so at lightspeed, so there is no way we could detect it and do anything against it. It could happen any second, even as you're reading this text. However, whether we are living in one is very controversial. It is possible that the collapse has already happened shortly after the Big Bang in the inflation area.
From the paper where this was proposed:
The possibility that we are living in a false vacuum has never been a cheering one to contemplate. Vacuum decay is the ultimate ecological catastrophe; in the new vacuum there are new constants of nature; after vacuum decay, not only is life as we know it impossible, so is chemistry as we know it. However, one could always draw stoic comfort from the possibility that perhaps in the course of time the new vacuum would sustain, if not life as we know it, at least some structures capable of knowing joy. This possibility has now been eliminated.
-Simulation termination:
Assuming our whole universe is a computer simulation, its creators could simply shut it down.
-AI failure:
If we develop strong AI and upload our minds into computers, we would strongly depend on computers. Some virus or other software failure could spell the end of human society.
-Supernatural apocalypse:
Assuming evil gods/supernatural entities exist, they could terminate humanity. Perhaps they cooperate with the NWO who is trying to hide them.
-Humanity won't go extinct:
Michio Kaku thinks the heat death of the universe could be avoided by creating a new universe and escaping into it. From his book The Physics of the Impossible:
[An] advanced civilization facing the ultimate death of the universe could contemplate taking the ultimate journey to another universe. For these beings the choice would be to freeze to death or leave. The laws of physics are a death warrant for all intelligent life, but there is an escape clause in those laws. Such a civilization would have to harness the power of huge atom smashers and laser beams as large as a solar system or star cluster to concentrate enormous power at a single point in order to attain the fabled Planck energy. It is possible that doing so would be sufficient to open up a wormhole or gateway to another universe. A Type III civilization may use the colossal energy at their disposal to open a wormhole as it makes a journey to another universe, leaving our dying universe and starting over again.
-Something entirely else could happen.