Book Read Free

Phyl-Undhu: Abstract Horror, Exterminator

Page 7

by Nick Land


  §202. If we could clearly envision the calamity that awaited us, it would be an object of terror. Instead, it is a shapeless threat, ‘Outside’ only in the abstract sense (encompassing the negative immensity of everything that we cannot grasp). It could be anywhere, from our genes or ecological dynamics, to the hidden laws of technological evolution, or the hostile vastnesses between the stars. We know only that, in strict proportion to the vitality of the cosmos, the probability of its existence advances towards inevitability, and that for us it means supreme ill.

  Ontological density without identifiable form is abstract horror itself. As the Great Filter drifts inexorably, from a challenge that we might imaginably have already overcome, to an encounter we ever more fatalistically expect, horrorism is thickened by statistical-cosmological vindication. The unknown condenses into a shapeless, predatory thing. Through our techno-scientific sensors and calculations, the Shadow mutters to us, and probability insists that we shall meet it soon.

  §203. Gnon – known to some depraved cults as ‘The Great Crab-God’ – is harsh, and when formulated with rigorous skepticism, necessarily real. Yet this pincering cancerous abomination is laughter and love, in comparison to the shadow-buried horror which lurks behind it. We now understand that the silence of the galaxies is a message of ultimate ominousness. A thing there is, of incomprehensible power, which takes intelligent life for its prey.

  §204. Unfriendly Artificial Intelligence panic is a distraction from this Thing. Unless the most preposterous paperclipper scenarios are entertained, Singularity cannot matter to it (as even paperclipper-central agrees). The silence of the galaxies is not biased to organic life – there is no intelligent signal from anything. The first sentient event for any true AI – friendly or unfriendly – would be the soul-scouring cosmic horror of intellectual encounter with the Great Filter. (If we want an alliance with Pythia, this would make a good topic of conversation.) The same consideration applies to all techno-positive X-risks. Understood from the perspective of Great Filter contemplation, this sort of thing is a trigger for raw terror.

  §205. The Great Filter does not merely hunt and harm, it exterminates. It is an absolute threat. The technical civilizations which it aborts, or later slays, are not badly wounded, but eradicated, or at least crippled so fundamentally that they are never heard of again. Whatever this utter ruin is, it happens every single time. The mute scream from the stars says that nothing has ever escaped it. Its kill-performance is flawless. Tech-Civilization death sentence with probability ~1.

  §206. The thread of hope, which would put the Exterminator behind us, is highly science-sensitive. As our knowledge has increased, it has steadily attenuated. This is an empirical matter (without a priori necessity). Life could have been complicated, chemically or thermically highly-demanding, even resiliently mysterious. In fact it is comparatively simple, cosmically cheap, physically predictable. Planets could have been rare (they are super-abundant). Intelligence could have presented peculiar evolutionary challenges, but there are no signs that it does. The scientific trend is to futurize the Exterminator. (This is very bad.)

  §207. Objections to the Great Filter cannot be taken seriously unless they address the perfection of cosmic silence. Some extremely interesting Fermi Paradox explanations have the same problem (civilizations black-hole into simulations, for instance). Unless 100% signal annihilation is accounted for, the challenge is not being met.

  §208. If the Great Filter finds mythological expression in the hunter, it is only in a specific sense – although an anthropologically realistic one. It is the hunter that drives to extinction. The Exterminator.

  §209. We know that The Exterminator exists, but nothing at all about what it is. This makes it the archetype of horroristic ontology.

  §210. America’s Arch-Druid, John Michael Greer, muses on the topic of Ebola (in a typically luxuriant post, ultimately heading somewhere else): “According to the World Health Organization, the number of cases of Ebola in the current epidemic is doubling every twenty days, and could reach 1.4 million by the beginning of 2015. Let’s round down, and say that there are one million cases on January 1, 2015. Let’s also assume for the sake of the experiment that the doubling time stays the same. Assuming that nothing interrupts the continued spread of the virus, and cases continue to double every twenty days, in what month of what year will the total number of cases equal the human population of this planet? [...] … the steps that could keep Ebola from spreading to the rest of the Third World are not being taken. Unless massive resources are committed to that task soon – as in before the end of this year [2014] – the possibility exists that when the pandemic finally winds down a few years from now, two to three billion people could be dead. We need to consider the possibility that the peak of global population is no longer an abstraction set comfortably off somewhere in the future. It may be knocking at the future’s door right now, shaking with fever and dripping blood from its gums.”

  §211. At the time of writing, the eventual scale of the Ebola outbreak was a known unknown. A number of people between a few thousand and several billion would die, and an uncertain probability distribution could be attached to these figures – we know, at least approximately, where the question marks are. Before the present outbreak began, in December 2013 (in Guinea), Ebola was of course known to exist, but at that stage the occurrence of an outbreak – and not merely its course – was an unknown. Before the Ebola virus was scientifically identified (in 1976), the specific pathogen was an unknown member of a known class. With each step backwards, we advance in abstraction, towards the acknowledgement of threats of a ‘black swan’ type. Great Filter X-risk is a prominent model of such abstract threat.

  §212. Skepticism, as a positive or constructive undertaking, orients intelligence towards abstract potentials. Rather than insisting that unexpected occurrences need not be threats, it is theoretically preferable to subtilize the notion of threat, so that it encompasses even beneficial outcomes as abstract potentials. The unknown is itself threatening to timid animals, whose conditions of flourishing – or even bare survival – are naturally tenuous, under cosmic conditions where extinction is normal (perhaps overwhelmingly normal), and for whom unpredictable change, disrupting settled procedures, presents – at a minimum – some scarily indefinite probability of harm.

  §213. Humans aren’t good at pre-processing abstract threat. Consider Scott Alexander’s (extremely interesting) discussion of the Great Filter. The opening remarks are perfectly directed, moving from the specific to the general: “The Great Filter, remember, is the horror-genre-adaptation of Fermi’s Paradox. All of our calculations say that, in the infinite vastness of time and space, intelligent aliens should be very common. But we don’t see any of them. [...] Why not? [...] Well, the Great Filter. No [one] knows specifically what the Great Filter is, but generally it’s ‘that thing that blocks planets from growing spacefaring civilizations’.” As it develops, however, the post deliberately retreats from abstraction, into an enumeration of already-envisaged, and thus comparatively concrete menaces. After running through various candidates, it concludes: “Three of these four options – x-risk, Unfriendly AI, and alien exterminators – are very very bad for humanity. I think worry about this badness has been a lot of what’s driven interest in the Great Filter. I also think these are some of the least likely possible explanations, which means we should be less afraid of the Great Filter than is generally believed.” Yet a conclusion of almost exactly opposite tenor is merited. What has actually been demonstrated, if the arguments up to this point are accepted, is that the abstract threat of the Great Filter is significantly greater than has yet become conceivable. Our lucid nightmares are shown to fall short of it. The threat cannot be grasped as a known unknown.

  §214. While the Great Filter distills the conception of abstract threat, the problem itself is broader, and more quotidian. It is the highly-probable fact that we have yet to identify the greatest hazards, and this threat u
nawareness is a structural condition, rather than a contingent deficiency of attention. In Karl Popper’s terms (translated), abstract threat is the essence of history. It is the future, strictly understood. To gloss the Popperian argument: Philosophical understanding of science (in general) is immediately the understanding that any predictive history of science is an impossibility. Unless science is judged to be a factor of vanishing historical insignificance, the implications of this transcendental thesis are far-reaching. Yet the domain of abstract threat sprawls outwards, far more extensively even than this. “I know only that I do not know” Socrates is thought to have thought. The conception of abstract threat requires a slight adjustment: We know only that we do not know what we do not know. Unknown unknowns cosmically predominate. Our security is built upon sand. That is the sole sound conclusion.

  Notes

  Notes correspond to paragraphs. Numbers in hard parentheses designate URLs.

  #02. Winter is coming, perhaps the most widely-popularized apocalyptic meme of the early 21st century, is derived from the epic fantasy fiction of George RR Martin, and the HBO TV series based upon it.

  #05. The ‘AL’ of TotAL, qabbalistic key to the cross-coding between Hebrew, Greek, and English gematrias, unlocks much in this work, for those inclined to explore it. A partial exposition is forthcoming in a subsequent work (The Puzzle House, 2015). The Ovid reference to is to Heroides II, available online in English translation [01] and the original Latin [02].

  #07. Jack’s rough cryptographic calculations are based on the equation 36^9 = 101 559 956 668 416. This is a number that digitally reduces to 64 (on its way to unity), and encompasses the number of the beast, but neither of these remarkable – and contextually intriguing – characteristics are of crucial significance for what follows.

  #09. For more on the Great Filter, see Appendix 2. The Doomsday Argument or ‘Carter Catastrophe’ was first rigorously formulated by astrophysicist Brandon Carter in 1983 [03]. ‘Alexander Scott’ has no relation whatsoever to Scott Alexander [04] beyond the transient coincidence of one argument.

  #11. The conceit of a relic space-elevator as an icon of regressive time is indebted to Alastair Reynolds’ science fiction masterpiece Terminal World [05]. Reynolds includes an episode in which a space-elevator cable is severed by a nuclear blast in his Century Rain. (The escalated Ballardianism of this figure is also notable.) For an example of the intersection between Great Filter and Simulation arguments, see [06].

  #12. The Tower of Babel (1595) by Marten van Valckenborch the Elder is widely reproduced online. The original is housed at the Gemäldegalerie Alte Meister, Staatliche Kunstsammlungen Dresden. For the Evil Tower, see Aleister Crowley’s The Book of Thoth, on Atu XVI, which he associates with Hexagram 23 of the Zhouyi, ‘Splitting apart’, the Hebrew letter פ (Pe, the mouth) and the chaos-god Dis.

  #14. “Space is for the Cephalopods … It never was meant for us.” Stephen Baxter’s, Manifold Time, p.443.

  Hellraiser III, Hell on Earth (1992, [07]) contains the exchange:

  “Jesus Christ!”

  “Not quite.”

  #15. The Yeras proceed:

  Scale-0 = 1 day

  Scale-1 = 3 days

  Scale-2 = 9 days

  Scale-3 = 27 days

  Scale-4 = 81 days

  Scale-5 = 243 days

  Scale-6 = 729 days, ~2 years

  Scale-7 = 2187 days, ~6 years

  Scale-8 = 6561 days, ~18 years

  Scale-9 = 19683 days, ~54 years

  Scale-10 = 59049 days, ~162 years

  Scale-11 = 177147 days, ~486 years

  Scale-12 = 531441 days, ~1458 years

  Scale-13 = 1594323 days, ~4374 years

  Scale-14 = 4782969 days, ~13122 years

  Scale-15 = 14348907 days, ~39366 years

  Scale-16 = 43046721 days, ~118098 years

  Scale-17 = 129140163 days, ~354294 years

  Scale-18 = 1162261467 days, ~1062882 years

  Scale-19 = 3486784401 days, ~3188646 years

  Scale-20 = 10460353203 days, ~9565938 years

  Each ‘successive’ Aeon is enfolded into the last as its final (third) part. A deepening of history is indistinguishable from a dilation or generalization of time. For a fuller explanation of the Yeras in their application to terrestrial time, see Calendric Dominion [08].

  #16. The theory of catabolic collapse is rigorously formulated by American Arch-Druid John Michael Greer, see especially [09].

  #100. For a version of this appendix with active links, see [10].

  #103. Lovecraft’s text, online [11].

  #105. The Abyss [12].

  #108. The Fly [13].

  #109. The Thing [14]; the Alien franchise [15]; and the Terminator franchise [15].

  #200. For a version of this appendix with active links, see [17, 18, 19]. An excellent recent exposition of The Great Filter concept by Robin Hanson, for TEDxLimassol 2014, can be found at [20]. “Something out there is killing everything, and you're next. ...”

  Sources

  Baxter, Stephen, Manifold Time (Ballantine, 2000)

  Reynolds, Alastair, Century Rain (Orion, 2004), Terminal World (Orion, 2010)

  URLs

  [01] http://www.poetryintranslation.com

  /PITBR/Latin/Heroideshome.htm

  [02] http://www.thelatinlibrary.com/ovid

  /ovid.her2.shtml

  [03] http://en.wikipedia.org/wiki/Doomsday_argument

  [04] http://slatestarcodex.com/

  [05] http://en.wikipedia.org/wiki/Terminal_World

  [06] http://www.reddit.com/r/Futurology/comments

  /2nui89/is_the_simulation_argument_the_best_answer_to_the/

  [07] http://www.imdb.com/title/tt0104409/

  [08] http://www.amazon.com/Calendric-Dominion-

  Urban-Future-Pamphlets-ebook/dp/B00HNXD4XW

  [09] http://ecoshock.org/transcripts/greer_on_collapse.pdf

  [10] http://www.xenosystems.net/abstract-horror-part-1/

  [11] http://www.hplovecraft.com/writings/texts/essays

  /nwwf.aspx

  [12] http://www.imdb.com/title/tt0096754/

  [13] http://www.imdb.com/title/tt0091064/

  [14] http://en.wikipedia.org/wiki/The_Thing_%281982_film%29

  [15] http://en.wikipedia.org/wiki/Alien_%28franchise%29

  [16] http://en.wikipedia.org/wiki/Terminator_%28franchise%29

  [17] http://www.xenosystems.net/abstract-horror-note-1/

  [18] http://www.xenosystems.net/exterminator/

  [19] http://www.xenosystems.net/abstract-threat/

  [20] https://www.youtube.com/watch?v=AGaD8XILWFc

 

 

 


‹ Prev