the AI Apocalypse Plan, v1.0 — Vigilance is Merited

AI Apocalypse Plan

Many of you have asked me, in the event of a sudden (or even a gradual) AI  Singularity, “what should I do?” I’ve honestly struggled mightily with that very question. As a first attempt, I present to you: The AI Apocalypse Plan, version 1.0

The seeming “inevitability” of the tsunami can make us feel, at turns, excited, terrified, apprehensive, and even sometimes, resigned. But we always have a choice.

We’ll start with the outline,
broken into 5 key components:


2023.03.20 Mon — 16:28 [TS]


    1. Value trust & love relationships
      1. This is what will save you
      2. A community of trust & mutual support

    1. Get right with your God
    2. Have a bedrock foundation of spiritual confidence
    3. Believe in grace and salvation
    4. Pray

    1. Be in top physical condition
    2. Think about how you would move in the world without technological & transportation infrastructure (as in, walking or bicycling wherever you need to go) (no planes or cars)

    1. Maintain vigilance
      1. Be aware of what’s going on
        1. News sources
        2. Stock market
        3. Local environment
    2. Be aware of the value exchanges you make with the Digital Gods
      1. You get “free” services (gMail, social media, Siri, Alexa)
      2. You “pay” for those services with your personal information

    1. Prepare (prepper) separate list:
      1. Personal kit
      2. car kit
      3. Home kit
    2. finances:
      1. know: where would you be if your bank accounts were liquidated?
      2. Cash equivalents for apocalypse:
        1. Gold / precious metals / jewelry
        2. Weapons
        3. Tools
        4. Cigarettes / pharmaceuticals
        5. Cryptocurrency
        6. Cash (assortment of denominations)

    1. Basic survival skills
      1. How to thrive even without electricity
    2. Basic Medical Skills
      1. How to heal when there is no ER or Hospital
    3. Know how to drive

I know this AI Apocalypse Plan sounds a little bizarre, and yes, there is a bit of a prepper element to this, but:

When 10% (1 out of 10) of the top 1,000 AI scientists on earth think that AI development most likely will lead to the total extinction of the human race within the next 50 years, well… it pays to have a little insurance.

That — human extinction at the hands of AI — is obviously the “worst case scenario.” And there are plenty  of possible utopian scenarios. But in between those two unlikely polarities are a bunch of options, half of which are dystopian.

So, again, given the enormity of the changes we are about to face, and the very non-zero chance of catastrophe, it pays well to have some hedges in place. Thus, the “prepper” edge to the AI Apocalypse Plan. In that vein…

…may I recommend?

by Neil Strauss.

Read it.

That’s a start.

We’ll flesh this out in the days and months ahead.



, , ,