[lbo-talk] Existential Risks

Charles A. Grimes cgrimes at rawbw.com
Wed Jan 31 14:58:47 PST 2007


``Existential risks...'' (see link) .d.

------------

I have to say this article made my morning. It was outrageously funny and at the same utterly devastating. Thanks Dwayne

The two threats to the species I found wildest of all were the idea that we exist in a long term simulation program, it is always possible that somebody can turn it off. The second was the idea that there is a possibility that we can upload a human intelligence to a computer, which can in turn improve itself in an exponiental rate and thereby become a threatening power of sufficient scope to wipe us out and leave a post-human world.

About the only thing left out of the article was an explanation of exactly why current human societies can pose serious threats to the species.

The writer assumes we might be able to control most developments and modify them sufficiently to avoid some of the extinction scenerios (global warming is just one). But what he failed to do was analyze more exactly why at the moment there are absolutely no efforts, except talk to do so.

The reason is that most well developed technological societies are also undergoing completely uncontrolled development due to unregulated and unplanned economic infrastructures. And further that the international super-structures that might provide some coordination, control, and regulation needed to avoid these doomsday scenerios are thoroughly committed to neoliberal and de-centralized policies.

In other words, we will never be able to develop global policies needed to contain and reverse global climate change or any other theoretically managable global risks. The reason is our national and international policies that favor capital and privilage its needs over all others, no matter what those other needs might be.

So, I am pretty firmly convinced we are dead meat sooner or later, unless we overthrow capital once and for all.

CG



More information about the lbo-talk mailing list