Threats to consciousness

in philosophy •  8 years ago 

In my first post I mentioned threats to consciousness, here I set out what I believe these threats are. I don't consider myself a philosopher, psychologist or spiritual person. What I'm describing is based on a mish-mash of ideas I've picked up during the years.

To define what I mean by consciousness, I mean it as referred to in the Hard Problem of Consciousness.

Clathrate Gun by Lamerie

The first, possibly most obvious threat is that of physical survival, in particular due to climate change. Many here see climate change as a story developed for control purposes, and I keep an open mind as this is a significant possibility, but on balance I believe the clathrate gun has already been fired, and a number of feedback mechanisms are occurring with significant conseqences. In particular, the availability breathable air may become an issue. As we have already passed the point of no return efforts of the world's power structures are now focused on blinding the population at large to what is occurring. However I don't think this will wipe out society, the worst case scenario I envisage has the worlds population drop from billions to millions within the next generation, the survivors being those with technology to survive in a more hostile natural climate. I wouldn't expect to survive, but those that do may eventually experience an improved quality of life. My best case scenario is based around the greening of desert regions due to increased atmospheric water vapor turning the balance of power between first and third world, but this is almost pure fantasy with very little supporting evidence. There are other physical threats including war - what they have in common is that there is very little we can do personally about them, but they do bring a sense of urgency.

The second threat is from without - the development of artificial intelligence threatens to render human life obsolete or irrelevant, and may be perceived as a threat. These threats are often framed in terms of the singularity, and super-human intelligence, but I am more fearful of sub-human intelligence - the most obvious risk being autonomous weapons.

The third threat from within - as we hand off more personal agency to machines, and soon develop enhancements to our own intelligence without a sound understanding consciousness we risk short circuiting whatever mechanism has evolved to make the human brain a suitable environment for consciousness to arise. This is the most important one to focus on, since it is through understanding consciousness and agency that we can better consider how to deal with the other threats facing us.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!