The Transhumanist Agenda

Transhumanism is an international intellectual and cultural movement supporting the use of science and technology to improve human mental and physical characteristics and capacities.

So sayeth Wikipedia circa November 2010. It's not wrong, though it's not entirely accurate either. The philosophy of Transhumanism more precisely suggests the application of science and technology to eradicate human suffering and solve societal and physical issues that afflict us as a species and as individuals.

This is a branch of philosophy that fascinates me, and it feels a whole lot like a self-fulfilling prophecy to my mind as well. The whole purpose of technology is to overcome the limitations of our own minds and bodies and to make our lives better, greater and easier. This is clear enough with technology such as clothing and wheels but gets a little more abstract when you consider stock markets, economies, currency or smart phones but the driving principles are the same merely the environment we are coping with has become less tangible.

So medicine is probably the posterchild of the Transhumanist ideal, technology directly applied to fixing us. But there's more it could do and more applications of other technologies that we could use to radically change the human experience. This is where morality comes along with a sack full of concerns:

We could eliminate unhappiness. We alter brain chemistry through your preferred medium, chemical adjustment, genetic manipulation, an implant to stimulate some happy part of the brain - the particular science is not important but the ramifications are huge. It sounds great, no more sad people - we all spend a significant portion of our lives in the pursuit of happiness or the evasion of unhappiness. But there's a reason we feel unhappy, it informs our behaviour, it warns us when conditions are adverse to our health (mental or physical). Unhappiness has a function.

If you need a better example: imagine eradicating the ability to feel pain. No more owee, but where is the cognitive feedback now that informs and educates you that fire is a bad place to put your hand? The negatives of human experience are as important, if not (on a survival of the species sense) more important than the positives.

So can we really say that solving all of our problems, solving all negative experience is beneficial on an individual or societal level? We can all agree that needless suffering is a bad thing, but is some suffering necessary?

Nemesis: Motivation

Morality in Godlessness