Technology as a Bear Trap
“But lo! Men have become the tools of their tools.”
-Thoreau
TLDR: Once technology gifted an unready civilization the ability to destroy itself via the push of a button, it bankrupted the wisdom of unfettered technological development for its own sake – yet we still barrel ahead.
Why Question Technology? #
This post aims to call into question whether technology as a whole has well-served humanity so far. Now, of course advancing technology has brought numerous practical benefits to our lives, which might seem to place the good of technology beyond question. However, because technology so fundamentally shapes and reshapes our world, we should still regularly question whether the benefits of technology really do outweigh the costs.
Even if after such questioning we conclude that, yes, on the whole, technology has made our lives better, it is still important to pause and ask the question. And we should likewise question all of the other powerful drivers of our culture – our politics, our economic system, our institutions; we should make sure that they each are still serving our best interests. For although it might be difficult, we as a society can reshape any of these powerful drivers if we so desire. We created them, after all. But for now, let us stick more narrowly to questioning only technology.
The Many Benefits #
First of all, we can consider the many ways that technology has improved the human condition. Fatal diseases have been rendered curable, trans-continental travel has become safe and efficient, near-instantaneous global communication is now ubiquitous, and many laborious tasks are now entirely automated. Clearly, technology has made our lives less brutish: It has facilitated a more organized civilization where violent deaths have on the whole tended to decrease, and quality of life has generally increased.
If so far technology has mostly helped to increase the quality of our lives, a case for the perils of advancing technologies might seem implausible. But this point of view neglects the possibility that technology could yield benefits while it develops, but still be ruinous in the end. It’s like a bread-crumb trail that leads to a bear trap.
Progress Traps #
But how could technological development be similar to a trap? A straight-forward argument is that because nuclear weapons are a product of technology, and that they may well cause the annihilation of the human species, that the sum total of the many boons of technology so far pale in comparison to this destructive potential. That is, if unfettered technological development ultimately leads to humanity’s destruction, then the niceties it brought along the way were just a sweet frosting covering the underlying poison.
An all-out nuclear war would devastate the planet and potentially drive the human race extinct: Our massive future potential to resolve global conflict, to go beyond the Earth, to explore the universe, to discover the fundamental laws governing our world – it would all be wiped out. In this way, although technology has improved our lives, the ultimate effect so far has been to put an incredible power in our hands that we are not yet ready to wield. In fact, we’ve already come close the brink of nuclear war several times.
The lesson isn’t that technology is evil; or even that technological development is evil. The more subtle insight is that we’re basically slightly-evolved apes with an inborn morality unsuited to modern times; and it although we have had great moral teachers, on the whole our morality evolves slowly, and requires us to overpower our outdated brains: Brains that are much better adapted to the cave-man environment they evolved to best serve (and not the bold new modern world we’ve invented at a pace much exceeding natural evolution).
The overriding problem is that our technological development has outpaced the development of our morality. The result is an accumulation of technological power that we’re unfit to wield.
Still, We Barrel Ahead #
It’s not news to point out the fact that as a species we’re not responsible enough to be entrusted with the terrible power of nuclear weapons. So why then, after making such a mistake in extending our power far beyond our moral capabilities do we continue to rush on head-first in developing new technologies, with even greater destructive power? After inventing the nuclear fission bomb that the US dropped on Hiroshima, was it really necessary to create a weapon 1,500 times more powerful? Shouldn’t the lesson be that we should be much more careful and deliberate in how we encourage technological progress?
The answer, through invoking the precautionary principle, is a definitive yes. We as a species, by being Earthbound, and having developed the capacity to destroy ourselves, without the wisdom to avoid doing so, are in a particularly precarious spot in our development. There is a non-trivial chance that we could extinct our species in the near future. But the danger of self-extinction is lessened if we survive long enough to either: 1) enhance our morality enough to handle the responsibility of the power we’ve accumulated through technology; or 2) colonize our moon or other planets, such that the eggs of humanity are not all in one planetary basket.
So if our goal as a species is to realize our great potential – imagine a world of plenty, without starvation or violence – then perhaps we should focus technological development (as much as that is even possible) on two fronts: Developing our morality, and developing our capabilities for exploring space.
Can Technology Save Us From Itself? #
Perhaps if we had more wisdom earlier, we would have adopted new technologies only slowly and deliberately. That way, we might have better understood their effects before deploying them on a massive scale. Unfortunately, in our brash rush to discover, we’ve unleashed the power to destroy ourselves, and that power is now vested in many self-interested nations that may not be beyond exercising that incredible power.
So because we cannot place pandora back int he box, ironically, it may be that technology now is our best hope of saving ourselves – from our own technology. But what I’m suggesting is not the same unfettered technological development that got us into this mess in the first place – development driven by capitalism or government self-interest.
Technological Development Driven By Raw Capitalism #
Technological development driven by capitalism tends towards doing anything that can be done – whether or not it should be done. That is, capitalism rewards the commercial applications of a technology without asking the tough questions, like whether society can safely wield that technology or not? For example, biotech companies may develop tools that allow easy genetic engineering, which on one hand could facilitate curing genetic disorders; on the other hand, the same tools might also enable a misguided person to craft (and possibly unleash) a civilization-ending biological weapon.
Yet, market pressure will drive a company to develop those tools anyways – because there is money to be made; the benefits are reaped by the company (they create a new profitable product), while the risks are borne by civilization as a whole (an externality, a well-known problem with capitalism).
Technological Development Driven By Self-Serving Government #
The other main driver for technological development, are governments. While governments support basic research, they are also largely driven explicitly towards weaponization of technology. And clearly, developing better weapons hardly seems like a solution to the problem of having more powerful weapons than we can currently handle.
So in the end, the current inertia of technological development is worrisome: It is driven by capitalism to do anything and everything without asking if it will ultimately benefit humanity, and it is often driven by governments to develop increasingly powerful weapons.
What To Do? #
I don’t have all the answers to these troubling problems, although future posts will explore possible solutions. However, it seems like the first step is awareness. If we as a society become aware that technological development requires heavy responsibility, and is not a beneficent panacea, than perhaps we can begin an important conversation about being more thoughtful, deliberate, and realistic about how technological development can be focused for our species’ benefit (instead of for the short-sighted benefit of profits or governmental defense).
For further reading, I recommend: