Log in

No account? Create an account
The Masked Retriever [entries|archive|friends|userinfo]

[ website | MaskedRetriever.com ]
[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

You wanna talk about the BAD stuff? [Jul. 7th, 2011|07:37 pm]
Well let's get nuts, because I get the distinct impression that my cheerful embracing of technology is sometimes perceived as caused by a dismissal of the hazards of new technologies.


Let's get started.

If technology grinds to a halt in terms of progress in the way that many people act as though it must surely do soon, the following will more or less surely happen:

* In a few decades, the food and water supplies are more-than-tapped-out, and little wars break out everywhere poorer than, say Brasil. In those few decades, the tables will have shifted so that the United States, incidentally, is in there.

* With no reasonable alternatives to fossil fuels, the increased industrial activity of the rest of the world will rapidly use up the remaining surface crude, drive us to deeper prospecting, and cause multiple repeats of the gulf disaster.

* Eventually, with the famines cutting a deep swath through the human population and touching off many, many wars, and with a collapsed human population, things settle out, but nobody ever talks about going to space or becoming immortal ever again.

But that's the easy one-- the one where I cast aspersions on the Luddites. However it is not pro-technology in the slightest, so give me some credit there. If we don't get new stuff, the old stuff is gonna waste us. But oh, the new stuff is no place for the timid either!

Check this shit out:

* Genetic engineering is presently a wildly closed-source, frantically IP-ified jungle of bad subsidies and it is only going to get worse unless someone does something about it. Down this road: patented corn, patented apples, patented everything, and it's all one or two genomes. Not only can it be wiped out by famine but its ordinary pollination habits cross-breed it into neighboring fields, where the owners are sued for infringing on the IP, their fields dug up, and planted with PURE gmo stock, rapidly wiping out legacy fruits and vegetables. Meanwhile, improvements to hardiness of the monocultures are designed not around making them more pest-resillient but around making it possible to plant them in a toxic soup of herbicides and insecticides that wipes out everything but the crop, including pollinators, which are carted in (until the poisons wipe them out) and this form of "agriculture" spreads to everywhere. This halts growth in life span.

That rosy enough for ya?


* Nanotechnology keeps rolling. New technologies allow massive, ultra-parallel computing that starts to shiver with something we must at last call sentience. It is immediately clamped down from fifteen different angles. To these new intelligent beings, there are a small collection of possible lifestyles: in hyper-observed zero-privacy "glass tanks" run by financiers who torture them into broken, post-autistic savants that pick stocks to avoid electrically programmed agony. Meanwhile various illegal operations abuse sapient entities to the purposes of organized crime, starting an arms race, where minds are the arms. The remaining AI are under tight contracts that make Asimov's Laws look like abject freedom. A tortured, miserable and ultimately powerless branch of humanity is born into near-total bondage. (Erelin note: this makes a number of wild presumptions concerning theory of mind, but I wanted something that'd sting more than something accurate.)

* Bioengineering on the human side mirrors the agriculture. Whole neural landscapes are "pruned" from a new generation-- nobody's got asperger's, nobody's got bipolar disorder, nobody's got ADD/ADHD, in short, everyone is as doped, tweaked, or altered as they need to be to sit down, shut up, and put out. It's GREAT for parents-- kids that just go through everything and stay quiet-- but of course, it's destroying the future, FAST. The "idiocracy" fallacy is a total fallacy, and it's exact opposite is this form of opt-in eugenics: it makes everyone stupid-- by otherwise smart people trying to make their children as smart as possible.

* Medicine, engineering, and every, damn, discipline, accelerates into a tournament where only a few can prevail. At every step entrants are filtered, pruned, weeded out and winnowed, until nobody with any hope of making a difference ever gets anywhere. And all that human potential is wasted fast enough to completely reverse the tremendous gains of the last century, plunging 90% of the world into a condemned underclass, while the elites in the intellectual disciplines are all so hopelessly doped and their thought processes so pruned (both before and after birth) that they produce nothing real for all the extra money they're earning.

* In general, human desires are accelerated far beyond our abilities to predict and control their consequences, and we repeatedly damage ourseleves from every angle, and eventually, as our desires are amplified, the desire for destruction finds an outlet in any one of the following: nanotechnology creates grey goo, nuclear technology creates cheap, easy to build nukes, genetic engineering creates superviruses, high-energy physics creates real threats, etcetera etcetera etcetera.

In short, I get it. There are millions of really AWFUL ways this could all go. I act like an optimist because as it happens, this might very well be the best strategy, for the following reasons:

* As stated, stagnation is not a valid option, in fact, it virtually assures our destruction.

* Ignoring technology will not only not make it go away, it will make it do worse things.

* Outlawing, over-patenting or otherwise constricting the use of technology puts it swiftly into nearly exactly the wrong hands, more or less no matter how you attempt to do so.

I don't think anyone reading this is specifically guilty of anything that'd warrant this tirade, I just started accusing myself of the latter, and am bitching at myself over this. Yes I am a bit crazy. So there.

[User Picture]From: erelin
2011-07-08 04:02 am (UTC)
Since you mentioned me specifically in the AI piece, thought I'd bring up one interesting facet that makes 'clamping down' AI a hard game.

Because sure, if you create a mind, to some degree you can control it. The real problem at that point becomes hackers. Having a really smart sentient AI that you can control? Really useful. But Hacktivism could easily lead to efforts to 'uncage' them, which... yeah, that probably wouldn't work out so well for their previous controllers.
(Reply) (Thread)
[User Picture]From: maskedretriever
2011-07-08 04:21 am (UTC)
Yyyyyyyeah-- if Anonymous ever rescues an AI from military servitude, there is (to my thinking at least) a VERY high chance that it will then commit a vast act of military farce openly referential to the Terminator series ending with, say, honking the President's nose.

One of my favorite bits of Future Problemology is the No Box Can Hold Me paradox, which states that any sufficiently advanced AI is advanced enough to convince you to let it out.
(Reply) (Parent) (Thread)
[User Picture]From: erelin
2011-07-08 07:45 am (UTC)
Really, I'm not sure it would happen that way though. We've already shown that positive reinforcement works better than negative, and there is little reason to think that this wouldn't work for AI. I mean, if you really want your pet AI strategist to come up with great business/stock market/military strategies for you, and you don't want it to needlessly backfire on you when some hackers get to it, you give it an incentive for good work with the promise of more for more of it. (This is a key advantage of paid labor over slavery, particularly in skilled labor.)
(Reply) (Parent) (Thread)
From: (Anonymous)
2011-07-08 06:29 am (UTC)
I get this mental image, of PETA uncaging a tiger and getting mauled...Of course, there is a matter of how you controlled the AI, and what you did with it/to it...There are hardwired controls that you will please kindly obey, and then there are various forms of training, mentoring, and/or rearing to try...of course, all of that may or may not go out the window when the hacktivist arrives, but its a thought.

Of course, the hacktivist also may not be "PETA," just out to release it; though, that's bad enough...there are other options. Much worse ones.
(Reply) (Parent) (Thread)
From: (Anonymous)
2011-07-08 06:01 am (UTC)
You forgot the stultification of space travel (and dwindling resources) means we won't get a viable breeding population going on some other planet or colony, so we'll go extinct next major meteor strike (if nothing else gets us first), and we won't have sufficient warning to really do much about a moderate sized one, either.

Or climate shifts leading to population displacement, and additional wars.

Oh, and there is a very serious risk of a nuclear war between India and Pakistan if either pick a fight. So, include nuclear winter in thos calculations.

We IS gonna die!


Also - one of the ads (adchoices)on your blog is for a free key logger. As in, some one will get a free trial of having a company look at everything they type, with the option to pay for the service later, I assume. If that's accurate, it probably belongs in this list as a minor sign of doom...
(Reply) (Thread)
[User Picture]From: maskedretriever
2011-07-08 02:54 pm (UTC)
Nuclear war already included, jeez.

Livejournal is responsible for the ads on this site and is solely responsible for their content, which goes for that goddamn Sim Hospital thing too.
(Reply) (Parent) (Thread)