Is The ‘Golden Age’ of GM Tarnishing Already

This entry was sparked by an article I read over at NPR’s website about the failure of GM corn, and the seed companies desperate search for something to replace it. You can read it yourself by clicking here. As the article points out, one of the original commercially available genetically modified plants, BT corn, is failing. But a lot of my readers aren’t farmers and don’t know what BT corn is or know about the problem it was developed to cure, so let me tell you about that first.

The problem is this guy here:

This is the european corn root worm, otherwise known as the corn borer, and its rather nondescript parent moth. The european corn root worm is not native to North America. It first appeared in the US around 1917, and quickly spread throughout the US and up into Canada. The moth itself is harmless. It’s offspring, however, loves to eat the roots of corn plants, tunneling through them, weakening them, reducing yields, and even killing the plant. Over the years it became a serious problem, infesting much of the corn belt. There were pesticides that could kill it, but they are expensive, and a lot of the most effective pesticides are toxic and many have been outright banned because of their toxicity over the years.

BT corn was one of the first genetically engineered plants to be approved for commercial use. It incorporated genes from bacillus thuringiensis, a naturally occurring bacteria that lives in the soil. The modification causes the corn plant to produce a protein called Bt delta endotoxin, which kills the larvae of the corn borer. The BT toxin itself is not new. It’s been available for use as a pesticide since the 1960s, and it has a pretty good safety record. It is generally recognized as safe for humans, other mammals, fish, birds, and most insects.

BT corn worked very well indeed. So well that it quickly dominated the market and a large number of different varieties of BT corn are now available. But there’s a problem. It’s not working very well any longer.

In just twelve years the corn borer began to develop a tolerance for the BT toxin, and it began to spread. We are rapidly approaching the point where BT corn will no longer be effective against the pest and we’ll be right back where we started.

The same thing is happening with the other big GM cash cow, engineered plants like soybeans that are immune to glyphosate, the active ingredient in RoundUp from Monsanto. As was the case with BT corn, it wasn’t long before the so-called ‘super weeds’ began to emerge which were immune to glyphosate. And they are spreading throughout the country.

In both cases, these products were supposed to be quick and easy fixes for problems that have no quick and easy solution. And in both cases these ‘solutions’ were doomed to fail from the beginning. And everyone knew that they were not going to be long term solutions. Even as they were developing these products, the researchers involved were warning that sooner or later the pests they were intended to fight would eventually develop resistance and we’d be back where we started again. They recommended various techniques to help to reduce the spread of resistance. Those techniques were ignored because most companies these days operate on the basis of generating as much profit as possible right now and the hell with the future.

And that seems to be the problem with a lot of the GM products I’ve seen. They’re little more than quick fixes. Very little thought seems to go into determining if the products are going to be viable, useful, over the long term. The focus is profit, as much profit as possible, and profit right now. And hopefully the CEOs can cash out their stock options and bank their bonuses before the whole thing falls apart and leaves the company, and the farmers, worse off than before.

I’m not anti-GMO. Genetically modifying organisms has the potential to be incredibly beneficial. But as long as this current business climate where the only thing that matters is profit, right now, and the hell with the future, remains in effect, it is never going to fulfill that potential.

 

Disconnection from Reality in Agriculture

I often find myself irritated by what appears to be a serious problem with how some ag news outlets and their various pundits report on the dairy industry. Ever since milk prices plummeted a couple of years ago, I’ve been reading an endless string of opinion pieces by the so called experts, the pundits, even actual news reports, that indicate that milk production is dropping, or is going to drop, the number of milking cows is going to shrink, and there is going to be a significant improvement in farmgate(1) prices.

Even as I was reading some of those items I was scratching my head because the actual data I was seeing was telling me exactly the opposite of what the pundits at the ag web sites were claiming. While there was some shrinking numbers in some parts of the world, like New Zealand, what I was seeing in the rest of the world was a significant increase in production almost world wide.

The experts were claiming that production in the US was shrinking as well. They were claiming that production was flat or even shrinking as farmers culled herds and halted expansion plans.

The problem was that at the same time I was seeing new permits for mega farms being applied for, news stories about expansion plans, and other indications that exactly the opposite was happening.

The new USDA report that came out yesterday supported what I’d been seeing in the news, and indicated that the pundits don’t read the news reports in their own magazines or websites.

August milk production was up almost 2% in the US. Texas’ production was up 11%. The report said that 16,000 milking cows were added in July alone, and 45,o00 were added over the past year. And just ten minutes ago I was reading about yet another application here in Wisconsin for a dairy CAFO(2) to expand to 5,000 head.

The problem with a lot of these experts seems to be that they look at a specifically local condition and extrapolate from that and apply it world wide, while ignoring what’s really going on.

Some of the claims that production in the US was in decline was due to California. Production there has been declining significantly for the last ten years for a variety of factors. But they’ve been ignoring the fact that almost everywhere else in the US production has been going up. Wisconsin, North Dakota, Arizona, Minnesota… almost every state with any kind of significant dairy farm presence has been increasing production, often dramatically, as with Texas.

It’s been the same thing with the EU. They focus on a single country that’s seen a decline in production, and from that claim production is going down through the entire EU. When it isn’t.

It’s been a similar story when it comes to demand for milk products. They seem to focus on a small part of the world that is experiencing an increase in demand for milk products, and apply that world wide.

Even worse, they’ve gotten in the habit of looking at Global Dairy, a milk marketing system in New Zealand, as an indicator of world wide demand. But they tend to ignore the fact that GD is not an independent market. It is a wholly owned subsidiary of Fonterra, the New Zealand milk processing giant, and that it has a history of deliberately manipulating supplies flowing through the market in order to manipulate prices. Neither the amount of product flowing through GD, nor the prices of the products sold, is an accurate picture of supply and demand.

 

 

  1. Farmgate price is not the commodity futures price, but the actual price that the farmer gets for her/his product. There is often a significant difference between the commodities prices and the farmgate price. For example, a couple of months ago when the corn price on the Chicago market was running about 3.49, the actual price farmers in this area were getting for their corn was 2.78.
  2. CAFO is the term used by government for a mega farm. Concentrated Animal Feeding Operation. It applies not just to dairy farms but to any animal operation that has more than a certain number of cattle, pigs, etc. Generally around 500 – 700 animals.

Computers that should have been great but weren’t

The item I wrote about the Epson HX-20 the other day reminded me about one of the other items I was supposed to try to sell for that business supply company, the Epson QX-10. This beastie:

screen-shot-2016-09-18-at-7-46-32-am

This particular example is, judging from the color of the case, an elderly one. The plastics used for computer cases rather rapidly turned an unappealing shade of dirty yellow. In its prime, though, it was a rather handsome creature, and it was both one of the most advanced, and the most useless computers I’d ever worked with.

At the time the QX came out the computer market was going through a shakeup and even more importantly, a shakeout. There were dozens of different computer makers back then, offering an astonishing variety of systems that ranged from the silly to the sublime. But at the time, IBM with its PC and the MS-DOS operating system was well on its way to becoming the standard for small business and, eventually, home computers. By the time Epson brought the QX-10 to market, its underlying hardware was already pretty much obsolete, and it’s sophisticated software and graphics weren’t enough to make up for it’s lack of horsepower.

Before IBM jumped into the market with the PC, the ‘standard’ for small business computers was the 8080 or Z80 CPU based microcomputers running the CP/M operating system. These computers were based on an 8-bit CPU and limited to 64K of RAM. Then IBM came along with it’s PC, which used the 16-bit Intel 8088 which could handle up to 640K of RAM, at around the same price as the 8-bit CP/M machines, and the rest is, as they say, history.

How did Epson hope to compete in a market that was already crowded with other 8-bit, Z80 based computers, or to compete against IBM and MS-DOS?

By coming out with a operating system of their own which was combined with a hardware package that made the QX-10 the most sophisticated system ever produced. Or so they claimed.

The QX-10 was admittedly pretty sophisticated. It had a high-resolution monochrome graphics system with up to 128K of dedicated video memory that blew away anything except dedicated CAD systems. It’s Valdocs operating system was incredibly advanced for it’s day with a built in Help system, 128 character long file names when everyone else struggled along with 8 characters. And it had 265K of RAM.

And it had what was possibly the first WYSIWYG ‘what you see is what you get’ word processor to become widely available at a (somewhat) reasonable cost. Boldface a word? It showed up in bold on your screen. Same with italics, underlining, etc. Virtually every word processor on the market at the time showed not bold face, but codes embedded in the text to turn on or off control functions, if they allowed things like bold face or italics at all.

They gave me one of these things and I had it at home for a few weeks while I learned it inside and out because I was supposed to support the thing. It was definitely sophisticated. The graphics capabilities were outstanding. It was undeniably an amazing computer when combined with the Valdocs system.

The problem was that it just didn’t work very well. Valdocs and TPM, the underlying operating system, were full of bugs. It seemed every other day I was getting updates and bug fixes. And since this is the pre-internet, that meant either dialing the company’s BBS system with a 300 baud modem and paying long distance phone bills, or waiting until they shipped me a floppy disk with the updates.

The biggest problem though was it was slow. Oh dear lord it was slow! Any kind of competent typist could easily outdistance the Valdocs word processor, getting forty, sixty characters ahead of the display update. So far ahead that you could easily overload the buffers and lose characters and words. And since we were supposed to push this as a word processing system because of the WYSIWYG display system, well, it’s pretty hard to sell a word processor that made you work slower.

The other problem was that there was no software for the Valdocs system except what was supplied by Epson. The word processor, calculator and drawing program and, I think, a rather brain dead database. There was a spreadsheet but it was so abysmally slow you could go get a cup of coffee while it was recalculating.

If you wanted to use it for actual work, that meant you had to reboot the system with the old CP/M operating system to actually do anything useful. And, of course, once you booted into CP/M, all of the fancy features Epson was pushing were lost and all you had was a generic and overpriced CP/M computer.

Then there was the competition. At the same time Epson was pushing the QX-10, the IBM-PC was becoming the standard for small business computers. There was lots of genuinely useful business software available for it. So basically there was absolutely no reason to buy the QX-10 with it’s outdated hardware, useless Valdocs system or the increasingly obsolete CP/M system.

Epson’s solution to the competition from IBM was to find someone to supply them with a plug in card that was basically an IBM-PC clone on card, while they scrambled to get the QX-16 system on the market. This ‘solution’ was literally a PC clone on a card that plugged into the computer’s internal bus, with an 8088 CPU, it’s own memory, everything. It worked, sort of. But it didn’t actually run MS-DOS, it ran PC-DOS which was an MS-DOS clone. It would run some MS-DOS based software. Sometimes. Maybe.

It also cost in the neighborhood of $1,500 if I remember right.

So you have a computer with a base price of around $2,500, already far more than comparable CP/M machines. And now you have to drop another $1,500 for a card to make it use MS-DOS software, and there’s no guarantee it will actually run the software you need…

Oh, brother…

Could it have been a great computer? I don’t think there’s any doubt that it could have. The QX was, on the surface at least, one of the most sophisticated systems to hit the market at the time. It had a lot of features that eventually became standard on later generations of computers; long file names, WYSIWYG word processor, high resolution graphics, etc.

Unfortunately, design decisions crippled it. The decision to go with the Z80 processor meant it would never have enough raw horsepower to live up to the hype. The graphics system’s hardware was woefully slow. The Valdocs system, while very nice, was bogged down by the obsolete hardware and inefficient programming techniques. Even worse, Epson never brought out any software that ran under Valdocs except that which was included with the computer. That meant that in order to run the popular business software of the day, the computer had to be rebooted into CP/M, and that turned it into nothing but a vastly overpriced, generic business computer.

Valdocs itself acquired a reputation of being buggy. I never really ran into serious problems with it except it’s woefully slow speed, but I wasn’t using the computer under actual business conditions.

There were rumors flying around that over at Rising Star, the company that made Valdocs and its underlying OS, TPM, programmers were routinely fired as soon as they finished work on their assigned modules, leaving people who were unfamiliar with the code to try to support and debug problems.

I was told that large parts of Valdocs and even TPM had been written in Forth, of all things. Forth is not exactly what I’d call user friendly. It was never designed for large projects. It was originally designed as a hardware control language used to control telescopes. I’m not saying it can’t be done, but oh brother… I’ve programmed in Forth and I wouldn’t want to use it for any kind of complex system.

Epson went on to bring out the QX-16, an interesting machine that was intended to compete head to head against the IBM PC. It had both a Z80 and 8088, and would run either Valdocs, CP/M or PC-DOS. Alas, it wasn’t very good either.

The upgraded hardware didn’t cure the system’s speed issues. The word processor was faster, but screen updates were still unacceptably slow. The spreadsheet was terrible. Reviews at the time claimed that a spreadsheet that would recalculate in just five or six seconds in MS-DOS or CP/M spreadsheets, would take minutes to recalculate under Valdocs. And while it could run some MS-DOS software, a lot of it wouldn’t run at all.

 

Computer Memories

I ran into this little item in a nostalgia piece in a UK magazine called Gadget, and it brought back a lot of memories. I’ve had a lot of jobs over the years, some better than others, and one of them involved trying to sell these things–

screen-shot-2016-09-17-at-7-03-42-pm

I’ve been involved with the personal computer industry in one way or another since 1979, and in 1983/1984 while I was back in college studying business, computer science and electronics, I was also working a part time job for a business supply company that sold, among other things, this — this thing.

Epson’s claim to fame was making relatively inexpensive, relatively well made, dot matrix printers. Not computers. And when the company decided to move into the lucrative personal computer market, things didn’t go all that well for them, largely due to things like this, and the famous, (or infamous if you had to try to sell the damned things) QX-10 computer.

The HX-20 was, to put it bluntly, utterly useless. The 4 line, 20 character long display was was too small for any kind of serious work. And while the built in thermal printer was a nice feature, well, it doesn’t do you much good if you don’t have any software that actually does something useful, and the HX-20 had pretty much no software support at all. As the blurb above points out, the rechargeable battery usually didn’t. Recharge, I mean. And it certainly didn’t last 50 hours, especially if you used the printer or the tape.

The Epson factory rep took me out to dinner and dumped one of these things on me in the hopes I’d help him get my boss to buy them. I fiddled with it for an hour, the battery went dead, the printer only worked when it felt like it, and the tape deck immediately ate the one cassette tape I tried using. With the wonky battery, the dodgy tape deck, the ridiculously tiny display, and total lack of any kind of useful software, I refused to have anything to do with it.

Somehow he managed to talk our boss into ordering a dozen of the damned things, and now it was my job to try to sell them.

Meanwhile Radio Shack was coming out with the TRS-80 Model 100, which was the same size, had a 40 character by 10 line display that was actually useful, all kinds of goodies like a built in modem, built in bar code reader, real standardized I/O ports for RS-232, a ROM port for speciality software, and, better still, you could buy actual, real and useful software for it. And it cost less.

The things are probably still sitting in a box in storage somewhere. We certainly never sold any. They’re probably with the dozen or so QX-10 computers he was talked into stocking that we never sold, either.

Conversation Becomes Shouting in a Society Without Authority – The Daily Beast

We are now at a point in politics, a new book warns, where reality has lost its authority: Facts are considered a matter of opinion.

Source: Conversation Becomes Shouting in a Society Without Authority – The Daily Beast

I put up that post about the ‘Age of Stupidity’ too quickly, or I could have brought up this item over at the Daily Beast, which also touches on the matter of belief vs. facts.

The writer of the article believes that is due almost entirely to a lack of some kind of authority figure.

I don’t believe that’s true, however. We are in a situation now where a significant number of people base their beliefs not on actual fact or evidence, but on what someone tells them to believe, exactly such an ‘authority figure’ that the writer claims we need.

But we have ‘authority figures’, and they are part of the problem. The anti-vaccination crowd that puts it’s mindless belief in the ‘authority figures’ of B-list celebrities who know nothing of science or biology, the climate change deniers who blatantly ignore facts and evidence, and spout opinions based on ridiculous conspiracy theories, quasi-supernatural explanations or outright lies that offer them some kind of financial or other personal gain. The list goes on and on.

Charles Sykes, one of the right wing radio ranters here in the state, did an interview in which he explained how this is largely the fault of himself and people like him. For years now, he and others like him, like Limbaugh, Hannity, Jones and the other far right pundits, have been deliberately doing everything they can to undermine any kind of ‘authority figure’ that the public might rely on for accurate information. They’ve worked hard to undermine the mainstream media, government agencies, even science itself in order to further their own agenda.

The mainstream media itself has to take some of the blame for this current climate. In its effort to generate a never ending string of clickbait headlines, generate controversy where there is none, cause fear and panic in order to pump up its ratings and profits, it’s given voice to loony conspiracy theories, blatantly inaccurate statements by politicians and others, ridiculous health claims and I don’t know what all else. It’s failed to call out politicians over outright lies. It has just — just failed. In everything except generating profits, of course.

The Age of Stupid

Some people like to classify different periods of human development in terms of ‘ages’. We’ve had the Stone Age, the Bronze Age, the Iron Age, the Steam Age, the Space Age, the Nuclear Age.

According to a friend of mine, we have entered what will be the last age of humanity, the Age of Stupid. And one of the problems is, well, this kind of attitude:

screen-shot-2016-09-11-at-8-03-56-am

This quote from William James, a philosopher and psychologist from back around the end of the 19th century, pretty much sums up what’s going on these days. This quote was bogus at the time, and it is bogus now.

Belief does not create  fact. Belief does not alter the facts.

But that doesn’t stop millions of people from thinking that it does.

We seem to have come to a point in human evolution where a lot of people think that belief does indeed equal fact, is even superior to fact. All you have to do is turn on the television, listen to the radio, read on the internet, and you can see that.

Sometimes when I see the list of people who, for whatever reason, accept belief over fact, I despair about the future of the human race. I see it every single day. The anti-vaxxers, the creationists, the climate change deniers, the scammers selling phony cures, the conspiracy theorists… The list goes on and on. And the apparently endless string of politicians willing to exploit the ‘true believers’.

How did it happen? How did we evolve a culture where the claims of a former Playboy model are given more credibility than those of actual doctors? When did the beliefs of someone like Ken “Jesus rode a dinosaur’ Ham become more credible than those of actual geologists, paleontologists and biologists? How did we end up in a world where people share the ‘outrage’ of the “Food Babe” when she expressed horror that there was nitrogen in the air of an aircraft she was in?

Do we really live in the “Age of Stupid”?