October 26, 2014

Story of the Cape Island boat

It happens that my grandfather not long before he passed helped settle a minor dispute as to the origins of the Cape Island boat. And he knew whereof he spoke. William E. "Bill" Smith was himself a builder of Cape Island boats in the 1930s and '40s and '50s, and was born at Centreville on Cape Sable Island near enough to the time of the advent of our Cape Island boat, in 1902, in a house built by one of the boat's inventors who like my grandfather and father and a good many other Cape Island boatbuilders was a carpenter when he wasn't a boatbuilder. Before he left us, Grampie explained that two men in Clark's Harbour on the Island were "building about the same thing at about the same time and about the same place," namely Ephraim Atkinson and William Kenny, known on the Island in their times as Eefy Atkins and Willy Kenny, and those two together are certainly the inventors of the Cape Island boat. My grandfather's pronouncement as I recollect it now went, "Eefy Atkins gets the credit for the boat, and he deserves it, but Willy Kenny was building about the same thing at about the same time and about the same place." My grandfather was quite adamant that the both men should share in the credit.

As to the name of the thing, Cape Islanders themselves use "Cape Island boat" because obviously a "Cape Islander" to them is a person, but outsiders use "Cape Islander", and outsiders further afield use "Novi boat" or "Downeaster". I take it that "Novi" is short for "Nova Scotia", but Nova Scotians know the boat as the "Cape Islander".

The principal idea of the Cape Island boat was to accommodate a forward engine, which would drive a shaft run through the length of the keel and fixed to an aft propeller, with a rudder mounted on the skeg directly behind, the propeller and rudder being fully submerged. The idea would succeed the sloop which had been the workboat of the Island and Municipality since their settlements in the founding migration of New England Planters in the 1760s.

The older Cape Island boats were not so long as a lot of their more contemporary descendants, very much narrower at their beams, lower at their sides, shallower in their drafts, and they narrowed more sharply toward their sternboards. The old-time boats were built low enough at the sides to where a couple strong men could pick a boat up by grabbing it around the gunwales at the stern, where anymore a man can just about stand upright in the draft of a larger Cape Island boat, "grounded out". And the tubbiest of today's Cape Island boats can run half as wide as they are long, where on the older boats the ratio of length overall to beam would be something like 3:1, and until not so very long ago 2:1 was unthinkable. But the fundamental design and idea of the Cape Island boat is unchanged from the earliest times.

The very old Cape Island boats dispensed sometimes with a wheelhouse altogether, making do with what was called a "spray hood", which is to say an oiled canvas stretched over a wooden frame to afford the pilot some shelter. The older boats were very often "straight sheer", or absent a "break", that diagonal step-up at the main bulkhead to allow for more headroom in the forecastle and more hull forward for higher seas. A pilothouse, a sort of windowed bump on the forward deck, was found on a lot of the older boats through to more recent times, before the hulls went tall enough to where there was no call for the extra forward headroom of the pilothouse.

The construction of those old, wooden Cape Island boats didn't diverge appreciably from the construction of wooden boats and ships more generally, but a point or two on wooden-age construction may be useful here. A Cape Island boat in those times began as what was called a "half-model", which is to say a hand-carved scale-model of half a hull, to be chopped into sections, measured, and scaled up for the construction of the wooden hull. A Cape Island hull is a lot of curves and not so many angles, and any hull must be perfectly symmetrical, so one side can't be duplicated exactly by hand and eye, but the one side may be measured and mirrored. The hulls as well as the superstructures on the old boats were built to their owners' specifications and so every inch of the half-model for a hull was amendable, 'til the boat was just so. Then among the finishing stages, "knees" or L-shaped reinforcements connecting the deck and bulwark were cut out of tree roots, where the tree met the ground.

The Cape Island boat was of course a wooden boat, until the 1970s and later, before Reginald "Reggie" Ross of Stony Island -- which notwithstanding the insular name is another of the communities of Cape Sable Island -- added his chapter to the story. Reggie Ross had studied chemistry in England and was familiar with fiberglass technology and appreciated the value in applying it to the Cape Island boat, and sometime in the '70s he ordered the requisite supplies and built the first fiberglass Cape Island boat. Since that time and with an interim phase when the later wooden hulls were very often sealed in fiberglass, the hulls of Cape Island boats have been formed of solid fiberglass in fiberglass molds, a mold being a sort of inside-out boat, derived from a "plug" which is a wooden hull built more or less in the way Cape Island hulls were built from the earliest times. Cape Island boats still for the most part are "finished" in wood so as to be amendable to the specifications of their owners, with any woodwork that's to be exposed to the elements being sealed in fiberglass and gelcoat, a heavy paint based on fiberglass resin. But the larger part of the history of the Cape Island boat even now is the history of a wooden boat.

The Cape Island boat never was built for speed, but for seaworthiness and workability. One very fine fisherman from Maine observed that it was "like a tank." The Cape Island boat was taken up near and far -- my family boat business alone in its time built boats for the Island and province, for New Brunswick and Newfoundland, for Quebec and Ontario, for Maine and New Hampshire and Massachusetts and Connecticut, and indeed for Oregon -- and has been in service from its advent early in the 20th Century to this second decade of the 21st Century. That's testament enough, but they do say that the Cape Island boat was known even to Lloyd's of London, as a good risk.

At one point within my lifetime and by our count, or my memory of our count, there were something over twenty working boat shops on the Island; at the time of this writing the grand total would be countable on one hand, with fingers to spare.

(My little and fairly antique website for the family boat business may be found at McGrayBoatbuilders.com. Gone but not forgotten.)

September 9, 2014

Mid-Century Modern revived, Game of Thrones and Spoils of Babylon reviewed

I. Mid-Century Modern revived

This latter-day vogue for Mid-Century Modern may in some quarters be as insincere as the handlebar-moustache pendant on a teeny-bopper's necklace, and it may even be inspired by a basic-cable TV show, namely Mad Men, but in any event the revival of that rarefied art or "design" movement of the American 1950s and '60s antedating the annus horribilis of 1968 has by now attained something of a critical mass.

The principal term there is "Modern", and the irony is of course that from our vantage Mid-Century Modern comes across as rather traditional and quaint -- nothing could be more dated than Modern -- not to mention all-American and confident, and most especially ante-1968. As of 1968, Western Civilization was "deconstructed", and some us are picking up and patching together the pieces still, with uneven result.

Andy Warhol makes a serviceable illustration for art post-'68: Warhol had a very fine eye, indisputably, but a fashion stylist may have an excellent eye and yet no-one would confuse her with the fashion designer, and in that same way, Andy Warhol was not so much an artist as a cutter-and-paster of art, and his art amounts too much to isolating fragments from the creations of others for presentation as ironies. 

That sort of thing is all well and good for purposes of satire and so on, but it can't possibly answer a creative, constructive, cohesive movement like Mid-Century Modern. Art since the Fall of Western Civilization has been too often a sifting of the rubble, or a caricature or grotesquery of what came before, where the movements before that time had been constructions of gleaming new edifices. Art Nouveau, Art Deco, Mid-Century Modern, etc.: all brave new worlds in their times, and none of them "deconstructions". The Victorian and Baroque movements may have been derivative, the one of the Gothic Middle Ages and the other of Antiquity, but they took the past for a foundation and built on it, vigorously.

So we look on a movement like Mid-Century Modern with admiration or even envy, and look around at the desolation and destitution that the Hippies and their witless younger apers have made of our civilization, and we can do no better than to invoke the Last Good Age, the last time there was confidence and creation and civilization in our civilization.

II. Three points on Game of Thrones

The sort of schtick of George R. R. Martin, author of the Song of Ice and Fire novels from which Game of Thrones is adapted, is that he makes things awfully, appallingly hard for his characters. Martin writes from Thomas Hobbes' formulation of "the life of man" in a dark-age "Naturall Condition of Mankind" as "solitary, poore, nasty, brutish, and short". He places his characters in bad situations, then makes things worse for them, and then worse still.

Martin is expert at invoking actual Medieval history and mythology, like Tolkien with whom he shares two middle initials. Beowulf comes readily to mind as a Martin blueprint, in its combining of real and fantastical, in its horrific, limb-severing gore, and in the implication found in Beowulf by its interpreters of the old pagan pantheon of gods giving way to the monotheist Christian Trinity. Martin is worthy of the obvious comparison to Tolkien, who was a devotee of the Dark Ages Anglo-Saxon universe that produced Beowulf.

People do make a fuss about Martin's early-going execution of Ned Stark, who is one of the few fairly unalloyed white hats of the story, inasmuch as killing off a hero to the readership or audience is supposed to be unconventional and a sacrifice for an author. But it strikes me that Martin milks that execution like a fat cow, to the point where I reckon it was no sacrifice at all: Lord Stark of the North is elevated in death to a saint, and his life and execution are invoked to where yes-siree-good-ol'-Ned-Stark becomes a sort-of shorthand for all that was good and might have been, and all that's been lost and might be again. Stark serves the story in death more than he could've done as a going concern.

III. Not quite three points on The Spoils of Babylon

IFC's spoof epic miniseries. Highest commendation possible for style, but the show is sparse for jokes, and what jokes there are, are weak, like as not. Will Ferrell accounts for the better part of the jokes, in his prologue and epilogue commentaries as author and director Eric Jonrosh, and Michael Sheen takes his small role to the nth degree and delivers the funniest line of the show outside of Ferrell's parts ("Louisa May is my guide and my compass"), but otherwise the laughs are too hard to come by. And the show's a cliche of leftism and grinds its axes indulgently.

What Spoils of Babylon gets altogether right is style. It's art, in its way, and manages to make jokes of its art; it elevates style to an end in itself, and a comic end. I'd almost credit it as a novel genre of comedy which might for want of a better appellation be termed here Aesthetics Comedy.

But it may be that the inside-out-turning comedy of this early 21st Century will come from conservatives or libertarians or anyway non-leftists, because those sorts haven't been captured by the conventions of the age.

July 6, 2014

Observations on moving to Houston

On moving to Houston from parts more northerly, a fellow is liable to observe first that palm trees here are a commonplace, and it does seem that most anything photosynthetic will take and thrive under this sun and in this soil and with this moisture. I can't imagine that there's a more treed, flowered, and shrubbed jurisdiction on God's earth. And Houston's roaches are so very big, I almost think I ought to apply for a hunting license before squashing one.

Houstonians don't tan so much as roast. The sun can be a violent, angry thing at this latitude. I'm given to understand that Houston is hot or warm for something like ten months of the twelve, and I almost wonder if it only turns winterish at all in respectful observance of Christmas: 90-some-odd degrees on December the 25th would after all be an affront to Christmasness.

They claim Houston is the fourth most populous city in the Union anymore, and having been fairly terrorized by the big-city traffic and rents here, I'm disinclined to dispute them. It's no laggard for square mileage, come to that: I've been on the road for 25 minutes and imagined that we must've crossed city and county lines, then appealed to the map and discovered that we hadn't got out of our corner of Houston. There are entire states in New England and the Mid-Atlantic not so awfully bigger than Houston.

Anyone accustomed to the tic-tac-toe board that is the map of Tulsa, OK will observe that the streets of Houston were not plotted on a grid at right angles by some civil engineer, but run at all angles, and curve and swerve, and follow their own courses and logics and histories. Houston was after all founded in the 1830s, when a street still was something that just sort of happened, as people and goods moved from one point to another through features natural and manmade.

A Tulsan will observe also the unidirectionality of Houston's streets, and the great, mounded, gardened islands segregating their two sides, with their requisite U-turning. The grander of Houston's overpasses writhe and rise into the clouds like Jack's Beanstalk, and are formidable structures unto themselves, constituting miniature metropolises of columnar towers.

Work in Houston is done when it needs doing, even if that be on weekends or in the black of night, and road crews or paint contractors may be found on the job at all hours.

Church in Houston is a going concern, not a vanishing ancient rite practiced by scatterings of semi-fossilized stragglers.

I'm no foreigner to the American institution of the Walmart Supercenter, but until I came to Houston I never conceived of one with a wine aisle and a McDonald's in the parking lot, another at the entrance, and still another at the alternate entrance, for a grand total of three McDonald'ses within a matter of yards. Not to say I'm complaining.

Houston is so very rich, I'm reliably informed that a Third Worlder resident here declined a job offer of collecting litter and posting notices for $11.50 an hour, on account of it paid too poorly. Shiny late-model vehicles jam the streets and cram the parking lots, a towering metal-and-glass cupola which may or may not be a stylized representation of a pineapple embellishes a skyscraping new hospital, refrigerator ice-makers come standard-issue, and even the busted-up, dumpstered detritus are nice. I've never felt so poor.

I'm a fellow who's written lines like "Texas accounts for half the 'net new jobs' in all America for the year," and I've been preaching about economic systems and the fruit they bear since I was too young to be preaching about anything, and yet I have no capacity to process prosperity on this sort of scale. Houston and Texas are final proof that American decline is a policy, or the consequence of policy: if certain parts of this country were more Texan, there'd be no notion of American decline, and Churchill's "broad, sunlit uplands" would stretch before America today as in the three-and-a-half centuries 'til sometime in the late 1960s.


I may yet wind up singing "Take Me Back to Tulsa", but I'm privileged to be Gone to Texas.

March 18, 2014

Vlad the Throwback

Vladimir Putin is a throwback. For Putin, it's as though the 20th Century never happened. Which stands to reason, because for Russia generally the 20th Century was something that happened someplace else.

The 20th Century was very largely a continuation of the later 19th Century 'til sometime around the end of the First World War in 1918, and Russia was knocked out of the war in '17 by the Bolshevik Revolution and descended first into chaos and before long into that Hadean nightmare called communism, 'til the 20th Century was near enough to over. And because communism can't survive the exposure of its people to the alternative, the Soviet Union had of necessity to be a hermetically-sealed hermit-state which would shoot and kill its citizens summarily rather than let them walk across its borders and out of its system. So Russia skipped from the Edwardian world of 1917 to the 1990s, and passed the intervening decades in a bad dream.

The idea that annexing territories is the worst kind of gauche in international relations is novel, coming as it does in the 20th Century among the earlier phases of Western Civilization's project of dismantling itself and reducing the towering, gleaming skyscraper to a useless, miserable little heap of broken and twisted bits. Putin is a quite conventional nationalist and imperialist like is found from the dawn of time 'til sometime in the 20th Century, as alien to the West in 2014 as it is commonplace to every civilization 'til living memory. As though Putin had skipped the 20th Century.

Plus which, it has to be said, Russia is nothing if not self-consciously un-Western, and repudiating the vogues of our Western bien-pensants is a point of pride over there.

So Vladimir Putin is unbound by our decadent 21st Century Western notions of "international law" or even "morality", which in practice amounts to voluntarily binding one or both arms behind our backs in war against barbarians who want us all dead and burning in Hell. Putin's only constraints in the end are practical ones. Putin may be expected to push, in other words, 'til someone pushes back and draws the line that constitutes the geopolitical limits of his Russia. Putin may prefer to reconquer the old Soviet or Czarist empires by maneuver, but then, even Hitler won his early conquests of the Rhineland and Austria and Czechoslovakia and Klaipeda without a shot fired, and it can only be assumed that if push comes to shove, Putin will shove.

Barack Obama's Russia policy from the very outset has been capitulation to Russian demands and psychoses, as gestures of comity to bring about some mythical global harmony or anyway as an uncomprehending rote-Left reaction against George W. Bush. Obama's defense and foreign policies generally are to retreat and contract, which only invites trouble and raises the price of answering it, because threats and half-measures won't suffice without something bigger and badder behind them. If one may be permitted to mix metaphors, a serious president with a demonstrable capacity for pulling triggers and upsetting applecarts may need only rattle a sabre to get the attention of trouble-makers or would-be trouble-makers, but an Obama can wag his finger day and night without effect, because those trouble-makers have the measure of him and understand him to have not the constitution or even the interest to give force to his finger-wagging.

And the West in 2014 isn't much in the mood for anything costly or causing of discomfort, any more than we were in the mood in the 1930s to confront Hitler on his assorted acquisitions, least of all to uphold the sovereignty of Ukraine or Georgia or whatever other old Soviet satellites and Czarist colonies Putin has in mind to reconquer.

So Vladimir Putin is liable to have things his way, and the Lord only knows how far he'll go along that way, unless and until someone pushes back and wins the shoving-match and a very different president of the United States sits in the Oval Office.

January 24, 2014

Monty Python apostasy

Python is the standard for comedy in the English-speaking world and thus the world more generally, Monty Python's Flying Circus is sick-makingly funny, including post-John Cleese, Monty Python and the Holy Grail is the standard for feature-film comedy in the same way that Flying Circus is the standard for TV sketch comedy, and indeed even the Python LPs are by times miracles of comedy, but...

John Cleese had it more or less right when he walked away from Flying Circus after the first three seasons, or "series" as they call seasons over there. Python was of a particular time and place, it could only exhaust itself and achieve a point of diminishing returns before long, and anyway less is more.

Python and the Holy Grail is cheap and cheerful; it's episodic to the point of being practically glorified sketch comedy and doesn't succumb to service of a plot; it bookends the TV series, so that the Flying Circus spirit hasn't yet gone out of the Pythons; it's All-England, England being the font of comedy in the world and the sort-of infrastructure for a very great deal of Python; and most of all it's comedy for comedy's sake -- comedy first, second, and last and none of your dreadful politics or pretensions, thanks very much.

The two later Python features Life of Brian and The Meaning of Life depart from that Holy Grail formula and are disappointments for it. Life of Brian commits the comedy sin of taking itself seriously. It's politics, it's insufficiently episodic to liberate itself from its point-making plot, its setting and subject could scarcely be any less comic, and it comes too late, after the spirit had passed.

Meaning of Life is at least a compilation of sketches, but it too much presumes to be a proper movie, a big-deal feature film, it has a weightiness if not an air of menace about it which acts as a wet blanket on the fun of the TV series and Grail movie, and by the time of Meaning of Life there wasn't material or spirit enough left in the Pythons to fill a feature film. The Pythons were too old and too changed for a reprise of Flying Circus, and anyway the time for Flying Circus had come and gone.

Which is not to say there's no life after Python, only that the Pythons stayed too long at the fair. Of all the post-Python projects John Cleese's Fawlty Towers is far and away the greatest and incidentally also the proof positive for the Cleese Doctrine of less is more, that series lasting all of two seasons and 12 episodes all told. Cleese was good in Fish Called Wanda and he near enough to hijacked Cheers in his recurring role there, so outshining the rest of the show as to have diminished it in his wake. Cleese doesn't miss too many opportunities to trade on his brand, whether in Schweppes TV ads or Harry Potter cameos, etc., but he's an institution and a sort-of ambassador for Britain, and anyway he was the one Python who got it right, that the thing to do was walk away and quit while Python was ahead.

I dearly love Michael Palin's movies and travelogues post-Python, although those don't presume to be comedy proper; they're more humor than comedy per se. Graham Chapman was a hard-case drunk and died too young, of cancer. Eric Idle has inclined more than I can abide to showtunes. Terry Jones is a politician with jokes. And Terry Gilliam who really never was so much a comic as a visual artist has made of himself a leading exponent of fantastical filmmaking.

And another thing: the Pythons didn't invent comedy or TV comedy or TV sketch comedy or even British TV sketch comedy, and would never pretend to that claim themselves. Flying Circus was derivative of Spike Milligan's Q, and Milligan's Goon Show before that, and the Pythons generally were products of England and of Austerity Britain and of the good schools and of the English way of humor, which goes back further than I'm able to put a finger on.

All that said, I suppose I'm relieved in a way that the Pythons did return to the trough one time too many, or several times too many, because otherwise we might be compelled to venerate them as demigods and despair of attempting humor ourselves.

December 28, 2013

At the risk of rendering this a baseball blog...

I'm compelled to hold forth on the vote of the Major League Baseball Rules Committee to outlaw the upper-body collision at homeplate: Beyond the question of the emasculation even of baseball, there is the question of how the homeplate collision can be outlawed as a practical matter, unless as A. J. Pierzynski joked, a fifth base is tacked on someplace behind homeplate.

A play at the plate will be very much more often than not a tag-play, that's to say, the baserunner will need tagging out as opposed to forcing out, which is in turn to say, it won't suffice for the catcher to get a foot on the plate while in possession of the ball, and in order to record the out the catcher must make physical contact with the runner before that runner makes contact with the plate.

That in itself wouldn't make for collisions at homeplate; the collisions come into it because homeplate is of course the fourth and final base, and a baserunner is thus free to overrun it. Safe or out, once he's crossed homeplate, a baserunner is through running the bases. And because sliding means slowing, not only in the act itself but in the preparation for it, a baserunner will often find that unless sliding will get him around or under the catcher somehow, he'll be further ahead to run through homeplate: he'll get there faster by running through it than by sliding to it.

It is the natural right of a catcher to block his plate, following from the necessity of his making contact with the runner before that runner makes contact with the plate, and in that same way, it is the natural right of a baserunner to run his course through homeplate, and may the best man win. Those are among the more fundamental of the natural rights and laws of the Great American Game and most Victorian of sports.

And all those fixed and moving parts work together to produce the collision at the plate. It's not as though Abner Doubleday sat down one fine day in Cooperstown, New York* and said, what this game needs is brutalizing, bone-crunching, bodily collision. Collisions are what come out the other end of the natural rights and laws, and formal rules, and physics, and plain sense, in baseball.

Now it may be stipulated that all of this is right and true, but at the same time, the homeplate collision is an injurious institution and consequently MLB is left with no alternative but to "do something". But I fail to see how a collision at the plate should be any more injurious today when it's outlawed than in the 20th and presumably also 19th Centuries when it was lawful and a commonplace. My idea is that what has changed is the society, with this decadent nature-is-what-we-say-it-is 21st-Century erosion and subversion of the manful virtues, and my suspicion is that what has moved MLB to action is the 21st-Century peril of the disabled list, if not also the monetary valuations of the catchers and runners.

A collision at homeplate cannot rationally be more injurious in the 21st Century than it had been in the centuries prior, but in this 21st Century an oopsie can knock a multi-million-dollar-salaried asset out of the lineup and make a treatment-and-rehabilitation case of him, and a season can easily be decided by the names appearing on the disabled list as opposed to the starting lineup. And fair enough: I'd never say that money on that order of magnitude oughtn't be a consideration, that a ballclub oughtn't have a right to expect some playing time out of a man they're paying maybe multiple millions of dollars in a single season, and the disabled list has gotten to be a scourge of big-league baseball to where it's a cliche for a contending ballclub to pray "so long as we stay healthy" as a sort of "Lord willing" appended to their more hopeful pronouncements. But it does strike me that that's what's moved MLB to action just now in rewriting a rule which I have to assume reaches back more or less to the dawn of the game as we know it.

In another century I played catcher, albeit as the rankest amateur, so I like to think I know whereof I speak on this score, not that I ever let my ignorance stand in the way of my opinion. I respect too much some of the men who've championed the MLB ruling to call it pussified, but I will say I'm against it and what's more that I regard it as an artificial imposition counter to the natural law of baseball, and time will tell if it can be enforced without aggrieving and outraging the catchers and runners, their ballclubs, and their fans. A baseball type can have his day spoiled by an umpire's calling a ball for a strike, even, and a play at the plate is as big a call as they come, ending as it does in an out or a run -- one or the other and nothing else.

* - Yes, yes, I know: that baseball was invented by Abner Doubleday at Cooperstown, NY is known now to be a ludicrousness

August 6, 2013

The trouble with the World Series

When somewhere along the way the All-Star Game went from life-and-death struggle to exhibition and spectacle, with no great import attached to the winning or losing of it, Major League Baseball and the Players Association agreed to "make it count", investing it with the determination of home-field advantage for the World Series.

We haven't got so het up about the All-Star Game in these past years and decades because we don't have any great investment in one league over another. I'm an American League man of longstanding, but I can't think of a good reason for it, and I couldn't think of a good reason to be very upset that the National League had won three All-Star Games in a string before 2013.

The two leagues aren't Republican and Democrat, or even GM and Ford; it's the luck of the draw whether your ballclub happens to belong to one or another, and it wouldn't make any rational sense at this point to attach to one over the other with any great emotion. It's something like being born with a surname starting in "S" as opposed to, say, "A": those of us blessed with surnames starting in "S" might think we've got the finest last initial going, but we'd be worse than silly to invest any great emotion in the S names, or to set ourselves against the A ones, because we've got to know on some level that A or S, it's the luck of the draw.

And there's a practical cause for divesting the All-Star Game of world-ending passion. Big-league ballplayers are something like Winston Churchill's description of modern warships: egg-shells with hammers. One rolled ankle from turning second base in a hurry, and a ballplayer can wind up on the disabled list, watching from the bench and worse than useless to his team. Injuries anymore have gotten to where "stay healthy" has become a sort of "Lord willing" of winning ballclubs, as in, "We ought to make it to the post-season, if we stay healthy."

If an injury is a tragedy to a ballclub, then an injury in a game that counts for nothing is a catastrophe. And if the fans and players and management weren't bothered about injuries from an All-Star Game before 1970, Pete Rose and Ray Fosse had a run-in at homeplate for the ages in the All-Star Game of that year which separated Fosse's shoulder and had some part in truncating his career.

But they say they had to "do something" to "make it count", so here we are, with the outcome of the All-Star Game determining home-field advantage for the World Series.

Maybe in some other sports a home-field advantage isn't so concrete, but in baseball the home field can and not infrequently does make the difference. Home-field advantage when it comes to the World Series means the first two games and the last two are played in the home ballpark, assuming the series goes to seven games. And since 2003 when home-field advantage for the World Series was first decided by the All-Star Game, the league that's won the All-Star Game has carried the World Series as well, seven times out of ten. Apart from the more psychological element of many tens of thousands of human beings supporting as opposed to spitting on the men on the field, there are at least two very practical advantages to playing at home in the game of baseball.

No two big-league ballparks are alike, for a start: the contours and heights of the walls, the dimensions from homeplate to the outfield walls and from the baselines to the sideline walls, the liveliness or otherwise of a ground ball on the grass and dirt, the way the wind carries along or knocks down a fly ball, the very atmosphere of the place -- vary from one major-league park to another, and sometimes appreciably. An outfielder for instance will be familiar with the way a ball caroms off the outfield walls in his home ballpark, and where those walls are in the first place, and familiar with the prevailing winds and native atmosphere in that park and their effects on an airborne baseball. And so on.

And home-field advantage means hitting last. Hitting in the bottom of the inning makes very little odds except when the game happens to be tied or close in the bottom of the 9th and into extra innings, when a go-ahead run for the home team ends the ballgame and the visiting team has spent its chance to answer that run. That may be said to be psychological like the support or otherwise from the stands: the visiting team gets no fewer at-bats for hitting first, after all, so hitting last is no advantage except in that psychological sense of knowing where you stand, that a run here wins the ballgame, say, or that you've got to register one run before the other fellow registers three outs just to stay alive, etc. But it's a structural psychological advantage, and very real, for the home team and against the visitor. And because the World Series is played between the two best teams in the game, by definition or anyway on paper, you're apt to get some evenly-matched, close-run ballgames between them, and tie-games in the bottom of the 9th and later.
 
So this is not scrapping over scraps. Baseball is a game of fractions of an inch and of a second, and whole ballgames and even World Series can turn on infinitesimally small things. And home-field advantage for the World Series is no small thing.
 
The one fair way of deciding home-field advantage would be to compare regular-season records, so that the superior ballclub of the two left standing in late October was rewarded for having been the better team over the course of the season. A coin-toss would be capricious, and the present arrangement of determining home-field advantage by All-Star Game outcome puts the decision into the hands of men who are uninvolved in the World Series, inasmuch as the All Stars are drawn from all 30 major-league clubs.
 
This scheme for "making it count" that Major League Baseball and the Players Association hit on in '03 is probably the best that could be devised for investing the All-Star Game with anything approaching the import it had in the time of Ted Williams. But I fail to see why the All-Star Game must be so life-and-death, and why it's not perfectly reasonable to regard it as an exhibition and a spectacle, as opposed to desperately, earth-shakingly serious.
 
And anyhow, Major League Baseball ought to make it its policy to let the people decide. If the fans do invest the All-Star Game with some world-ending passion then more power to them, and if they regard it as an exhibition and a spectacle, well, so be it. Who can blame them if they do conceive of the All-Star Game more as a spectacle, and what on earth is wrong with it? It ought to be good enough for the All-Star Game to be what the name implies, the one moment in a season when all the very best are assembled on one field and two dream teams. Let the people decide if it counts for anything, and don't let it skew the World Series.

May 8, 2013

So, about this Syria business

Syria in 2013 looks in places like Berlin circa 1945, it's been carved up already by jihadist and Islamist rebels, instituting sharia wherever they command a preponderance of men and arms, the Alawite, Ba'athist "national government" is sustained by Russian and Iranian and Hezbollah intervention, and the body count has hit 70,000 and counting, per the United Nations which is useful for such jobs as counting corpses in the more godforsaken corners of the earth, and not a lot else.

So what's to be done about this Syria business. The short answer is, as of now and beyond the usual humanitarian assistance, nothing. There was a moment when an intervention on a small margin, supporting the rebels without a very direct involvement on our part, would have been advisable, at the outset of this Syrian civil war in '11 when there was a true national rebellion, led by elements of the Syrian armed forces, against the Alawite Ba'athist dictatorship and enemy to the United States and Israel.

But that moment came and went because Obama and his administration prefer to "lead from behind", which is to say, go golfing and hope things somehow work out in the end, or that no-one notices if things don't work out, or that somehow the decision is made for us, or that Britain or France or anyone else at all rides over the crest of the hill and spares us from getting our hands dirty. Plus which, Obama and his administration had committed themselves to the line that Bashar al-Assad was a "reformer", some sort of misunderstood moderate and great man worthy of the praise of U.S. Secretary of State Hillary Clinton, and his Ba'athist dictatorship worthy of the resumption of normal diplomatic relations with the United States. Bush and his administration had recalled the American ambassador to Syria in '05 after Syria's assassination of Rafik Hariri, twice the Prime Minister of Lebanon, so of course Obama and his administration had to go altogether in the contrary direction.

Obama's timing was impeccably awful to boot: he claimed a recess appointment to circumvent Senate confirmation of his nominee for ambassador to Damascus, rewarding Bashar al-Assad with normal diplomatic relations, not three months before the onset of this Syrian civil war, just in time for al-Assad to demonstrate beyond all doubting that he was a dictator and butcher of the first magnitude, with whom the United States ought to be in a state of cold war at the very best. And it was worse than even that: Obama was so loath to come down against the Butcher of Damascus, exposing himself as a fool and vindicating Bush's judgment on the al-Assad regime, that he watched five months of al-Assad's butchery before calling for his resignation, at long last, and even then, Obama couldn't bring himself to speak the words, preferring the medium of the presidential "written statement". So our moment passed, and with it our only good alternative.

Because before long, the national rebellion was hijacked by the international jihad, as in Libya not long before. So that by this point, per a memorable and frank New York Times report, "Nowhere in rebel-controlled Syria is there a secular fighting force to speak of."

Obviously the United States isn't great guns for getting in the middle of this Alawite-jihadi-Russian-Iranian-Hezbollah melee, although in the abstract, that would be the one unalloyed good alternative, to sort of descend on Syria like some Heavenly host bringing ruin to the wicked on all sides, then institute a democracy for the ordinary Syrians still drawing breath. But that's as comprehensively mooted a point as can be, inasmuch as it'll never happen, which leaves us with the binary choice of keeping clear of Syria, or intervening indirectly as in supporting one side over the other.

But to support one side over another would be to support our principal enemies and threats in the world, and without reason to believe the end would be anything other than awful for us and for the people of Syria both. If I may be forgiven for invoking that most overdone of wars a second time in one piece, this Syrian civil war has gotten to be something like a science-fiction parallel-universe theater of the Second World War, where for various reasons the Germans, the Italians, and the Japanese wound up going at one another hammer-and-tongs, somehow at war against one another and not together against us. We'd have been fools to touch that with a pole, instead of letting it play out and letting the works of them bleed one another dry, while we followed the developments from afar and counted the blood and treasure expended on their side and husbanded on ours.

The peculiarities of this Syrian civil war have conspired to assemble our principal enemies and threats in the world, on opposing sides. Al-Qaeda, or the international jihad more generally, and Iran are more or less equivalent as menaces to the United States, and this Syrian business has set them one against the other, to the point where the reports are that Iran has expelled some of the al-Qaeda-ists it had made welcome before Syria made things awkward between them. Then Hezbollah which menaces Israel has come into it, and the Russians who are our geopolitical rivals and general ne'er-do-wells have been on the ground in support of the al-Assad regime as well, for reasons best known to themselves. And the Syrian regime, or rather whatever's left of the Alawite Ba'athist dictatorship of Bashar al-Assad, is an enemy of longstanding to the United States and to Israel, and has been at the very least an enabler and ally to those people trying to blow us up. So given that those are the parties to this Syrian civil war, as of now, what justice may be claimed in helping any faction over any other, and what good may be expected, for us or for the people of Syria?

Of course the trouble with this is that in the meantime those people of Syria are dying by the tens of thousands, and indeed the last man standing in Syria, holding or inheriting whatever assets and armaments are left by that time, will be no friend to us. But that's the price of not intervening at the outset, when there were still white hats against the black hats, and when indirect intervention might have helped those white hats to victory inasmuch as the outside forces propping up the al-Assad regime had not yet come into it with both feet.

So there is no good alternative left to us, but there is one least bad alternative, namely, to keep our powder dry, do nothing for the time being apart from putting up some innocuous humanitarian assistance. But of course Barack Obama has just now commenced thumping his chest on Syria, days after being reduced to impotent observer while Israel acted in Syria boldly and deftly, Obama's chest-thumping coming complete with a gratuitous and unjust slight against President Bush and a declaration of "moral obligation" which Obama discovered only after two years and 70,000 dead bodies. And Obama's interventionism is of course to take the form of some unquantified new support for the rebels who are by now al-Qaeda-ists and Islamists and assorted jihadists. How we or the people of Syria would be better off for al-Qaeda's being propped up by the United States, so as to carry on the civil war with the aim ultimately of rendering Syria a terrorist squat and a colony of the new caliphate, is unknown to me, but that is the course set for us by our occasional Commander-in-Chief.

He does nothing at the moment when supporting the rebels might conceivably have tipped the balance, and when it'd have meant supporting worthy men against a common enemy and conceivably helping to replace a tyrant and a menace with friends and allies in a decent and democratic successor government; then when it's far too late for that and supporting the rebels amounts to aiding al-Qaeda and Islamists and the international jihad, Obama gives the order to support the rebels. The worst of all worlds, as ever.

February 6, 2013

On the official divorce of the stock markets and the real economy

One wonders if January 30 of 2013 may be one of those days like they write into movies set circa 1929: "Oh, Father, don't be such a bore. 'Gross domestic product' is for the university men. All I know is, my Radio Corp shares are up and I'm taking my best gal Millie out for a malted." Or something like that.

On that day the news came down from the Bureau of Economic Analysis that its initial estimate for gross domestic product in the fourth quarter of 2012 was very slightly negative, that's to say, the United States economy actually shrank a bit in the final few months of 2012. Barack Obama ought to praise Almighty God that the BEA doesn't issue those initial estimates as projections and before November 6, because the exit polling found a clear plurality of voters on Election Day holding to the quaint notion that the economy was affirmatively improving, where we now know it was in fact contracting, or at best standing still, at just about that time.

Time was, the stock markets were dependent on what we fusty traditionalists insist on calling "the real economy". A BEA report like the January one showing Q4 2012 GDP at -0.1%, making the first decline since the official, statistical end of the recession in '09, would've been received by the markets as bad news and sent them lower -- and indeed the markets did go lower, only just, but they dusted themselves off and carried on toward their sunlit uplands such that all of two days later, the Dow Jones Industrials and S&P 500 were registering 52-week highs, with the NASDAQ not far off a 52-week high of its own and the Dow crossing 14,000 for the first time since October of '07 when its record of 14,165 was set, putting it one good day away from a new record high. The stocks-and-economy headline for those few days might read something like "U.S. economy shrinks, markets rejoice."

The markets and the real economy were seen in public together hand-in-hand until sometime after the economy found its bottom in '09; as the economy bounced along that bottom in 2010 and '11, neighbors overheard the real economy and the markets squabbling acrimoniously, with the markets becoming by times accusatory; by 2012 as the markets went from strength to strength, the real economy was known to be sleeping on the couch while the markets took the master bedroom upstairs, with the real economy stopping on the way from work for a hamburger while the markets had salmon and risotto at home on the good china; and finally when the Dow crossed 14,000 points two days after GDP came in negative on January 30 of 2013, the divorce papers came through. It's now official: the real economy and the stock markets are well and truly divorced, and they don't much feel like speaking to one another for the time being, either.

I'm not a writer of upper-middle-class American vernacular dialogue circa 1929, and I'm certainly no market analyst, so I offer herewith the considered assessment of Bob Janjuah who despite the funny name was Chief Markets Strategist at the Royal Bank of Scotland, via the pseudonymous Tyler Durden at ZeroHedge.com:

"Real wealth can only be created by innovation and hard work in the private sector, with policymakers, the financial sector and financial markets there to aid and encourage/incentivise. Real wealth is not created by the printing press and by excessive government spending. We simply cannot turn wine into water – after all, if it were that easy, why have we not done this before...

"Sure, central bankers through [quantitative easing] can create a chemical/synthetic concoction that may well get us even more intoxicated than real wine, but like most chemical processes that are focused on by-passing the rules and focused on immediate quick fixes, the "wine" they are synthetically creating will I fear ultimately lead to either a large market hangover (at best) or – at worst – to the "market equivalent" of serious liver poisoning or something even worse.

"The scale of the fallout will I feel be determined largely by how far markets and policymakers are willing and/or able to stretch the elastic band between real world reality and liquidity fed asset markets. Past experience shows us that this band can be stretched a long way, and we know that central bankers have a bad track record at both spotting and managing asset bubbles."

Thus spake Janjuah. And that looks about right. Every ridiculously overinflated boom must bust; the trouble with this bubble is, it's the product of the wholesale printing of dollars and profligate deficit spending, and built on an economy that's arguably recessionary and inarguably enervated. The United States could absorb a crash in 2000 and again in '08, because by those times it was near enough to full employment and coming off good long stretches of healthy expansion in GDP, plus which the American dollar hadn't been debased wantonly in the inflation of the bubbles that were popped in those crashes.

There is just no reconciling the Dow Jones skipping giddily toward its record high, with -0.1 percent GDP and 14.4 percent effective unemployment. A crash in these circumstances, and affecting the dollar that all Americans deal in, could be a catastrophe. 

November 8, 2012

The Complete Guide to the 2012 Presidential Election, According to Me

(Updated, Dec 5)

I'm afraid I'm unrepentant in my prophesies that Barack Obama was to be a one-term president: I'm not sure that was so very far off considering that the Gallup and Rasmussen polls both had Obama at 48 percent on election eve, with Romney favored albeit by just a point, considering that the difference in the four deciding swing states came to not much more than 400,000 votes combined, considering that Obama shed something like four million votes off his '08 numbers, considering that Obama's margin in the popular vote was hacked from over 7 percent in '08 to something over 3 percent in '12, considering that whites who are after all the great majority voted against Obama by all of 20 points, 59-39 percent, and considering that independents came down against Obama by five points. But Democrats outnumbered Republicans by six points, and that decided it.

Romney's surplus independents might well have lifted him over that Democrat advantage except that they were shorn away in the last days of the campaign by Obama's Superstorm Sandy photo-op, that storm becoming a humanitarian and economic crisis only after Obama had taken his victory lap and jetted off again to his campaign rallies, with such statesmanlike displays as urging his followers to vote for "revenge". Had he belonged to the unapproved party, Sandy would've been treated by the press and popular culture as Obama's Katrina, but because he's a Democrat, a leftist, and an Obama, he showed up for the first inning of a nine-inning ballgame and was acclaimed World Series champion, just in time for the vote.

Obama has become one of two presidents in all American history to win a second term with a narrowed margin in the popular vote and a shrunken share of the electoral vote. The other one to do it also was a Democrat, Woodrow Wilson, who won a second term in 1916 on keeping America out of the war, then proceeded promptly to take America all the way into it, conscription and Sedition Act and all. (In case you're curious, things went badly for Wilson's Democrat Party after 1916: they were reduced in the 1918 midterms from 53 seats to 47 in the Senate, and from 214 to 192 in the House, and they were shut out of the presidency 'til 1932.)

Barack Obama won his re-election with very many fewer votes than four years prior and very much shrunken margins, flopping over the finish line about 400,000 votes ahead of Romney in the four kingmaker states put together, with independents affirmatively voting to terminate his presidency, and on the strength of an "anti" campaign, "killing Romney" as per the explicit Obama strategy from the start, as opposed to presenting a program for the next four years. But now the course for those next four years is set, to wit:

Foreign affairs and war.

Afghanistan is a lost war and Obama has lost it, after throwing three times more dead American bodies at it in four years than Bush did in seven. The trouble with Obama's surge was that it wasn't a surge; it was something closer to the pre-surge policy in Iraq. And now Obama will withdraw from Afghanistan in what he prefers to conceive of as "ending the war", only, there is no such neutral alternative in war. Wars are won or lost, and very occasionally stalemated, but to withdraw from the field without achieving your object, and abandon it to the enemy, is what is called "losing a war". And Obama will be the president who lost the Afghan War; indeed, he's that already, but now that fact will be made plain.

Al Qaeda is running amok across the greater Middle East including especially Libya, and when Obama thumps his chest about "decimating" al Qaeda, I believe he is deluding himself or lying, because as commander-in-chief he has to see the reports of al Qaeda ascendancy, including in precincts that were until lately free of Islamist militancy. Al Qaeda affiliates have now killed an American ambassador and three other Americans, and sacked an American consulate, in what is arguably al Qaeda's greatest coup against the United States since the 9/11 attacks of '01. Obama failed utterly to act before that attack to defend against it, despite that the consulate in question had been attacked twice in the months before and that every man and his dog on the ground were pleading for security. Come to that, a good part of what little security they did have was withdrawn not long before the final assault. Obama then failed also to intervene in the seven-hour assault with the ready forces he had at his command. And finally Obama tried to make out that this al Qaeda-affiliated terror attack was some sort of movie review that got carried away, to borrow from Mark Steyn. There is real trouble, and real incapacity on the part of Obama to attend to it or even to recognize it.

Iran is four years closer to going nuclear than when Obama ascended the presidency. A nuclear Iran would be the Armageddon nightmare that's had people awake nights since the advent of the bomb in 1945, and Obama is very much more against action to forestall Iran's going nuclear than he is against an Iranian bomb.

Economics and finances.

The markets are in freefall as I write this. The Dow Jones gave up 313 points or 2.4 percent in the wake of the vote, for its worst crash of the year. And that's the second-worst selloff yet registered on the Dow following a presidential election, second only to the bloodbath of 486 points and over 5 percent, on the day after Obama's first election in '08. So there's progress.

The national debt under Obama's own optimistic FY2013 budget proposal would go past $20 trillion in four scant years, i.e. 2016, as Obama retires from the presidency, meaning that Obama would have doubled the debt single-handed, adding as much debt in one presidency as was added in the other 43 combined.

And about how much longer before America runs up against the credit wall? There's not capital enough in the world to finance this kind of debt, and the Federal Reserve on Obama's watch is presently into its third round of quantitative easing, i.e., printing American dollars to soak up some part of this uncoverable debt, which has the effect of debasing the dollar and making everything that much more expensive. Obama's Plans A, B, and C for resolving the debt crisis are to raise taxes on the rich, despite that the revenue from his proposed tax hikes would come to a drop in the bucket, and the top 10 percent of federal income tax filers have been carrying 71 percent of the federal income tax burden for some time already, at the Bush rates.

When the economists and business analysts have observed in this Age of Obama that "capital is sitting on the sidelines", what they've been getting at is that real-economy investment has been waiting and watching for a change in direction. The election has determined there'll be no such change in direction for four years more, and so that capital can only be expected to stay put or to flee for jurisdictions where the leadership doesn't treat businessmen as sort-of enemies of the state. There is no good reason to imagine that the economy in the next four years will be appreciably different from the last four years. Come to that, full implementation of Obama's greatest onslaughts against business, Obamacare and Dodd-Frank, the financial regulatory leviathan, was deferred 'til after the election, as if to prove beyond all doubt that they'd be economically crushing and politically toxic, so there's good cause to suspect the Obama economy has not found its bottom even yet.

Recessional remarks.

Dick Morris had by my lights the best line of the campaign. (Yes, I know Morris has come out of this badly inasmuch as he was projecting a world-beating Romney landslide, though to be fair he was going on historic averages of turnout among blacks, Hispanics, and young people, which was not an indefensible presupposition. In any event, Morris had a good line.) He said, Obama likes to tell about all the troubles he inherited as president; just imagine how he'll complain if he wins a second term and inherits this mess. 

I talked a long while a couple months ago with an old Marine, who said something I mostly set aside 'til election night, namely that if Obama were to be re-elected, the patience of the people would run out. And along those lines, Bill O'Reilly is no hack like me, and unlike me he never passes up an opportunity to extend Barack Obama the benefit of the doubt, so when he comes down against Obama with great force of conviction, I take notice, and I was frankly shocked to hear O'Reilly's pronouncement on Obama and his Democrat Party on the night after the vote. He said, if Obama hasn't got the economy rolling again in two years, it's the end for the Democrats, not for two years but for good. Even I wouldn't be quite so categorical as that -- I'd prefer "for a generation" -- but I thought on it, and O'Reilly is hitting on something there. This unending sort-of depression that we've got mucked down in has carried on for about half a decade now; Obama was elected to fix it, and instead he turned a recession into the next thing to a depression. If the desolate moonscape of this economy does not bloom with new growth, if our lives are kept on hold for not half a decade but nearly a decade as in the Great Depression, then the people and history will never forgive Obama.

Now to the usual refrain that this is "the demise of the Republican Party!" which we get every year the Democrats can claim a victory, from the press about as much as from Democrats: if ever there was a case to be made for that proposition it was in '08, when the Rs lost the presidency by half a dozen points and were reduced from minorities to smaller minorities in both houses of Congress, but a year later they were winning again even in statewide races in New Jersey and Massachusetts, and a year after that they had won arguably the greatest turnover in the century and a half of the Grand Old Party, taking into account the red wave in the statehouses and governorships. There really is no reason in 2012 to see some smouldering hole where the Republican Party used to be: Republicans held their big majority in the House, filled out their governorships to a nice round 30 of 50, and came within 400,000 votes in the four deciding swing states to knocking off an incumbent president.

And another thing. Since I was a boy, I've seen three two-term presidents, and I've observed in these presidential second terms a couple common traits: nothing much gets done, with the prospect for accomplishment declining as the term progresses; scandals, sometimes from the first term, fester and pop; and that second term flies by. In a year and a half, we'll be into the midterms campaign, and it won't be long after the vote in November 2014 that things will turn inexorably to the presidential primaries, on both sides, with the president becoming in his final year a kind of afterthought, pushed aside first by the primaries and then by the general election. By maybe the spring of 2015, candidacies will be declared and so on, and the sitting, lame-duck president will begin to fade from our thinking. A second-term president gets closer to three years than four, effectively, and indeed sometimes he doesn't get even that much. It'll go faster than you know.

July 26, 2012

Betting on form for November 6

Never mind the polls and unemployment rates and even Harold Macmillan's "events, dear boy, events". If Barack Obama were to win re-election come November 6, he'd be only the second Democrat president to be elected to more terms than one since Franklin Roosevelt, back when Bing Crosby and the Andrews Sisters were tearing up the Billboard charts with "Is You Is or Is You Ain't (Ma' Baby)".

Bill Clinton was of course elected to two terms, though it has to be said that he was the beneficiary in his first election especially of an unusually strong third-party candidacy in Ross Perot. Perot split the anti-Clinton vote in 1992 and '96 such that Clinton could pass through to the White House with 43 percent and 49 percent of the popular vote. Obama has no third-party spoiler on Perot's order of magnitude to save him, and in any event Barack Obama is no Bill Clinton, having no truck with Clinton's Third Way, more pro-business, incremental leftism which as an ideology has turned out to be nothing much more than a curiosity of the 1990s.

Jimmy Carter's offer for re-election in 1980 went sufficiently badly that he had conceded to Ronald Reagan before the polling stations on the West Coast were closed.

Lyndon Johnson served out the last year of John Kennedy's term and proceeded handily to win a term of his own in '64, but he was eligible per the 22nd Amendment for re-election and was the presumptive Democratic nominee until Eugene McCarthy finished seven points behind the sitting president in the New Hampshire primary of March 1968. By the end of the month, Johnson had uttered maybe his most famous remark, that "I shall not seek, and I will not accept, the nomination of my party for another term as your president." The Democratic National Convention that summer was a madhouse, the party was radicalized, and Democrats were banished from the presidency for seven of the next ten elections.

John Kennedy was of course assassinated about three years into his only term, so his case can only be left out of consideration here. Unfair though that may be, it just can't be said with certainty that he'd have won re-election, and neither that he'd have lost, so Kennedy is counted out for these purposes.

Which leaves the case of Harry Truman. Truman filled out all but a few months of Franklin Roosevelt's last term and won a term of his own in 1948, but he'd been exempted from the 22nd Amendment and was thus eligible for another kick at the can in '52. He wrote that he'd no intention of offering for re-election, but his name was on the ballot in the New Hampshire primary that March when Estes Kefauver won 55 percent to Truman's 44, and it was only after the Kefauver upset that Truman announced he'd be standing down. The Democrats chose Adlai Stevenson later that year and again four years after that, as their nominee to lose to Dwight Eisenhower.

On the Republican side over this same period were George W. Bush, George H. W. Bush, Ronald Reagan, Gerald Ford, Richard Nixon, and Dwight Eisenhower. Four of those six were elected to two terms, though Nixon didn't finish his second. Ford assumed the presidency to fill out that second Nixon term and a couple years later gave way to the Carter interregnum, but Ford doesn't exactly fit in this scheme on account of he wasn't elected in the first place. And so one is left with Bush the Elder as the only Republican president since Herbert Hoover in 1932 to be elected and not re-elected, and obviously he wasn't helped by the same 19-percent Perot phenomenon that smoothed the way for Clinton.

And the period from Hoover back to the advent of the Republican Party in the middle-19th Century is bleaker still for Democrats: James Buchanan, Grover Cleveland, and Woodrow Wilson constitute the totality of elected Democrat presidents in the three-quarters of a century spanning 1856 and 1932. There's a reason they call Republicans the Grand Old Party.

Come to that, the grand total of Democrat presidents to be elected to more terms than one, in the century and a half since the founding of the Republican Party, is four. And that counts Cleveland whose two terms were non-consecutive. Republicans have re-elected presidents as many times in just the last sixty years.

It may justly be said that none of this history and statistics is dispositive, but there is such a thing as betting on form.

April 15, 2012

The mystical awfulness of Troll 2

The story goes wrong very early on, when it’s explained that the Waits family is moving from the suburbs to the country community of Nilbog for a month as part of some rural-urban exchange program. Now, anyone who’s tried his hand at fiction will surely forgive a little shoehorning. Certain things must happen to advance the plot, certain characters must be in certain places at certain times and so on, and a little coincidence or implausibility is eminently forgivable when it’s in aid of a plot point. The trouble with this rural-urban exchange explanation is that it’s so needlessly implausible. We all know there is no such thing in America as a rural-urban exchange program, in which whole families uproot and displace themselves and switch homes with some other family for the sake, presumably, of mixing the rural and urban populations, like something out of the imagination of some mad mid-20th Century Communist central planner.

What’s worse is that there’s a compelling explanation for the move to the country that’s screaming itself hoarse and which could have tied up two other dangling ends at the same time. Apart from this business of the move, we are left with the still bigger questions of why does the ghost of Grandpa visit the boy to warn him about this place called Nilbog and the goblins there, and why do these goblins consume humans? Could the goblins not eat animals like the rest of us, or that is to say, could they not convert animals to the vegetarian goop that they consume as they do with humans? It’s explained at the outset that goblins “need no reason” for what they do, that they eat people out of sheerest evil, but that’s not terribly interesting. Come to that, why are the goblins so hell-bent on eating the Waitses and what odds does it make to them which humans they eat?

So why, oh, why did it not occur to the filmmakers that they had three elements which made no sense at all, but which could be explained in one swoop, all of them together, and in a way that would make the story very much more compelling? The family goes to the country because their recently-deceased patriarch owned property there; the ghost of Grandpa visits the boy from beyond the grave because he has unfinished business with the goblins of his old Nilbog homestead; and the goblins don’t eat people for its own sake but to vanquish their mortal enemies. Maybe the goblins had been driven out of Nilbog long ago but regrouped and reclaimed the place after a long struggle, and maybe the Grandpa character was the last of the goblin-fighters and finally had to abandon the property at Nilbog to save himself, and now the Nilboggian goblins want revenge against their old nemesis, in consuming his surviving family. There, in one motion, the story would have made sense of the senseless and given some purpose to the thing to boot. Not to say it wouldn’t be fairly silly still, but it would at least not be both silly and nonsensical.

To the extent Troll 2 crosses from horror into something resembling science fiction, Jules Verne it ain’t. In order that these vegetarian goblins can at the same time be cannibals, they must first feed the people they mean to eat some elixir which converts them into a vegetarian green goop. Now I know they’re doing great things these days in trying to make soybeans taste like hamburgers, but that’s the point: because soybeans are vegetables, anything that is made from them is necessarily vegetable; if a story is asking us to suspend our disbelief far enough to accept that humans can be converted to vegetable matter, it’ll have to give us some compelling explanation -- science or sorcery, but something big and something particular -- or else the whole business will come off as a joke. And it does. And to this day the Italian fellow who directed Troll 2 is painfully plain that it was not for laughs, and indeed he’s affronted that folks are laughing where he didn’t intend for them to.

The special effects are dreadful, but that’s not necessarily a strike against Troll 2, or anyway it’s not so much a strike against it in this 21st Century. A lot of us are sick to death by now of CGI and have a newfound appreciation for “real” faking. CGI FX are too often too perfect and too canned and too busy to move a lot of us viewing public, and so a puppet arm rigged with fake goblin blood getting the chop with an actual axe is really rather quaint and honest to us today. There’s a lot to be said for cheap and cheerful.

The dialogue comes off by times like it was translated from the German, and that’s not far off: the screenwriter is in fact Italian and not the most expert at the American vernacular.

The acting is said to be bad, but I’m not so sure there was anything in the way of acting in Troll 2 that couldn’t have been salvaged by the right material and direction and editing. Hold a shot for too long and even Remains of the Day could be made to look unnatural and silly. And I defy any great actor to pull off lines and directions like were in that script. Let’s see Anthony Hopkins try and pull off the father role in the most representative scene: Father dumps boy onto bed and bellows, “You can’t pi** on hospitality! I won’t allow it!” whereupon the father goes for his belt and the frightened boy asks what he means to do, with the answer being, “Tighten my belt by one loop so I don’t feel hunger pangs!” The only way to come at material like that and make it work is to play it as a gag, but the director wouldn't hear of it.

I may be prejudiced on this point on account of I incline to what may be called the Roberto Rossellini school, that most anyone can act for the movies if they’re cast in the right parts and if the director handles them and their footage in the right way. They got the chap who plays the proprietor of the general store out of a mental institution, and he did a fine job, despite or maybe because of his being off his nut during shooting. Connie McFarland’s performance as the daughter is held out for special scorn, but I don’t see how her acting was anything worse than what qualified for Saved by the Bell at about that same time -- and Saved by the Bell is an American institution. Robert Ormsby as Grandpa was perfectly competent and absolutely grandfatherly. And I certainly don’t see why the father himself, good old George Hardy, shouldn’t get work as an actor today, provided of course that the roles call for a Southern accent.

Which brings us to the near-mystical question of what it is about this ridiculous Troll 2 that is so fascinating to so many. EPIX, the poor man’s HBO, has run Troll 2 along with its companion documentary Best Worst Movie by Michael Stephenson, who played the boy in the film. As this documentary goes on it becomes plain, slowly but surely and without ever spelling it out, how it was that Troll 2 turned out as it did.

At one point a Troll 2 appreciator in line outside one of the theatrical showings offers that Troll 2 is a movie that aliens might make if they'd been receiving our TV signals and tried to ape what they had seen, without properly understanding any of it. Well, that fellow may have been righter than he knew for: the director is a strange man and an Italian, who only just gets by in English, and the screenwriter happened to be his wife, also Italian and not the most proficient in English. This was an all-American horror picture circa 1990 through the funhouse mirrors of strange Italian filmmakers. That explains it, really, and once you've worked that out then the fascination falls away a bit. Troll 2 is an American movie, and American popular culture for a couple decades up to 1990, as perceived by outsiders who don't have a very firm grasp on anything American.

Which wouldn’t be the first time Italian filmmakers aped an American film genre. I refer of course to the Spaghetti Westerns, which also turned out a bit off and not in a bad way.