Pages

Sunday, October 30, 2011

When We Eat, We’re Eating Energy

Herewith some startling numbers. A report by the Congressional Research Service in 2004 (link), provides direct and indirect energy consumed in Agriculture, including crop and livestock production. The numbers are in quadrillion BTUs, and the answer is that we consumed 1.7 quadrillion BTUs of energy, 1.1 quadrillion BTU in direct uses, and 0.6 quads in indirect, thus for fertilizers and pesticides. A USDA Factbook (link) tells us that “The aggregate food supply in 2000 provided 3,800 calories per person per day.” The (late) 2012 Statistical Abstract (link, Table 2) tells me that the 2002 population of the United States was 288.1 million people. So let us combine these numbers. Below the CRS data shown graphically:


Now one food calorie, also known as a kilo calorie, is worth 3.968320721 BTUs. Put another way, it takes 0.251995761 calories to make a BTU. So we can convert calories to BTUs or BTUs to calories. A quadrillion is 1015. Here is what it looks like: 1,000,000,000,000,000. So let us proceed.

If we apply year 2000 calories per capita to the 2002 population, we discover that in that year agriculture provided 1.1 trillion calories per day or 399.6 trillion calories a year to the total population. We can render that number into BTU-equivalents by either multiplying by 3.968… or by dividing it by 0.25199…. In any case we will be 1.584 quadrillion BTUs. That was the energy output of our agriculture in 2002 measured in quads of BTUs. So what did we expend in energy to get that output? What was the energy input? It was 1.7 quadrillions of BTU.



To get a ratio here, we take the input and divide it by the output. The result is 1.072. In other words, we expended more calories in industrial energy to get 1 calorie that we could eat. But the energy ratio calculation is not yet done.

We learn from this story in the Atlanta Journal Constitution, by William G. Mosely, co-author of a 2010 National Academy of Science study, “Understanding the Changing Planet: Strategic Directions for the Geographical Sciences,” that in addition to direct and indirect energy uses, an equivalent amount is used in processing and packaging of food. These two are, according to Mosely, 7 percent each of national energy use. And then another 3 percent of national energy consumption is used in food distribution.

Using these additional inputs, it turns out that in 2002 we used total energy of 4.13 quads as input for that 1.58 quads of caloric output. The cost now turns out to be 2.6 units of energy in for each unit of edible energy out.

Therefore it is absolutely true that we are eating energy in this modern day and age. This should make us very alert when we think about World Population. It has now reaching 7 billion—projected to reach 8 billion in 2050—a time by which we shall have pretty much consumed currently know hard reserves and only shale stuff will be left to take us to the next billion-person addition to the human family.

Added later. Thanks to russel’s comment to an earlier version of this post, I realized that I was using the wrong kinds of calories, not the kilo calories used in food. This post has therefore been updated to correct that very big error. I appreciate the correction.

Wednesday, October 26, 2011

Shale Oil and Gas: The Years They Add

The real hot topic in energy these days is not wind or solar. It is shale. By way of an example, today’s New York Times carries a story titled “The Energy Picture, Redrawn.” The focus is on shale oil and on shale gas respectively. As is usual in such coverage, the broader context provided by current reserves and trends in consumption are not highlighted at all. The graphics suggest, instead, plenty of both for a long time to come.

By way of a useful footnote to news coverage, I’ve undertaken to calculate for you just how long current proven reserves will last and how much longer the fossil age will last assuming that projected shale reserves actually pan out.

Here is a graphic that tells the tale. After that I’ll tell you how I did it.


What we see here is that without exploitation of global shale oil reserves, oil will run out by 2037. If we succeed in exploiting all reserves of shale oil, we get another 44 years and the world runs out of oil in 2081. The same for natural gas. Current conventional world gas reserves now are seen to last longer (until 2048), but exploiting all shale gas will only extend gas use by 25 years, to 2073.

What I did was to assemble three categories data: (1) current reserve estimates; I chose the World Oil estimates (link) because they are the highest. On the oil side these already include some portions but not all Canadian tar sands—which are not part of the shale of projection. (2) I obtained shale oil reserves from this Wikipedia compilation and shale gas estimates from the Energy Information Administration (link). (3) I obtained oil and gas consumption estimates from the EIA (link, link). For oil I calculated the consumption growth trend from 1982 through 2008; I extended that trend into the future. The growth is at a rate of 1.4 percent a year in barrels. For gas, the EIA provided forward estimates out to 2035; the growth here is at a rate of 1.6 percent a year, measured in trillions of cubit feet. These I extended at the same rate into future years as well.

Having a projected consumption out through 2100, I calculated, first, how rapidly known reserves would be consumed. Next, I added shale reserves to current reserves and did the calculation once again. The results are charted above.

Part of the down-side of loudly cheering shale reserves is that it lets the public fall back into an easy slumber—of ignorance. Shale is not a long-term solution. First of all, both resources will cost a lot more to exploit than crude oil and natural gas. Exploitation will have huge environmental consequences. And current estimates may well be optimistic.

The next two graphics show the same relationships in quantitative forms, the first for oil, the second for gas.



Desalting the Sea

Desalination has long been an interest of mine—entirely due to chance and personal history. My first assignment as an analyst at Midwest Research Institute, which actually marked my career as an analyst, was to write a report on that subject for a client. Back in those days half the people in the field still called it desalinization. The change in name had humorous aspects. It was the big joke at the first ever industry conference I attended. Yes, on this subject. Those were the days of Nikita Khrushchev—and a thaw in Russia. The name of the conference was changed at the last minute at the request of a Russian delegation; the Russians had requested it because the old word was too close to destalinization—which was too hot a topic in Russia. Or this is what the talk and snickers at the conference were all about…

Now today comes an interesting story in the New York Times. It’s about a $4 billion desalting plant built in Tianjin in China (quite near Beijing). The thrust of the story is that it costs the Chinese more to desalt the water than they receive for it—thus they are desalting at a loss. “In some places,” says the Times, “this would be economic lunacy. In China it is economic strategy.” This neatly summarizes the differences between worshippers of the Hidden Hand and users of the Human Hand. China subsidizes technologies with an eye on the future—while we bow heads praying to Adam Smith.

Some interesting factoids here. Some of these the Times includes. One is that the technology used by the Chinese is from Israel, entirely imported. It was just assembled in Tianjin. Another is that the project is owned and operated by a state-owned conglomerate called SDIC (State Development & Investment Corporation). What the Times omits is that SDIC is a rather sizeable venture with 2010 revenues of 64.6 billion renminbi ($10.3 billion). SDIC also achieved profits of RMB 6.8 ($1.07 billion). That’s a profit of 10.4 percent. Not bad, actually—and that includes whatever red ink the desalting operation spilled on SDIC’s books.

Someday third-world will come to mean countries that still hew to superstitions worship of old secular gods. If we don’t convert to the new secularism, we might become denizens of that third world ourselves.

Tuesday, October 25, 2011

The StatAb Is Dead, Long Live the StatAb

The Statistical Abstract of the United States (1878-2011) is now officially dead. We’ve managed 131 editions of this book. We’ve managed to produce it right through World War I, the Depression, World War II, the Korean War, Vietnam, and on—until now budget cuts have killed it. Here is the official announcement from the Census Bureau (link):

The U.S. Census Bureau is terminating the collection of data for the Statistical Compendia program effective October 1, 2011. The Statistical Compendium program is comprised of the Statistical Abstract of the United States and its supplemental products — the State and Metropolitan Area Data Book and the County and City Data Book. In preparation for the Fiscal Year 2012 (FY 2012) budget, the Census Bureau did a comprehensive review of a number of programs and had to make difficult proposals to terminate and reduce a number of existing programs in order to acquire funds for higher priority programs. The decision to propose the elimination of this program was not made lightly. To access the most current data, please refer to the organizations cited in the source notes for each table of the Statistical Abstract.
Never mind. Long live the StatAb. The spirit it represented is still alive and well. It was an effort to empower a well-informed citizenry with straight facts—straight from their own Census Bureau—with lots of help from other stellar statistical agencies of the United States government. Those people are still there, although diminished in numbers. We salute and join you in bowing our heads. The StatAb Shall Rise Again!

Monday, October 24, 2011

Real and Unreal “Work”

“Work” has positive connotation when its meaning is to create something or get some necessary task accomplished—be that using hands-and-bodies or using only hands-and-mind. The “creation” here, needless to say, is actually a kind of transformation, be it of physical substances or of ideas. For many people in corporate life today “work” has come to mean something quite different. The actual doing itself happens outside to the corporation. The employees’ job inside is to get other people to do the work—and the pressure is on Big Time to get the work done at ever lower prices. That’s what words like out-sourcing, farming-out, and contracting signify.

This sort of thing began with the industrial revolution—which itself began with textiles. So-called producers obtained cotton, flax, and wool. They farmed it out to individuals who spun the yarns in a highly fragmented cottage industry. Then the yarns were farmed out again, to a similar network of individuals working on their own, to be woven into fabric. The fabric, in turn, was farmed out to individuals who cut and sewed to the producer’s specifications, again paid only for the work accomplished, never for their time. Those who worked had virtually no power—many although in numbers. When we arrived in America as immigrants, the first job my mother held was in a textile operation where she was paid by the piece. Yes, she was an employee, but this practice, piecework, harked all the way back to the dark beginnings of industrialization. In this structure producers did no work and workers had but a minimum economic share in the manufacturing process.

In the evolution of industry there was a period—we’re still largely in that period—when workers were hired, paid for their time, had a stake in the enterprise, and also had a role in product design and creation (as defined above). But now we are gradually going in the other direction again. In many corporations, only tiny minorities of employees actually do work; they are either at the lowest administrative levels or they are engineers-designers who specify the work to be done. Its actual accomplishment is sourced out. Most employees do unreal work: they must find, qualify, negotiate with, and pressure outside agents who will then do the work specified.

Is it real work to lean on suppliers to produce to spec at ever lower prices? Or is that something else? What pride arises from being able to say: “I got the bastards to move up the deadline and to swallow the new price. They hated it, but I got the signature.” Now those who have to do this sort of work are themselves but peons, are themselves under pressure, and the more they succeed, the more the pressure mounts. They, but indeed also the people who live in more square feet above them, are all slaves of the Devil himself who wants infinite expansion of meaningless numbers to enrich the very few who only talk to lawyers and no longer actually really need the wealth.

History has cycles. Times of equilibrium, when justice more or less reigns, are transitory. Then the cycles either turn up—justice increases—or they turn down—the reverse effect. Our times now trend downward. May the process accelerate. May Modernity soon die its soul-less death. Here and there the change upward is sometimes glimpsed, but the instances are still too few. May they multiply.

Sunday, October 23, 2011

Naked but Eating Peanut Butter

In the category of food, next to my daily bread I most value peanut butter. I waxed eloquent on that subject two years ago on the old LaMarotte (link); there I called America the Zone of Heavenly Oils and concluded the post by saying: “One cannot praise peanuts, and peanut butter, frequently and ardently enough.” So I am prejudiced.

The other day I had occasion to note here that Global Warming was threatening coffee and cocoa—two more natural products I’d add to the list of the Indispensables. Elsewhere (link) I have suggested that in a forced-choice between coffee and chocolate on the one hand and gasoline on the other, I would choose coffee and chocolate. Now we learn that (groan! wail! sob!) peanuts are under attack—and possibly from the same cause. Drought has devastated the peanut crop—and plantings were, in any case, already down. Why? Well, farmers prefer to grow corn and cotton because these crops yield higher profits. So is it to be Cotton or Peanut butter? My choice is already made. I’d gladly go naked if I could still have my jar of Smucker’s Natural Peanut butter. And I’ll gladly walk to the store to get it, lacking gasoline, so that I might buy Taster’s Choice and Lind’s dark chocolate while I am out—but with an old newspaper draped around my privates using my last cotton string.

Saturday, October 22, 2011

Employment’s Best and Worst: September

The Bureau of Labor Statistics (link) brought the latest changes in employment and unemployment between August and September. One table in the release shows states with statistically significant employment changes. I’ve mapped these courtesy of Nations Online Project’s free outline map:


Light brown indicates states losing employment (New Mexico, Wisconsin, Michigan, Ohio, Pennsylvania, New Jersey, New Hampshire, and North Carolina). Purple shows those gaining employment at statistically significant rates (Hawaii (not shown), Arizona, Louisiana, Mississippi, Florida, District of Columbia (not shown), and Maine).

The vaguely discernible pattern (if any pattern) is that heavy industrial states are not exactly recovering and that what little significant recovery is taking place is mostly southerly.

Friday, October 21, 2011

Suddenly NGDP?

Economists who want to kick-start the economy want the Federal Reserve to do it. Why the Fed? Because our government, which controls the fiscal purse-strings, is hopelessly deadlocked. That only leaves monetary policy, the Fed’s area of action.

Pots are all a-boil, a-bubble on economics blogs. People want a relatively new idea implemented. It’s called NGDP targeting. The N here stands for nominal. Nominal Gross Domestic Product. That really means GDP as measured in ordinary, current dollars, dollars valued as we use them right here, right now. The conventional way to measure GDP is in real, meaning constant dollars—dollars with inflation removed. GDP data are collected in nominal dollars (of course), but expressed in constant dollars—so that any two periods may be compared. We want to know what really changed: how much has production of goods and services increased or decreased. Get rid pure price increases. The Consumer Price Index lets us do that.

Now why do the economists want NGDP targeting? Why not RGDP targeting? The answer is very simple. The Fed has absolutely no way of influencing real GDP. But it could influence NGDP. Let’s start with the reasons for that.

When we measure nominal GDP in two successive periods, the change we see is caused by two components hidden, as it were, inside that nominal dollar. Part of the change is actual growth in products and services delivered. The other part is due to increase in prices, inflation. Thus—

NGDP = Growth + Inflation.
RGDP = Growth - Inflation.

Suppose that nominal GDP grew by 5.5 percent, but the inflation was 1.12 percent. Then growth was responsible for 4.38 and inflation for 1.12 percent. Real GDP growth would therefore be 4.38 percent in the period.

Let’s look next at how the Fed comes into this. Their role emerges when we note that Inflation itself has two parts—although CPI only measures both together, indistinguishably mixed. One part is due to increase in prices, the other is due to increase in the money supply. And the Federal Reserve controls this second part. Let’s see that equation again:

NGDP = Growth + (price inflation + increase in money supply)

When we see it this way, we start to understand the NGDP targeting advocacy. The Fed can cause the money supply to grow or to shrink. And when it does this, it can influence the NGDP. The Fed’s long standing policy is to control the aggregate, inflation. It does so by causing the part that it controls to change. The Fed targets inflation as a whole. It tries to keep that measure within bounds—without causing a recession.

The Fed has three ways of influencing the money supply. One of these, and the most important, is (1) the Federal Funds Rate, thus interest charged on banks’ borrowing of federal funds; low rates cause more borrowing—therefore money supply grows. The others are (2) lowering or increasing bank reserve requirements; lowering these increases, raising these decreases money supply; with low reserve requirements, banks have more cash on hand; and (3) buying or selling Treasury bills from banks or securities dealers; buying these puts cash into the economy; selling these draws cash away.

How would NDP targeting work? Instead of targeting the inflation rate, the Fed would announce an NGDP growth rate as its target. To hit that target, it would take (or forgo) monetary actions. With the growth rate target widely known, bankers, investors, indeed the whole economy, would know in advance what the Fed would do. This is known as “expectations,” and the more accurate these are, the more rational is economic behavior (thus our economists).

To use the example above, suppose the Fed had set a goal of a 6 percent NGDP growth but the actual NGDP grew 5.5 percent (as above). With targeting in place, and publicly known, the Fed would now attempt to increase inflation by 0.5 percent, from 1.12 to 1.62 percent by one or a combination of the above-described monetary activities. The real growth (4.38 percent) plus the new inflation rate (1.62 percent), would soon deliver the targeted 6 percent NGDP growth.

Similarly, if inflation suddenly increased, say because oil prices spiked up, the Fed would not immediately take steps to counter this increase—the tendency when inflation is the target. The Fed would know that these high prices will also dampen physical growth. It might therefore leave the inflation take its course—or only mildly counter it.

Advocates of NGDP targeting foresee major benefits from this more sophisticated approach. My feeling is that they want inflation increased now somehow. NGDP targeting is one way to do it. And this because they don’t see the Administration succeeding in passing fiscal stimulus bills.

Wednesday, October 19, 2011

Maternal Morality Rate

This post was prompted by an article we read on Common Dreams by Johnny Barber, “Whatever Happened to Women and Children First?” It is reachable here and worth reading if you’re interested in Afghanistan.

I noted in that article that the United States ranks poorly in this category. I thought I would look into that. Using 2008 data from a report jointly sponsored by WHO, UNICEF, UNFPA and The World Bank (link), I’ve constructed a table that shows where the United States ranks. The report is titled Trends in Maternal Mortality, 1990 to 2008. Forty-nine countries have a better Maternal Mortality Rate (MMR), 120 worse. The MMR is expressed as maternal mortalities per 100,000 live births. The U.S. has an MMR of 24, as does Saudi Arabia. Here is the tabulation:


Striking here is that countries nearest but still better than us are not in the industrialized category and predominantly Muslim countries. Those below us (Uruguay - 27, Lebanon - 26, Fiji - 26, and Chile - 26) are also industrially modest, except perhaps Chile.

Countries with large populations, and thus great diversity, tend to do worse than small and developed places. Russia comes it at 39, China at 38, and India at a high 230 rate. Greece has the lowest MMR and Afghanistan the worst: a staggering 1,400 mothers die for every 100,000 babies born there.

My source also shows 1990 data. Now for the 51 countries shown in the listing, 42 had improved their MMR in the 1990-2008 period, one had been unchanged, and 8 had a worse MMR rate. The United States fell into this last category. In 1990 its rate had been 12! We had slid back.

This has implications for the health insurance system that countries have in place. No doubt about it, we have the best health care system in the world, but its reach is obviously short. Else we wouldn’t rank at the bottom of the developed world. In countries where the systems are well organized, wealth is present, and the health care systems are modern, the results outperform ours, in many cases by a three-fold measure. Take for instance those with an MMR of 8 in the table. Yes, the list includes France, Finland, and Australia. But it also has tiny Malta and Qatar, Serbia, and the Czech Republic. Outperforming the U.S. of A. Who’d have thought it?

Tuesday, October 18, 2011

The Mind Behind Windows

I’ve been aware of this for decades now, but two coinciding events reminded me of it again. One was public, the other private. Steve Jobs passed away. And by chance/circumstance, Frederick Brooks’ The Mythical Man-Month fell into my hand once more and I was reading in it again. There, on page 260, in an addendum to the book that had come with the 1995 edition, came the reminder.

The man who actually invented the graphical user interface (GUI—Windows) was Douglas C. Engelbart (1925-). He also invented the tool indispensable for using Windows, the computer mouse. Engelbart unveiled the first demonstration in 1968 at the Western Joint Computer Conference. He and a team at Stanford Research Institute had produced the first GUI. His ideas were later developed by Bob Taylor and his team at the Xerox Palo Alto Research Center. That is where Steve Jobs saw the system demonstrated for the first time during a visit. He in turn succeeded in producing his own version, initially for the Apple Lisa which, per Brooks, was too slow to make most of it. Next Jobs introduced it on the Macintosh (1985), and, as people say, the rest is history. Microsoft simply had to have its own version of that. Enter Windows.

This, in a way, is a morality tale, a tale of “it takes many people,” a tale of creativity, of enterprise all rolled into one. The genuine creators are sometimes the least known. The noblest of them take that in stride and are content to have received the gifts that rain down from on high.

Monday, October 17, 2011

Charitable Giving

I woke up this morning—who can say what prompts one’s dreaming—wondering whatever happened to that, you know, the big, the prominent charitable organization, the one that I used to be proud to give to, at work—because, of course, every fall, there would be this big drum-beat and promotion and everybody renewed his or her subscription. I was halfway through my first cup of coffee before I had the name. The United Way. The United Way of America. My impression was that this organization—and if not the organization then its annual campaign—had virtually disappeared. In turn that then prompted me to have a look at charitable giving in the United States—and what it might look like in these sticky times.

The first pair of graphics shows the source of private philanthropic dollars. I show two pies, the first for 1990 and the second for 2009. These represent $101.4 and $308.8 billion dollars respectively. (Click to enlarge, Esc to return again.)


The notable change here is the shrinkage of individual contributions in this period from 80 percent to 75 percent of the total. Corporate giving has also shrunk. The slack has been taken up by foundations. Next, let’s look at how these moneys were allocated.


The big kid on the block is Religion. It tops all other recipients of voluntary giving—but notice its sudden down-ward slide in the Great Recession. I’ve singled out four other categories by type and prominently marked Human Services, the category under which United Way Falls. The bars shown, All Other, include (in order of importance) Public and Societal Benefit (where, for instance, contributions to PBS might fall), International Giving, and Environment. I’ve omitted foundation grants to individuals and the catch-all “Unallocated” category. The graphic thus accounts for 79 percent of the $309 billion we managed to produce in 2009.

Let me next show you an interesting chart graphing how the top ten organizations that provide Human Services fared in 2009. They are arranged in this graph in order of their size. United Way is at the top. All of these organizations live on cash contributions by the public, augmented by some amount of grant income. 2009 was a bad year, as we note. Six of the ten showed decreased contributions. The dramatic contrarian, with a 66 percent increase in contributions that year, was Catholic Charities USA.


Finally, here is a graphic that shows United Way’s own performance from 1999 through 2010. The Great Recession has had a painful impact on the leading charity that helps many, many local charities to stay in business. The down-turn in 2005 may well have been the result of a scandal that unfolded in 2004. United Way’s CEO for the Washington Branch was convicted of misuse of funds that year—and the news of that may have hurt the agency the following year. I’ve stitched this graphic together from multiple sources as shown last.


It’s instructive to look at charitable giving at times like these. What is the general situation out there? Well, incomes are declining. Not surprisingly, individual contributions are dropping as a share of total contributions. Income inequality is increasing. Public welfare funding is under pressure. Thus it is naïve to imagine that voluntary giving will take up the slack as public funding declines. Yet the absence of jobs, strained incomes, and the “jobless recovery” are all increasing the need for privately provided charitable services.

Data sources:
  • First three graphs: The 2012 Statistical Abstract (link); select Table 580.
  • Top 10 in 2009: The Chronicle of Higher Education (link); it cites the “go to” source for the data in this field, The Chronicle of Philanthropy.
  • United Way Private Income: Data for 2005-2010 are from AggData (link) which enables a download of a free Chronicle of Philanthropy data base. The 1999 data come from Funding Universe (link), the 2000 an d 2001 data from Chronicle of Philanthropy news stories, the 2004 data from Wikipedia (citing Chronicle of Philanthropy), and the missing years I extrapolated.

Sunday, October 16, 2011

Coffee Growing Regions of the Globe


News surfaced yesterday, offered by Starbucks, that the long-term viability of coffee is threatened by Global Warming. Herewith an image of the world’s coffee-growing regions courtesy of Coffee Tea Warehouse (link). The regions marked in yellow are the top ten coffee growing regions, their leader being Brazil. Coffee-growing is an equatorial sort of pursuit, the activity falling between the Tropics of Cancer and of Capricorn. From the National Coffee Association of America, Inc., I have it that 125.2 million bags of coffee (each weighing 132 pounds) were produced in the 2009/10 season. Coffee production has been trending down. According to the International Coffee Organization (ICO) the Finns are most addicted to coffee, consuming nearly 10 kg per person (data for 2000). Sweden, Switzerland, Germany, France, Italy, and Brazil all outrank us here. Our own consumption is put at 4 kg per capita by the ICO.

Saturday, October 15, 2011

Dennis Ritchie, R.I.P

Dennis Ritchie (1941-2011) died on October 12 at age 70. He is an honored member of the tribe of computer programmers, indeed a very visible member of it, but, such is that tribe, he is largely only known within the fold. He is the designer of the C programming language and co-developer (with Ken Thompson) of the UNIX operating system.

Other than assembly languages, I’ve written programs in Basic, Cobol, C, C++, and Pascal. Of these my great favorite was and is Pascal, written by the Swiss computer scientist Niklaus Emil Wirth, who is still alive; at 77 he is closer to me in age, two years my senior. Ritchie, in our view, was a mere youngster. Pascal of which the DOD’s Ada is an extension, is a very strict, orderly language—while C and C++ are both more free and flexible. I always liked the wordy formalities of Pascal—perhaps because I am a traditionalist. But writing in C or its extensions became obligatory in order to write fundamental code for Windows. I made the transition easily enough—and in large part because C permits you, if you want to, to maintain a Pascal-like structure in your coding. It takes a lot less keying, but it is more difficult, later on, to read. And I appreciated that flexibility.

Just recently, on Ghulf Genes, I’ve had occasion to mention Frederick Brooks’ book, The Mythical Man-Month, in the context of praising programming as a preeminent practice of creativity (link). Frederick Brooks was the developer-in-chief of IBM’s OS/360 operating system, thus another prominent figure in the clan. And Dennis Ritchie, who wrote both an operating system that underlies the Internet and probably the leading computer language, was, you can be sure, one of the great creative figures of our time—albeit at a level of near-invisible humility which, no doubt, made his entry through the Pearly Gate as swift and clean and sure as the code we write today in C and its descendants, not least among them Java, the programming language of the Internet.

Friday, October 14, 2011

Compensation Lags Productivity - or Worse

I’ve reported on this phenomenon earlier once on the Old LaMarotte as well. Here is an update using a 2005 base point set to 100. What I am showing below is an index for both, Productivity for the Nonfarm Business Sector as a whole, measured in output per hour of work, and Real Compensation (thus inflation adjusted) for the same sector.


What this shows is that compensation only ever weakly follows advances in productivity—and in some years when productivity advances, compensation declines (2008 and 2011 2Q). In the current period, the gap between the two has been accelerating, as shown in the inset graphic. I’ve argued elsewhere that productivity is a double-edged sword. It can be increased, in effect, by farming out the production of unfinished or semi-finished components to others—or letting some of the labor be done by contractors in a service business. An example of that in the medical field is to have Indian contractors evaluate test results sent to them by electronic means. Making the case for this is not a simple process; therefore I refer the reader to a close look at how this comes about that I presented on the Old LaMarotte (link) some time ago..

This and some other recent presentations, alas, simply point to the fact that when you “go global” economically, you might well be “going postal” on the people who must live locally. Change must be massive, systemic, and—given the scale of economies—must be led by government.

I got the data shown here using this Bureau of Labor Statistics facility (link). Note, to see graphs better, click on them. To get back smoothly, press Esc.

Thursday, October 13, 2011

Why GDP No Longer Measures Welfare

The other day the New York Times wondered, Why is income falling with the recession officially over? I suggested then—and I’m far from alone in this—that the Gross Domestic Product is no longer an accurate way of measuring the general welfare of the American population. In the earlier post (link), I showed growing inequality in income as one of the interesting indicators. It suggests that while GDP is a good measure of the welfare of the top quintile of the population, it does not measure welfare over all.

Today I’ll contrast the GDP-performance with Jobs-Performance in our economy over an extended period of time. I will show the two in raw numbers and in index formats. First the raw numbers:


[Note: Clicking through enlarges image. But changes introduced—by Google? others?—now do not bring you back to the post if you click on the Back Arrow. Instead, when wishing to return, press Esc.]

Total employment is shown here in thousands, GDP in constant 2005 dollars. Both were growing in the 1939-2010 period, but as the annotations show, the growth of employment advanced at a rate of 2.1 percent a year while the growth of GDP was 3.6 percent annually. The two curves, therefore, gradually converge.

Now this pattern of growth suggests that productivity may be responsible for the different rates of growth. True. In 1939 each employed person generated $34,978 in GDP (GDP divided by employment). In 2010, each employees generated $100,818 in GDP. These being constant dollars, productivity, measured in this “gross” manner, increased just a shade under 3-fold. And here the negatives associated with productivity appear. GDP, which we casually associate with the general welfare, requires fewer and fewer people. If productivity had not advanced, thus stood at 1939 levels in 2010, we would have employed three times as many people, 374 million versus 130 million. We don’t have that many workers, to be sure, so productivity has a positive aspect too. But why then is income falling? That is because productivity gains are not shared with the laboring masses. Income growth is always lower than growth of productivity.

The growth rate in employment has also slowed. In the 1990-2010 period, it was 0.9 percent a year—over against GDP growth in that same period of 2.5 percent. The difference between these two rates (1939-2010=1.5%, 1990-2010=1.6%) has been increasing.

The bottom line is that general welfare means employment for all who seek it; GDP growth means fewer and fewer jobs. You draw your own conclusions.

The next chart sharpens the view of GDP and Employment growth. It shows both at 100 in 1939—and then changes based on that index into the future. What you see is the marginalization of employment over time. The GDP rests more and more on goods produced either by machines or labor overseas. China should watch our GDP and rejoice. We, here at LaMarotte, will keep staring at the jobs numbers instead.

Wednesday, October 12, 2011

The Rich Get Richer

Yesterday I mentioned growing income inequality—in the aggregate. Today another look at that subject. Herewith a graphic on the changing distribution of income by each fifth of the U.S. population of households. These data show shares of income and how they’ve changed from a base of 1990 at decade intervals. The data come from this Census facility (link); select Table H-2.


Notable here is that share of income by the four lower quintiles has dropped—and that the top quintile has gained share consistently. The rich are getting richer, the poor poorer. In 2010, the four lower quintiles, thus 80 percent of households, earned less than the top quintile.

Evidently this erosion of economic power has not translated into any kind of political reaction as yet. The two top quintiles, representing 40 percent of the voters, and having average income of $79,000 (fourth) and $170,000 (fifth) are numerous and influential enough to produce an electorate roughly equally divided between the parties. How long will this last? Or is it that people do not vote based on their economic interests? If they did, an administration much more concerned with equitable income distribution would be a shoe-in.

Tuesday, October 11, 2011

An Active Sun


The graphic shows the sun today, courtesy of NASA’s Solar Observatory/Atmospheric Imaging Assembly (link). At least eleven sunspots are active here. Sun images are always awesome, and especially so when the sun is on its way to a so-called solar maximum. We’re on our way to such a situation now. An earlier post here (link) give you some more insight into the sun’s cycling life.

I like to check on the sun from time to time. Behind us now is one of the quietest periods in recent solar history. In 2009, for instance, the sun was spot-free on 260 days (71% of the time). The new activity made itself manifest last year—51 spotless days (14%). This years so far, and we’re quite a good ways into the year, indeed we see its end approaching, we’ve only managed 2 spotless days, that’s two, not a typo, thus less than 1 percent of the time.

Monday, October 10, 2011

Household Income

Newspapers and media yesterday reported that median household income has dropped again as of June 2011. The New York Times headline was, “Recession Officially Over, U.S. Incomes Kept Falling.” The number actually cited was $49,909 in real dollars produced in a special study. That number is somewhat higher than the value I have from the Bureau of Labor Statistics’ Current Population Survey for 2010, but never mind small details. The BLS values are based on 2010 constant dollars. The inflation adjustments used by the Sentier Research study (link) cited by the NYT are not explicitly stated. I thought I’d put up some history here. The data are from this BLS facility (Table H-6, All Races).


Along with the median household income (half earn more, half less), I am also providing average household income (mean income). Beginning and end values are shown in numbers along with the highest points and the lowest—if the lowest are not at the beginning. I’ve also provided a bar graph at the bottom showing the difference between average and median.

After at least two previous jobless recoveries, it should not surprise us that incomes lag the ends of recessions. Indeed the divergence between GDP- and Jobs-performance has become, for me, an economic indicator. One way to read it is that economic well-being as measured by the GDP is no longer a reliable measure for human well-being measured by full employment and consequent quality of life. Another way to read this is that wealth no long trickles down as it once did—thus that the distribution of gross national wealth is no longer quite as even as once it was.

This last point is illustrated by the difference between average and median income. It has been growing. And since these data show constant dollars, that difference is absolute. The movement of the Gini Index, which charts income inequality, is shown in the next graphic for the same period of time.


Some notes. If you want to understand how the Gini Coefficient is calculated, here are two posts on the Old LaMarotte (one, two). The data for this graphic are from this BLS facility; select Income Inequality and then Table H-4. The larger the Gini, the greater the inequality.

Notice that on this chart the Gini advanced during the Great Recession while both jobs and income dropped. We are slowly entering a very different era, but we think that things should follow patterns as in the old days. They no longer do. That is what should concern us.

Sunday, October 9, 2011

Those Who Finish, Those Who Don’t

I don’t know what the proportions are, but quite a lot of people start jobs but never really finish them. The worst of these are people who deliberately stop short because they can’t be bothered—and they don’t mind leaving their mess for others to handle. A classic cluster around here is formed by people who take their dogs on walks, carry plastic sacks, collect the poo, tie the sack at the top, and then just surreptitiously drop it near some bush along their path. Those who walk their dogs in the dark and don’t carry a sack at all—why they are beneath contempt. They belong to another category.

I feel for those who start things but don’t know they haven’t finished. One observes this sort of thing constantly in places of work. People write reports, but the damn thing’s incomplete. It misses the point of the assignment. You stare at pages and realize: you’ve just been handed a list. It lacks analysis, comes to no conclusion. People make sales calls but fail on the necessary follow-up. People make trips and fail to observe the very thing you sent them to observe.

A variant of this, at the higher levels, is excruciatingly hard analytical work—say in science—which is admirable in its own narrow field but obviously ignores the greater whole. Thus, lacking a feel for comprehensiveness, such people are blind to their own most glaring errors.

I go on walks and therefore daily see gardening projects abandoned half-way through, mowed lawns but unswept walks, abandoned tools that, sometimes, have begun to rust away.

Then there are those who finish things. This spring I passed a yard where an elderly gent had started to restore patches of bad grass in a strip next to a lovely line of flowers. The job looked hopeless to me, his method not very promising. We’re dog-sitting again, and taking Katie on her walk, I passed this place again. I only usually follow that path when I am with a dog. Big surprise. The unsightly patches of grass were, today, virtually indistinguishable from the rest of the lawn. I could still detect the areas, however, faintly: they were a deeper, richer green.

When I set out on that someday journey to discover the mythical bird, Simurgh, it is people like that old man I want to be in my company.

Saturday, October 8, 2011

Employment Change by Sector, August-September 2011

Let’s look at employment change in more detail. I charted the big picture yesterday. Here I present change between August and September by sectors of the economy.


The pattern here is much the same as for the last several months. The two categories that show steady increase are professional and business services and education and health services. The second of those is mostly health services—the one sector of this economy that’s marching right along—thanks to the aging Baby Boom generation.

The positive change in the Information sector is due to the end of the Verizon strike. Last month we had a loss there of 48,000 jobs, 45,000 of which the strike caused. The strike ended September 2. That added back the lost 45,000—but the net change in that sector is only 34,000, meaning that even Information lost jobs if we ignore the Verizon strike.

Once more, the big loser is Government, down another 34,000 jobs. Now it is well worth noting that employment in government is the only category of employment our political leadership actually directly controls. And this is the sector that consistently loses the most jobs month to month. Our politicians, therefore, don’t walk the talk. No, sir. To show you how this sector has performed in 2011, I present here another graphic.


This one shows, using an index (December 2010=100), how far employment has slipped in the nine months of 2011 we’ve clocked thus far. Every level of government has shed at least 1 percent of its employment in this period, the Federal and Local more than one. Local government is the largest of these categories. Now in economically troubled times demand for government services grows. We are responding by removing people who provide services at such times. We are not, collectively, behaving rationally.

Friday, October 7, 2011

Employment: Update September 2011

The big news last month was Zero Job Growth—neither gains nor losses. The economy had gained 83,000 jobs; but in part due to a strike at Verizon, the economy had also lost 83,000. Striking workers are not counted as “employed.” Well…
  • First of all, the Bureau of Labor Statistics revised the August figures in its September release (link). Instead of that zero, we actually had a gain of 57,000. Thus the bad news had been overstated. On the graphic below, I show this change in purple.
  • Second, September data show a gain of 103,000 workers. The number somewhat overstates real gains because some of that increase (about 48,000 workers) is accounted for by Verizon strikers going back to work.
Here is our updated graphic for September:


To stress the positive, we have now had 12 months of net employment gains. To stress the negative, we lost 8.7 million jobs in the 2008-2009 period. Of these, so far we have only recovered 2.0 million jobs in the next 21 months. Here is a graphic showing that:


Tomorrow I will show detail by sector.

Renminbi Rant Reminder

Last March, during another time of China-bashing, I put up a post on the Old LaMarotte (link) on the subject of the renminbi or the Chinese yuan. Here is the flavor of it:

Ultimately currency control is price setting. By holding the value of the yuan low in exchange for the dollar, the Chinese government is also lowering the price of its goods. It is foregoing revenue. This means that we can obtain Chinese goods cheaper than we could if the renminbi were to float. This also means that the Chinese are willingly accepting a lower price for what they produce. Self-evident? Yes. Why do they do that? They say: “It will benefit our people if we sell our goods at a cheaper price. More people will buy them. We will sell more things and employ more of our people. There is no law which says that we cannot set our prices high or low. The only difference between us and you, America, is that we do this at the national level and you don’t do it at all. You don’t seem concerned for the good of your own people.”

It’s not as if China were lending us billions and selling us cheap goods at the point of a gun. We’re doing it willingly. There are straightforward ways of defending our workers. One of those is to put up stiff tariffs.

Tuesday, October 4, 2011

Buying Some Cars, Not Buying Homes

Yesterday came news of jumping auto sales, all domestic producers scoring meaningful gains. To be sure, the same story also predicted that total auto sales this year will be significantly lower than last year. The percentage increases were large because we had sunk so low. This suggested that it might be good to look at two categories of private fixed investments over a longer period of time, thus from 1990 to the end of the second quarter of 2011, available from the Bureau of Economic Analysis (link); the table number is 5.3.3. The data I am showing are quantitative indices calculated by the BEA, with 2005 representing 100.


Expenditures began to plummet in the first quarter of 2006 already. Halfway through 2011, transportation equipment sales (largely autos, vans, and pickups) were still a long distance from the level in 1999. Expenditures on single family residences have stopped dropping but are not rising. The early onset of a slump in these indicators tells us that the public already, somehow, knew that something was amiss early in 2006—underlining for me the suspicion that we are possibly seeing a genuine sea-change. That vans and pickups lead sales recently reported, however, goes counter to that feeling. That certainly sounds like the same old, same old thoughtlessness.

Monday, October 3, 2011

Longitude

Orientation—whether in the physical, social, or metaphysical dimension—is the absolute beginning of knowledge. You’ve got to know where you are. That very word, orientation, is derived from the physical. It comes from the Latin oriri, to rise, and the rising something indicated by the word was the sun, therefore the east. East-west orientation, therefore, was relatively easy for humanity. You simply had to observe the sun. It rose in the east and set in the west.

Paradoxically, however, traveling by sea, humanity’s first effective orientations using the sun told people where they were in the north-south dimension. After we were certain that the earth was a ball, that it travelled around the sun—and at a tilt to the sun’s own rotation around its axis—we learned to use an astrolabe, thus an instrument able to measure the angle of the sun to the horizon at noon, thus at its highest point. Knowing this angle and the time of year, the astrolabe (and later the sextant) could tell us how far north or south we were of the equator at any time of year. That technique dates to 150 BC. I’ve summarized the process on this blog earlier; the link is below.

To know where we were in the east-west dimension took much, much longer. It required the development of very accurate clocks—able to operate at sea. That achievement finally came in the eighteenth century, thanks to the achievements of an English clockmaker called John Harrison (1693-1776). That story is told most eloquently by Dava Sobel in her 1995 book, Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time. It’s short, suspenseful, entertaining, has pictures, and is must reading to anyone who’d like to know this story in detail. Harrison labored to win a £20,000 prize set by the British Parliament to solve the intractable problem of longitude. That prize in today’s dollars would be $4.47 million.

Why such a huge prize? Ships, cargo, entire fleets—and all the people on them—were routinely lost at sea in those days because they had miscalculated, by so-called dead reckoning, just where they were in an east-west direction from their intended landing place. Dead reckoning used estimated measurement of speed and time, over a set course (measurable by latitude), and thus calculating distance. But especially in stormy weather, speed-measurement, indeed course calculation, was extremely chancy. Therefore dead reckoning, while it worked reasonably well, was extremely chancy. After such a period, getting a hard fix on longitude was impossible out on the open sea.

So how do clocks come into this? In brief, as observed from the earth, the sun moves 15 degrees of longitude in the space of an hour. If you know your own time accurately, and also know what time it is at another fixed point on the earth, you can use the difference in time to calculate with great accuracy how far you are from that fixed point.

The illustration shows the geometrical basis of lines of longitude calculated as angles from the Prime Meridian. The Prime Meridian here is the “fixed point”—or rather the fixed line—from which the navigator calculates his or her distance east or west. The Prime Meridian these days runs right through the Royal Observatory of Greenwich, England. That line begins at the north and ends at the south pole. If the navigator sailed west and measured time locally and it was noon, the other clock, running on universal (call it Greenwich time) said 2:00 pm, the navigator knew that he or she was 30°W longitude from Greenwich, which is at 0° longitude. Conversely, if your local time is noon, but Greenwich time is 10:00 am, where are you then? 30°E longitude. One degree is 60 nautical and 69 statue miles or 111 kilometers—that’s at the equator. More on this later.

Now more illustrations:



This one shows the longitude over the United States. Our longitudes are all west. Longitudes are further subdivided into 60 minutes, each minute into 60 seconds. My own location in Detroit is 83° 05’—although I note that not all of my sources agree about the minutes. When you see longitude or latitude figures, the fractions may also be rendered into hundreds, so that Detroit’s longitude may be shown as 83.08 and mean the same thing as above. The above courtesy of Tutapoint.com (link).


Herewith longitudes overlaying a map of the world. This graphic, of course, does not do justice to a crucial fact. The distance between longitudes is not uniform all around the world. It is greatest at the equator, 69 miles, and zero at the poles. At 40° latitude, north or south, the distance between lines of latitude shrinks to 53 miles. Therefore accurate calculations of longitude require additional lookups to adjust for the latitude where the navigator takes his or her readings. The illustration is from Jacksonville State University (link).


Herewith the big picture, showing the whole world again, as presented by Wikipedia (link).

This post is the third, and last, on the subject of the astrolabe. The others are here (first, second). The astrolabe is meaningfully connected to this subject for two reasons. A good clock measuring Greenwich time and an astrolabe would still suffice today to navigate accurately on the oceans. The astrolabe was useful for determining the exact local time, thus noon—which has always been the time for seafarers to find out where they were

Sunday, October 2, 2011

A Handful of Colliders

The shutdown of the Tevatron (image), America’s only large hadron collider, on September 30, 2011, brings the number of LHCs worldwide down from two to one. Western culture has entirely dominated elementary particle physics at the experimental level—the level where proof of the particles’ existence and behavior can be physically determined. The only accelerators are in the United States and at CERN’s facility in Switzerland. CERN is operated by 20 European member states.

This research involves causing particles to accelerate to very high speeds, read energies. When they have reached top speed, they are caused to collide, and the moment of collision is then recorded. Careful observation and measurement of collision points and the tracks left by particles gives us insights into the strange “existents” that make up all matter.

The size or power of a collider is measured in electron volts. The Tevatron was named the Tevatron because it is capable of generating 1 TeV of power, thus one teravolt of energy, thus one trillion volts. The volt is a measure of electrical pressure or flow. The numeric succession is from the simple electronvolt (eV) to deca (10s), hecto (100s), kilo (1000s), mega (millions, MeV), giga (billions, GeV), tera (trillions, TeV), and peta (quadrillions, PeV). We’ve not reached the peta stage. Our largest, the CERN-LHC, has a 14 TeV rating, each colliding particle carrying 7 TeV. By way of grasping the monstrous energies involved, ponder that the energy of light ranges between 1.6 to 3.4 eV. At room temperature, a single molecule in the air has 0.04 eV of kinetic energy.

Here in the United States we’ve struggled to excel, and Tevatron was our winner. Isabelle, at Brookhaven National Laboratories, proposed at 400 GeV, never got off the ground; it was begun but then got cancelled in 1983. Tevatron was built in 1987. We built the Relativistic Heavy Ion Collider at Brookhaven in 2000 (17.7 GeV when used on protons); it is still operational. CERN, which had built two earlier colliders (1971, 1981) before we built our first, introduced its Large Hadron Collider (14 TeV) in 2009; currently it is the most powerful. Our own Superconducting Super Collider, which was to have been built in Texas, would have produced 40 TeV of energy, 20 TeV per colliding hadron. The project was cancelled in 1993. CERN is now working on a proposed Super Large Hadron Collider, envisioned to begin operations in 2019. Herewith a list of our handful of colliders obtained, like the image of Tevatron, from Wikipedia (link).


The scientific interest is certainly present in the United States. The collective political will is not. I’m powerfully reminded of a book about Hellenistic science I’ve recently read. It is The Forgotten Revolution by Lucio Russo. Russo argues that Rome, unlike Hellenistic Greece, had had no gut-level interest in science as science. It was interested only in power and its administration. These colliders represent pure science. If they promised a handy way to cause little Big Bangs deliverable by drone, then perhaps Congress might be found at the plate; not until then.

Saturday, October 1, 2011

Fermilab Closes Tevatron

Fermilab is one of twenty-one national laboratories funded by the U.S. Department of Energy. Yesterday Fermilab announced that it was closing its Tevatron accelerator. Tevatron is the second largest hadron collider; the largest is CERN’s LHC. Tevatron has been around for 18 years, Fermilab for 44. The word hadron designates protons and neutrons, thus composite particles made of quarks; they are atomic nuclei.

I got to wondering about the reasons for this closure. In one word, budget. Fermilab anticipates that operation of the Tevatron would require $100 million for the next three years, thus roughly $33 million yearly. In light of looming budget cuts, DOE shook its head at funding this amount. Therefore Tevatron will close its doors.

In FY 2009 Fermilab had a budget of $330 million. In FY 2010 the lab operated under a continuing resolution. I could only find a graphic that shows its FY 2011 budget—and it was above that for 2009, but not precisely determinable. In the high-energy physics department, as it were, the Tevatron represents DOE’s contribution to basic research. That’s what’s usually sacrificed.