Pages

Showing posts with label Cloud Computing. Show all posts
Showing posts with label Cloud Computing. Show all posts

Monday, April 15, 2013

Crushed Between Clouds and Insects

An article in the New York Times today suggests that Intel, which got about 84 percent of its revenues in 2012 from PCs and the more muscular servers, has now joined the ranks of major computer companies upset by the maturing of the PC industry (“Intel Looks Past PCs for Foothold”). PC sales have been declining of late; the economy is sluggish, hence corporations and institutions, the major users of that tool, are not investing. But Cloud Computing is on the rise. Cloud computing? The phrase refers to the use distant servers, owned by others, to store an institution’s or enterprises total data “in a cloud.” Cloud servers are simpler and have fewer features than those used by corporations to perform broader computing services, along with file serving, in-house. Therefore they are less profitable. Intel, apparently, finds itself crushed between clouds above and insects (read smartphones and tablets and pads) below; these latter, according to the Times story, rarely incorporate Intel processors.

Why this need to “look past PCs”? The only explanation I can find is pressure on Intel’s stock. The company is magnificently profitable. In the 2007-2012 period, the company’s net income in the worst year, 2009, was 12.4 percent of its net revenues. In 2007 it was 18.2, inn 2012 it was 20.6 percent. This is a company with 53.3 billion in sales (2012) which has grown at a compounded annual rate of 6.8 percent a year 2007 through 2012. There is nothing wrong here except for the bloody market, and I mean the stock market. The market cannot stand a maturing industry. The lady may be but 43 years of age, but as soon as she has shown the first few grey hairs, she is immediately labeled a crone—and we must now “look past” the lady. Here is a chart of Intel’s revenues, net income, and an index of its stock, with 2007 set at 100:

What, if anything is wrong here? Revenue and net income show the consequences of the Great Recession—and the sluggish nature of the recovery. But the stock has responded much more dramatically. And, to be sure, the people who run Intel are not in the computer chip business. They are in the stock market business. Never mind the fundamental value of a corporation that absolutely dominates a market, computing, by owning most of the market share of its most fundamental component, the central processing unit. The same madness also governs the pea-sized brains of almost all those who run publicly traded companies—even one like Food, which can only really grow as population does. That industry has ruined a basic, necessary, and “always will be present” industry by trying to squeeze growth from it by making more and more high-margin prepared foods with too much fat, salt, or sugar—or artificially formulated foods with less of those three than natural foods ought to have.

Ears that hear, eyes that see. But we are lead by the deaf and the blind.

Tuesday, May 29, 2012

The Electrical Brain

Power failed again in our neighborhood, evidently due to some defective transformers. This had happened before, and over an extended period, the week of August 8-14, 2010, thus less than two years ago. Oddly enough, power to half of our circuits remained on. It was denied to about a third of the house, from basement to the second storey. Whole banks of plugs were also gone, and these affected parts of the still “lit” portions of the house. The failure lasted from 8:43 pm to 1:10 am (this morning)—but we were gone and discovered the situation at 11:30, about midway into it.

Such events rudely remind us. We never think about electric power—until it fails. And when it does, it is truly difficult to think about anything else. It is a trauma—although in rank secondary to failures of the body itself, which are even more powerful reminders of our contingent state in this dimension.

This morning—the coincidence seems meaningful—comes an op-ed piece in the Wall Street Journal informing us that data centers now consume 1.3 percent of all electricity generated across the globe. Data centers are used by the likes of Google, Facebook, Bing, Yahoo, and many others—not least any organization or individual engaged in “cloud” computing, thus storing files miles away from the screen-keyboard-and-box. The author of the piece, Robert Bryce (“Renewable Energy Can’t Run the Cloud”) says that “The power needed by data centers has been a hot topic for more than a decade as local electricity grids have been forced to adapt to huge new loads.”

When you think about it, that 1.3, while a small percent, is high when we think of all the lights that burn—so much so that you can see the outlines of America from a satellite by night if it is cloudless across our sub-continent.

Got us thinking. How much of our food-intake is used by our brains? The answer is 20 percent (link). This answer is based on the brain’s average consumption of our oxygen intake, used for burning, in part to the brain’s glucose consumption; that varies from 11 percent in the morning to nearly 20 percent in the evening (link). A fifth is pretty high considering the size of the head in comparison to all the rest. Here is a highly privileged consumer of our daily bread.

Now we known what the global cybernetic brain’s memory storage consumes. It would be nice to known how much electrical energy we use in our communications—including the devices that are hot-and-ready to serve me here, in this basement work space. The printer’s on, the computer hums, and light reaches me from my efficient flat-screen display. Are we, in the broader category, already at 20 percent? I think not—but have no data. But the time will come. As will the arrival of the end of fossil fuels. What are we going to do then? And what will we call that approaching time? The Dark Ages?

Monday, June 6, 2011

Clouds of Glory

The following post first appeared on July 20, 2009 on the “Old” Lamarotte. I thought I’d republish it here now that Apple has just announced its iCloud. The facts haven’t changed.

*     *     *

“The Cloud” or “cloud computing” is now starting to show up in newspaper features, suggesting that a ramp-up to commercial initiatives is under way. The assumption is that this idea is something new, better yet, “a new paradigm”; that phrase has what we once called panache. In a sentence: all of our data and software reside on the Internet rather than on our hard drive. The supposed benefit is that if the computer fails or the laptop is stolen, nothing is lost. Another much-touted feature of cloud computing is that many individuals can use the same data simultaneously, which might be useful in a corporate setting—or so it’s claimed. How many people does it take to write a single letter? or a report? It’s best if the answer is one. The quality drops geometrically as authors are multiplied. But never mind. We are now deep, deep into group grope, group talk, group think, group fun, group everything…

The Cloud reminds me of the Mainframe Days. Let’s pinpoint it and say 1960 or thereabouts. The hardware platform was the massive mainframe; it held all the data; it also provided all of the CPU cycles. We reached the computer by means of terminals. Unlike PCs terminals had no chip; they were dumb input and output devices, hence the phrase “dumb terminal.” They sent keystrokes and displayed symbols on a screen (if you were so lucky); many had to reach over and look at scrolling paper on which the mainframe gave its oracular responses. The big draw-back of the mainframe was slow speed—when the number of users shot up. At least dozens and usually many more people working at the same time shared a single or a cluster of CPUs, and at times the mainframe was slow to distraction. Distant mainframes also predate The Cloud. To use a computer far away by means of terminals and dedicated phone lines was known as time-shared computing. Also very slow. The speed of data transfer has since greatly increased. If the terminal is replaced by a good computer, the server no longer needs to supply all of the CPU cycles. They are provided by the user’s machine.

What’s so hot about this idea? Group work is one answer—but networks already provide all the group-stuff we really need. Safety is the other, but that issue is really rather overrated. Those who really need to save their data know that. For that reason they use redundant means of backup. They back up computers and prepare multiple additional archival CDs on which the information is held—the copies often transported to different locations so that a fire or flood at the place of work cannot destroy the symbolic wealth. Are we justified in trusting managers of The Cloud more than our own systems? Are those distant managers beyond the reach of misfortune, fires, or terrorist attacks? I apologize for being cynical, but it seems to me that safety is just a red herring.

What’s really hot about The Cloud is that it offers someone else—not us, the users—a new way of making money out of data-storage and software by levying a continuous recurring charge on the hapless user. The service is by subscription. This means that we, as users, pay and keep on paying. Use The Cloud and turn into a Cash Cow. Back in the good old Mainframe Days, companies like IBM charged even for the use of their Operating Systems the same way—and levied fees for using other software on top of that. It was a monthly cost to the customer. Operating systems and software that you could buy once-and-for-all crimped that business model in major ways, but good ideas, like milking the consumer continuously, have a way of returning again. And here they are, coming to us now in a cloud of glory.