Is it real, or is it hype?
That's the question manufacturers may be asking as they kick the virtual tires of a relatively new concept called utility computing. The idea is that, similar to the way it obtains water, natural gas or electricity, a company would simply turn a switch and be able to access whatever computing power and software it wanted and pay only for what it uses.
The promise is clear enough. Manufacturers that have invested tens of millions of dollars on rafts of computer servers, thousands of personal computers, miles of networks, and software packages galore -- not to mention their IT staffs -- could get rid of all that and simply plug into a wall socket.
"Why should manufacturers pay millions of dollars to run their computer infrastructure, when they can pay on demand for the amount of IT resources they actually consume?" asks John Meyer, vice president of brand strategy at Computer Associates, a leading software firm in Islandia, N.Y.
So far, though, few manufacturers are embracing the entire utility computing concept. Some are poking a toe or two, trying a hosted application here and testing a pay-by-use or on-demand system there. Actually, what most IT vendors are offering are pieces of the far-reaching utility computing concept: hosted software applications, outsourcing, Web hosting, etc.
A big reason for the lack of enthusiasm for utility computing on the part of most large manufacturers is that they've already invested huge sums of money and plenty of staff time installing and learning their enterprise resource planning, customer relationship management and supply-chain systems. They aren't about to suddenly toss all that over the fence behind corporate headquarters to save a couple of bucks. "We do not expect manufacturers to drop what they have and embrace utility computing," Meyer says.
Another roadblock holding utility computing back is the basic premise behind it -- that all of information technology, including software, computers, networks, etc., is a commodity. Few manufacturers are ready to accept this notion, especially after investing tens of millions of dollars in setting up computers and software to more intelligently and productively run their businesses.
Some of utility computing's detractors point to its past failures. One of the dot-com era's high flyers, Storage Networks Inc., which once had a market capitalization of $14 billion, pushed the idea that companies could avoid buying expensive data storage devices and instead purchase "storage on demand." The Waltham, Mass., company liquidated in 2003, and other storage service providers have since gone the way of the dodo.
Utility computing differs from outsourcing because it's based on a per-use pricing model instead of a fixed-price contract. The utility concept also is broader than using hosted software, because it includes computers, network connections, and services too. "No companies are taking advantage of all that wide spectrum of services," says Lance Travis, vice president of research at AMR Research in Boston.
The shift to the utility model, if there is one, will be gradual, most observers say. "We see manufacturers starting to lay the groundwork for on-demand, utility computing," says CA's Meyer. His colleague, Lokesh Jindal, vice president of product marketing at CA, adds, "The change will be slower for larger companies -- more of an evolution."
Certainly IT vendors have their own takes on exactly what utility computing is, and what services they provide to make it work. IBM's "on-demand" offering includes everything from soup to nuts. "IBM's on-demand vision is a pretty expansive definition, combining outsourcing and services and new technologies, including open-source middleware and clusters of computers running shared applications, all coupled with a pay-by-use pricing model," observes AMR's Travis.
Some utility computing vendors are focusing on manufacturers' need to deploy some applications over the Internet. "We deploy companies' Web-based applications on demand over a global grid computing network of 15,000 servers in 69 countries," says Kieran Taylor, director of product management for Akamai, which has teamed with IBM and offers an on-demand version of IBM's Websphere services. "We see manufacturers migrating an increasing number of business applications to the Internet."
One of Akamai's selling points is that it promises to handle whatever volume of traffic a company gets on a Web-based application. "It's difficult for companies to predict how much capacity they will need," Taylor points out, "so we give them a global platform to use that's capable of handling their peak demand."
Cellphone manufacturer Sony Ericsson Mobile Communications shifted its dealer locator application and other Internet-based applications to Akamai's network, realizing a gain in performance (speed) of 1,200%. The cellphone maker also was able to cut the number of servers it needs from 66 down to two. "Our customers use our resources when and where they want to, and we bill them on a pay-per-use basis," Taylor adds. Companies can arrange to pay based on the number of page views on a Web site, the number of requests to use an application or some other measure of activity.
Another reason some manufacturers remain wary of the utility computing concept is a worry that it may put data at risk to have others -- in some cases companies with servers in other countries or on other continents -- running and maintaining customer files and product information. Some information security experts warn that, in general, the farther away a company's mission-critical data is, the greater the risk of loss of control, largely because the security of data transmission remains a major issue.