It’s time to forget the hype and start dealing with reality. That was the vibe surrounding the topic of cloud computing at the recent SHARE in Atlanta conference. Having heard for years about the computing model that reduces costs while increasing flexibility and scalability, attendees came looking for results.
And that’s what was delivered in some of the cloud-focused sessions. Not only were attendees treated to examples of successful cloud computing implementations, but they also were enlightened to lessons learned during those implementations. For good measure, they received a healthy dose about the relevance of the mainframe to the cloud.
SHARE, an organization that traditionally draws the bulk of its membership from the mainframe community, highlighted the cloud as one of the conference’s hot topics. The organization wants members to understand the cloud, and its relevance to mainframe computing, while showcasing positive results and warning against pitfalls.
Diana Donnellan, a former IBM business development executive, spoke during one of the sessions about how cloud components improved the bottom line for a customer involved in a lengthy SOA deployment. A replacement of fat desktop clients with a SaaS solution produced a 50 percent revenue increase in the year after the deployment. A mobile sales force CRM deployment, to replace an inventory system the client was using improperly, also paid dividends with a 10 percent sales increase, she said.
The cloud components were bright spots in a project that she said lasted three times longer than the customer had hoped due to challenges in working with the SOA. Along the way the entire IT staff quit after realizing how different the new technology would be from what they knew. They have since been replaced by staff more attuned to the new technologies, which has helped drive the long-term ROI from the project.
One of the lessons drawn from the whole project, Donnellan said, was that for some customers taking small steps makes more sense than big transformational change.
At a session on cloud basics, Mike Buzzetti, an IT architect at IBM’s Design Center, sought to win over cloud skeptics. An oft-heard argument from doubters is that cloud is merely an iteration of the work the mainframe has been doing for decades. Buzzetti conceded the point, but noted the cloud brings the computing power that once was accessible only in mainframe shops to everyone.
Frank J. De Gilio, an IBM distinguished engineer, addressing the relevance of the mainframe to the cloud, cited a customer project in which System z was used to deploy 16 images into a virtual environment. The process took one hour, far less than the three days it took when the images were implemented using a distributed environment, he said.
“We can provide these same (types of results). We do these things already in our environment, and that’s the message we need to bring back to the lines of businesses we support,” he said.
Ray Jones, IBM Vice President of System z Software, described during a keynote speech how the City of Honolulu uses System z for a cloud computing/social networking solution that has improved interaction between city government and its citizens. The city now publishes its complete budget online so everyone can see how tax money is spent. In addition, through the use of social media, citizens can make requests for such things as fixing potholes for quick, cost-effective action by the city.
More such projects are surely in the cards, as CIOs increasingly look at cloud computing as a way to achieve business and IT objectives. In a 2011 IBM Global CIO Study that included 3,200 technology executives, researchers found that cloud computing has moved to fourth place in their list of key visionary initiatives, from 16th place two years earlier. In 2011, 45 percent more CIOs cited cloud computing as a priority.
Buzzetti cautioned that adopting the cloud requires having a handle on costs and making decisions on whether to build solutions in-house or buy the technology. It’s important to know the organization’s pain points and requirements.
When considering the cloud, among the decisions organizations must make is whether to go with a public or private cloud approach, said Michael Wojton, an IBM senior systems analyst. If the cloud service is supporting a core function, it makes sense to keep it in-house; otherwise “let somebody else do it,” he said. “Both are valid (approaches), but you’ve got to figure out what makes sense and which is the best place to do it.”
Donnellan said cloud hype has just passed its peak. Referring to what research firm Gartner has dubbed the “Hype Cycle,” she said cloud computing has passed the “peak of inflated expectations.”
The next phase in the Hype Cycle is the “trough of disillusionment,” when do-it-yourselfers and early adopters take a shot, often falling short of desired success, she said. Next comes the “slope of enlightenment,” when companies hire consultants to do the work right, at which point a knowledge transfer takes place, ultimately leading to the “plateau of productivity.”
And that, of course, shows that cloud adoption isn’t any different from the evolution of technologies that have come before, with a wave of hype and uncertainty followed by the sobering doses of reality implementations bring.
Note: Some sessions at SHARE in Atlanta — including sessions by Jones and De Gilio– were broadcast over the web as part of SHARE Live! Recordings are available for purchase at: http://www.share.org/p/cm/ld/fid=142
Veteran tech journalist Pedro Pereira was on special assignment at SHARE in Atlanta. Follow him on Twitter @EditPedro.