Big data projects are meant to give companies the tools and insights they need to increase business value. All of that added value should make the initial project investment worth it. But it doesn’t always work out that way. What makes for successful, ROI-driving, big data projects, and what role could mainframes play in producing better results and, by extension, profitability from these projects?
Technology consultancy Capgemini and software developer Informatica released a joint report recently that evaluated the payoff of big data projects. The report surveyed U.S. and European executives, finding that 27 percent consider their organization’s big data initiatives to be profitable. Most – 45 percent – said they’ve at least been able to break even on their investments, while the rest either couldn’t tell or said they’ve lost money.
Profitability is, of course, only one factor against which you can judge the value of big data initiatives. Respondents were also asked about their ability to “operationalize” big data and turn it into real business progress. To that end, most respondents (36 percent) said they’ve only achieved half or fewer of their big data goals. Only one-quarter said these projects allowed them to achieve 75 percent of their goals.
By evaluating factors around project profitability and outcomes, the survey provides a sense of business attitudes toward big data. The majority of respondents expressed continued enthusiasm toward big data initiatives; 30 percent said they will accelerate or expand current projects to new departments or locations. So the question becomes: How can these businesses improve their chances of a successful big data project from both a financial and goal-oriented perspective?
Evolving mainframe technology provides an answer. The sheer processing power of mainframes offer the horsepower big data platforms need to enable richer real-time data analysis, insights and actions. Mainframes can lead to faster and more powerful results, making it easier for businesses to achieve more of their big data goals and improve ROI on those investments.
In fact, IT professionals increasingly view the mainframe as a critical piece to the big data analysis puzzle. A survey from Syncort and Enterprise Systems Media last year found that 69 percent of IT managers and technical professionals ranked the use of the mainframe for large-scale processing as “very important” to the success of big data strategies. Enterprise System Media’s Denny Yost commented that the mainframe is effectively being used to solve the “same data cost and management challenges” IT professionals tackled before, but now applied to the complex use cases introduced by big data.
Meanwhile, the introduction of the IBM z13 systems underscores the ongoing development of newer, more powerful mainframes built for the high-processing needs of today’s mobile, cloud and data-driven environments. It seems clear that, the more companies look to big data initiatives to improve business outcomes, the more mainframes will help those businesses draw value out of their investments.
To learn more about the role mainframes can play in big data initiatives, as well as other hot topics in the world of mainframe, register for SHARE Atlanta, July 31-August 5.