IBM and Informatica Take Next Steps Toward Cloud DeliveryIBM and Informatica Take Next Steps Toward Cloud Delivery
As the cloud model matures, new offerings pave the way for shared analytic insight and services-based data integration.
Scales of economy in storage, processing power and bandwidth will inevitably pull increasing numbers of enterprises into cloud computing. That's a key premise Bitcurrent Analyst Alistair Croll expounded at this week's Interop event in New York. After sharing impressive stats on steady cloud-provider improvements in capacity, reliability and performance, Croll asked: "How many of you have done two network upgrades delivering a four-fold improvement in performance in one month [as Amazon has done with EC2]? This is all cloud providers do for a living, and that's one of the reasons enterprises are on a slippery slope and will slide very easily toward cloud computing."
Recognizing the appeal of an approach known for its flexibility and low cost, IBM and Informatica this week unveiled their latest steps toward cloud computing. IBM's is a sweeping offering, combining best practices, BI technology and infrastructure aimed at private-cloud deployment. Informatica's developments are more mature, if a bit more targeted and tactical than IBM's, and they involve both public- and private-cloud scenarios.
Building 'Smart' Clouds
As one Twitterer quipped, there was "no buzzword left behind" in IBM's November 16 announcement of the "IBM Smart Analytics Cloud." Based on IBM's own internal development of what it described as "the world's largest private cloud for business analytics," the Smart Analytics Cloud offering is a combination of best-practice advice, IBM Cognos 8 business intelligence (BI) technology and -- for enterprises that are large enough -- System Z-based infrastructure aimed at delivering private-cloud services with scale, power efficiency and resiliency. The emphasis is on helping large organizations to develop private clouds through which they can efficiently share data and analytic services. IBM itself has taken this approach by combining more than 100 separate data systems from across its enterprise and exposing the resultant insight to more than 200,000 internal employees.
"This is aimed at any company trying to strategically deploy across the organization," says Christopher Dziekan, an IBM Cognos product strategy executive. "We've taken all of the knowledge we gained and the techniques we used to provision internally, and we've packaged that as a service offering."
Included is standardization and best-practice advice on project scheduling, funding models, charge-back schemes and cloud-style provisioning. For those companies using an IBM platform such as System Z, there are also hardware optimizations and tenancy approaches for large-scale deployments. But most people think of cloud computing as an opportunity to outsource data center capacity. So where's the flexibility in a private-cloud offering?
"Elasticity is very much a part of it, because there are some departments and divisions that are going to be super-active at some points and other departments and divisions that will demand capacity at other times," Dziekan explains. "You don't want each division developing its own infrastructure to handle peak loads only to have it sit idle during those valley periods. The costs go up if you take a per-department approach, so this is about achieving economies of scale through cloud-based delivery."
As for the BI angle, Cognos software is displacing Brio technology that was previously used by IBM internally, and Big Blue is only too glad to dump legacy technology that is now supported by Oracle (by way of Hyperion). IBM discovered that it could save "tens of millions of dollars" by leveraging BI across the entire organization with cloud-style, services-based deployment, according to Dziekan, and he says the company is now in a position to help customers do the same.
"It's about taking a unified, strategic approach to deploying, managing and provisioning," Dziekan says. "We've done a number of projects in which we replaced a dozen reports for different regions, countries and languages with just one report. That one report can be secured to know what data you can access. And it knows what language you're using, so IT doesn't have to maintain a dozen different versions of what is otherwise an identical report."
The approach is said to take cost considerations out of the deployment -- with "tens of millions of dollars" saved at IBM -- while also encouraging "single-version-of-the-truth" consistency across the enterprise. Of course, IBM's Business Consulting Services division has multiple initiatives and projects aimed at providing cloud-based computing capacity as a service. But think of "Smart Analytics Cloud" as a route to building efficient, shared data and analysis resources with an internal, services-based approach. Public, Private and Hybrid Options
Informatica addressed both private- and public-cloud scenarios this week in announcing a new Cloud 9 Platform, in addition to new services and upgrades to existing software-as-a-service (SaaS) offerings. Cloud 9 is billed as a multi-tenant, enterprise-class "data integration platform as a service," and it's aimed at helping developers and systems integrators build, share, reuse and run data integration services and data quality mappings in the cloud.
"Data integration will be the defining capability for cloud computing -- not outsourced data centers or sexy new application solutions," writes Chris Boorman, Informatica's chief marketing officer, in this introductory blog post on Cloud 9. "To really embrace cloud computing, one needs a whole new way of delivering enterprise data integration that brings together the ease of use that business users require with the sophistication that must be delivered for IT architects. Otherwise, cloud computing will simply remain the domain of non-critical fancy-looking applications on the periphery of true enterprise business requirements."
Support for business-IT collaboration is a core capability of the November 10 Informatica 9 release, and Cloud 9 uses the platform's collaborative underpinnings to help developers and business users build, share and deploy data integration or data quality components as services on private or public clouds.
Informatica also announced a Winter '09 release of its existing Informatica Cloud Services this week. These SaaS-based data integration applications let non-technical Salesforce CRM and Force.com users tap into on-premise or cloud-based data. The upgrades are said to deliver broader connectivity, more flexible scheduling capabilities and better support for assignment rules associated with marketing leads. A new sandbox feature lets users copy data synchronization, replication and quality tasks for non-production work such as development and testing. The vendor also introduced Informatica Address Quality Cloud Services, which offer a SaaS-based version of the company's Address Doctor capabilities to verify and correct postal address details in 240 countries.
Advancing its offerings for Amazon EC2, the vendor introduced Informatica Data Quality Cloud Edition, which is said to deliver a broad range of data quality services, including profiling, cleansing, matching and monitoring. These services can be flexibly turned on and off as needed through Amazon's pay-as-you-use model, and they are compatible with the recently announced Amazon Relational Database Service (Amazon RDS) as both source and target endpoints, so you can access and store custom application data in the cloud.
With this week's announcements Informatica emphasized that it has been working on cloud-based approaches for three years, and it peppered its press releases with quotes from customers who have been working with established services that have been commercially available for many months.
"We're using Informatica Cloud Services to replicate millions of rows of data from Salesforce CRM to a centralized database running on Amazon EC2," stated Stephen Brown, technical architect at Telegraph Media Group. "As we think about moving more of our IT infrastructure to the cloud, the ability to develop more complex mappings and workflows and run them as custom services for line-of-business managers will allow us to continue to provide self-service while IT remains in control."
There was much talk of a "Hybrid Cloud" model at this week's Interop event, and it was very much about providing the combination of self-service flexibility while making sure that IT remains in control. The private approach appeals to those who have privacy and security concerns while the public options liberate the enterprise from the rigors of infrastructure investment and management. Even if enterprises throw in the towel and farm out compute capacity and data storage to low-cost cloud providers, there will still be plenty of work for IT, says Croll of Bitcurrent. "Their role simply moves up the stack to managing policy and provisioning," he says, and within large organizations, that alone is more than enough to keep armies of IT workers busy.
About the Author
You May Also Like