More On '4 Technologies Reshaping BI'More On '4 Technologies Reshaping BI'

Last week's in-depth feature, "4 Technologies That Are Reshaping Business Intelligence," generated a number of comments and questions. Here are three of the deeper musings and inquiries along with my thoughts on parallel processing vs. in-memory technology, uses of stream processing/complex event processing, and growing demand for predictive analytics...

Doug Henschen, Executive Editor, Enterprise Apps

September 10, 2009

5 Min Read
information logo in a gray background | information

Last week's in-depth feature, "4 Technologies That Are Reshaping Business Intelligence," generated a number of comments and questions. Here are three of the deeper musings and inquiries along with my thoughts on parallel processing vs. in-memory technology, uses of stream processing/complex event processing, and growing demand for predictive analytics...

Comment on massively parallel processing (MPP) vs. in-memory technology: I enjoyed reading your August 31 story "4 Technologies That Are Reshaping Business Intelligence." I regularly hear the MPP database vendors bat down in-memory solutions as a transitory bump-up in scalability and performance, effective for now, but not on a long-term roadmap. They contend that when the data volumes and analytics workloads inevitably grow, in-memory hardware becomes less practical and economic, and ultimately has scalability limits. Of course, as SSDs become more economical and replace conventional disk drives, then MPP database vendors will become on a larger scale closer to what in-memory solutions provide now on a smaller scale. --MikeMike, that's good insight on the compatibility of in-memory and MPP. Teradata and IBM are both working on SSD-based MPP appliances that will deliver the best of both worlds, but the cost will no doubt dictate that they take on high-value analyses with comparatively modest data sets.

Yes, MPP systems based on conventional disk technologies can quickly process queries across huge data sets. But not every problem involves 10-terabyte, 50-terabyte or 100-terabyte-plus data stores. The vast majority of warehouses are still in the sub-five-terabyte range, and sometimes it's only the latest data, rather than vast histories, that are relevant in a query. The long story short is that both technologies have their place. Depending on the scale of your data and the value of speedy analysis, you could choose one or the other, or you might take advantage of both technologies.

While we're on the topic, Alex Wolfe of information recently offered this video report on SAP BusinessObjects Explorer, one of the in-memory technologies discussed in my report. The primary emphasis in this video is on the Internet-search-style querying of the tool (which came out of the vendor's Polestar technology), but toward the end of the video you hear a few points on the speed of in-memory analysis.

Question on the use of stream processing/complex event processing (CEP) technology:I was very interested in your recent BI report. Could this [stream processing] system be adapted to the financial services industry to create a "surveillance system" that could detect abnormal patterns and trends in financial transactions and operations at brokerage firms and banks? If it could, I visualize such system to be configured as an early warning system for fraud detection.--Gunther

Yes, complex event processing/stream processing technologies are already being applied in security, surveillance and risk detection roles quite aggressively. I mentioned Wall Street brokerages in the article, but they weren't the only pioneers of the technology. Leading government intelligence agencies were also early adopters. Today, many banks are using CEP to detect money laundering and other suspicious financial activity. CEP specialists include Aleri, Progress Apama and Streambase, and more recently companies including IBM and Oracle have acquired their way into the market. Just last week, Informatica joined the CEP market by acquiring Agent Logic. Questions on stream processing/CEP and predictive analytics:I just finished reading your story... and I have a couple of questions. First, when you talk about stream processing technology, is that similar to OLAP or rather is that what facilitates OLAP? Second, it seemed kind of contradictory that the most important thing in BI software [cited by survey respondents] is still [query] speed and [while] predictive analytics are so low on [the survey respondent's list]. I know you said in the story that that may be because users are not familiar with predictive capabilities. But then why are IBM and SAS trying to market something for which the business user isn't ready yet? --Nea

Regarding stream processing/CEP, that technology is not really an alternative to OLAP. The in-memory technologies discussed in article DO present a way to do OLAP-style multidimensional analysis without the pre-built cubes and data aggregations often required by conventional OLAP technologies. By taking the in-memory route, you can avoid IT labor and delays while gaining the flexibility to quickly query large volumes of data without aggregation (within memory limits, of course).

In the case of stream processing/CEP, you're looking at data as it is being created in transactional systems rather than historical information stored in a database, mart or warehouse. You're looking for patterns that signal conditions in the moment (shifts in stock value, system outages, financial exposure, network hacking, etc.). Complex event processing vendors have developed capabilities to store data stream histories (for comparison and analysis), but the emphasis is on spotting current conditions.

Regarding analytics and prediction, the techniques and technologies have been around for a long time, but they have not been as broadly deployed as conventional BI. This is due in part to the cost and complexity of the software and also to the cost of qualified analytics practitioners. IDC separates the larger BI market into "query, reporting and analysis tools" (conventional BI) and "advanced analytics" (predictive and statistical analytics). The former was estimated at $6.3 billion in 2008 while the latter was just $1.5 billion.

What SAS, IBM and others see is the opportunity to bring advanced analytics capabilities to that broader market of conventional BI practitioners. The trick will be generalizing the capability in a way that delivers value without requiring PhDs to make it work. Cookie cutter analytics didn't work for Dealer Services (the case example in my story), so it developed (and will have to maintain) a customized model. If a canned model or application doesn't work for your business, you may need high-priced talent to make sophisticated analytics such as prediction work for you. Not every business can justify or support that, so it hasn't been as broadly adopted as conventional BI.

Hope this helps! Doug HenschenLast week's in-depth feature, "4 Technologies That Are Reshaping Business Intelligence," generated a number of comments and questions. Here are three of the deeper musings and inquiries along with my thoughts on parallel processing vs. in-memory technology, uses of stream processing/complex event processing, and growing demand for predictive analytics...

Read more about:

20092009

About the Author

Doug Henschen

Executive Editor, Enterprise Apps

Doug Henschen is Executive Editor of information, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of Transform Magazine, and Executive Editor at DM News. He has covered IT and data-driven marketing for more than 15 years.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights