Microsoft SQL Server 2008 R2: A Closer LookMicrosoft SQL Server 2008 R2: A Closer Look

PowerPivot for Excel and PowerPivot for SharePoint speed multidimensional analysis. Parallel Data Warehouse edition now set for 2010 with IBM as a hardware partner.

Doug Henschen, Executive Editor, Enterprise Apps

November 9, 2009

6 Min Read
information logo in a gray background | information

In-memory analysis is clearly the big headline in the "R2" release of Microsoft SQL Server 2008, announced last week and expected in the first half of 2010. But the upgrade also promises master data management functionality, stream processing capabilities and improved datacenter administration. Current licensees will be able to sample it all in a Community Technical Preview (CTP) to be released this month. The three big themes behind the upgrades are pervasive insight, IT efficiency and dynamic development. But to dive into the details, here's a synopsis of what to expect in the next version of Microsoft SQL Server.

In-Memory Analysis

Microsoft SQL Server 2008 R2 will support in-memory analysis with two "PowerPivot" capabilities: PowerPivot for Excel and PowerPivot for SharePoint. The first is an add-in, in-memory business intelligence (BI) engine for Excel that will enable desktop users to bring together up to millions of rows of data for rapid, multidimensional analysis.

"This is a column-organized BI engine with high compression and high scan rates that will enable Excel users to work with very large amounts of information right on their desktop," says Tom Casey, General Manager, SQL Server Business Intelligence at Microsoft. "We often demonstrate the potential by using 100 million rows of data, but more typically users will be looking at many thousands of rows of information."

Speed and data capacity are limited only by the available processing power and memory of the hardware, according to Casey. The add-in will also let users mashup data from third-party providers, Web sources, external databases and so on, as if they are working with tables within Excel. "PowerPivot can infer many of the relationships and guide you in defining the relationships that need to be defined," Casey says. "Slicers" built into Excel are said to support the rapid, in-memory filtering, aggregation and navigation capabilities supported across large volumes of data.

PowerPivot for SharePoint facilitates collaboration by letting users upload PowerPivot-driven analyses to SharePoint. Microsoft has long had integrations between SQL Server Reporting Services and SharePoint; PowerPivot for SharePoint brings the same approach to Workbooks built in PowerPivot for Excel. Instead of sharing these analyses via e-mail or USB keys, however, they are shared securely on SharePoint. IT administrators will use the SharePoint security model to secure the workbooks, and schedule and manage the refresh rates of the data from the original sources.

The drawback of these PowerPivot offerings is that they will work only with Excel 2010 and SharePoint 2010, respectively. The one crack in this restriction is that PowerPivot analyses developed in Excel 2010 and posted to SharePoint 2010 will be shareable with Excel users who don't have the latest edition of Office/Excel. In that case, "the functionality of PowerPivot analyses will also work through Excel Services in SharePoint," Casey explains. So Excel 2007 users will be able to slice, dice and navigate PowerPivot analyses. Master Data Management

R2 will support master data management (MDM) with what Microsoft calls "Master Data Services." The focus is on analytical MDM, Microsoft says, which means it's about managing master data as used in warehouses, marts and other analytic resources -- not the harder job of managing master data as it's used in mission-critical transactional systems.

In classic Microsoft style, the emphasis here is on exposing functionality that ISVs and developers can use to create end-user applications. In other words, you don't get product MDM, customer MDM or other domain-specific MDM solutions; you get generic capabilities, including hierarchy management, metadata resolution, data viewing and data stewardship capabilities that can be used to build domain-specific apps. On the upside, you'll be able to keep multiple systems of record "synchronized, normalized and visible" across business units and functions, but the domains themselves will have to be built by either partners or direct customers.

"A bunch of our go-to-market effort will occur with partners and service providers that will build out domain-specific solutions," Casey says. "We're focused on the platform elements."

One platform element that will be differentiating, according to Casey, is end-user-ready data stewardship capabilities that will enable domain experts to fill in data hierarchies, mappings and other elements of MDM solutions using SharePoint for collaboration.

Complex Event Processing

Known variously as complex event processing (CEP), stream processing and event processing, this technology is about analyzing high volumes of fast-moving data in real time so you can detect opportunities or threats and quickly act on that information. Wall Street trading firms and intelligence agencies were among the first to exploit CEP, but it's gradually going mainstream. Uses including RFID analysis, dynamic logistics planning and adaptive manufacturing demand-and-response systems.

As with the MDM capabilities in R2, Microsoft emphasizes that the new Stream Insight capabilities are "ingredient technologies" for domain-specific applications. Casey says Microsoft will specifically exploit Stream Insight for real-time updates of data warehouses and adds that the potential for other high-throughput applications shouldn't be taken lightly.

"This is an all-new engine that is designed to operate on streams, and it will not be I/O bound," he explained, adding that it will "support processing at very high scale and with great flexibility. Where you write to or serialize to is up to the application builder." In other words, the service won't be enslaved to relational database speeds and storage paradigms. High-Scale Deployments

Microsoft has two developments in the works that will support high-scale database deployments: SQL Server Datacenter and SQL Server 2008 Parallel Data Warehouse. The Datacenter offering is aimed at modernizing the SQL Server architecture to take advantage of up to 256 processors (four times the old limit of 64), thereby supporting mission-critical deployments.

"This is consistent with Windows Server Datacenter Edition, and it will support unlimited virtualization," Casey says.

The Datacenter edition will work hand-in-hand with upgraded R2 application- and multiserver-management capabilities that let administrators declare objects to be part of the data tier of an application and then manage, deploy, monitor, secure and otherwise administer those assets as a single unit. That should be "a big gain for IT efficiency," Casey says, particularly when looking for consistent deployment across test and production environments.

The Parallel Data Warehouse edition is based on the scale-out, massively parallel processing technology Microsoft acquired last year when it purchased DATAllegro. This is a separate product that is not included in the November CTP. This edition is in "private CTP," according to Casey, and scheduled for release in the first half of 2010.

The surprise news last week was that Microsoft has added IBM to its list of hardware partners, which previously included only HP, Dell and Bull. IBM and storage vendor EMC have been added for both the Fast Track Data Warehouse reference configurations and the coming Parallel Data Warehouse Edition. Casey says the move was made in the interest of customer choice, but both IBM and Microsoft undoubtedly have their eyes on Oracle. Oracle's pending Sun acquisition -- which will mark Oracle's entry into the hardware market and also includes the SQL Server competing MySQL open-source database -- gives both rivals good reason to work together. Microsoft will be using IBM’s X series and Intel processors.

Microsoft announced last week that HP's reference configurations for SQL Server now scale up to 48 terabytes, up from 32 terabytes previously.

Read more about:

20092009

About the Author

Doug Henschen

Executive Editor, Enterprise Apps

Doug Henschen is Executive Editor of information, where he covers the intersection of enterprise applications with information management, business intelligence, big data and analytics. He previously served as editor in chief of Intelligent Enterprise, editor in chief of Transform Magazine, and Executive Editor at DM News. He has covered IT and data-driven marketing for more than 15 years.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights