Analytics technologies lend enterprise content management a hand
A comprehensive collection of articles, videos and more, hand-picked by our editors
While we hear about big data all the time -- like a Fleetwood Mac song -- the truth is that SharePoint has largely been about small data.
Big data offers the opportunity to exploit a trove of company data and act on it, which has become a strategic imperative for business. But historically SharePoint has not enabled complex analytics.
The truth is that SharePoint has largely been about small data.
As a refresher, big data refers to data sets that are large, complex and rich. Most organizations have historical data that may trace back many years and that is strategically valuable. The problem is that few organizations have the tools or expertise to exploit big data in the form of analytics.
SharePoint and small data
As the SharePoint platform has evolved, it has become better integrated with the Microsoft suite of business applications, from SQL Server to Office. That integration is truly off and running with SharePoint 2013.
SharePoint is built on SQL Server, whose tool set can store and analyze large data sets. In addition, Microsoft initiated out-of-the-box, user-friendly business intelligence with SQL Server 2005, introducing a tool set for creating data warehouses and data marts, convenient extract, transform and load for staging data for analysis, and a powerful variation of Transact-SQL that enables query writing that can handle n-dimensional data structures. As a result, you don't need business intelligence experts in-house to for a successful BI project. You just need SQL Server 2005 or higher, and savvy IT personnel.
SharePoint 2010 contained PerformancePoint, Microsoft's dashboarding/key performance indicator technology, which is tied to SQL Server Reporting Services and Excel and makes it easy to deliver business analytics.
So, where's the disconnect? Why is SharePoint still a bite-size data management environment, and why don't we see it as a player in big data?
Power to the people
Hands-on analytics in SharePoint historically used Excel Services.
But that created a serious barrier: While SharePoint empowered users on the business side by letting them, rather than IT, crunch data, it also had a hard, 2-GB size limitation on file objects it hosted. In big data terms, 2 GB is nothing.
For more on SharePoint 2013
Is SharePoint 2013 all grown up?
Curbing SharePoint sprawl
SharePoint 2013's lesser-known features
Today's SharePoint has made two key feature enhancements to overcome the big data barrier while diminishing reliance on IT: First, it introduces Open Data Protocol, or OData, which builds on the REST application programming interface to get information in and out of Excel workbooks that reside in SharePoint. Business users can now link an analytics workbook to a range of data sources simultaneously: The data doesn't all have to reside in SQL or in a data warehouse to be analyzed.
Second, and even more important, while the 2-GB file object limit still applies in SharePoint, there is no hard limit on the amount of data that can be manipulated in memory when doing analytical processing. So, a user can create an analytical workbook, dump as much data into it as is permitted within thephysical limits of the serve, and process the data. Only the result is limited to 2 GB, in terms of being stored in SharePoint.
After the crunch
The processing capability of SharePoint 2013 isn't all. In addition to its ability to perform data analytics in worksheets and to handle huge data sets in the process, it has a truckload of new tools available for the processing: enhancements to PowerPivot; new data display functionality; Power View, a data modeling and visualization engine; and even an iPad interface for PerformancePoint dashboards.
But that's another story. What matters is that SharePoint 2013 finally makes big data in SharePoint a reality.