The customer facing components of the Enterprise DW/BI service is the BI architecture itself. As the BI tools vastly improve their design and deployment models in addition to providing a wide range of analytic and visualization services, the business community's desire for a self-service deployment model is presenting some interesting challenges to the traditional IT service model. How can IT provide an effective self-service model to their various end-user communities within the enterprise yet still ensure adequate stability and reliability across the enterprise? If we let the end-users develop and deploy user defined tables, imaginative semantic layers, reports, dashboards, scorecards, etc, surely we will end up with a mish-mash of objects that will undermine the integrity and performance of the overall enterprise data warehouse. But they are the customers and our job is to ensure happy customers, right?
Take into consideration a company that has grown through merger and acquisition. To retain the innovation and creativity of the individual business entities, these business units are encouraged to do their own thing by executive management. Is there a way to embrace this autonomy when it comes to providing a BI environment that promotes innovatiion and discovery of new opportunity, WITHOUT compromising the enterprise data warehouse performance and stability?.
The short answer is, we need to find a way to embrace innovative and nimble business communities. But how? I'm interested in your thoughts.
When I was first exposed to the new wave of Real-Time Operational BI solutions coming online, my initial reaction was, "Oh man, here we go again! Fighting the BI vendors' 'sizzle sell' and the "EDW heresy' that comes along with them selling direct connects to the operational support systems and the propagation of independent Data Marts."
But, I owed it to myself to get a better appreciation for the solution. The Operational BI I'm referring to is the real-time connection to federated sources (Operational Support Systems) through web service agents and the direct updates of "twinkling" Operational Dashboard measures. The end-users have control over the setting of performance thresholds and alerting schemes, dynamically, as needed to quickly react to blips in the operational metrics. This is a nice fit for specialized applications such as a Command and Control Center.
What about the metrics that get accumulated throughout the day? Push & persist them to the Enterprise Data Warehouse. This provides a history of these R/T metrics enabling more strategic trending and data mining functions.
Operational BI has a home in the overall Enterprise Information Management architecture. But is your enterprise ready for r/t Operational BI? Is the business prepared to react and evolve in real-time mode in response to r/t changes? The technologies are ready to enable and the vendors are ready to sell.
It's a paradigm shift. But hey every paradigm needs a nudge once in awhile!
I was a judge on three of this year's TDWI 2007 Best Practices award categories, and recently took some time to review some of the other winners. This year TDWI added the “Radical BI” category. I didn’t judge this category but was nevertheless intrigued by some of the entries.
The Radical BI winner was Lawrence Livermore National Laboratory. In a nutshell, LLNL is the organization responsible for ensuring that nuclear weapons in the United States stay safe and effective through advances in science and technology. That’s a big job. I happened to have one of their managers, Dave Biggers, in my “BI From Both Sides” class in Boston, and he added some wise context to some of the best practices we cover in that class.
LLNL’s data warehouse goes back to 1985, the same year I started working in data warehousing. (No, don’t start doing any math in your head. Stop it!) The philosophy behind their data warehouse, called ASSIST, was that it would be built on a modular architecture that supports both batch and on-demand queries; that it allowed users to create their own “virtual dimensions” to better support intuitive analytics; that desktop data integration would close the loop between user-defined Excel spreadsheets and corporate data; and that an open and loosely-coupled architecture could support myriad business needs.
The evolution of the original ASSIST system has been continuous, arguably even one step ahead of the prevailing technology trends. But when you ask Dave Biggers why LLNL won the award, he keeps it simple. “I’m sure our philosophy of enabling users and embracing direct access and tools such as Excel had a bearing on TDWI’s selection,” he says modestly. “The essence of our approach has been modularity, data stewardship, virtual dimensions, and openness.”
What’s Radical BI, anyway? Well, judging from the Livermore application, it’s about meeting end-users where they are, and providing them with the tools and data they need in order to do their jobs—and doing it flexibly without a lot of dogma. And amid a lot of vendor spin, futures predictions, and market-ecture, that’s radical indeed!
Technorati Tags: BI best practices, Lawrence Livermore National Laboratory, Radical BI
It seems that the best practice for selecting a solution is starting at a new point of origin. It used to be that when you looked at vendors you started with a needs analysis and proceeded from there. With all of the mergers and acquisitions in the marketplace these past couple months choosing what type of vendor to go with is now part of the challenge. Do you want best of breed or do you want an end to end solution?
Earlier today I spoke with a BPM vendor that offers performance management in an on-demand platform. They do it well and focus on the financial planning, forecasting and consolidation space. They don't however offer a database, or specialized ETL tools or even data quality or integration. They have marquee clients and have been in the space for years.
I understand that existing architecture drives some of the decision but I am curious as to what else comes into play. Is it always price and feature driven? Can you still get an economically solid deal from a company with the end to end solution? Does your personal like or dislike of SAP or Oracle stop you from including OutlookSoft or Hyperion in your RFP and what about office politics?
I'd like to hear your thoughts please comment below and share with our readers your feelings on what drives the decisions and where your budget fits in with needs and solutions.
Also, thank you for visiting the site and the blog. As a Co-founder of the Business Intelligence Network I am pleased and proud of this site and especially happy to be working with our partner TDWI to provide this resource to the community. Please visit often and we look forward to you participating and sharing your thoughts on best practices.
Co-founder and Executive Vice President
Business Intelligence Network
Tags: Business Intelligence, Best Practice, Business Intelligence Network, TDWI
I find it interesting that at this week's inaugural launch of TDWI's BI Best Practices web site, there are numerous articles on Return on Investment (ROI) for business intelligence applications.
If you are feeling a little left out because you have not calculated your project’s ROI, you are not alone. Survey after survey, including the latest in my upcoming book, shows that few calculate the ROI for their BI application. It might be used as a way of securing project funding but rarely as a way of measuring success or determining if the project achieved what was intended.
In judging the TDWI applications, we don’t look at ROI … much. We do, however, look at specific business benefits though – savings of N million dollars, X increase in campaign effectiveness, Y lift in revenues, and so on. We try hard not to let the big numbers sway us. If someone claims they saved $4 million dollars and they have not provided an ROI, I look at what percentage is that $4 million in respect to the size of the company and the capital budget. In some cases that $4 million savings is still a big deal; in others, it’s small change.
The reality is that ROI is a very precise number, with wildy imprecise inputs. Can anyone accurately say how much a 10% increase in revenues is attributable to the BI project? No. Cost is pretty much the only accurate component. The numerator is a guestimate. In that regard, do the ROI, even if it’s just a quick calculation on the back of an envelope. It’s a wonderful promotional tool when convincing business users of the value of business intelligence.
Cindi Howson, founder BIScorecard.
It is encouraging to see Business Intelligence implementations that are making a real impact on the bottom line. Ten years ago data warehouses were rare, hard to rationalize and difficult to cost-justify. Now-a-days, it is a rate situation if your enterprise does not have some form of data warehousing or business intelligence implementation in place. As ubiquitous as data warehousing has become these days, still only a select few are really making significant impact on the operational bottom line. If the submissions for predictive analytics we reviewed this year are any indication of industry trends, we will see a renewed focus on data warehousing and business intelligence solutions. Finally we're not just providing online "green bar" reports. We're implementing foundational components that enable an organization's ability to achieve market superiority.
CEO - TeamDNA, Inc
It was exciting to see this year's candidates utilizing some new best practices along with some of the ones we have seen work successfully for many years. It is great to see these sound data warehouse techniques work on different projects in different industries - practices such as:
1 - Comprehensive Data Governance framework providing guidance in Data Classification, Information Security, and Industry Data Sharing policies
2 - Metrics Cascade and Cause and Effect metrics maps
3 - Collaborating with vendors and clients on the direction of the data warehouse
4 - Revenue-generating products developed from the data warehouse
5 - Preconditioned customer contact groups (selection groups)
6 - Use of a Data Derivation Engine by the business users without necessary IT involvement to enable daily updates of data to provide Contextual Customer Dialog
7 - Off-the-shelf (preconditioned) contact groups
8 - Data evaluated and standardized to the extent possible prior to project
9 - SMEs and specialists leveraged
10 - Business project sponsor involvement during project
11 - Data quality considered to be one of the major decision drivers
I look forward to seeing the innovative ideas in next year's competition!
Data Warehouse Principal - Chimney Rock Information Solutions