Business Intelligence Best Practices -

Collaboration. Communication. Community.

 E-mail to friend
Dashboards and Scorecards

by Hugh Watson
Despite the attention given to scorecards and dashboards, it is my experience that the similarities and differences between these two approaches to business performance management (BPM) are little understood.

I attended the Executive Summit at the TDWI conference in Las Vegas last winter, an event for senior BI and data warehousing managers and their business sponsors. One of the sessions explored “what’s hot” in BI, and the participants voted on what was most important to them. Not surprisingly, dashboards and scorecards topped the list. This outcome is consistent with the results of other recent surveys.

Despite the attention given to scorecards and dashboards, it is my experience that the similarities and differences between these two approaches to business performance management (BPM) are little understood. I’ll give you my take on the topic in this article. I’ll begin with their precursors, and then I’ll discuss the technology currently available for implementing BPM based on the products I saw in the exhibit hall.

The Emergence of Dashboards and Scorecards

Even though dashboards and scorecards may seem relatively new, they are actually evolutionary developments. In the late 1970s, Jack Rockart (1979) introduced and popularized the critical success factors (CSFs) concept, which identifies and monitors what companies, business units, departments, and individuals must do well in order to be successful. Once identified and tracked, CSFs help organizations communicate about, and focus on, those factors: market share, number of new product introductions, the per-unit cost of production—whatever factors are most important.

As executive information systems (EISs) became popular in the 1980s, CSFs and key performance indicators (KPIs) were important components. Experts recommended that companies identify and monitor CSFs at the industry, company, work unit, and individual level. (Watson, Houdeshel, and Rainer, 1997)

In many ways, today’s dashboards and scorecards are the EISs of yesterday because of their focus on key performance metrics. Consider two examples.

Reading Rehabilitation Hospital
This hospital serves patients recovering from serious injury, illness, and surgery. In the 1980s, the hospital had an initiative to improve the quality of patient care. After considerable study, 90 metrics related to patient care were identified. Among them was the percentage of patient charts with incorrect entries by the doctors, nurses, or rehabilitation therapists. For each group, a chart showed how the hospital was doing over time and against goals.

The most interesting part of the story is that in addition to putting the charts in a computer-based system, the hospital posted the results dashboard in the cafeteria for everyone to see. The impact was dramatic. Within a week, the doctors (who initially had the worst record) improved significantly and were no longer at the bottom of the list. Over time, there was a spiral effect as each group improved its performance. This example illustrates the adage, “That which gets watched gets done.”

In the mid 1980s, Fisher-Price, a leading manufacturer, marketer, and distributor of children’s toys, experienced a dramatic drop in sales when video games appeared. (Volonino and Watson, 1990-91) The company was slow to detect the change in the marketplace because of deficiencies in its information systems.

In response, Fisher-Price modified its business strategy to emphasize the need to be market driven in order to quickly respond to changing market conditions. It also decided to develop an EIS that monitored business processes and provided information to all the people involved in getting the right products quickly to the marketplace—sales, distribution, inventory, production, suppliers, product design, and senior management.

Two parts of the Fisher-Price story are especially important. The first is that EIS was designed specifically to support the business strategy. Second, users of the system ranged from the sales reps to senior management. One of the common misconceptions about EISs is that they are only created for executives. When David Friend was CEO of Pilot Software (one of the leading EIS software vendors at the time), he argued that EIS should stand for “everybody’s information system,” and if an EIS does not spread throughout an organization, it is unlikely to survive.

Comparing Dashboards and Scorecards

The systems developed at Reading Rehabilitation Hospital and Fisher-Price illustrate the key components and the similarities and differences between dashboards and scorecards.

The most important similarity is that both systems contain metrics that communicate what is important, monitor what is taking place, and help people be successful in their work. Metrics are displayed in dashboards in both systems. Scorecards go beyond dashboards. They explicitly link the dashboards to business strategy. This linking is the most significant difference between the two.

About Scorecards

Norton and Kaplan (1991) created awareness of the potential for scorecards with their Harvard Business Review article, “The Balanced Scorecard—Measures that Drive Performance.” A critical part of what they call the balanced scorecard is having metrics for four perspectives: financial, customer, internal business, and innovation and learning. They argue that while financial measures are important, they are backward-looking, and that forwardlooking measures, such as for innovation and learning, are needed to assess future success.

Kaplan and Norton’s (2004) thinking has evolved. They currently advocate the use of strategy maps to explicitly show how to link metrics and business processes to the business strategy. In other words, the strategy maps show the cause-and-effect relationships between key processes and metrics.

The use of (1) metrics for financial, customer, internal business, and innovation and learning and (2) their linkage to one another and to the overall business strategy are the most significant differences between dashboards and scorecards.

Because the scorecard approach is explicitly linked to business strategy, sponsorship at the highest organizational levels is usually required. It also normally requires a top-down approach, with scorecards potentially being pushed down to the individual worker level.

About Dashboards

Dashboards can exist at all levels of the organization, and they contain metrics that relate to that level, but they are not explicitly linked to the business strategy using a specific methodology. Wayne Eckerson’s book, Performance Dashboards: Measuring, Monitoring, and Managing Your Business, provides several detailed descriptions of types of dashboards. For example, Quicken Loans uses operational dashboards to monitor the performance of its more than 500 mortgage experts at its Web call center.

Because dashboards can exist at any level, they do not always require top management sponsorship and can be developed using a bottom-up approach. A department can implement dashboards on its own, and if it’s successful, other departments may choose to do the same.

Departmental or operational dashboards are receiving considerable attention because of their ability to monitor operational systems through the use of enterprise information integration (EII), enterprise application integration (EAI), and real-time data warehousing technologies. This is often referred to as business activity monitoring (BAM) and is analogous to process control in manufacturing.

Purity Is Not Required

The current discussion of dashboards and scorecards identifies only the end points of the BPM option. There are midpoints as well. For example, I’m currently working on a BPM system that will use the financial, customer, internal business, and innovation and learning perspectives for the metrics, but they will not be explicitly linked to a business strategy.

The Technology for Dashboards and Scorecards

Leading BI vendors (such as Hyperion and Cognos) and newer entrants (including iViz) offer products—often running on their BI platforms—that facilitate the development, use, and maintenance of dashboards and scorecards. In the case of scorecard software, there are tools for creating a strategy map and linking metrics to the map. In both scorecard and dashboard software, developers can create a library of metrics that can be used in the dashboards. Individual screens can be tailored with the particular metrics that are created. Wizards are available to facilitate the development process.

While users cannot change how the metrics are defined and calculated, they can control how they are displayed, changing, for example, the type of chart or the colorsused. For every metric, there are official data sources (e.g., data mart, SAP, Excel) and the scorecards or dashboards access the data needed for that metric. Users are able to drill down on a metric to see the underlying data upon which it is based. Also, users can control the refresh cycle for the metrics, including having automatic updates, to provide near-real-time metrics.

Dashboard software can also be used to support or replace other approaches to enterprise reporting. When used for reporting, it can serve as a single dashboard and reporting approach and eliminate the support and training costs associated with the use of multiple software products.


Dashboards and scorecards are similar in how they look and their use of metrics. Scorecards, however, are more tightly linked to business strategy. Vendors offer software that facilitates the development and use of scorecards and dashboards, and the same technology can also be used for enterprise reporting.


Eckerson, W.W. Performance Dashboards: Measuring, Monitoring, and Managing Your Business, Wiley, 2006.

Kaplan, R.S. and D.P. Norton. “The Balanced Scorecard— Measures that Drive Performance,” Harvard Business Review, January-February 1991, pp. 71-79.

Kaplan, R.S. and D.P. Norton. Strategy Maps: Converting Intangible Assets into Tangible Outcomes , Harvard Business School Publishing Company, 2004.

Rockart, J.F., “Chief Executives Define Their Own Data Needs,” Harvard Business Review, March-April 1979, pp. 81-93.

Volonino, L. and H.J. Watson, “The Strategic Business Objects Method for Guiding Executive Information Systems Development,” Journal of Management Information Systems , Winter 1990-91, pp. 27-39.

Watson, H.J., G. Houdeshel, and R.K. Rainer. Building Executive Information Systems and Other Decision Support Applications , Wiley, 1997.

Recent articles by Hugh Watson

Hugh Watson -

Hugh Watson is a Professor of MIS and holds a C. Herman and Mary Virginia Terry Chair of Business Administration in the Terry College of Business at the University of Georgia. He is the author of 22 books and more than 100 scholarly journal articles. He is the Senior Editor of the Business Intelligence Journal and a Fellow of TDWI.