Business Intelligence 101: A Brief History
Updated · Jul 10, 2014
The purpose of business Intelligence (BI) solutions is pretty much the same as the purpose of “military intelligence”: to give commanders at every stage of a campaign a good view of the battlefield and the pros and cons of their options. In the business context, this means telling the business on an ongoing basis the state of the business (production, sales, profits, losses) and the factors that affect success (success of a sales campaign, customer satisfaction, manufacturing efficiency, planning and budgeting).
The best way to understand how today’s business intelligence solutions do this is to see how they have evolved. At the start, in the late 1980s, most businesses did run-the-business order entry, manufacturing resource planning (MRP) and accounting via in-house solutions that ran 9-to-5 or in “batch mode” at the end of the day and on weekends. To these were added, in the early 1990s, enterprise resource planning (ERP) applications that combined MRP, accounting, planning/budgeting and related (e.g., salesforce automation, or SFA) apps, and a data warehouse (or equivalent data marts) that ensured stable, companywide data that formed the basis of companywide reporting
As the data in existing systems and the demand for BI-type “decision support” grew, two problems became apparent:
- A “query from hell” could crowd out all other processing on existing run-the-business systems and therefore prevent customer service at crucial times
- Queries generated far more “result data” than in-house reports – what was called then “drinking from a fire hose” – and therefore the result data needed further massaging and filtering before it was presented to users
Dealing with the Data
The solution to the first problem, in the early 1990s, was to copy production data ceaselessly to a data warehouse or set of data marts that only handled queries. This, in turn, vastly increased the amount of data that a data warehouse must handle. The answer was ETL (extract, transform, load) software attached to the data warehouse, with a specialized version called EAI (enterprise application integration) to handle communication between ERP packages and data transmission from the ERP data stores to the data warehouse.
The solution to the second problem was to improve the analytical capabilities of business intelligence solutions, so that they could handle not only simple queries but also complex queries and online analytical processing (OLAP) that more effectively zeroed in on just the results needed in particular cases. This, in turn, created a special class of user called the data miner – pretty much the only one (back then) who could understand how to turn business information needs into queries to the data warehouse.
The first BI tools (from companies such as Cognos and Business Objects, which today form a large part of the market under their new owners) initially aimed to do queries on business data in order to dig deeper into it, or get results faster, than end-of-week/month/quarter reports. Around 2000, the reporting parts of the typical large enterprise combined with this data mining, and the whole thing became a business intelligence system.
Today’s BI systems, therefore, carry out two main functions: reporting and querying. Remember, however, that other applications in the typical large organization’s portfolio use one or both of these for different purposes. For example, accounting system reporting also supports results reporting to the public and the government.
By the early 2000s, two new challenges triggered further evolution of BI solutions:
- The increasing importance of both reporting and querying meant enormous and increasing business pressure on IT to deliver reporting and querying on “fresh” daily and even hourly data, rather than via batch runs on weekends
- The advent of the Web meant new customer interfaces and Web-based data (e.g., social media data) and required that customer-facing solutions operate 24/7, 52 weeks per year
One answer to the first challenge was to begin to shrink the “delay time” between entry of new data and its use by the BI solution. Thus, for a strictly limited set of business-critical data, data mining and reporting have moved steadily from a maximum of a week later to as little as a few minutes after the data arrives.
The Cloud and Analytics
As the size of some data warehouses began to reach the terabyte range, the only effective long-run way to do this was to “fudge” the BI solution: to allow the underlying data warehouse to handle some updates at the same time as queries and to allow the BI solution to reach outside the data warehouse, very carefully, to access key “fresh” data. In other words, BI systems and data became networked – and so, BI solutions that viewed sub-apps and data as a virtualized whole, existing in the Internet cloud or clouds, began to make sense – the so-called cloud BI. This has especially been true since the amount of data used by BI, from internal systems and the Web, has grown so large that it has become its own topic area: Big Data.
Another answer to challenge No. 1 was to provide tools to aggregate key “fresh” decision-support data. That in turn has led to a new BI capability: “360-degree view” (meaning, across divisional data warehouses and including operational data) performance management (PM) applications, or the ability to provide dashboards showing corporate business executives actionable results from the business as of a few hours ago, or even alerts about data arriving minutes ago.
To meet challenge No. 2 by taking advantage of the new customer data, CRM (customer relationship management) applications were added to the BI system’s portfolio of useful sub-apps. Just as importantly, analytics, or the ongoing, semi-automated process of data mining, became a new organizational goal.
Because analytics is mainly focused on deeper data dives rather than business intelligence, it is in fact just as important for purposes outside of BI. For example, “embedded” analytics can analyze network traffic ceaselessly and automatically, for dangerous patterns, and even take actions to avoid IT disaster without administrator intervention. This does not aid the business user’s strategic decisions, but it is vitally important to avoid a business-threatening customer interface crash.
Analytics, therefore, to a first approximation, can be thought of as in equal parts (a) the ”querying” segment of BI solutions and (b) support for the business-process-redesigning or risk-avoidance parts of today’s business. One of its main effects within BI solutions is to enlarge BI’s ability to support less sophisticated data-mining end users. Today’s “data scientists” are in fact a far larger and more business-savvy set of BI data-mining end users that interface with BI systems through new analytical tools.
A final note: BI continues to generate new approaches that are not part of the “core curriculum” but may be worth examining in greater detail – for example, open-source BI, agile BI and a function that might be called “Hadoop BI.” Among the other interesting topics these days are the evolution of OLAP, data discovery for BI, agile marketing’s use of BI, master data management and BI, virtualized BI data stores, and “BI for the masses” – extending BI usage to most organizational end users. BI continues to be an evolving, exciting area for the average business.
But we’ll leave all that for BI 102.
Wayne Kernochan is the president of Infostructure Associates, an affiliate of Valley View Ventures that aims to identify ways for businesses to leverage information for innovation and competitive advantage. Wayne has been an IT industry analyst for 22 years. During that time, he has focused on analytics, databases, development tools and middleware, and ways to measure their effectiveness, such as TCO, ROI, and agility measures. He has worked for respected firms such as Yankee Group, Aberdeen Group and Illuminata, and has helped craft marketing strategies based on competitive intelligence for vendors ranging from Progress Software to IBM.
Wayne Kernochan has been an IT industry analyst and auther for over 15 years. He has been focusing on the most important information-related technologies as well as ways to measure their effectiveness over that period. He also has extensive research on the SMB, Big Data, BI, databases, development tools and data virtualization solutions. Wayne is a regular speaker at webinars and is a writer for many publications.