- Details
- by Ronald Fuller
OLAP cubes are among the most powerful resources available for business intelligence and analytics. Here's what the product manager for Microsoft Analysis Services said about OLAP back in 2002:
OLAP multidimensional databases combine incredible performance with unsurpassed analytical power and, in my opinion, are the foundation of the BI platform.
The multidimensional data model is vastly superior to the relational data model when it comes to the expressiveness of analytical operations. The ability to have random access to any point in space, both detailed data and aggregates, makes it a breeze to express calculations that would otherwise take pages of SQL statements using a relational database.
This remains true about OLAP today, and it is likely to remain true for a long time. While technologies change, the underlying concepts remain the same. But OLAP seems to be falling out of favor recently. Why would that be?
The top Google search result for OLAP claims "OLAP Is Becoming Obsolete". This is actually a paid advertisement that invites users to download a paper titled Selecting the Right Database Technology for Your Business Analytics Project. Interestingly, it says very little about OLAP and does not even use the word obsolete anywhere in the paper. And the only negative thing it says about OLAP is the single sentence below:
Storing the results of these pre-calculations takes exponentially more storage resources than the actual raw data does, limiting the size of raw data that can make up a cube to gigabyte scale.
But that is actually false, because the most popular storage format for OLAP cubes are multidimensional structures which require far less storage than the original source data. Production OLAP cubes have exceeded more than 20 terabytes of raw data, and their continued growth is limited only by computing power, memory and storage.
In the real world, there is no reason to believe that OLAP is becoming obsolete. But there are more signs that people think it is, notwithstanding the facts. The statement below is from the Microsoft website more recently:
For new projects, we generally recommend tabular models. (rather than OLAP cubes)
Microsoft has been a leader in promoting OLAP, so why are they now downplaying it and steering people towards tabular models? The reason they give is that "tabular models are faster to design, test, and deploy..." But that is true only when the requirements and data are very simple. When the requirements or the data become more complex, even a little bit, the complexity of developing a tabular model explodes and quickly becomes far, far more complex than a multidimensional OLAP project with exactly the same requirements and data. And the end products are less flexible and far less able to adapt to changing requirements.
So what gives? Why would Microsoft make a claim that relies on an implausible assumption of simplicity which does not exist in most enterprise environments? And why would another company claim that OLAP is becoming obsolete in a paid advertisement with only a single dubious claim to back it up?
Here's what I think is going on: OLAP is most valuable when the underlying source data is clean and well-integrated. And clean, well-integrated data is difficult to achieve for many organizations. Claiming that OLAP is obsolete is a marketing ploy to promote products that work well with poor-quality data and data that is not well integrated.
There is a legitimate need for such products, and they have a huge market potential – but vendors should make that case and sell those products without claiming that OLAP is becoming obsolete because it is not. For organizations with clean, well-integrated data OLAP is far and away the best choice for business intelligence and analytical applications.
- Details
- by Ronald Fuller
The idea that managing information involves cost-benefit tradeoffs is not new, but unfortunately those decisions usually focus only on engineering and implementation issues such as performance and storage, while business utility is totally ignored. By business utility I mean the capacity of an information resource to meet expressed and unexpressed needs and adapt to changing requirements.
To understand why, lets look back to the early days of modern database software. The statements below were made by Don Chamberlin who was one of the co-designers of SQL, the world's most widely-used database query language. Here he is describing decisions made in the mid-1970s which his team intended to give users more flexibility with cost-benefit tradeoffs:
“When the original SQL designers decided to allow users the options of handling nulls and duplicates, they viewed these features as minor conveniences, not as major departures from orthodoxy, taken at the risk of excommunication."
"SQL trusts the database designer to decide whether the costs ... are justified. To impose these costs on all applications ... seems a little heavy-handed, and seemed even more so in 1975 given the costs of storage and processing at that time."
Even today (in 2015) Dr. Chamberlin and his former colleagues continue to bear harsh criticism for those decisions. This is completely unfair because they did not force anyone to do or ignore anything. Rather, they gave users the freedom to make their own cost-benefit decisions.
But unfortunately those decisions today, and others of equal importance, do not fall to the business stakeholders who pay the price for poor choices or, who are in the best position make good ones. Instead they are decided behind the scenes by technical professionals, while information owners remain unaware that such tradeoff opportunities even exist. The result has become that information systems are too expensive, do not adapt well to changing requirements, and do not meet the expectations of their owners.
Information owners do not need to become technical experts to make good decisions about organizing information, and it is absolutely essential that business experts, rather than architects or engineers, have final decision authority. The role of IT departments in these matters should be strictly limited to cost-benefit advisement.
Even when technical professionals carefully consider cost-benefit tradeoffs they are constrained by a technical perspective. The following statement is from a paper titled Justifying database normalization: a cost/benefit model:
"the determination of appropriate normal forms frustrates many systems analysts ... a trade-off exists among system performance, storage, and costs."
But the author omits a critical factor: the primary overall consideration in any cost-benefit equation should always be business utility. Technology professionals cannot be expected to recognize the full range of implications for business utility in any given tradeoff scenario. They must instead rely on a set of requirements specifically spelled-out in advance by the business customer. That is why scope creep is such a huge problem in most IT projects, and why most information systems cannot adapt well to changing requirements.
It would probably be impossible for a business owner to make a complete list of every possible use-case for a given resource. But a business owner can easily determine whether a resource can satisfy their foreseeable needs even before any requirement has been expressed. What an owner considers to be foreseeable can change over time, but it will always change more slowly than the perceived requirements when technology professionals have to guess or fill-in the blanks. New technologies can expand the range of use cases for information, but for information to be useful it must still be organized in a way that allows the desired use – technology cannot change that.
For related discussion see:
- Details
- by Ronald Fuller
Dr. Thomas Haigh tells a fascinating story in The Business History Review that explains why information management is seen as a technical discipline rather than a management one: The systems men were members of the Systems and Procedures Association during the 1950s and 60s. The purpose of this association was not to promote research or continuing education, but rather to seek increased status and management authority for its members within their employing organizations. They offered an implicit bargain to corporate executives:
"You put us in charge and we’ll deliver to you more power over your firms than you’ve ever dreamed of" (p 43)
Executives for the most part were not convinced. They understood that technical expertise does not translate into management ability. By the 1970’s the Systems and Procedures Association was defunct, and the various roles of the systems men merged into corporate IT departments. But they left a stubborn cultural legacy that persists still today, the idea that managing information is a job for architects and engineers rather than business experts:
"For better or worse, to speak of something as an information system continues to imply that it should be engineered by an information specialist and built using information technology" (p 59)
"It seems unlikely that the idea of information can ever truly be separated from these roots: it is just too historically and culturally charged" (p 59)
The full article is available here, and a free version here.
- Details
- by Ronald Fuller
Database normalization is a dubious concept for 2 reasons:
- The cost-benefit value of any normal form can only be determined by an information owner, not by a database expert, and
- The normal form of most tables can only be determined by a subject matter expert (SME).
As an example of reason 1, the table below contains an error which you can see in the last row: an address in Grand Junction, Tennessee has the same ZIP code as the address above it in Grand Junction, Colorado. This is clearly a mistake, but without investigating further we cannot know whether the error is with the State or the Zip.
Breaking City and State off into two separate tables as shown here eliminates any potential for this kind of inconsistency, however the advantage might come at a cost because performance can suffer when applications and reports have to query 2 joined tables instead of 1.So what is more important, better performance or lower risk of error? Who should decide? Only information owners have the appropriate perspective and incentive to make that kind of decision wisely (see Sophotaxis). Architects and engineers can provide valuable cost-benefit advisement, but final decisions should be made by owning stake holders.
The example above is pretty simple, but large business databases can have thousands of similar cost-benefit scenarios that are far more complex. The more complex the situation, the greater the need for ownership perspective and expertise in the specific business issues at hand.
As an example of reason 2, consider a simple table containing only one column with phone numbers. Many engineers would say that this table is in first normal form. In most situations it would work fine just as it is, or as a column in a larger table. But for a company such as a telephone service provider, where users might want to group or sort phone numbers by area code or exchange, this table would not be in first normal form. In other words, it would not satisfy even the minimum theoretical standard for use in a modern database system. Instead, in this particular case, the numbers should be broken out into each meaningful component as shown in the lower image.
This shows that even in very simple cases the normal form of a table can only be determined by someone who understands the full-range of potential uses for the information. Business databases are filled with situations like this, but unfortunately most decisions about organizing information are made by technologists rather than business experts. That is why businesses struggle with databases and create systems that cannot adapt well to changing requirements.
Information should be organized in a way that makes the most sense to its owner, not according to some predetermined normal form which cannot be reliably or purposely achieved. The normal forms are usually seen as guidelines to improve performance and protect consistency. But any decision made for performance can potentially degrade the usefulness and value of the information, and the range of decisions needed to protect consistency can only be determined by a business expert.
A far more useful and comprehensive tool for thinking about the way information should be organized is the principle of sophotaxis, which I explain in The Ancient Secrets of Information Management.
For related discussion see:
- Details
- by Ronald Fuller
A widely held assumption in the academic field of business information systems is that technology and behavior are inseparable, but this is false. One widely-cited source that promotes this idea is Design Science in Information Systems Research, where the authors state:
"Technology and behavior are not dichotomous in an information system. They are inseparable (Lee 2000)"
The source they cite (Lee 2000) explains their rationale:
"The problem of 'technology vs. behavior' is a dilemma in the following way: If we take a technology approach to IS, then how would we be different from engineering and computer science? But if we take a behavioral approach to IS, then how would we be doing research that any behavioral field could not already do?"
"Just as a physician cannot design a remedy for a patient’s body and emotions separately, and just as an architect cannot design “form and function” independently, the IS field similarly does not have the option of designing the technology subsystem alone or the behavioral subsystem alone – we have to design both together."
My response to the first paragraph is this: Deciding how business information is organized has nothing to do with engineering and computer science and everything to do with management priorities and objectives. The role of technology professionals should be limited to implementation and cost-benefit advisement. This should be an easy distinction to make, but the conventional wisdom is clouded by historical and cultural biases. For example, technology workers in the 1950s persuaded business leaders to think of information management as an engineering discipline instead of a management prerogative. This is discussed further in The Legacy of the Systems Men. There are plenty of purely 'behavior' related issues that the academic IS community could focus on which have nothing to do with technology, such as the problem of poorly-organized information, which I describe further in Sophotaxis.
My response to the second paragraph is that it is false. Decisions about how information is organized can and should be made before any automated system is created, even when those decisions can change (and they always do). The design of the automated system might raise cost-benefit issues that will impact the information decisions, but those cost-benefit trade-offs cannot be well-understood until decision makers know what the trade-offs will entail, and that is possible only when the information resource is defined first (see Cost-Benefit Value is Ignored).
When business-oriented priorities are subordinated to technology-oriented factors without a deliberate cost-benefit analysis, operational and analytical capabilities suffer. Organizing information is an act of business administration, which is a role IT departments are not intended for. The role of IT should be limited to implementation and cost-benefit advisement.
The following statement is evidence of how deeply the issue of information management is misunderstood. It has been repeated in various forms more than ten thousand times by authors at universities, technology companies, and government organizations:
"With the proliferation of information technology starting in the 1970s, the job of information management had taken a new light, and also began to include the field of data maintenance. No longer was information management a simple job that could be performed by almost anyone. An understanding of the technology involved and the theory behind it became necessary. As information storage shifted to electronic means, this became more and more difficult."
This statement is false; information management has never been a simple job that could be performed by almost anyone – at least not since the birth of modern accounting in the late 13th century. The techniques used in manual accounting systems rely on a set of cross-referenced and interconnected books which use structures and rules that are fully consistent with the theory behind modern relational database systems, which I explain further in The Ancient Secrets of Information Management.
Accounting is the discipline of managing information about money. The same logic-based techniques can be used to manage other kinds of information as well, but manual accounting is so painstaking and time consuming that it is easy to understand why early merchants only made the effort with the kind of information they considered to be most important. When relational database software was introduced in the 1970s, business leaders and scholars mistakenly assumed that it had created an entirely new computer-based method to organize information. But in reality it created a new computer-based way to automate old logic-based techniques that had been used with success for 700 years. If this were understood the shift to electronic automation would have made things far easier rather than more difficult.
Early merchants certainly did not look to the craftsmen who made their tools to also define their accounts. But that is exactly what modern organizations do. It makes no difference that the old tools were made from paper, feathers and dye, and the new ones from computers, software, and networks. The old tools served exactly the same purpose as the new with respect to the organization of information. The new tools serve an additional purpose of automating processes and workflows, but that is no reason to believe that the engineers who create the tools should also be responsible to organize information. And there are important reasons to understand why they should not.
- Details
- by Ronald Fuller
Contrary to conventional wisdom, logic is not the opposite of emotion. When people claim that it is, I think what they really mean is that practicality is the opposite of emotion. We use the same patterns of reasoning when we follow our hearts as when we follow our heads. If our starting points for reasoning are cold and mechanical, logic allows us to form conclusions that are cold and mechanical. When our starting points are warm and compassionate, logic allows us to draw warm and compassionate conclusions. Logic relates to every human motivation, from savage instinct to artistic genius.
Logic cannot tell us if our starting points are correct or good – only what must be true IF our starting points are true. Our starting points determine the whole range of conclusions available to us. Starting points can be anything that informs our thoughts and actions, such as assumptions, measurements, religious doctrines, moral principles, laws, scientific results, political opinions, facts, or mistaken facts. Anything we choose to accept or assert. Most starting points can be traced back to more fundamental starting points. When it is impossible to trace back any further, that is a first principle. Most first principles cannot be proven, rather they are chosen or accepted. First principles represent our most basic beliefs and assumptions.
Learning logic helps people make better decisions and avoid deception whatever their motivations and priorities.
Logic is a tool, just as leverage and aerodynamic lift are tools. Leverage and gravity made early mechanical clocks possible. Logic makes precise reasoning and computers possible.