I often joke about how easy it is to take two or three loosely related bits of information and declare them a trend. But there’s also an opposing tendency to think that nothing fundamental ever really changes. This makes it harder to recognize significant developments when they do occur.
One of those fundamentals has always been the need to define requirements before building or selecting a system. But I’m beginning to suspect that has changed. In systems, we see methodologies like “extreme programming” that advocate incremental development with a minimum of advance documentation. In marketing, we see rapid test cycles taking advantage of low cost and quick feedback on the Internet. We also see outsourced systems making it easy to try new methods and media with little investment or implementation time. In databases we see tools like QlikTech, SkyTide, Alterian and SmartFocus that radically reduce the effort needed to load and manipulate large volumes of data. If you want to stretch a bit, you could even argue that non-relational data sources, such as XML, text and graphics files, are part of the movement towards a less structured world.
This all takes some getting used to. Loyal readers of this blog (hi Mom!) know how much I like QlikTech (reminder: my company is a reseller), precisely because it lets me do complicated things to large amounts of data with little technical skill, hardware or cost. The other products I’ve mentioned offer similar flexibility, although they aren’t quite as easy or inexpensive.
My point is, we’re used to a world where advance planning is necessary because building a significant-sized system takes a great deal of skill and effort. These new tools radically reduce that skill and effort, making possible hands-on experimentation. The value should be obvious: we can bring in more data and look at it in different ways, making discoveries that would be unavailable if doing each analysis were more expensive.
We may need to revise our standard model of a business intelligence architecture to accommodate this. Specifically, we would add ad hoc analysis systems to the existing data warehouse and data marts. The ad hoc systems would hold data we may want to use, but haven’t quite figured out how. They would be a stepping stone in the development process, letting us experiment with data and applications until we understand them well enough to harden their design and add them to the regular production systems.
(Incidentally, some data warehouse gurus have proposed an "exploration warehouse" or "exploration data mart" that is a somewhat similar concept. They seem to have in mind more carefully processed data than I do, but I wouldn't want to claim to be the first to think along these lines.)
This notion may meet some resistance from IT departments, which are naturally reluctant to support yet another tool. But the price/performance ratios of these new products are so overwhelmingly superior to conventional technologies – at least when it comes to ad hoc analysis – that IT should ultimately see the advantage. End-users, particularly the technically skilled analysts who suffer the most from the rigidity of existing systems, should be more enthusiastic from the start.
Thursday, 19 April 2007
Are System Requirements Obsolete? (Probably not, but that should get your attention.)
Posted on 08:09 by Unknown
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment