Marketing Deal Offers

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg
Showing posts with label dashboards. Show all posts
Showing posts with label dashboards. Show all posts

Tuesday, 7 December 2010

Tableau Software Adds In-Memory Database Engine

Posted on 08:13 by Unknown
Summary: Tableau has added a large-scale in-memory database engine to its data analysis and visualization software. This makes it a lot more powerful.

Hard to believe, but it's more than three years since my review of Tableau Software’s data analysis system. Tableau has managed quite well without my attention: sales have doubled every year and should exceed $40 million in 2010; they have 5,500 clients, 60,000 users and 185 employees; and they plan to add 100 more employees next year. Ah, I knew them when.

What really matters from a user perspective is that the product itself has matured. Back in 2007, my main complaint was that Tableau lacked a data engine. The system either issued SQL queries against an external database or imported a small data set into memory. This meant response time depended on the speed of the external system and that users were constrained by the external files' data structure.

Tableau’s most recent release (6.0, launched on November 10) finally changes this by adding a built-in data engine. Note that I said “changes” rather than “fixes”, since Tableau has obviously been successful without this feature. Instead, the vendor has built connectors for high-speed analytical databases and appliances including Hyperion Essbase, Greenplum, Netezza, PostgreSQL, Microsoft PowerPivot, ParAccel, Sybase IQ, Teradata, and Vertica. These provide good performance on any size database, but they still leave the Tableau user tethered to an external system. An internal database allows much more independence and offers high performance when no external analytical engine is present. This is a big advantage since such engines are still relatively rare and, even if a company has one, it might not contain all the right data or be accessible to Tableau users.

Of course, this assumes that Tableau's internal database is itself a high-speed analytical engine. That’s apparently the case: the engine is home-grown but it passes the buzzword test (in-memory, columnar, compressed) and – at least in an online demo – offered near-immediate response to queries against a 7 million row file. It also supports multi-table data structures and in-memory “blending” of disparate data sources, further freeing users from the constraints of their corporate environment. The system is also designed to work with data sets that are too large to fit into memory: it will use as much memory as possible and then access the remaining data from disk storage.

Tableau has added some nice end-user enhancements too. These include:

- new types of combination charts;
- ability to display the same data at different aggregation levels on the same chart (e.g., average as a line and individual observations as points);
- more powerful calculations including multi-pass formulas that can calculate against a calculated value
- user-entered parameters to allow what-if calculations

The Tableau interface hasn’t changed much since 2007. But that's okay since I liked it then and still like it now. In fact, it won a little test we conducted recently to see how far totally untrained users could get with a moderately complex task. (I'll give more details in a future post.)

Tableau can run either as traditional software installed on the user's PC or on a server accessed over the Internet. Pricing for a single user desktop system is still $999 for a version that can connect to Excel, Access or text files, and has risen slightly to $1,999 for one that can connect to other databases. These are perpetual license fees; annual maintenance is 20%.

There’s also a free reader that lets unlimited users download and read workbooks created in the desktop system. The server version allows multiple users to access workbooks on a central server. Pricing for this starts at $10,000 for ten users and you still need at least one desktop license to create the workbooks. Large server installations can avoid per-user fees by purchasing CPU-based licenses, which are priced north of $100,000.

Although the server configuration makes Tableau a candidate for some enterprise reporting tasks, it can't easily limit different users to different data, which is a typical reporting requirement. So Tableau is still primarily a self-service tool for business and data analysts. The new database, calculation and data blending features add considerably to their power.
Read More
Posted in business intelligence software, dashboards, reporting software, tableau software | No comments

Thursday, 27 March 2008

The Limits of On-Demand Business Intelligence

Posted on 13:31 by Unknown
I had an email yesterday from Blink Logic , which offers on-demand business intelligence. That could mean quite a few things but the most likely definition is indeed what Blink Logic provides: remote access to business intelligence software loaded with your own data. I looked a bit further and it appears Blink Logic does this with conventional technologies, primarily Microsoft SQL Server Analysis Services and Cognos Series 7.

At that point I pretty much lost interest because (a) there’s no exotic technology, (b) quite a few vendors offer similar services*, and (c) the real issue with business intelligence is the work required to prepare the data for analysis, which doesn’t change just because the system is hosted.

Now, this might be unfair to Blink Logic, which could have some technology of its own for data loading or the user interface. It does claim that at least one collaboration feature, direct annotation of data in reports, is unique. But the major point remains: Blink Logic and other “on-demand business intelligence” vendors are simply offering a hosted version of standard business intelligence systems. Does anyone truly think the location of the data center is the chief reason that business intelligence has so few users?

As I see it, the real obstacle is that most source data must be integrated and restructured before business intelligence systems can use it. It may be literally true that hosted business intelligence systems can be deployed in days and users can build dashboards in minutes, but this only applies given the heroic assumption that the proper data is already available. Under those conditions, on-premise systems can be deployed and used just as quickly. Hosting per se has little benefit when it comes to speed of deployment. (Well, maybe some: it can take days or even a week or two to set up a new server in some corporate data centers. Still, that is a tiny fraction of the typical project schedule.)

If hosting isn't the answer, what can make true “business intelligence on demand” a reality? Since the major obstacle is data preparation, then anything that allows less preparation will help. This brings us back to the analytical databases and appliances I’ve been writing about recently : Alterian, Vertica, ParAccel, QlikView, Netezza and so on. At least some of them do reduce the need for preparation because they let users query raw data without restructuring it or aggregating it. This isn’t because they avoid SQL queries, but because they offer a great enough performance boost over conventional databases that aggregation or denormalization are not necessary to return results quickly.

Of course, performance alone can’t solve all data preparation problems. The really knotty challenges like customer data integration and data quality still remain. Perhaps some of those will be addressed by making data accessible as a service (see last week’s post). But services themselves do not appear automatically, so a business intelligence application that requires a new service will still need advance planning. Where services will help is when business intelligence users can take advantage of services created for operational purposes.

“On demand business intelligence” also requires that end-users be able to do more for themselves. I actually feel this is one area where conventional technology is largely adequate: although systems could always be easier, end-users willing to invest a bit of time can already create useful dashboards, reports and analyses without deep technical skills. There are still substantial limits to what can be done – this is where QlikView’s scripting and macro capabilities really add value by giving still more power to non-technical users (or, more precisely, to power users outside the IT department). Still, I’d say that when the necessary data is available, existing business intelligence tools let users accomplish most of what they want.

If there is an issue in this area, it’s that SQL-based analytical databases don’t usually include an end-user access tool. (Non-SQL systems do provide such tools, since users have no alternatives.) This is a reasonable business decision on their part, both because many firms have already selected a standard access tool and because the vendors don’t want to invest in a peripheral technology. But not having an integrated access tool means clients must take time to connect the database to another product, which does slow things down. Apparently I'm not the only person to notice this: some of the analytical vendors are now developing partnerships with access tool vendors. If they can automate the relationship so that data sources become visible in the access tool as soon as they are added to the analytical system, this will move “on demand business intelligence” one step closer to reality.

* results of a quick Google search: OnDemandIQ, LucidEra, PivotLink (an in-memory columnar database), oco, VisualSmart, GoodData and
Autometrics.
Read More
Posted in analysis systems, business intelligence, dashboards, database technology, hosted software, on-demand software, qliktech, qlikview, service oriented architecture | No comments

Thursday, 31 January 2008

QlikView Scripts Are Powerful, Not Sexy

Posted on 10:30 by Unknown
I spent some time recently delving into QlikView’s automation functions, which allow users to write macros to control various activities. These are an important and powerful part of QlikView, since they let it function as a real business application rather than a passive reporting system. But what the experience really did was clarify why QlikView is so much easier to use than traditional software.

Specifically, it highlighted the difference between QlikView’s own scripting language and the VBScript used to write QlikView macros.

I was going to label QlikView scripting as a “procedural” language and contrast it with VBScript as an “object-oriented” language, but a quick bit of Wikipedia research suggests those may not be quite the right labels. Still, whatever the nomenclature, the difference is clear when you look at the simple task of assigning a value to a variable. With QlikView scripts, I use a statement like:

Set Variable1 = ‘123’;

With VBScript using the QlikView API, I need something like:

set v = ActiveDocument.GetVariable("Variable1")
v.SetContent "123",true

That the first option is considerably easier may not be an especially brilliant insight. But the implications are significant, because they mean vastly less training is needed to write QlikView scripts than to write similar programs in a language like VBScript, let alone Visual Basic itself. This in turn means that vastly less technical people can do useful things in QlikView than with other tools. And that gets back to the core advantage I’ve associated with QlikView previously: that it lets non-IT people like business analysts do things that normally require IT assistance. The benefit isn’t simply that the business analysts are happier or that IT gets a reduced workload. It's that the entire development cycle is accelerated because analysts can develop and refine applications for themselves. Otherwise, they'd be writing specifications, handing these to IT, waiting for IT to produce something, reviewing the results, and then repeating the cycle to make changes. This is why we can realistically say that QlikView cuts development time to hours or days instead of weeks or months.

Of course, any end-user tool cuts the development cycle. Excel reduces development time in exactly the same way. The difference lies in the power of QlikView scripts. They can do very complicated things, giving users the ability to create truly powerful systems. These capabilities include all kinds of file manipulation—loading data, manipulating it, splitting or merging files, comparing individual records, and saving the results.

The reason it’s taken me so long time to recognize that this is important is that database management is not built into today's standard programming languages. We’re simply become so used to the division between SQL queries and programs that the distinction feels normal. But reflecting on QlikView script brought me back to the old days of FoxPro and dBase database languages, which did combine database management with procedural coding. They were tremendously useful tools. Indeed, I still use FoxPro for certain tasks. (And not that crazy new-fangled Visual FoxPro, either. It’s especially good after a brisk ride on the motor-way in my Model T. You got a problem with that, sonny?)

Come to think of it FoxPro and dBase played a similar role in their day to what QlikView offers now: bringing hugely expanded data management power to the desktops of lightly trained users. Their fate was essentially to be overwhelmed by Microsoft Access and SQL Server, which made reasonably priced SQL databases available to end-users and data centers. Although I don’t think QlikView is threatened from that direction, the analogy is worth considering.

Back to my main point, which is that QlikView scripts are both powerful and easy to use. I think they’re an underreported part of the QlikView story, which tends to be dominated by the sexy technology of the in-memory database and the pretty graphics of QlikView reports. Compared with those, scripting seems pedestrian and probably a bit scary to the non-IT users whom I consider QlikView’s core market. I know I myself was put off when I first realized how dependent QlikView was on scripts, because I thought it meant only serious technicians could take advantage of the system. Now that I see how much easier the scripts are than today’s standard programming languages, I consider them a major QlikView advantage.

(Standard disclaimer: although my firm is a reseller for QlikView, opinions expressed in this blog are my own.)
Read More
Posted in analysis systems, analytics tools, business intelligence, dashboards, database technology, qliktech, qlikview | No comments

Wednesday, 9 January 2008

Visualizing the Value of QlikTech (and Any Others)

Posted on 14:24 by Unknown
As anyone who knows me would have expected, I couldn't resist figuring out how to draw and post the chart I described last week to illustrate the benefits of QlikTech.

The mechanics are no big deal, but getting it to look right took some doing. I started with a simple version of the table I described in the earlier post, a matrix comparing business intelligence questions (tasks) vs. the roles of the people who can answer them.

Per Friday's post, I listed four roles: business managers, business analysts, statisticians, and IT specialists. The fundamental assumption is that each question will be answered by the person closest to business managers, who are the ultimate consumers. In other words, starting with the business manager, each person answers a question if she can, or passes it on to the next person on the list.

I defined five types of tasks, based on the technical resources needed to get an answer. These are:

- Read an Existing Report: no work is involved; business managers can answer all these questions themselves.

- Drilldown in Existing Report: this requires accessing data that has already been loaded into a business intelligence system and preppred for access. Business managers can do some of this for themselves, but most will be done by business analysts who are more fluent with the business intelligence product.

- Calculate from Existing Report: this requires copying data from an existing report and manipulating it in Excel or something more substantial. Business managers can do some of this, but more complex analyses are performed by business analysts or sometimes statisticians.

- Ad Hoc Analyze in Existing: this requires accessing data that has already been made available for analysis, but is not part of an existing reporting or business intelligence output. Usually this means it resides in a data warehouse or data mart. Business managers don't have the technical skills to get at this data. Some may be accessible to analysts, more will be available to statisticians, and a remainder will require help from IT.

- Add New Data: this requires adding data to the underlying business intelligence environment, such as putting a new source or field in a data warehouse. Statisticians can do some of this but most of the time it must be done by IT.

The table below shows the numbers I assigned to each part of this matrix. (Sorry the tables are so small. I haven't figured out how to make them bigger in Blogger. You can click on them for a full-screen view.) Per the preceding definitions, they numbers represent the percentage of questions of each type that can be answered by each sort of user. Each row adds to 100%.


I'm sure you recognize that these numbers are VERY scientific.

I then did a similar matrix representing a situation where QlikTech was available. Essentially, things move to the left, because less technically skilled users gain more capabilities. Specifically,

- There is no change for reading reports, since the managers could already do that for themselves.

- Pretty much any drilldown now becomes accessible to business managers or analysts, because QlikTech makes it so easy to add new drill paths.

- Calculations on existing data don't change for business managers, since they won't learn the finer points of QlikTech or even have the licenses needed to do really complicated things. But business analysts can do a lot more than with Excel. There are still some things that need a statistician's help.

- A little ad hoc analysis becomes possible for business managers, but business analysts gain a great deal of capability. Again, some work still needs a statistician.

- Adding new data now becomes possible in many cases for business analysts, since they can connect directly to data sources that would otherwise have needed IT support or preparation. Statisticians can also pick up some of the work. The rest remains with IT.

Here is the revised matrix:


So far so good. But now for the graph itself. Just creating a bar chart of the raw data didn't give the effect I wanted.


Yes, if you look closely, you can see that business managers and business analysts (blue and red bars) gain the most. But it definitely takes more concentration than I'd like.

What I really had in mind was a single set of four bars, one for each role, showing how much capability it gained when QlikTech was added. This took some thinking. I could add the scores for each role to get a total value. The change in that value is what I want to show, presumably as a stacked bar chart. But you can't see that when the value goes down. I ultimately decided that the height of the bar should represent the total capability of each role: after all, just because statisticians don't have to read reports for business managers, they still can do it for themselves. This meant adding the values for each row across, so each role accumulated the capabilities of less technical rows to their left. So, the base table now looked like:



The sum of each column now shows the total capability available to each role. A similar calculation and sum for the QlikTech case shows the capability after QlikTech is added. The Change between the two is each role's gain in capability.


Now we have what I wanted. Start with the base value and stack the change on top of it, and you see very clearly how the capabilities shift after adding QlikTech.



I did find one other approach that I may even like better. This is to plot the sums of the two cases for each group in a three-way bar chart, as below. I usually avoid such charts because they're so hard to read. But in this case it does show both the shift of capability to the left, and the change in workload for the individual roles. It's a little harder to read but perhaps the extra information is worth it.



Obviously this approach to understanding software value can be applied to anything, not just QlikTech. I'd be interested to hear whether anybody else finds it useful.

Fred, this means you.
Read More
Posted in analysis systems, business intelligence, dashboards, qliktech, qlikview | No comments

Friday, 4 January 2008

Fitting QlikTech into the Business Intelligence Universe

Posted on 16:46 by Unknown
I’ve been planning for about a month to write about the position of QlikTech in the larger market for business intelligence systems. The topic has come up twice in the past week, so I guess I should do it already.

First, some context. I’m using “business intelligence” in the broad sense of “how companies get information to run their businesses”. This encompasses everything from standard operational reports to dashboards to advanced data analysis. Since these are all important, you can think of business intelligence as providing a complete solution to a large but finite list of requirements.

For each item on the list, the answer will have two components: the tool used, and the person doing the work. That is, I’m assuming a single tool will not meet all needs, and that different tasks will be performed by different people. This all seems reasonable enough. It means that a complete solution will have multiple components.

It also means that you have to look at any single business intelligence tool in the context of other tools that are also available. A tool which seems impressive by itself may turn out to add little real value if its features are already available elsewhere. For example, a visualization engine is useless without a database. If the company already owns a database that also includes an equally-powerful visualization engine, then there’s no reason to buy the stand-alone visualization product. This is why vendors to expand their product functionality and why it is so hard for specialized systems to survive. It’s also why nobody buys desk calculators: everyone has a computer spreadsheet that does the same and more. But I digress.

Back to the notion of a complete solution. The “best” solution is the one that meets the complete set of requirements at the lowest cost. Here, “cost” is broadly defined to include not just money, but also time and quality. That is, a quicker answer is better than a slower one, and a quality answer is better than a poor one. “Quality” raises its own issues of definition, but let’s view this from the business manager’s perspective, in which case “quality” means something along the lines of “producing the information I really need”. Since understanding what’s “really needed” often takes several cycles of questions, answers, and more questions, a solution that speeds up the question-answer cycle is better. This means that solutions offering more power to end-users are inherently better (assuming the same cost and speed), since they let users ask and answer more questions without getting other people involved. And talking to yourself is always easier than talking to someone else: you’re always available, and rarely lose an argument.

In short: the way to evaluate a business intelligence solution is to build a complete list of requirements and then, for each requirement, look at what tool will meet it, who will use that tool, what the tool will cost; and how quickly the work will get done.

We can put cost aside for the moment, because the out-of-pocket expense of most business intelligence solutions is insignificant compared with the value of getting the information they provide. So even though cheaper is better and prices do vary widely, price shouldn’t be the determining factor unless all else is truly equal.

The remaining critieria are who will use the tool and how quickly the work will get done. These come down to pretty much the same thing, for the reasons already described: a tool that can be used by a business manager will give the quickest results. More grandly, think of a hierarchy of users: business managers; business analysts (staff members who report to the business managers); statisticians (specialized analysts who are typically part of a central service organization); and IT staff. Essentially, questions are asked by business managers, and work their way through the hierarchy until they get to somebody who can answer them. Who that person is depends on what tools each person can use. So, if the business manager can answer her own question with her own tools, it goes no further; if the business analyst can answer the question, he does and sends it back to his boss; if not, he asks for help from a statistician; and if the statistician can’t get the answer, she goes to the IT department for more data or processing.

Bear in mind that different users can do different things with the same tool. A business manager may be able to do use a spreadsheet only for basic calculations, while a business analyst may also know how to do complex formulas, graphics, pivot tables, macros, data imports and more. Similarly, the business analyst may be limited to simple SQL queries in a relational database, while the IT department has experts who can use that same relational database to create complex queries, do advanced reporting, load data, add new tables, set up recurring processes, and more.

Since a given tool does different things for different users, one way to assess a business intelligence product is to build a matrix showing which requirements each user type can meet with it. Whether a tool “meets” a requirement could be indicated by a binary measure (yes/no), or, preferably, by a utility score that shows how well the requirement is met. Results could be displayed in a bar chart with four columns, one for each user group, where the height of each bar represents the percentage of all requirements those users can meet with that tool. Tools that are easy but limited (e.g. Excel) would have short bars that get slightly taller as they move across the chart. Tools that are hard but powerful (e.g. SQL databases) would have low bars for business users and tall bars for technical ones. (This discussion cries out for pictures, but I haven’t figured out how to add them to this blog. Sorry.)

Things get even more interesting if you plot the charts for two tools on top of each other. Just sticking with Excel vs. SQL, the Excel bars would be higher than the SQL bars for business managers and analysts, and lower than the SQL bars for statisticians and IT staff. The over-all height of the bars would be higher for the statisticians and IT, since they can do more things in total. Generally this suggests that Excel would be of primary use to business managers and analysts, but pretty much redundant for the statisticians and IT staff.

Of course, in practice, statisticians and IT people still do use Excel, because there are some things it does better than SQL. This comes back to the matrices: if each cell has utility scores, comparing the scores for different tools would show which tool is better for each situation. The number of cells won by each tool could create a stacked bar chart showing the incremental value of each tool to each user group. (Yes, I did spend today creating graphs. Why do you ask?)

Now that we’ve come this far, it’s easy to see that assessing different combinations of tools is just a matter of combining their matrices. That is, you compare the matrices for all the tools in a given combination and identify the “winning” product in each cell. The number of cells won by each tool shows its incremental value. If you want to get really fancy, you can also consider how much each tool is better than the next-best alternative, and incorporate the incremental cost of deploying an additional tool.

Which, at long last, brings us back to QlikTech. I see four general classes of business intelligence tools: legacy systems (e.g. standard reports out of operational systems); relational databases (e.g. in a data warehouse); traditional business intelligence tools (e.g. Cognos or Business Objects; we’ll also add statistical tools like SAS); and Excel (where so much of the actual work gets done). Most companies already own at least one product in each category. This means you could build a single utility matrix, taking the highest score in each cell from all the existing systems. Then you would compare this to a matrix for QlikTech and find cells where the QlikView is higher. Count the number of those cells, highlight them in a stacked bar chart, and you have a nice visual of where QlikTech adds value.

If you actually did this, you’d probably find that QlikTech is most useful to business analysts. Business managers might benefit some from QlikView dashboards, but those aren’t all that different from other kinds of dashboards (although building them in QlikView is much easier). Statisticians and IT people already have powerful tools that do much of what QlikTech does, so they won’t see much benefit. (Again, it may be easier to do some things in QlikView, but the cost of learning a new tool will weigh against it.) The situation for business analysts is quite different: QlikTech lets them do many things that other tools do not. (To be clear: some other tools can do those things, but it takes more skill than the business analysts possess.)

This is very important because it means those functions can now be performed by the business analysts, instead of passed on to statisticians or IT. Remember that the definition of a “best” solution boils down to whatever solution meets business requirements closest to the business manager. By allowing business analysts to perform many functions that would otherwise be passed through to statisticians or IT, QlikTech generates a hugh improvement in total solution quality.
Read More
Posted in analysis systems, analytics tools, dashboards, qliktech, qlikview, software selection, vendor evaluation | No comments

Tuesday, 12 June 2007

Looking for Balanced Scorecard Software

Posted on 08:12 by Unknown
I haven’t been able to come up with an authoritative list of major balanced scorecard software vendors. UK-based consultancy 2GC lists more than 100 in a helpful database with little blurbs on each, but they include performance management systems that are not necessarily for balanced scorecards. The Balanced Scorecard Collaborative, home of balanced scorecard co-inventor David P. Norton, lists two dozen products they have certified as meeting true balanced scorecard criteria. Of these, more than half belong non-specialist companies including enterprise software (Oracle, Peoplesoft [now Oracle], SAP, Infor, Rocket Software) and broad business intelligence systems (Business Objects, Cognos, Hyperion [now Oracle], Information Builders, Pilot Software [now SAP], SAS). Most of these firms have purchased specialist products. The remaining vendors (Active Strategy, Bitam, Consist FlexSI, Corporater, CorVu, InPhase, Intalev, PerformanceSoft [now Actuate], Procos, Prodacapo, QPR and Vision Grupos Consultores) are a combination of performance management specialists and regional consultancies.

That certified products are available from all the major enterprise and business intelligence vendors shows the basic functions needed for balanced scorecard are well understood and widely available. I’m sure there are differences among the products but suspect their choice of system will rarely be critical to project success or failure. The core functions are creation of strategy maps and cascading scorecards. I suspect systems vary more widely in their ability to import and transform scorecard data. A number of products also include project management functions such as task lists and milestone reporting. This is probably outside of the core requirements for balanced scorecard but does make sense in the larger context of providing tools to help meet business goals.

If your idea of a good time is playing with this sort of system (and whose isn’t?), Strategy Map offers a fully functional personal version for free.
Read More
Posted in balanced scorecard, dashboards, marketing performance measurement, score cards | No comments

Monday, 11 June 2007

Why Balanced Scorecards Haven't Succeeded at Marketing Measurement

Posted on 10:07 by Unknown
All this thinking about the overwhelming number of business metrics has naturally led me consider balanced scorecards as a way to organize metrics effectively. I think it’s fair to say that balanced scorecards have had only modest success in the business world: the concept is widely understood, but far from universally employed.

Balanced scorecards make an immense amount of sense. A disciplined scorecard process begins with strategy definition followed by a strategy map, which identifies the measures most important to a business and how they are relate to each other and final results. Once the top-level scorecard is built, subsidiary scorecards report on components that contribute to the top-level measures, providing more focused information and targets for lower-level managers.

That’s all great. But my problem with scorecards, and I suspect the reason they haven’t been used more widely, is they don’t make a quantifiable link between scorecard measures and business results. Yes, something like on-time arrivals may be a critical success factor for an airline, and thus appear on its scorecard. That scorecard will even give a target value to compare with actual performance. But it won’t show the financial impact of missing the target—for example, every 1% shortfall vs. the target on-time arrival rate translates into $10 million in lost future value. Proponents would argue (a) this value is impossible to calculate because there are so many intervening factors and (b) so long as managers are rewarded for meeting targets (or punished for not meeting them), that’s incentive enough. But I believe senior managers are rightfully uncomfortable setting those sorts of targets and reward systems unless the relationships between the targets and financial results are known. Otherwise, they risk disproportionately rewarding the selected behaviors, thereby distorting management priorities and ultimately harming business results.

Loyal readers of this blog might expect me to propose lifetime value as a better alternative. It probably is, but the lukewarm response it elicits from most managers has left me cautious. Whether managers don’t trust LTV calculations because they’re too speculative, or (more likely) are simply focused on short-term results, it’s pretty clear that LTV will not be the primary measurement tool in most organizations. I haven’t quite given up hope that LTV will ultimately receive its due, but for now feel it makes more sense to work with other measures that managers find more compelling.
Read More
Posted in balanced scorecard, customer metrics, dashboards, lifetime value, marketing performance, marketing performance measurement, score cards | No comments

Sunday, 3 June 2007

Data Visualization Is Just One Part of a Dashboard System

Posted on 19:43 by Unknown
Following Friday’s post on dashboard software, I want to emphasize that data visualization techniques are really just one element of those systems, and not necessarily the most important. Dashboard systems must gather data from source systems; transform and consolidate it; place it in structures suited for high-speed display and analysis; identify patterns, correlations and exceptions; and make it accessible to different users within the constraints of user interests, skills and authorizations. Although I haven’t researched the dashboard products in depth, even a cursory glance at their Web sites suggests they vary widely in these areas.

As with any kind of analytical system, most of the work and most of the value in dashboards will be in the data gathering. Poor visualization of good data can be overcome; good visualization of poor data is basically useless. So users should focus their attention on the underlying capabilities and not be distracted by display alone.
Read More
Posted in analysis systems, business intelligence, dashboards, score cards, vendor evaluation | No comments

Friday, 1 June 2007

Dashboard Software: Finding More than Flash

Posted on 09:37 by Unknown
I’ve been reading a lot about marketing performance metrics recently, which turns out to be a drier topic than I can easily tolerate—and I have a pretty high tolerance for dry. To give myself a bit of a break without moving too far afield, I decided to research marketing dashboard software. At least that let me look at some pretty pictures.

Sadly, the same problem that afflicts discussions of marketing metrics affects most dashboard systems: what they give you is a flood of disconnected information without any way to make sense of it. Most of the dashboard vendors stress their physical display capabilities—how many different types of displays they provide, how much data they can squeeze onto a page, how easily you can build things—and leave the rest to you. What this comes down to is: they let you make bigger, prettier mistakes faster.

Two exceptions did crop up that seem worth mentioning.

- ActiveStrategy builds scorecards that are specifically designed to link top-level business strategy with lower-level activities and results. They refer to this as “cascading” scorecards and that seems a good term to illustrate the relationship. I suppose this isn’t truly unique; I recollect the people at SAS showing me a similar hierarchy of key performance indicators, and there are probably other products with a cascading approach. Part of this may be the difference between dashboards and scorecards. Still, if nothing else, ActiveStrategy is doing a particularly good job of showing how to connect data with results.

- VisualAcuity doesn’t have the same strategic focus, but it does seek more effective alternatives to the normal dashboard display techniques. As their Web site puts it, “The ability to assimilate and make judgments about information quickly and efficiently is key to the definition of a dashboard. Dashboards aren’t intended for detailed analysis, or even great precision, but rather summary information, abbreviated in form and content, enough to highlight exceptions and initiate action.” VisualAcuity dashboards rely on many small displays and time-series graphs to do this.

Incidentally, if you’re just looking for something different, FYIVisual uses graphics rather than text or charts in a way that is probably very efficient at uncovering patterns and exceptions. It definitely doesn’t address the strategy issue and may or may not be more effective than more common display techniques. But at least it’s something new to look at.
Read More
Posted in analytics tools, customer metrics, dashboards, score cards, software selection | No comments
Older Posts Home
Subscribe to: Comments (Atom)

Popular Posts

  • eBay Offers $2.4 Billion for GSI Commerce: More Support for Marketing Automation
    eBay ’s $2.4 billion offer for e-commerce services giant GSI Commerce has been described largely in terms of helping eBay to compete with ...
  • 4 Marketing Tech Trends To Watch in 2014
    I'm not a big fan of year-end summaries and forecasts, mostly because I produce summaries and forecasts all year round.  But I pulled to...
  • Selligent Brings a New B2C Marketing Automation Option to the U.S.
    I’m writing this post on my old DOS-based WordPerfect software, to get in the proper mood for discussing business-to-consumer marketing auto...
  • Infer Keeps It Simple: B2B Lead Scores and Nothing Else
    I’ve nearly finished gathering information from vendors for my new study on Customer Data Platform systems and have started to look for patt...
  • Demand Generation Usability Scores - Part 2
    Usability Items for Simple Marketing Programs (note: this is a slightly revised version of the original post, reflecting vendor feedback.) Y...
  • Demand Generation Usability Scores - Part 3
    Usability Items for Complex Marketing Programs (note: this is a slightly revised version of the original post, reflecting vendor feedback....
  • Gainsight Gives Customer Success Managers a Database of Their Own
    I had a conversation last week with a vendor whose pitch was all about providing execution systems with a shared database that contains a un...
  • LeadSpace Offers A No-Memory Approach to B2B Lead Scoring
    My discussion last week of Infer , Mintigo , and Lattice Engines raised the question of what other B2B data vendors might be considered C...
  • Oracle Real-Time Decisions Empowers Business Users
    One of the few dependable rules in the software industry is that Suites Win. When a market first develops, it is filled with “point solutio...
  • First Look at New Marketo Release
    I’m going to diverge just slightly from my current obsession with usability to talk about a conversation I had today with Marketo President...

Categories

  • [x+1]
  • 1010Data
  • 2009 trends
  • 2010 predictions
  • 2011 predictions
  • 2013 marketing automation revenues
  • 2014 predictions
  • account data in marketing systems
  • acquisitions
  • acquistions
  • act-on software
  • active conversion
  • activeconversion
  • acxiom
  • ad agencies
  • ad servers
  • adam needles
  • adobe
  • adometry
  • advertising effectiveness
  • advocate management
  • affiliate marketing
  • agilone
  • aida model
  • aimatch
  • algorithmic attribution
  • alterian
  • analysis systems
  • analytical database
  • analytical databases
  • analytical systems
  • analytics tools
  • app exchange
  • app marketplace
  • application design
  • aprimo
  • are
  • artificial intelligence
  • ascend2
  • asset management
  • assetlink
  • atg
  • attribution analysis
  • attribution models
  • automated decisions
  • automated dialog
  • automated modeling
  • autonomy
  • b2b demand generation
  • b2b demand generation systems
  • b2b email marketing benchmarks
  • b2b lead scoring
  • b2b marketing
  • b2b marketing automation
  • b2b marketing automation industry consolidation
  • b2b marketing automation industry growth rate
  • b2b marketing automation revenues
  • b2b marketing automation systems
  • b2b marketing automation vendor rankings
  • b2b marketing data
  • b2b marketing industry consolidation
  • b2b marketing strategy
  • b2b marketing system comparison
  • b2c marketing automation
  • b2c marketing automation vendors
  • balanced scorecard
  • balihoo
  • barriers to marketing success
  • barry devlin
  • beanstalk data
  • behavior detection
  • behavior identification
  • behavior targeting
  • behavioral data
  • behavioral targeting
  • big data
  • birst
  • bislr
  • blogging software
  • brand experience
  • brand marketing
  • business intelligence
  • business intelligence software
  • business intelligence systems
  • business marketing
  • businses case
  • callidus
  • campaign flow
  • campaign management
  • campaign management software
  • causata
  • cdi
  • cdp
  • channel management
  • channel marketing
  • channel partner management
  • chordiant
  • cio priorities
  • clickdimensions
  • clicksquared
  • clientxclient
  • cloud computing
  • cmo surveys
  • cms
  • collaboration software
  • column data store
  • column-oriented database
  • columnar database
  • community management
  • compare marketing automation vendors
  • compiled data
  • complex event processing
  • consumer marketing
  • contact center systems
  • content aggregation
  • content distribution
  • content grazing
  • content management
  • content marketing
  • content matrix
  • content recommendations
  • content selections
  • content syndication
  • context automation
  • conversen
  • coremetrics
  • crm
  • crm integration
  • CRM lead scores
  • crm software
  • crm systems
  • crmevolution
  • cross-channel marketing
  • crowd sourcing
  • custom content
  • custom media
  • customer database
  • customer analysis
  • customer data
  • customer data integration
  • customer data management
  • customer data platform
  • customer data platforms
  • customer data quality
  • customer data warehouse
  • customer database
  • customer experience
  • customer experience management
  • customer experience matrix
  • customer information
  • customer management
  • customer management software
  • customer management systems
  • customer metrics
  • customer relationship management
  • customer satisfaction
  • customer success
  • customer support
  • cxc matrix
  • dashboards
  • data analysis
  • data cleaning
  • data cleansing
  • data enhancement
  • data integration
  • data loading
  • data mining
  • data mining and terrorism
  • data quality
  • data transformation tools
  • data visualization
  • data warehouse
  • database management
  • database marketing
  • database marketing systems
  • database technology
  • dataflux
  • datallegro
  • datamentors
  • david raab
  • david raab webinar
  • david raab whitepaper
  • day software
  • decision engiens
  • decision engines
  • decision management
  • decision science
  • dell
  • demand generation
  • demand generation implementation
  • demand generation industry
  • demand generation industry growth rate
  • demand generation industry size
  • demand generation industry trends
  • demand generation marketbright
  • demand generation marketing automation
  • demand generation software
  • demand generation software revenues
  • demand generation systems
  • demand generation vendors
  • demandforce
  • digiday
  • digital marketing
  • digital marketing systems
  • digital messaging
  • distributed marketing
  • dmp
  • dreamforce
  • dreamforce 2012
  • dynamic content
  • ease of use
  • ebay
  • eglue
  • eloqua
  • eloqua10
  • elqoua ipo
  • email
  • email marketing
  • email service providers
  • engagement engine
  • enteprise marketing management
  • enterprise decision management
  • enterprise marketing management
  • enterprise software
  • entiera
  • epiphany
  • ETL
  • eTrigue
  • event detection
  • event stream processing
  • event-based marketing
  • exacttarget
  • facebook
  • feature checklists
  • flow charts
  • fractional attribution
  • freemium
  • future of marketing automation
  • g2crowd
  • gainsight
  • Genius.com
  • genoo
  • geotargeting
  • gleanster
  • governance
  • grosocial
  • gsi commerce
  • high performance analytics
  • hiring consultants
  • hosted software
  • hosted systems
  • hubspot
  • ibm
  • impact of internet on selling
  • importance of sales execution
  • in-memory database
  • in-site search
  • inbound marketing
  • industry consolidation
  • industry growth rate
  • industry size
  • industry trends
  • influitive
  • infor
  • information cards
  • infusioncon 2013
  • infusionsoft
  • innovation
  • integrated customer management
  • integrated marketing management
  • integrated marketing management systems
  • integrated marketing systems
  • integrated systems
  • intent measurement
  • interaction advisor
  • interaction management
  • interestbase
  • interwoven
  • intuit
  • IP address lookup
  • jbara
  • jesubi
  • king fish media
  • kwanzoo
  • kxen
  • kynetx
  • large company marketing automation
  • last click attribution
  • lead capture
  • lead generation
  • lead management
  • lead management software
  • lead management systems
  • lead managment
  • lead ranking
  • lead scoring
  • lead scoring models
  • leadforce1
  • leadformix
  • leading marketing automation systems
  • leadlander
  • leadlife
  • leadmd
  • leftbrain dga
  • lifecycle analysis
  • lifecycle reporting
  • lifetime value
  • lifetime value model
  • local marketing automation
  • loopfuse
  • low cost marketing software
  • low-cost marketing software
  • loyalty systems
  • lyzasoft
  • makesbridge
  • manticore technology
  • mapreduce
  • market consolidation
  • market software
  • market2lead
  • marketbight
  • marketbright
  • marketgenius
  • marketing analysis
  • marketing analytics
  • marketing and sales integration
  • marketing automation
  • marketing automation adoption
  • marketing automation benefits
  • marketing automation consolidation
  • marketing automation cost
  • marketing automation deployment
  • marketing automation features
  • marketing automation industry
  • marketing automation industry growth rate
  • marketing automation industry trends
  • marketing automation market share
  • marketing automation market size
  • marketing automation maturity model
  • marketing automation net promoter score. marketing automation effectiveness
  • marketing automation pricing
  • marketing automation software
  • marketing automation software evaluation
  • marketing automation success factors
  • marketing automation system deployment
  • marketing automation system evaluation
  • marketing automation system features
  • marketing automation system selection
  • marketing automation system usage
  • marketing automation systems
  • marketing automation trends
  • marketing automation user satisfaction
  • marketing automation vendor financials
  • marketing automation vendor selection
  • marketing automation vendor strategies
  • marketing automion
  • marketing best practices
  • marketing cloud
  • marketing content
  • marketing data
  • marketing data management
  • marketing database
  • marketing database management
  • marketing education
  • marketing execution
  • marketing funnel
  • marketing integration
  • marketing lead stages
  • marketing management
  • marketing measurement
  • marketing mix models
  • marketing operating system
  • marketing operations
  • marketing optimization
  • marketing performance
  • marketing performance measurement
  • marketing platforms
  • marketing priorities
  • marketing process
  • marketing process optimization
  • marketing resource management
  • marketing return on investment
  • marketing ROI
  • marketing sales alignment
  • marketing service providers
  • marketing services
  • marketing services providers
  • marketing skills gap
  • marketing software
  • marketing software evaluation
  • marketing software industry trends
  • marketing software product reviews
  • marketing software selection
  • marketing software trends
  • marketing softwware
  • marketing suites
  • marketing system architecture
  • marketing system evaluation
  • marketing system ROI
  • marketing system selection
  • marketing systems
  • marketing technology
  • marketing tests
  • marketing tips
  • marketing to sales alignment
  • marketing training
  • marketing trends
  • marketing-sales integration
  • marketingpilot
  • marketo
  • marketo funding
  • marketo ipo
  • master data management
  • matching
  • maturity model
  • meaning based marketing
  • media mix models
  • message customization
  • metrics
  • micro-business marketing software
  • microsoft
  • microsoft dynamics crm
  • mid-tier marketing systems
  • mindmatrix
  • mintigo
  • mma
  • mobile marketing
  • mpm toolkit
  • multi-channel marketing
  • multi-language marketing
  • multivariate testing
  • natural language processing
  • neolane
  • net promoter score
  • network link analysis
  • next best action
  • nice systems
  • nimble crm
  • number of clients
  • nurture programs
  • officeautopilot
  • omnichannel marketing
  • omniture
  • on-demand
  • on-demand business intelligence
  • on-demand software
  • on-premise software
  • online advertising
  • online advertising optimization
  • online analytics
  • online marketing
  • open source bi
  • open source software
  • optimization
  • optimove
  • oracle
  • paraccel
  • pardot
  • pardot acquisition
  • partner relationship management
  • pay per click
  • pay per response
  • pedowitz group
  • pegasystems
  • performable
  • performance marketing
  • personalization
  • pitney bowes
  • portrait software
  • predictive analytics
  • predictive lead scoring
  • predictive modeling
  • privacy
  • prospect database
  • prospecting
  • qliktech
  • qlikview
  • qlikview price
  • raab guide
  • raab report
  • raab survey
  • Raab VEST
  • Raab VEST report
  • raab webinar
  • reachedge
  • reachforce
  • real time decision management
  • real time interaction management
  • real-time decisions
  • real-time interaction management
  • realtime decisions
  • recommendation engines
  • relationship analysis
  • reporting software
  • request for proposal
  • reseller marketing automation
  • response attribution
  • revenue attribution
  • revenue generation
  • revenue performance management
  • rfm scores
  • rightnow
  • rightwave
  • roi reporting
  • role of experts
  • rule-based systems
  • saas software
  • saffron technology
  • sales automation
  • sales best practices
  • sales enablement
  • sales force automation
  • sales funnel
  • sales lead management association
  • sales leads
  • sales process
  • sales prospecting
  • salesforce acquires exacttarget
  • salesforce.com
  • salesgenius
  • sap
  • sas
  • score cards
  • search engine optimization
  • search engines
  • self-optimizing systems
  • selligent
  • semantic analysis
  • semantic analytics
  • sentiment analysis
  • service oriented architecture
  • setlogik
  • setlogik acquisition
  • silverpop
  • silverpop engage
  • silverpop engage b2b
  • simulation
  • sisense prismcubed
  • sitecore
  • small business marketing
  • small business software
  • smarter commerce
  • smartfocus
  • soa
  • social campaign management
  • social crm
  • social marketing
  • social marketing automation
  • social marketing management
  • social media
  • social media marketing
  • social media measurement
  • social media monitoring
  • social media roi
  • social network data
  • software as a service
  • software costs
  • software deployment
  • software evaluation
  • software satisfaction
  • software selection
  • software usability
  • software usability measurement
  • Spredfast
  • stage-based measurement
  • state-based systems
  • surveillance technology
  • sweet suite
  • swyft
  • sybase iq
  • system deployment
  • system design
  • system implementation
  • system requirements
  • system selection
  • tableau software
  • technology infrastructure
  • techrigy
  • Tenbase
  • teradata
  • test design
  • text analysis
  • training
  • treehouse international
  • trigger marketing
  • twitter
  • unica
  • universal behaviors
  • unstructured data
  • usability assessment
  • user interface
  • vendor comparison
  • vendor evaluation
  • vendor evaluation comparison
  • vendor rankings
  • vendor selection
  • vendor services
  • venntive
  • vertica
  • visualiq
  • vocus
  • vtrenz
  • web analytics
  • web contact management
  • Web content management
  • web data analysis
  • web marketing
  • web personalization
  • Web site design
  • whatsnexx
  • woopra
  • youcalc
  • zoho
  • zoomix

Blog Archive

  • ▼  2013 (55)
    • ▼  December (4)
      • 4 Marketing Tech Trends To Watch in 2014
      • Webinar, December 18: How Marketers Can (Finally) ...
      • Woopra Grows from Web Analytics to Multi-Source Cu...
      • Optimove Helps Optimize Customer Retention (And, Y...
    • ►  November (5)
    • ►  October (4)
    • ►  September (3)
    • ►  August (5)
    • ►  July (5)
    • ►  June (5)
    • ►  May (6)
    • ►  April (6)
    • ►  March (1)
    • ►  February (6)
    • ►  January (5)
  • ►  2012 (56)
    • ►  December (4)
    • ►  November (3)
    • ►  October (6)
    • ►  September (4)
    • ►  August (7)
    • ►  July (3)
    • ►  June (4)
    • ►  May (5)
    • ►  April (3)
    • ►  March (4)
    • ►  February (8)
    • ►  January (5)
  • ►  2011 (74)
    • ►  December (9)
    • ►  November (8)
    • ►  October (6)
    • ►  September (5)
    • ►  August (5)
    • ►  July (3)
    • ►  June (6)
    • ►  May (5)
    • ►  April (6)
    • ►  March (8)
    • ►  February (7)
    • ►  January (6)
  • ►  2010 (75)
    • ►  December (9)
    • ►  November (9)
    • ►  October (5)
    • ►  September (6)
    • ►  August (7)
    • ►  July (3)
    • ►  June (6)
    • ►  May (9)
    • ►  April (4)
    • ►  March (6)
    • ►  February (6)
    • ►  January (5)
  • ►  2009 (96)
    • ►  December (2)
    • ►  November (4)
    • ►  October (5)
    • ►  September (9)
    • ►  August (7)
    • ►  July (16)
    • ►  June (9)
    • ►  May (5)
    • ►  April (11)
    • ►  March (11)
    • ►  February (11)
    • ►  January (6)
  • ►  2008 (59)
    • ►  December (6)
    • ►  November (3)
    • ►  October (8)
    • ►  September (1)
    • ►  August (5)
    • ►  July (8)
    • ►  June (5)
    • ►  May (5)
    • ►  April (6)
    • ►  March (3)
    • ►  February (3)
    • ►  January (6)
  • ►  2007 (84)
    • ►  December (4)
    • ►  November (6)
    • ►  October (6)
    • ►  September (1)
    • ►  August (4)
    • ►  July (7)
    • ►  June (16)
    • ►  May (20)
    • ►  April (20)
Powered by Blogger.

About Me

Unknown
View my complete profile