Tag Archives: stephen

Stephen Few: Now You See It

Portland

Readers:

Stephen_Few2I was in Portland, Oregon last week attending three data visualization workshops by industry expert, Stephen Few. I was very excited to be sitting at the foot of the master for three days and soak in all of this great dataviz information.

Last Thursday, was the third workshop, Now You See It which is based on Steve’s best-selling book (see photo below).

To not give away too much of what Steve is teaching in the workshops, I have decided to discuss one of our workshop topics, human perceptual and cognitive strengths.

You can find future workshops by Steve on his website, Perceptual Edge.

Best Regards,

Michael

Now You See It

 

Designed for Humans

Good visualizations and good visualization tools are carefully designed to take advantage of human perceptual and cognitive strengths and to augment human abilities that are weak. If the goal is to count the number of circles, this visualization isn’t well designed. It is difficult to remember what you have and have not counted.

Quickly, tell me how many blue circles you see below.

Design for Humans 1

The visualization below, shows the same number of circles, however, is well designed for the counting task. Because the circles are grouped into small sets of five each, it is easy to remember which groups have and have not been counted, easy to quickly count the number of circles in each group, and easy to discover with little effort that each of the five groups contains the same number of circles (i.e., five), resulting in a total count of 25 circles.

Design for Humans 2

The arrangement below is even better yet.

Design for Humans 3

Information visualization makes possible an ideal balance between unconscious perceptual and conscious cognitive processes. With the proper tools, we can shift much of the analytical process from conscious processes in the brain to pre-attentive processes of visual perception, letting our eyes do what they do extremely well.

Stephen Few: Information Dashboard Design

Readers:

Stephen_Few2I am in Portland, Oregon this week attending three data visualization workshops by industry expert, Stephen Few. I am very excited to be sitting at the foot of the master for three days and soak in all of this great dataviz information.

Today, was the second workshop, Information Dashboard Design which is based on Steve’s best-selling book (see photo below).

To not give away too much of what Steve is teaching in the workshops, I have decided to discuss one of the dashboard exercises we did in class. The goal here was to find what we feel is wrong with the dashboard.

I will show you the dashboard first. Then, you can see our critique below.

You can find future workshops by Steve on his website, Perceptual Edge.

Best Regards,

Michael

Information Dashboard Design

 

Dashboard To Critique

CORDA Airlines Dashboard

Critique Key Points

  • Top left chart – Only left hand corner chart has anything to do with flight loading
  • Top left chart – are flight numbers useful?
  • Two Expand/Print buttons – Need more clarity (right-click on chart would be a better choice)
  • Top right chart – Poor use of pie charts – size of pies are telling largest sales channel – use small multiple bar charts, total sales as a fourth bar chart
  • Redundant use of “February” – In the title and in charts
  • Bottom left chart – why does it have a pie chart in it?
  • Bottom right chart – map may be better as a bar chart (geographical display could be useful if we had more information). Current way bubbles are being expressed is not useful (use % cancellations instead). Symbols may have a different meaning every day
  • Bottom right chart – CORDAir Logo – is this necessary?
  • Location of drop-down. Not clear if it applies to top left chart or all charts
  • Backgrounds – heavy colors, gradients
  • Instructions should be in a separate help document. Only need to learn this once.
  • Top left chart: Faint Image in background. Suppose to look like a flight seating map. Do you really want to see this every day? It is a visual distraction.
  • IMPORTANT: Is there visual context offered with any of the graphs? No. This is critical.

————————————————————————————————-

Dashboard Example Source: Website of Corda Technologies Incorporated, which has since been acquired by Domo.

Stephen Few: Show Me The Numbers

Readers:

Stephen_Few2I am in Portland, Oregon this week attending three data visualization workshops by industry expert, Stephen Few. I am very excited to be sitting at the foot of the master for three days and soak in all of this great dataviz information.

Yesterday, was the first workshop, Show Me the Numbers which is based on Steve’s best-selling book (see photo below).

To not give away too much of what Steve is teaching in the workshops, I have decided to give one “before and after” example each day with Steve’s explanation of why he made the changes he did.

You can find future workshops by Steve on his website, Perceptual Edge.

Best Regards,

Michael

Show Me the Numbers

 

“Before” Example

In the example below, the message contained in the titles is not clearly displayed in the graphs. The message deals with the ratio of indirect to total sales – how it is declining domestically, while holding steady internationally. You’d have to work hard to get this message the display as it is currently designed.

Before - Show Me the Numbers

 

“After” Example

The revised example below, however, is designed very specifically to display the intended message. Because this graph, is skillfully designed to communicate, its message is crystal clear. A key feature that makes this so is the choice of percentage for the quantitative scale, rather than dollars.

After - Show Me the Numbers

Additional Thoughts From Steve

The type of graph that is selected and the way it’s designed also have great impact on the message that is communicated. By simply switching from a line graph to a bar graph, the decrease in job satisfaction among those without college degrees in their later years is no longer as obvious.

More Thoughts - Show Me the Numbers

Stephen Few: Why Do We Visualize Quantitative Data?

Readers:

Stephen_FewIt has been a while since I have discussed some of the latest creative thoughts on data visualization from Stephen Few. I have read all of Steve’s books, attended several classes from him, and religiously follow his blog and newsletter on his website, Perceptual Edge.

For those of you who don’t know, Stephen Few is the Founder & Principal of Perceptual Edge. Perceptual Edge, founded in 2003, is a consultancy that was established to help organizations learn to design simple information displays for effective analysis and communication.

Steve has stated that his company will probably always be a company of one or two people, which is the perfect size for him. With 25 years of experience as an innovator, consultant, and educator in the fields of business intelligence and information design, he is now considered the leading expert in data visualization for data sense-making and communication.

Steve writes a quarterly Visual Business Intelligence Newsletter, speaks and teaches internationally, and provides design consulting. In 2004, he wrote the first comprehensive and practical guide to business graphics entitled Show Me the Numbers, now in its second edition. In 2006, he wrote the first and only guide to the visual design of dashboards, entitled Information Dashboard Design, also now in its second edition. In 2009, he wrote the first introduction for non-statisticians to visual data analysis, entitled Now You See It.

Here is his latest thoughts from his newsletter.

Best regards,

Michael

 

Why Do We Visualize Quantitative Data?

Per Stephen Few, we visualize quantitative data to perform three fundamental tasks in an effort to achieve three essential goals:

Web

These three tasks are so fundamental to data visualization, Steve used them to define the term, as follows:

Data visualization is the use of visual representations to explore, make sense of, and communicate data.

Steve poses the question of why is it that we must sometimes use graphical displays to perform these tasks rather than other forms of representation? Why not always express values as numbers in tables? Why express them visually rather than audibly?

Essentially, there is only one good reason to express quantitative data visually: some features of quantitative data can be best perceived and understood, and some quantitative tasks can be best performed, when values are displayed graphically. This is so because of the ways our brains work. Vision is by far our dominant sense. We have evolved to perform many data sensing and processing tasks visually. This has been so since the days of our earliest ancestors who survived and learned to thrive on the African savannah. What visual perception evolved to do especially well, it can do faster and better than the conscious thinking parts of our brains. Data exploration, sensemaking, and communication should always involve an intimate collaboration between seeing and thinking (i.e., visual thinking).

Despite this essential reason for visualizing data, people often do it for reasons that are misguided. Steve dispels a few common myths about data visualization.

Myth #1: We visualize data because some people are visual learners.

While it is true that some people have greater visual thinking abilities than others and that some people have a greater interest in images than others, all people with normal perceptual abilities are predominantly visual. Everyone benefits from data visualization, whether they consider themselves visual learners or not, including those who prefer numbers.

Myth #2: We visualize data for people who have difficulty understanding numbers.

While it is true that some people are more comfortable with quantitative concepts and mathematics than others, even the brightest mathematicians benefit from seeing quantitative information displayed visually. Data visualization is not a dumbed-down expression of quantitative concepts.

Myth #3: We visualize data to grab people’s attention with eye-catching but inevitably less informative displays.

Visualizations don’t need to be dumbed down to be engaging. It isn’t necessary to sacrifice content in lieu of appearance. Data can always be displayed in ways that are optimally informative, pleasing to the eye, and engaging. To engage with a data display without being well-informed of something useful is a waste.

Myth #4: The best data visualizers are those who have been trained in graphic arts.

While training in graphic arts can be useful, it is much more important to understand the data and be trained in visual thinking and communication. Graphic arts training that focuses on marketing (i.e., persuading people to buy or do something through manipulation) and artistry rather than communication can actually get in the way of effective data visualization.

Myth #5: Graphics provide the best means of telling stories contained in data.

While it is true that graphics are often useful and sometimes even essential for data-based storytelling, it isn’t storytelling itself that demands graphics. Much of storytelling is best expressed in words and numbers rather than images. Graphics are useful for storytelling because some features of data are best understood by our brains when they’re presented visually.

We visualize data because the human brain can perceive particular quantitative features and perform particular quantitative tasks most effectively when the data is expressed graphically. Visual data processing provides optimal support for the following:

1. Seeing the big picture

Graphs reveal the big picture: an overview of a data set. An overview summarizes the data’s essential characteristics, from which we can discern what’s routine vs. exceptional.

The series of three bar graphs below provides an overview of the opinions that 15 countries had about America in 2004, not long after the events of 9/11 and the military campaigns that followed.

graph-of-country-opinions

Steve first discovered this information in the following form on the website of PBS:

table-of-country-opinions

Based on this table of numbers, he had to read each value one at a time and, because working memory is limited to three or four simultaneous chunks of information at a time, he couldn’t use this display to construct and hold an overview of these countries’ opinions in his head. To solve this problem, he redisplayed this information as the three bar graphs shown above, which provided the overview that he wanted. Steve was able to use it to quickly get a sense of these countries’ opinions overall and in comparison to one another.

Bonus: Here is a link to where Steve discusses the example above on his website.

2. Easily and rapidly comparing values

Try to quickly compare the magnitudes of values using a table of numbers, such as the one shown above. You can’t, because numbers must be read one at a time and only two numbers can be compared at a time. Graphs, however, such as the bar graphs above, make it possible to see all of the values at once and to easily and rapidly compare them.

3. Seeing patterns among values

Many quantitative messages are revealed in patterns formed by sets of values. These patterns describe the nature of change through time, how values are distributed, and correlations, to name a few.

Try to construct the pattern of monthly change in either domestic or international sales for the entire year using the table below.

table-of-sales-data

Difficult, isn’t it? The line graph below, however, presents the patterns of change in a way that can be perceived immediately, without conscious effort.

graph-of-sales-data

You can thank processes that take place in your visual cortex for this. The visual cortex perceives patterns and then the conscious thinking parts of our brains make sense of them.

4. Comparing patterns

Visual representations of patterns are easy to compare. Not only can the independent patterns of domestic and international sales be easily perceived by viewing the graph above, but they can also be compared to one another to determine how they are similar and different.

In Summary

These four quantitative features and activities require visual displays. This is why we visualize quantitative data.

Has MicroStrategy Toppled Tableau as the Analytics King?

MicroStrategy Analytics

In a recent TDWI article titled Analysis: MicroStrategy’s Would-Be Analytics King, Stephen Swoyer, who is a technology writer based in Nashville, TN, stated that business intelligence (BI) stalwart MicroStrategy Inc. pulled off arguably the biggest coup at Teradata Corp.’s recent Partners User Group (Partners) conference, announcing a rebranded, reorganized, and — to some extent — revamped product line-up.

One particular announcement drew great interest: MicroStrategy’s free version of its discovery tool — Visual Insight — which it packages as part of a new standalone BI offering: MicroStrategy Analytics Desktop.

With Analytics Desktop, MicroStrategy takes dead aim at insurgent BI offerings from QlikTech Inc., Tibco Spotfire, and — most particularly — Tableau Software Inc.

MicroStrategy rebranded its products into three distinct groups: the MicroStrategy Analytics Platform (consisting of MicroStrategy Analytics Enterprise version 9.4 — an updated version of its v9.3.1 BI suite); MicroStrategy Express (its cloud platform available in both software- and platform-as-a-service  subscription options; and MicroStrategy Analytics Desktop (a single-user, BI discovery solution). MicroStrategy Analytics Enterprise takes a page from Tableau’s book via support for data blendinga technique that Tableau helped to popularize.

“We’re giving the business user the tools to join data in an ad hoc sort of environment, on the fly. That’s a big enhancement for us. The architectural work that we did to make that enhancement work resulted in some big performance improvements [in MicroStrategy Analytics Enterprise]: we improved our query performance for self-service analytics by 40 to 50 percent,” said Kevin Spurway, senior vice president of marketing with MicroStrategy.

Spurway — who, as an interesting aside, has a JD from Harvard Law School — said MicroStrategy implements data blending in much the same way that Tableau does: i.e., by doing it in-memory. Previous versions of MicroStrategy BI employed an interstitial in-memory layer, Spurway said; the performance improvements in MicroStrategy Analytics Enterprise result from shifting to an integrated in-memory design, he explained.

“It’s a function of just our in-memory [implementation]. Primarily it has to do with the way the architecture on our end works: we used to have kind of a middle in-memory layer that we’ve removed.”

Spurway described MicroStrategy Desktop Analytics as a kind of trump card: a standalone, desktop-oriented version of the MicroStrategy BI suite — anchored by its Visual Insight tool and designed to address the BI discovery use case. Desktop Analytics can extract data from any ODBC-compliant data source. Like Enterprise Analytics, it’s powered by an integrated in-memory engine.

In other words: a Tableau-killer.

“That [Visual Insight] product has been out there but has always been kind of locked up in our Enterprise product,” he said, acknowledging that MicroStrategy offered Visual Insight as part of its cloud stack, too. “You had to be a MicroStrategy customer who obviously has implemented the enterprise solution, or you could get it through Express, [which is] great for some people, but not everybody wants a cloud-based solution. With [MicroStrategy Desktop Analytics], you go to our website, download and install it, and you’re off and running — and we’ve made it completely free.”

The company’s strategy is that many users will, as Spurway put it, “need more.” He breaks the broader BI market into two distinct segments — with a distinct, Venn-diagram-like area of overlap.

“There’s a visual analytics market. It’s a hot market, which is primarily being driven by business-user demand. Then there’s the traditional business intelligence market, and that market has been there for 20 years. It’s not growing as quickly, and there’s some overlap between the two,” he explained.

“The BI market is IT-driven. For business users, they need speed, they need better ways to analyze their data than Excel provides; they don’t want impediments, they need quick time to value. The IT organization cares about … things … [such as] traditional reporting [and] information-driven applications. Those are apps that are traditionally delivered at large scale and they have to rely on data that’s trusted, that’s modeled.”

If or when users “need more,” they can “step up” to MicroStrategy’s on-premises (Enterprise Analytics) or cloud (Express) offerings, Spurway pointed out. “The IT organization has to support the business users, but they also need to support the operationalization of analytics,” he argued, citing the goal of embedding analytics into the business process. “That can mean a variety of things. It can mean a very simple report or dashboard that’s being delivered every day to a store manager in a Starbucks. They’re not going to need Visual Insight for something like that — they’re not going to need Tableau. They need something that’s simplified for everyday usage.”

MicroStrategy Analytics Powerful

Something More, Something Else

Many in the industry view self-service visual discovery as the culmination of traditional BI.

One popular narrative holds that QlikTech, Tableau, and Spotfire helped establish and popularize visual discovery as an (insurgent) alternative to traditional BI. Spurway sought to turn this view on its head, however: Visual discovery, he claimed, “is a starting point. It draws you in. The key thing that we bring to the table is the capability to bridge the gap between traditional model, single-version-of-the-truth business intelligence and fast, easy, self-service business analytics.”

In Spurway’s view, the usefulness or efficacy of BI technologies shouldn’t be plotted on a linear time-line, e.g., anchored by greenbar reports on the extreme left and culminating in visual discovery on the far right. Visual discovery doesn’t complete or supplant traditional BI, he argued, and it isn’t inconceivable that QlikTech, Tableau, and Spotfire — much like MicroStrategy and all of the other traditional BI powers that now offer visual discovery tools as part of their BI suite — might augment their products with BI-like accoutrements.

Instead of a culmination, Spurway sees a circle — or, better still, a möbius strip: regardless of where you begin with BI, at some point — in a large enough organization — you’re going to traverse the circle or (as with a möbius strip) come out the other side.

There might be something to this. From the perspective of the typical Tableau enthusiast, for example, the expo floor at last year’s Tableau Customer Conference (TCC), held just outside of Washington, D.C. in early September, probably offered a mix of the familiar, the new, and the plumb off-putting. For example, Tableau users tend to take a dim view of traditional BI, to say nothing of the data integration (DI) or middleware plumbing that’s associated with it: “Just let me work already!” is the familiar cry of the Tableau devotee. However, TCC 2013 played host to several old-guard exhibitors — including IBM Corp., Informatica Corp., SyncSort Inc., and Teradata Corp. — as well as upstart players such as WhereScape Inc. and REST connectivity specialist SnapLogic Inc.

These vendors weren’t just exhibiting, either. As a case in point, Informatica and Tableau teamed up at TCC 2013 to trumpet a new “strategic collaboration.” As part of this accord, Informatica promised to certify its PowerCenter Data Virtualization Edition and Informatica Data Services products for use with Tableau. In an on-site interview, Ash Parikh, senior director of emerging technologies with Informatica, anticipated MicroStrategy’s Spurway by arguing that organizations “need something more.” MicroStrategy’s “something more” is traditional BI reporting and analysis; Informatica’s and Tableau’s is visual analytic discovery.

“Traditional business intelligence alone does not cut it. You need something more. The business user is demanding faster access to information that he wants, but [this] information needs to be trustworthy,” Parikh argued. “This doesn’t mean people who have been doing traditional business intelligence have been doing something wrong; it’s just that they have to complement their existing approaches to business intelligence,” he continued, stressing that Tableau needs to complement — and, to some extent, accommodate — enterprise BI, too.

“From a Tableau customer perspective, Tableau is a leader in self-service business intelligence, but Tableau [the company] is very aware of the fact that if they want to become the standard within an enterprise, the reporting standard, they need to be a trusted source of information,” he said.

Among vendor exhibitors at TCC 2013, this term — “trusted information” or some variation — was a surprisingly common refrain. If Tableau wants to be taken seriously as an enterprisewide player, said Rich Dill, a solutions engineer with SnapLogic, it must be able to accommodate the diversity of enterprise applications, services, and information resources. More to the point, Dill maintained, it must do so in a way that comports with corporate governance and regulatory strictures.

“[Tableau is] starting to get into industries where audit trails are an issue. I’ve seen a lot of financial services and healthcare and insurance businesses here [i.e., at TCC] that have to comply with audit trails, auditability, and logging,” he said. In this context, Dill argued, “If you can’t justify in your document where that number came from, why should I believe it? The data you’re making these decisions on came from these sources, but are these sources trusted?”

Mark Budzinski, vice president and general manager with WhereScape, offered a similar — and, to be sure, similarly self-serving — assessment. Tableau, he argued, has “grown their business by appealing to the frustrated business user who’s hungry for data and analytics anyway they can get it,” he said, citing Tableau’s pioneering use of data blending, which he said “isn’t workable [as a basis for decision-making] across the enterprise. You’re blending data from all of these sources, and before you know it, the problem that the data’s not managed in the proper place starts to rear its ugly head.”

Budzinski’s and WhereScape’s pitch — like those of IBM and Teradata — had a traditional DM angle. “There’s no notion of historical data in these blends and there’s no consistency: you’re embedding business rules at the desktop, [but] who’s to say that this rule is the same as the [rule used by the] guy in the next unit. How do you ensure integrity of the data and [ensure that] the right decisions were made? The only way to do that is in some data warehouse-, data mart-[like] thing.”

Stephen Swoyer can be reached at stephen.swoyer@spinkle.net.