Tag Archives: Has

Tips & Tricks #2: In MicroStrategy Web, Report Execution Fails with Error ‘Results for this message cannot be retrieved from MicroStrategy Intelligence Server. This might be because the execution has failed. Please contact your administrator.’

I had this error the other day. Fortunately, I had just been reading about working set governor settings the other night to prepare for the CPA and MCE exams.

First, let’s discuss the issue.

Issue/Problem

In MicroStrategy Web, when I executed a report, I received the following error message (also, see screenshot below).

(Results for this message cannot be retrieved from MicroStrategy Intelligence Server. This might be because the execution has failed. Please contact your administrator.)

Results cannot be retrieved error message

Basically, what happened was my report request could not be processed. Looking through the MicroStrategy KnowledgeBase, it basically tells you to try to run the report again. If it still throws this error, contact your Administrator.

If you look at the Web Log on the Web Server, you will see an error similar to the following:

<record reset=`true`>
 <package>com.microstrategy.webapi</package>
 <level>SEVERE</level>
 <miliseconds>1157577081154</miliseconds>
 <timestamp>09/06/2006 14:11:21:154</timestamp>
 <thread>154</thread>
 <class>CDSSXMLReportServer</class>
 <method>GetExecutionResultsEx</method>
 <message>(Results for this message cannot be retrieved from 
MicroStrategy Intelligence Server. This might be because the
execution has failed. Please contact your administrator.) 
(com.microstrategy.webapi.MSTRWebAPIException)</message>
 <exception>com.microstrategy.webapi.MSTRWebAPIException: 
(Results for this message cannot be retrieved from 
MicroStrategy Intelligence Server. This might be because
the execution has failed. Please contact your 
administrator.)
 at com.microstrategy.webapi.
CDSSXMLReportServer.GetExecutionResultsCommon(Unknown Source)
 at com.microstrategy.webapi.
CDSSXMLReportServer.GetExecutionResultsEx(Unknown Source)
 at com.microstrategy.web.objects.
WebReportInstanceImpl.
getExecutionResults(Unknown Source)
 at com.microstrategy.web.objects.
WebReportInstanceImpl.pollForResults(Unknown Source)
 at com.microstrategy.web.objects.
WebReportInstanceImpl.populateUserReportCache(Unknown Source)
 at com.microstrategy.web.objects.
WebReportInstanceImpl.pollStatus(Unknown Source)
 at com.microstrategy.web.beans.
ReportInstanceProxy.pollStatus(Unknown Source)
 at com.microstrategy.web.beans.
ReportBeanImpl.localCollectData(Unknown Source)
 at com.microstrategy.web.beans.
ReportBeanImpl.doCollectData(Unknown Source)
 at com.microstrategy.web.beans.
AbstractWebBean.collectData(Unknown Source)
 at com.microstrategy.web.app.beans.
AbstractAppComponent.collectData(Unknown Source)
 at com.microstrategy.web.app.beans.
ReportFrameBeanImpl.collectData(Unknown Source)
 at com.microstrategy.web.app.beans.
AbstractAppComponent.collectData(Unknown Source)
 at com.microstrategy.web.app.beans.
PageComponentImpl.collectData(Unknown Source)
 at com.microstrategy.web.app.
MSTRWebController.processRequest(Unknown Source)
 </exception>
 <userName>Administrator</userName>
 <clientID>172.19.19.2</clientID>
 </record>

The DSSErrors.log file on the MicroStrategy Intelligence Server contains the following errors:

[Kernel][Error] MsiWorkingSet::PersistMsg(): 
failed to attach RI to msg, error 0x80003F79
 [Kernel][Error] CDSSServerMessage::put_OriginalRI: 
WSResultPool->AddRI for msg xxxx return error 0x80003F79
 [Kernel][Error] CDSSServerMessage::put_OriginalRI: 
WSResultPool->AddRI for msg xxxx return error 0x80003F79
 [Kernel][Error] CDSSServerMessage::GetReportInstance():
get_OriginalRI() return error 0x1

Now what?

Solution

The size of the report that was to be executed has 40MB report cache size while the ‘Maximum RAM for Working Set cache size’ is set as 102,400KB, as shown in the image below:

Governing Rules - Default - Working Set

The MicroStrategy Intelligence Server was unable to swap out the report instances of 40MB in the Working Set.  To resolve this issue, I needed to increase the size of Maximum RAM for Working Set cache to a higher value, for example 512,000KB.

What is the ‘Working Set’ Governor Setting?

When a user runs a report from MicroStrategy Web or Web Universal, the results from the report are added to what is called the working set for that user’s session and stored in memory on the Intelligence Server. The working set is a collection of messages that reference in-memory report instances. A message is added to the working set when a user executes a report or retrieves a message from his or her Inbox.

The purpose of the working set is to:

  1. Allow the efficient use of the web browser’s Back button.
  2. Improve web performance for manipulations.
  3. Allow users to manually add messages to the History List.

This setting is accessible via the MicroStrategy Intelligence Server Configuration as shown below:

Governing Rules - Default - Working Set

Working Set Governors

The ‘Working Set file directory’ is the location in the filesystem where the Report Instances may be persisted on disk. A report instance will be persisted on disk in binary format if its size exceeds the limit set by the ‘Maximum RAM for Working Set cache’ governor or none of the report instances in memory can be swapped to make room for the new report instance. The persisted report instance will be persisted as the <filename(GUID)>.po and may be reused if the report is invoked again.

The ‘Maximum RAM for Working Set cache’ is the governor that modulates the size of the WorkingSet Result Pool. The Maximum value is: 2147483647 MB, the Minimum value depends on version and is 200 MB in 9.3.1, and the Default value is: 200 MB. Note that if the value specified is greater then the machines memory it uses the default of 200 MB. The default value is usually sufficient, but if memory issues arise, as noted above, the setting can be increased. Increasing this setting means that the MicroStrategy Intelligence Server may be operating with a higher average memory footprint during its lifecycle, so proper tuning may be needed if memory usage becomes an issue.

Important Notes

  • There is no Working Set (WS) for a session created by the MicroStrategy Desktop client.
  • This is a MicroStrategy Intelligence Server configuration level setting, so it applies to all the projects and all the users and is not specific to a project. If these settings are changed, MicroStrategy Intelligence Server may need to be restarted.
  • The MicroStrategy Working Set is not the same as the Microsoft Windows Operating System Working Set.

Has MicroStrategy Toppled Tableau as the Analytics King?

MicroStrategy Analytics

In a recent TDWI article titled Analysis: MicroStrategy’s Would-Be Analytics King, Stephen Swoyer, who is a technology writer based in Nashville, TN, stated that business intelligence (BI) stalwart MicroStrategy Inc. pulled off arguably the biggest coup at Teradata Corp.’s recent Partners User Group (Partners) conference, announcing a rebranded, reorganized, and — to some extent — revamped product line-up.

One particular announcement drew great interest: MicroStrategy’s free version of its discovery tool — Visual Insight — which it packages as part of a new standalone BI offering: MicroStrategy Analytics Desktop.

With Analytics Desktop, MicroStrategy takes dead aim at insurgent BI offerings from QlikTech Inc., Tibco Spotfire, and — most particularly — Tableau Software Inc.

MicroStrategy rebranded its products into three distinct groups: the MicroStrategy Analytics Platform (consisting of MicroStrategy Analytics Enterprise version 9.4 — an updated version of its v9.3.1 BI suite); MicroStrategy Express (its cloud platform available in both software- and platform-as-a-service  subscription options; and MicroStrategy Analytics Desktop (a single-user, BI discovery solution). MicroStrategy Analytics Enterprise takes a page from Tableau’s book via support for data blendinga technique that Tableau helped to popularize.

“We’re giving the business user the tools to join data in an ad hoc sort of environment, on the fly. That’s a big enhancement for us. The architectural work that we did to make that enhancement work resulted in some big performance improvements [in MicroStrategy Analytics Enterprise]: we improved our query performance for self-service analytics by 40 to 50 percent,” said Kevin Spurway, senior vice president of marketing with MicroStrategy.

Spurway — who, as an interesting aside, has a JD from Harvard Law School — said MicroStrategy implements data blending in much the same way that Tableau does: i.e., by doing it in-memory. Previous versions of MicroStrategy BI employed an interstitial in-memory layer, Spurway said; the performance improvements in MicroStrategy Analytics Enterprise result from shifting to an integrated in-memory design, he explained.

“It’s a function of just our in-memory [implementation]. Primarily it has to do with the way the architecture on our end works: we used to have kind of a middle in-memory layer that we’ve removed.”

Spurway described MicroStrategy Desktop Analytics as a kind of trump card: a standalone, desktop-oriented version of the MicroStrategy BI suite — anchored by its Visual Insight tool and designed to address the BI discovery use case. Desktop Analytics can extract data from any ODBC-compliant data source. Like Enterprise Analytics, it’s powered by an integrated in-memory engine.

In other words: a Tableau-killer.

“That [Visual Insight] product has been out there but has always been kind of locked up in our Enterprise product,” he said, acknowledging that MicroStrategy offered Visual Insight as part of its cloud stack, too. “You had to be a MicroStrategy customer who obviously has implemented the enterprise solution, or you could get it through Express, [which is] great for some people, but not everybody wants a cloud-based solution. With [MicroStrategy Desktop Analytics], you go to our website, download and install it, and you’re off and running — and we’ve made it completely free.”

The company’s strategy is that many users will, as Spurway put it, “need more.” He breaks the broader BI market into two distinct segments — with a distinct, Venn-diagram-like area of overlap.

“There’s a visual analytics market. It’s a hot market, which is primarily being driven by business-user demand. Then there’s the traditional business intelligence market, and that market has been there for 20 years. It’s not growing as quickly, and there’s some overlap between the two,” he explained.

“The BI market is IT-driven. For business users, they need speed, they need better ways to analyze their data than Excel provides; they don’t want impediments, they need quick time to value. The IT organization cares about … things … [such as] traditional reporting [and] information-driven applications. Those are apps that are traditionally delivered at large scale and they have to rely on data that’s trusted, that’s modeled.”

If or when users “need more,” they can “step up” to MicroStrategy’s on-premises (Enterprise Analytics) or cloud (Express) offerings, Spurway pointed out. “The IT organization has to support the business users, but they also need to support the operationalization of analytics,” he argued, citing the goal of embedding analytics into the business process. “That can mean a variety of things. It can mean a very simple report or dashboard that’s being delivered every day to a store manager in a Starbucks. They’re not going to need Visual Insight for something like that — they’re not going to need Tableau. They need something that’s simplified for everyday usage.”

MicroStrategy Analytics Powerful

Something More, Something Else

Many in the industry view self-service visual discovery as the culmination of traditional BI.

One popular narrative holds that QlikTech, Tableau, and Spotfire helped establish and popularize visual discovery as an (insurgent) alternative to traditional BI. Spurway sought to turn this view on its head, however: Visual discovery, he claimed, “is a starting point. It draws you in. The key thing that we bring to the table is the capability to bridge the gap between traditional model, single-version-of-the-truth business intelligence and fast, easy, self-service business analytics.”

In Spurway’s view, the usefulness or efficacy of BI technologies shouldn’t be plotted on a linear time-line, e.g., anchored by greenbar reports on the extreme left and culminating in visual discovery on the far right. Visual discovery doesn’t complete or supplant traditional BI, he argued, and it isn’t inconceivable that QlikTech, Tableau, and Spotfire — much like MicroStrategy and all of the other traditional BI powers that now offer visual discovery tools as part of their BI suite — might augment their products with BI-like accoutrements.

Instead of a culmination, Spurway sees a circle — or, better still, a möbius strip: regardless of where you begin with BI, at some point — in a large enough organization — you’re going to traverse the circle or (as with a möbius strip) come out the other side.

There might be something to this. From the perspective of the typical Tableau enthusiast, for example, the expo floor at last year’s Tableau Customer Conference (TCC), held just outside of Washington, D.C. in early September, probably offered a mix of the familiar, the new, and the plumb off-putting. For example, Tableau users tend to take a dim view of traditional BI, to say nothing of the data integration (DI) or middleware plumbing that’s associated with it: “Just let me work already!” is the familiar cry of the Tableau devotee. However, TCC 2013 played host to several old-guard exhibitors — including IBM Corp., Informatica Corp., SyncSort Inc., and Teradata Corp. — as well as upstart players such as WhereScape Inc. and REST connectivity specialist SnapLogic Inc.

These vendors weren’t just exhibiting, either. As a case in point, Informatica and Tableau teamed up at TCC 2013 to trumpet a new “strategic collaboration.” As part of this accord, Informatica promised to certify its PowerCenter Data Virtualization Edition and Informatica Data Services products for use with Tableau. In an on-site interview, Ash Parikh, senior director of emerging technologies with Informatica, anticipated MicroStrategy’s Spurway by arguing that organizations “need something more.” MicroStrategy’s “something more” is traditional BI reporting and analysis; Informatica’s and Tableau’s is visual analytic discovery.

“Traditional business intelligence alone does not cut it. You need something more. The business user is demanding faster access to information that he wants, but [this] information needs to be trustworthy,” Parikh argued. “This doesn’t mean people who have been doing traditional business intelligence have been doing something wrong; it’s just that they have to complement their existing approaches to business intelligence,” he continued, stressing that Tableau needs to complement — and, to some extent, accommodate — enterprise BI, too.

“From a Tableau customer perspective, Tableau is a leader in self-service business intelligence, but Tableau [the company] is very aware of the fact that if they want to become the standard within an enterprise, the reporting standard, they need to be a trusted source of information,” he said.

Among vendor exhibitors at TCC 2013, this term — “trusted information” or some variation — was a surprisingly common refrain. If Tableau wants to be taken seriously as an enterprisewide player, said Rich Dill, a solutions engineer with SnapLogic, it must be able to accommodate the diversity of enterprise applications, services, and information resources. More to the point, Dill maintained, it must do so in a way that comports with corporate governance and regulatory strictures.

“[Tableau is] starting to get into industries where audit trails are an issue. I’ve seen a lot of financial services and healthcare and insurance businesses here [i.e., at TCC] that have to comply with audit trails, auditability, and logging,” he said. In this context, Dill argued, “If you can’t justify in your document where that number came from, why should I believe it? The data you’re making these decisions on came from these sources, but are these sources trusted?”

Mark Budzinski, vice president and general manager with WhereScape, offered a similar — and, to be sure, similarly self-serving — assessment. Tableau, he argued, has “grown their business by appealing to the frustrated business user who’s hungry for data and analytics anyway they can get it,” he said, citing Tableau’s pioneering use of data blending, which he said “isn’t workable [as a basis for decision-making] across the enterprise. You’re blending data from all of these sources, and before you know it, the problem that the data’s not managed in the proper place starts to rear its ugly head.”

Budzinski’s and WhereScape’s pitch — like those of IBM and Teradata — had a traditional DM angle. “There’s no notion of historical data in these blends and there’s no consistency: you’re embedding business rules at the desktop, [but] who’s to say that this rule is the same as the [rule used by the] guy in the next unit. How do you ensure integrity of the data and [ensure that] the right decisions were made? The only way to do that is in some data warehouse-, data mart-[like] thing.”

Stephen Swoyer can be reached at stephen.swoyer@spinkle.net.