416.467.9100 | Dundas Data Visualization | Login
Welcome Guest
Dashboard v5.0

This site makes extensive use of JavaScript.

Please enable JavaScript in your web browser and reload the page before proceeding.

Troubleshooting Performance Issues

Hide navigation
RSS
Modified on Tue, 13 May 2014 12:23 PM Categorized as Performance, Troubleshooting

Overview

Performance is a critical part of acceptance for a dashboard. If a dashboard performs poorly, users will avoid using it no matter how useful it is. Performance problems fall under a small set of categories

  • Data querying
  • Network conditions
  • Rendering

Data querying speed is affected by the complexity of the queries and the underlying data source. Network conditions are affected by geographic distance and utilization. The rendering speed depends on the server when using the HTML viewer or image creation service, and on the client machine for the Silverlight viewer. Although all these topics are covered in this article, we focus primarily on bottlenecks caused by data querying since this is by far the most common issue.

Query performance goes hand in hand with data caching. Caching improves the response time tremendously, but it can also hide performance bottlenecks. To correctly analyze a query, it is important to know how caching works and how to turn it off. See the Caching Data article for more information.

Tip: Before beginning, be sure to read the performance best practices article. Many performance problems can be avoided by following these guidelines.

Performance Analysis

Dundas Dashboard has a built-in set of performance measurements which are visible when previewing or viewing a dashboard. The measurements provide the total time required by the dashboard to fully render on the client machine, as well as individual timings for:

  • Data visualization controls rendering
  • Dundas KPI/Dataset definition retrieval
  • Data retrieval on the server
  • Number of controls
  • Number of KPI and Datasets
  • Individual timings for each KPI/Dataset, as well as the number of data points returned

This will allow you to isolate the slow-performing KPIs. Once these are known, each one should be analyzed individually and optimized.

Tip: By default the performance information dialog is hidden, see this article to enable it.

Interpreting the Performance Information Dialog

The figures in the performance information dialog will help you determine where to look for performance gains. Look for activities that are consuming the most time when your dashboard loads. The table below lists activities that may be taking a very long time to run, what possible problems could be, and what the next steps are in resolving the problem.

SymptomProblemAction
High value in Running database query for a KPI/dataset.The underlying query is slow.Analyze the KPI/dataset and all the virtual tables it and its dimensions use. Possible causes include poor use of indexing or inefficient joins (described in the next section)
High value in Transport time.Large distance between machines.Place servers and client as close together as possible. If data is in a remote location, both sync database caching and in-memory caching will help.
Network bottleneck between machines.Work with your IT team to find the slow link.
Unnecessary data is being returned for the virtual table (e.g. unaggregated data or unused columns).Modify the virtual table so data is aggregated to the lowest level needed, and only necessary data is returned.
High value in Creating temporary tables for a KPI/dataset.Stored procedures that return a lot of data are slow to run.Replace stored procedures with table-valued functions if possible (described in the next section).
Large date dimensions are constantly being recreated.Configure the date dimension to use a permanent table for storage if possible.
High values in Render data controls task.Dashboard scripts may be slow.Examine scripts for inefficient algorithms and optimize.
Excessive data is being rendered.Reduce the total amount of data being displayed. This can be done by narrowing the range of values to select from or aggregating data. Also see the performance best practices article for control-specific optimizations.
High value in Building data query for a KPI/dataset.Heavy use of virtual tables based on other virtual tables.Create views in the data source instead of basing a virtual table on others, or use a manual virtual table.

Tip: Detailed defintions are available under: performance information definitions

KPI / Dataset Designer Analysis

The KPI/dataset should be opened in the designer and previewed. The query can be slow because the underlying virtual table is slow, or because the regular or time dimensions it uses add a lot to the query complexity.

The goal is to determine which one of these two possible causes is the culprit. To do this, start by finding out how fast the query is resolved by eliminating each potentially problematic dimension one at a time. Check out the object, demote the regular and the time dimensions one by one, and preview after each operation. This simple analysis may indicate which base structures are leading to poor performance.

If a time dimension is the cause, one possible fix is in the Time Dimension Designer. The time dimension can be defined for a smaller time period, therefore removing unnecessary time frames. If it can also be changed from using temporary tables to permanent tables, this will also improve its performance.

If a regular dimension is causing the bottleneck, check its dependencies to find out which virtual tables it relies on.

Once the problem virtual tables have been identified, it is time to move to the next step.

Virtual Table Analysis

Open the Virtual Table Designer and preview the table. Make sure the Virtual Table does not return unnecessary information. Hide these columns to reduce the amount of data that is transferred between various processing layers. In many cases, selecting only necessary columns may result in major timing improvements if the database engine determines that a covering index is good enough to use for the query plan.

In many cases it is better to have small specialized Virtual Tables that are serving a single KPI/dataset object, rather than one large Virtual Table that serves different objects.

Avoid unnecessary joins, especially if you later rely on regular dimension for providing product names and descriptions, for example.

If the preview time cannot be further improved, there are two solutions to consider:

  1. Manual Virtual Table
    • The generated query can be manually overridden and your DBA can tune the query and include query hints.
  2. Data Connector Sync caching
    • Set up sync data caching with a proper schedule.

Tip: Using Manual Override on a Virtual Table gives you the opportunity to check the query that is being sent to the data store without the need for a profiling tool. This query can be copied, executed and analyzed directly on your database.

Tip: If the Virtual Table is based on the Dundas Sync database, make sure proper indexes are defined. The lack of indexes in the Sync database is a major cause of poor performance.

Stored Procedures vs. Table-Valued Functions

In SQL it's not possible to join a stored procedures with other tables. Instead a temporary table must be created with the stored procedure results, then the temporary table can be joined and finally dropped. Dashboard will automatically do this for any Stored Procedure Virtual Table, and then any joins are applied after this. This means a Stored Procedure Virtual Table has additional overhead compared to using tables, views, or table-valued functions.

To avoid this overhead, you may be able to rewrite your stored procedure as a table-valued function. Table-valued functions can be joined to other tables, so performance will improve since temporary tables are not needed. This MSDN article describes how to determine if a stored procedure is a good candidate for changing into a table-valued function.

Improve Data Retrieval Time

If your dashboard has multiple Data Visualization controls and Dashboard Data Retrieval (as seen from the Performance Information Dialog) takes a lot of time, you can follow the steps below to allow multiple queries to run at the same time:
  1. For each Data Visualization control where you want the underlying queries to run in parallel, set the Disable Loading Data to True.
  2. On the dashboard's Load Interaction, enable the data loading by script. For example:
    DataGrid1.IsLoadingDataDisabled = false;
    BarChart1.IsLoadingDataDisabled = false;
    LineChart1.IsLoadingDataDisabled = false;


Tip: You can find more information on using the Disable Loading Data property here.

About Dundas | Contact Us Follow us on Twitter! | Privacy Statement | Report Site Issues

Copyright © 2009-2014 Dundas Data Visualization, Inc.