8 Tips to Increase Dashboard Performance

When building dashboard we tend to talk about data, metrics, and design, which I will say is key! But I am not afraid to admit that I have a couple of time started building dashboards and realized that when it came to testing out the dashboard with real data, well, it needed a few adjustments to not only look great but also perform great. No one wants to wait for a long time to view a dashboard. So what do you do? Well, there are a number of things you can look to do to optimize performance, which I, of course, will cover in this blog post.

The basics first

Before we look at the things you can do to improve performance it’s important to understand how it all works. Every step in the dashboard is ultimately a query that is being fired off to the engine. That means your pipeline as well as your Account Executive filter. Yes, it may be acting as a filter but it’s still a query that takes time to execute and that selection is passed on to every other step that uses that same dataset. Let’s say your dashboards contains 30 steps or queries, all of them have to be executed before the dashboard loads and you can view the results, but that doesn’t mean that your browser will execute them all at once, instead they are executed in batches.

Tools to use

The first thing I recommend using is, of course, the Inspector! The inspector is a relatively new tool that checks the performance of your dashboard as well as the individual steps. You find the Inspector by clicking the three dots when viewing your dashboard.

You can see how costly each step is and you can drill into each step to understand what is happening.

I particularly like the “view more details”, which I find that a lot of people are not aware of. The cool thing is you can see what was actually fired off to the engine, which is a brilliant feature when you are working with bindings.

And the final thing to highlight about the Inspector is that it can give you suggestions on how to improve your dashboards. Simply switch from “Steps” to “Performance” and click the “Run Performance Check”.

 

Finally, it’s worth mentioning that you can of course also use Chrome’s Developer Tool to see how fast the queries are executed to the engine with a further breakdown than the Inspector.

Tips to increase performance

With the basics covered let’s have a look at 8 simple things you can do to improve your dashboard performance.

Tip 1 – remove unused steps

It sounds obvious, however, it is a step in creating dashboards that is often forgotten; remove the unused steps. While working you create steps that you later figure wasn’t needed or you may be testing something out. Before going live with the dashboard make sure to remove those unused steps so the query engine doesn’t have to worry about them.

Tip 2 – reduce steps

It’s easy to create filters and selections as well as add charts to your dashboard and sometimes you can get carried away and add a lot to the dashboard. You may, in fact, end up with more than 50 steps, that have to be run before the dashboard loads. Well, first of all, that probably means your dashboard cannot be viewed on one screen without some scrolling. Second of all you probably have a lot of filters that are rarely used. My question to you is; why not create a second dashboard or page to limit scrolling and increase the performance. Also if filters are rarely used, don’t include them, users still have the explore option for those rare occasions.

Tip 3 – use global filters

If your dataset is huge then global filters can be a way to go. When dashboard steps are run, they run on the full dataset, however, if there is a global filter the dataset is filtered first and then the query is run. What this means is that your query is run against a smaller dataset, which ultimately will take less time than if it runs on the full dataset.

Tip 4 – avoid measure filters

All dimensions (anything you can group by) are indexed in Einstein Analytics, which means selections based on dimensions run fast. Measures are not indexed, so if you have range filters based on a measure that can slow down the performance of your dashboard. If a measure filter is absolutely crucial to your dashboard see if you cannot apply a flag or bucket on the dataset instead. Let’s say you want to be able to filter on big deals, in your data flow create computeExpression node to mark your big deals – you can see an example of a computeExpression here. Once you have the flag or bucket use that new dimension as a filter in your dashboard.

Tip 5 – unnecessary calculations

SAQL and result bindings are powerful, but they can be costly as well. So while it may be tempting to use both, if your dashboard is slow and you can do your calculations in the data layer then do it. As you saw in tip 4 then the transformation options in the data flow are powerful and if you can make your calculations up front then your dashboard and the query engine won’t have to do so much heavy lifting at the point where the dashboard is being viewed.

Tip 6 – use pages

When pages were first introduced I struggled to see the use cases. However, I have changed my mind. Not only can you create some pretty awesome dashboards with the use of pages (see the gif below), pages also have positive effects on your dashboard performance. When you use pages only the steps that are used on the viewed page are executed. So your dashboard may have 50 steps but if you only use 15 on a page only those 15 steps are executed, needless to say, the dashboard will load much faster.

Tip 7 – trim your dataset

Einstein Analytics allow you to have millions of rows of data. I think it’s obvious that the larger the data set is the more the dashboard performance is affected. So if you can reduce the size of your data set then do it. Maybe you have data for each minute of the day or maybe you have opportunity data from the last 10 years. The question is if you really need the level of information? I would argue “no” and say that better trim that dataset by using a filter node or aggregating some of your data prior to registering your datasets.

Tip 8 – don’t be too detailed

While the devil often is in the detail having very detailed charts cannot be recommended especially with large datasets, this will often result in too many rows trying to be displayed. Let’s say you have a data set that contains details for each minute of the day if you have a chart that tries to display what happens minute by minute, your dashboard would most likely break because of the volume of rows. But let’s take a step back, why would you even display this level of detail? I will argue that the chart is not able to give you a quick overview if you show all your data by the minute. However, if you group by the hour it already gives a better overview – unless it’s per hour for the last year.

Learn from your mistakes

I’ve certainly learned from my mistakes, especially when it comes to large volumes of data. As you build more and more you will learn what to focus on. But I think what is key is to focus on it during the design and build, not after UAT or worse when it’s in production. What most of us face is that we don’t get data until the last minute and if data is vast it is absolutely crucial that you have been considerate in your design and in your build else you will have performance issues.

If you want to get even more tips and insight on performance I can really recommend the mini campfire video on Dashboard Performance with the brilliant Terry Wilson. Unfortunately, I can’t link to the specific video but scroll to find “[WA] Campfire Mini-Series E19 Dashboard Performance [SP]”.


Leave a Reply

Your email address will not be published. Required fields are marked *