Query data from Data Cloud in CRM Analytics

4.9
(15)

It seems Data Cloud is everywhere lately. Or maybe it’s just because I myself have transitioned into working with Data Cloud. So I thought it would be interesting to check out the live connector for Data Cloud in this blog.

Some of you may know that Data Cloud is also known as Genie and is based on CDP. Why is this relevant? Well, we will see some legacy namings when we want to work with the Data Cloud data.

How do you get access?

Setting up the connector is simple and doesn’t actually demand that you do any work within CRM Analytics, unlike the Snowflake Live connector. In fact, it’s similar to how the Salesforce Direct feature works it’s just available. The difference is that you need the correct permissions; meaning access to Data Cloud and CRM Analytics with a license and a permission set (one for each).

Note: See what permission to assign on the Salesforce Help Pages.

Query data from Data Cloud

To query your Data Cloud Data simply go to a dashboard and click on “Create Query” like you would when you are querying a dataset. Notice in the pop-up window you have a tab called “Salesforce Genie”.

You can query data in three different ways:

  1. Data Lake Object (DLO) – data pushed to a DLO from a Data Steam (external data) or via a Data Transform.
  2. Data Model Object (DMO) – a mapped schema of your DLOs. Let’s say you have multiple Salesforce orgs connected to your Data Cloud, your standard objects would exist in various similar DLOs but be mapped and treated as one in your DMO.
  3. Calculated Insight – summarized data from your DMOs that you can set up and use in your segmentation.

I’ll continue my example with a DLO called transaction_mock. Once an input is selected the explorer we know pops up. Here you can select your measures and dimensions, apply filters, set a limit, and of course, change the visualization. Below you can see a simple example.

If you want to go further than the UI can take you, you can always switch to the query editor and write your own SQL.

Once you have added a few queries it could look something like below. Note that if you are using the same source data faceting will apply.

However, if you introduce multiple sources we need to use bindings. In the example above, you’ll see how I have a query from a second DLO in a list that allows the user to select an email. We want to make sure the selection is passed as a filter to the charts/queries below.

Before we show the binding, know that the list showing emails also have a second grouping of loyalty number. This loyalty number is also represented in the first data source, so we can pass that in our binding.

Note: For more detail on what a binding is and how to use them check out this overview blog.

As you may know, we need to create a binding for each of the two charts, but in this blog I’ll just show how to do it for one. So let’s look at the bar chart and open the advanced editor and choose the Query tab.

Source query: this will be your query used in the list widget, which for me is Customer_1.

Source Data – Data Selection: this will be a column, as we can select multiple values.

Source Data – Column: this is the loyalty number.

Interaction: this is a selection binding.

Data Serialization Function: this is what controls the format of the output of the binding and we want to make this .asObejct().

The end result is {{column(Customer_1.selection, [\"loyalty_number__c\"]).asObject()}}.

You would have to add this to the filter section of your query. Note that we do need to define what column we need to apply the filter to, so I’ve added that to my query as well as my binding.

What if it’s not a compact form query?

In the example above we are using the UI to generate the query. But as I mentioned you can also write your own SQL if the UI is not able to do what you want. In that case, the data serialization we choose needs to be different as the format of the output of the binding is different.

The query we have been working on would look like this in SQL:

SELECT "department__c" AS "department__c", SUM("amount__c") AS "A" 
FROM "transaction_mock__dll" 
GROUP BY "department__c" 
ORDER BY "A" DESC 
LIMIT 250

The binding we will be using has the exact same data selection, but you’ll notice I am using .asSQLWhere() as the data serialization. When this SQL data serialization is used we also need to add the SQL type and an optional boolean to indicate if we want the include where in the output.

Note: Read more about data serialization for SQL in this blog.

Because I do not have a where statement already, I want to include that in my binding by setting the boolean to true. And since this is a Data Cloud query I will use the SQL type cdp.

Since we need to define what column to filter by we need to leverage the data manipulation function toArray() to construct the expected input for the data serialization. If you want to understand more about why and how this binding is constructed check out this blog where I’ve written a deep dive on SQL bindings. The final result is what you can see below.

SELECT "department__c" AS "department__c", SUM("amount__c") AS "A" 
FROM "transaction_mock__dll" 
{{toArray(\"loyalty_id__c\", \"IN\", column(Customer_1.selection, [\"loyalty_number__c\"])).asSQLWhere(\"cdp\", true)}}ORDER BY "A" DESC 
LIMIT 250

Considerations

You can find tips and limitations in the Salesforce Help Pages, and I recommend to read it before you get started. One thing I will highlight is working with live data always has an impact on dashboard performance as these sources are not optimized for CRM Analytics the same way CRM Analytics datasets are. Thus make sure you only query data needed, meaning apply filters and don’t include unnecessary queries, and if you can use calculated insights.

How useful was this post?

Click on a star to rate useful the post is!

Written by


1 thought on “Query data from Data Cloud in CRM Analytics”

  • 1
    Freedom Software on May 26, 2023 Reply

    Thanks for the detailed article about data cloud. It is very helpful guide.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.