But that’s not my preferred currency

4
(1)

As you may have figured out Einstein Analytics does not support multi-currency out of the box. If you are an organization that has enabled multi-currency in Salesforce then I am sure you are likely to want to see this in Einstein Analytics. As I said it’s not supported out of the box (at least not as of Summer19), but that shouldn’t stop us from having it in Einstein Analytics. There has already been created some content on the matter including a tech lounge webinar from the brilliant Bharadwaj Kopparthi and a blog from Varun Mehta. However, I still decided to take a go at this topic to explain the approach in a detailed way, so you know how to set this up from Salesforce settings to dataflow and dashboard bindings.

The integration user

Let’s first discuss how the integration user works, as this is crucial to understanding what happens with our Salesforce currency and fields. The integration user is the user that is used to bring data from Salesforce to Einstein Analytics. If the integration user does not have access to an object or a field you will get an error in your dataflow trying to pull it in. But there is another important thing to know of this user. The locale and currency settings for this user is what Einstein Analytics uses. Hence if your integration user has “British Pounds” as the currency your currency in Einstein Analytics will be British Pounds too. For this solution make sure that the integration user has the same currency as the corporate currency, it will make your dataflow a little more simple.

One more thing to know. When the integration user brings over currency fields they get converted to its defined currency before data reaches Einstein Analytics – in the example from before that would be British Pounds. The solution, that is suggested in this blog, calculates the different currencies in the dataflow. It’s important to note that some calculations can be slightly off if you do not bring in enough decimals especially for those records that are created in a different currency than the corporate currency. As you may know, decimals are controlled in the “Manage Currencies” settings of Salesforce. If you increase the decimals, the currency fields on objects will now show the decimals you defined. As an example I have 6 decimals on my Danish Kroner, that means when I enter currency in Danish Kroner it will show with 6 decimals; 1000 DKK becomes 1000.000000 DKK. Not ideal. Users most likely wouldn’t like to see that, but 0 or 2 decimals is not ideal when we need to convert it from GBP back to DKK in the dataflow.

The way around this is to create a formula field of type “Currency” on the object in mind and simply have it mirror the currency field but with more decimals. Note that you want to keep the length of the field to maximum 18. In the example below, you can see how I created a new custom field on my opportunity object called ‘Amount__c’. This field is simply taking the value of the default Amount field.

Alright with the assumption that you know how to create this formula field let’s head to Einstein Analytics.

Dataflow calculations

Okay, let’s head to the Data Manager in Einstein Analytics and open the dataflow you want to use. I’ve created a “Playground – currency” dataflow I’ll be using for illustrating the approach. And as mentioned above remember to set the currency of your integration user to the same as your company currency.

The first thing we want to do is bring in our object. I’ll be using the opportunity object where I created my custom formula field. Click on the sfdcDigest transformation and select the object and the fields you want to bring in – as a minimum you need to have your currency field and the field ‘CurrencyIsoCode’, but as you can see below I am bringing in a few additional fields.

Next, we need to bring in the currency object that holds the details around conversion rates, which is called “CurrencyType”. Here we need to include the fields “IsoCode” and “ConversionRate”. Again click the sfdcDigest transformation and select the object and fields as mentioned.

With the two objects in it’s time to add the currency details to the opportunity. We will do this by clicking and creating an augment transformation where opportunity is your left side and currency is your right side. The two objects are joined together by the isocode. The left key is “CurrencyIsoCode” and the right key is “IsoCode”. From the right side we just need to bring in the field ‘ConversionRate’. I’ve defined the relationship as ‘currency’.

The conversion rate for the currency in our opportunities have been added. The next step is to create fields to contain the iso code for each of the currencies that you are going to use besides your corporate currency. In my example above I have three currencies British Pound (GBP), Danish Kroner (DKK) and Swedish Krona, but let’s keep the example simple and just use GBP and DKK in the dataflow and dataset. Since GBP is my corporate currency I can ignore that calculation as the integration user will bring all the currency fields in as GBP. Let’s create a field for the DKK iso code we can use to bring in the DKK conversion rate. So click on the ‘computeExpression’ transformation and create a field for DKK. The source of this node will be the augment node we just created. I will call the field ‘IsoDKK’, choose the field type to be “Text” and in the SAQL expression we just want the static value “DKK”. After this, you can go ahead and save your node. Remember that if we were bringing in SEK as well we would need to create a field for this as well.

Using our new ‘IsoDKK’ field, we can bring in the currency rate from the CurrencyType object by applying another augment transformation. We, of course, need this to calculate the ‘Amount__c’ in different currencies – or in our case the DKK. Again we can ignore GBP because of the integration user. So let’s have a look at how to bring in the DKK conversion rate.

Click on the augment transformation and make the previous computeExpression node where we created the ‘IsoDKK’ field the left source. The key here will be the field ‘IsoDKK’. The right source will be the ‘CurrencyType’ object with the key ‘IsoCode’. I’ve called the relationship “DKK”. From the right side, we need only need to bring in the “ConversionRate”. Once done click on “Save”.

If you want to bring in more currencies you would need to repeat this step, but use this augment as the left source.

We now have all the fields we need in order to do our calculation. So we can now create another computeExpression to calculate our DKK currency. Before creating the new transformation I like to click on the augment node and select the “Output Fields” tab to find and copy the exact API name of our DKK conversion rate, in my case, it’s called ‘DKK.ConversionRate’. Now go ahead and click on the computeExpression tranformation, give it a name and make the previous augment the source node. We need to add two fields here. The first field we will call ‘DKK_Amount’, which will be of type numeric. In the SAQL expression we need to add our calculation which will be the amount field multiplied with the DKK conversion rate, please see the calculation below.

'Amount__c' * 'DKK.ConversionRate'

Remember to add precision (16) and scale (2), as they are mandatory fields and your dataflow will fail if you try to run it without those defined.

We need to create a field for the GBP amount as well, even though ‘Amount__c’ is already containing the desired value. Hence we do not need to make any calculations, we can simply put ‘Amount__c’ in the SAQL expression. I’ve called my field ‘GBP_Amount’ and have kept the other setting the same as in the ‘DKK_Amount’ field. It’s important to note that you need to keep the naming the same but switch out the iso code for the matching currency. We need this later when we create bindings in the dashboard.

Once you have saved your new currency fields you can go ahead and add the register node. I can recommend using the slice node just before the register node to drop the fields that we are just using for our calculations. However, in this example, I want to keep everything to validate what has happened, so I am going directly to the register node and register my dataset. All you really need to do here is choose the computeExpression from before as the source node and then give your node and dataset a name.

Make sure to update and run your dataflow by clicking on the blue button in the top right corner. After this check that your dataflow ran correctly and go to Analytics Studio.

The dashboard work

My assumption is that you want the currency to be shown depending on the logged-in user’s preferred currency. So if my currency is set to be DKK I will see the amount in DKK, just like I would see it in GBP if that is my preferred currency. In order to achieve this, I will be leveraging a SOQL step. I’ve covered this use case in details in another blog, so in this blog, I will only post the JSON for the SOQL step used in my dashboard.

Before I get to the SOQL step, let’s create a simple dashboard with a single chart showing opportunity amount by stage. When you create your step please pick either the ‘GBP_Amount’ or the ‘DKK_Amount’ field. I’ve also switched to a SAQL step to allow for my binding. You can see my query and dashboard below.

"query": "q = load "CurrencyDS";nq = group q by 'StageName';nq = foreach q generate 'StageName' as 'StageName', sum('DKK_Amount') as 'sum_DKK_Amount';nq = order q by 'StageName' asc;nq = limit q 2000;",

Alright, next I need to add my SOQL step to look at the logged-in user and grab their default currency field which on the user object is called ‘DefaultCurrencyIsoCode’. Switch to JSON (command or control + E) and add the following SOQL step. I like to add it just after my step parameter, just remember the comma after the ending curly bracket in the SOQL step. Also note you can, of course, add more fields to the query should you need them.

"UserData": {
"groups": [],
"numbers": [],
"query": "SELECT DefaultCurrencyIsoCode FROM User where ID = '!{user.Id}'",
"selectMode": "single",
"strings": [
"DefaultCurrencyIsoCode"
],
"type": "soql"
}

Next, we need to add a binding the first query. Essentially we want to change the ‘DKK’ string in the API name of my measure out with the default currency iso code from the user object. If you want an in-depth explanation to how bindings work please have a look at my binding series, specifically part 3 on group and measure bindings. The binding I will be using is a result binding using cell() as the data selection function and .asString() as the data serialization function, which you can see below:

{{cell(UserData.result, 0, "DefaultCurrencyIsoCode").asString()}}

In my query above I have replaced ‘DKK’ in my ‘foreach’ statement with my binding. If you are sorting by the currency field you would have to add your binding to that as well. My measure binding looks like this:

"query": "q = load "CurrencyDS";nq = group q by 'StageName';nq = foreach q generate 'StageName' as 'StageName', sum('{{cell(UserData.result, 0, "DefaultCurrencyIsoCode").asString()}}_Amount') as 'sum_{{cell(UserData.result, 0, "DefaultCurrencyIsoCode").asString()}}_Amount';nq = order q by 'StageName' asc;nq = limit q 2000;",

And that’s it. Well, if you have more charts in your dashboard you would, of course, need to add a binding for each query that needs to use the user’s preferred currency. Note if you want the user to select which currency to use, you can also use a static step that contains the different iso codes and do a selection binding for each chart that should leverage the different currencies.

As a final touch, you can modify the XMD to have a custom format and show the currency symbol before or after the number, making it explicit what currency is being used. I would highly encourage you to do this to make sure your users don’t make any wrong assumptions.

How useful was this post?

Click on a star to rate useful the post is!

Written by


9 thoughts on “But that’s not my preferred currency”

  • 1
    Giuseppe Cardace on August 26, 2019 Reply

    Wow, you read my mind :-O
    I need to do this today as a PoC in my company 😉

    • 2
      Rikke on August 26, 2019 Reply

      Happy to help ?

  • 3
    Madeline on November 11, 2019 Reply

    What about when when the conversion rate is constantly changing? How do you keep the value at what it was in the past.

  • 4
    Surabhi Panwar on May 15, 2020 Reply

    HI! I have a doubt, for some of our Swiss users, the currencyisocode is CHF while for some it is EUR, and the defaultcurrencyisocide is CHF for all. I am not sure about the reason behind this behavior. We have a code that refers the currencyisocode for users. It creates a problem for the users who have EUR as currencyisocode instead of CHF. Why is it so? We expect it to be CHF for all, but currently it is CHF only for some.

  • 5
    Stella on June 28, 2020 Reply

    Hi, When you have more than 1 currency to convert, you will need to do Augment for the Isocode and the CurrencyType for EACH currency. Augment can allow me combine two tables into one but How about when I have 10 columns (of different currency rate) to add? How can you add all those columns to the original Dataset? When I have 10 currency to transfer, is there a easier way to add fields?

  • 6
    Bruno on April 30, 2021 Reply

    Hi Rikkie!! Hope you’re doing really well.

    We’ll apply this solution to cover some business requirements, I came here to glad the detailed solutions showing these complex steps in an easy way to understand!

    Thank you so much 🙂

  • 7
    Harshal on August 25, 2022 Reply

    Can we add compute expression in a recipe?

    • 8
      Rikke on August 25, 2022 Reply

      Yes, you just do a transformation and use a formula. Syntax is a bit different.

  • 9
    Peter Martensen on November 11, 2022 Reply

    I don’t see anything in the formula that stops the Amount from converting when the Opportunity is Closed.

Leave a Reply to Surabhi Panwar Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.