Blog

What is Salesforce Experience Cloud and what can you do with it?

You might hear people talk about “Community Cloud”…
You might be wondering how a Digital Experience is different to Salesforce Experience Cloud…
Or you might be asking yourself where a partner portal fits into all of this…

Don’t worry, we’ve got you covered in this article!

What is Salesforce Experience Cloud?

Salesforce Experience Cloud is a set of functionality built on the Salesforce Customer Relationship Management (CRM) platform. Experience Cloud enables you to build beautifully-branded “digital experiences” that are connected to your CRM with a key focus on engaging customers and empowering partners. You can use Experience Cloud to create multiple sites and experiences within your Salesforce org to address different purposes.

We’ve already thrown in a few tricky words so why don’t we take a quick pause to define some key terms.

Quick glossary:

Salesforce CRM platform: More than a database, this powerful software is your single source of truth for managing your customer relationships. It’s for you and your business.

Salesforce Experience Cloud:  Formerly known as “Community Cloud”, it’s a set of functionality to deliver external-facing digital experiences (see below) that sit on top of your CRM. It’s for you, your business and your external stakeholders.

Salesforce Community Cloud: The old name for Salesforce Experience Cloud.

Digital Experience: The external-facing ‘site’ you can build with Salesforce Experience Cloud.

Site: Another name for a Digital Experience instance.

Partner portal: One example of a Digital Experience.

Help forum: Another example of a Digital Experience.

DXP: You might see Experience Cloud referred to by this acronym; it is Salesforce’s digital experience platform (DXP).

With those definitions in mind, Salesforce’s online learning platform, Trailhead describes digital experiences and Experience Cloud:

“A digital experience built using Experience Cloud provides a window into your Salesforce world.”

 From Get Started with Experience Cloud

So hopefully we’ve cleared up some terms and their definitions. Now let’s investigate what this means in reality!

What can you build using Experience Cloud?

Salesforce Experience Cloud is a set of functionality built on the powerful Salesforce CRM platform. This means that when you’re using Experience Cloud to build a digital experience, the world is your oyster!

You can create a digital experience for any situation where you want to present a beautiful external-facing interaction with your Salesforce CRM.

Popular examples include:

  • Partner portal
  • Account portal
  • Storefronts
  • Microsites
  • Industry solutions
  • Customer service hub
  • Help centres
  • Support site
  • Mobile app

💡 Top tip: Salesforce has created Lightning Bolts. These are pre-built industry-specific Digital Experience templates (amongst other things!) created to help you go to market quicker.

Why should you use Experience Cloud?

With Experience Cloud, you’re giving stakeholders what they need. Not only this, but you’re doing this completely integrated with your CRM. This means you’re maintaining and cultivating a single source of truth with an even bigger picture of your business, your customers and your partners.

We keep repeating that it’s built on the Salesforce platform… but that’s because it’s important! By building on the Salesforce platform, you’re building your partner portal or support forum on world-leading software that is scalable, secure, customizable and centralized.

A few more perks to consider at a glance:

  • Go to market fast with industry-specific customizable themes
  • Create multiple experiences for specific needs
  • Design for every device since all digital experiences are 100% mobile optimized and fully responsive

Experience Cloud and ProvenWorks in action

In case you can’t tell, we love Salesforce Experience Cloud. As a set of functionality, it adds tremendous value to your business by creating endless new opportunities for you to interact with your stakeholders.

However we’re not just fans of Experience Cloud; as a team of Salesforce experts who create data management solutions to save you time, we see the powerful potential of Digital Experiences.

We’ve created solutions that work seamlessly on Digital Experiences to:

  • Empower external users to provide verified addresses at the point of entry
  • Open up import jobs to partners and business users simply and securely

Reduce cart abandonment and improve user registration with fast address verification

Think user registration portals and ecommerce checkouts. Create powerful user experiences with our Address Verification Flow Component that verifies address data at the point of entry in 5 key strokes or less.

Empower external users to import safely and simply via a digital experience

Think partner deal registrations or subsidiary sales information. Safely empower users to import data into your Salesforce through preconfigured mappings using our two-step drag and drop wizard – introducing the ManagedImport component.

Prepare your address data for the Custom Address Field Type in Salesforce

If you, like many of us, have been eagerly awaiting the custom address field type in Salesforce (it’s only been like 10 years or so?), then you’ll be pleased to hear that Salesforce has announced it is in Beta from the Summer ‘22 release! 

At ProvenWorks we’ve been fortunate enough to participate in the closed pilot since day one so we have been able to follow its progress. We’re now excited to be able to share with you some information so that you can be prepared for its release.

Isn’t the custom address field just like the standard address field type?

Sort of – however, be prepared that State & Country picklists are enforced for all new custom address fields in Salesforce, regardless of your existing org settings. This is largely why we’re writing this article.

State & Country picklists provide a neat solution for ensuring clean data at the point of entry, but admittedly, we’ll be the first people to warn you about the integration issues, customization headaches, and maintenance anxiety you may face when using Salesforce State & Country picklists.

…Nevertheless we are address experts in Salesforce so we’re going to embrace State & Country picklists head on and let you know how to prepare if you wish to migrate from custom text fields to the new address field type.

Important: If you’re steering clear of State & Country picklists and you wish to remain using text fields then keep doing what you’re doing! There is no need to change if it works for you. The rest of this article may still be helpful to understand how to standardize data stored in State and Country text fields.

Standardization is critical

Keeping org data clean is the driving force behind Salesforce’s decision to enforce State & Country picklists for the new field type. If you, like many others, have been excited to migrate away from five custom text fields to a single address compound field, then we’re going to have to standardize that data before the migration process.

The rest of this article will walk you through a fast and efficient way to standardize your existing data from within Salesforce using AddressTools Premium. We’ll cover two approaches:

  1. Create a standardization trigger leveraging AddressTools and run a “mass update” to execute the logic.
  2. Export a standardized list of address data ready for reimporting back into Salesforce.

Both approaches require the same initial steps for configuring AddressTools’ standardization functionality so we’ll start there and break out into the two options later.

The use case

The scenario we’ll be following will be looking at a custom object called “Warehouse”. The Warehouse object contains five custom text fields that when put together create an address. We will refer to these fields collectively as an “address block”.

The five custom fields are:

  • Street
  • City
  • State
  • Postal Code
  • Country 

The fields are populated from a number of different sources – web forms, integrations and user entries – so we cannot guarantee that the data is standardized. 

To prepare this data we’re going to expedite the process by using AddressTools Premium available on the AppExchange. As users (and developers) of the package we have heaps of experience and even some hidden tricks that’ll save days. It shouldn’t take more than a couple of hours from start to finish.

Installing the AddressTools Premium trial

If you’re not already using AddressTools Premium in your organization you’ll need to first install it from the AppExchange. You can do this in a sandbox if you want to test the functionality before pushing it to production.

Note: AddressTools Premium is a paid-for product that comes with a 14-day free trial. This could save you days of work so the cost may be something worth considering, especially if you have wider address requirements.

  • Go to the AddressTools Premium AppExchange listing.
  • Select Get It Now.
  • You may be prompted to log in if not already.
  • Select Install in Production or Install in Sandbox depending on your requirements. It is best practice to test in a sandbox before moving to production.
  • Agree to the terms and conditions.
  • Select Confirm and Install.
  • You may be prompted to log in again. If so, log into the org you want to install the package to.
  • Select Install for Admins Only.
  • Press Install.
  • Check Yes, grant access to these third-party web sites.
  • Select Continue.

Let the process install the package. AddressTools Premium has a lot of features so it may take some time to install (you may receive a warning saying it’s taking a long time, this is normal). When the package has completed its installation you’ll receive a success email.

Once the package has been installed, navigate to the AddressTools App (via the App Launcher) and open the AddressTools Administration tab. You’ll immediately be on the Installation sub-tab.

  • Under the Installation tab, select Create Token.
  • A green tick will appear next to the first step.

Next we’ll want to install the AddressTools Premium dataset. This is a list of countries, states, alternative names, ISO codes, (and heaps of other address-related data).

Warning: This dataset is large. Ensure you have enough storage available if you’re testing this in a sandbox. If your allocated storage is low or you are unsure you can select Only install sample data but beware this will not populate any alternative country and state values that will be used to expand the acceptable standardization data for countries and states. This can be manually added later if you so wish.

  • Under Data Installation, select Get Started.
  • Optionally choose Only install sample data.
  • Select Install.
  • A final warning will appear in relation to storage size. When you’re ready press Yes.

This may take some time and will preconfigure some functionality for your org. Feel free to continue reading this guide so that when you’re done you’ll be ready to rock.

  • Once the installation has finished, refresh the page to see that green tick.

Disabling the out-of-the-box functionality

A trigger is provided out of the box for the Account, Contact, Contract and Lead objects. If this is a fresh installation of AddressTools Premium in your org we’ll want to disable these triggers so that we don’t impact current business processes when we begin enabling functionality further down the line.

After the data installation:

  • Select Settings from the left navigation.
  • Scroll to Trigger Settings.
  • Disable each of the trigger settings in this section.
  • Select Save.

If the address fields you want to standardize exist on one of the four objects, the triggers can be re-enabled at a later point.

Configuring the address block

As mentioned earlier we’ll be referring to the five custom text fields as an “address block”. We need to configure AddressTools Premium with each of the text fields. This will allow the tool to execute standardization on the custom State & Country fields. 

  • On the AddressTools Administration page, select Address Blocks from the left navigation.
  • Use the Add button in the top right.
  • Select the object where your address block exists. We’re choosing Warehouse__c.
  • If you have record types enabled on the object, leave None chosen.
  • Select Next.
  • Under Postal Address Fields, select the relevant fields for each picklist:
    • Country
    • State
    • City
    • ZIP/Postal Code
    • Street

With the object and address fields now specified, it’s time to choose the settings we want to enable for the block.

Whilst still on the new address block modal:

  • Scroll down to Global Settings.
  • Check Standardize Country.
  • Check Standardize State.

Note: there are plenty of other settings here that may take your fancy. A tooltip is provided next to each giving you some insight into what’s available. You can come back to this page at any time should you wish to explore the other capabilities of AddressTools Premium.

  • We’ll complete this step by selecting Save.

Configure standardization values

Standardization is the process of converting multiple acceptable values to a single value. For example let’s take a look at the country Egypt:

  • Full name – Egypt
  • ISO-2 – EG
  • ISO-3– EGY
  • Local name (Latin characters) – Miṣr
  • Local name (Native characters) – مِصر

Each of the above values are technically correct entries for Egypt but a picklist won’t allow all of these values to be entered. Using a text field to accept all the variations of the country name will ease the stress for end users, integrations and streamline future expansions of your org. It’s then best practice to standardize the values to a single preferred format for analytical purposes after the data is inserted.

To identify the acceptable values for each country we’ll take a look at the Countries object installed with AddressTools. This is one of the objects that the data installation will have populated records for and is fundamental to the standardization functionality.

  • Select App Launcher.
  • Search and select Countries.
  • Select All from the available list views.
  • To help understand the data, select the United States country record from the list view.

Looking at the Country record, you can find dedicated fields for:

  • Full name
  • ISO-2
  • ISO-3
  • Local name (Latin characters)
  • Local name (Native characters)

The good news is that each of these field’s values are automatically configured to be accepted in text fields configured with AddressTools Premium. When the AddressTools trigger functionality is enabled the values will be standardized to a defined format on insert and update.

Let’s take a look at another example for acceptable data by talking about United Kingdom, or do I mean Great Britain, or England?… You get where I’m going…

State & Country picklists don’t support the inputs of these variations, and these variations also don’t fit into the five dedicated fields on the Countries object. This is where we introduce Alternative Country Names.

  • Whilst still looking at your existing country record, select Related.
  • Select Alternative Country Names.

This list may be empty depending on the country you’re looking at or because you only installed the sample dataset. Don’t worry, you can add as many records here as you find necessary. 

To add a new Alternative Country Name:

  • Select New.
  • Write the value into the Alternative Country Name field.
  • Ensure the Original Country field is populated with the Country.
  • Is Obsolete: Unchecked.
  • Select Save.

And it’s that simple, you’ve now added an Alternative Country Name that AddressTools will be able to identify during the standardization process.

The State object is configured similarly. To access States navigate to the related list on the Country record. For example, navigate to the United States Country record, select Related, and here you’ll find a list of states belonging to the United States. 

Each State record has a:

  • Full name
  • ISO code

An Alternative State Name object is available where you can add a list of acceptable values. After all, we can’t seriously expect all our users to spell Mississippi correctly every time… So practically speaking if there are common misspellings or abbreviations you find in your org you can add them here to be standardized.

That covers configuring all of the acceptable values. Now we need to define the formats for the data to be standardized to.

Defining the standardized formats

Whilst we’re still looking at Country and State record data, we’ll configure the State format first. 

This is managed on the Country record and can be controlled on a per-country basis. 

  • Navigate back to a Country record (i.e. United States).
  • Enable or disable Use Subcountry Code in State field.

This option can be enabled/disabled to standardize the state value to either its full name or ISO value (i.e. Texas vs TX).

Lastly, we need to define the Country format. This is an org-wide setting and applies to all country values.

  • Go to the AddressTools Administration tab.
  • Navigate to Settings in the left navigation.
  • Use the pencil icon next to Standardization Enabled.
  • Check Standardization Enabled.
  • Edit the Country Standardization Format to match the desired format.
  • Select Save.

Note: These settings can be changed at a later date if you need to change your format. You’ll then need to run one of the following jobs to standardize the data to the new format.

It’s configured, now what?

We have two options to mass standardize the data:

  1. Enable a trigger on the object, run a mass update and have the trigger standardize all the data during the update.
  2. Invoke a job via the Developer Console to export a standardized list of data that can be manually reimported into Salesforce.

Choose the approach that best suits you. If you’re unsure what route to take we have instructions below walking you through both.

Option 1: Create a trigger on the object and run a mass update.

We’ve configured all the standardization settings so now we need to tell the object to follow them. As we’re working with a custom object in this example we’ll need to create a new trigger in the org to invoke the AddressTools functionality.

A trigger is provided out of the box for the Account, Contact, Contract and Lead objects. Follow the relevant steps to enable or create a trigger for the object where your address block exists.

If you’re working with either of the Account, Contact, Contract or Lead objects:

  • Navigate to the AddressTools Administrator tab.
  • Select Settings from the left navigation.
  • Scroll to Trigger Settings.
  • Enable the trigger on the object you’re standardizing.
  • Select Save.

If you’re working with an object that isn’t Account, Contact, Contract or Lead:

  • Go to Setup.
  • Navigate to Object Manager.
  • Locate the Object you want to create the trigger for.
  • Select Triggers and New.
  • In the box, replace the existing code snippet with the following:
trigger ValidateOBJECTLABELCountryFields on OBJECTAPI (before insert, before update) {
    pw_ccpro.CountryValidator2.Validate(Trigger.new, Trigger.oldMap);  
}
  • Replace OBJECTLABEL with the label name of the object you’re creating the trigger for.
  • Replace OBJECTAPI with the API name of the object you’re creating the trigger for.
  • Select Save.

With the trigger enabled for the object, we need to turn on the standardization setting in the AddressTools Administration tab:

  • Navigate to the AddressTools Administrator tab.
  • Select Settings from the left navigation.
  • Under Feature Enablement, check the box for Standardization Enabled.
  • Confirm that the Country Standardization Format is set as you desire.
  • Select Save.

Before we do a mass update we can test the standardization functionality on our address block. 

  • Navigate to a record where your address block exists.
  • Edit the record.
  • Change the country text value to a variation of the value currently present (e.g. if the country is United States, change it to USA or US).
  • Save the record.

The record will standardize to the format specified in the settings. (If you entered the desired format, the value won’t change on save as it’s already in the expected format. Try changing to another format to confirm the test).

Before save:

After save:

Now the test has been confirmed we need to invoke the trigger on all existing records. This will involve running an update on every record in the object. There are many different ways that you can achieve this so if you can already think of one then do what you know best.

If you need some guidance, we have a separate article on how to run a “mass touch” using Salesforce Flows. Check it out here.

Once the mass touch operation has successfully run, all State and Country values that matched the AddressTools dataset will now adhere to your defined standardization format.

There may be some leftover values and this will require some manual intervention. If you find a repeat offender you can add the value to the Alternative Country or State Name objects and re-run the process to catch them.

Option 2: Export the standardized data for importing later

Before we start, it makes sense to see the result of these instructions so let’s take a look at what our exported file will contain.

For every record on the configured object that can be standardized, the data will be exported with the following data in the file:

  • Record ID
  • Current text field values (Old)
  • Standardized versions of the text field values (New). 

Note: The export will ignore records that are already in the desired format or that contain data that cannot be standardized (i.e. an unrecognized value).

To prepare AddressTools Premium to execute this export:

  • Navigate to the AddressTools Administrator tab.
  • Select Settings from the left navigation.
  • Under Feature Enablement, check the box for Standardization Enabled.
  • Confirm that the Country Standardization Format is set as you desire.
  • Add your email address to the Batch Verification Alerts Email Address field.
  • Select Save.

This process will need permission to send an email to the email address configured in the previous section. You may need to change your org’s Email Deliverability settings to support this.

To check/change your Deliverability settings:

  • Go to Salesforce Setup.
  • Search for Deliverability in the left search.
  • Select Deliverability from the left navigation.
  • Make note of your existing Access level, you can revert the setting back to this once you’re done.
  • Change Access Level to All email.
  • Select Save.

For some of you reading this guide, you may not have worked with the Developer Console before so follow closely and let’s execute some Apex! 

Note: If this is your first time we recommend doing this in a sandbox so you don’t affect any production data.

  • Go to the cog in the top right of your Salesforce page.
  • Select Developer Console.

The Developer Console window will open in a new window.

  • Select Debug | Open Execute Anonymous Window.
  • Under Enter Apex Code, type the code below 
pw_ccpro.BatchValidateAndGenerateCSV M = new pw_ccpro.BatchValidateAndGenerateCSV('OBJECTAPI');
Database.executeBatch(M);
  • Change OBJECTAPI to the API Name of your Object. We’ll be typing ‘Warehouse__c’.
  • Select Execute.

This will now begin the standardization process. The length of time it will take to execute will vary depending on how much data you have in your org.

Once the job is complete you will receive an email with a .csv attachment containing all of the standardized data from the address block ready for importing either into the existing fields or ready to migrate into your State & Country picklists. 

And there you have it – your standardized file is waiting for you! When you’re ready to import this data back into Salesforce, use an importing tool* of your choice and ensure to update the records matching the Record ID found in column A.

Warning: Be vigilant when running mass update operations in a production environment. Where possible backup your data first.

*Pssst if you’re looking for a new favorite importing solution, why not try out SimpleImport for this import job!

Summary

So there you have it, we’ve walked through how to standardize your existing data ready for the new custom address field type in Salesforce using AddressTools Premium.

If you have found this guide to be helpful, please ensure you share it with others so that they can learn how to standardize their address data stored in text fields. If it has saved you time then it may save them time too!

If you have any questions about AddressTools and any of its capabilities we’d love to hear from you. Get in contact with us at info@provenworks.com.

How to handle sanctioned states within Salesforce

As of February 2022, US companies must adhere to imposed restrictions on economic relations with the non-government controlled areas of the Donetsk and Luhansk oblasts in eastern Ukraine. AddressTools Premium customers can quickly and automatically identify Leads, Accounts and Opportunities in Salesforce that should be flagged in relation to the executive order.

The AddressTools ‘State’ object is pre-populated with Ukrainian province (oblast) values. The package ensures that all addresses in these regions use a standard state value for reporting purposes.

A custom checkbox field can be added to the State object to flag “Sanctioned Regions”. Using the package’s built-in trigger, a formula field can be populated on records in these regions. Any new record created in these regions would be automatically flagged.

Before getting started

This guide assumes that you are leveraging the Lookup Field Population functionality available in AddressTools Premium. To find out about the Lookup Field Population functionality please refer to the relevant section in the Installation Walkthrough.

These instructions will demonstrate displaying whether the Account Billing Address exists within a sanctioned region. The steps can be replicated across any address block configured with AddressTools.

When you’re ready to proceed, continue through the guide using the steps below.

Step by step guide

Create a Sanctioned Region checkbox on the State object

First create a checkbox field on the AddressTools State object and name it Sanctioned Region. This field will be used to flag regions under sanction.

Add this field to the State object’s page layout in order to maintain the value with ease.

With the field created, relevant regions can be set to TRUE by navigating to the State record via the parent Country in Salesforce. For example, navigate to Countries then select Ukraine. Finally click Related and find the impacted state in the list.

Create a formula field to display the Sanctioned Region value on an Account

To display the Sanctioned Region value on a record where the address exists, we’ll use a formula field to reference the associated State record.

Create a Formula field on the object where the address exists. We’ll choose Account for this example. When choosing the Formula Return Type, choose Checkbox and name the field Sanctioned Region.

Use the Advanced Formula tool to insert the custom Sanctioned Region field located on the related State object (this is where the Lookup Field Population functionality comes in).

Save the field and optionally add it to your page layout for quick referencing.

Save and test

When we save a new Account, the Billing State Lookup field populates on save using AddressTools’ Lookup Field Population functionality. When viewing the saved record the formula field will traverse the lookup and present the Sanctioned Region value on the Account.


Get in touch

If you have any questions or concerns about handling sanctioned states in your Salesforce org, please don’t hesitate to get in touch with us.

How to fire a trigger for existing records in Salesforce using Flows

So you’ve just deployed a new trigger or flow and you want to push your org’s existing records through it. This can often be referred to as a “Mass Touch” or “Mass Update”. There are many different ways that this can be achieved, some involving exporting data using third-party tools, but here’s one approach that can be done completely within Salesforce, and with no code!

Create a new “Mass Touch” field

We’ll start the process by adding a new field on the object we want to fire the trigger on. This field will be used to update the record without touching its existing data.

  • Go to Salesforce Setup.
  • Select Object Manager.
  • Locate the object you want to execute a mass touch for and select it.
  • Go to Fields & Relationships.
  • Select New.
  • Data Type: Number.
  • Press Next.
  • Field Label: Mass Touch
  • Length: 1
  • Decimal Places: 0
  • Field Name: MassTouch
  • Help Text: This field should only be used to invoke an update on the record and should not be populated by user entry.
  • Press Next.
  • Only provide visibility to your profile (or the profile of the user completing this process).
  • Press Next.
  • Uncheck Add Field to prevent the field from being added to the page layout.
  • Select Save.

Your object now has a new field that can be updated without impacting existing business data.

Creating a scheduled flow

We now need to create a flow that’s going to look at all the records on the object and update the Mass Touch field just created. We’re going to do this using a Scheduled Flow as this will allow us to set a time for the process to fire. If your object has a lot of existing records it may be preferable to run this process out of hours.

  • Go to Salesforce Setup.
  • Search for Flows in the left navigation and select Flows.
  • Select New Flow.
  • Choose Schedule-Trigger Flow and select Create.
  • Select Set Schedule.
  • Choose a Start Date and Start Time that work for your org.
  • Set Frequency as Once.
  • Select Done.
  • There is no requirement to specify an object in the start element so leave this empty.

The start element in the flow is now configured. Let’s follow that up and add an Update Records element.

  • Press the + icon to open the Add Element view.
  • Scroll down to Data and select Update Records.
  • Label: Update Records
  • API Name: UpdateRecords
  • Description: Set the records' Mass Touch field to a new value.
  • How to Find Records to Update and Set Their Values: Specify conditions to identify records, and set fields individually.
  • Under Update Record of This Object Type, search and select the Object you want to perform the mass touch on.
  • Under Filter Object Records, you can add conditions to only mass touch a selection of records, or have this set to “None” which will update all. We’ll use None for this example.
  • Under Set Field Values for the Object Records, choose the Mass Touch (MassTouch__c) field.
  • Value: 1. (If you’re running the mass touch on this object again in the future you can change this value to 2, or 3 and keep cycling the number for each mass touch you want to complete, it just needs to change from its previous value).
  • Select Done.

With the flow configured, we now need to save it and name it.

  • Select Save from the top right.
  • Provide a Name and Description for the flow so that you will be able to identify it in the future should you wish to run it again.

Validation rule management

Before running a mass touch job on the object we need to consider validation rules. Flows do not have an elegant way to bypass validation rules so keep an eye out for any flow errors when the job runs. An email will be sent to your own user account’s email address containing the errors faced during the flow allowing you to manually correct the conflicting errors (see image below).

If there are some common offending validation rules consider disabling them temporarily to complete this task but remember to re-enable them when you’re done!

Run the flow

The flow is now fully configured and ready to be activated. Use the Activate button in the top right to enable the flow. The mass touch operation will begin when the scheduled time configured in the Start element is met.

If you don’t want to wait for the scheduled time you can run the flow immediately by selecting Debug. Make sure that you uncheck “Run flow in rollback mode” to ensure that the mass touch’s changes persist after the flow has been executed.

Remember, check your emails after the flow has run to review any errors as the update can roll back multiple records even if a single record fails.

Summary

We hope that you have found this article to be helpful for your organization and just a reminder to be vigilant when running mass update operations in a production environment, where possible backup your data first.

If you’d like to find out more about what ProvenWorks do, check out our homepage.

When is a Salesforce Field no longer useful?

One of the many advantages of CRM systems is that it is very easy for business users to add additional fields. One of the main disadvantages of CRM systems is that it is very easy for business users to add fields without giving thought to the ‘bigger picture’.

In this blog post we will discuss our approach for determining when to remove fields from the system.

TLDR: The basic principal is that when a field is no longer trustworthy, is there any point in keeping it in your production system?

Can we trust this field?

A field is untrustworthy if you (and anyone else) cannot find a consistent process for populating or maintaining that field despite reasonable efforts to do so.

Let’s take an example of an Account field that says “Number of employees at this location” and some of the questions we’d ask to determine how much we trust it:

1. Who collects/populates it?

Is it the Sales Rep? Is it the marketing department? Is it from a third party data provider?
– Don’t know and can’t find out? Back it up and delete it!

2. When is it collected?

At time of record creation? When the Account becomes a customer? When there’s an opportunity?
– Nobody knows? Remove it.

3. When was it first ‘put into production’?

This is vital if we are doing any kind of historical analysis.
– If the field is only there for trend analysis and the data appears patchy then it’s useless.

4. When was it last updated?

What’s the worst case?
– No historical value? Lose it.

5. Do we actually understand what the field means?

Is it unambiguous?
– If no one can understand what the field means, it’s useless.

Do we understand what the field means?

If you’re interested in a deep dive on field naming, see our previous post on field naming conventions.

In short, what we must do is look at the field name/label and determine if the question used to populate the field is at all ambiguous.

Even if the IT/IS/Sales Ops department understands the meaning of the field, do the people who are actually populating it? What about users who primarily speak different languages?

We work with a number of organizations and see lots of different field naming conventions. An example of an ambiguous field we saw recently was on an account that simply said “MSP Customer”. MSP means Managed Service Provider. Due to the client’s channel sales model, it was not clear if this field meant the account was a “Managed Service Provider” themselves or if this account was effectively an indirect customer since they purchased through a Managed Service Provider.

If we cannot understand what the field means, we cannot fulfil its purpose. Ultimately this not only wastes time but can also lead to inaccurate reporting, misinterpretation and misinformed business decisions.

What is the purpose of having a field anyway?

Let’s finish up by reminding ourselves of the basics. We believe there are two key purposes for having a field in a CRM:

  1. To mark a record as being at a certain stage in a process and/or
  2. To record data for later analysis (reporting).

If a given field is not adding any value to a current business process and it has no historical value for the reasons listed above, back it up and delete it before someone makes a business decision based on its content!

About ProvenWorks

We mean it when we say we’re Salesforce experts. We work exclusively in the Salesforce ecosystem and our products are built 100% for Salesforce.

Field naming convention for Salesforce and database tables

Joel Mansford is the Founder and Managing Director of ProvenWorks. With many years experience working within Sales & Marketing building databases, reports and customising CRM systems, he considers himself a techie at heart.

In this blog I’ll take you through a variation of a field naming convention for Salesforce that we use at ProvenWorks and recommend for use within Salesforce systems. To be clear we’re talking about the API name (or underlying table names) and not the labels that are exposed to users.

I want to stress that the most important thing is to have a field naming convention; what that convention actually is is of secondary importance. You will know that you have a good convention when you get the field/API name correct >90% of the time without actually knowing the table well simply because you know how it would have been named by its purpose.

Name your fields

If you’re reading this then you’ve probably already determined that you need to think about the naming convention of your fields. On a new field creation, Salesforce first asks for the label. It then creates an API name based on this label. This means you can get:

Label=”Last 2 letters of Mother’s maiden Name”,

API name = “Last_2_Letters_of_Mother_s_maiden_Name__c”

Note that the spaces and apostrophe all become underscores, making the ‘s’ stand on its own and the API name overly long (those underscores aren’t helping us at all). Also if the label changes (e.g. “2”->”two”) then confusion ensues because developers using this ‘convention’ will always assume that the API name matches the label.

Naming conventions

Field names should use Upper Camel Case or Pascal casing. Here’s an excerpt from http://en.wikipedia.org/wiki/CamelCase:

“CamelCase (also spelled “camel case” and sometimes known as medial capitals[1]) is the practice of writing compound words or phrases in which the words are joined without spaces and are capitalized within the compound — as in LaBelle, BackColor , or iMac. The name comes from the uppercase “bumps” in the middle of the compound word, suggestive of the humps.”

Try to avoid using abbreviations unless they are very widely used i.e. Id is fine instead of Identifier, Pro_Serve is not a good abbreviation of Professional Services as it is not consistent in its number of characters for each word and “Serve” is not a known abbreviation of “Services”. We’ll get onto verbosity in naming later.

Acronyms are treated as if they were words, so SFDC Account Id would become SfdcAccountId – whilst this one isn’t intuitive it is necessary to properly separate the words out correctly. If SFDC were treated as all capitals, many tools would (like SSRS) convert it to “S F D C Account Id” which is ‘more’ wrong. In this example, “SalesforceAccountId” would be best.

Grouping

The underscore (_) character can be used after a field prefix to logically group fields together, for example:

  • Software_Amount becomes Amount_Software
  • Maintenance_Amount becomes Amount_Maintenance
  • Training_Amount becomes Amount_Training

This grouping means that when fields are sorted alphabetically, logically-related fields appear together – very useful in implementations with >150 fields! Grouping is easily achieved by slight word re-arrangement. For example, although we would usually verbally say “Date Created” here we name the field CreatedDate.

Grouping (underscore) isn’t absolutely necessary for field ‘pairs’ for example CreatedBy and CreatedDate or Contact.AccountId and Contact.AccountType (as the two fields will always be populated together). However, if you anticipate other fields being added to this theme then grouping can be beneficial and makes reviewing the column/field list much easier.

Choice of words & verbosity

This blog only outlines the conventions for naming. The actual choice of words is a task best proposed by one individual and then reviewed by others to ensure that the words are a good reflection of the purpose of the field without extra clarification. It is very rare that an individual gets the field naming right alone. Bounce the ideas and then finally make sure the actual text is reviewed for typos(!).

However great the temptation to “quickly create a field”, you should always resist. Spending literally a few minutes carefully considering the naming will at a minimum save you time later and more importantly prevent a field being misinterpreted and thus wrong business decisions being made.

Remember it is better to be unambiguous and have a long field name than a short field name that is open to interpretation. If a developer complains that it takes too long for them to type a long field name then I’d suggest they’re in the wrong profession if they struggle typing a dozen extra characters.

About ProvenWorks

We mean it when we say we’re Salesforce experts. We work exclusively in the Salesforce ecosystem and our products are built 100% for Salesforce.

Data Cleaning 101: 4 common data problems solved

This is the third instalment of our Data Cleaning 101 mini-series. So far we’ve explored why clean data is essential in Salesforce, and introduced 5 steps to start you on your journey to cleaner data. Now it’s time to dive into the details! We’re going to examine the different problems you probably have with your data and take a closer look at some of our favorite ways to tackle them!

The problems

  1. users entering bad data
  2. unstandardized data and messy reports
  3. duplicates and more duplicates
  4. inaccurate old data

1. Users entering bad data 

It’s great if you prioritize clean data, but if your users can still input bad data with no barriers, we’re back to square one. 

For this problem, we’re looking for solutions that will clean data at the point of entry.

Validate fields to meet set criteria before the data gets saved to your database. Design them carefully and with the end-user experience in mind.

  • Use alternative data types

Do you need to have everything as a text field? Could picklists, radio buttons, or checkboxes be used instead for the data you’re storing?

If you’re looking at AppExchange solutions, use our five step guide from Part 2 to assess whether the solution resolves your specific problem.

💡 It’s important to consider everybody across your Salesforce organization and make changes that also consider the end-user experience.

2. Unstandardized data and messy reports

As basic as it sounds, ensuring that your data is uniform and standardized will save you time on all kinds of tasks.

For example, if you create a report to find accounts based in Mississippi, imagine if all your records show Mississippi instead of MS or Mississippi, and its various misspellings. No more bloated field filtering, just clean data with uniform reports! 

For this problem, we’re focusing on ways we can better organize our data.

  • Use international standards

Use internationally-recognized standards to segment your data in the best way possible. They’re designed to be consistent and globally understood!

  • Run standardization tasks

Consider exporting data to standardize it with your favorite tools then reimport it back into your organization, or look into ways you can automate it within Salesforce.

  • Apps like IndustryComplete* will streamline industry categorization, SimpleImport* can speed up those importing tasks, and AddressTools* will start standardizing your address data immediately! (There are free versions available too for some of our tools* to get you up and running for no cost!)

3. Duplicates and more duplicates

If your role involves importing and entering data, duplicates are a constant battle. Dealing with duplicate information or “dupes” is a huge blackhole of time for admins and fundamentally for your business too.

Find a solution that stops or at least clearly alerts you to duplicated data. Have you implemented Duplicate Rules in Salesforce? Or is there a budget for a dedicated duplicate check tool? You have some great options out of the box and also on the AppExchange.

  • If you’re importing data, Excel and Google Sheets have some great duplicate handling tools available. Try them out before importing your data into Salesforce.

  • Duplicate rules in Salesforce are not to be sniffed at! They can be a great free feature if you’re on Professional, Enterprise, Performance, Unlimited, and Developer Editions.

  • And of course, we love SimpleImport Premium’s* multi-field matching to help identify existing records easier, preventing duplicates from being inserted and saved.

4. Inaccurate old data

Up to 40% of user productivity is lost to incomplete, inconsistent data. Sorting bad data from good wastes 50% of users’ time. These stats alone show how existing bad data is bad for business.

Removing or validating existing data is essential for your users to remain focused and productive. If some data is now redundant, archive or delete it from your system altogether, but if it’s still in use, correct it by updating and validating it. There are plenty of applications that not only draw your attention to the incorrect data, but give you the option to update, change or delete it.

  • Schedule regular spring cleans of your org – regularly diving into your data is the best way to ensure your data is up to date, and to spot any blackholes of outdated information! Checking through your data every two weeks is best as our Operations Director, Beth, recommends in our 5 step guide (Part 2).
  • Tools that have access to international databases can be an invaluable addition to your org, i.e. looking up address data against postal authorities.

Summary

Now you have the tools and wisdom to win the war against bad data. Let’s recap the main 4 data problems that we’ve managed to tackle, they are:

  • users entering bad data
  • unstandardized and messy data
  • the dreaded duplicates
  • inaccurate old data 

Hopefully, with the facts and resources presented in this final instalment of our Data Cleaning 101 mini-series, you can take full control of your org and clean up your bad data!


Data Cleaning 101: a round up

So now it’s the end of our little crusade against unclean data. But first let’s go over what we’ve added to your data cleaning arsenal.

Data Cleaning 101 has provided you with:

  • a reality check about maintaining clean data in your CRM (blog one),
  • 5 tips to remember before purchasing a solution to figure out what you want to get out of your data (blog two),
  • 4 specific data problems you might have and how to solve them.

We’d love to hear your stories with bad data and the applications that saved your data. Are there any tips you think we should know about? Tell us!

Or, whether you’ve simply enjoyed learning about data and data cleaning let us know, we’d love to hear from you.

I want to start cleaning up my data!

Got issues with your data in your Salesforce org that you’re ready to tackle? Get in touch and see if one (or more) of our solutions can help clean the dirty data stinking up your CRM. 

Data Cleaning 101: 5 steps to start cleaning your data

If you’ve been following our series so far, you will have learned all about why you should care about data cleansing. Did you know that bad data can cost companies up to a quarter of their revenue? So, cleaning bad data is paramount for good data practice. But for more reality checks, catch up on the first instalment: why should you care about data cleansing? 

Cleaning bad data can be tricky

This article presents 5 essential steps for you to better understand your org, your data, and how to start the data cleansing process. By following these steps, you can go forward and make well-informed decisions for your company. You may even discover some quick wins along the way.

Follow through these 5 steps to begin your journey to cleaner data.

1. Ask yourself: why do I need this data?

It may sound broad, but knowing what you want to get from your data is vital. So take a step back and look at your end goal. Are you collecting all of the information you need? Or maybe you’re collecting too much? After all, there’s no point collecting data if you’re never going to use it. One less field you’re capturing is one less field you need to maintain!

Start by creating and establishing a plan outlining what clean data means for you and your org. You can’t do anything until you understand your org and users. 

For some prompts, start here:

  • What data do I need to achieve my business goals?
  • Am I capturing everything I need to succeed?
  • Am I collecting some data for the sake of collecting data?

2. Examine where your data is coming from

It’s time to identify all the different ways that data is entering your org. It’s likely that when you dig deeper, there will be more entry points than you think!

A few entry points to consider:

  • service agents
  • sales reps
  • partner users
  • web-to-lead forms
  • integrations

Once you’ve made your list and found the holes, you can do something about them!

3. Plug the holes to clean the data

I saw a great analogy on Reddit:

I think of improving data quality as a sinking boat. If you are sinking, you need to plug the holes first (sources of bad data) and then start bailing out the water (getting rid of the bad data) second.”

(lziemke)

Simply put, if data enters your org clean, this saves you from battling with your records later on. This should be the initial focus. Thankfully there are lots of useful ways to make this a reality with minimal effort that don’t impede the end user experience.

Where possible, think of picklists, validation rules, or even consider managed packages available from the AppExchange.

4. Try the AppExchange!

Solutions on the AppExchange can be a great cost-effective, and time-efficient way to help you combat unclean data. Managed package providers are often experts in their field, so finding a ready-to-go data management solution will help you clean your data in no time!

So how do you choose the right app for you? This isn’t an exhaustive list, but we’ve included a data cleansing focused checklist to ask the managed package providers:

  1. Does the solution automatically resolve data quality issues?
  2. Does the solution require changes to the user experience?
  3. Does the solution clean all my data entry points?
  4. Does the solution clean up pre-existing data?
  5. How long does it take to implement the solution?

After all, don’t reinvent the wheel when it’s already turning – if you have an issue, it’s likely that hundreds of others do too!

5. Start building good data habits for cleaner data

So you’ve taken a good look at your business, identified where your data is coming from, and even potentially put some actions in place to begin cleaning up your org. Now it’s time to maintain this good practice as your org grows and share your wisdom.

Here are a few ways you can build good data habits in your organization:

  • Get users involved and trained in what to look out for
  • Identify entry points that are inputting bad data
  • Schedule regular manual checks for your data 

Operations Director, Beth, is our resident data cleaner and a strong believer in the power of good data practice:

“There is nothing more satisfying than having a good old spring clean in Salesforce! It’s great to identify trends and get to the root cause of how bad data is able to get in. Popping in a scheduled time every two weeks is the way I’ve found works best. Don’t let it build up and be a task you keep on putting off!”

Beth Clements, ProvenWorks

We live in a ‘need it done yesterday’ society so safeguarding your org against the perils of bad data is a sure way to give you back your time and let you focus on the things that matter.

Remember: the best day to start cleaning your data was yesterday. The second best day is today.

Life’s better after cleaning bad data from your org.

Stay tuned for the final instalment to our data blog series to learn about some of our favourite cleaning apps.

See you back here soon for the final instalment of the series!

Data Cleaning 101: Why should you care about data cleansing?

Unclean data stinks up your system. It can travel through your business, slowing you down, causing confusion and costing you money. Don’t believe us? Between 2019-2021, over 40% of sales reps did not have enough information about leads and accounts to make effective sales, and there are plenty more stats around data cleansing coming up.

This is the first in our three-part series taking you through the importance of data cleansing and some top tips for practising clean data habits… so make sure to stick around!

An introduction to data cleansing

Bad data… Does it really matter? How do I clean it? And what can I do about it in the future?

Don’t worry we have you covered with the ‘whats’, ‘whys’ and ‘hows’ of dirty data throughout this series. First off a few definitions…

Clean data: Data that is without error and in its entirety so it can be used effectively by everyone.

Data cleansing: The process of identifying incomplete, incorrect, inaccurate or irrelevant data and modifying, validating, deleting or replacing it.

So how does unclean data actually affect me?

Over the past 6 years Salesforce and Salesforce bloggers have noted the persistent and holistic impact that bad data has on companies of all sizes. Let’s take a look at some numbers:

As we’ve already discussed, between 2019-2021, over 40% of sales reps did not have enough information about leads and accounts to make effective sales, and therefore struggled to achieve monthly and annual sales targets. This inevitably affects generated revenue.

Why is data cleansing important?

  1. Ensure consistency, validity and confidence in your data by cleansing it regularly. This foundation can yield better results and help reach goals quicker.
  1. Reduce the time employees spend sorting through bad data and let users be more productive on the tasks that matter – like growing your business or organization!
  1. Eradicate data privacy worries for customers and clients. It’s outright good practice; you wouldn’t want your private mail sent to an incorrect address!
  1. Protect your company’s reputation. Saving bad or incomplete information affects all areas of your company, from sales and marketing to the senior management, stopping everyone from carrying out their job well and making beneficial decisions, which directly affects the reputation of your business and its revenue.

The good news

Not all is lost! The University of Texas has estimated that even if the data entered is 10% more accurate, then your revenue would be considerably boosted not only for bigger enterprises but also for B2B and B2C firms.

So now you know what data cleansing is, how do you practice it?

Watch out for the next post in Data Cleaning 101 for our five best tips for cleansing your Salesforce org. You don’t want to miss that one!

Want to learn more about the impact of poor data?

Check out our blog: What is the impact of poor quality data? 

What is the impact of poor quality data?

Data is an incredibly important asset to any business. Bad quality data is not just a time sink to remedy, it can also cause active damage to an organization. Starting with good quality data is key, especially in Salesforce where qualifying leads, assigning key accounts, and ensuring cases are routed to the correct queues are paramount.

What are the consequences of bad data?

  • Damage to relationships with clients and partners
  • Lowered credibility
  • Poor productivity
  • Poor decision making
  • Lost revenue

How can I get accurate data in my Salesforce Organization?

Salesforce provides various ways to keep data accurate such as validation rules, required fields, workflow rules, and duplicate management rules. We believe that good data is the basis of success. This means correct data at the point of entry, and where that’s not possible, it means having the ability to highlight and intelligently cleanse poor data.

We offer a range of solutions for the Salesforce CRM platform to promote and assist accurate data entry, to provide methods to cleanse your data, integrate it, and help keep your business compliant with GDPR.

AddressTools

AddressTools can verify address data both at the point of entry, or as a clean up task. With the data being verified against local postal authorities in over 240+ countries, you can rely on AddressTools to ensure that Salesforce is your master source of knowledge. Ensure that users in your Salesforce Organization are spending their time efficiently, rather than fixing the issues caused by poor quality address data.

IndustryComplete

IndustryComplete uses NAICS, SIC and iSIC classification systems to verify industries in Salesforce. With an easy-to-use component, users can search for an industry using keywords which will populate the standard Salesforce industry field. Having an interactive and easy to use component to enter in industry data not only saves users time, but ensures accurate data is entered.


SimpleImport

The Managed Import component from SimpleImport allows users to import data into Salesforce directly, but safely into predefined fields, mapped by an administrator. Standard users are empowered with streamlined import functionality, but with strict, customizable controls to ensure security and data integrity.


PhoneTools

PhoneTools allows you to screen against UK TPS and CTPS lists to assist with PECR compliance, all within Salesforce with no coding required. Manually check numbers when you need them or automate screening to keep you up-to-date.

For more information on our products discussed above please check out our AppExchange listings for AddressToolsIndustryCompleteSimpleImport and PhoneTools.