Salesforce Data Management Best Practices
- Salesforce data management best practices are needed to keep customer information accurate, business processes effective, staff productivity efficient and maintain user trust.
- As more companies become data-driven, the importance of relevant, reliable and useful data is magnified.
- Poor data quality is like a disease in two ways. First, it often goes undetected until it's a big problem. Second, left untreated it only gets worse. Salesforce data management best practices are the cure.
Salesforce Data Management Best Practices
Customer data is the life blood of any Salesforce (SFDC) system. But as the data deteriorates so does the value of your application.
Data quality is to the ability of data to serve its intended purpose. Data management is the process to ensure data quality.
CRM data management is a prerequisite to marketing, sales and customer service business processes and information reporting. In fact, the effectiveness of these core business functions is directly aligned to the quality of data.
With accurate and complete customer data, sales reps don't waste time searching for and manually entering account and contact information. Marketers don't accidently market to the same people multiple times. Customer service agents have enough information to deliver a personalized customer experience. And management can trust the information reporting.
SFDC and many AppExchange partners offer apps and services to analyze, validate, clean, dedupe, append and archive data. This post shares several Salesforce data management best practices, configuration options and tools to maintain data quality.
Data Quality Best Practices
Data quality is impacted with every data import. Whether using the data import wizard, Data Loader or Dataloader.io, refrain from the popular but ill-advised more-data-is-better thinking. Unused data is a distraction that clogs the user interface, reduces staff productivity and incurs numerous costs to maintain. It's now about how much data you have but how much gets used.
It's also good practice to normalize, format and verify data for import outside your SFDC system. Sanitizing after the upload can leave orphaned attributes even if bogus data is deleted. Also, for each of your imports, append the imported data with a data source identifier, so can score data vendors or sources, or revert the import later if necessary.
Most dirty data comes from human error and most human error comes from missing data standards. To keep your CRM system clean and efficient you need to define data standards and show what good data looks like. Create a CRM data policy to define how data should be entered and standardized. Then create SFDC validation rules that enforce your data policy rules.
Sometimes data validation is not limited to data entry. Rules may be created so that additional fields become mandatory as the data accumulates or advances past a certain point.
The combination of validation and workflow rules are another option. They can detect and automatically update improperly entered data. For example, if a user entered an account state as "FLA" a rule could automatically change the value to "FL." I've also used these rules to detect and resolve common misspellings such as "srteet." Another use case is to apply rules to alert users of invalid conditions, such as if a phone number includes an alpha character or a zip code has too few numbers.
Creating data entry parameters increases both data entry speed and data quality. Converting text fields to drop down menus or multi-select fields reduces the amount of data users must type manually. This technique is especially effective with mobile devices.
Custom field types are another method to ensure data standardization. For example, assign all custom date fields to Type = Date and custom currency fields to Type = Currency. For fields with a standard list of values, use Type = Picklist.
Insert data quality scores on the most important records, such as Leads, Accounts, Contacts, Opportunities, Cases and Campaigns. These scores are calculated pursuant to your data policy standards. Sometimes record scores establish a minimum threshold value. Other times, quality scores go up as more data is entered to the record.
When it comes to the pervasive problem of duplicate data, an ounce of prevention is better than a pound of cure. Start by stopping duplicate records at the source of entry. Use SFDC duplicate record detection and response handling to prevent duplicate records from getting into the application.
The SFDC Potential Duplicates function assigns controls to designate whether users can create duplicate leads, accounts and contacts and assigns permissions to merge these redundant records.
Lastly, you will lower duplicate data entry if you go beyond the simple contact matching rules and apply fuzzy logic and other more advanced detection queries. Even better, proactive data entry aids will increase Salesforce adoption.
Use data quality dashboards to display quality compliance or variances front and center.
Dashboard leaderboards are effective in acknowledging the top contributors or identifying the top offenders who may need additional instruction.
Lastly, as part of your Salesforce implementation roadmap, plan to prevent the most catastrophic problem of unintended data loss with the SFDC weekly data export service or a similar AppExchange third party service. This provides a revert option if data is inadvertently deleted.
The SFDC data export service is free with Enterprise and Unlimited versions or can be purchased by Professional Edition customers. As an alternative, there are free tools such as DataLoader.io.