Wednesday, April 5, 2017

Real Business Data Returns

Auntie Pat Tern believes you can have your cake and eat it too, "Just use your resources wisely". So I wanted to see how to use my static resources more wisely in my Salesforce org. I have previously shown how to use static resources for test data. Now, I want to use that same data in developer sandboxes created from my org.  Since static resources are brought over like code to developer sandboxes, I can reference them in code I execute when I generate the sandboxes. Using the map datatype helps me keep relationships intact as well:

With this simple trick, I can preload my dev orgs with sample business data that I am using in my unit tests. Like Auntie Pat Tern promised, I can have my cake and eat it too when it comes to using realistic sample data in my tests and development orgs.

Whole Heaps of Fun

Auntie Pat Tern has noticed that questions about heap limits come up periodically. This could be a concern for developers needing to work with files and attachments in Salesforce. Luckily, those native objects aren't subject to typical heap limits. If you stick to blobs you can work with these large data objects. Don't use this knowledge to do bad things, now! Limits help us write code that performs well for our users as well as our fellow tenants in the multi-tenant environment we all know and love.

12:52:35:797 USER_DEBUG [43]|DEBUG|Heap: 57,789,821 / 6,000,000

Of course, if you test your code at limits, you can be sure of just how awesome your code is, which is another great reason to write thorough tests for all your code. You will find that if you are careful of what datatype you use and what you try to do with it the platform rewards you with heaps and heaps of fun.

Wednesday, September 28, 2016

Eliminate Cargo Cult Programming and Build a Stronger Team

My Auntie Pat Tern is one of those people who refuses to let family members wear cargo pants when we go out to dinner.  She says she doesn't want to participate in the 'cargo cult'.  She inspired me to look at how Cargo Cult Programming has affected our Salesforce org.

One of the best things about programming on the Salesforce platform is the wealth of resources online to help problem solve. It can be one of the worst things about programming on the platform as well if you aren't careful about the resources you choose to follow.  Some code samples use outdated practices, others may be snippets of code taken out of context, which may not behave as expected in a different context.

In one case, the Apex documentation team included System.assert statements in a code snippet.  Our new developer later copied the snippet into a utility class.  The developer found a great resource but didn't fully understand the difference between System.assert and System.debug.  The former builds in a fatal error, appropriate for test code but not for our essential platform automations. Sniffing out this cargo cult programming helped us know what additional training could help our developers.

A more frequently found cargo cult is the factory class to build basic objects. A factory class is still far better than using the very outdated SeeAllData in your test code.  But test data factory classes add unnecessary work for coders. Salesforce offers a newer and better technique with Test.loadData, which I wrote about in July.  It lets us shift the work of creating and maintaining sample data from the developer to the administrator.  

Other cargo cult programming that pops up in our Salesforce org relates to data sharing and security rules.  We teach our developers early on to understand the difference between code that runs only in Triggers vs code that may be run by end-users via UI customizations like pages, flows and actions. And we encourage our administrators to monitor org security in our code during code reviews and test executions.

Hunting out and eliminating cargo cult programming helps us build cohesion among the team that maintains our Salesforce platform and better understanding between administrators and developers who now work more closely together. This helps us spread the effort of org maintenance across a team and make use of clicks and code for our projects. End users benefit and our org is more reliable and easier to maintain as well.

Friday, September 9, 2016

So Many Resources, So Little Time

I found Auntie Pat Tern burying a hammer in the flower bed. When I asked why, she said it was Tim Toady's favorite tool and he needed to learn something new. When all you have is a hammer, everything looks like a nail, so she wanted to make sure he found new ways of doing things. I like to make sure my developers are constantly learning new skills as well, so they don't rely on "golden hammer" practices that may have become outdated.
Become a specialist and then learn even more.

Learning to develop on the Salesforce platform is like finding a magpie's nest full of shiny objects. Where do you start and how do you figure out which gems are most valuable? Here's the path I take:
  1. Attend Dreamforce. The keynote and product demos will help you know what you want to learn. The sessions will help you get started on that learning. It offers the chance to talk to people who are on the same learning journey as you are and share tips and interests. I am presenting two sessions on developing with Apex this year: 8 Essential Apex Tips for Admins and Apex Trigger Essentials for Admins.
  2. Watch more Dreamforce sessions online. After the event, Salesforce makes sessions available by video online. Sessions you wish you had attended, topics you didn't know you were interested in until after Dreamforce, all are free to watch online.
  3. Follow up your learning in the Success Community.  Find your local user group and developer group and follow the online conversations to learn how other people are using Salesforce and the challenges they are overcoming. Even if you don't have a local group, you can join groups online to be part of the discussions.
  4. Keep up the learning with Trailhead. Salesforce offers online learning opportunities through Trailhead. Don't be intimidated by the SuperBadges, they offer a well defined path to learning about a particular skill in Salesforce. And if you get stuck the community is there to help.
  5. Two places to turn for online help from the community when you get stuck on any of your coding projects are the Developer Community and Stack Exchange. Search the boards to see if someone has already asked about the question you have, and if you can't find a discussion, go ahead and post your question. Once you gain more skills, you should find that you are answering more questions than you are asking.
The resources are there to help you along. Learning to develop, or learning to develop even better, can be fun with all of the resources Salesforce offers. You don't have to rely on golden hammers when there are opportunities to learn new and better ways to work with Salesforce.

Sunday, August 7, 2016

CSV Data: Commonly Surfacing Vexations in Data

According to Auntie Pat Tern, "Trying the same thing over and over without getting the results you want is enough to make anyone crazy." Unfortunately, that's the excuse Tim Toady uses when he doesn't do his homework because he's tired of not getting the results he wants from the effort.

When it comes to creating CSV files for use as unit test data, what should be an easy process can make you a bit crazy if you wind up getting unfamiliar errors. The following steps and potential errors may help:

Step 1: Export some production data.
Step 2: Delete all but a reasonable selection of data.
Step 3: Remove system fields.
Step 4: Make sure the date fields are in the form YYYY-MM-DD
Step 5: Make sure date/time fields have a "T" between the date and time, with no space, such as 2016-08-07T01:40:39
Step 6: Make sure the commas that appear in text fields haven't thrown off the number of columns.
Step 7: Make sure there are no blank rows with only commas and no values
Step 8: Renumber IDs starting with 1
Step 9: Remove any invalid IDs for related records (Account ID of 000000000000000AAA appears for all top-level accounts, and should be removed, for example)
Step 10: Upload the CSV file as a Static Resource for your code to reference

If you are creating a set of child records for an object that is related to another object, you can sort the parent and child data by the IDs of the parent to make sure you get records that match in both data sets. Use search and replace to redo the parent IDs to match the new IDs in the CSV file of parent records. For example, if you have a CSV for Accounts with IDs numbered 1-200, the related contact records must use 1-200 for the Account ID as well.

Bad CSV files might result in the following errors:

Potential Error Likely Solution
Invalid ID value on line 2: 000000000000000 Remove invalid IDs
Too many DML rows: 10001 Load fewer records
CSV Parse error: '8/20/1959' is not a valid value for the type xsd:date Format dates as YYYY-MM-DD
Duplicate ID value on line 81: null Remove empty rows from CSV file
CSV Parse error: '2011-09-07 01:00:31' is not a valid value for the type xsd:dateTime Format date/time fields with "T" rather than space between date and time
Validation Errors While Saving Record(s) Erroneous data or IDs not starting with 1
System.UnexpectedException: Salesforce System Error Remove stray commas throwing off columns
Static Resource not found Make sure code refers to Static Resource by name

Start with a small number of records using the fewest fields for testing the code or configuration changes you need to test. That way, you won't wind up like Tim Toady, who falls back on bad habits when errors occur with his first attempts to follow best practices.

Sunday, July 31, 2016

What A Load Of Business Data

Auntie Pat Tern thinks Superman's x-ray vision is stupid because "how would he ever know where he's supposed to look and where he should not bother focusing?" That got me wondering about the data and tests in my Salesforce org.  Even clicks-not-code developers should have automated tests that validate configuration changes with existing and expected data. So what's the best way to know which data those automated tests should use?

Some Salesforce developers like to write automated tests with "SeeAllData=True", an outdated and bad practice. A test that can see all data is not the same as a test that is smart enough to see the data that needs testing. For example, a good test will test positive scenarios, negative scenarios and extremes. In other words:
  • What if the data is exactly what we planned for -- records with all the right data in all the right places?
  • What if the data falls outside of expected norms -- records with missing fields or invalid data, for example?
  • What if the data is coming in from a data import operation or an integration and many records need to be processed?
Admins can follow a few simple steps to create automated tests to run against their configuration changes, and their efforts can be used by developers for better unit tests as well. Simply create sample data representative of the expected inputs. Then use code like the following to load that data in a unit test:


Note that you will want to use data that corresponds to the configuration changes that you are testing. In this example, configuration changes and business process automation around Account, Contact and Opportunity record creation will be tested. You can create a similar test class for other related objects, including custom objects.

This code can be run as a unit test by itself to validate configuration changes, or it can be called by other test code to set up data for more complex unit testing related to other code in the org.

Next week, I will look into some of the errors that might occur when you create CSV files of sample data and how to avoid those errors.

But for now, Auntie Pat Tern has a point, who wants Superman looking at just any old thing with his x-ray vision when he ought to be focusing on information that's most helpful.

Monday, July 11, 2016

Enough Is As Good As A Feast

Auntie Pat Tern loves Sir Thomas Mallory and Mary Poppins, and quotes both when she reminds us "enough is as good as a feast", especially when Tim Toady's eyes are bigger than his stomach at the buffet. So I thought I would look at Salesforce storage limits to see if we have a feast in our Enterprise Edition org (which is the entry-level org for many businesses and the basis for the nonprofit license grant, even those with the nonprofit starter pack pre-installed).

Salesforce currently allocates 20MB of data storage per user in every EE org. But wait, there's more. Every org starts with a minimum of 1GB of data storage. So whether you are a nonprofit with a grant of 10 free licenses or a small business with 50 licenses, you have a minimum of 1GB for data.

More than enough is too much.
But how many records is that? Salesforce allocates 2KB for most records. Articles require 4KB, Campaigns require a whopping 8KB (4 times the size of most other records). That means a basic Salesforce org (Enterprise Edition or better) will have data storage capacity for about 500,000 records (excluding Campaigns and Articles). If you have 100,000 Accounts and 400,000 Contacts for 1-50 users, you are going to need more storage.

If you are using Person Accounts, or if you create a unique Account for each Contact, 250,000 individuals would result in 500,000 records since each individual would require both an Account and Contact record.

Custom objects behave the same as typical standard objects taking 2KB per record, regardless of the number or type of fields associated with those records. Even if you use a lot of rich text fields with large image files, a custom object record still requires only 2KB of data storage. The trick with rich text fields is that they actually are stored as files and so impact file storage rather than data storage.

Consider your storage needs carefully when you create sandboxes. Partial Copy sandboxes currently provide 5GB of data storage, that's more than you have in your production org if you have 50 or fewer users! But they don't offer much in terms of file storage. You may need a Full Copy sandbox for the convenience of copying all of your data at once and accommodating larger amounts of file storage and files from rich text fields.

But for standard data needs, use these basic formulas to calculate your data needs in GB:

( # of records X 2KB ) X 1/1024 X 1/1024
or
( # of Campaigns X 8KB ) X 1/1024 X 1/1024
or
( # of Articles X 4KB ) X 1/1024 X 1/1024

In the first formula, we multiply the number of records you have times 2KB, the data storage typically needed for records to get the required number of kilobytes of storage. Then we multiply by 1/1024 to convert from kilobytes to megabytes and again to convert from megabytes to gigabytes.

In the second formula, we multiple the number of Campaign records times 8KB because they consume a lot of storage compared to typical standard and custom objects. Then we do the same calculations to convert from KB to GB as described above.

The third formula is for calculating the storage needs of Articles since their records require 4KB rather than the standard 2KB. Otherwise it works like the first two, with the result expressed in gigabytes.

Keep in mind that Person Accounts and 1x1 Contact to Account data models will create a Contact and an Account record, so every individual is represented by two records rather than just one.

Salesforce offers options for purchasing additional storage space. You can also upgrade to Performance, Unlimited or similar Editions that offer six times as much data storage per user compared to Enterprise Edition (orgs with fewer than 10 licenses may not see the benefits of the additional storage).

But, as Auntie Pat Tern would say, it is good to be grateful for what you have and know when you've got enough for your share. Salesforce makes it easy to calculate your needs.