Friday, September 9, 2016

So Many Resources, So Little Time

I found Auntie Pat Tern burying a hammer in the flower bed. When I asked why, she said it was Tim Toady's favorite tool and he needed to learn something new. When all you have is a hammer, everything looks like a nail, so she wanted to make sure he found new ways of doing things. I like to make sure my developers are constantly learning new skills as well, so they don't rely on "golden hammer" practices that may have become outdated.
Become a specialist and then learn even more.

Learning to develop on the Salesforce platform is like finding a magpie's nest full of shiny objects. Where do you start and how do you figure out which gems are most valuable? Here's the path I take:
  1. Attend Dreamforce. The keynote and product demos will help you know what you want to learn. The sessions will help you get started on that learning. It offers the chance to talk to people who are on the same learning journey as you are and share tips and interests. I am presenting two sessions on developing with Apex this year: 8 Essential Apex Tips for Admins and Apex Trigger Essentials for Admins.
  2. Watch more Dreamforce sessions online. After the event, Salesforce makes sessions available by video online. Sessions you wish you had attended, topics you didn't know you were interested in until after Dreamforce, all are free to watch online.
  3. Follow up your learning in the Success Community.  Find your local user group and developer group and follow the online conversations to learn how other people are using Salesforce and the challenges they are overcoming. Even if you don't have a local group, you can join groups online to be part of the discussions.
  4. Keep up the learning with Trailhead. Salesforce offers online learning opportunities through Trailhead. Don't be intimidated by the SuperBadges, they offer a well defined path to learning about a particular skill in Salesforce. And if you get stuck the community is there to help.
  5. Two places to turn for online help from the community when you get stuck on any of your coding projects are the Developer Community and Stack Exchange. Search the boards to see if someone has already asked about the question you have, and if you can't find a discussion, go ahead and post your question. Once you gain more skills, you should find that you are answering more questions than you are asking.
The resources are there to help you along. Learning to develop, or learning to develop even better, can be fun with all of the resources Salesforce offers. You don't have to rely on golden hammers when there are opportunities to learn new and better ways to work with Salesforce.

Sunday, August 7, 2016

CSV Data: Commonly Surfacing Vexations in Data

According to Auntie Pat Tern, "Trying the same thing over and over without getting the results you want is enough to make anyone crazy." Unfortunately, that's the excuse Tim Toady uses when he doesn't do his homework because he's tired of not getting the results he wants from the effort.

When it comes to creating CSV files for use as unit test data, what should be an easy process can make you a bit crazy if you wind up getting unfamiliar errors. The following steps and potential errors may help:

Step 1: Export some production data.
Step 2: Delete all but a reasonable selection of data.
Step 3: Remove system fields.
Step 4: Make sure the date fields are in the form YYYY-MM-DD
Step 5: Make sure date/time fields have a "T" between the date and time, with no space, such as 2016-08-07T01:40:39
Step 6: Make sure the commas that appear in text fields haven't thrown off the number of columns.
Step 7: Make sure there are no blank rows with only commas and no values
Step 8: Renumber IDs starting with 1
Step 9: Remove any invalid IDs for related records (Account ID of 000000000000000AAA appears for all top-level accounts, and should be removed, for example)
Step 10: Upload the CSV file as a Static Resource for your code to reference

If you are creating a set of child records for an object that is related to another object, you can sort the parent and child data by the IDs of the parent to make sure you get records that match in both data sets. Use search and replace to redo the parent IDs to match the new IDs in the CSV file of parent records. For example, if you have a CSV for Accounts with IDs numbered 1-200, the related contact records must use 1-200 for the Account ID as well.

Bad CSV files might result in the following errors:

Potential Error Likely Solution
Invalid ID value on line 2: 000000000000000 Remove invalid IDs
Too many DML rows: 10001 Load fewer records
CSV Parse error: '8/20/1959' is not a valid value for the type xsd:date Format dates as YYYY-MM-DD
Duplicate ID value on line 81: null Remove empty rows from CSV file
CSV Parse error: '2011-09-07 01:00:31' is not a valid value for the type xsd:dateTime Format date/time fields with "T" rather than space between date and time
Validation Errors While Saving Record(s) Erroneous data or IDs not starting with 1
System.UnexpectedException: Salesforce System Error Remove stray commas throwing off columns
Static Resource not found Make sure code refers to Static Resource by name

Start with a small number of records using the fewest fields for testing the code or configuration changes you need to test. That way, you won't wind up like Tim Toady, who falls back on bad habits when errors occur with his first attempts to follow best practices.

Sunday, July 31, 2016

What A Load Of Business Data

Auntie Pat Tern thinks Superman's x-ray vision is stupid because "how would he ever know where he's supposed to look and where he should not bother focusing?" That got me wondering about the data and tests in my Salesforce org.  Even clicks-not-code developers should have automated tests that validate configuration changes with existing and expected data. So what's the best way to know which data those automated tests should use?

Some Salesforce developers like to write automated tests with "SeeAllData=True", an outdated and bad practice. A test that can see all data is not the same as a test that is smart enough to see the data that needs testing. For example, a good test will test positive scenarios, negative scenarios and extremes. In other words:
  • What if the data is exactly what we planned for -- records with all the right data in all the right places?
  • What if the data falls outside of expected norms -- records with missing fields or invalid data, for example?
  • What if the data is coming in from a data import operation or an integration and many records need to be processed?
Admins can follow a few simple steps to create automated tests to run against their configuration changes, and their efforts can be used by developers for better unit tests as well. Simply create sample data representative of the expected inputs. Then use code like the following to load that data in a unit test:


Note that you will want to use data that corresponds to the configuration changes that you are testing. In this example, configuration changes and business process automation around Account, Contact and Opportunity record creation will be tested. You can create a similar test class for other related objects, including custom objects.

This code can be run as a unit test by itself to validate configuration changes, or it can be called by other test code to set up data for more complex unit testing related to other code in the org.

Next week, I will look into some of the errors that might occur when you create CSV files of sample data and how to avoid those errors.

But for now, Auntie Pat Tern has a point, who wants Superman looking at just any old thing with his x-ray vision when he ought to be focusing on information that's most helpful.

Monday, July 11, 2016

Enough Is As Good As A Feast

Auntie Pat Tern loves Sir Thomas Mallory and Mary Poppins, and quotes both when she reminds us "enough is as good as a feast", especially when Tim Toady's eyes are bigger than his stomach at the buffet. So I thought I would look at Salesforce storage limits to see if we have a feast in our Enterprise Edition org (which is the entry-level org for many businesses and the basis for the nonprofit license grant, even those with the nonprofit starter pack pre-installed).

Salesforce currently allocates 20MB of data storage per user in every EE org. But wait, there's more. Every org starts with a minimum of 1GB of data storage. So whether you are a nonprofit with a grant of 10 free licenses or a small business with 50 licenses, you have a minimum of 1GB for data.

More than enough is too much.
But how many records is that? Salesforce allocates 2KB for most records. Articles require 4KB, Campaigns require a whopping 8KB (4 times the size of most other records). That means a basic Salesforce org (Enterprise Edition or better) will have data storage capacity for about 500,000 records (excluding Campaigns and Articles). If you have 100,000 Accounts and 400,000 Contacts for 1-50 users, you are going to need more storage.

If you are using Person Accounts, or if you create a unique Account for each Contact, 250,000 individuals would result in 500,000 records since each individual would require both an Account and Contact record.

Custom objects behave the same as typical standard objects taking 2KB per record, regardless of the number or type of fields associated with those records. Even if you use a lot of rich text fields with large image files, a custom object record still requires only 2KB of data storage. The trick with rich text fields is that they actually are stored as files and so impact file storage rather than data storage.

Consider your storage needs carefully when you create sandboxes. Partial Copy sandboxes currently provide 5GB of data storage, that's more than you have in your production org if you have 50 or fewer users! But they don't offer much in terms of file storage. You may need a Full Copy sandbox for the convenience of copying all of your data at once and accommodating larger amounts of file storage and files from rich text fields.

But for standard data needs, use these basic formulas to calculate your data needs in GB:

( # of records X 2KB ) X 1/1024 X 1/1024
or
( # of Campaigns X 8KB ) X 1/1024 X 1/1024
or
( # of Articles X 4KB ) X 1/1024 X 1/1024

In the first formula, we multiply the number of records you have times 2KB, the data storage typically needed for records to get the required number of kilobytes of storage. Then we multiply by 1/1024 to convert from kilobytes to megabytes and again to convert from megabytes to gigabytes.

In the second formula, we multiple the number of Campaign records times 8KB because they consume a lot of storage compared to typical standard and custom objects. Then we do the same calculations to convert from KB to GB as described above.

The third formula is for calculating the storage needs of Articles since their records require 4KB rather than the standard 2KB. Otherwise it works like the first two, with the result expressed in gigabytes.

Keep in mind that Person Accounts and 1x1 Contact to Account data models will create a Contact and an Account record, so every individual is represented by two records rather than just one.

Salesforce offers options for purchasing additional storage space. You can also upgrade to Performance, Unlimited or similar Editions that offer six times as much data storage per user compared to Enterprise Edition (orgs with fewer than 10 licenses may not see the benefits of the additional storage).

But, as Auntie Pat Tern would say, it is good to be grateful for what you have and know when you've got enough for your share. Salesforce makes it easy to calculate your needs.

Saturday, June 11, 2016

More Better Bots

You know Auntie Pat Tern loves Jeopardy!, the tv show with all the answers, but we have an old friend named Eliza who believes we can find all the answers we need by asking ourselves the right questions.  My developers have been asking some of the right questions since their recent trip to the TrailheaDX conference (a conference for Salesforce Developers, similar to past conferences in 2007 and 2010).

First of all, the developers asked what we should do about all of our existing Visualforce pages now that users are craving Lightning Experience.  "What can we do to give them that look and feel without a wholesale rewrite of our UI?" they asked.  Their questions naturally led them to the answer that we should start simply with applying the Lightning Design System to what we have in Visualforce, as a first step.

Next, the developers asked what was going on with Salesforce and AI.  "What's the deal with bots?" they wondered.  While they mulled this over, I reached out to Eliza to see what she had to say:
> Hello. My name is Eliza.  How may I help you?
* Eliza, what's the difference between AI and machine learning?
> Ahh.  Please continue...
* Really, I have three issues: Artificial Intelligence, Machine Learning and Natural Language Processing.
> What does that suggest to you?
* Like Steve Wozniak said at TrailheaDX, Artificial Intelligence is nothing without Machine Learning.  Natural Language Processing benefits from the ability to learn better responses.
> I'm not sure I understand you fully.
* Well, MIT has done some cool stuff with bots.  One learned to answer frequently asked questions for students.
> Can you elaborate on that?
* For a class that prompted over 10,000 online questions per semester, the professor used Machine Learning and trained a bot using questions and answers from previous semesters.  The bot was set to respond to those familiar, frequently asked questions and to only respond to new questions if it calculated a 97 percent or higher confidence level for its answer.
> What does that suggest to you?
* It suggests that I can blend Natural Language Processing and Machine Learning to provide automated, interactive responses to FAQs.
> Say, do you have any psychological problems?
* Um, I do sometimes chat with bots.
Having a bot that can help customer service agents provide the best responses to Cases would help our service team reply quickly and consistently.  We also might want a bot to help provide strategies for the sales team, following the department's best practices. Bots, with the ability to learn from our org's data and best practices can also help us guide new employees and reduce their frustration.

Auntie Pat Tern doesn't really like the term "Artificial Intelligence" because she believes that using code to process our human questions and requests is genuine intelligence.  And our old friend Eliza is curious to know how you feel about that.

Tuesday, May 17, 2016

SeeAllData = Fail

According to Auntie Pat Tern, it takes 21 days to break a bad habit, but Salesforce developers have had about that many API versions to break the bad habit of writing unit tests with "SeeAllData=true".

In most cases, this clause alone can render tests completely useless. Unit tests should prove that code functions according to specifications by asserting proper database manipulations--create, read, update, delete operations--as well as Visualforce navigation.

I looked at unit tests in my org and found this example of a test that failed to achieve its purpose:

This test should be asserting that the data for ordering samples can be created with just a contact and sample type defined.  Unfortunately the test relies on existing, live, org data rather than test 'silo' data because of "SeeAllData=true" in the first line. There are easy methods for unit tests to create their own test data without relying on live org data.

We encountered the following problems because of using "SeeAllData=true":
  • It required us to maintain test data among our real data.
  • When the test data was changed during data cleanup (in one case, Pat Tern's Account was deleted), the tests failed even though functionality was unchanged.
  • Tests were not reliable between Sandboxes and Production orgs due to data differences rather than actual functionality.
  • Apex Hammer Tests may not have been automatically monitored in our org for each Salesforce release since Hammer Tests are blind to live org data.
In the rare case where a specific piece of data may be required for your code to behave, consider using custom metadata types instead of Salesforce objects. Trailhead can help you learn more about how they allow you to move metadata and records between orgs and test functionality without needing to see all data in the org.

Saturday, May 7, 2016

Pay Down That Technical Debt

Auntie Pat Tern says we should pay with cash, or at least pay our credit cards off every month because "debt," she says, "is like fresh fish, it seems great at first but gets old fast and then it really stinks."

It's Hammer Time!
So I looked at my Salesforce org to assess the level of technical debt we had accrued and to plan how we would start paying down that debt.

Sometimes technical debt comprises the shortcuts or mistakes that no one has time to correct when a project needs to be completed. Technical debt also develops over time.  As businesses mature and change, their technical solutions can fall behind and create technical debt.

A Salesforce org can be especially prone to this since Salesforce offers a wealth of new features for all orgs three times each year and not all companies make an effort to rewrite their technical solutions based on these new features. Luckily, Salesforce offers some tools to help us assess some of the technical debt associated with our code.

First, I checked our Hammer Test Status.  The data silo gauge revealed that we still had old (and sadly a few new) unit tests that were using "seealldata=true", which we needed to update.  It also revealed a couple of tests that were failing and so needed some attention.

Next I checked the API version on our Apex Classes.  Any class that is 10 or more versions behind the current API version needed a review, both on the code side and on the process side.

The third step for me was reviewing our org's documentation for outdated information on our code.

My org review also included configuration, managed packages and basic processes, as well.

We may not be able to pay down all of our technical debt right away, but a monthly review and reminder of how it accumulates will help us develop better in the future.  And as Auntie Pat Tern says, "It's not just pay me now or pay me later, you know.  It's pay me now or pay me later with additional compounded interest!"