As a someone who manages Salesforce you may find yourself in a position where you need to loop over many records and take some kind of action. With some of the improvements to Flow Builder, you can now accomplish this pretty easily. However, there are still limits to Flow and if you are a developer like myself, you will probably want to take advantage of a Salesforce Batch Class. To find out more about those limits, Salesforce Ben wrote an excellent article found here.
If you are new to the concept of Batch handling in Salesforce, you can find details here. To summarize, Batch Apex is an asynchronous process that handles multiple chunks of records. You would use this in a scenario where you have a large amount of records or lengthy processing that would otherwise exceed system limits.
In your Apex class you have three methods; start, execute, and finish. Start is where you load your records. Execute is where your core logic will run over a sub-set in a chunk you specify (up to 200). The finish is post-processing and can be useful for identify batch errors, sending emails, and writing exception logs.
There are two useful add-ons to the batch class that I find useful. The first is Database.AllowCallouts. This allows you to make HTTP requests from the Execute method. The second is Database.Stateful. Stateful will hold on to information generated in the Execute method and make it available in the Finish method. The downside to Stateful is it will impact the performance and processing time of the batch. So, use it only when necessary.
Below I’ve outlined two use-cases where I employed a Batch class to process large chunks of data.
Use Case A
The requirement was to update Account children with data from a parent when the parent was a particular record type. A quick query of the system and it was known to me that there were some parents that had over 200 children. And, some parents had grand-children. Knowing this, I didn’t even try flow, I jumped right to Batch Apex. In the Start I queried the Children for the Parent. Then I processed the field updates in batches of 100 in the Execute. I used a 100 for my batch because in the Execute I was also making a secondary SOQL request. I used an Aggregate query in the Execute to capture any children who also have children. I put their ID in a public set by using Stateful. In the Finish I then looped over the set and called the same batch class for each child who holds grand-children. I also send an email to the SF Admin Group specifics about the job.
The risk with this solution was that I may exceed another limit. The total number of batches you can have queued at a time. To solve that, I used a solution brought to me by Jitendra Zaa, a leader in the Salesforce space. He developed a way to submit batch jobs to a custom class in which it checks for limits and then releases them as spots become available. This has saved me many times in working with large data.
Use Case B
The requirement was to process pending orders for a monthly billing cycle. There are three payment methods; ACH, Invoice, and Credit Card. We needed to loop over the orders, process the records, and then email our clients that their billing statement was ready to view within the client experience and attach a PDF where required. This is a monitored process and one that is kicked off manually by someone in the Operations / Finance team.
To start, I built a Lighting app to deliver the LWC that would trigger the processing. This LWC provides options to the user, specifically which orders to process based on payment type. Once the validations are passed, a batch job is submitted. One neat feature of this LWC is it allows the user to perform a “dry run” of the batch. This allows auditing of the data prior to being processed. The dry run option generates Excel spreadsheets of the orders broken down by payment type. This allows auditing of expected revenues by the Finance team and the Operations team can do a final pre-check for any data that may seem out of whack.
The batch start gathers the orders based on what it knows about the associated Account. In the execute we grab the accounts for the batch of orders to get associated data to process the payment. We also need to know whom we are sending the notice to, so we need Account Contact Relationships with a specific role. The order holds the Billing Information, but in some instances, there is more than one receiver of the notice. Once we have that data, we then process each order accordingly. This is mostly staging data for the final processing step. We also generate the PDF attachments and create a list of Emails to send at the end of our execute job. For instance, for Credit Card orders, we need to insert a record into the Credit Card handler object provided by the vendor.
In the finish we write exception logs and notify the originator of the batch specifics.
Once the second batch is completed, the user can then submit the records to their final destination which is the payment type. Each one has an API that is used to send the data to the host for payment processing / tracking. The reason we don’t automatically send the data for processing is the finance team wanted to monitor and control which payment types are processed at a particular time of day. The job also allows them to add records that may have been worked on since the last processing.
If you have a use-case where you think Batch Apex might be a solution, leave a comment and let’s discuss the project!