Platform Cache

Introduction: – In this blog post I am going to explain “Platform Cache ” and how you are going to use. You may use many ways to improve your pages to run quickly by using custom settings and view states reduce technicians and so on.  But with new platform cache, you can store Salesforce session and org data for later access and applications can run faster because they store reusable data in memory.

How Platform Cache will work?
Platform Cache uses local cache and a least recently used (LRU) algorithm to improve performance.The local cache is the application server’s in-memory container that the client interacts with during a request. Cache operations don’t interact with the caching layer directly but instead interact with the local cache.For session cache, all cached items are loaded into local cache upon the first request. All subsequent interactions use the local cache. Similarly, an org cache gets operation retrieves a value from the caching layer and stores it in the local cache. Subsequent requests for this value are retrieved from the local cache. Platform Cache uses an LRU algorithm to evict keys from the cache. When cache limits are reached, keys are evicted until the cache is reduced to 100-percent capacity. If session cache is used, the system removes cache evenly from all existing session cache instances. The local cache also uses an LRU algorithm. When the maximum local cache size for a partition is reached, the least recently used items are evicted from the local cache

 Types Of Platform Cache: –

Platform cache supports two types of caches

  • Session cache—Stores data for individual user sessions. For example, in an app that finds customers within specified territories, the calculations that run while users browse different locations on a map are reused.

    Session cache lives alongside a user session. The maximum life of a session is eight hours. Session cache expires when its specified time-to-live (ttlsecs value) is reached or when the session expires after eight hours, whichever comes first.

  • Org cache—Stores data that any user in an org reuses. For example, the contents of navigation bars that dynamically display menu items based on user profile are reused.

    Unlike session cache, the org cache is accessible across sessions, requests, and org users and profiles. Org cache expires when its specified time-to-live (ttlsecs value) is reached.

 

Distribute the cache with Partitions:-

Partitions allow you to improve the performance by distributing cache space in the way that works best for your applications. after setting up the partitions you can add, access, and remove data from them using the Platform Cache Apex API.In order to use Platform Cache, create at least one partition. Each partition has one session cache and one org cache segment and you can allocate separate capacity to each segment. Session cache can be used to store data for individual user sessions, and the org cache is for data that any users in an org can access. You can distribute your org’s cache space across any number of partitions. Session and org cache allocations can be zero, or five or greater, and they must be whole numbers. The sum of all partition allocations, including the default partition, equals the Platform Cache total allocation. The total allocated capacity of all cache segments must be less than or equal to the org’s overall capacity.

After you set up partitions, you can use Apex code to perform cache operations on a partition. For example, use the Cache.SessionPartition and Cache.OrgPartition classes to put, retrieve, or remove values from a specific partition’s cache. Use Cache.Session and Cache.Org to get a partition or perform cache operations by using a fully qualified key.

To access the Partition tool in Setup, enter Platform Cache in the Quick Find box, then select Platform Cache. Click new Platform Cache Partition . each Partition should have a session Cache and org Cache. enter Partition name and label as “partition”

PC1

Handling Session cache
Use the Cache.Session and Cache.SessionPartition classes to manage values in the session cache. To manage values in any partition, use the methods in the Cache.Session class. If you’re managing cache values in one partition, use the Cache.SessionPartition methods instead.


// Add a value to the cache . local is the default name space cache
Cache.Session.put('local.partition.key', '1234567');
if (Cache.Session.contains('local.partition.key')) {
    String key = (String)Cache.Session.get('local.partition.key');
}
// Dafualt cache paritions
Cache.Session.put('key', '123456');
if (Cache.Session.contains('key')) {
    String key = (String)Cache.Session.get('key');
}

// Get a cached value
String val = (String)Cache.Session.get('local.partition.key');

If you’re managing cache values in one partition, use the Cache.SessionPartition methods instead. After the partition object is obtained, the process of adding and retrieving cache values is similar to using the Cache.Session methods. The Cache.SessionPartition methods are easier to use because you specify only the key name without the namespace and partition prefix.


// Get partition
Cache.SessionPartition sessionPart = Cache.Session.getPartition('local.Partition');
// Retrieve cache value from the partition
if (sessionPart.contains('key')) {
    String cachedTitle = (String)sessionPart.get('key');
}
// Add cache value to the partition
sessionPart.put('value', 'welcome');

Handling Org Cache

Use the Cache.Org and Cache.OrgPartition classes to manage values in the org cache. To manage values in any partition, use the methods in the Cache.Org class. If you’re managing cache values in one partition, use the Cache.OrgPartitionmethods instead.


// Add a value to the cache

Cache.Org.put('local.partition.key', 'Hello ');
if (Cache.Org.contains('local.partition.key')) {
    String key = (String)Cache.Org.get('local.partition.key');
}

If you’re managing cache values in one partition, use the Cache.OrgPartition methods instead. After the partition object is obtained, the process of adding and retrieving cache values is similar to using the Cache.Org methods. The Cache.OrgPartition methods are easier to use because you specify only the key name without the namespace and partition prefix.


// Get partition
Cache.OrgPartition orgPart = Cache.Org.getPartition('local.partition');
// Retrieve cache value from the partition
if (orgPart.contains('key')) {
    String key = (String)orgPart.get('key');
}
// Add cache value to the partition
orgPart.put('value','welcome');

Diagnose Platform cache:-
cache diagnoses will provide the information about how much cache is used.The Diagnostics page provides valuable information, including the capacity usage, keys, and serialized and compressed sizes of the cached items. The session cache and org cache have separate diagnostics pages. The session cache diagnostics are per session, and they don’t provide insight across all active sessions.

Here is the simple code that used to store and retrieve the conversion exchange rates from the web services. Simple it checks the key is present in the Cache, it is going to retrieve the values from the platform chance otherwise it will store the values in the cache. so that you no need to make Webservice call even time to get the real time conversion rates


public class PlatformCacheController {
    Cache.OrgPartition orgPart ;

    public PlatformCacheController(){
        orgPart  = Cache.Org.getPartition('local.partition');
        //http://apilayer.net/api/live?access_key=<>&amp;currencies=EUR,GBP,CAD,PLN&amp;source=USD&amp;format=1
    }

    public String fetchData(String fromCurrency , String toCurrency ){
        String keytoStoreorRet =fromCurrency+toCurrency;
        If(checkKeyInCache(keytoStoreorRet)){
            return (String) orgPart.get(keytoStoreorRet);
        }else{
            HttpRequest req = new HttpRequest();
            req.setEndpoint('http://apilayer.net/api/live?access_key=456fab5d3bee967f81169416e234387e&amp;currencies='+toCurrency+'&amp;source='+fromCurrency+'&amp;format=1');
            req.setMethod('GET') ;
            Http h = new Http();
            HttpResponse resp =  h.send(req) ; 

            orgPart.put(keytoStoreorRet , resp.getBodyAsBlob());
            return resp.getBody() ;
        }

    }

    public void updateKeyinCache(String key ,String values){
        if(!checkKeyInCache(key)){
            orgPart.put(key, values);
        }
    }
    public boolean checkKeyInCache(String key){
        if (orgPart.contains(key)) {
            return true ;
        }else{
            return false ;
        }

    }

}

 

Considerations of platform cache and best practices 

  • Cache isn’t persisted. and there is no guaranty on data lost.Make sure you should handle the data loss properly. You can use CacheBuilder to handle the data losses.
  • Decided what type of data access you need like Concurrent vs serial retrieve and update. Org cache supports concurrent reads and writes across multiple simultaneous Apex transactions
  • think how to handle cache misses like Using CacheBuiler or you can use your own retrieval or update cache handling
  • Not all the data need to be stored in the cache. Including more data in the cache may impact performance. In case if you need to store the bulk data, split and store into multiple keys
  • Use the cache to store static data or data that doesn’t change often rather than changing the data very often

 

 

 

 

 

 

 

Platform Events in Salesforce

In this blog post, I am going to explain about platform events a new feature generally available from Summer ’17 release as part of the “Enterprise Message Platform” which provide event driven architecture.

Let’s talk about Event Driven Architecture 

Salesforce event-driven architecture is consists of event producers, event consumers, and channels.  Platform events simplify the process of communicating changes and responding to events. Publishers and subscribers communicate with each other through events. One or more subscribers can listen to the same event and carry out actions .with an Event-driven architecture each service publishes an event whenever it updates or creates a data. Other services can subscribe to events.It enables an application to maintain data consistency across multiple services without using distributed transactions.  Let us take an example of order management. When the Order management app creates an Order in a pending state and publishes an OrderCreated event.The Customer Service receives the event and attempts to process an Order. It then publishes an OrderUpdate event.Then OrderUpdate Service receives the event from the changes the state of the order to either approved or canceled or fulfilled.The following  diagram show the event driven architect

Capture

Terminology : –
Event
A change in state that is meaningful in a business process. For example, a placement o of an order is a meaningful event because the order fulfillment center requires notification to process the order.
Event Notifier 
A message that contains data about the event. Also known as an event notification.
Event producer
The publisher of an event message over a channel.
Channel
A conduit in which an event producer transmits a message. Event consumers subscribe to the channel to receive messages.
Event consumer
A subscriber to a channel that receives messages from the channel. A change in state that is meaningful in a business process.

Looks like Streaming API, But Really not 

But when you overlook at Platform events it makes similar to Streaming API and most of the futures including the replayID and durability but below makes the difference between with streaming API.

  • Platform  events are special kinds of entity similar to custom object custom object
  • You can publish and consume platform events by using Apex or a REST API or SOAP API. Platform events integrate with the Salesforce platform through Apex triggers. Triggers are the event consumers on the Salesforce platform that listen to event messages.Unlike custom objects, you can’t update or delete event records. You also can’t view event records in the Salesforce user interface, and platform events don’t have page layouts. When you delete a platform event definition, it’s permanently deleted.
  • Platform events may be published using declarative tools (Process Builder)
  • platform events can also be subscribed to using APEX  or decoratively process builder  and flows

Another major,off-course really impressive one is you can publish changes from apex trigger and you can consume from apex trigger trigger

Publishing and subscribing Platform events 

Publishing and subscribing the platform event are more flexible. You can publish event messages from a Force.com app or an external app using Apex or Salesforce APIs and you can subscribe from the Salesforce or external apps or use long polling with cometD as well.

Let’s take an Example:- 

Now I am going to explain step by step to set up, publish and consume events. What we are going to do it Employee Onboarding Process. Now Once an external app publishes the events, we are going to create an account and when Salesforce publish onboarding events another system is going to receive the platform events.

Step 1: – Define a Platform event
You can define platform event similar like custom object, go to setup –> develope –> Platform events –> create new platfomr events as shown below.

Capture

By seeing it looks like custom objects but here are the few major considerations.

  • Platform event is appended the __e suffix for API name of the event.
  • you can’t query Platform events through SOQL or SOSL.
  •  you can’t use Platform in reports, list views, and search.
  •  published platform events can’t be rolled back.
  • e methods aren’t supported with platform events.
  • All platform event fields are read-only by default
  • Platform events don’t have an associated tab
  • Only after insert Triggers Are Supported
  • You can access platform events both through API and declaratively
  • You can control platform events though Profiles and permissions

Step 2: – Publishing Platform Events

You can publish events using an Apex method or with declarative tools, such as Process Builder or the Cloud Flow Designer or you can publish events using Salesforce API. we are going to seeing all the ways how to publish the platform events.

 Publish Using Apex:-
you can publish platform events by using apex trigger or execute anonymous and batch Apex etc.But here I am going to publish by using Apex triggers. A trigger processes platform event notifications sequentially in the order they’re received and trigger runs in its own process asynchronously and isn’t part of the transaction that published the event.

trigger PlatformEventPublish on Account (after insert , after update ) {

    If(trigger.isAfter &amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;amp;amp;amp;amp;amp; trigger.isUpdate){
        List&amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;Employee_On_boarding__e&amp;amp;amp;amp;amp;amp;amp;amp;amp;gt; publishEvents = new List&amp;amp;amp;amp;amp;amp;amp;amp;amp;lt;Employee_On_boarding__e&amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;();
        for(Account a : Trigger.new){
            Employee_On_boarding__e eve = new Employee_On_boarding__e();
            eve.Name__c = a.Name ;
            eve.Phone__c = a.Phone ;
            eve.Salary__c = a.AnnualRevenue ;
            publishEvents.add(eve);
        }
        if(publishEvents.size()&amp;amp;amp;amp;amp;amp;amp;amp;amp;gt;0){
            EventBus.publish(publishEvents);
        }

    }

}

 Now if you can see, Salesforce has a special class to publish the platform events EventBus which are having methods publish method. once the event is published you can consume the events from the channel

Publish Using Process Builder 

You can publish platform events using the declarative tools like process builders and flows. Here is the image shows the platform events insert by using process builder.

Capture

Publish Events by Using API
Now I am going to see another way of publishing events from API. I am going to use the workbench to publish the events.
Capture

 

Step 3: – Subscribe for Platform events from the channel
You can now subscribe the platform events from the Platform events object trigger which is created in step 1. Here is the sample trigger show how you can handle the subscribed events.Here simply I am creating the new accounts from the platform even but you can implement your own business logic to update the data .

trigger OnBoaringTrigger on Employee_On_boarding__e (after insert) {
    List&amp;amp;amp;amp;amp;amp;amp;amp;lt;Account&amp;amp;amp;amp;amp;amp;amp;amp;gt; acc = new List&amp;amp;amp;amp;amp;amp;amp;amp;lt;Account&amp;amp;amp;amp;amp;amp;amp;amp;gt;();
    for(Employee_On_boarding__e oBording :trigger.new){
        acc.add(new Account(Name =oBording.Name__c , Phone =oBording.Phone__c , AnnualRevenue = oBording.Salary__c));
    }
    if(acc.size() &amp;amp;amp;amp;amp;amp;amp;amp;gt;0){
        insert acc ;
    }
}

Here is the simple visual force page that consumes the platform events which you published. This page is built on cometD.
you can consume the platform events by using this  URI /event/Employee_On_boarding__e and the Complete code is here below.

 <apex:page standardStylesheets="false" showHeader="false" sidebar="false">

<apex:includeScript value=”{!$Resource.cometd}”/> <apex:includeScript value=”{!$Resource.jquery}”/> <apex:includeScript value=”{!$Resource.json2}”/> <apex:includeScript value=”{!$Resource.jquery_cometd}”/> (function($){ $(document).ready(function() { $.cometd.configure({ url: window.location.protocol+’//’+window.location.hostname+ (null != window.location.port ? (‘:’+window.location.port) : ”) +’/cometd/40.0/’, requestHeaders: { Authorization: ‘OAuth {!$Api.Session_ID}’} }); $.cometd.handshake(); $.cometd.addListener(‘/meta/handshake’, function(message) { $.cometd.subscribe(‘/event/Employee_On_boarding__e’, function(message) { var div = document.getElementById(‘content’); div.innerHTML = div.innerHTML + ‘

Notification

‘ + ‘Streaming Message ‘ + JSON.stringify(message) + ‘
‘; }); }); }); })(jQuery) </apex:page>

Key points:-
1 . Platform events are executed under ” Automated Process entity.” So you have set Automated Process in debug logs
2. You can control the Platform events on Profile and permission sets
3.You can see all the platform events that are subscribed under Platform events objects.
Capture
4. Platform events have lifecycles state like Running, Ideal, Suspended,, Error, Expired
5.Platform events are having retry mechanism.

if (EventBus.TriggerContext.currentContext().retries &lt; 4) {
// Condition isn't met, so try again later.
throw new EventBus.RetryableException(
'Condition is not met, so retrying the trigger again.');
} else {
// Trigger was retried enough times so give up and
// resort to alternative action.
// For example, send email to user.
}

 

 

 

 

 

 

 

 

 

Salesforce Apex Convert Lead

In many of the business cases, you may want control lead conversion process through apex trigger or your may need to build your own lead conversion process . in this post, we are going to see how to control the lead conversion from the apex trigger.

What happens when the lead is converted?

  • A contact, account, and opportunity are created and populated with the lead’s data
    The lead field “Converted” is changed from False to True.
  • The data within standard lead fields is automatically transferred to the contact, account, and/or opportunity. For Custom Field use Lead mapping
  • In general on lead conversion, if you wanted to perform any custom logics like then you will go for apex lead conversion.
    Here are some of the use cases.
    1.Checking Account Industry and SIC mapping against your company mapping
    2.On lead Conversion, if you wanted to auto subscribe specific user the record for chatter feed
    3. Sending new or updated account information to ERP or etc.
    Here is the simple job that will convert leads. You can schedule this class to perform the automatic lead conversion
  global class LeadConversionJob implements Schedulable {
    global void execute(SchedulableContext sc){
        List&amp;amp;lt;Lead&amp;amp;gt; leads = [Select Id , Name,Company  from lead where LeadSource='Web' ];

        List&amp;amp;lt;Database.LeadConvert&amp;amp;gt; lcList = new List&amp;amp;lt;Database.LeadConvert&amp;amp;gt;();
        for(Lead l :leads){
            Database.LeadConvert lc = new Database.LeadConvert();
            lc.setLeadId(l.id);
            lc.ConvertedStatus = 'Closed - Converted';
            lcList.add(lc);
        }
        List&amp;amp;lt;Database.LeadConvertResult&amp;amp;gt; lcr = Database.convertLead(lcList);      
        for(Database.LeadConvertResult lcrRes : lcr){
            if(lcrRes.isSuccess()){

            }else{
                //error handling here 
            }

        }

    }

}

If you would like to perform some of the operations on after lead is conversion or to perform some of the business requirements, you write apex trigger as shown below.

Apex Trigger:-

 trigger LeadConversion on Lead (after update) {

    LeadConversionHandler.convertLead(Trigger.newMap , Trigger.oldMap);

}

Apex Class

public class LeadConversionHandler {

    public static void convertLead(Map&lt;Id, Lead&gt; newLead , Map&lt;Id ,Lead&gt; oldLead){
        for(Id idNew :newLead.keySet()){
            If(newLead.get(idNew).isConverted== true &amp;&amp;  oldLead.get(idNew).isConverted== false){
                If(newLead.get(idNew).ConvertedAccountId!=null){
                    // Use your logics here
                    Account a = [Select a.Id, a.Description From Account a Where a.Id = :newLead.get(idNew).ConvertedAccountId];

                }     
                If(newLead.get(idNew).ConvertedContactId!=null){
                    // Use your logics here 
                    Contact c = [Select c.Id, c.Description, c.Name From Contact c Where c.Id = :newLead.get(idNew).ConvertedContactId];
                }
                If(newLead.get(idNew).ConvertedOpportunityId!=null){
                    // use your logics here 
                    Opportunity opp = [Select o.Id, o.Description from Opportunity o Where o.Id = :newLead.get(idNew).ConvertedOpportunityId];

                }
            } 

        }

    }

}

 

 

 

Send SMS from Chatter Feed with Action Link Template

In this blog, I am going to explain how to send SMS  from chatter feed by using action link templates. I am going to use Twilio rest API to send the SMS. Please refer this link for more information on action link templates Inside Action Link Templates

Step 1: –   Defining Action Link Group Templates

go to Setup, enter Action Link Templates in the Quick Find box, then select Action Link Templates create a new one with the below values

Field Value
Name Send Text Messages
Developer Name Send_Text_Messages
Category Primary action
Executions Allowed Once
Hours until Expiration  

10.PNG

 

Step 2: – Creating an  Action Link Temple 

Create a new Action link template under the above-created action link group template with the following details.

Field Value
Action Link Group Template Send Text Messages
Action Type Api API Async
Action URL https://api.twilio.com/2010-04-01/Accounts/{!Bindings.accountSID}/SMS/Messages.json
User Visibility Every one can see
HTTP Request Body Body={!Bindings.body}&To={!Bindings.toNumber}&From={!Bindings.fromNumber}
HTTP Headers Authorization: Basic {!Bindings.authToken}
Host: {!Bindings.host}
Content-Length: {!Bindings.lenght}
X-Target-URI: {!Bindings.uri}
Content-Type: {!Bindings.conenttype}
Position 0
Label Key None
Label Text Message
HTTP Method POST

after saving the Action link template its looks as shown below

11.PNG

Go back to the action link group template and publish it

Step 3: – Posting it to chatter feed 

Now I am going to use apex to instantiated action link template and post it to chatter.

go to developer console and paste the below code.

 

  // Query the Action link Groupd Id 
        ActionLinkGroupTemplate template = [SELECT Id FROM ActionLinkGroupTemplate WHERE
                                            DeveloperName='Send_Text_Messages'];
        Map&lt;String, String&gt; bindingMap = new Map&lt;String, String&gt;();
        // Map of Binding variables 
        bindingMap.put('accountSID', '<Your Account SID>');
        bindingMap.put('body', 'testing');
        bindingMap.put('toNumber','2159155090');
        bindingMap.put('fromNumber','2674600419');
        bindingMap.put('authToken', 'Auth token ==');
        bindingMap.put('host', 'api.twilio.com');
        bindingMap.put('lenght','113');
        bindingMap.put('uri','https://api.twilio.com');
        bindingMap.put('conenttype','application/x-www-form-urlencoded');

        List&lt;ConnectApi.ActionLinkTemplateBindingInput&gt; bindingInputs = new  List&lt;ConnectApi.ActionLinkTemplateBindingInput&gt;();
        for (String key : bindingMap.keySet()) {
            ConnectApi.ActionLinkTemplateBindingInput bindingInput = new
                ConnectApi.ActionLinkTemplateBindingInput();
            bindingInput.key = key;
            bindingInput.value = bindingMap.get(key);
            bindingInputs.add(bindingInput);
        }

        ConnectApi.ActionLinkGroupDefinitionInput actionLinkGroupDefinitionInput = new
            ConnectApi.ActionLinkGroupDefinitionInput();
        actionLinkGroupDefinitionInput.templateId = template.id;
        actionLinkGroupDefinitionInput.templateBindings = bindingInputs;
        // Action link Group Definition .
        ConnectApi.ActionLinkGroupDefinition actionLinkGroupDefinition = ConnectApi.ActionLinks.createActionLinkGroupDefinition(Network.getNetworkId(),
                                                                                                                                actionLinkGroupDefinitionInput);

        ConnectApi.FeedItemInput feedItemInput = new ConnectApi.FeedItemInput();
        ConnectApi.FeedElementCapabilitiesInput feedElementCapabilitiesInput = new
            ConnectApi.FeedElementCapabilitiesInput();
        ConnectApi.AssociatedActionsCapabilityInput associatedActionsCapabilityInput = new
            ConnectApi.AssociatedActionsCapabilityInput();
        ConnectApi.MessageBodyInput messageBodyInput = new ConnectApi.MessageBodyInput();
        ConnectApi.TextSegmentInput textSegmentInput = new ConnectApi.TextSegmentInput();
        feedItemInput.body = messageBodyInput;
        feedItemInput.capabilities = feedElementCapabilitiesInput;
        feedItemInput.subjectId = 'me';
        messageBodyInput.messageSegments = new List&lt;ConnectApi.MessageSegmentInput&gt;();
        textSegmentInput.text = 'Click to post a feed item.';
        messageBodyInput.messageSegments.add(textSegmentInput);

        feedElementCapabilitiesInput.associatedActions = associatedActionsCapabilityInput;
        associatedActionsCapabilityInput.actionLinkGroupIds = new List&lt;String&gt;();
        associatedActionsCapabilityInput.actionLinkGroupIds.add(actionLinkGroupDefinition.id);
         ConnectApi.FeedElement feedElement =
            ConnectApi.ChatterFeeds.postFeedElement(Network.getNetworkId(), feedItemInput);

 

Now you can see new action link on the chatter feed as shown below.

12.PNG Once you click on the Text message it will send an SMS and also update the Statis to chatter feed.

 

Handling “MIXED_DML_OPERATION” with future calls

 You can run “MIXED_DML_OPERATION” error when you are trying to perform DML on setup and non-setup objects in the same transaction.This restriction exists because some sObjects affect the user’s access to records in the org.Non-Setup objects are standard objects like Account or any custom object.  here is the few Setup Object
  • ObjectPermissions
  • PermissionSet
  • PermissionSetAssignment
  • QueueSObject
  • ObjectTerritory2AssignmentRule
  • ObjectTerritory2AssignmentRuleItem
  • RuleTerritory2Association
  • SetupEntityAccess
  • Territory2
  • Territory2Model
  • UserTerritory2Association
  • User
  • GroupMember
  • FieldPermissions

Use case: – Once “Customer Satisfaction Survey” is completed we need to assign it to the  “Internal_Employee” group. Here is the simple trigger


 trigger ServeyTrigger on Customer_Satisfaction_Survey__c (after insert) {
    Group g=[select Id from Group Where DeveloperName='Internal_Employee'];

    List&amp;amp;amp;amp;lt;GroupMember&amp;amp;amp;amp;gt;listGroupMember =new List&amp;amp;amp;amp;lt;GroupMember&amp;amp;amp;amp;gt;();
    for (Customer_Satisfaction_Survey__c cs  : Trigger.new){
        GroupMember gm= new GroupMember();
        gm.GroupId=g.id;
        gm.UserOrGroupId = UserInfo.getUserId();
        listGroupMember.add(gm);
    }
    insert listGroupMember;

}

 

Once you try to insert the new records you will get the below error message

Review all error messages below to correct your data.
Apex trigger ServeyTrigger caused an unexpected exception, contact your administrator: ServeyTrigger: execution of AfterInsert caused by: System.DmlException: Insert failed. First exception on row 0; first error: MIXED_DML_OPERATION, DML operation on setup object is not permitted after you have updated a non-setup object (or vice versa): GroupMember, original object: Customer_Satisfaction_Survey__c: []: Trigger.ServeyTrigger: line 11, column 1

To solve this problem simply you can add the Second DML into future call

  1. Create a method that performs a DML operation on one type of sObject.
  2. Create a second method that uses the future annotation to manipulate a second sObject type.

Now update the trigger as shown below

trigger ServeyTrigger on Customer_Satisfaction_Survey__c (after insert) {
  ServeyAsyncCall.assignToUser(trigger.newMap.keySet());
}

Apex Class:

 

 public class ServeyAsyncCall {
    @future
    public static void assignToUser(Set&amp;lt;Id&amp;gt; setofIds){
        Group g=[select Id from Group Where DeveloperName='Internal_Employee'];
        List&amp;lt;Customer_Satisfaction_Survey__c&amp;gt; records =[Select Id , Name from Customer_Satisfaction_Survey__c where id in:setofIds];
        List&amp;lt;GroupMember&amp;gt;listGroupMember =new List&amp;lt;GroupMember&amp;gt;();
        for (Customer_Satisfaction_Survey__c cs  :records){
            GroupMember gm= new GroupMember();
            gm.GroupId=g.id;
            gm.UserOrGroupId = UserInfo.getUserId();
            listGroupMember.add(gm);
        }
        if (!Test.isRunningTest()) {
            insert listGroupMember;
        }
    }
}

Apex Test Class:- 

@isTest
private class ServeyAsyncCall_test {

    private static testMethod  void bypassMixedDML(){
        // You can insert the user with @future  for this test method . But in this test class simply i queried
        // InsertUser.callInsertfuture()
        User thisUser = [SELECT Id FROM User WHERE Id = :UserInfo.getUserId()];
        System.runAs (thisUser) {

            Customer_Satisfaction_Survey__c cs = new Customer_Satisfaction_Survey__c();
            cs.Comments__c='Hello' ;
            cs.Name='Test Class';
            insert cs ; 

        }
    }
}

Asynchronous calls with @Future

In this blog, I am going to how to make the asynchronous calls with @Future annotations .with the Future annotation you can make asynchronous Web service callout to an external service.A future method runs in the background, asynchronously. You can call a future method for executing long-running operations, such as callouts to external Web services or any operation you’d like to run in its own thread, on its own time.

What is Asynchronous Process in Salesforce 

An asynchronous process is a process or function which does not require interaction with a user. It can be used to execute a task “in the background” without the user having to wait for the task to finish. Force.com features such as Asynchronous Apex (@future), Batch Apex, Bulk API, Reports and other features use asynchronous processing to efficiently process requests.Each future method is queued and executes when system resources become available. That way, the execution of your code doesn’t have to wait for the completion of a long-running operation. A benefit of using future methods is that some governor limits are higher, such as SOQL query limits and heap size limits.

Use Case: – Upon Creation of Account Record, you need to validate the billing address of account and update latitude and longitude values.

Solution: – Create a trigger of account on after insert and after update.  Invoke  an asynchronous Webservice call with @future annotation to validate the address against google API and return latitude and longitude

Trigger :- 

trigger AccountTriger on Account (after insert) {

    AccountUpdateAsync.updateAccountAddress(Trigger.new[0].Id);

}

Apex Class:-

 public class AccountUpdateAsync {
    @future(callout=true)
    public static void updateAccountAddress(String accId){
        Account acc = [Select Id , Name ,Location__Latitude__s  , Location__Longitude__s, 
                       BillingStreet ,BillingCity ,BillingState ,BillingPostalCode,
                       BillingCountry from Account where id =:accId Limit 1];

        String httpReqURI = 'https://maps.googleapis.com/maps/api/geocode/json?address='+
            acc.BillingStreet+','+acc.BillingCity+','
            +acc.BillingState+','+acc.BillingPostalCode+','+
            acc.BillingCountry
            +'&amp;key=&lt;API_KEY&gt;';
        HttpResponse response = sendHttpReq(httpReqURI);

        if (response.getStatusCode() == 200) {
            JSONParser parser = JSON.createParser(response.getBody());
            Map &lt;String, Object&gt; root = (Map &lt;String, Object&gt;) JSON.deserializeUntyped(response.getBody());
            List &lt;Object&gt; childLevel = (List&lt;Object&gt;) root.get('results');
            for( Object o : childLevel){
                Map &lt;String, Object&gt; childLevel2 = (Map &lt;String, Object&gt;) o;
                Map &lt;String, Object&gt; grandChaildLevel= (Map &lt;String, Object&gt;) childLevel2.get('geometry');
                Object objFinal = grandChaildLevel.get('location');
                Map&lt;String, Object&gt; locs = (Map&lt;String, Object&gt;)objFinal;
                acc.Location__Latitude__s = (Decimal)locs.get('lat') ; 
                acc.Location__Longitude__s = (Decimal)locs.get('lng');
                update acc;
            }

        }

    }

    public static HttpResponse sendHttpReq(String url){
        Http http = new Http();
        HttpRequest request = new HttpRequest();
        request.setEndpoint(url);
        request.setMethod('GET');
        HttpResponse response = http.send(request);
        return response;

    }

    public static  void parseJson(String json){

    }

}

Future method Considerations:- 

Below are the future method considerations

  1. Methods with the future annotation must be static methods and can only return a void type.
  2. The specified parameters must be primitive data types, arrays of primitive data types, or collections of primitive data types. No sObjects or objects as arguments
  3. The future method does not necessarily execute in the same order it is called.
  4. Methods with the future annotation cannot be used in Visualforce controllers in either getMethodName or setMethodName methods, nor in the constructor.
  5. You cannot call a method annotated with future from a method that also has the future annotation
  6. future calls cannot be made from a batch job
  7. future calls can execute concurrently

 

Best Practices:-

future methods are having its own advantages.  Consider this below best practices while implementing the future calls

  1. Make @future  code is efficient as possible. Long execution times increase the chance of delays and extended delays.
  2.  Minimize the number of the asynchronous requests created to minimize the chance of delay and extended delays.
  3. Make sure you are not invoking too many future calls when you are invoking from the trigger . I would advice one future call for one trigger transaction.
  4. Avoid recursive behavior in future calls.Use a static variable to store the state of the trigger processing
  5. Consider using Batch Apex instead @future to process a large number of records asynchronously.
  6. asynchronous processing framework is queue-based, lower priority than real-time interaction Best practice – ensure @future requests execute as fast as possible and remember limits: 10 @futures each making at most 10 callouts.

 

Limits :-

Future calls have below limits

  1. No more than 10 @future method calls per Apex invocation.
  2. You’re limited to 50 future calls per Apex invocation, and there’s an additional limit on the number of calls in a 24-hour period.  The maximum number of future method invocations per a 24-hour period is 250,000 or the number of user licenses in your organization multiplied by 200, whichever is greater. Still, all salesforce licenses are not applicable fo this limits . for examples chatter licenses are not part of this.  More limits you can refer in salesforce limits guide.

 

 

Salesforce Apex Managed Sharing

In this blog post, I am going to explain how to use apex managed sharing. Apex managed sharing allows developers to programmatically share custom objects. When you use Apex managed sharing to share the custom object, only users with the “Modify All Data” permission can add or change the sharing on the custom object’s record, and the sharing access is maintained across record owner changes.

Apex Manager sharing can be enabled on the object that is having private and public read-only access.

1.Apex Managed Sharing can not be enabled on public read-write objects
2.Each custom object is having its own sharing table with __share name if the object sharing is private or public read-only
3.Apex sharing reasons and Apex managed sharing recalculation are only available for custom objects.
4.Objects on the detail side of a master-detail relationship do not have an associated sharing object. The detail record’s access is determined by the master’s sharing object and the relationship’s sharing setting

Understanding Sharing Reason:-
the Reason field on a custom object specifies the type of sharing used for a record. This field is called rowCause in Apex or the Force.com API. you can use the Sharing reason in Force.com Managed Sharing, User Managed Sharing, Apex Managed Sharing

RowCause values depend on the type of sharing .its can be Owner, team, share, manual etc depends on the type of sharing the reason.

Access Levels:-
Access level is determining a user’s access to records. Most share objects support the following access levels: Supported Access levels are Private, Read Only, ReadWrite, Full Access
Every share object has the following properties:-
AccessLevel – Can be any of the Edit, Read, All
ParentID – The ID of the object
RowCause – The reason why the user or group is being granted access
UserOrGroupId – The user or group IDs to which you are granting access

User Managed Sharing Using Apex:-
Now we are going to create an apex class that is used to share the data based on sharing.It is possible to manually share a record to a user or a group using Apex or the SOAP API. If the owner of the record changes, the sharing is automatically deleted


public with sharing class DeliverySharingCls {
    public static void shareDelivery(list&lt;Delivery__c&gt; delivery){
        List&lt;Delivery__Share&gt; totalShares = new List&lt;Delivery__Share&gt;() ; 
        for( Delivery__c d : delivery){
            // Create new sharing object for the custom object Delivery__c.           
            Delivery__Share share = new Delivery__Share() ; 
            // Set the ID of record being shared.
            share.ParentId = d.Id ;
            // Set the ID of user or group being granted access. Insted of the heard coded user id , you need to create fetch the 
            // id dynaically 
            share.UserOrGroupId = '00541000000RlIa';
            // Set the access level.
            share.AccessLevel = 'Edit';
            // Set rowCause to 'manual' for manual sharing.
            share.RowCause = Schema.Delivery__Share.RowCause.Manual;
            totalShares.add(share); 
        }

        if(totalShares.size()&gt;0){
            list&lt;Database.SaveResult&gt; saveres = Database.insert(totalShares) ; 

        }
    }

}

Invoke the above code from the trigger to share the data as shown below.


 trigger DeliveryShare on Delivery__c (after insert) {
    DeliverySharingCls.shareDelivery(trigger.new) ;

}

Creating Apex Managed Sharing:-

Apex managed sharing enables developers to programmatically manipulate sharing to support their application’s behavior through Apex or the SOAP API

Apex managed sharing must use an Apex sharing reason. Apex sharing reasons are a way for developers to track why they shared a record with a user or group of users.
Apex sharing reasons are defined on an object’s detail page.
To Create a Apex Managed Sharing reason on the custom Object
go to –> setup -> create ->object ->Delivery__c -> under Apex Managed Sharing Reason ->Click New and create a sharing reason as shown below .

Apex sharing reason name: Custom Sharing Model
In order to use it in the code, you have to use it as shown below
Schema.Delivery__share.rowCause.Custom_Sharing_Model__c
Now Update the above code with the rowCause as shown below.


public with sharing class DeliverySharingCls {
    public static void shareDelivery(list&lt;Delivery__c&gt; delivery){
        List&lt;Delivery__Share&gt; totalShares = new List&lt;Delivery__Share&gt;() ; 
        for( Delivery__c d : delivery){
            // Create new sharing object for the custom object Delivery__c.    
            Delivery__Share share = new Delivery__Share() ; 
            // Set the ID of record being shared.
            share.ParentId = d.Id ;
            // Set the ID of user or group being granted access. Insted of the heard coded user id , you need to create fetch the 
            // id dynaically 
            share.UserOrGroupId = '00541000000RlIa';
            // Set the access level.
            share.AccessLevel = 'Edit';
            // Set rowCause to 'Custom Sharing model' for manual sharing.
            share.RowCause = Schema.Delivery__Share.RowCause.Custom_Sharing_Model__c;
            totalShares.add(share); 
        }

        if(totalShares.size()&gt;0){
            list&lt;Database.SaveResult&gt; saveres = Database.insert(totalShares) ; 

        }
    }

}