Node Js Heroku Canvas Integration

Introduction

In this blog, I am going to explain Heroku and node js canvas integration. Canvas allows you to integrate web applications with Salesforce in the different places of salesforce like Chatter Tab, Visualforce Page, Lightning Component, Publisher, Chatter Feed, Console, Open CTI, Mobile Navigation, Layouts and Mobile Cards. In this post, I am going to show the simple table that contains the product master data which is simple HTML table.

Prerequisites

  1. Git installed on your local machine
  2. Heroku account
  3. Heroku toolbelt installed.
  4. Node.js installed on your local machine

Creating Connected App

In order to integrate the salesforce with Heroku application, you need to create the configure the salesforce connected app as shown.

  • Setup -> Create -> Apps -> Connected Apps-> New
  • Fill  Connected App NameAPI Name, and Contact Email as shown in the image 
  • Check the Enable OAuth Settings in the API (Enable OAuth Settings) section
    • Callback URL: add as https://localhost.com/_callback. Later after deploying node js app to Heroku, we need to change this.
    • Selected OAuth Scopes: select the scopes you want for the NodeJS app access token to grant.  Select as ” Full Access ” for this testing app.
  • In the section, Canvas App Settings type Check Force.com Canvas
  • Canvas App URL: https://localhost.com/canvas/callback.Later after deploying the apps Heroku, we are going to change this URL .
  • Access Method: Signed Request (POST)
  • Locations: choose Chatter Tab and publisher as of now. but you can select one or more based on the where you want to show this the canvas on salesforce.
    Now finally connected App is looking as shown below.

Now you need to enable the app for the profiles or users. To do this,

  • Click on the Manage on the connected App which you created in above .click edit policies
  • In the OAuth policies select Admin approved users are pre-authorized for the Permitted Users field and save it
  • In the Profiles related list add the profiles or permission set which you wanted to grant access

Building Your App on Node Js

Let’s set up the basic app structure. We will use Express to set up a basic Node application. Here is the below package.json file which we will use to build the node js application. Application folder structure is shown below.

package.json

  • Once you have created your file structure, set up a Git repo in this directory by running from command. $ git init your command line within this directory.
  • Inside of the public > views > index.html file, add the following code

     
  • From the command line in the main app directory initialize your Node app by running -.npm install which will load all the node modules from the package.json

In the index.js file add the following code:

Run the following from the command line within your main app directory to push your changes to Heroku.

  1. heroku create
  2. git add .
  3. git commit -m "pushing to heroku"
  4. git push heroku master
  5. heroku open
    Open the URL which looks like as a random string like https://limitless-retreat-55750.herokuapp.com/

Now go to the Connected app which you created in earlier, update the callback URL and Canvas App URL with the Heroku app URL  example https://limitless-retreat-55750.herokuapp.com/

Now We have to add the Environmental variable Consumer secret from the connected app to Heroku by runs the following command.

Now you can see the Canvas app on the chatter tab as shown below.

Github URL for complete code : https://github.com/rajamohanvakati/Canvas-App-Quick-Action

 

 

 

IBM Watson Salesforce Visual Recognition

In this Blog, I am going to explain how to Integrate Salesforce and IBM Watson Image Recognition.  Visual Recognition allows users to understand the contents of an image or video frame, answering the question: “What is in this image?” and “Are there similar images?” etc. The IBM Watson Visual Recognition service uses deep learning algorithms to analyze images for scenes, objects, faces, and other content. The response includes keywords that provide information about the content.A set of built-in classes provides highly accurate results without training. You can train custom classifiers to create specialized classes. You can also create custom collections of your own images, and then upload an image to search the collection for similar images. In this blog, we are going to use default classifiers to classify an image and then detect faces in an image.You can also preview a live version of this application. Some of the major highlights include

Object determination — Classification of things in the image
Face detection — Detect human faces, including estimation of age & gender
Text extraction — Extract text contained in the image
Celebrity identifier — Identity of person if your image includes a public figure

1 .  Setup API Key for IBM Watson Visual Recognition

  1. Register for IBM Bluemix with US South or your preferred region.
  2. Log in to the IBM Bluemix.
  3. Select Create app, to start with the new appletwatson
  4. Find Watson on Services tab
  5. On Watson services, select Visual Recognition API
  6. Click on Create to get started.
  7. On Service credentials, select on credential-1 and copy the API-key
    Once you get the API key you are completed with Bluemix console setup now we will turn to Salesforce.

2. Salesforce Setup

I am considering the case is having an image which we will use for image Recognition. So let’s setup two new fields on the case as shown below. Upload_Image_URL__c – URL(255)
Image__c  formula   HYPERLINK( Upload_Image_URL__c ,IMAGE( Upload_Image_URL__c , “Case Image”))

3. Train a Custom Classifier

So not too much worried at this stage on the  Custom Classifier and trying to emphasize the out of the box default classifier with IBM Waston.You can consider to implement the custom classifier based on the business logic and wanted to implement your own image classification that fits your model like a product catalog.

4.Remote Site Settings

Add the Image Recognition services under Remote Site settings.

5.Recognition Image

Here is the simple visualforce page that shows how we can classify the images as shown below. You will get the result of the each image with a prediction score and Classification. You can now use the class with the highest score in your application to label your image.

6. Code Walkthrough

Here is the code walk through for the apex controller.The below code is shown the simple way to build the endpoint URL of image Recognition services.

the below code is making an API call to Watson services along with the image URL

Below Code is used to for parsing the JSON response data to show it the page

Here is corresponding visualforce page code

Github Repo is here below

https://github.com/rajamohanvakati/Watson-Image-Recognition

Customer Tone Analyze from Live Agent Chat

In this blog, I am going to explain the how to understand the customer tone from the live agent chat and this post is an extended version of my post about customer tone. IBM Watson has different tone analyzer services like from the text, email, and customer engagement.
You can use live agent transcript for tone analysis. simple live agent chat is looking as shown below

the result of the analysis will look as shown below.

The code is here for the same is shown here below.

The below is the visualforce page for the same.

 

 

 

 

 

 

Understand Customer Feeling from Case With IBM Watson

In this post, we are going to see how to use IBM Watson tone analyzer services understand the customer emotional, social, and language tones from the case descriptions. You can use IBM Waston services in many ways like understand the customer tone from Analyzing text marketing campaigns, Understanding the lead interested level, understand customer happiness from the case is few use cases. Let’s assume the case is coming from the web to lead with case subject and Description about the case . This post we are going to Predict whether they are happy, sad, confident, and more from the description an the subject.

What is Watson Tone Analyzer?

The IBM Watson Tone Analyzer service uses linguistic analysis to detect emotional, social, and language tones in written text. The service can analyze tone at both the document and sentence levels. You can use the service to understand how your written communications are perceived and then to improve the tone of your communications. Businesses can use the service to learn the tone of their customers’ communications and to respond to each customer appropriately or to understand and improve their customer conversations in general.You submit JSON, plain text, or HTML input that contains your written content to the service. The service returns JSON results that report the tone of your input. You can use these results to improve the perception and effectiveness of your communications, ensuring that your writing conveys the tone and style that you want for your intended audience. The following diagram shows the basic flow of calls to the service.

Submit content to the Tone Analyzer service and use the results to improve your communications.

1 . Create Waston Service

To get started, one can just create an IBM login at https://console.ng.bluemix.net and create a service called “Tone Analyzer” as shown below.

Select the Service click on create a service to receive your credentials which are required to make an API call from the salesforce.

 

2. Create Named Credential in Salesforce

Create a named credentials in Salesforce with the Tone Analyzer service credentials which you got them from the first step and you can use those in call out to IBM Watson services.

3. Simple Visualforce page.

That’s all. Here is the below Sample visualforce page that displays the customer emotions based on the case subject and description.The page shows the complete custom tone in different areas of the Emotions and Language styles and Social Tendencies.

4.Code Walkthrough

Here is the below piece of code that prepares the Json file to send the HTTP request to Watson services.

Below piece of code shows how to make the Http Call to IBM Watson tone services. We are using the Named Credecntiosn with call out.

 

Below Piece of code is for the JSON parsing.

Here is the visualforce page the show the result .

Here is the complete code

https://github.com/rajamohanvakati/Watson-Tone-Analyzer-

5.Conclusion

If you’d like help identifying how your customers are feeling, IBM Watson tone analyzer will help you to understand.

 

 

 

Salesforce Contact Personality Insights with Watson

1. Introduction

In this blog i am going to explain how to use contact personality insight by using contact twitter handler to Predict personality characteristics by using the twitter timeline. but you can leverage the Personality Insights other aspects based on the user profile information or Facebook post or even you can use Salesforce chatter post it selfs to understand.

2 .What is Personality Insight?

Personality Insights extracts and analyzes a spectrum of personality attributes to help discover actionable insights about people and entities, and in turn guides end users to highly personalized interactions. The IBM Watson Personality Insights service provides an API that enables applications to derive insights from social media, enterprise data, or other digital communications. The service uses linguistic analytics to infer individuals’ intrinsic personality characteristics, including Big Five, Needs, and Values, from digital communications such as email, text messages, tweets, and forum posts.The service can automatically infer, from potentially noisy social media, portraits of individuals that reflect their personality characteristics. It can also determine individuals’ consumption preferences, which indicate their likelihood to prefer various products, services, and activities.As a core service of the IBM Watson platform, the Personality Insights service can help businesses understand their customers at a deeper level. It can help businesses learn their clients’ preferences, improve customer satisfaction, and strengthen client relations. Businesses can use these insights to improve client acquisition, retention, and engagement, and to guide highly personalized engagements and interactions to better tailor their products, services, campaigns, and communications for individual clients.

3. Create a custom field on Contact

Create a new text field on the contact twitter_handler__c text of 255 on contact object. we will use this to store the contact twitter handler.  We are going to see the personality insight sunburstChart which contains all the insights of the contacts

4 .  Making API calls to twitter / Connected App in twitter.

Go to https://apps.twitter.com/ and create an app from the twitter. After this, you are going to get the OAuth details which we will make an API call to twitter time line. Here is the below code which we will use to make an API call to twitter

5.Create a Services in IBM Watson console.

Go to IBM Watson console and create a new Personality  insight services and obtain the service credinatils

Click on Create and Get the Service Credentials as shown below

6. Created Named Credential in Salesforce

Create a named credentials in Salesforce with the IBM Watson personality insight API authentication which we will use for call out with Watson Personality services.

7. Finally, Visualforce Page and Controller to make API call

Here is the apex controller that calls IBM Watson personality services.

This is the visualforce page

You can find the complete code in this GitHub URL https://github.com/rajamohanvakati/Personality-Insights-