Thursday, 29 October 2015

“A commit in Time, Saves Hours Nine!”


“Version Control”


The ultimate boon for software engineers! Why? Suppose you have a demo scheduled with your client. You are a hardworking programmer, just like me, and you successfully complete the functionality well in time for the demo. Everything is fine, dandy! Until you decide to play with the code, doing a code review yourself, beautifying and rectifying the bugs within. Minutes before the demo and you finish the code review fixes as well; kudos, good going mate!!! Now you get a cup of coffee and rehearse the demo, already playing how smoothly it would go. Here comes your worst nightmare live in action, the code has broken! In reality, the rectification process ended up being a disaster and that too minutes before the demo! Say goodbye to coffee and peace, you will be left scratching your head to find out what went wrong! Safe to say that your client ain’t going to be too gung-ho about it!

This is where Version Control comes to your rescue, like a knight in shining armor, just like those superhero movies! With version control, incremental versions of the files are stored, so that in times of desperate measures such as this, you can quickly switch to one of the previous versions and turn everything back to normal; almost like a piece of magic!!! And the magic does not stop here. With advanced version control tools such as Github, BitBucket and Apache Subversion (SVN), you can even sync your local files with the cloud so that you can quickly access them even if you are on the go or do not have access to your machine. Amazing, isn’t it?

One of the most popular methods of version control is Github. In a few simple steps, I’ll show you how you can set-up version control using Github and say goodbye to all last minute glitches.

What you need

  1. A Github account, which you can create for free at https://github.com/join
  2.  Git shell which you can download and install on your machine from https://git-scm.com/downloads

Let’s get it rolling!

Step 1: Initializing a Repository

A repository or simply a repo is a common storage place for all the documents in the version control system. It maintains the version for every file that you have added to it.

To initialize the repository, you need to use the git init command.


This would initialize an empty repository in the folder you are currently working on.

Step 2: Adding files to the Repository

The next step is to add the files to the repository in order to maintain versions. It is necessary to specify the files whose versions you need to maintain to git. You need to use the git add * command for the same. The ‘*’ specifies that every file in the current folder is to be added to the repo. You can also mention the names of the files that you want to keep a track of. Suppose you want to track the file Constants.cls only, the command would now need to say git add Constants.cls

You can also view the status of the files in the repo by running the git status command.


Github has its own way of recording the changes to the files. Once you have added the files to track, it stores the files in a staging area where all the changes to the file are recorded as snapshots.
Now that you have successfully added files to your repository, the version control system is up and running and already has started tracking the changes you save to your files.

Step 3: Committing the changes

What is the use of only tracking the changes to the file and not saving them? Committing the changes ensures that every change that you did prior to the commit has been recorded and stored on the version control system.

If you are likely to make mistakes, committing your files is the way to stay safe.

git commit -m <commit_message> command does the trick for you. The arguments -m <commit_message> are used to record the commit messages. Commit messages are a way to document the changes done in the current version of the files.


As a cardinal rule, it is always a good practice to add meaningful commit messages. In case you need to revert back to a previous version of the file, it’s a lot easier to identify the version from its commit message.



What next?

Now that you have set-up version control using Github, you can have that cup of coffee even if the recent changes affect your code negatively. The only thing that you need is to revert back to the commit where everything was working as expected. In order to do that, git checkout -- <file_name> is the command for you. This would return the specified file to the previous version and discard all the uncommitted changes.

But it all comes at a cost…

  • Though Github allows you to create a free repo, it remains public; which means anyone can view your files and their revisions when stored on the server. But worry not, if you want a private repository, you can just purchase it. Or you can also switch to other service providers such as BitBucket, which allows you to create a private repo with a single user.
  • Uncommitted changes once lost, are lost forever
  • Storing large amount of data on the cloud is not very efficient as it takes a lot of data and resources to publish a file or to download it
  • As the number of users increase for your private repo, the cost also increases

It doesn’t stop right here though…

There is a lot more to explore in the git world, such as branches, pulls, merging, pull requests etc. Things that make software development a lot easier and collaborative. You can find the online tutorials at https://guides.github.com and make software development easier and more collaborative for yourself and those around you!

But make sure you commit your files before you get started with those tutorials, because you know it too well now “A commit in time, saves hours nine!!!”

References





Written by Saket Joshi, Salesforce Developer at Eternus Solutions
Read More »

Friday, 23 October 2015

What Dreamforce 2015 taught me about Sales Productivity: Part 2

In my previous post, I talked about how Dreamforce laid great stress on Social Selling, Sales Enablement and Accurate Forecasting. Well, that wasn’t all. Salesforce also laid emphasis on the significance of a healthy pipeline, sales coaching, setting the right goals and most importantly, prioritization, as the key focus areas for achieving greater sales productivity.

Do NOT Sideline your Pipeline

A healthy Sales pipeline is the key to your Sales performance. Not only does it signify that you have a healthy sales process that is obviously reaping benefits for you, it clearly translates into more revenue. A large number of Dreamforce sessions reiterated on spending more time in refining your pipeline that has more qualified prospects than suspects, and training your Sales team to achieve the same. However, it’s easier said than done!

Gartner reports have listed ‘healthy pipeline’ to be the second most important thing that Sales Leaders desperately need and do not get. The truth is, a lot of sales reps start out not being good at pipeline refinement and management but only the best ones master the art. So what’s the key to pipeline management?

Clichéd as it sounds, the key to a dream pipeline is an effective sales process in place. Statistics show that a dedicated sales process leads to as much as 18% increase in revenue growth through effective sales pipelines. Trained Sales reps are another big contributor towards a favorable pipeline (covered at length in subsequent sections), leading to 9-24% increase in annual revenue growth. Bottom line: revere thy pipeline and hold it in good stead!

Sales Coaching

Dreamforce revealed that the key to sales productivity is in pipeline management. But that does not happen on its own accord; you need to train your sales reps for that. Which brings me to the next pertinent aspect of sales productivity: sales coaching.

Sales reps need to be constantly coached with the right content that would help them close deals faster. This is achieved through a three-pronged process: Identification of ‘coachable’ activities, defining and detailing the coaching methodology and establishing a coaching rhythm that brings the best out of your sales reps.

Sales Coaching involves three key aspects that lead to setting the right goals:
  • Breakdown of sales goals and how they will be measured
  • Focus on the critical activities that lead to success
  •  Focus on fewer goals done well rather than too many goals done poorly

Statistics reveal that organizations have seen up to 80% increase in revenue when sales reps are coached the right way. Dreamforce sessions certainly got this one right!

You need PRIORITY and NOT more time

The final yet the foremost thing that Dreamforce taught me! Any sales guy who tells you he wishes he had 48 hours in a day clearly isn’t on top of his game. The smartest of sales people are the ones who prioritize their prospects and deals, who know how to spend more time with the prospects that are more likely to give them business and Dreamforce could not have laid more emphasis on this fact. A lot of sessions in Dreamforce were targeted at increasing sales productivity through prioritization, and how spending more time with fewer prospects actually holds the key to productivity.

Your time is precious, and as a sales rep, even more so. Identify the prospects who are ‘mobilizers’ and engage with them through commercial insights, instead of mere leadership perspective. Take them through the stages of a verified and validated pipeline and monitor their progress. You will find that they are likely to give you more business than the ones who did not qualify.

Transforming your Sales Productivity: Where do you begin?

In a pyramid of sales productivity, your sales reps form the base. Only ones the sales reps are efficient and effective can they translate and provide value to their teams, organizations and customers, in that order. Therefore, for any organization that is looking to enhance the productivity of its Sales team needs to start at the base: with their sales reps, coaching and enabling them to achieve their targets. Once your army is ready and raring to go, it’s only a matter of time that the world will be at your feet!



Written by Nupur Singh, Pre-Sales & Marketing Expert at Eternus Solutions
Read More »

Monday, 19 October 2015

Development Environment with Continuous Integration

Have you ever had to work on the same Apex class or Visualforce page as the guy sitting beside you and wait for him to finish his work? Does your team often work on the same development cycle across multiple instances? Would it help if you could find out bugs before you signed off the build for testing? If the answer to any of these questions is a ‘yes’, Continuous Integration aka CI is the thing for you!

CI: The Evolution…

Originally developed by the Salesforce Foundation to support the Non-profit Starter Pack 3.0, CI is currently being used for Salesforce Foundation’s Cumulus project. It is also available as an open source software to reuse in your own managed package development efforts.

Prerequisites to using CI

Before you can begin using CI, you need to ensure that you have checked all items off the following checklist:
  1. Install Eclipse and Setup the environment variables for Cumulus CI
  2. Download Ant, which allows for easy command-line deployment, and follow steps given at http://www.salesforce.com/us/developer/docs/daas/Content/forcemigrationtool_install.htm to install the Force.com Migration Tool.  Migration tool is a Java/Ant-based command-line utility, which helps ANT based migration by executing Ant scripts.
  3. Create your local development org by visiting https://developer.salesforce.com. It is recommended that you populate some test data.
  4. Download Git, which is the most powerful source code repository and widely accepted amongst the developer communities
  5. Installing the Source tree is optional.  A Source tree provides UI for operations such as Pull, Push, Merge and Commit, instead of using commands.
  6. Install Jenkins on your local machine by downloading the Java Web Archive (.war). Go to the directory where you have downloaded the file and execute java -jar jenkins.war command. If you open http://localhost:8080/ URL in your browser, you can see Jenkins is running on your local machine.

CI: Coding Intelligently!

CI works on the principle of constant flow of development changes and unit testing, in order to detect conflicts and errors within the development cycle itself. The diagram below gives more clarity on this:

  1. As a developer, you need to create a feature branch and pull the master branch in this local development org to get the most recent version of the code, including the changes done by any other developer of your team
  2. Once you are ready to submit the new code version, including your changes, you need to commit those changes and push the branch into Git
  3. Once the changes are committed, Jenkins will launch the build process
  4. The CI server will get the code from the repository and unit testing will be executed by the CI server to find out test class methods, if any
  5. The files will then be packaged into distributable units and deployed onto the test server. Automated functional tests will be executed to validate the package and its basic functionality.
  6. User acceptance testing will be done by CI by deploying the same package on the testing environment. The same package will be deployed on the production server once the acceptance tests are successful.
  7. After running all the tests, the results will be sent back to the developer. If there are any failures, the developer can fix them, commit those changes, and push the same branch into Git.
  8. Your QA team can take the pull of that branch and test it. If any code changes are required, you will need to follow the same process starting from pull till push.
  9. The entire development is done within the feature branch. Once the development is complete, feature branches are merged into the master branch and all other branches can then be destroyed.
  10. A Master branch is a single, persistent branch maintained in the repository to avoid any ambiguity. Once you create a pull request, any commits that you make to your branch requesting to merge, are also included in the pull request. You can use comments on pull requests to explain the issue or solution in brief.
  11. The entire development is done in feature branches (e.g feature/<BranchName>) and all releases are managed through Git tags (e.g uat/1.0-beta3 or prod/1.0).

Benefits of using CI

  1. The quality of a release can be improved significantly by identifying and resolving issues in the earlier stages of the development lifecycle
  2. CI helps developers to run automated tests and also in ensuring the required code coverage. It is helpful for a developer to find out the issues and bugs during unit testing which he may have overlooked during development.
  3. CI process helps to resolve shared development issues. Multiple developers can update the same metadata at same time within the development organization, without any worries.

There you go! You are all set for development using Continuous Integration. In case you need more information, you can also refer to the following links:

  1. CumulusCI Introduction
  2. How To Use Git, GitHub and the Force.com IDE with Open Source Labs Apps 




Written by Jyoti Chouthai, Salesforce Lead Developer at Eternus Solutions
Read More »

Monday, 12 October 2015

What Dreamforce 2015 taught me about Sales Productivity: Part 1

It’s been a while that Dreamforce 2015 concluded but the delirium lives on! There’s still so much buzz around the different products and services Salesforce introduced over the 4-day event. However, a CRM no matter how extensive and useful comes into play only when you have a great customer base and an extremely healthy pipeline. Add to that the fact that Salesforce recently posted revenues upwards of USD 5 billion, the first time they touched that number! Therefore, I was hoping that Dreamforce 2015 would cover sessions and trainings around Sales Productivity, as who better to teach you about Sales productivity than the elite member of the 5 billion club? Much to my pleasure, Dreamforce covered Sales Productivity at length and I learnt a great deal of new things!

Social is the new selling wand!


Here’s where you would expect me to write something like ‘Move over traditional sales, social selling is the way to go!’ Well, that ain’t happening! Fact is, social selling has always been there, much before you and I, much before cloud computing, Facebook, internet, anything that you can think of! Confused? Haven’t you always purchased groceries from the pleasant guy next block? Did you purchase your first bike from the guy recommended by your friend? Isn’t your dentist the one your colleague recommended you? These are all examples of social selling! You ask your peers and friends for recommendations, and don’t shy away from dishing feedback.

Same applies for selling anything using Social Media. The opportunities are simply enormous! Thanks to the networked markets, people know more about your products and services than some of your own folks. Use it to your advantage. Reach out to people over Twitter and LinkedIn, engage them in a conversation and help them know how you can solve their problems. Gain real time insight into what your customers are discussing online. You’ll be able to understand their needs (both individually and more broadly across the market). Build relationships and trust, and you’ll be able to engage them with useful, timely information to help.


Image Courtesy www.wsi-digimedia-marketing.com


Investment in Sales Enablement pays big time!


Your people are your assets! By that definition, your sales people are your most valuable assets. Therefore it is logical to invest in their enablement, helping them adapt to the ever-changing world of sales. Educate your sales teams to deliver the right message at the right time to prospects. When sales people are on message, they convert leads and close opportunities faster.

Invest time and money on their skill growth, so that they are aligned to the customers’ needs rather than focused on Why Choose Us. Encourage open communication and easy integration between team members and integrate this Sales Enablement process across the organization, ensuring a team approach towards right sale to the right client. Remember, successful Sales Enablement is 10% tactical and strategic contribution and 90% alignment towards a common agenda.


Image Courtesy: Salesforce.com

Accurate Forecasts is a must


I cannot possibly stress enough on the significance of Accurate Forecasting. However, a lot of sales people lay great stress on the accuracy of a forecast, often without realizing or finding where they are wrong. The trick is to break the forecasts into a series of assumptions that can be referred to later. But if you have ever been a Sales guy, you know this is easier said than done. So how do you make accurate forecasts?

The foremost thing to do is to enable your Sales reps with technology. Let technology do the numbers for them, find insights that they can use to engage prospects in meaningful conversations and guide them through the lifecycle. Use a software or an application that comes with an audit trail, so that you can note any changes to the deal value, customer or probability of cracking the deal and take necessary actions. Bottom line, let your sales people do sales and leave the data and accounting to your technology.

Accurate Forecasting is an art mastered with practice, one that needs your people trained to yield the desired results. Train them to identify the signs that help them identify a good deal from a bad one, enable them to focus on pipeline exceptions and analyze the cause behind them to take corrective action wherever required, and teach them how to constantly refine their forecasts.

With great power comes great responsibility and it is no different for your sales people. Let them know what is expected of them, give them complete autonomy but also make them accountable. Set aside time to review the forecasts, discuss action items for any exceptions and plan the way ahead. Reviewing the numbers together gives multiple views around the same potential problem, enabling the team to find a solution that is most suitable.

It is important to understand and acknowledge that Sales productivity is actually a series of keys that unlock the big prize that Sales productivity is. In my next blog, I will talk about how setting the right goals, sales coaching, a healthy pipeline and prioritizing go a long way towards enhancing the Sales productivity. Till then, happy selling!


What Dreamforce 2015 taught me about Sales Productivity: Part 2


Written by Nupur Singh, Pre-Sales & Marketing Expert at Eternus Solutions
Read More »

Friday, 9 October 2015

Case Management: Notify Case Owner on creation of a Case comment via Apex

If you have worked with Salesforce, you will acknowledge that irrespective of the application you are building or the solution you are designing, there will always be scenarios where you will need to override the standard Salesforce functions and extend them to get your functionality right. Today, I have a simple, yet useful Case Management hack, that will simplify your lives to a great extent.

Case is a very powerful object within Salesforce and “with great power comes great responsibility”. While dealing with cases, it is our responsibility as a Salesforce developer to ensure that case owners are notified every time there is a new comment added to their cases. These comment notifications are critical in ensuring that the case owner is aware of all updates made against their case and can plan the next steps required for the successful closure of the case.

Let us take a quick look at what standard features Salesforce offers for similar notifications on Cases.

Salesforce Feature


If you are the owner of a case record and you want to monitor the status of your Case, especially around case comments, Salesforce provides a notification feature for this. This feature is accessible and can be enabled from the Setup menu.

Then why does the Standard Function not Suffice?


It is a commonly requested feature to have Visualforce pages which allow users to add/update case comments. I had a requirement very similar to this, with a slight catch, that my Visualforce page was exposed on a website for customers to log, track and update cases.

When I tried adding case comments on the case object using Apex, these comments were successfully added but the case record owner did not receive any notifications around the same, despite the case notifications feature being enabled.

Here is a snippet of what I used to add case comments.


Let us look at two alternative workarounds to address this issue. The first one requires Apex, while the second one will not require any Apex code!

Trick 1: Using Apex


When we create case comments using Apex, we need to set triggerUserEmail true in our code. Even though auto-sent emails can be triggered by actions in the Salesforce user interface, the DML Options settings for EmailHeader take effect only for DML operations carried out in Apex code.

I implemented this using the following code:


This will now send out notifications on case comments via Apex.

However, you should note that for the notifications to work, Case Owner and Case Comment creator are to be different users.

Trick 2: Using Workflow


We can achieve the same result using a Workflow activity. For this, we need to write a workflow rule on the Case Comment object to perform a field update on the related Case. You can create a Boolean field Send Notification on the Case object for the same.

Step 1:  Create a field with Boolean data type.

Step 2:  Create a workflow on Case Comment as shown below:


Step 3: Create a Workflow action for Field update on Case as shown below:


Step 4: Activate the Workflow rule.

Step 5: Create another workflow rule on Case object. This will send an Email to the Case Owner every time our Boolean flag is updated.


Step 6: Create a Workflow action for Email Alert as shown below:


Pro Tip: You can use a custom Email template for configuring this email alert.

Step 7: Finally, activate your workflow!


Both these options are great workaround to ensure the case owner does not miss out on the updates. Irrespective of which option you go for, you will never miss out on a case alert anymore!



Written by Yogesh Sharma, Salesforce Developer at Eternus Solutions
Read More »

Tuesday, 6 October 2015

Data Import from SQL to MS Dynamics CRM


If you have worked with MS Dynamics CRM, you would have certainly needed to import data from your SQL databases. Integration with different servers come with their own challenges. In this blog, I will take you through a simple procedure of importing data from your SQL databases into your MS Dynamics CRM instance.

Before You Get Started

You will need to have a basic idea about how to create SQL queries and decent hands-on experience of MS Dynamics CRM.

Our Target

I needed to get all the entities and entity attributes from my MS Dynamics CRM and create a dynamic table for the selected entity and its attributes as table columns in the database. I also needed to insert multiple records into the table and finally import the same table into MS Dynamics CRM to get multiple entity record in a single click.

Solution

With the help of .net framework & Microsoft Dynamics CRM SDK, I developed a mapping tool for my requirement. This would essentially take care of the following scenarios:
  1. Export & import with MS Dynamics CRM
  2. Creating a dynamic table in our local database
Let's take a look at the screenshots:

Figure 1: Accept URL, Username, Password of the respective MS Dynamics CRM portal


Figure 2: The mapping screen will contain Entities, Entity attributes, SQL server
name and databases of selected server and a button which will create a dynamic table on selected database


Figure 3: Sync with Dynamics CRM button will import table records into Dynamics CRM


How I built my Rome


Step 1
The very first thing we need as per Figure 1 is a Dynamics CRM URL, Username and Password to connect with CRM. I have used IOrganizationService. With the help of this service, we can access data and meta data of the particular organization.

NOTE: For interaction purpose with Dynamics CRM, you need to add the following references:


Clicking on the button will give you the credential and you need to parse the credential by using Parse method of CRM connection class to pass it to IOrganizationService. CRM Connection is a class which is present inside Microsoft.Xrm.Client namespace.

Step 2
As per Figure 2, once we get the credentials, we just need to pass it to IOrganizationService to get all the entities of a particular organization. We can use the following code:


To get the entity attributes, we can use the Attribute Meta data class as shown below. Once we get all the attribute & its types, we can create a dynamic table inside the database with the help of this class.


Step 3
To get SQL server name and databases, we can use SQL server management object(SMO).

NOTE: Add following references to use the SMO:


To get instances of all available SQL Servers, use the following code:


As per figure 2, on selection of the server, we need to get a list of databases which are present on that particular server. For the same, use the following code as shown below:


Step 4
We are ready with our Entities, Entity attributes, Server name and Database. Now we need to create a table for the selected entity dynamically. For that purpose, we need to pass selected server and database name to SMO server and database class respectively. Then we need to create an object of table class, pass column name and data type to the table object and finally call the Create method of table object.

NOTE: Since MS Dynamics CRM attribute data types & SQL data types are different, we must replace CRM data type with SQL data type.

Step 5
Finally we are ready with our dynamic table. Now, we need to insert bulk of record into the table. To import data into CRM, we need to make a request to ExecuteMultipleRequest class and send the list of records to it.

I have used the following code for the same:


And we are done! I ended up creating a reusable component, which provides enough capability to import data from SQL to Dynamics CRM. My migration team loves me to the core!




Written by Neethuanna Matthew,  Microsoft Dynamics CRM Champion at Eternus Solutions
Read More »

Sunday, 4 October 2015

5 Things You Must Know About SalesforceIQ

Touted as the future of selling for every business, Salesforce introduced SalesforceIQ in Dreamforce’15 amidst much fanfare. Powered by the game-changing Relationship Intelligence technology that utilizes advanced data science to analyze the relationships between organizations and prospects, customers and partners, the new SalesforceIQ for Small Business and SalesforceIQ for Sales Cloud are expected to transform selling for all businesses. Here are 5 things you must know about SalesforceIQ.


1. Start Selling without Data Entry and within minutes!

Quite literally! SalesforceIQ automatically updates itself with relevant information from your prospect and customer communications, including but not limited to your emails, calendars, calls and marketing software like HubSpot, Pardot and MailChimp. What this essentially means is that your sales reps can now focus on selling rather than mundane data entry within your CRM, something they have always dreaded! Additionally, companies can literally start selling within minutes of setup, thanks to easy onboarding and no up-front setup costs.


2. Superman Selling

This is an era where sales reps are expected to be at the top of their game all the time, almost in a Superman-esque manner. Whether you need to connect to a new prospect or answer his queries, SalesforceIQ helps you do just that. Closest Connections helps you quickly identify people within your network who could provide best introduction to a prospect or a target organization while Intelligence Fields helps Sales reps prioritize and focus on the most crucial opportunities within their pipeline!


3. Say goodbye to back and forth navigation

Not only are your CRM records updated from within your inbox, but your emails are also analyzed and information is brought into your email from your CRM. Say goodbye to navigating back and forth between your emails, CRM and marketing software!


4. Prioritize your opportunities with intelligence

What’s better for a sales rep than having a CRM tell him which opportunities need his immediate attention, without having to hunt for this information? SalesforceIQ captures key insights and information from calls, emails and other forms of communications, thus providing you with the required support to prepare for an opportunity. And if this did not blow your mind, SalesforceIQ comes with an Inactive Days field to alert you whenever an opportunity needs your attention, helping you make faster decisions! A CRM was never this intelligent before!

Image Courtesy Salesforce

5. Plethora of amazing features

Read receipt notifications as alert when emails are opened, follow-up email scheduling, ability to attach files from a range of cloud services and schedule an email to be sent at a designated time are just the tip of the iceberg that SalesforceIQ is. SalesforceIQ comes powered with numerous email shortcuts, predictive notifications and helps you stay abreast with your pipeline. Add automated logging and reminders to this list and your sales reps are in for a massive treat!


SalesforceIQ: a win-win situation!


SalesforceIQ is a smart, easy-to-use CRM built for smart sales people, who are looking to increase their efficiency and spend time on selling rather than worrying about the underlying things. Not only does SalesforceIQ drive success in the organization, it also enables them to deliver increased value to their customers. The bar for selling has been set high and SalesforceIQ is the stairway to that heaven!

You can also read more about SalesforceIQ in this post.





Written by Nupur Singh, Pre-Sales & Marketing Expert at Eternus Solutions
Read More »

Wednesday, 23 September 2015

Activity Reminders on Dashboard For Microsoft Dynamics CRM


Interactions with customers through phone call, emails and recording the same form a vital part of any CRM application. As a Microsoft Dynamics CRM user, you can create phone call, tasks, emails, fax and letters through Activities which appear on an entity record or you will find your records in the activity entity.

But what if I wanted the due date of my Activities to appear on the dashboard like reminders? For a recent requirement, I needed to do just that. One of my favorite clients wanted his activities to show up as reminders as per their due dates. For this, I decided to create a new dashboard, as shown below.



Customizing Activity Reminders


The dashboard needed some customization for creating activity reminders.

Step 1: Create Dependency Files

Create a web resourse, say, RestSDK and browse to your respective directory to find the SDK.REST.js. I have used some bootstrap CDNs for my page, however, you can download bootstrap files from http://getbootstrap.com/ and create web resources of bootstrap.min.js, bootstrap.min.css and bootstrap-theme.min.css respectively and add them into your html file.

You also need to download the latest jquery.js and json and add them to your file. Alternatively, a CDN of jquery will also suffice. I have used my own .css file for the purpose of customization. Likewise, you can add your own .css files, as per your requirement

You will find the SDK.REST.js file in Microsoft Dynamics CRM SDK 2015, usually downloaded from https://www.microsoft.com/en-in/download/details.aspx?id=44567


Step 2: Create a .html Web resource for Phone Call,Appointment and Task


The preview page of PhoneCall.html looks like:


You need to create a grid or an HTML table which will display your record.


On load of your page, you need to write a function as shown below to retrieve the phone call entity record. Here, SDK.REST.retrievemultiple() retrieves the phone call entity record.


Next, you need to add rows to the table that you have created. With the help of clientglobalcontext.js, it is quite easy to access the Logged-In User. I have used Mscrm.GlobalContext.prototype.getUserName() to filter and retrieve the records I need. You can also click on the record to view the entire information using  Mscrm.GlobalContext.prototype.getClientUrl() which gives the application URL and entity type code of the entity.


Finally, the rows are added to the created table using the EntityRetrieveComplete() function.


Likewise, you can create the appointment.html and task.html in the same way.

The preview of  appointment.html is shown below.


The preview of task.html is shown below.


Step 3: Creating Activity Reminder Dashboard


I have created a 3-Column overview Dashboard wherein I need to insert into each column the web resource I have created .

There! Your Activity Reminder Dashboard is created!


So what’s the big deal about Activity Reminder Dashboard?

The best thing about an Activity Reminder Dashboard is that the next time you are assigned a task, a phone call or an appointment, you just need to login to view your reminder dashboard which will provide you with necessary details, like scheduled time and priority of the activity.If you do not have your outlook integrated with your CRM, this is almost God-send! So goodbye missing out on important appointments! Hello, Activity Reminder dashboard!




Written by Neethuanna Matthew,  Microsoft Dynamics CRM Champion at Eternus Solutions
Read More »

Wednesday, 16 September 2015

Analyzing Salesforce data with Google Analytics: Part 4

Welcome to the anchor leg!  Are you geared for crossing the line today? Till now, we have successfully configured our Google Analytics account (Part 1) as well as our Salesforce org (Part 2) , and also created the custom variables (Part 3) in the previous posts. Let’s now put the final piece of the puzzle together and create the custom dimensions, and once done, we will then go about tracking the most popular Accounts based on page views, using what we have learnt so far.


Creating Custom Dimensions


Custom dimensions allow you to combine Google Analytics data with non-Google Analytics data, e.g. CRM data. For example, if you want to store the Geo-location of signed-in users in your CRM system, you could combine this information with your Google Analytics data to see Page-views by Geo-location.


Limits of custom dimensions


There are 20 indices available for different custom dimensions in each property. Premium accounts have 200 indices available for custom dimensions. Custom dimensions cannot be deleted, but you can disable them.

Step 1

Go to the Admin section of your Google analytics account and then click on the Custom Dimensions link under the Custom Definitions drop down menu:


Step 2

Click on the New Custom Dimension button.


Step 3

Enter the name of your new custom dimension, select its scope as Hit and then click on the create button.


Every custom dimension has got four scopes: Hit, Session, User and Product. Hit is a call to the GA/UA server by a JavaScript library (like ga.js, analytics.js etc). A hit can be a pageview, screenview, event, transactions, item etc.


  1. When a custom dimension has hit level scope, its value is only applied to the hit with which the value was sent
  2. When a custom dimension has Session level scope, its value is applied to all the hits in the current web session
  3. When a custom dimension has User level scope, its value is only applied to all the hits in the current and future web sessions of a user, until the value changes or the custom dimension is made inactive.
Note: You can’t delete a custom dimension or metric once you have created it. All you can do is then make it inactive if you don’t need it.


Step 4

Once you click on the create button, you will be shown the sample code for your custom dimension. Just click on the Done button for now.


You have now created your first custom dimension as shown below:


Step 5

You will need the help of a developer now. Forward the following example code for your custom dimension to your developer, so that he can use the same to track your website.


Note: You can get these example codes by clicking on the name of your custom dimension.


My Final Google Analytics Script!!!


My final google analytics script including custom variables and custom dimensions can be seen below:


You can also see the custom dimensions in the Google analytics site, by going to Reporting Tab > Behavior > Site Content > All Pages and choosing the secondary dimension as given below:


After choosing the custom dimension User Name, it looks something like:


Tracking the Most Popular Accounts based on Page Views


Go to Reporting Tab > Behavior > Site Content > All Pages and filter out as shown below:


Tadaa! You now have your most popular accounts based on page views! You can use similar logic for other use cases and in turn, get more out of your Salesforce org, just as I promised.


Analyzing Salesforce data with Google Analytics:  Part 1, Part 2, Part 3


Written by Arun Kumar Bharati,  Salesforce Developer at Eternus Solutions
Read More »

Monday, 14 September 2015

Lightning and Nonprofits: Best of both worlds!


This is the age of mobility! Users need access to real-time data on their fingertips. Deals are signed quicker because sales reps have access to the data on their mobiles and can take necessary actions really fast! It has become pertinent for information to be available on mobile; it is a necessity, no longer a luxury!

Same is the case with nonprofits and fundraising. Nonprofits constantly need a view of their donation pipelines, along with all donation details till date, in order to plan proactively for their fundraising and constituent activities. The user should also be able to see the donors along with donation details on a map to get a real time visualization of total donations within a region and to be able to trace and locate the donors.

For this last week and a half, the world has truly been struck by Lightning! Salesforce introduced the new Lightning Experience combining the powers and prowess of Lightning Design System, Lightning App Builder and Lightning Components to facilitate modern, seamless apps that focus on usability and user experience. Salesforce promised that apps would be built at a lightning speed and I decided to test that. And what better way to test it than Lightning and Map components? Result? True to Salesforce’s promise, my app was ready at an ultra-quick pace!

Donation Manager App

The Donation Manager App is a mobile app that gives a list of all donors categorized by user-defined regions, along with donation summary and detailed information in a single click.

It should be noted that before you start building any Lightning Component, you must enable the Lightning Components in Salesforce1 (BETA). You can do this by navigating to Setup and search for Lightning Components.

While I was designing the architecture of my app, I decided to do a few tweaks to make it more optimized.
  1. Built as a 100% pure lightning app, Donation Manager App essentially consists of a single page to dispense the requisite information, making it extremely simple to use and navigate.
  2. In order to save on the development efforts and timelines, I reused the Salesforce 1 detail page
  3. I also leveraged the standard Salesforce objects (Opportunity object became my Donation object and Account object became my Donor object) in order to save on time and reuse the out of the box functionality as much as possible.
  4. Additionally, it also enabled me to leverage the Parent-Child relationship between Account and Opportunity and leverage functionalities like roll-up summary fields

The Map Component

First things first, I needed to develop a map component. The app opens to a map interface, showing a list of all donors, existing and prospective, in the vicinity and enabling your user to get directions to reach them, if need be! The locations are depicted in the form of pins, as shown in the first image below. The app also comes with color-coded donor details, giving you a clear picture of who your most generous donors are.



Once you click on any pin, the app displays the summary of all donations for that donor, as shown above. I have used the standard functionality of a roll-up summary field to depict the same.

However, developing this was not easy. Each time I used a Google map API, I was getting the following error:

Content Security Policy: The page's settings blocked the loading of a resource at http://maps.google.com/maps/api/js?sensor=false ("script-src https://ap1.lightning.force.com https://ssl.gstatic.com chrome-extension: 'unsafe-eval' 'unsafe-inline'").

I kept trying different things and failing until I found the Leaflet.js library for maps! Thank you, Christophe Coenraets, I owe you one!

Finally, I could begin building my app!

Building my App!

  1. I created a new lightning component within my org by navigating to Setup – Lightning App Builder – New
  2. I navigated to the developer console in order to develop my custom components. As per your business requirement, you can leverage the standard components provided by Salesforce: Filter list, Recent Items, Report Chart, Rich Text and Visualforce.
  3. I dragged and dropped the components I needed on the App screen
  4. As soon as my custom components were created, I got a component page (.cmp), a component controller (.js) and a helper (.js). In order to interact with Salesforce or use the Apex functionality, I now needed to create a class.
  5. I also had to provide the @AuraEnabled annotation to my methods which I needed to access within my custom components.

Developing my Component Page

  1. I needed to include name of the controller, force:apphostable and flexipage interfaces. For details on this, please refer to Salesforce’s documentation on Lightning.
  2. Thereafter, I had to include the CSS and JavaScript libraries that I needed within the static resource. Don’t forget to include the URL for the same within your component page with the help of ltng:require tag.
  3. I added the dependencies, handlers and attributes which I would require with the help of aura tags.

    It should be noted that sometimes navigation does not work if you do not include dependency resource markup://force:navigateToSObject

    Now I had to design my HTML as per the need.
  1. I now had to make a space for my map which will be loaded through leaflet.js file.
  2. Design your markers and pop-ups as needed. In my case, I was fetching Account details and links to Opportunity by calling controller functions.
  3. The controllers were called by using {!c.<function name>} 


    Let’s tweak this a little and add Opportunity and related Account details from the pop-up below our map to enhance the functionality of our app!
  1.  c.GetOpportunity () function will bring Account and related opportunity details from my Salesforce org. We will discuss this function a little later.
  2. In order to access Salesforce objects inside our HTML, we usually call the Salesforce object in the following manner:
    {!acc.Name}
However, when you want to use the objects and fields within the HTML tags, you need to use them a little differently.



My app now looked like:


 Once you click on View Details, the details of all donations made till date for that particular donor, along with the basic donor details, are listed in a chronological order, as shown above. Basic donor details include Account Name, phone number, address and email, meaning that if you need to contact the donor immediately, the information is readily available for you.

Developing my Controller & Helper

But none of these functionalities will run on their own. I needed to create a controller and helper to execute them.
  1. As you know, init() is the first function to load, even before page load. With the help of my init () handler, my doInit() function was loaded on the controller page, calling my function for retrieving the accounts.
  2. getAccountOnMap()  helper method interacted with my class and fetched the account details, which are set in the view with the help of component.set() function
  3. For my requirement, I needed to color code the markers based on the value of the roll-up summary field. For this, I used simple marker tags with appropriate conditions in the logic, and attached them to the billing address of the account on the map.
  4. For view details, I used the getOpportunity() function in my controller which calls two helper methods: one to fetch the account details and the other to fetch the donation details. You can add your own functions as per you requirement


    Let’s move on to my helper class.
  5. The getAccountOnMap() function is called from the controller and it fetches data from my Salesforce class, including all the modifications, data parsing, markers and popup details which are populated on the map.


  6. You can add functionality as per your requirement. I have added conditions for markers and set the attributes for map as shown below.

  7. Once the user clicks on View Details, he should see the Account and donation information. For this, I called two helper functions from my controller.
  8. Do not forget to add actions and pass parameters in setParam() method. In my case, I needed to pass the Account ID and call my Apex class.
  9. The response from Apex class would be passed back in the setCallback() method.
  10. I needed to set all the results which I required with the help of component.Set () method and pass to view using v.attribute() to my component page.
  11. Finally, I needed to enqueue your action


Last, but not the least…

Don’t forget to use annotation @AuraEnabled  to fetch any details from your apex class.

Tadaa! I can now deploy my app and see the power of lightning unfold into a beautiful app!

Deploying the App

  1. Go to Setup – Lightning App Builder – Create New App
  2. Add your components as per the requirement. I added two reports, discussed later in this blog.
  3. Save your settings and hit Activate.
Your app is good to go!

The App works because…

  • Lightning-quick: It is built 100% using Lightning Components, leading to faster, client-side execution.
  • All the information is within a single page! Say goodbye to navigating back and forth for information!
  • Rapid Development cycle, leading to greater productivity and enhanced ROI
  • Reusable: You can pick up this component and reuse anywhere with minimal tweaks! The power of Lightning!
  • No dependency on any browser or device. Thanks to the Lightning Component, the app is responsive and has a seamless interface.

Lightning enables you to build amazing apps real quick which can be purchased and sold on AppExchange. A new beautiful world of amazing apps beckons you, deep dive into it today!



Written by Ashwini Singh, Salesforce Developer at Eternus Solutions
Read More »