Blog

Blogging on programming and life in general.

  • I like to keep my blob containers quite tidy and delete any files that would unnecessarily increase its size. For a project I was working on, I had a blob that was being used to temporarily store images a user uploaded for manipulation at a later time. I saw no reason to keep these files for no longer than 24 hours. An Azure WebJob seemed an ideal solution to do this.

    I could've left the blob container to stagnate and fester over time and the reasoning behind creating a cleanup task wasn't from a cost point of view. A blob container is very reasonably priced for the amount of storage and requests I would be making. I was more concerned about performance for times where I would be trawling through many thousands of files to get back the image a user had uploaded for temporary use by my web application.

    Creating an Azure WebJob is very easy and versatile. You have the flexibility to develop a WebJob by creating the following scripts or programs:

    • .cmd, .bat, .exe (using windows cmd)
    • .ps1 (using powershell)
    • .sh (using bash)
    • .php (using php)
    • .py (using python)
    • .js (using node)
    • .jar (using java)

    In this post, I will be developing my WebJob using a Console Application that will generate an executable. In Visual Studio 2017, there are two ways you can go about creating a project for your WebJob:

    1. Console Application project
    2. Selecting Azure WebJob project - which you will find under the "Cloud" category.

    If you create your WebJob using a Console Application, you will still have the option later on to "Publish as an Azure WebJob..." when right-clicking on the project. In the code below I happened to be using a Console Application only because I didn't even know a Azure WebJob project existed until after I completed development on my project. Doh!

    Program.cs

    I have created a new project called "Site.AzueWebJob.Cleanup". The project uses the following two Azure nuget packages:

    
    namespace Site.AzureWebJob.Cleanup
    {
        class Program
        {
            static void Main(string[] args)
            {
                try
                {
                    CloudStorageAccount storageAccount = CloudStorageAccount.Parse("<Insert Storage Connection String Here>");
    
                    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
                    CloudBlobContainer dataContainer = blobClient.GetContainerReference("<Blob container name>");
    
                    Console.WriteLine("Hourly threshold to remove records: {0}", ConfigurationManager.AppSettings["Azure.CleanupHours"]);
    
                    #region Retrieve all data items greater than 24 hours and delete them
    
                    Console.WriteLine("Retrieving old data files...");
    
                    // Get files where the "Last Modified Date" is olders than 24 hours.
                    IEnumerable<CloudBlob> oldData = dataContainer.ListBlobs()
                                    .OfType<CloudBlob>()
                                    .Where(b => b.Properties.LastModified.Value.Date < DateTime.Now.AddHours(int.Parse(ConfigurationManager.AppSettings["Azure.CleanupHours"].ToString()) * -1));
    
                    IList<CloudBlob> dataBlobs = oldData as IList<CloudBlob> ?? oldData.ToList();
    
                    Console.WriteLine("Data records retrieved: {0}.", dataBlobs.Count);
                    Console.WriteLine("Removing old data files...");
    
                    // Loop through the files and delete if they exist.
                    foreach (CloudBlob dataBlob in dataBlobs)
                    {
                        bool isDeleted = dataBlob.DeleteIfExists();
    
                        if (isDeleted)
                            Console.WriteLine("Deleted: {0}.", dataBlob.Name);
                    }
    
                    #endregion
    
                    Console.WriteLine("Removing old data complete.");
                }
                catch (Exception ex)
                {
                    Console.WriteLine("Error cleaning container files: {0}", ex.Message);
                }
    
                Console.WriteLine("Clean Containers WebJob complete.");
            }
        }
    }
    

    There isn't really much to it. All I am doing is retrieving all files that are older than 24 hours (value set within App.config app setting called: "Azure.CleanupHours") and then carrying out the delete process by looping through any records returned.

    The most safest way to delete a file is to use the CloudBlob.DeleteIfExists() call. As the method name suggests, it will only delete a file if it exists. Using the CloudBlob.Delete() will cause an exception if for some reason the file isn't there and will require additional error handling.

    Final Steps

    Now that we have our Azure WebJob ready to go, the only thing left is to publish to your Azure Web App by simply right-clicking on your project and selecting: "Publish as an Azure WebJob...". Here you will connect to your Azure instance and have the options to choose how your WebJob should run:

    • Continuously
    • On Demand
    • On Schedule
  • Reading and writing files from an external application to Saleforce has always resulted in giving me quite the headache... Writing to Salesforce probably exacerbates things more than reading. I will aim to detail in a separate post on how you can write a file to Salesforce.

    In this post I will demonstrate how to read a file found in the "Notes & Attachments" area of Salesforce as well as getting back all information about that file.

    The first thing we need is our attachment object, to get back all information about our file. I created one called "AttachmentInfo":

    public class AttachmentInfo
    {
        public string Id { get; set; }
        public string Name { get; set; }
        public string Description { get; set; }
        public string BodyLength { get; set; }
        public string ContentType { get; set; }
        public byte[] FileBytes { get; set; }
    }
    

    I created two methods in a class named "AttachmentInfoProvider". Both methods are pretty straight-forward and retrieve data from Salesforce using a custom GetRows() method that is part of another class object I created: ObjectDetailInfoProvider. You can get the code for this from the following blog post - Salesforce .NET API: Select/Insert/Update Methods.

    GetAttachmentsDataByParentId() Method

    /// <summary>
    /// Gets all attachments that belong to an object. For example a contact.
    /// </summary>
    /// <param name="parentId"></param>
    /// <param name="fileNameMatch"></param>
    /// <param name="orderBy"></param>
    /// <returns></returns>
    public static async Task<List<AttachmentInfo>> GetAttachmentsDataByParentId(string parentId, string fileNameMatch, string orderBy)
    {
        string cacheKey = $"GetAttachmentsByParentId|{parentId}|{fileNameMatch}";
    
        List<AttachmentInfo> attachments = CacheEngine.Get<List<AttachmentInfo>>(cacheKey);
    
        if (attachments == null)
        {
            string whereCondition = string.Empty;
    
            if (!string.IsNullOrEmpty(fileNameMatch))
                whereCondition = $"Name LIKE '%{fileNameMatch}%'";
    
            List<dynamic> attachmentObjects = await ObjectDetailInfoProvider.GetRows("Attachment", new List<string> {"Id", "Name", "Description", "Body", "BodyLength", "ContentType"}, whereCondition, orderBy);
    
            if (attachmentObjects.Any())
            {
                attachments = attachmentObjects.Select(attObj => new AttachmentInfo
                {
                    Id = attObj.Id,
                    Name = attObj.Name,
                    Description = attObj.Description,
                    BodyLength = attObj.BodyLength,
                    ContentType = attObj.ContentType
                }).ToList();
    
                // Add collection of pick list items to cache.
                CacheEngine.Add(attachments, cacheKey, 15);
            }
        }
    
        return attachments;
    }
    

    The GetAttachmentsDataByParentId() method takes in three parameters:

    • parentId: The ID that links an attachment to another object. For example, a contact.
    • fileNameMatch: The name of the file you wish to search for. For most flexibility, a wildcard search is performed.
    • orderBy: Order the returned dataset.

    If you're thinking this method alone will return the file itself, you'd be disappointed - this is where our next method GetFile() comes into play.

    GetFile() Method

    /// <summary>
    /// Gets attachment in its raw form ready for transformation to a physical file, in addition to its file attributes.
    /// </summary>
    /// <param name="attachmentId"></param>
    /// <returns></returns>
    public static async Task<AttachmentInfo> GetFile(string attachmentId)
    {
        List<dynamic> attachmentObjects = await ObjectDetailInfoProvider.GetRows("Attachment", new List<string> {"Id", "Name", "Description", "BodyLength", "ContentType"}, $"Id = '{attachmentId}'", string.Empty);
    
        if (attachmentObjects.Any())
        {
            AttachmentInfo attachInfo = new AttachmentInfo();
    
            #region Get Core File Information
    
            attachInfo.Id = attachmentObjects[0].Id;
            attachInfo.Name = attachmentObjects[0].Name;
            attachInfo.BodyLength = attachmentObjects[0].BodyLength;
            attachInfo.ContentType = attachmentObjects[0].ContentType;
    
            #endregion
    
            #region Get Attachment As Byte Array
    
            Authentication salesforceAuth = await AuthenticationResponse.Rest();
    
            HttpClient queryClient = new HttpClient();
    
            string apiUrl = $"{SalesforceConfig.PlatformUrl}/services/data/v37.0/sobjects/Attachment/{attachmentId}/Body";
    
            HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, apiUrl);
            request.Headers.Add("Authorization", $"OAuth {salesforceAuth.AccessToken}");
            request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
    
            HttpResponseMessage response = await queryClient.SendAsync(request);
    
            if (response.StatusCode == HttpStatusCode.OK)
                attachInfo.FileBytes = await response.Content.ReadAsByteArrayAsync();
    
            #endregion
    
            return attachInfo;
        }
        else
        {
            return null;
        }
    }
    

    An attachment ID is all we need to get back a file in its raw form. You will probably notice there is some similar functionality happening in this method where I am populating all fields of the AttachmentInfo object, just like the GetAttachmentsDataByParentId() method I detailed above. The only difference being is the fact this time round only a single file is returned.

    The reason behind this approach comes from a performance standpoint. I could have modified the GetAttachmentsDataByParentId() method to also return the file in its byte form. However, this didn't seem a good approach, since we could be outputting multiple files large in size. So making a separate call to focus on getting the physical file seemed like a wise approach.

    To take things one step further, you can render the attachment from Salesforce within your ASP.NET application using a Generic Handler (.ashx file):

    <%@ WebHandler Language="C#" Class="SalesforceFileHandler" %>
    
    using System;
    using System.Text;
    using System.Threading.Tasks;
    using System.Web;
    using Site.Salesforce;
    using Site.Salesforce.Models.Attachment;
    
    public class SalesforceFileHandler : HttpTaskAsyncHandler
    {
        public override async Task ProcessRequestAsync(HttpContext context)
        {
            string fileId = context.Request.QueryString["FileId"];
        
            // Check if there is a File ID in the query string.
            if (!string.IsNullOrEmpty(fileId))
            {
                AttachmentInfo attachment = await AttachmentInfoProvider.GetFile(fileId);
    
                // If attachment is returned, render to the browser window.
                if (attachment != null)
                {
                    context.Response.Buffer = true;
    
                    context.Response.AppendHeader("Content-Disposition", $"attachment; filename=\"{attachment.Name}\"");
    
                    context.Response.BinaryWrite(attachment.FileBytes);
    
                    context.Response.OutputStream.Write(attachment.FileBytes, 0, attachment.FileBytes.Length);
                    context.Response.ContentType = attachment.ContentType;
                }
                else
                {
                    context.Response.ContentType = "text/plain";
                    context.Response.Write("Invalid File");
                }
            }
            else
            {
                context.Response.ContentType = "text/plain";
                context.Response.Write("Invalid Request");
            }
    
            context.Response.Flush();
            context.Response.End();
        }
    }
    
  • Ever since I re-developed my website in Kentico 10 using Portal Templates, I have been a bit more daring when it comes to immersing myself into the inner depths of Kentico's API and more importantly - K# macro development. One thing that has been on my list of todo's for a long time was to create a custom macro extension that would render all required META open graph tags in a page.

    Adding these type of META tags using ASPX templates or MVC framework is really easy to do when you have full control over the page markup. I'll admit, I don't know if there is already an easier way to do what I am trying to accomplish (if there is let me know), but I think this macro is quite flexible with capability to expand the open graph output further.

    This is how I currently render the Meta HTML within my own website at masterpage level (click for a larger image):

    Open Graph HTML In Masterpage

    I instantly had problems with this approach:

    1. The code output is a mess.
    2. Efficiency from a performance standpoint does not meet my expectations.
    3. Code maintainability is not straight-forward, especially if you have to update this code within the Page Template Header content.

    CurrentDocument.OpenGraph() Custom Macro

    I highly recommend reading Kentico's documentation on Registering Custom Macro Methods before adding my code. It will give you more of an insight on what can be done that my blog post alone will not cover. The implementation of my macro has been developed for a Kentico site that is a Web Application and has been added to the "Old_App_Code" directory.

    // Registers methods from the 'CustomMacroMethods' container into the "String" macro namespace
    [assembly: RegisterExtension(typeof(CustomMacroMethods), typeof(TreeNode))]
    namespace CMSApp.Old_App_Code.Macros
    {
        public class CustomMacroMethods : MacroMethodContainer
        {
            [MacroMethod(typeof(string), "Generates Open Graph META tags", 0)]
            [MacroMethodParam(0, "param1", typeof(string), "Default share image")]
            public static object OpenGraph(EvaluationContext context, params object[] parameters)
            {
                if (parameters.Length > 0)
                {
                    #region Parameter variables
    
                    // Parameter 1: Current document.
                    TreeNode tnDoc = parameters[0] as TreeNode;
                    
                    // Paramter 2: Default social icon.
                    string defaultSocialIcon = parameters[1].ToString();
    
                    #endregion
    
                    string metaTags = CacheHelper.Cache(
                        cs =>
                        {
                            string domainUrl = $"{HttpContext.Current.Request.Url.Scheme}{Uri.SchemeDelimiter}{HttpContext.Current.Request.Url.Host}{(!HttpContext.Current.Request.Url.IsDefaultPort ? $":{HttpContext.Current.Request.Url.Port}" : null)}";
    
                            StringBuilder metaTagBuilder = new StringBuilder();
    
                            #region General OG Tags
                            
                            metaTagBuilder.Append($"<meta property=\"og:title\" content=\"{DocumentContext.CurrentTitle}\"/>\n");
    
                            if (tnDoc.ClassName == KenticoConstants.Page.BlogPost)
                                metaTagBuilder.Append($"<meta property=\"og:description\" content=\"{tnDoc.GetValue("BlogPostSummary", string.Empty).RemoveHtml()}\" />\n");
                            else
                                metaTagBuilder.Append($"<meta property=\"og:description\" content=\"{tnDoc.DocumentPageDescription}\" />\n");
    
                            if (tnDoc.GetValue("ShareImageUrl", string.Empty) != string.Empty)
                                metaTagBuilder.Append($"<meta property=\"og:image\" content=\"{domainUrl}{tnDoc.GetStringValue("ShareImageUrl", string.Empty).Replace("~", string.Empty)}?width=600\" />\n");
                            else
                                metaTagBuilder.Append($"<meta property=\"og:image\" content=\"{domainUrl}/{defaultSocialIcon}\" />\n");
    
                            #endregion
    
                            #region Twitter OG Tags
    
                            if (tnDoc.ClassName == KenticoConstants.Page.BlogPost || tnDoc.ClassName == KenticoConstants.Page.GenericContent)
                                metaTagBuilder.Append("<meta property=\"og:type\" content=\"article\" />\n");
                            else
                                metaTagBuilder.Append("<meta property=\"og:type\" content=\"website\" />\n");
    
                            metaTagBuilder.Append($"<meta name=\"twitter:site\" content=\"@{Config.Twitter.Account}\" />\n");
                            metaTagBuilder.Append($"<meta name=\"twitter:title\" content=\"{DocumentContext.CurrentTitle}\" />\n");
                            metaTagBuilder.Append("<meta name=\"twitter:card\" content=\"summary\" />\n");
    
                            if (tnDoc.ClassName == KenticoConstants.Page.BlogPost)
                                metaTagBuilder.Append($"<meta property=\"twitter:description\" content=\"{tnDoc.GetValue("BlogPostSummary", string.Empty).RemoveHtml()}\" />\n");
                            else
                                metaTagBuilder.Append($"<meta property=\"twitter:description\" content=\"{tnDoc.DocumentPageDescription}\" />\n");
    
                            if (tnDoc.GetValue("ShareImageUrl", string.Empty) != string.Empty)
                                metaTagBuilder.Append($"<meta property=\"twitter:image\" content=\"{domainUrl}{tnDoc.GetStringValue("ShareImageUrl", string.Empty).Replace("~", string.Empty)}?width=600\" />");
                            else
                                metaTagBuilder.Append($"<meta property=\"twitter:image\" content=\"{domainUrl}/{defaultSocialIcon}\" />");
    
                            #endregion
    
                            // Setup the cache dependencies only when caching is active.
                            if (cs.Cached)
                                cs.CacheDependency = CacheHelper.GetCacheDependency($"documentid|{tnDoc.DocumentID}");
    
                            return metaTagBuilder.ToString();
                        },
                        new CacheSettings(Config.Kentico.CacheMinutes, KenticoHelper.GetCacheKey($"OpenGraph|{tnDoc.DocumentID}"))
                    );
    
                    return metaTags;
                }
                else
                {
                    throw new NotSupportedException();
                }
            }
        }
    }
    

    This macro has been tailored specifically to my site needs with regards to how I am populating the OG META tags, but is flexible enough to be modified based on a different site needs. I am carrying out checks to determine what pages are classed as "article" or "website". In this case, I am looking out for my Blog Post and Generic Content pages.

    I am also being quite specific on how the OG Description is populated. Since my website is very blog orientated, there is more of a focus to populate the description fields with "BlogPostSummary" field if the current page is a Blog Post, otherwise default to "DocumentPageDescription" field.

    Finally, I ensured that all article pages contained a new Page Type field called "ShareImageUrl", so that I have the option to choose a share image. This is not compulsory and if no image has been selected, a default share image you pass as a parameter to the macro will be used.

    Using the macro is pretty simple. In the header section of your Masterpage template, just add the following:

    Open Graph Macro Declaration

    As you can see, the OpenGraph() macro can be accessed by getting the current document and passing in a default share icon as a parameter.

    Macro Benchmark Results

    This is where things get interesting! I ran both macro implementations through Kentico's Benchmark tool to ensure I was on the right track and all efforts to develop a custom macro extension wasn't all in vain. The proof is in the pudding (as they say!).

    Old Implementation

    Total runs: 1000
    Total benchmark time: 1.20367s
    Total run time: 1.20267s
    
    Average time per run: 0.00120s
    Min run time: 0.00000s
    Max run time: 0.01700s
    

    New Implementation - OpenGraph() Custom Macro

    Total runs: 1000
    Total benchmark time: 0.33222s
    Total run time: 0.33022s
    
    Average time per run: 0.00033s
    Min run time: 0.00000s
    Max run time: 0.01560s
    

    The good news is that the OpenGraph() macro approach has performed better over my previous approach across all benchmark results. I believe caching the META tag output is the main reason for this as well as reusing the current document context when getting page values.

  • To continue my ever expanding Salesforce journey in the .NET world, I am adding some more features to my "ObjectDetailInfoProvider" class that I started writing in my previous post. This time making some nice easy, re-usable CRU(D) methods... just without the delete.

    All the methods query Salesforce using Force.com Toolkit for .NET, which I have slightly adapted to allow me to easily interchange to a traditional REST approach when required.

    Get Data

    /// <summary>
    /// Gets data from an object based on specified fields and conditions.
    /// </summary>
    /// <param name="objectName"></param>
    /// <param name="fields"></param>
    /// <param name="whereCondition"></param>
    /// <param name="orderBy"></param>
    /// <param name="max"></param>
    /// <returns></returns>
    public static async Task<List<dynamic>> GetRows(string objectName, List<string> fields, string whereCondition, string orderBy = null, int max = -1)
    {
        ForceClient client = await AuthenticationResponse.ForceCom();
    
        #region Construct SQL Query
    
        StringBuilder query = new StringBuilder();
    
        query.Append("SELECT ");
    
        if (fields != null && fields.Any())
        {
            for (int c = 0; c <= fields.Count - 1; c++)
            {
                query.Append(fields[c]);
    
                query.Append(c != fields.Count - 1 ? ", " : " ");
            }
        }
        else
        {
            query.Append("* ");
        }
    
        query.Append($"FROM {objectName} ");
    
        if (!string.IsNullOrEmpty(whereCondition))
            query.Append($"WHERE {whereCondition} ");
    
        if (!string.IsNullOrEmpty(orderBy))
            query.Append($"ORDER BY {orderBy}");
    
        if (max > 0)
            query.Append($" LIMIT {max}");
    
        #endregion
    
        // Pass SQL query to Salesforce.
        QueryResult<dynamic> results = await client.QueryAsync<dynamic>(query.ToString());
    
        return results.Records;
    }
    

    Insert Row

    /// <summary>
    /// Creates a new row within an specific object.
    /// </summary>
    /// <param name="objectName"></param>
    /// <param name="fields"></param>
    /// <returns>Record ID</returns>
    public static async Task<string> InsertRow(string objectName, Dictionary<string, object> fields)
    {
        try
        {
            ForceClient client = await AuthenticationResponse.ForceCom();
    
            IDictionary<string, object> objectFields = new ExpandoObject();
    
            // Iterate through fields and populate dynamic object.
            foreach (KeyValuePair<string, object> f in fields)
                objectFields.Add(f.Key, f.Value);
    
            SuccessResponse response = await client.CreateAsync(objectName, objectFields);
    
            if (response.Success)
                return response.Id;
            else
                return string.Empty;
        }
        catch (Exception ex)
        {
            // Log error here.
    
            return string.Empty;
        }
    }
    

    Update Row

    /// <summary>
    /// Updates existing row within an specific object.
    /// </summary>
    /// <param name="recordId"></param>
    /// <param name="objectName"></param>
    /// <param name="fields"></param>
    /// <returns>Record ID</returns>
    public static async Task<string> UpdateRow(string recordId, string objectName, Dictionary<string, object> fields)
    {
        try
        {
            ForceClient client = await AuthenticationResponse.ForceCom();
    
            IDictionary<string, object> objectFields = new ExpandoObject();
    
            // Iterate through fields and populate dynamic object.
            foreach (KeyValuePair<string, object> f in fields)
                objectFields.Add(f.Key, f.Value);
    
            SuccessResponse response = await client.UpdateAsync(objectName, recordId, objectFields);
    
            if (response.Success)
                return response.Id;
            else
                return string.Empty;
        }
        catch (Exception ex)
        {
            // Log error here.
    
            return string.Empty;
        }
    }
    

    The neat thing about Insert and Update methods is that I am using an ExpandoObject, which is a dynamic data type that can represent dynamically changing data. This is a new feature in .NET 4.0. Ideal for the ultimate flexibility when it comes to parsing field name and its value. It's a very dynamic object that allows you to add properties and methods on the fly and then access them again.

    If there is any other useful functionality to add to these methods, please leave a comment.

  • I have been doing a lot of Saleforce integration lately, which has been both interesting and fun. Throughout my time working on Salesforce, I noticed that I am making very similar calls when pulling information out for consumption into my website. So I decided to make an extra effort to develop methods that would allow me to re-use commonly used functionality into a class library to make overall coding quicker.

    I am adding all my Salesforce object query related functionality to a class object called "ObjectDetailInfoProvider". This will give me enough scope to expand with additional methods as I see fit.

    To start with, I decided to deal with returning all information from both picklist and multi-select picklists fields, since I find that I constantly require the values of data due to the vast number of forms I am developing. To be extra efficient in every request, I taken the extra step to cache all returned data for a set period of time. I hate the idea of constantly hammering away at an API unless absolutely necessary.

    Before we get into it, it's worth noting that I am referencing a custom "AuthenticationResponse" class I created. You can grab the code here.

    Objects

    There are around seven class objects used purely for deserialization when receiving data from Salesforce. I'll admit I won't use all fields the API has to offer, but I normally like to have a complete fieldset to hand on the event I require further data manipulation.

    The one to highlight out of all the class objects is "ObjectFieldPicklistValue", that will store key information about the picklist values, such as Label, Value and Active state. All methods will return this object.

    public class ObjectFieldPicklistValue
    {
        [JsonProperty("active")]
        public bool Active { get; set; }
    
        [JsonProperty("defaultValue")]
        public bool DefaultValue { get; set; }
    
        [JsonProperty("label")]
        public string Label { get; set; }
    
        [JsonProperty("validFor")]
        public string ValidFor { get; set; }
    
        [JsonProperty("value")]
        public string Value { get; set; }
    }
    

    I have added all other Object Field class objects to a snippets section on my Bitbucket account.

    GetPicklistFieldItems() & GetMultiSelectPicklistFieldItems() Methods

    Both methods perform similar functions; the only difference is cache keys and lambda expression to only pull out either a picklist or multipicklist by its field name.

    /// <summary>
    /// Gets a values from a specific picklist within a Salesforce object. Items returned are cached for 15 minutes.
    /// </summary>
    /// <param name="objectApiName"></param>
    /// <param name="pickListFieldName"></param>
    /// <returns>Pick list values</returns>
    public static async Task<List<ObjectFieldPicklistValue>> GetPicklistFieldItems(string objectApiName, string pickListFieldName)
    {
        string cacheKey = $"GetPicklistFieldItems|{objectApiName}|{pickListFieldName}";
    
        List<ObjectFieldPicklistValue> pickListValues = CacheEngine.Get<List<ObjectFieldPicklistValue>>(cacheKey);
    
        if (pickListValues == null)
        {
            Authentication salesforceAuth = await AuthenticationResponse.Rest();
    
            HttpClient queryClient = new HttpClient();
    
            string apiUrl = $"{SalesforceConfig.PlatformUrl}services/data/v37.0/sobjects/{objectApiName}/describe";
    
            HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, apiUrl);
            request.Headers.Add("Authorization", $"Bearer {salesforceAuth.AccessToken}");
            request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
                    
            HttpResponseMessage response = await queryClient.SendAsync(request);
    
            string outputJson = await response.Content.ReadAsStringAsync();
    
            if (!string.IsNullOrEmpty(outputJson))
            {
                // Get all the fields information from the object.
                ObjectFieldInfo objectField = JsonConvert.DeserializeObject<ObjectFieldInfo>(outputJson);
    
                // Filter the fields to get the required picklist.
                ObjectField pickListField = objectField.Fields.FirstOrDefault(of => of.Name == pickListFieldName && of.Type == "picklist");
                        
                List<ObjectFieldPicklistValue> picklistItems = pickListField?.PicklistValues.ToList();
    
                #region Set cache
    
                pickListValues = picklistItems;
    
                // Add collection of pick list items to cache.
                CacheEngine.Add(picklistItems, cacheKey, 15);
    
                #endregion
            }
        }
    
        return pickListValues;
    }
    
    /// <summary>
    /// Gets a values from a specific multi-select picklist within a Salesforce object. Items returned are cached for 15 minutes.
    /// </summary>
    /// <param name="objectApiName"></param>
    /// <param name="pickListFieldName"></param>
    /// <returns>Pick list values</returns>
    public static async Task<List<ObjectFieldPicklistValue>> GetMultiSelectPicklistFieldItems(string objectApiName, string pickListFieldName)
    {
        string cacheKey = $"GetMultiSelectPicklistFieldItems|{objectApiName}|{pickListFieldName}";
    
        List<ObjectFieldPicklistValue> pickListValues = CacheEngine.Get<List<ObjectFieldPicklistValue>>(cacheKey);
    
        if (pickListValues == null)
        {
            Authentication salesforceAuth = await AuthenticationResponse.Rest();
    
            HttpClient queryClient = new HttpClient();
    
            string apiUrl = $"{SalesforceConfig.PlatformUrl}services/data/v37.0/sobjects/{objectApiName}/describe";
    
            HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, apiUrl);
            request.Headers.Add("Authorization", $"Bearer {salesforceAuth.AccessToken}");
            request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
    
            HttpResponseMessage response = await queryClient.SendAsync(request);
    
            string outputJson = await response.Content.ReadAsStringAsync();
    
            if (!string.IsNullOrEmpty(outputJson))
            {
                // Get all the fields information from the object.
                ObjectFieldInfo objectField = JsonConvert.DeserializeObject<ObjectFieldInfo>(outputJson);
    
                // Filter the fields to get the required picklist.
                ObjectField pickListField = objectField.Fields.FirstOrDefault(of => of.Name == pickListFieldName && of.Type == "multipicklist");
    
                List<ObjectFieldPicklistValue> picklistItems = pickListField?.PicklistValues.ToList();
    
                #region Set cache
    
                pickListValues = picklistItems;
    
                // Add collection of pick list items to cache.
                CacheEngine.Add(picklistItems, cacheKey, 15);
    
                #endregion
            }
        }
    
        return pickListValues;
    }
    
  • This site has been longing for an overhaul, both visually and especially behind the scenes. As you most likely have noticed, nothing has changed visually at this point in time - still using the home-cooked "Surinder theme". This should suffice in the meantime as it currently meets my basic requirements:

    • Bootstrapped to look good on various devices
    • Simple
    • Function over form - prioritises content first over "snazzy" design

    However, behind the scenes is a different story altogether and this is where I believe matters most. Afterall, half of web users expect a site to load in 2 seconds or less and they tend to abandon a site that isn’t loaded within 3 seconds. Damning statistics!

    The last time I overhauled the site was back in 2014 where I took a more substantial step form to current standards. What has changed since then? I have upgraded to Kentico 10, but this time using ASP.NET Web Forms over MVC.

    Using ASP.NET Web Form approach over MVC was very difficult decision for me. Felt like I was taking a backwards step in making my site better. I'm the kind of developer who gets a kick out of nice clean code output. MVC fulfils this requirement. Unfortunately, new development approach for building MVC sites from Kentico 9 onwards will not work under a free license.

    The need to use Kentico as a platform was too great, even after toying with the idea of moving to a different platform altogether. I love having the flexibility to customise my website to my hearts content. So I had to the option to either refit my site in Kentico 10 or Kentico Cloud. In the end, I chose Kentico 10. I will be writing in another post why I didn't opt for the latter. I'm still a major advocate of Kentico Cloud and started using it on other projects.

    The developers at Kentico weren't lying when they said that Kentico 10 is "better, stronger, faster". It really is! I no longer get the spinning loader for obscene duration of time whilst opening popups in the administration interface or lengthy startup times when the application has to restart.

    Upgrading from Kentico 8.0 to 10 alone was a great start. I have taken some additional steps to keep my site clean as possible:

    1. Disable view state on all pages, components and user controls.
    2. Caching static files, such as CSS, JS and images. You can see how I do this at web.config level from this post.
    3. Maximising Kentico's cache dependencies to cache all data.
    4. Took the extra step to export all site contents into a fresh installation of Kentico 10, resulting in a slightly smaller web project and database size.
    5. Restructured pages in the content tree to be more efficient when storing large number of pages under one section.

    I basically carried out the recommendations on optimising website performance and then some! My cache statatics have never been so high!

    My Kentico 10 Cache Statistics

    One slight improvement (been a long time coming) is better open graph support when sharing pages on Facebook and Twitter. Now my links look pretty within a tweet.

  • I can only speak about my experiences from working in the technical industry, but there isn't a week that goes by when I am not being spammed by recruitment agencies who don't seem to get the message that I'm not interested. I can "almost" deal with the random emails I get from various agencies, but when you get targeted by a single person on a daily basis it gets infuriating!

    I remember back in the day when I was fresh out of University and the sense of excitement I had whenever a recruiter phoned or emailed. The awesome feeling that I was in demand! I can still remember my first job interview, where to my horror I found out that the recruiter "tweaked" my CV by adding skills that I didn't have any experience of, resulting in making me look like an absolute idiot in front of my interviewer. I thought they were my friend and only looking for my best interest. In reality, this is not the case.

    As an outsider looking in, the recruitment industry seems to be a really cut-throat business where only one thing seems to matter: the numbers! Not whether a candidate is particularly right for the role. I am not tarnishing all recruiters with the same brush - there are some good guys out there, just not enough.

    A little while ago, I was "tag teamed" by two recruitment agents working at the same agency within such a short space of time. What I'd like to highlight here is that I did not respond to any of their correspondence. But the messages still kept rolling in.

    From Emma...

    Emma sent me standard emails and was quite persistent. One brief Linkedin message accompanied by three direct emails.

    From: Emma
    Sent: 06 July 2016 14:37
    To: 'surinder@doesntwantajob.com
    Subject: New Development Opportunities based in the Oxfordshire area!


    Dear Surinder,


    How are you?


    I just wanted to catch up following on from my prior contact last week over LinkedIn, as mentioned I did come across your profile on LinkedIn and I would be keen to firstly introduce myself, as well as to catch up to find out if you could be open to hearing about anything new.


    As mentioned in my prior email, I am also recruiting for a .Net Developer for a leading organisation based in South Oxfordshire! I would be extremely keen to discuss this position with yourself in further detail.


    Please can you give me a call or drop me an email to let me know your thoughts either way. Hope to hear from you soon!


    Kind Regards,
    Emma

    From: Emma
    Sent: 12 July 2016 14:45
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested in working for a leading Software House, Surinder?


    Hi Surinder,


    How are you?


    Could you be interested in new roles at present?


    If so as mentioned below, I am working on a new .Net Developer role for a leading company based in South Oxfordshire. Please can you give me a call or drop me an email through to let me know your thoughts.


    I will look forwards to speaking with you soon!


    Kind Regards,  
    Emma

    From: Emma
    Sent: 15 July 2016 14:14
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested in working for a leading Software House, Surinder?


    Hi Surinder,


    How are you?


    Are you on the look out for new roles?


    If so as mentioned below, I am working on a new .Net Developer role for a leading company based in South Oxfordshire. Please can you give me a call or drop me an email through to let me know your thoughts.


    I will look forwards to speaking with you soon!


    Kind Regards,
    Emma

    From Becky...

    Now Becky cranked things up a notch or two. She was determined to get my attention and very persistent, I'll give her that. Her strategy consisted of Linkedin messages, following me on Twitter and (like Emma) send me a few emails.

    From: Becky
    Sent: 07 August 2015 11:37
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested?


    Hi Surinder,


    I’ve come across your profile on LinkedIn and it made me think you could possibly be interested in a Web Developer opportunity that I’m currently recruiting for, based at a leading company in West Oxfordshire. 


    I appreciate that you may not be actively looking at the moment, but I can see that you have been at for over 5 years now, so I wanted to approach you about this as I thought you could possibly be interested in a change?

    I’ve included some more information about the role attached.

    <Omitted Job Description>

    I can see you’re working in an agency environment at the moment at <Omitted Company>, which is obviously great for the variety of sites you get to work on. This role will give you that opportunity as well, but with the chance to engage more with the projects your working on, having a deeper involvement in the entire process.

    You will get to work with the latest versions of ASP.Net and C#. My client also gives you full access to Pluralsight. On top of all this, the working environment is the most beautiful in Oxfordshire and there are excellent environmental benefits. The salary is up to <Omitted Salary>.

    That was the first email. A standard recruiters email, but went the extra mile to personalise things based on my current experience. The second email gets a little further to the point and I start to smell the sense of desperation.

    From: Becky
    Sent: 11 August 2015 12:34
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested?


    Hi Surinder,


    I just wanted to send you another email following up on my previous one below, regarding a Web Developer opportunity I am currently recruiting for based in West Oxfordshire. I have attached some more information for you to the email.


    I’d really like the opportunity to chat to you about this opportunity, as I think it could be a really great fit for you. The company are a great one to work for – they offer you fantastic environmental benefits, a beautiful working space, plus are keen to create an interesting and productive environment for developers with full access to Pluarlsight and the latest versions of ASP.Net and C#.
       
    If you are interested in discussing this with me further it would be great if you could get back to me! However, as I said in my previous email, if you’re not interested n pursuing new opportunities then please do just let me know and I shall remove you from our mailing list right away.


    Kind Regards
    Becky

    I think by the the third and final email, Becky finally got the message and knew I wasn't going to take the bait. But admirably tries to get something out of it by asking if I know of anyone else interested in the position she is offering.

    From: Becky
    Sent: 13 August 2015 14:10
    To: 'surinder@doesntwantajob.com
    Subject: Could you be interested?


    Hi Surinder,


    I just wanted to send you a final email about the role below and attached to this email. I think with your experience at <Omitted Company> you would be a great fit for it, so I would love to have a chat with you about it if you think you could be interested. However, if you’re not looking for a new role – seeing as you are an expert in this field – would you know anyone else with your skill set who may be interested in this position? If so, please do not hesitate to pass on my details!


    Kind Regards
    Becky


    Is it acceptable to go to these great lengths to get someones attention? A single email alone should suffice. I understand they have a job to do, but do they really think this approach works? They're clearly not engaging candidates in the right way. I truly question the mentality here. Recruiters remind me of Terminators...just without the killing part.

    Recruitment Terminator Reference - It Can't Be Reasoned With...

    Doesn't sound hopeful does it? But there are two things you can do to lessen the headache and make you less of a target and at the same time, still keep in the loop (if you feel ever so inclined) with the good opportunities that may arise:

    • First and foremost do NOT ever respond! Even if it is to tell them you're not interested. Soon as they know the email address is active and see signs of life, you'll never get them to leave you alone.
    • If you want to enquire about a position via a recruitment agent, use a different contact email address. At least you can ditch it at times of need.

    Loved reading this article titled: Stop The Recruiting Spam. Seriously.. An inciteful read covering some really good points on the state of the recruitment industry.

    Note to my current employment and any recruiters: I'm happy where I am.

  • My custom Salesforce library that I readily use for any Salesforce integrations within my native .NET applications consists of a combination of both handwritten code as well as utilsing the functionality present within the Force.com Toolkit. Even though the Force.com Toolkit does pretty much everything you need for day to day activities like basic read and write interactions. When it comes to anything more, a custom approach is required.

    I have created a AuthenticationResponse class that contains two methods so I could easily interchange between different authentication processes depending on my needs:

    • Rest - Retrieves access token to Salesforce environment in a traditional REST approach.
    • ForceCom - Retrieves authentication details when API calls using Force.com toolkit is used.
    public class AuthenticationResponse
    {
        /// <summary>
        /// Retrieves access token to Salesforce environment in a traditional REST approach.
        /// </summary>
        /// <returns></returns>
        public static async Task<Authentication> Rest()
        {
            HttpClient authClient = new HttpClient();
                
            // Set required values to be posted.
            HttpContent content = new FormUrlEncodedContent(new Dictionary<string, string>
                    {
                        {"grant_type","password"},
                        {"client_id", SalesforceConfig.ConsumerKey},
                        {"client_secret", SalesforceConfig.ConsumerSecret},
                        {"username", SalesforceConfig.Username},
                        {"password", SalesforceConfig.LoginPassword}
                    }
            );
                
            HttpResponseMessage message = await authClient.PostAsync($"{SalesforceConfig.PlatformUrl}services/oauth2/token", content);
    
            string responseString = await message.Content.ReadAsStringAsync();
    
            JObject obj = JObject.Parse(responseString);
    
            return new Authentication
            {
                AccessToken = obj["access_token"].ToString(),
                InstanceUrl = obj["instance_url"].ToString()
            };
        }
    
        /// <summary>
        /// Retrieves authentication details when API calls using Force.com toolkit is used.
        /// </summary>
        /// <returns></returns>
        public static async Task<ForceClient> ForceCom()
        {
            ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
    
            AuthenticationClient auth = new AuthenticationClient();
    
            await auth.UsernamePasswordAsync(SalesforceConfig.ConsumerKey, SalesforceConfig.ConsumerSecret, SalesforceConfig.Username, SalesforceConfig.LoginPassword, $"{SalesforceConfig.PlatformUrl}services/oauth2/token");
    
            ForceClient client = new ForceClient(auth.InstanceUrl, auth.AccessToken, auth.ApiVersion);
    
            return client;
        }
    }
    

    All configuration settings such as the consumer key, consumer secret, username and password are being read from the web.config via a "SalesforceConfig" class. But these can be replaced by calling directly from your own app settings. Both methods return the access token required for querying a Salesforce platform.

  • As I have been writing the last few blog posts, I've been getting the case of "twitchy feet" during the writing process. I normally get "twitchy feet" when frustrated or annoyed by things in my life that I feel could be done easier. In this case, my site has started to frustrate me and felt that adding new posts became a chore.

    Over the 10 years (has it really been this long!?) owning and maintaining this site, it's started to become a bit of the beast from the initial outset. I've jumped from platform to platform based on my needs at the time:

    • Wordpress (2006)
    • BlogEngine (2007 to 2012)
    • Kentico (2012 to present)

    I feel at the grand old age of 31, I need a platform that nurtures my writing creativity without having to worry about general maintainance and somewhat restrictive editorial functionality. Ever since I tasted the pure nectar that is Markdown, my writing speed has gone through the roof and love having full control through the simplistic editing interface - Markdown is the furture!

    I am a certified Kentico Developer (you may have got that impression from my vast posts on the platform) and specifically chose Kentico CMS because it gave me the full flexibility to build the site how I wanted. As great as the platform is, I've come to the conclusion that this site will never grow to be anything more than one thing: a blog. So I want to down-size like a person getting on in his years and move to a smaller house.

    Enter Ghost...

    Ghost

    The Ghost platform has garnered a lot of traction over the years ever since its concept in 2012. I've been keeping an eye on it over the years and never really gave the platform much thought until I noticed quite a few popular bloggers making the move and experiencing the lightening fast performance. This is possibly down to the blogger hosting their instance on Ghost Pro. Could be wrong. I am planning on going down the Ghost Pro hosting route and get everything setup by the very nice people behind the scenes at Ghost HQ, who will lovingly host and look after my site.

    I opened up a dialog on Twitter to Ghost who were very kind in alleviating my initial migration worries:

    @SurinderBhomra We can upload images for you, if you send the upload directory in the format Ghost uses, i.e. /content/images/yyyy/mm/image-name
    — Ghost (@TryGhost) October 7, 2016

    @SurinderBhomra We can help with the redirects if you're coming over to Ghost(Pro). :)
    — Ghost (@TryGhost) October 6, 2016

    The only thing I will have to get over, which Ghost will not be able to help me with is getting over the mindset that I will not be able to to tinker around with my site to the full extent as I do now. But this isn't necessarily a bad thing and will give me the opportunity to concentrate more on writing quality content. I just hate the thought of restricting myself.

    Ghost has put a framework in place that no other platform has done so well - giving power to write content anywhere:

    • Desktop browser
    • Mobile browser
    • Desktop application

    Looks like Ghost lives up to its main selling point:

    An open source blogging platform which makes writing pleasurable and publishing simple.

    What I also love is the SEO optimisation out-of-the-box. God knows how many hours I've spent trying to get my site SEO friendly, not only from an search indexing standpoint, but a social sharing standpoint too with all the open graph tags built-in. No need for extra plugins or development from a code perspective.

    Whats Next?

    As it currently stands, I am evaluating Ghost through their 14 day trial and need to send an email to their support team before I make a confirmed decision to move. I like what I am seeing to far. Just need to get the time to put a migration process in place to move the 200 posts on this site. Eek!

    Ghost is definitely not as scary as I once thought. Cue Ray Parker Jr...

  • Published on
    -
    2 min read

    The Pursuit Of Happiness

    So it's finally come to this... A point in my life where I'm questioning what have I done to get to this place I currently find myself standing, wanting to make sense of an emotion that was so naturally built into my being from day one. But now, I am not too sure if it exists or ever did exist.

    The Sad Clown

    Before you read any further, I thought I just clarify you won't be finding me talking about the performance of Will and Jayden Smith in the film: The Pursuit of Happiness. The title of the film and this post is purely coincidental.

    This year has been to what I can only describe as: turbulent. The complete opposite to what it should have been. It was going to be a year of pastures new. A seed of great things to come was planted, watered on a daily basis and nurtured to flourish into the start of something quite beautiful. Alas, like the state of my lawn it’s very much the case where no matter how much hard graft is invested to transforming something withered to greener pastures, it morphs back to its original state as nature intended. Some things cannot be changed.

    Why do I write this? That I do not know. Maybe writing my inner thoughts into words to stare back at me in its raw unforgiving form is the only way to come to terms with what I am facing. Let's call it: therapy.

    I look at my life and think I am a lucky person. I have nothing to complain about, yet I feel something missing. As one day ends and another begins, I find myself wondering what I am trying to accomplish and questioning if I am doing everything in my power remedy the wounds still open from earlier this year. Honest answer: probably not. Yesterday, I thought about what Friedrich Nietzsche said:

    If you stare into the abyss, the abyss stares back at you.

    By not confronting the wounds of yesterday, I'm consumed by being reminded of the painful events that has wedged itself deep into my hippocampus. Slowly eroding away my old self. But there is just enough for the small part of me that still exists to warn me that I am slowly edging mentally to the point of no return. So I am here writing this very post.

    If I don't start the healing process now, what I fear the most may come into fruition - others around me will notice the gaping hole where my left ventricle used to be. I have come to the conclusion that I'm not so good at being the great pretender over a considerable duration of time.

    With every letter I type I slowly regain consciousness and become self aware once again, coming to the realisation that this year has changed me. No doubt about that. But I'm stronger for it.

    If a human being thoughts and emotions is truly boundless, then it's in our nature to have the capacity to forgive, forget and learn. By doing this, I can only hope the resulting outcome will be... happiness. In time this will happen. As they say "time is a great healer". I take great comfort in that.