Blog

Blogging on programming and life in general.

  • Memories are what give life purpose. They allow us to go back to the past, into a time that shall forever be stateless. Most importantly memories are experiences that mould us into the person we are today.

    For some reason, when I think about the word memories the first thing that come to mind are pictures... Photos to be exact. I only started thinking how important photos are to me whilst I was having a conversation with my cousin Tajesh. Tajesh popped over this weekend gone by and like always has many fascinating stories to tell. One story in particular got my attention. He told me about an amazing trip he had in Australia many many years ago and how he lost all the photos he had taken after recently damaging his hard drive. On hearing his predicament, I was profoundly moved and imagined how I'd feel if I was in his position.

    Even though our brains are wired to remember events and experiences, memories seem to somehow fade away over time and we start forgetting the little detail of images until it forms into a hazy recall. We remember enough to transport us back to a time or a place, but the brain has a strange way of patching together what we once saw. As if they are pieces of a larger puzzle. If your brain is anything like mine where you can only selectively retrieve one piece of the puzzle that is most meaningful, we're missing a vast array of information.

    I decided I'd make an attempt to try and recover my cousins lost photos. He handed over his Western Digital Caviar edition hard drive carefully enclosed in an old VHS box, entrusting I'll have it's best interests at heart and keeping whatever memories that maybe locked away safe inside... A damaged hard drive is in some ways like our brains selective recall. The data is stored somewhere but we sometimes have problems accessing them.

    I'm no hard disk recovery expert and I am hoping some off the shelf software will help me in getting at least some photos back from his holiday. So what's the game plan?

    I'll start with using a piece of software I blogged about back in 2011 - EaseUs. EaseUs provides a line of software ranging from backup to recovery. It helped me then and (fingers crossed) it'll help me now. I'll also need a 3.5 inch disk caddy to allow the hard drive to be connected via USB and start the recovery process.

    As it stands, my cousins Western Digital Caviar disk doesn't seem to have any visible damage and there are no noises when run. It just doesn't boot.

    Stay tuned for future posts on how I get on.

    To be continued...

  • Published on
    -
    4 min read

    My Time At Melia Bali Hotel

    Family. Family is what comes to mind when I think of my time at Melia Bali. I only happen to come to this conclusion as I checked out on the calm and (strangely) cool evening before having to depart back to the UK.

    I don't generally write about my travels (or lack of!). But my time at Melia has energised me to write something and as a result, I scribble away madly trying to make sense of processing my erratic thoughts and feelings during my long flight back. Just so I can write this post.

    Melia Bali Coconut On The Beach

    I find it ironic I started writing this post about "family", when I happened to visit Bali with people to whom I deem most dear: mum, dad and sister. I feel that family forms centre place to the services they provide.

    If you happen to have the privilege of staying at the Melia Bali Hotel for long enough, you'd get a sense of familiarity of the people working there. To some extent I hope I am perhaps familiar to them - The Indian guy with the ridiculously frizzy hair (a result of the climate ;-) ).

    These people truly are the back bone of the hotel and make it what it is. Yes, Melia is a pretty place on the surface but it's the people that make it truly shine when compared to the other hotels staggered along the beach shoreline. They are without a doubt amazing at what they do. From the lady who cooks me the most delicious fluffy omelette in the morning down to the lobby personnel who are willing to help with any query or concern.

    From the moment I wake up and make my way to the breakfast hall to the moment I enter the lobby at the end of a long day gallivanting, I am greeted with many smiles. You can't help but be infected with a sense of positivity and happiness, something I don't think I've ever come across when holidaying elsewhere.

    The lobby statues and wondrous ceiling mural makes for a welcome sight at any time, setting the ambience and standard for the hotel. If you're lucky to be at the lobby during the evening, you'd be greeted by two classically trained balinese dancers dressed from an era of time gone by. As they dance with delicate intricacy to the tune of the rindik, I am reminded of similarities when compared to classical Indian dances - a lost art and cultural heritage slowly eroding with time, making for a visceral experience and something you can't help but appreciate.

    Melia Bali Dancers

    Melia offers around five restaurants to cater for the guests varying palettes - each with their own theme and cuisines. We found ourselves venturing outside to nearby restaurants as after a few nights as we found the prices a little dear based on the portion sizes of the main meals. Even though the food was very tasty, my western belly expected something more sizeable. When taking into consideration the 21% combined tax and service charge on top of the prices on the menu - not so cheap. There are many fine eateries at the Bali Collection for consideration, just a 5-10 minute walk away.

    When there are issues, it is family that are there to support at time of need. We happened to experience some very loud noises from the room above us at an unsightly hour. Spoiling the serenity we become accustomed to. Now this went on for a couple nights. We just happened to make a remark of our problem in passing to one of the workers whilst feasting on the morning breakfast buffet and within a a short period of time this was communicated to the customer service representative who apologised and organised a room change swiftly.

    The rooms themselves are all very well maintained, clean and provided nice views from the balcony. Based on our room change, you can expect subtle differences in terms of the what the rooms offer. For example, our first room just had a shower, but the second room had a shower/bath. My only quibble is the shower head position - not a deal breaker. There are generous bathroom amenities, consisting of toothbrush, toothpaste, vanity kit, shampoo, conditioner, shower gel, shaving kit and body lotion. All fully restocked daily. The crazy thing is that you get a new toothbrush every day! As much as I like a new toothbrush, I sometimes have to question the environmental impact.

    Melia Bali - View from Lobby

    Even though our noisy neighbour was no fault of thier own, Melia took us under their wing to ensure our holiday was perfect. We were even given a fruit platter for our troubles. I think what struck a chord with me is before we departed, is that the management seem to know everything about a guests stay to a granular level. The hotel manager hoped we'd comeback again to visit even with the minor inconvenience we experienced.

    If I do get another opportunity to visit again, I will consider paying a little extra for "The Level" experience. I have to admit, I was quite envious of all the things I heard about this upgrade when talking to other guests on the beach. I felt like a pauper. I like the idea of having a little more privacy in terms of accomodation and your own space on the beach. More importantly it's adults only. No noisy kids! :-)

    Melia Bali - Entrance At Night

    I look back at my time at Melia Bali with fond memories that will be permanently etched into my memory. For me, Melia compliments Bali as the wondrous land of Bali compliments Melia.

  • For one of my side projects, I was asked to use Butter CMS to allow for basic blog integration using JavaScript. I have never heard or used Butter CMS before and was intrigued to know more about the platform.

    Butter CMS is another headless CMS variant that allows a developer to utilise API endpoints to push content to an application via an arrange of approaches. So nothing new here. Just like any headless CMS, the proof is in the pudding when it comes to the following factors:

    • Quality of features
    • Ease of integration
    • Price points
    • Quality of documentation

    I haven't had a chance to properly look into what Butter CMS fully has to offer, but from what I have seen from working on the requirements for this side project I was pleasently surprised. Found it really easy to get setup with minimal amount of fuss! For this project I used Butter CMS's Blog Engine package, which does exactly what it says on the tin. All the fields you need for writing blog posts are already provided.

    JavaScript Code

    My JavaScipt implementation is pretty basic and provides the following functionality:

    • Outputs a list of posts consisting of title, date and summary text
    • Pagination
    • Output a single blog post

    All key functionality is derived from the "ButterCMS" JavaScript file:

    /*****************************************************/
    /*                    Butter CMS                                 */
    /*****************************************************/
    var ButterCMS =
    {
        ButterCmsObj: null,
    
        "Init": function () {
            // Initiate Butter CMS.
            this.ButterCmsObj = new ButterCmsBlogData();
            this.ButterCmsObj.Init();
        },
        "GetBlogPosts": function () {
            BEButterCMS.ButterCmsObj.GetBlogPosts(1);
        },
        "GetSinglePost": function (slug) {
            BEButterCMS.ButterCmsObj.GetSinglePost(slug);
        }
    };
    
    /*****************************************************/
    /*                Butter CMS Data                         */
    /*****************************************************/
    function ButterCmsBlogData() {
        var apiKey = "<Enter API Key>",
            baseUrl = "/",
            butterInstance = null,
            $blogListingContainer = $("#posts"),
            $blogPostContainer = $("#post-individual"),
            pageSize = 10;
    
        // Initialise of the ButterCMSData object get the data.
        this.Init = function () {
            getCMSInstance();
        };
    
        // Returns a list of blog posts.
        this.GetBlogPosts = function (pageNo) {
            // The blog listing container needs to be cleared before any new markup is pushed.
            // For example when the next page of data is requested.
            $blogListingContainer.empty();
    
            // Request blog posts.
            butterInstance.post.list({ page: pageNo, page_size: pageSize }).then(function (resp) {
                var body = resp.data,
                    blogPostData = {
                        posts: body.data,
                        next_page: body.meta.next_page,
                        previous_page: body.meta.previous_page
                    };
    
                for (var i = 0; i < blogPostData.posts.length; i++) {
                    $blogListingContainer.append(blogPostListItem(blogPostData.posts[i]));
                }
    
                //----------BEGIN: Pagination--------------//
    
                $blogListingContainer.append("<div>");
    
                if (blogPostData.previous_page) {
                    $blogListingContainer.append("<a class=\"page-nav\" href=\"#\" data-pageno=" + blogPostData.previous_page + " href=\"\">Previous Page</a>");
                }
    
                if (blogPostData.next_page) {
                    $blogListingContainer.append("<a class=\"page-nav\" href=\"#\" data-pageno=" + blogPostData.next_page + " href=\"\">Next Page</a>");
                }
    
                $blogListingContainer.append("</div>");
    
                paginationOnClick();
    
                //----------END: Pagination--------------//
            });
        };
    
        // Retrieves a single blog post based on the current URL of the page if a slug has not been provided.
        this.GetSinglePost = function (slug) {
            var currentPath = location.pathname,
                blogSlug = slug === null ? currentPath.match(/([^\/]*)\/*$/)[1] : slug;
    
            butterInstance.post.retrieve(blogSlug).then(function (resp) {
                var post = resp.data.data;
    
                $blogPostContainer.append(blogPost(post));
            });
        };
    
        // Renders the HTML markup and fields for a single post.
        function blogPost(post) {
            var html = "";
    
            html = "<article>";
    
            html += "<h1>" + post.title + "</h1>";
            html += "<div>" + blogPostDateFormat(post.created) + "</div>";
            html += "<div>" + post.body + "</div>";
            
            html += "</article>";
    
            return html;
        }
    
        // Renders the HTML markup and fields when listing out blog posts.
        function blogPostListItem(post) {
            var html = "";
    
            html = "<h2><a href=" + baseUrl + post.url + ">" + post.title + "</a></h2>";
            html += "<div>" + blogPostDateFormat(post.created) + "</div>";
            html += "<p>" + post.summary + "</p>";
    
            if (post.featured_image) {
                html += "<img src=" + post.featured_image + " />";
            }
    
            return html;
        }
    
        // Set click event for previous/next pagination buttons and reload the current data.
        function paginationOnClick() {
            $(".page-nav").on("click", function (e) {
                e.preventDefault();
                var pageNo = $(this).data("pageno"),
                    butterCmsObj = new ButterCmsBlogData();
    
                butterCmsObj.Init();
                butterCmsObj.GetBlogPosts(pageNo);
            });
        }
    
        // Format the blog post date to dd/MM/yyyy HH:mm
        function blogPostDateFormat(date) {
            var dateObj = new Date(date);
    
            return [dateObj.getDate().padLeft(), (dateObj.getMonth() + 1).padLeft(), dateObj.getFullYear()].join('/') + ' ' + [dateObj.getHours().padLeft(), dateObj.getMinutes().padLeft()].join(':');
        }
    
        // Get instance of Butter CMS on initialise to make one call.
        function getCMSInstance() {
            butterInstance = new Butter(apiKey);
        }
    }
    
    // Set a prototype for padding numerical values.
    Number.prototype.padLeft = function (base, chr) {
        var len = (String(base || 10).length - String(this).length) + 1;
    
        return len > 0 ? new Array(len).join(chr || '0') + this : this;
    };
    

    To get a list of blog posts:

    // Initiate Butter CMS.
    BEButterCMS.Init();
    
    // Get all blog posts.
    BEButterCMS.GetBlogPosts();
    

    To get a single blog post, you will need to pass in the slug of the blog post via your own approach:

    // Initiate Butter CMS.
    BEButterCMS.Init();
    
    // Get single blog post.
    BEButterCMS.GetSinglePost(postSlug);
    
  • If you have many sites running on your installation of Windows Server, you will soon find that there will be an accumulation of logs generated by IIS. Through my niavity, I presumed that there is a default setting in IIS that would only retain logs for a specific period of time. It is only when I started noticing over the last few weeks the hard disk space was slowly getting smaller and smaller.

    Due to my sheer embaressment, I won't divulge how much space the logs had taken up. All I can say, it was quite a substantial amount. :-)

    After some Googling online, I came across a Powershell script (which can be found here), that solved all my problems. The script targets your IIS logs folder and recusively looks for any file that contains ".log" for deletion. Unfortunately, the script did not run without making some minor modifications to the original source. This is due to changes in versions of Powershell since the post was written 3 years ago.

    $logPath = "C:\inetpub\logs\LogFiles" 
    $maxDaystoKeep = -5
    $cleanupRecordPath = "C:\Log_Cleanup.log" 
    
    $itemsToDelete = dir $logPath -Recurse -File *.log | Where LastWriteTime -lt ((get-date).AddDays($maxDaystoKeep)) 
    
    If ($itemsToDelete.Count -gt 0)
    { 
        ForEach ($item in $itemsToDelete)
        { 
            "$($item.FullName) is older than $((get-date).AddDays($maxDaystoKeep)) and will be deleted." | Add-Content $cleanupRecordPath 
            Remove-Item $item.FullName -Verbose 
        } 
    } 
    Else
    { 
        "No items to be deleted today $($(Get-Date).DateTime)." | Add-Content $cleanupRecordPath 
    }    
    
    Write-Output "Cleanup of log files older than $((get-date).AddDays($maxDaystoKeep)) completed!" 
    
    Start-Sleep -Seconds 10
    

    If you're ever so inclined, hook this script up to a Scheduled Task to run on a daily basis to keep your log files in order.

  • Published on
    -
    1 min read

    The Biggest E-commerce of My Kentico Career!

    I recently blogged about a very large Kentico E-commerce build I was involved with at Syndicut that contained around 2 million products. Trust me, this is a major feat in itself! A lot of customisation and performance improvements were made to the Kentico build to accomodate the sheer volume of products.

    A follow up post will be published soon that will detail the issues we experienced in developing a Kentico site that has to manage more products you could ever imagine and our solution to those issues.

    You can have a read about the project and some of my highlights here: https://medium.com/syndicutstudio/welcome-to-our-biggest-kentico-e-commerce-build-yet-9bd2109955e0.

  • A website can tell the public a lot about you, from the things you want people to see and other things you probably would not. HTTP Headers can divulge things about your website that you wouldn't necessarily want to make public and its up to the individual to make a decision on what headers they're willing to expose. But what I would recommend is to at least analyse any site prior to moving to a production environment.

    Why all of a sudden am I talking about questioning your website HTTP Headers?

    It was only by chance when perusing StackOverflow I came across a question about securing HTTP headers, I was directed to a site called securityheaders.io. I immediately entered this very site for scanning, thinking it would fare quite well. But boy oh boy was I wrong!:

    Security Headers (Before)

    Based on this result, does this make my website vulnerable? To a certain extent yes. By default you're exposing some key information to potential hackers about how your website is built. For example, here is a simple list of HTTP Headers that could be returned from the server:

    • Web server
    • Framework version
    • Cache handling
    • Cross-site scripting access
    • Referrer policies

    Now based on that list alone, what HTTP headers would you hide? From having my eyes opened by the report generated by securityheaders.io, as a minimum I would hide anything that shows what technology, framework and server platform I am using. If there happens to be an exploit on the very server or technology you are using, we don't want the whole world to know that especially if you happen to be hosting a high traffic website.

    I decided to correct all the issues highlighted by securityheaders.io and spent additional time obfuscating some additional headers. Now I can proudly say I've passed. There is just one blemish against the report to do with the "Content-Security-Policy" header, which defines approved sources of content that the browser may load.

    Security Headers (After)

    I been tweaking around with the rules for this header and I'll be honest when I say it shafted the administration dashboard of my the content management system I use for my site - Kentico CMS. So before I reinstate the header, I need a little more time tweaking.

    Another great site to use to analyse the security of your site (.NET sites only) is ASafaWeb, which scans for common configuration vulnerabilities.

    Recommended Links

  • Whilst making a request to one of my API endpoints for an iOS application, I came across a very unhelpful error: "Invalid response for blob". I couldn't really understand why React Native was complaining about this single API endpoint, since all my other endpoints did not encounter this error.

    React Native: Invalid Response For Blob

    The API endpoint in question is a pretty simple email address validator. If the users email address is unique and passes all verification checks, the endpoint will either return a 200 (OK) or 400 (Bad Request) along with a response containing the error. For those who understand ASP.NET Web API development, my endpoint is as follows:

    /// <summary>
    /// Check if a user's email address is not already in user along with other string validation checks.
    /// </summary>
    /// <param name="email"></param>
    /// <returns></returns>
    [HttpPost]
    [AllowAnonymous]
    [Route("EmailAddressValidator")]
    public HttpResponseMessage EmailAddressValidator(string email)
    {
        if (UserLogic.IsEmailValid(email, out string error)) // UserLogic.IsEmailValid() method carries out email checks...
            return Request.CreateResponse(HttpStatusCode.OK);
    
        return Request.CreateResponse(HttpStatusCode.BadRequest, new ErrorModel { Error = error });
    }
    

    So pretty simple stuff!

    Weirdly enough the "Invalid response for blob" issue did not occur within my endpoint when the users email address did not pass the required checks, thus returning a 400 error and a response detailing the error. It was only when a 200 response was returned without a value.

    There seems to be a bug in the React Native environment when it comes to dealing with empty API responses. The only way I could get around this, is to always ensure all my API endpoints returned some form of response. Very strange! I suggest all you fellow React Native developers do the same until a fix is put in place.

    The issue has already been logged so I will be keeping an eye on a release for a fix.

    Update (16/07/2018)

    I wasn't too sure to whether anything had been done in regards to the fix since writing this post as there was no update to the Github issue that was first logged on the 5th March. So I decided to share this very post to a React Native group on Reddit to get some form of answer. Within a short period of time, I was told this issue has been fixed in React Native version 0.56.

  • Earlier this week a post I wrote for C# Corner was published. It was about an alternative to use the very well known SQL Server "IN" condition when working with many values. I discuss storing a list of values you would normally pass directly into your "IN" condition for querying to a User Defined Data Type.

    There will probably be a very small number of cases where additional steps I write in the post will need to be carried out. Afterall, SQL Server has a very large limit on the number of values the “IN” condition can handle, based on the length of instruction (max 65k).

    Check it out here: http://www.c-sharpcorner.com/blogs/alternative-to-sql-in-condition-when-working-with-many-values

  • I decided to write this blog post after one of my fellow Kentico Cloud developer Matt Nield tweeted the following last week:

    So happy to see this coming to Kentico Cloud! The amount to times I yearned for something I could use to clear cache!
    — Surinder Bhomra (@SurinderBhomra) July 13, 2017

    Webhook capability is something I have been yearning for since I built my first Kentico Cloud project and this feature cannot come soon enough! It will really take the Kentico Cloud headless CMS integration within our applications to the next level. One of the main things I am looking forward to is using webhooks is to develop a form of dependency caching, so when content is updated in Kentico Cloud, the application can reflect these changes.

    In fact, I am so excited to have this feature in my hands for my caching needs, I have already started developing something I could potentially use in time for the Q3 2017 release - should be any time now.


    As we all know, to not only improve overall performance of your application as well as reducing requests to the Kentico Cloud API, we are encouraged to set a default cache duration. There is documentation on the different routes to accomplish this:

    1. Controller-level - using OutputCache attribute
    2. CachedDeliveryClient class - provided by the Kentico Cloud Boilerplate that acts as a wrapper around the original DeliveryClient object to easily cache data returned from the API for a fixed interval.

    I personally prefer caching at controller level, unless the application is doing something very complex at runtime for manipulating incoming data. So in the mean time whilst I wait for webhook functionality to be released, I decided to create a custom controller attribute called "KenticoCacheAttribute", that will only start the caching process only if the application is not in debug mode.

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Web;
    using System.Web.Mvc;
    
    namespace Site.Web.Attributes
    {
        public class KenticoCacheAttribute : OutputCacheAttribute
        {
            public KenticoCacheAttribute()
            {
                Duration = HttpContext.Current.IsDebuggingEnabled ? 0 : int.Parse(ConfigurationManager.AppSettings["KenticoCloud.CacheDuration"]);
            }
        }
    }
    

    The "KenticoCacheAttribute" inherits the OutputCacheAttribute class, which gives me additional control to when I'd like the caching process to happen. In this case, the cache duration is set within the web.config.

    I found the one main benefit of my custom controller attribute is that I will never forget to start caching pages on my website when it comes to deployment to production, since we never want our website to have debugging enabled unless we're in a development environment. This also works the other way. We're not too concerned about caching in a development environment as we always want to see changes in incoming data straight away.

    The new cache attribute is used in the exact same approach as OutputCacheAttribute, in the following way:

    [Route("{urlSlug}")]
    [KenticoCacheAttribute(VaryByParam = "urlSlug")]
    public async Task<ActionResult> Detail(string urlSlug)
    {
         // Do something...
    
        return View();
    }
    

    This is a very simple customisation I found useful through my Kentico Cloud development.

    The custom attribute I created is just the start on how I plan on integrating cache managment for Kentico Cloud applications. When webhook capability is released, I can see further improvements being made, but may require a slightly different approach such as developing a custom MVC Action Filter instead.

  • I like to keep my blob containers quite tidy and delete any files that would unnecessarily increase its size. For a project I was working on, I had a blob that was being used to temporarily store images a user uploaded for manipulation at a later time. I saw no reason to keep these files for no longer than 24 hours. An Azure WebJob seemed an ideal solution to do this.

    I could've left the blob container to stagnate and fester over time and the reasoning behind creating a cleanup task wasn't from a cost point of view. A blob container is very reasonably priced for the amount of storage and requests I would be making. I was more concerned about performance for times where I would be trawling through many thousands of files to get back the image a user had uploaded for temporary use by my web application.

    Creating an Azure WebJob is very easy and versatile. You have the flexibility to develop a WebJob by creating the following scripts or programs:

    • .cmd, .bat, .exe (using windows cmd)
    • .ps1 (using powershell)
    • .sh (using bash)
    • .php (using php)
    • .py (using python)
    • .js (using node)
    • .jar (using java)

    In this post, I will be developing my WebJob using a Console Application that will generate an executable. In Visual Studio 2017, there are two ways you can go about creating a project for your WebJob:

    1. Console Application project
    2. Selecting Azure WebJob project - which you will find under the "Cloud" category.

    If you create your WebJob using a Console Application, you will still have the option later on to "Publish as an Azure WebJob..." when right-clicking on the project. In the code below I happened to be using a Console Application only because I didn't even know a Azure WebJob project existed until after I completed development on my project. Doh!

    Program.cs

    I have created a new project called "Site.AzueWebJob.Cleanup". The project uses the following two Azure nuget packages:

    
    namespace Site.AzureWebJob.Cleanup
    {
        class Program
        {
            static void Main(string[] args)
            {
                try
                {
                    CloudStorageAccount storageAccount = CloudStorageAccount.Parse("<Insert Storage Connection String Here>");
    
                    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
                    CloudBlobContainer dataContainer = blobClient.GetContainerReference("<Blob container name>");
    
                    Console.WriteLine("Hourly threshold to remove records: {0}", ConfigurationManager.AppSettings["Azure.CleanupHours"]);
    
                    #region Retrieve all data items greater than 24 hours and delete them
    
                    Console.WriteLine("Retrieving old data files...");
    
                    // Get files where the "Last Modified Date" is olders than 24 hours.
                    IEnumerable<CloudBlob> oldData = dataContainer.ListBlobs()
                                    .OfType<CloudBlob>()
                                    .Where(b => b.Properties.LastModified.Value.Date < DateTime.Now.AddHours(int.Parse(ConfigurationManager.AppSettings["Azure.CleanupHours"].ToString()) * -1));
    
                    IList<CloudBlob> dataBlobs = oldData as IList<CloudBlob> ?? oldData.ToList();
    
                    Console.WriteLine("Data records retrieved: {0}.", dataBlobs.Count);
                    Console.WriteLine("Removing old data files...");
    
                    // Loop through the files and delete if they exist.
                    foreach (CloudBlob dataBlob in dataBlobs)
                    {
                        bool isDeleted = dataBlob.DeleteIfExists();
    
                        if (isDeleted)
                            Console.WriteLine("Deleted: {0}.", dataBlob.Name);
                    }
    
                    #endregion
    
                    Console.WriteLine("Removing old data complete.");
                }
                catch (Exception ex)
                {
                    Console.WriteLine("Error cleaning container files: {0}", ex.Message);
                }
    
                Console.WriteLine("Clean Containers WebJob complete.");
            }
        }
    }
    

    There isn't really much to it. All I am doing is retrieving all files that are older than 24 hours (value set within App.config app setting called: "Azure.CleanupHours") and then carrying out the delete process by looping through any records returned.

    The most safest way to delete a file is to use the CloudBlob.DeleteIfExists() call. As the method name suggests, it will only delete a file if it exists. Using the CloudBlob.Delete() will cause an exception if for some reason the file isn't there and will require additional error handling.

    Final Steps

    Now that we have our Azure WebJob ready to go, the only thing left is to publish to your Azure Web App by simply right-clicking on your project and selecting: "Publish as an Azure WebJob...". Here you will connect to your Azure instance and have the options to choose how your WebJob should run:

    • Continuously
    • On Demand
    • On Schedule