Blog

Blogging on programming and life in general.

  • A picture tag allows us to serve different sized images based on different viewport breakpoints or pixel-ratios, resulting in better page load performance. Google's Pagespeed Insights negatively scores your site if responsive images aren't used. Pretty much all modern browsers support this markup and on the off chance it doesn't, an image fallback can be set.

    Using the picture markup inside page templates is pretty straight-forward, but when it comes to CMS related content where HTML editors only accommodate image tags, it's really difficult to get someone like a client to add this form of markup. So the only workaround is to transform any image tag into a picture tag at code-level.

    Code: ConvertImageToPictureTag Extension Method

    The ConvertImageToPictureTag method will perform the following tasks:

    1. Loop through all image tags.
    2. Get the URL of the image from the "src" attribute.
    3. Get other attributes such as "alt" and "style".
    4. Generate picture markup and add as many source elements based on the viewport breakpoints required, apply the URL of the image, style and alt text.
    5. Replace the original image tag with the new picture tag.

    The ConvertImageToPictureTag code uses HtmlAgilityPack, making it very easy to loop through all HTML nodes and manipulate the markup. In addition, this implementation relies on a lightweight client-side JavaScript plugin - lazysizes. The lazysizes plugin will delay the loading of the higher resolution image based on the viewport rules in the picture tag until the image is scrolled into view.

    using HtmlAgilityPack;
    using Site.Common.Kentico;
    using System;
    using System.Collections.Generic;
    using System.Collections.Specialized;
    using System.Linq;
    using System.Text;
    using System.Web;
    
    namespace SurinderBhomra.Common.Extensions
    {
        public static class ContentManipulatorExtensions
        {
            /// <summary>
            /// Transforms all image tags to a picture tag inside parsed HTML.
            /// All source image URL's need to contain a "width" query parameter in order to have a resize starting point.
            /// </summary>
            /// <param name="content"></param>
            /// <param name="percentageReduction"></param>
            /// <param name="minimumWidth">The minimum width an image has to be to warrant resizing.</param>
            /// <param name="viewPorts"></param>
            /// <returns></returns>
            public static string ConvertImageToPictureTag(this string content, int percentageReduction = 10, int minimumWidth = 200, params int[] viewPorts)
            {
                if (viewPorts?.Length == 0)
                    throw new Exception("Viewport parameter is required.");
    
                if (!string.IsNullOrEmpty(content))
                {
                    //Create a new document parser object.
                    HtmlDocument document = new HtmlDocument();
    
                    //Load the content.
                    document.LoadHtml(content);
    
                    //Get all image tags.
                    List<HtmlNode> imageNodes = document.DocumentNode.Descendants("img").ToList();
                    
                    if (imageNodes.Any())
                    {
                        // Loop through all image tags.
                        foreach (HtmlNode imgNode in imageNodes)
                        {
                            // Make sure there is an image source and it is not externally linked.
                            if (imgNode.Attributes.Contains("src") && !imgNode.Attributes["src"].Value.StartsWith("http", StringComparison.Ordinal))
                            {
                                #region Image Attributes - src, class, alt, style
                                
                                string imageSrc = imgNode.Attributes["src"].Value.Replace("~", string.Empty);
                                string imageClass = imgNode.Attributes.Contains("class") ? imgNode.Attributes["class"].Value : string.Empty;
                                string imageAlt = imgNode.Attributes.Contains("alt") ? imgNode.Attributes["alt"].Value : string.Empty;
                                string imageStyle = imgNode.Attributes.Contains("style") ? imgNode.Attributes["style"].Value : string.Empty;
    
                                #endregion
    
                                #region If Image Source has a width query parameter, this will be used as the starting size to reduce images
    
                                int imageWidth = 0;
    
                                UriBuilder imageSrcUri = new UriBuilder($"http://www.surinderbhomra.com{imageSrc}");
                                NameValueCollection imageSrcQueryParams = HttpUtility.ParseQueryString(imageSrcUri.Query);
    
                                if (imageSrcQueryParams?.Count > 0 && !string.IsNullOrEmpty(imageSrcQueryParams["width"]))
                                    imageWidth = int.Parse(imageSrcQueryParams["width"]);
    
                                // If there is no width parameter, then we cannot resize this image.
                                // Might be an older non-responsive image link.
                                if (imageWidth == 0 || imageWidth <= minimumWidth)
                                    continue;
    
                                // Clear the query string from image source.
                                imageSrc = imageSrc.ClearQueryStrings();
    
                                #endregion
    
                                // Create picture tag.
                                HtmlNode pictureNode = document.CreateElement("picture");
    
                                if (!string.IsNullOrEmpty(imageStyle))
                                    pictureNode.Attributes.Add("style", imageStyle);
    
                                #region Add multiple source tags
    
                                StringBuilder sourceHtml = new StringBuilder();
    
                                int newImageWidth = imageWidth;
                                for (int vp = 0; vp < viewPorts.Length; vp++)
                                {
                                    int viewPort = viewPorts[vp];
    
                                    // We do not not want to apply the percentage reduction to the first viewport size.
                                    // The first image should always be the original size.
                                    if (vp != 0)
                                        newImageWidth = newImageWidth - (newImageWidth * percentageReduction / 100);
    
                                    sourceHtml.Append($"<source srcset=\"{imageSrc}?width={newImageWidth}\" data-srcset=\"{imageSrc}?width={newImageWidth}\" media=\"(min-width: {viewPort}px)\">");
                                }
    
                                // Add fallback image.
                                sourceHtml.Append($"<img src=\"{imageSrc}?width=50\" style=\"width: {imageWidth}px\" class=\"{imageClass} lazyload\" alt=\"{imageAlt}\" />");
    
                                pictureNode.InnerHtml = sourceHtml.ToString();
    
                                #endregion
    
                                // Replace the image node with the new picture node.
                                imgNode.ParentNode.ReplaceChild(pictureNode, imgNode);
                            }
                        }
    
                        return document.DocumentNode.OuterHtml;
                    }
                }
    
                return content;
            }
        }
    }
    

    To use this extension, add this to any string containing HTML markup, as so:

    // The HTML markup will generate responsive images using based on the following parameters:
    // - Images to be resized in 10% increments.
    // - Images have to be more than 200px wide.
    // - Viewport sizes to take into consideration: 1000, 768, 300.
    string contentWithResponsiveImages = myHtmlContent.ConvertImageToPictureTag(10, 200, 1000, 768, 300);
    

    Sidenote

    The code I've shown doesn't carry out any image resizing, you will need to integrate that yourself. Generally, any good content management platform will have the capability to serve responsive images. In my case, I use Kentico and can resize images by adding a "width" and/or "height" query parameter to the image URL.

    In addition, all image URL's used inside an image tags "src" attribute requires a width query string parameter. The value of the width parameter will be the size the image in its largest form. Depending on the type of platform used, the URL structure to render image sizes might be different. This will be the only place where the code will need to be retrofitted to adapt to your own use case.

  • The title of this post might seem a tad extreme, but I just feel so strongly about it! Ever since I started learning ASP.NET those many years ago, I've never been a fan of using "Eval" in data-bound controls I primarily use, such as GridViews, Repeaters and DataList. When I see it still being used regularly in web applications I cringe a little and I feel I need to express some reasons to why it should stop being used.

    I think working on an application handed down to me from an external development agency pushed me to write this post... Let's call it a form of therapy! I won't make this post a rant and will "try" to be constructive and concise. My views might come across a little one-sided, but I promise I will start with at least one good thing to say about our evil friend Eval.

    Postive: Quick To Render Simple Data

    If the end goal is to list out some string values as is from the database with some minor manipulation from a relatively small dataset, I almost have no problem with that, even though I still believe it can be used and abused by inexperienced developers.

    Negative: Debugging


    The main disadvantage of embedding code inside your design file (.aspx or .ascx) is that it's not very easy to view the output during debugging. This causes a further headache when your Eval contains some conditional statements to alter the output on a row-by-row basis.

    Negative: Difficult To Carry Out Complex HTML Changes

    I wouldn't recommend using Eval in scenario's where databound rows require some form of HTML change. I've seen some ugly implementations where complex conditional statements were used to list out data in a creative way. If the HTML ever had to be changed through design updates, it would be a lot more time consuming to carry when compared to moving around some form controls that are databound through a RowDataBound event.

    Negative: Ugly To Look At

    This point will come across very superficial. Nevertheless, what I find painful to look at is when Eval is still used to carry out more functionality by calling additional methods and potentially repeating the same functionality numerous times.

    Performance/Efficiency

    From my research, it's not clear if there specifically is a performance impact in using Eval alone, especially with the advances in the .NET framework over the years. A post from 2012 on StackExchange brought up a very good point:

    Eval uses reflection to get the value of the relevant property/field, and using Reflection to get values from object members is very slow.

    If the type of an object can be determined at runtime, you're better off explicitly declaring this. After all, it's good coding standards. In the real world, the performance impact is nominal depending on the number of records you are dealing with. Not recommended for building scalable applications. I generally notice a slow down (in milliseconds) when outputting 500 rows of data.

    I have read that reflection is not as much of an issue in the most recent versions of the .NET framework when compared to say, .NET 1.1. But I am unable to find any concrete evidence of this. Regardless, I'd always prefer to use the faster approach, even if I am happening to shave off a few milliseconds in the process.

    Conclusion

    Just don't use Eval. Regardless of the size of the dataset I am dealing with, there would only be two approaches I'd ever use:

    1. RowDataBoundEvent: A controls RowDataBoundEvent event is triggered every time a row is databound with data. This approach enables us to modify the rows appearance and structure in a specific way depending on the type of rules we have in place.
    2. Start From Scratch: Construct the HTML markup by hand based on the datasource and render to the page.

    If I were to be building a scalable application dealing with thousands of rows of data, I am generally inclined to go for option 2. As you're not relying on a .NET control, you won't be contributing to the page viewstate.

    Even though I have been working on a lot more applications using MVC where I have more control on streamlining the page output, I still have to dabble with Web Forms. I feel with Web Forms, it's very easy to make a page that performs really bad, which makes it even more important to ensure you are taking all necessary steps to ensure efficiency.

  • Being a web developer I am trying to become savvier when it comes to factoring additional SEO practices, which is generally considered (in my view) compulsory.

    Ever since Google updated its Search Console (formally known as Webmaster Tools), it has opened my eyes to how my site is performing in greater detail, especially the pages Google deems as links not worthy for indexing. I started becoming more aware of this last August, when I wrote a post about attempting to reduce the number "Crawled - Currently not indexed" pages of my site. Through trial and error managed to find a way to reduce the excluded number of page links.

    The area I have now become fixated on is the sheer number of pages being classed as "Duplicate without user-selected canonical". Google describes these pages as:

    This page has duplicates, none of which is marked canonical. We think this page is not the canonical one. You should explicitly mark the canonical for this page. Inspecting this URL should show the Google-selected canonical URL.

    In simplistic terms, Google has detected there are pages that can be accessed by different URL's with either same or similar content. In my case, this is the result of many years of unintentional neglect whilst migrating my site through different platforms and URL structures during the infancy of my online presence.

    Google Search Console has marked around 240 links as duplicates due to the following two reasons:

    1. Pages can be accessed with or without a ".aspx" extension.
    2. Paginated content.

    I was surprised to see paginated content was classed as duplicate content, as I was always under the impression that this would never be the case. After all, the listed content is different and I have ensured that the page titles are different for when content is filtered by either category or tag. However, if a site consists of duplicate or similar content, it is considered a negative in the eyes of a search engine.

    Two weeks ago I added canonical tagging across my site, as I was intrigued to see if there would be any considerable change towards how Google crawls my site. Would it make my site easier to crawl and aid Google in understanding the page structure?

    Surprising Outcome

    I think I was quite naive about how my Search Console Coverage statistics would shift post cononicalisation. I was just expecting the number of pages classed as "Duplicate without user-selected canonical" to decrease, which was the case. I wasn't expecting anything more. On further investigation, it was interesting to see an overall positive change across all other coverage areas.

    Here's the full breakdown:

    • Duplicate without user-selected canonical: Reduced by 10 pages
    • Crawled - Currently not indexed: Reduced by 65 pages
    • Crawl anomaly: Reduced by 20 pages
    • Valid : Increased by 60 pages

    The change in figures may not look that impressive, but we have to remember this report is based only on two weeks after implementing canonical tags. All positives so far and I'm expecting to see further improvements over the coming weeks.

    Conclusion

    Canonical markup can often be overlooked, both in its implementation and importance when it comes to SEO. After all, I still see sites that don't use them as the emphasis is placed on other areas that require more effort to ensure it meets Google's search criteria, such as building for mobile, structured data and performance. So it's understandable why canonical tags could be missed.

    If you are in a similar position to me, where you are adding canonical markup to an existing site, it's really important to spend the time to set the original source page URL correctly the first time as the incorrect implementation can lead to issues.

    Even though my Search Console stats have improved, the jury's still out to whether this translates to better site visibility across search engines. But anything that helps search engines and visitors understand your content source can only be beneficial.

  • My day to day version control system is Bitbucket. I never got on with their own Git GUI offering - Sourcetree. I always found using TortoiseGit much more intuitive to use and a flexible way to interact with my git repository. If anyone can change my opinion on this, I am all ears!

    I work with large projects that are around a couple hundred megabytes in size and if I were to clone the same project over different branches, it can use up quite a bit of hard disk space. I like to quickly switch to my master branch after carrying out a merge for testing before carrying out a release.

    Luckily TortoiseGit makes switching branches a cinch in just a few clicks:

    • Right-click in your repository
    • Go to TortoiseGit context menu
    • Click Switch/Checkout
    • Select the branch you wish to switch to and select "Overwrite working tree changes (force)"

    TortoiseGit Switch Branches

    Selecting the "Overwrite working tree changes (force)" tick box is important to ensure all files in your working directory is overwritten with the files directly from your branch. We do not want remnants of files left from the branch we had previously switched from kicking around.

  • Regardless of any site you have worked on, there is always a potential problem of a page rendering broken images. This is more likely to happen when images are served from external sources or through accidental deletion within content management platforms.

    The only way I found a way to deal with this issue, is to provide a fallback alternative if the image to be served cannot be found. I've created a FallbackImage() extension method that can be applied to any string variable that contains a path to an image.

    public static class ImageExtensions
    {
        /// <summary>
        /// Creates a fallback image if the image requested does not exist.
        /// </summary>
        /// <param name="imageUrl"></param>
        /// <returns></returns>
        public static string FallbackImage(this string imageUrl)
        {
            string cachedImagePath = CacheEngine.Get<string>(imageUrl);
    
            if (string.IsNullOrEmpty(cachedImagePath))
            {
                string sanitiseImageUrl = string.Empty;
    
                if (!imageUrl.IsExternalLink())
                    sanitiseImageUrl = $"{HttpContext.Current.GetCurrentDomain()}{imageUrl.Replace("~", string.Empty)}";
    
                // Attempt to request the image.
                WebRequest request = WebRequest.Create(sanitiseImageUrl);
    
                try
                {
                    WebResponse response = request.GetResponse();
                    cachedImagePath = imageUrl;
                }
                catch (Exception ex)
                {
                    cachedImagePath = "/resources/images/placeholder.jpg";
                }
    
                // Add image path to cache.
                CacheEngine.Add(cachedImagePath, imageUrl, 5);
            }
    
            return cachedImagePath;
        }
    }
    

    To ensure optimum performance to minimise any unnecessary checks for the same image, the request is stored in cache for 5 minutes.

    The method is using some functionality that I have developed within my own website, which will only work when referenced in your own codebase:

    • GetCurrentDomain - get the full URL of the current domain including any protocols and ports.
    • CacheEngine - provides a bunch of helper methods to interact with .NET cache provider easily.
  • Published on
    -
    6 min read

    Year In Review - 2018

    At the end of 2017, I made a new years resolution: make more of an active effort to blog. Not only within my own website but to also do a little writing elsewhere to try something a little different. Generally, I fail to stick by my resolutions, this year was different and I have to give myself a pat on the back for the number of posts I managed to crank out over the year.

    Even though I have blogged for over 11 years, I found setting myself setting a new years resolution to write more has increased my overall confidence in writing as well as enjoyment. I now find myself using writing as a release to organise my thought process, especially when trying to grasp new learning concepts. If what I write helps others, that's a bonus!

    I highly recommend everyone to write!

    Popular Posts of The Year

    This year I have written 24 posts (including this one), ranging from technical and personal entries. I've delved into my Google Analytics and picked a handful of gems where I believe in my own small way have made an impact:

    One of the most common crawl message everyone experiences when viewing their Search Console. There isn't an exact science to resolving this issue, so decided to investigate a way to reduce the number of crawl errors by looking into the type of links Google was ignoring and submitting these links into a new sitemap.

    PaginationHelper class that renders numbered pagination that can easily be reused across any list of data when called inside a controller.

    A nice helper method to easily allow for partial views to be rendered as string outside controller context in .NET Core.

    A Powershell script to clear out old IIS logs that can be run manually or on a schedule.

    Originally written for Syndicut's Medium channel, I write about my experiences using a headless CMS from past to present - Kentico Cloud. When posted to Medium, I was very happy with the responses from the Kentico community as well as the stats. Since posting, it one of the most active viewed posts on the Syndicut Medium channel.

    This year I started learning and developing my first app using React Native. I have much more to write about the subject next year. This post detailed a bug (now rectified in the new version of React Native framework) when an API endpoint returns an empty response.

    I've been using Cloudflare's CDN infrastructure to improve load times and security for my website. I developed a C# method that uses Cloudflare's API to clear cache from the CDN at code level.

    A completely out of the ordinary post - a post about travel! Writing about my time at Melia Bali is the highlight to any of the posts written this year. I was testing the waters for a different writing style and hoping to produce more in a similar format. Definitely got the creative juices flowing!

    Site Refresh

    Throughout 11 years of owning and running this site, I have only ever refreshed the look around three times in its lifespan. It's always been a low priority in my eyes and in October I released a new look to my site. It's not going to win any prizes, but it should make my posts easier on the eye with some added flair and professionalism. I've also worked very hard on implementing additional optimisations for SEO purposes.

    Guest Writing

    I made a conscious choice this year to use other places to write outside the comfort of my own website to do some writing. When writing outside your own personal blog the stakes are higher and makes things a little more challenging as you need to cater your content to a potentially different audience.

    Out of the 24 posts I've written this year, 4 of them are what I categorised as "Guest Writing"... I need to come up with a better name.

    Currently, I've written a post for C# Corner and the rest on Syndicut's Medium channel. This is an area I wish to grow in. I am actually in the middle of writing a piece (nothing to do with coding and more to do with fiction) for film/entertainment website Den of Geek. What attracted me to write for Den of Geek is not just their content, but their award-winning mental health campaign.

    Statistics

    When comparing my statistics to date over the year to last year, I have an increase in 25% in page views and users. Bounce rate I still need to work on - currently decreased by 4%. Google Search Console statistics are looking promising - average page position has improved by 5 pages and total clicks increased by 110%.

    Syndicut

    2018 will mark eight and a half years working at Syndicut and this year like always has been filled with many challenging and exciting projects. If it wasn't for the diverse range of projects, which required me to do some interesting research, I don't think this blog would be filled with the content it has today. It's been a year filled with headless CMS's, consulting, blog writing, e-commerce, Alexa, ASP.NET Core, Kentico 12 and more!

    Bring On 2019!

    Bring on 2019 and some key areas I wish to focus and grow in...

    Public Speaking

    If I had to choose one thing I need to do next year, is to present a worthy technical subject for one of .NET Oxford's Lightning Talks. For those who do not know .NET Oxford - it's a great meetup for .NET Developers discussing anything and everything in the Microsoft Development industry.

    Every so often, they have lightning talks where developers can present their own subject matter. Even though I am familiar with presenting in front of clients through work, a technical conference is a completely different kettle of fish! After all, what you present needs to be useful to the people in the same industry.

    There was something Jon Skeet said in one of the .NET Oxford meetups in October that resonated with me. He said something along the lines of: everyone should do a public talk and take part in sharing knowledge. Now if Jon Skeet says that, who am I to argue!? I just need to have a ponder on the subject I wish to talk about.

    Consulting

    I would like to do some more consulting work. Syndicut has given me the opportunity to do that, providing our ongoing expertise to help one of our clients in adopting headless CMS (Kentico Cloud) into their current infrastructure. It was a surprise to me how much I've enjoyed consulting and this is an area I wish to continue to grow in addition to what I do best - code!

    Continue To Be Social Network Hermit

    I found it refreshing not to invest too much time (if any) in getting distracted by other people's lives on social networking sites like Facebook and Instagram. I will continue the process investing time in my own life and as a result feeling much more productive in doing so.

    I do not claim martyr in this area, afterall I am an active Twitter user and this will not change. For me, Twitter houses a collection of ideas from users in an industry I am very passionate about.

    Other Bits

    I really need a coffee table and desk! It's been a long time coming. Maybe 2019 will be the year I actually get one! :-)


    Surinder signing off!

  • Today I was working on a form that is nested inside an UpdatePanel with my work colleague. I'm not really a fan of UpdatePanel's, but I have to admit it does save time and for this form, I had to make an exception as there was a lot of dynamic loading of web controls going on.

    One part of the form required the use of a FileUpload control. Unfortunately, the FileUpload control is not supported when working alongside AJAX and requires a full page postback. If you can move the FileUpload control and corresponding button that carries out the action of uploading the file outside the UpdatePanel without breaking your form structure, that would be the most suitable approach. If not, then keep reading!

    The most well-known approach is to use PostBackTrigger - part of the UpdatePanel trigger options. When the trigger is set on the button that will action the file upload and will allow the page to carry out a full postback.

    <asp:UpdatePanel ID="FormUpdatePanel" runat="server">
        <ContentTemplate>
            <div>
                <asp:FileUpload ID="EntryFile1Upload" runat="server" />
                <br />
                <asp:Button ID="UploadButton" OnClick="UploadButton_Click" runat="server" Text="Upload File" />
            </div>
        </ContentTemplate>
        <Triggers>
            <asp:PostBackTrigger ControlID="UploadButton" />
        </Triggers>
    </asp:UpdatePanel>
    

    If your upload functionality still fails to work after adding the trigger, then you might be missing enctype attribute on the form tag. This is something I've overlooked in the past as some of the CMS's I work with add this attribute automatically. You can create the attribute at page or user control level on Page Load by simply adding the following line of code:

    this.Page.Form.Enctype = "multipart/form-data";
    
  • This month I've been writing some blog posts on why I decided to start using Cloudflare service for my website and utilising its API to allow me to purge cached files from the Cloudflare CDN on demand. Before reading further, I highly suggest perusing those posts just to put everything into context for my reasoning into using Cloudflare as well as the C# code that interacts with the API, which I will be referencing later on within this very post.

    My intial Cloudflare integration evolves around serving media files more efficiently through a CDN and having the ability to refresh these files automatically as updates are made within the Kentico CMS. Cloudflare's CDN services can help cache your content across their large global network, moving static files closer to your visitor.

    Based on the Page Rules I configured within the Cloudflare dashboard, I am caching all media library files served through the /getmedia/ URL path into the Cloudflare CDN. The same file will be served through the CDN until the set cache limit has expired. We need to implement functionality that will add some automation to the Kentico platform to purge the cache of a specific media library file when updated.

    Add A Global Event

    I created an event handler for the updating of Media library files as I wanted to get details of the file being updated by leveraging the MediaFileInfo class to access the Update.After event.

    protected override void OnInit()
    {
        base.OnInit();
    
        MediaFileInfo.TYPEINFO.Events.Update.After += Update_After;
    }
    
    private void Update_After(object sender, ObjectEventArgs e)
    {
        MediaFileInfo fileInfo = e.Object as MediaFileInfo;
    
        GlobalEventFunctions.PurgeMediaCache(fileInfo);
    }
    

    PurgeMediaCache() Method

    The event above calls a GlobalEventFunctions.PurgeMediaCache() method that will pass the information about the changed file ready for purging. The file URL parsed to the Cloudflare.PurgeSelectedFiles() method needs to be exact and take into consideration how your instance of Kentico is serving media files. If Permanent URL's are being used the /getmedia/ URL needs to be constructed consisting of:

    • Current domain
    • File GUID
    • File Name
    • File Extension

    Otherwise, we can just use get the file path as normal to where the media file resides.

    public class GlobalEventFunctions
    {
        /// <summary>
        /// Purges a file from the Cloudflare cache.
        /// </summary>
        /// <param name="fileInfo"></param>
        public static void PurgeMediaCache(MediaFileInfo fileInfo)
        {
            bool permanentURLEnabled = SettingsKeyInfoProvider.GetBoolValue($"{SiteContext.CurrentSiteName}.CMSMediaUsePermanentURLs");
            string filePath = string.Empty;
                
            if (permanentURLEnabled)
                filePath = $"{GetCurrentDomain()}/getmedia/{fileInfo.FileGUID.ToString()}/{fileInfo.FileName}{fileInfo.FileExtension}";
            else
                filePath = $"{GetCurrentDomain()}/{fileInfo.FilePath}";
    
            try
            {
                // Get code from: https://www.surinderbhomra.com/Blog/Post/2018/11/11/Cloudflare-API-Purge-Files-By-URL-In-C
                CloudflareCacheHelper cloudflareHelper = new CloudflareCacheHelper();
    
                cloudflareHelper.PurgeSelectedFiles(new List<string> { filePath });
            }
            catch (Exception ex)
            {
                EventLogProvider.LogException("Cloudflare Purge File Cache", "CLOUDFLARE_PURGE", ex, SiteContext.CurrentSiteID, $"Purge File: {filePath}");
            }
        }
    
        /// <summary>
        /// Get domain from current http context.
        /// </summary>
        /// <returns></returns>
        private static string GetCurrentDomain()
        {
            return $"{HttpContext.Current.Request.Url.Scheme}{Uri.SchemeDelimiter}{HttpContext.Current.Request.Url.Host}{(!HttpContext.Current.Request.Url.IsDefaultPort ? $":{HttpContext.Current.Request.Url.Port}" : null)}";
        }
    }
    

    We need not consider any other scenarios, such as insert or deletion. If a file is inserted, there is nothing to purge as it's a new file that will be cached directly into in the CDN on first request and when it comes to deletion we can just wait for the cache to expire.

    What's Next?

    The integration I have detailed so far is just scratching the surface of what Cloudflare has to offer and will investigate further on pushing more content over to the CDN. One area, in particular, I am looking into is carrying out full page caching. You might be thinking why even bother as Kentico has pretty good caching mechanisms already in place?

    Well Cloudflare has a really neat feature called "Always Online", where a cached version of a page is served if on the off chance it happens to go down or requires a reboot to install key security updates. But implementing this feature requires strict Page Rules to be setup within the Cloudflare dashboard to ensure the general workings of Kentico are not effected.

  • Earlier this week I wrote about the reasons to why I decided to use Cloudflare for my website. I've been working on utilising Cloudflare's API to purge the cache on demand for when files need to be updated within the CDN. To do this, I decided to write a method that will primarily use one API endpoint - /purge_cache. This endpoint allows a maximum of 30 URL's at one time to be purged, which is flexible enough to fit the majority of day-to-day use cases.

    To communicate with the API, we need to provide three pieces of information:

    1. Account Email Address
    2. Zone ID
    3. API Key

    The last two pieces of information can be found within the dashboard of your Cloudflare account.

    Code - CloudflareCacheHelper Class

    The CloudflareCacheHelper class consists of a single method PurgeSelectedFiles() and the following class objects used for serializing and deserializing our responses from API requests:

    • CloudflareFileInfo
    • CloudflareZone
    • CloudflareResultInfo
    • CloudflareResponse

    Not all the properties within each of the class objects are being used at the moment based on the requests I am making. But the CloudflareCacheHelper class will be updated with more methods as I delve further into Cloudflare's functionality.

    public class CloudflareCacheHelper
    {
        public string _userEmail;
        public string _apiKey;
        public string _zoneId;
    
        private readonly string ApiEndpoint = "https://api.cloudflare.com/client/v4";
    
        /// <summary>
        /// By default the Cloudflare API values will be taken from the Web.Config.
        /// </summary>
        public CloudflareCacheHelper()
        {
            _apiKey = ConfigurationManager.AppSettings["Cloudflare.ApiKey"];
            _userEmail = ConfigurationManager.AppSettings["Cloudflare.UserEmail"];
            _zoneId = ConfigurationManager.AppSettings["Cloudflare.ZoneId"];
        }
    
        /// <summary>
        /// Set the Cloudflare API values explicitly.
        /// </summary>
        /// <param name="userEmail"></param>
        /// <param name="apiKey"></param>
        /// <param name="zoneId"></param>
        public CloudflareCacheHelper(string userEmail, string apiKey, string zoneId)
        {
            _userEmail = userEmail;
            _apiKey = apiKey;
            _zoneId = zoneId;
        }
            
        /// <summary>
        /// A collection of file paths (max of 30) will be accepted for purging cache.
        /// </summary>
        /// <param name="filePaths"></param>
        /// <returns>Boolean value on success or failure.</returns>
        public bool PurgeSelectedFiles(List<string> filePaths)
        {
            CloudflareResponse purgeResponse = null;
    
            if (filePaths?.Count > 0)
            {
                try
                {
                    HttpWebRequest purgeRequest = WebRequest.CreateHttp($"{ApiEndpoint}/zones/{_zoneId}/purge_cache");
                    purgeRequest.Method = "POST";
                    purgeRequest.ContentType = "application/json";
                    purgeRequest.Headers.Add("X-Auth-Email", _userEmail);
                    purgeRequest.Headers.Add("X-Auth-Key", _apiKey);
    
                    #region Create list of Files for Submission In The Structure The Response Requires
    
                    CloudflareFileInfo fileInfo = new CloudflareFileInfo
                    {
                        Files = filePaths
                    };
    
                    byte[] data = Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(fileInfo));
    
                    purgeRequest.ContentLength = data.Length;
    
                    using (Stream fileStream = purgeRequest.GetRequestStream())
                    {
                        fileStream.Write(data, 0, data.Length);
                        fileStream.Flush();
                    }
    
                    #endregion
    
                    using (WebResponse response = purgeRequest.GetResponse())
                    {
                        using (StreamReader purgeStream = new StreamReader(response.GetResponseStream()))
                        {
                            string responseJson = purgeStream.ReadToEnd();
    
                            if (!string.IsNullOrEmpty(responseJson))
                                purgeResponse = JsonConvert.DeserializeObject<CloudflareResponse>(responseJson);
                        }
                    }
                }
                catch (Exception ex)
                {
                    throw ex;
                }
    
                return purgeResponse.Success;
            }
    
            return false;
        }
    
        #region Cloudflare Class Objects
    
        public class CloudflareFileInfo
        {
            [JsonProperty("files")]
            public List<string> Files { get; set; }
        }
    
        public class CloudflareZone
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("type")]
            public string Type { get; set; }
    
            [JsonProperty("name")]
            public string Name { get; set; }
    
            [JsonProperty("content")]
            public string Content { get; set; }
    
            [JsonProperty("proxiable")]
            public bool Proxiable { get; set; }
    
            [JsonProperty("proxied")]
            public bool Proxied { get; set; }
    
            [JsonProperty("ttl")]
            public int Ttl { get; set; }
    
            [JsonProperty("priority")]
            public int Priority { get; set; }
    
            [JsonProperty("locked")]
            public bool Locked { get; set; }
    
            [JsonProperty("zone_id")]
            public string ZoneId { get; set; }
    
            [JsonProperty("zone_name")]
            public string ZoneName { get; set; }
    
            [JsonProperty("modified_on")]
            public DateTime ModifiedOn { get; set; }
    
            [JsonProperty("created_on")]
            public DateTime CreatedOn { get; set; }
        }
    
        public class CloudflareResultInfo
        {
            [JsonProperty("page")]
            public int Page { get; set; }
    
            [JsonProperty("per_page")]
            public int PerPage { get; set; }
    
            [JsonProperty("count")]
            public int Count { get; set; }
    
            [JsonProperty("total_count")]
            public int TotalCount { get; set; }
        }
    
        public class CloudflareResponse
        {
            [JsonProperty("result")]
            public CloudflareZone Result { get; set; }
    
            [JsonProperty("success")]
            public bool Success { get; set; }
    
            [JsonProperty("errors")]
            public IList<object> Errors { get; set; }
    
            [JsonProperty("messages")]
            public IList<object> Messages { get; set; }
    
            [JsonProperty("result_info")]
            public CloudflareResultInfo ResultInfo { get; set; }
        }
    
        #endregion
    }
    

    Example - Purging Cache of Two Files

    A string collection of URL's can be passed into the method to allow for the cache of a batch of files to be purged in a single request. If all goes well, the success response should be true.

    CloudflareCacheHelper cloudflareCache = new CloudflareCacheHelper();
    
    bool isSuccess = cloudflareCache.PurgeSelectedFiles(new List<string> {
                                        "https://www.surinderbhomra.com/getmedia/7907d934-805f-4bd3-86e7-a6b2027b4ba6/CloudflareResponseMISS.png",
                                        "https://www.surinderbhomra.com/getmedia/89679ffc-ca2f-4c47-8d41-34a6efdf7bb8/CloudflareResponseHIT.png"
                                    });
    

    Rate Limits

    The Cloudflare API sets a maximum of 1,200 requests in a five minute period. Cache-Tag purging has a lower rate limit of up to 2,000 purge API calls in every 24 hour period. You may purge up to 30 tags in one API call.

  • A couple day ago my website got absolutely hammered by a wave of constant SQL injection attacks by the same IP over a time period of a couple hours.

    I only managed to notice this whilst perusing the Event Log within the Kentico Administration interface. I don't normally check my own error logs as regularly as I should do, but since my site has recently gone through a bit of a revamp (which I'm still yet to post about), I wanted to ensure I haven't broken anything.

    To be honest, I am flattered that someone would think this site is worth the time and energy in trying to hack my site. Trust me, it ain't worth it.

    Even though Kentico has handled these attacks well, as a precaution I wanted to implement an additional layer of security before any further untoward activity reaches to my site. Being on a shared hosting platform my options are limited and my hands are tied to put in an infrastructure that doesn't cost the world.

    Enter Cloudflare

    I have always had some form of awareness of the Cloudflare content delivery network, just never put the service into practice. At one point I was looking into utilising Cloudflare to manage all my site media files over their CDN. But Cloudflare isn't just a CDN, it's able to offer much more:

    • Analytics - monitor traffic as well as caching ratio and more!
    • Firewall- manage access by IP, country, or query rules.
    • Rate limiting- protect your site or API from malicious traffic by blocking client IP addresses that hit a URL pattern and exceed a threshold you define.
    • Page rules - to allow caching to be triggered by a number of rules targeting specific areas of a site.

    The great thing is that these options are part of the free plan... even though there are restrictions to the number of settings you are able to put in place. The security options alone was enough of a reason to try out Cloudflare. But for my needs, the free plan seemed to tick all the boxes. I am just scratching the surface to what Cloudflare has to offer and have already got a few of the features working alongside Kentico.

    How I'm Using Cloudflare With Kentico

    As security was a main concern for me, one of the first things I did was to add in some rules through the firewall and block suspicious traffic. Naturally the next was to take advantage of the CDN capabilities. I wouldn't recommend full site caching unless you have some pretty strict page rules in place as this has a chance to cause issues with the Kentico Admin interface. By default, Cloudflare doesn't cache HTML pages, which is a good thing as it gives us a plain canvas to target the areas we want to cache.

    Being on the free plan, I only had three page rules at my disposal and made the decision to cache the following parts of my site.

    Area Regex Rule Settings
    Media Library Files (using Permanent URL's) *surinderbhomra.com/
    	getmedia/\* | <br>**Cache Level:** Caches Everything<br>**Edge Cache TTL:** A month<br>**Browser Cache TTL:** 16 days<br> |
    

    | Site CSS, JavaScript and Images | *surinderbhomra.com/
    resources/* | Cache Level: Caches Everything
    Edge Cache TTL: A month
    Browser Cache TTL: 16 days | | ScriptResource.axd/WebResource.axd | *surinderbhomra.com/*.axd | Cache Level: Caches Everything
    Edge Cache TTL: 7 days
    Browser Cache TTL: 7 days |

    Two cache types are used:

    1. Edge Cache TTL - is the setting that controls how long CloudFlare's edge servers will cache a resource before requesting a fresh copy from your server.
    2. Browser Cache TTL - is the time that CloudFlare instructs a visitor's browser to cache a resource. Until this time expires, the browser will load the resource from its local cache and speeding up the request.

    I am caching my assets for a considerable amount of time, which begs the question how will changes to files purge the cache in Cloudflare? Luckily, Cloudflare has quite a nice API, where I have the ability to purge everything or individual files (maximum of 30 in one request).

    Purge Media Library File Cache

    I am in the middle of testing a GlobalEvent handler that will carry out the following steps when files are inserted or updated:

    • Get path of the file.
    • Check if site is using Permanent URL's. As the URL to the file will be constructured differently if enabled.
    • Convert the relative path to an absolute path.
    • Pass the absolute file path to Cloudflare using the following Purge Files by URL API endpoint.

    Once I have carried out some further tests, I will be posting this code in a follow-up post.

    Purge Site File Cache

    Now attempting to purge the cache for site files such as JavaScript, CSS and images are a little more tricky as I need to keep an eye on the files changed. The easiest thing to do is I could write some code that will iterate through all the files in the /resources folder and purge everything from the CDN. Not the most elegant solution. Still, need to ponder on the correct implementation. If anyone has got any better suggestions, let me know.

    How Do I Know Cloudflare Is Caching My Site Files?

    It does take a little time in Cloudflare to cache all files on a site. It all depends on pages being viewed. A page needs to be loaded in order to submit all its contents to cache. On first load, the response header CF-Cache-Status will return a "MISS", which means the content has not been served from Cloudflare.

    Cloudflare Response - MISS

    However, when you go back to the page and re-check the page headers, the CF-Cache-Status should return "HIT". If this is not the case, check your Page Rules within the Cloudflare dashboard.

    Cloudflare Response - HIT

    Is Cloudflare Worth It?

    Quick answer - Yes!

    Setup is very straight-forward. All that is required is to carry out a change to your domain and point your DNS to the DNS Cloudflare assigns to you. There is no downtime in doing this. As a result, overall site performance has improved and page speed test faired much better.

    To give you a better insight for transparency, here are some statistics straight from my Cloudflare portal over a 24 hour period:

    Requests Through Cloudflare
    (Understandably, the number of cached items served through Cloudflare is low due to only caching specific areas)

    Cloudflare Performance
    (My basic shared hosting should now be performing better as less requests are being served via the origin server)

    Cloudflare Detected Threats
    (Blocked threats - the reason why I decided to give Cloudflare a try in the first place)

    I still have to carry out a lot more research in using Cloudflare to its full potential and will use my website as a test bed to see what I can achieve. My end goal is to make my website quicker and more secure!