Blog

Tagged by 'cache'

  • ASP.NET Core contains a variety of useful Tag Helpers to enable server-side code to participate in creating and rendering HTML elements in our Views. One Tag Helper, in particular, has the ability to cache bust links to static resources such as Image, CSS and JavaScript by appending an asp-append-version="true" attribute.

    The asp-append-version attribute automatically adds a version number to the file name using a SHA256 hashing algorithm, so whenever the file is updated, the server generates a new unique version. For a deeper understanding on how ASP.NET Core performs this piece of functionality, give the following StackOverflow post a read: How does javascript version (asp-append-version) work in ASP.NET Core MVC?.

    This approach works perfectly if you're linking to your static resources using the relevant HTML tag, for example img, script or link. In my scenario, I'm using a JavaScript library called LabJS - a dynamic script loader that gives the ability to control the loading and execution of different plugins. For example:

    <script>
      $LAB
      .script("http://remote.tld/jquery.js").wait()
      .script("/local/plugin1.jquery.js")
      .script("/local/plugin2.jquery.js").wait()
      .script("/local/init.js").wait(function(){
          initMyPage();
      });
    </script>
    

    I need to be able to append a query string parameter to one of the JavaScript file references. One thing that came to mind was to use the applications last build-time as the cache busting value. Whenever the application is updated, this value will automatically be updated so no manual intervention is required.

    I found code examples from meziantou.net that demonstrated various approaches to acquiring an applications build date. I modified the "Linker timestamp" example to return a Unix timestamp in a newly created class called AssemblyUtils.

    public class AssemblyUtils
    {
        #region Properties
    
        public int UnixTimestamp { get; set; }
    
        #endregion
    
        /// <summary>
        /// Get timestamp in Unix seconds for the last build.
        /// </summary>
        /// <returns></returns>
        public static int GetBuildTimestamp()
        {
            const int peHeaderOffset = 60;
            const int timestampOffset = 8;
    
            byte[] bytes = new byte[2048];
    
            using (FileStream file = new FileStream(Assembly.GetExecutingAssembly().Location, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
                file.Read(bytes, 0, bytes.Length);
    
            int headerPos = BitConverter.ToInt32(bytes, peHeaderOffset);
            int unixTime = BitConverter.ToInt32(bytes, headerPos + timestampOffset);
    
            return unixTime;
        }
    }
    

    The code will only return the Assembly information if your Visual Studio .csproj file (from version 15.4 onwards) includes the following setting within the <PropertyGroup> settings:

    <Deterministic>False</Deterministic>
    

    It would be a waste to constantly call the GetBuildTimestamp() method to acquire assembly information directly within the page View, when the most ideal approach would be to make this call once on application startup.

    public void ConfigureServices(IServiceCollection services)
    {
        #region Assembly Utils - Build Time
    
        Action<AssemblyUtils> assemblyBuildOptions = (opt =>
        {
            opt.UnixTimestamp = AssemblyUtils.GetBuildTimestamp();
        });
    
        services.Configure(assemblyBuildOptions);
        services.AddSingleton(resolver => resolver.GetRequiredService<IOptions<AssemblyUtils>>().Value);
    
        #endregion
    }
    

    We can access the build timestamp value using Dependency Injection within a base controller that gets inherited by all controllers.

    public class BaseController : Controller
    {
        private int _buildTimetamp { get; set; }
    
        public BaseController(AssemblyUtils assemblyUtls)
        {
            _buildTimetamp = assemblyUtls.UnixTimestamp;
        }
    
        public override void OnActionExecuting(ActionExecutingContext context)
        {
            base.OnActionExecuting(context);
    
            // Assign build timestamp to a View Bag.
            ViewBag.CacheBustingValue = _buildTimetamp;
        }
    }
    

    The timestamp is assigned to a ViewBag that can then be accessed at View level.

    <script>
      $LAB
      .script("http://remote.tld/jquery.js").wait()
      .script("/local/plugin1.jquery.js")
      .script("/local/plugin2.jquery.js").wait()
      .script("/local/init.js?v=@ViewBag.CacheBustingValue").wait(function(){
          initMyPage();
      });
    </script>
    

    This will result in the following output:

    <script>
      $LAB
      .script("http://remote.tld/jquery.js").wait()
      .script("/local/plugin1.jquery.js")
      .script("/local/plugin2.jquery.js").wait()
      .script("/local/init.js?v=1609610821").wait(function(){
          initMyPage();
      });
    </script>
    
  • This month I've been writing some blog posts on why I decided to start using Cloudflare service for my website and utilising its API to allow me to purge cached files from the Cloudflare CDN on demand. Before reading further, I highly suggest perusing those posts just to put everything into context for my reasoning into using Cloudflare as well as the C# code that interacts with the API, which I will be referencing later on within this very post.

    My intial Cloudflare integration evolves around serving media files more efficiently through a CDN and having the ability to refresh these files automatically as updates are made within the Kentico CMS. Cloudflare's CDN services can help cache your content across their large global network, moving static files closer to your visitor.

    Based on the Page Rules I configured within the Cloudflare dashboard, I am caching all media library files served through the /getmedia/ URL path into the Cloudflare CDN. The same file will be served through the CDN until the set cache limit has expired. We need to implement functionality that will add some automation to the Kentico platform to purge the cache of a specific media library file when updated.

    Add A Global Event

    I created an event handler for the updating of Media library files as I wanted to get details of the file being updated by leveraging the MediaFileInfo class to access the Update.After event.

    protected override void OnInit()
    {
        base.OnInit();
    
        MediaFileInfo.TYPEINFO.Events.Update.After += Update_After;
    }
    
    private void Update_After(object sender, ObjectEventArgs e)
    {
        MediaFileInfo fileInfo = e.Object as MediaFileInfo;
    
        GlobalEventFunctions.PurgeMediaCache(fileInfo);
    }
    

    PurgeMediaCache() Method

    The event above calls a GlobalEventFunctions.PurgeMediaCache() method that will pass the information about the changed file ready for purging. The file URL parsed to the Cloudflare.PurgeSelectedFiles() method needs to be exact and take into consideration how your instance of Kentico is serving media files. If Permanent URL's are being used the /getmedia/ URL needs to be constructed consisting of:

    • Current domain
    • File GUID
    • File Name
    • File Extension

    Otherwise, we can just use get the file path as normal to where the media file resides.

    public class GlobalEventFunctions
    {
        /// <summary>
        /// Purges a file from the Cloudflare cache.
        /// </summary>
        /// <param name="fileInfo"></param>
        public static void PurgeMediaCache(MediaFileInfo fileInfo)
        {
            bool permanentURLEnabled = SettingsKeyInfoProvider.GetBoolValue($"{SiteContext.CurrentSiteName}.CMSMediaUsePermanentURLs");
            string filePath = string.Empty;
                
            if (permanentURLEnabled)
                filePath = $"{GetCurrentDomain()}/getmedia/{fileInfo.FileGUID.ToString()}/{fileInfo.FileName}{fileInfo.FileExtension}";
            else
                filePath = $"{GetCurrentDomain()}/{fileInfo.FilePath}";
    
            try
            {
                // Get code from: https://www.surinderbhomra.com/Blog/Post/2018/11/11/Cloudflare-API-Purge-Files-By-URL-In-C
                CloudflareCacheHelper cloudflareHelper = new CloudflareCacheHelper();
    
                cloudflareHelper.PurgeSelectedFiles(new List<string> { filePath });
            }
            catch (Exception ex)
            {
                EventLogProvider.LogException("Cloudflare Purge File Cache", "CLOUDFLARE_PURGE", ex, SiteContext.CurrentSiteID, $"Purge File: {filePath}");
            }
        }
    
        /// <summary>
        /// Get domain from current http context.
        /// </summary>
        /// <returns></returns>
        private static string GetCurrentDomain()
        {
            return $"{HttpContext.Current.Request.Url.Scheme}{Uri.SchemeDelimiter}{HttpContext.Current.Request.Url.Host}{(!HttpContext.Current.Request.Url.IsDefaultPort ? $":{HttpContext.Current.Request.Url.Port}" : null)}";
        }
    }
    

    We need not consider any other scenarios, such as insert or deletion. If a file is inserted, there is nothing to purge as it's a new file that will be cached directly into in the CDN on first request and when it comes to deletion we can just wait for the cache to expire.

    What's Next?

    The integration I have detailed so far is just scratching the surface of what Cloudflare has to offer and will investigate further on pushing more content over to the CDN. One area, in particular, I am looking into is carrying out full page caching. You might be thinking why even bother as Kentico has pretty good caching mechanisms already in place?

    Well Cloudflare has a really neat feature called "Always Online", where a cached version of a page is served if on the off chance it happens to go down or requires a reboot to install key security updates. But implementing this feature requires strict Page Rules to be setup within the Cloudflare dashboard to ensure the general workings of Kentico are not effected.

  • Earlier this week I wrote about the reasons to why I decided to use Cloudflare for my website. I've been working on utilising Cloudflare's API to purge the cache on demand for when files need to be updated within the CDN. To do this, I decided to write a method that will primarily use one API endpoint - /purge_cache. This endpoint allows a maximum of 30 URL's at one time to be purged, which is flexible enough to fit the majority of day-to-day use cases.

    To communicate with the API, we need to provide three pieces of information:

    1. Account Email Address
    2. Zone ID
    3. API Key

    The last two pieces of information can be found within the dashboard of your Cloudflare account.

    Code - CloudflareCacheHelper Class

    The CloudflareCacheHelper class consists of a single method PurgeSelectedFiles() and the following class objects used for serializing and deserializing our responses from API requests:

    • CloudflareFileInfo
    • CloudflareZone
    • CloudflareResultInfo
    • CloudflareResponse

    Not all the properties within each of the class objects are being used at the moment based on the requests I am making. But the CloudflareCacheHelper class will be updated with more methods as I delve further into Cloudflare's functionality.

    public class CloudflareCacheHelper
    {
        public string _userEmail;
        public string _apiKey;
        public string _zoneId;
    
        private readonly string ApiEndpoint = "https://api.cloudflare.com/client/v4";
    
        /// <summary>
        /// By default the Cloudflare API values will be taken from the Web.Config.
        /// </summary>
        public CloudflareCacheHelper()
        {
            _apiKey = ConfigurationManager.AppSettings["Cloudflare.ApiKey"];
            _userEmail = ConfigurationManager.AppSettings["Cloudflare.UserEmail"];
            _zoneId = ConfigurationManager.AppSettings["Cloudflare.ZoneId"];
        }
    
        /// <summary>
        /// Set the Cloudflare API values explicitly.
        /// </summary>
        /// <param name="userEmail"></param>
        /// <param name="apiKey"></param>
        /// <param name="zoneId"></param>
        public CloudflareCacheHelper(string userEmail, string apiKey, string zoneId)
        {
            _userEmail = userEmail;
            _apiKey = apiKey;
            _zoneId = zoneId;
        }
            
        /// <summary>
        /// A collection of file paths (max of 30) will be accepted for purging cache.
        /// </summary>
        /// <param name="filePaths"></param>
        /// <returns>Boolean value on success or failure.</returns>
        public bool PurgeSelectedFiles(List<string> filePaths)
        {
            CloudflareResponse purgeResponse = null;
    
            if (filePaths?.Count > 0)
            {
                try
                {
                    HttpWebRequest purgeRequest = WebRequest.CreateHttp($"{ApiEndpoint}/zones/{_zoneId}/purge_cache");
                    purgeRequest.Method = "POST";
                    purgeRequest.ContentType = "application/json";
                    purgeRequest.Headers.Add("X-Auth-Email", _userEmail);
                    purgeRequest.Headers.Add("X-Auth-Key", _apiKey);
    
                    #region Create list of Files for Submission In The Structure The Response Requires
    
                    CloudflareFileInfo fileInfo = new CloudflareFileInfo
                    {
                        Files = filePaths
                    };
    
                    byte[] data = Encoding.ASCII.GetBytes(JsonConvert.SerializeObject(fileInfo));
    
                    purgeRequest.ContentLength = data.Length;
    
                    using (Stream fileStream = purgeRequest.GetRequestStream())
                    {
                        fileStream.Write(data, 0, data.Length);
                        fileStream.Flush();
                    }
    
                    #endregion
    
                    using (WebResponse response = purgeRequest.GetResponse())
                    {
                        using (StreamReader purgeStream = new StreamReader(response.GetResponseStream()))
                        {
                            string responseJson = purgeStream.ReadToEnd();
    
                            if (!string.IsNullOrEmpty(responseJson))
                                purgeResponse = JsonConvert.DeserializeObject<CloudflareResponse>(responseJson);
                        }
                    }
                }
                catch (Exception ex)
                {
                    throw ex;
                }
    
                return purgeResponse.Success;
            }
    
            return false;
        }
    
        #region Cloudflare Class Objects
    
        public class CloudflareFileInfo
        {
            [JsonProperty("files")]
            public List<string> Files { get; set; }
        }
    
        public class CloudflareZone
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("type")]
            public string Type { get; set; }
    
            [JsonProperty("name")]
            public string Name { get; set; }
    
            [JsonProperty("content")]
            public string Content { get; set; }
    
            [JsonProperty("proxiable")]
            public bool Proxiable { get; set; }
    
            [JsonProperty("proxied")]
            public bool Proxied { get; set; }
    
            [JsonProperty("ttl")]
            public int Ttl { get; set; }
    
            [JsonProperty("priority")]
            public int Priority { get; set; }
    
            [JsonProperty("locked")]
            public bool Locked { get; set; }
    
            [JsonProperty("zone_id")]
            public string ZoneId { get; set; }
    
            [JsonProperty("zone_name")]
            public string ZoneName { get; set; }
    
            [JsonProperty("modified_on")]
            public DateTime ModifiedOn { get; set; }
    
            [JsonProperty("created_on")]
            public DateTime CreatedOn { get; set; }
        }
    
        public class CloudflareResultInfo
        {
            [JsonProperty("page")]
            public int Page { get; set; }
    
            [JsonProperty("per_page")]
            public int PerPage { get; set; }
    
            [JsonProperty("count")]
            public int Count { get; set; }
    
            [JsonProperty("total_count")]
            public int TotalCount { get; set; }
        }
    
        public class CloudflareResponse
        {
            [JsonProperty("result")]
            public CloudflareZone Result { get; set; }
    
            [JsonProperty("success")]
            public bool Success { get; set; }
    
            [JsonProperty("errors")]
            public IList<object> Errors { get; set; }
    
            [JsonProperty("messages")]
            public IList<object> Messages { get; set; }
    
            [JsonProperty("result_info")]
            public CloudflareResultInfo ResultInfo { get; set; }
        }
    
        #endregion
    }
    

    Example - Purging Cache of Two Files

    A string collection of URL's can be passed into the method to allow for the cache of a batch of files to be purged in a single request. If all goes well, the success response should be true.

    CloudflareCacheHelper cloudflareCache = new CloudflareCacheHelper();
    
    bool isSuccess = cloudflareCache.PurgeSelectedFiles(new List<string> {
                                        "https://www.surinderbhomra.com/getmedia/7907d934-805f-4bd3-86e7-a6b2027b4ba6/CloudflareResponseMISS.png",
                                        "https://www.surinderbhomra.com/getmedia/89679ffc-ca2f-4c47-8d41-34a6efdf7bb8/CloudflareResponseHIT.png"
                                    });
    

    Rate Limits

    The Cloudflare API sets a maximum of 1,200 requests in a five minute period. Cache-Tag purging has a lower rate limit of up to 2,000 purge API calls in every 24 hour period. You may purge up to 30 tags in one API call.

  • Published on
    -
    1 min read

    Cache Busting Kentico

    When developing a website that is quick to load on all devices, caching from both a data and asset perspective is very important. Luckily for us, Kentico provides a comprehensive approach to caching data in order to minimise round-trips to the database. But what about asset caching, such as images, CSS and JavaScript files?

    A couple days ago, I wrote an article on the Syndicut Medium publication on how I have added cache busting functionality in our Kentico CMS builds. I am definitely interested to hear what the approaches other developers from the Kentico network take in order to cache bust their own website assets.

    Take a read here: https://medium.com/syndicutstudio/cache-busting-kentico-cf89496ffda0.

  • I decided to write this blog post after one of my fellow Kentico Cloud developer Matt Nield tweeted the following last week:

    So happy to see this coming to Kentico Cloud! The amount to times I yearned for something I could use to clear cache!
    — Surinder Bhomra (@SurinderBhomra) July 13, 2017

    Webhook capability is something I have been yearning for since I built my first Kentico Cloud project and this feature cannot come soon enough! It will really take the Kentico Cloud headless CMS integration within our applications to the next level. One of the main things I am looking forward to is using webhooks is to develop a form of dependency caching, so when content is updated in Kentico Cloud, the application can reflect these changes.

    In fact, I am so excited to have this feature in my hands for my caching needs, I have already started developing something I could potentially use in time for the Q3 2017 release - should be any time now.


    As we all know, to not only improve overall performance of your application as well as reducing requests to the Kentico Cloud API, we are encouraged to set a default cache duration. There is documentation on the different routes to accomplish this:

    1. Controller-level - using OutputCache attribute
    2. CachedDeliveryClient class - provided by the Kentico Cloud Boilerplate that acts as a wrapper around the original DeliveryClient object to easily cache data returned from the API for a fixed interval.

    I personally prefer caching at controller level, unless the application is doing something very complex at runtime for manipulating incoming data. So in the mean time whilst I wait for webhook functionality to be released, I decided to create a custom controller attribute called "KenticoCacheAttribute", that will only start the caching process only if the application is not in debug mode.

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Web;
    using System.Web.Mvc;
    
    namespace Site.Web.Attributes
    {
        public class KenticoCacheAttribute : OutputCacheAttribute
        {
            public KenticoCacheAttribute()
            {
                Duration = HttpContext.Current.IsDebuggingEnabled ? 0 : int.Parse(ConfigurationManager.AppSettings["KenticoCloud.CacheDuration"]);
            }
        }
    }
    

    The "KenticoCacheAttribute" inherits the OutputCacheAttribute class, which gives me additional control to when I'd like the caching process to happen. In this case, the cache duration is set within the web.config.

    I found the one main benefit of my custom controller attribute is that I will never forget to start caching pages on my website when it comes to deployment to production, since we never want our website to have debugging enabled unless we're in a development environment. This also works the other way. We're not too concerned about caching in a development environment as we always want to see changes in incoming data straight away.

    The new cache attribute is used in the exact same approach as OutputCacheAttribute, in the following way:

    [Route("{urlSlug}")]
    [KenticoCacheAttribute(VaryByParam = "urlSlug")]
    public async Task<ActionResult> Detail(string urlSlug)
    {
         // Do something...
    
        return View();
    }
    

    This is a very simple customisation I found useful through my Kentico Cloud development.

    The custom attribute I created is just the start on how I plan on integrating cache managment for Kentico Cloud applications. When webhook capability is released, I can see further improvements being made, but may require a slightly different approach such as developing a custom MVC Action Filter instead.

  • This site has been longing for an overhaul, both visually and especially behind the scenes. As you most likely have noticed, nothing has changed visually at this point in time - still using the home-cooked "Surinder theme". This should suffice in the meantime as it currently meets my basic requirements:

    • Bootstrapped to look good on various devices
    • Simple
    • Function over form - prioritises content first over "snazzy" design

    However, behind the scenes is a different story altogether and this is where I believe matters most. Afterall, half of web users expect a site to load in 2 seconds or less and they tend to abandon a site that isn’t loaded within 3 seconds. Damning statistics!

    The last time I overhauled the site was back in 2014 where I took a more substantial step form to current standards. What has changed since then? I have upgraded to Kentico 10, but this time using ASP.NET Web Forms over MVC.

    Using ASP.NET Web Form approach over MVC was very difficult decision for me. Felt like I was taking a backwards step in making my site better. I'm the kind of developer who gets a kick out of nice clean code output. MVC fulfils this requirement. Unfortunately, new development approach for building MVC sites from Kentico 9 onwards will not work under a free license.

    The need to use Kentico as a platform was too great, even after toying with the idea of moving to a different platform altogether. I love having the flexibility to customise my website to my hearts content. So I had to the option to either refit my site in Kentico 10 or Kentico Cloud. In the end, I chose Kentico 10. I will be writing in another post why I didn't opt for the latter. I'm still a major advocate of Kentico Cloud and started using it on other projects.

    The developers at Kentico weren't lying when they said that Kentico 10 is "better, stronger, faster". It really is! I no longer get the spinning loader for obscene duration of time whilst opening popups in the administration interface or lengthy startup times when the application has to restart.

    Upgrading from Kentico 8.0 to 10 alone was a great start. I have taken some additional steps to keep my site clean as possible:

    1. Disable view state on all pages, components and user controls.
    2. Caching static files, such as CSS, JS and images. You can see how I do this at web.config level from this post.
    3. Maximising Kentico's cache dependencies to cache all data.
    4. Took the extra step to export all site contents into a fresh installation of Kentico 10, resulting in a slightly smaller web project and database size.
    5. Restructured pages in the content tree to be more efficient when storing large number of pages under one section.

    I basically carried out the recommendations on optimising website performance and then some! My cache statatics have never been so high!

    My Kentico 10 Cache Statistics

    One slight improvement (been a long time coming) is better open graph support when sharing pages on Facebook and Twitter. Now my links look pretty within a tweet.

  • When running my website through Google Page Insights, one of things I didn't do was cache static content, such as CSS, JavaScript and site images. Since I am on a shared hosting plan, I didn't think it was possible to have the option to cache a specific directory without direct IIS access.

    Normally, when working on client sites hosted on a dedicated server, I set the cache header within "HTTP Response Headers" area in IIS. But all this actually does is generate a web.config file within the directory you wish to cache:

    <!--?xml version="1.0" encoding="UTF-8"?-->
    <configuration>
        <system.webServer>
            <httpProtocol>
                <customHeaders>
                    <add name="Cache-Control" value="public, max-age=604800" />
                </customHeaders>
            </httpProtocol>
        </system.webServer>
    </configuration>
    

    So if you too are on shared hosting, add a web.config file with similar settings. In this case, I have cached my files for a week.

    You can also set the cache settings in your main web.config file by wrapping a location path around the <system.webServer> node:

    <location path="resources">
      <system.webServer>
            <httpProtocol>
                <customHeaders>
                    <add name="Cache-Control" value="public, max-age=604800" />
                </customHeaders>
            </httpProtocol>
        </system.webServer>
     </location>
    
  • I noticed something very strange whilst working on one of my recent Kentico projects, where I required a query string value to be case-sensitive. You might be asking why? Well the plan was to pass case-sensitive Base64 random value in a bit.ly ID format. For example: www.mysite.com/Home/iAfcTy.

    So I added a Wildcard URL to one of my pages to keep the URL looking nice and tidy. In this case: “/Home/{ID}”.

    Kentico Document Url Path

    Something with the most simplest of intensions ended up being a bit of a nightmare and to demonstrate what I experienced, see the following test-cases using Kentico’s Wildcard parameter.

    Test 1

    Passing “Hello” to the query string parameter resulted in the following:

    Kentico Wildcard Case 1

    This is the correct outcome.

    Test 2 – Things get interesting!

    Passing “HELLO” to the same query string parameter resulted in the following:

    Kentico Wildcard Case 2

    As you can see, the query string has been cached and resulted in the same value being used. It seems Kentico completely disregards the case sensitivity and it’s only by adding or removing characters that Kentico detects the value passed has changed.

    ***

    My understanding is that by default Kentico accepts the URL’s as entered by the website user. I thought by going to CMS Site Manager and changing the URL settings to “Use exactly the URL of the document” would accept case-sensitive lettering .

    Kentico Redirect Valid Urls

    As it turns out through my testing, this setting under “URL's and SEO” section doesn’t fix the issue and this may only work for document page names and not the query strings values themselves.

    For one moment, I thought I managed to find a bug in the Kentico platform and was hoping that I'd get a tree planted bearing my name through Kentico’s brilliant tree for a bug campaign. Alas, this was not the case. After discussing in great detail the problem with emails sent back and forth I couldn't seem to get the support personnel to replicate the issue.

    But if I'm experiencing this issue across different networks, workstations and installations, there must be an underlying problem within the Kentico platform.

    If one of my fellow Kentico experts can can try what I have stated in my post and report their findings in the comments section, it would be much appreciated.

    Who knows, there might be a really simple thing I’ve overlooked.

    Workaround

    Using the standard way of passing a query string value works perfectly and it only seems Kentico Wildcard URL’s  experiences this issue. So instead of using the Wildcard method, you will have to pass values in the following format:

    www.mysite.com/Home/?ID=Hello
    
  • Safari iOS6It wasn’t until today I found that the Safari browser used on iPad and iPhone caches page functionality to such an extent that it stops the intended functionality. So much so, it affects the user experience. I think Apple has gone a step too far in making their browser uber efficient to minimise page loading times.

    We can accept browsers will cache style-sheets and client side scripts. But I never expected Safari to go as far as caching responses from web services. This is a big issue. So something as simple as the following will have issues in Safari:

    // JavaScript function calling web service
    function GetCustomerName(id)
    {
        var name = "";
    
        $.ajax({
            type: "POST",
            url: "/Internal/ShopService.asmx/GetCustomerName",
            data: "{ 'id' : '" + id + "' }",
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            cache: false,
            success: function (result) {
                var data = result.d;
                name = data;
            },
            error: function () {
            },
            complete: function () {
            }
        });
        
        return name;
    }
    
    //ASP.NET Web Service method
    [WebMethod]
    public string GetCustomerName(int id)
    {
       return CustomerHelper.GetFullName(id);
    }
    

    In the past to ensure my jQuery AJAX requests were not cached, the “cache: false” option within the AJAX call normally sufficed. Not if you’re making POST web service requests. It’s only until recently I found using “cache:false” option will not have an affect on POST requests, as stated on jQuery API:

    Pages fetched with POST are never cached, so the cache and ifModified options in jQuery.ajaxSetup() have no effect on these requests.

    In addition to trying to fix the problem by using the jQuery AJAX cache option, I implemented practical techniques covered by the tutorial: How to stop caching with jQuery and JavaScript.

    Luckily, I found an informative StackOverflow post by someone who experienced the exact same issue a few days ago. It looks like the exact same caching bug is still prevalent in Apple’s newest operating system, iOS6*. Well you didn’t expect Apple to fix important problems like these now would you (referring to Map’s fiasco!). The StackOverflow poster found a suitable workaround by passing a timestamp to the web service method being called, as so (modifying code above):

    // JavaScript function calling web service with time stamp addition
    function GetCustomerName(id)
    {
        var timestamp = new Date();
    
        var name = "";
    
        $.ajax({
            type: "POST",
            url: "/Internal/ShopService.asmx/GetCustomerName",
            data: "{ 'id' : '" + id + "', 'timestamp' : '" + timestamp.getTime() + "' }", //Timestamp parameter added.
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            cache: false,
            success: function (result) {
                var data = result.d;
                name = data;
            },
            error: function () {
            },
            complete: function () {
            }
        });
        
        return name;
    }
    
    //ASP.NET Web Service method with time stamp parameter
    [WebMethod]
    public string GetCustomerName(int id, string timestamp)
    {
        string iOSTime = timestamp;
        return CustomerHelper.GetFullName(id);
    }
    

    The timestamp parameter doesn’t need to do anything once passed to web service. This will ensure every call to the web service will never be cached.

    *UPDATE: After further testing it looks like only iOS6 contains the AJAX caching bug.

  • Published on
    -
    1 min read

    HTTP Request Script

    In one of my website builds, I needed to output around a couple thousand records from a database permanently into the .NET cache. Even though I set the cache to never expire, it will get cleared whenever the application pool recycles (currently set to every 24 hours). As you can expect, if a user happens to visit the site soon after the cache is cleared, excess page loading times will be experienced.

    The only way I could avoid this from happening is by setting up a Scheduled Task that would run a script that would carry out a web request straight after the application pool was set to recycle.

    Luckily, I managed to find a PowerShell script on StackOverflow that will do exactly that:

    $request = [System.Net.WebRequest]::Create("")
    $response = $request.GetResponse()
    $response.Close()