Blog

Posts written in June 2016.

  • Published on
    -
    2 min read

    Made The Move To HTTPS!

    Early last month, I decided to make the move and finally run my site under a secure certificate. This something I’ve been meaning to do over the last year as it became apparent that Google will soon penalise your search rankings if an SSL is not installed. Quite a few of the developer blogs I follow have already made the transition, so I thought I too should do the same. I was surprised how cheap it was to move to HTTPS. For myself, I pay around £25 a year that consists of a basic Comodo SSL certificate and a dedicated IP. This is purely because my website is hosted on a shared hosting provider. It’ll probably be even more cheaper for those who manage their own hosting.

    I highly recommend anyone who still has qualms on making the move to https to read the following post by Scott Helme: Still think you don't need HTTPS?. He brings up some very interesting points and benefits that motivated me to make the move.

    The transition to making the move to HTTPS was painless and required no major downtime. But I did have to spend time in ensuring all external requests from my site were secure, for example Disqus, Google Ads and some external JS references. However, something a little more pressing caught my eye and got quite a fright when I logged into Google Webmaster Tools yesterday. Unbeknown to me, ever since my site changed to HTTPS, both my clicks and CTR statistics declined drastically over the month. Take a look at the blue and yellow lines:

    Google Webmaster Tools Clicks/CTR Decline

    At least this decline has not been reflected in my Google Analytics report. The number of visitors to my site has remained stable and have even noticed a slight increase - I don’t think the increase has anything to do with the SSL certificate. So what caused the rapid decline in Webmaster Tools? It seems like I missed something in my haste. I needed to in fact create a new website inside Webmaster Tools that contained my website URL prefixed with "https://". This is because a "http://www.surinderbhomra.com" is considered a different URL to "https://www.surinderbhomra.com". Makes sense when I think about it. I wrongly presumed that as long as I have the correct 301 redirects in place so all pages on my site is served over HTTPS, there won't be an issue.

    HTTP and HTTPS Sites In Google Webmaster Tools

    John Mueller wrote a FAQ post on Google+ that covers most of the important things you need to know and how to setup Webmaster Tools correctly on change to HTTPS: https://plus.google.com/+JohnMueller/posts/PY1xCWbeDVC.

    I won't lie to you, seeing that green padlock in the web address bar whenever I visit my site gives me a new found sense of euphoria!

  • For a site I'm working on, the Facebook's Comments plugin is being utilised on all our article pages. There was a requirement to pull in the latest comments in a listing page for each of these article pages as well as number of comments. Facebook's JavaScript library provides the ability to display a comments counter but not the ability to pull out x number of comments. So we'll have to go server-side and use Graph API to get the data we want.

    In this post, I will show you how you can get back all comments for a page by it's full URL.

    Prerequisites

    Before we get into the main C# logic methods, you need to make sure we have a few things in place:

    • ApiWebRequestHelper Class
    • Newtonsoft Json
    • Facebook App Settings
    • Class Objects

    ApiWebRequestHelper Class

    Whenever I am making a call to Facebook's Graph API endpoints, I will be making references to a "ApiWebRequestHelper" helper class. This is something I developed last month to make it easier for me to deserialize XML or JSON requests to a strongly-typed class object. You can take a look at the full code here.

    Newtonsoft Json

    The Newtonsoft Json library is a key ingredient to any JSON web requests. I'd be surprised if you've never heard or used it. :-) Nevertheless, you can get it here: http://www.newtonsoft.com/json.

    Facebook App Settings

    I haven't created a Facebook App for quite some time and things have changed very slightly in terms of the interface and options presented. The key things you need to get out of your created App is:

    • Application ID
    • Application Secret
    • Client Token

    I set the security settings with the following modes, which can be found in Settings > Advanced >  Security.

    Facebook App Advanced API Settings

    Class Objects

    The following class objects will be used to deserialize Graph API requests into class objects.

    The FacebookPageInfo, FacebookPage and FacebookPageShare objects will get the core information about the queried page, such as the Title and Description, as well as the comments and share counts.

    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageInfo
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("og_object")]
            public FacebookPage Page { get; set; }
    
            [JsonProperty("share")]
            public FacebookPageShare Share { get; set; }
        }
    
        public class FacebookPage
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("description")]
            public string Description { get; set; }
    
            [JsonProperty("title")]
            public string Title { get; set; }
    
            [JsonProperty("type")]
            public string Type { get; set; }
    
            [JsonProperty("updated_time")]
            public DateTime UpdatedTime { get; set; }
    
            [JsonProperty("url")]
            public string Url { get; set; }
        }
    }
    
    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageShare
        {
            [JsonProperty("comment_count")]
            public int CommentCount { get; set; }
    
            [JsonProperty("share_count")]
            public int ShareCount { get; set; }
        }
    }
    

    All comments for a page will be stored in the following objects:

    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageCommentInfo
        {
            public int TotalComments { get; set; }
            public List<FacebookCommentItem> Comments { get; set; }
        }
    }
    
    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookCommentItem
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("created_time")]
            public DateTime CreatedTime { get; set; }
    
            [JsonProperty("from")]
            public FacebookCommentFrom From { get; set; }
    
            [JsonProperty("message")]
            public string Message { get; set; }
        }
    
        public class FacebookCommentFrom
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("name")]
            public string Name { get; set; }
        }
    }
    

    Facebook Logic Class

    Now that we have the pre-requisites in place, lets get to the code that will perform the required functions:

    namespace Site.BusinessLogic
    {
        public class FacebookLogic
        {
            private string _accessToken;
    
            /// <summary>
            /// Uses default Client ID and Secret as set in the web.config.
            /// </summary>
            public FacebookLogic()
            {
                GetAccessToken(Config.Facebook.ClientId, Config.Facebook.ClientSecret);
            }
    
            /// <summary>
            /// Requires  Client ID and Secret.
            /// </summary>
            /// <param name="clientId"></param>
            /// <param name="clientSecret"></param>
            public FacebookLogic(string clientId, string clientSecret)
            {
                GetAccessToken(clientId, clientSecret);
            }
    
            /// <summary>
            /// Gets page info that has been shared to Facebook.
            /// </summary>
            /// <param name="pageUrl"></param>
            /// <returns></returns>
            public FacebookPageInfo GetPage(string pageUrl)
            {
                return ApiWebRequestHelper.GetJsonRequest<FacebookPageInfo>($"https://graph.facebook.com/{pageUrl}?access_token={_accessToken}");
            }
    
            /// <summary>
            /// Gets comments for a page based on its absolute URL.
            /// </summary>
            /// <param name="pageUrl"></param>
            /// <param name="maxComments"></param>
            public FacebookPageCommentInfo GetPageComments(string pageUrl, int maxComments)
            {
                try
                {
                    // Get page information in order to retrieve page ID to pass to commenting.
                    FacebookPageInfo facebookPage = GetPage(pageUrl);
    
                    if (facebookPage.Page != null)
                    {
                        return new FacebookPageCommentInfo
                        {
                            TotalComments = facebookPage.Share.CommentCount,
                            Comments = GetCommentsByPageId(facebookPage.Page.Id, maxComments).Comments
                        };
                    }
                    else
                    {
                        return null;
                    }
                }
                catch (Exception ex)
                {
                    // NOTE: Log exception here...
    
                    return null;
                }
            }
    
            /// <summary>
            /// Gets comments by Facebook's Page ID.
            /// </summary>
            /// <param name="fbPageId"></param>
            /// <param name="max"></param>
            /// <returns></returns>
            public FacebookCommentInfo GetCommentsByPageId(string fbPageId, int max = 10)
            {
                return ApiWebRequestHelper.GetJsonRequest<FacebookCommentInfo>($"https://graph.facebook.com/comments?id={fbPageId}&access_token={_accessToken}&limit={max}");
            }
    
            /// <summary>
            /// Retrieves Access Token from Facebook App.
            /// </summary>
            /// <param name="clientId"></param>
            /// <param name="clientSecret"></param>
            private void GetAccessToken(string clientId, string clientSecret)
            {
                UriBuilder builder = new UriBuilder($"https://graph.facebook.com/oauth/access_token?client_id={Config.Facebook.ClientId}&client_secret={Config.Facebook.ClientSecret}&grant_type=client_credentials");
    
                try
                {
                    using (WebClient client = new WebClient())
                    {
                        // Get Access Token from incoming response.
                        string data = client.DownloadString(builder.Uri);
    
                        NameValueCollection parsedQueryString = HttpUtility.ParseQueryString(data);
    
                        _accessToken = parsedQueryString["access_token"];
                    }
                }
                catch (Exception ex)
                {
                    // NOTE: Log exception here...
                }
            }
        }
    }
    

    By default, on initiation of the FacebookLogic class, the Application ID and Secret values will be inherited from the web.config, or you can pass in these values directly with the class overload parameters.

    Out of all the methods used here, we're interested in only using one: GetPageComments(). What you will notice from this method is that we cannot get the comments from one API call alone. We first have to make an extra API call to get the ID of the page. This ID is passed to the GetCommentsByPageId() method, to return all comments.

    Usage

    Comments for a page can be returned by adding the following in your code, where you will then be able to access properties to iterate through the comments:

    FacebookLogic fbl = new FacebookLogic();
    
    // Pass in the page URL and number of comments to be returned.
    var pageComments = fbl.GetPageComments("https://www.surinderbhomra.com/", 2);
    

    Whenever you call this piece of code, I would make sure you cache the results for 5 - 10 minutes, so you do not use up your API request limits.

  • Published on
    -
    4 min read

    5 Hours Of GoDaddy Pro Is More Than Enough For Me

    Go Daddy LogoThere's a saying that goes along the lines of: if it ain't broke don't fix it. But as human beings, we're a very inquisitive bunch and like to delve into the dark abyss of the unknown just to see what's out there. This is exactly what I did when I decided to move hosting providers.

    I've hosted my site for many years very happily on SoftSys Hosting ever since UltimaHosts decided to delete my site, with no backup to restore. Now when a hosting company does that, you know it's time to move on - no matter how good they say they are. If I have no issues with my current hosting provider, why would I ever decide to move? Afterall, my site uptime has been better than it has ever been and the support is top notch!

    Why Make The Move To Go Daddy Pro?

    Well, for starters the cost of hosting was not something to be sniffed at and based on my current site usage, the deluxe package (priced at £3.99 per month) on top of a promo code - an absolute bargain! Secondly, the infrastructure that consists of real-time performance and uptime monitoring (powered by NodePing) and high spec servers Go Daddy describe as:

    Hosting isn’t about having the best hardware, but it sure doesn’t hurt. Our servers are powered by Intel Xeon processors for heavy lifting, blazing-fast DDR 3 memory for low latency, and high speed, redundant storage drives for blistering page load times and reliability.

    Currently, my server is based in the US and wanted to house my website on servers nearer to home back in the UK, which unfortunately my existing shared hosting package does not offer. Even though it doesn't look it, my website is quite large and there are benefits in having hosting in the same locale to where you live, such as faster upload and download speeds when having to update a website via FTP.

    Thirdly, to have my domain and website all under a single dashboard login that Go Daddy seems to do well in a clean manageable interface and have the ability to self administer an SSL certificate without paying the hosting company for installation on my behalf.

    Lastly, the 24/7 support accessible through either email or online chat, where GoDaddy Pro boasts its low waiting time and highly specialised support that can be escalated to resolve problems quickly. But from what I experienced, it was at this the very point where they failed.

    Go Daddy Pro Advanced Technical Support

    The Extremely Painful Support Experience

    I kid you not, it made me want to smack myself over the head repeatedly with my Macbook Pro. My experience with GoDaddy Pro's "specialised support" personnel was a painful one. For explanation purposes, I will call the person I spoke to: Gary.

    After uploading files onto my GoDaddy Pro account, I carried out all initial setup with ease and was near enough ready to go. However, I was missing one critical piece of the puzzle that would demonstrate the low technical acumen of GoDaddy Pro support and thus my departure from their services. The critical piece being: the ability to upload and restore a backup of an MS SQL Server database.

    Now, the SQL Server database backup in question is around 150MB and could not simply access the online MS SQL portal to allow me to upload my backup since the URL to the portal uses a sub domain (if I remember correctly) based on your own live website domain. This was a problem for me purely because I have not pointed the domain over to Go Daddy's servers. If I did, the consequence would be my current site going down. Not an option! I required a temporary generated domain that would allow me to not only access the MS SQL portal but also test my site before I re-point my domain.

    My query was quite a simple one, so I opened up GoDaddy's support chat window and Gary came online to help. After much discussion he did not seem to understand why I'd want to do this and never managed to resolve the access issue to the MS SQL portal. All Gary did is send me links to documentation, even though I told him I have read the support literature beforehand and there was nothing relating to my original query.

    He then suggested that I log into the database through Microsoft Management Studio and then go through the process of restoring my backup directly from my computer to their database. As we all know, (unfortunately Gary didn't) you cannot upload and restore a database backup directly from your own computer. The backup needs to be on the database server itself and even after I told him I am a Microsoft .NET Developer and that this was not possible, Gary kept telling me I was wrong. This went on for quite sometime and got to the point where I just couldn't be bothered anymore and any motivation I had on moving hosting providers dissolved in that very instant.

    On one hand, I cannot falter Go Daddy's support response. It is pretty instant. On the other hand, the quality of support I had received is questionable to say the least.

    Outcome

    I decided to stay with SoftSys Hosting based on the fact I haven't had any issues and any queries I've ever had was dealt professionally and promptly. I think it's quite difficult to see how good you have it until you try something less adequate. Their prompt support exceeded my expectations when I requested an SSL to be installed on a Sunday evening and to my surprise it was all done when I got up the next morning. Now that's what I call service!

    If I can say anything positive about my GoDaddy experience is the money back promise within 30 days of purchase. I had no problems getting a full refund swiftly. I spoke to the friendly customer service fellow over the phone and explained why I wanted to leave and my refund was processed the next working day.

    I just wish I could get those 5 wasted hours of my life back...

  • Published on
    -
    6 min read

    My Development Overview of Kentico 9 MVC

    When Kentico offered the option to build websites using MVC, I was one of the many developers who jumped at the chance to utilise the new programming model. I've been building websites using the MVC programming model ever since it was first made available in Kentico 7 and with each version, the MVC implementation just got better and better. So much so, I even built my very own website (currently in Kentico 8) in MVC.

    MVC in Kentico has always been a bit of a hybrid being in the sense that it wasn't true MVC to the core, which is to be expected when you have to accommodate vast array of features the Kentico platform offers. Luckily for us Kentico 9 has wholeheartedly embraced MVC with open arms and things can only get better with subsequent versions.

    I have listed a few observations I thought would be good to write about from my initial experience of using MVC in Kentico 9 whilst working on a client project. I will be talking (very high level) about the changes from previous versions of Kentico MVC as well as the new development approaches for Kentico 9.

    1) Goodbye "Pages", Hello "Content-only Pages" Page Types

    "Content-only pages" is a new addition to Kentico 9 and is another form of Page Type, with its primary job (as the name suggests) to store content. The main difference between "Pages" and "Content-only Pages", is:

    • Aren't based on Page Templates.
    • Provides a simplified interface when managing content.
    • Does not have a presentation URL. URL patterns now need to be specified which handles the presentation to the MVC site via routing.
    • Lacks the ability to create many Page Aliases.

    "Content-only pages" is a requirement to developing MVC sites in Kentico 9. Overall, I actually found "Content-only pages" quite restrictive and useful key page properties are no longer available, such as the URLs and Navigation tabs. I really do wish that these features were left in.

    Kentico 9 MVC Missing Page Properties

    I will be talking more about the removal of the URLs in my next point, the missing Navigation property is easier to get around. I created a base content page called "Kentico Core Content" that contained all the fields that you would normally find under Navigation properties and inherited this page type on all my content-only pages, such as Articles. You'll then have to just make the customisations to inherit these fields at code level. Easy!

    Kentico 9 Core Content Page Inheritance

    2) No Document Aliases

    There's no option to allow the site administrator to add multiple document aliases for a page. This alone was nearly a deal breaker for me and was tempted to either go down Portal or ASPX templates route. The ability to create multiple document aliases in the URLs section is very powerful feature, especially if you plan on adding 301 redirects.

    To get around this excluded feature, you will either have to use URL Rewriting at web.config level or add additional routes at controller level to carry out all specific page redirects.

    So before deciding whether to choose the MVC approach, ask yourself if this is pressing feature for you and your client.

    3) Separation of the CMS and Presentation Layer

    Kentico 8 stepped up the MVC integration by allowing the developer to build their sites using MVC through the CMSApp_MVC project. This created a basic form of separation at project level that was much better suited compared to mixing key MVC components (Controllers/Models/Views) into what is a Web Form powered site in it's infancy in Kentico 7.

    Now there is complete separation between the CMS Admin and Presentation layer (or MVC site). Since the CMSApp_MVC approach has been made obselete in Kentico 9, you now have the full ability to create an MVC site as you would do normally in a non-Kentico web application. The only way Kentico and your MVC website can talk to one another is through a Web Farm configuration.

    Kentico 9 MVC Architecture

    I personally love this setup. My website can be kept light as possible and still harness the power of what Kentico has to offer through using conventional API calls from the Kentico library. I can tell you this for sure, the website itself performs better than ever and no one can tell what CMS is powering the site. Good for security.

    4) Licensing and Environment Setup

    Due to the need for Web Farm setup to allow syncronisation of content between the CMS and MVC site, the licensing requirements have changed. Based upon how you want to setup your separate sites, Kentico provides different licensing approaches, which pretty much covers all scenarios.

    My preferred setup is to run the Kentico and the MVC application under two different sites, on separate domains. Again, my reasoning comes down to catering for that additional level of security where the management of your site is on a sub-domain and not so obvious where the administration area resides. In this case, two licenses will be required. For example:

    You will get a free license for the Kentico site as long as the subdomain is "admin".

    The only sad part (for me personally) is that Kentico CMS Free Edition license does not allow for MVC websites. I really do hope that this changes at some point. I'd love to utilise full Kentico 9 MVC on future personal projects that are currently on the free edition. Otherwise they will forever be stuck in version 8.2.

    5) Page Templates

    The ability to use Page Templates alongside standard page types is still available within the Kentico interface, but you can only develop an MVC site this way by creating a (now obsolete) "CMSApp_MVC" project. Kentico 9 MVC is still fully backwards compatible with this approach.

    6) Retrieving Content In MVC Controllers

    In Kentico 8, a controller acted as the code-behind to your Page Template where you could get all the information about a current page by calling DocumentContext.CurrentDocument. In Kentico 9, this is no longer the case and it is recommended content should be retrieved using its auto-generated code. I generally don't go down the route of using the auto-generated code. I instead like to create my own custom methods so I have the freedom to pull out the exact data my data needs by passing the Node Alias Path into my control from the URL route patten. Personal preference.

    7) Friendly URL's Should Include A Node ID

    Kentico recommends all page URL's should consist of NodeID and Page alias, to ensure optimum search engine optimisation on the event the page alias of pages changes on the live site. Kentico's documentation states:

    Typically, a page alias will be part of a macro expression that makes up a URL pattern in content only page types. For example, a URL pattern can be specified like this /Articles/{%NodeID%}/{%NodeAlias%}. Where {%NodeAlias%} accesses the page alias.

    I've gone down the route of creating a custom route contraint in my MVC project, to allow me to retrieve TreeNode for the current document via HttpContext just from passing the Node Alias only. I could go into more detail, but this is probably best suited for another blog post.

    Conclusion

    The MVC integration in Kentico is like a fine wine. It gets better and better every year (in this case release) and they should be congratulated for undertaking such a humongous task. Would I choose it over Portal or ASPX pages for every project? Probably not, because I can see clients expecting functionality that is not quite present in an MVC installation as of yet.

    I do like the freedom MVC gives me whilst harnessing the power of Kentico and it works great on those projects that require maximum flexibility and the seperation of code-levels allows me to do that. In addition, if my site requires scaling, I can easily move it to Azure. I am very much looking forward to what Kentico has in store for feature releases.

    If there is anything I have listed in my initial observation that are incorrect, please leave a comment and I will update this post.