Blog

Blogging on programming and life in general.

  • As I have been writing the last few blog posts, I've been getting the case of "twitchy feet" during the writing process. I normally get "twitchy feet" when frustrated or annoyed by things in my life that I feel could be done easier. In this case, my site has started to frustrate me and felt that adding new posts became a chore.

    Over the 10 years (has it really been this long!?) owning and maintaining this site, it's started to become a bit of the beast from the initial outset. I've jumped from platform to platform based on my needs at the time:

    • Wordpress (2006)
    • BlogEngine (2007 to 2012)
    • Kentico (2012 to present)

    I feel at the grand old age of 31, I need a platform that nurtures my writing creativity without having to worry about general maintainance and somewhat restrictive editorial functionality. Ever since I tasted the pure nectar that is Markdown, my writing speed has gone through the roof and love having full control through the simplistic editing interface - Markdown is the furture!

    I am a certified Kentico Developer (you may have got that impression from my vast posts on the platform) and specifically chose Kentico CMS because it gave me the full flexibility to build the site how I wanted. As great as the platform is, I've come to the conclusion that this site will never grow to be anything more than one thing: a blog. So I want to down-size like a person getting on in his years and move to a smaller house.

    Enter Ghost...

    Ghost

    The Ghost platform has garnered a lot of traction over the years ever since its concept in 2012. I've been keeping an eye on it over the years and never really gave the platform much thought until I noticed quite a few popular bloggers making the move and experiencing the lightening fast performance. This is possibly down to the blogger hosting their instance on Ghost Pro. Could be wrong. I am planning on going down the Ghost Pro hosting route and get everything setup by the very nice people behind the scenes at Ghost HQ, who will lovingly host and look after my site.

    I opened up a dialog on Twitter to Ghost who were very kind in alleviating my initial migration worries:

    @SurinderBhomra We can upload images for you, if you send the upload directory in the format Ghost uses, i.e. /content/images/yyyy/mm/image-name
    — Ghost (@TryGhost) October 7, 2016

    @SurinderBhomra We can help with the redirects if you're coming over to Ghost(Pro). :)
    — Ghost (@TryGhost) October 6, 2016

    The only thing I will have to get over, which Ghost will not be able to help me with is getting over the mindset that I will not be able to to tinker around with my site to the full extent as I do now. But this isn't necessarily a bad thing and will give me the opportunity to concentrate more on writing quality content. I just hate the thought of restricting myself.

    Ghost has put a framework in place that no other platform has done so well - giving power to write content anywhere:

    • Desktop browser
    • Mobile browser
    • Desktop application

    Looks like Ghost lives up to its main selling point:

    An open source blogging platform which makes writing pleasurable and publishing simple.

    What I also love is the SEO optimisation out-of-the-box. God knows how many hours I've spent trying to get my site SEO friendly, not only from an search indexing standpoint, but a social sharing standpoint too with all the open graph tags built-in. No need for extra plugins or development from a code perspective.

    Whats Next?

    As it currently stands, I am evaluating Ghost through their 14 day trial and need to send an email to their support team before I make a confirmed decision to move. I like what I am seeing to far. Just need to get the time to put a migration process in place to move the 200 posts on this site. Eek!

    Ghost is definitely not as scary as I once thought. Cue Ray Parker Jr...

  • Published on
    -
    2 min read

    The Pursuit Of Happiness

    So it's finally come to this... A point in my life where I'm questioning what have I done to get to this place I currently find myself standing, wanting to make sense of an emotion that was so naturally built into my being from day one. But now, I am not too sure if it exists or ever did exist.

    The Sad Clown

    Before you read any further, I thought I just clarify you won't be finding me talking about the performance of Will and Jayden Smith in the film: The Pursuit of Happiness. The title of the film and this post is purely coincidental.

    This year has been to what I can only describe as: turbulent. The complete opposite to what it should have been. It was going to be a year of pastures new. A seed of great things to come was planted, watered on a daily basis and nurtured to flourish into the start of something quite beautiful. Alas, like the state of my lawn it’s very much the case where no matter how much hard graft is invested to transforming something withered to greener pastures, it morphs back to its original state as nature intended. Some things cannot be changed.

    Why do I write this? That I do not know. Maybe writing my inner thoughts into words to stare back at me in its raw unforgiving form is the only way to come to terms with what I am facing. Let's call it: therapy.

    I look at my life and think I am a lucky person. I have nothing to complain about, yet I feel something missing. As one day ends and another begins, I find myself wondering what I am trying to accomplish and questioning if I am doing everything in my power remedy the wounds still open from earlier this year. Honest answer: probably not. Yesterday, I thought about what Friedrich Nietzsche said:

    If you stare into the abyss, the abyss stares back at you.

    By not confronting the wounds of yesterday, I'm consumed by being reminded of the painful events that has wedged itself deep into my hippocampus. Slowly eroding away my old self. But there is just enough for the small part of me that still exists to warn me that I am slowly edging mentally to the point of no return. So I am here writing this very post.

    If I don't start the healing process now, what I fear the most may come into fruition - others around me will notice the gaping hole where my left ventricle used to be. I have come to the conclusion that I'm not so good at being the great pretender over a considerable duration of time.

    With every letter I type I slowly regain consciousness and become self aware once again, coming to the realisation that this year has changed me. No doubt about that. But I'm stronger for it.

    If a human being thoughts and emotions is truly boundless, then it's in our nature to have the capacity to forgive, forget and learn. By doing this, I can only hope the resulting outcome will be... happiness. In time this will happen. As they say "time is a great healer". I take great comfort in that.

  • Force.com Explorer is a really useful tool that gives you the ability to explore database tables within your Saleforce environment and run queries against them. Even though this tool has been retired since 2011, I still actively use it purely because I prefer to have an application installed on my computer, rather than the web-based tool - Workbench.

    I am writing this post for two reasons: Firstly, for Salesforce newcomers and secondly, one of my fellow developers working on the same project as me was having issues logging into Force.com Explorer. Judging by the title of this post this may sound a little self-explanatory or dim-witted. Nevertheless, it's a worthy post!

    Before I get to it, I am assuming you know the following three things:

    • How to generate a Security Token.
    • Create a Connected App.
    • Generate Client ID and Client Secret from your Connected App.

    Salesforce Force.com Explorer Login

    The easiest part of the login form is entering your login credentials and selecting the type of environment you are planning to explore. Just ensure you have a user login credentials that has sufficient access rights to explore Salesforce database objects.

    The Client ID field is a little misleading because this field doesn't just accept the Client ID key generated from your Connected App alone. It can also accept the following combination:"<Client-ID><Security-Token>". So don't make a misconception where the Client ID is only accepted.

    As you probably know (if you built apps using Salesforce API), combining the Client ID and Security Token allows you to access Salesforce data from any IP. If you whitelisted a specific IP in the Trusted IP Range at Connected App level, you might get away with using the Client ID alone.

  • Published on
    -
    2 min read

    Kentico 9 Certified Developer

    I haven't done the Kentico certified exam for over two years - but this doesn't make me any less of an awesome and competent Kentico Developer. Over the last two years, a lot has changed within the Kentico realm resulting in the subject matter becoming a little more of a challenge to keep up to speed with. Afterall, two years ago we saw the dawn of a new age - the end of Kentico 7 and the start of Kentico 8. I am seeing the same thing happening again, except this time we're just about seeing Kentico 10 making it's appearance over the horizon.

    What I noticed this time round was the increased number of questions evolving around macro's. It felt like I was bombarded at the time of carrying out the exam. I think the only thing that got me through is remembering the approach I took to the macro's I wrote for a recent EMS project.

    The Kentico Certification Preparation Guide has greatly improved compared to previous versions where in the past questions were pretty simple and a vast contrast to the real thing. This allowed me to gauge a lot more on the type of questions that would potentially be presented, but I did notice quite a few questions from the preparation guide cropped up in the real exam - although slightly re-worded.

    I highly recommend anyone who is interested in becoming a Kentico Certified Developer to read the following post by Jeroen Furst prior to taking the exam called: Tips for becoming a Kentico Certified Developer. Jeroen brings up some really good points and guidance to prepare yourself. If only I came across this post two years ago when I wasn't too sure what to expect (being my first Kentico exam), I would have felt more comfortable in understanding what was required.

    I was expecting there to be some questions relating to MVC due to all the effort made by the Kentico development team to make the MVC integration seamless within Kentico 9. Alas, this was not the case. Jeroen also states the following:

    If you are planning to take any older v6, v7 or v8 exams (if still possible) you might run into questions regarding Azure and the Intranet solution. These topics are not part of the v9 exam anymore.

    The Kentico 9 exam purely focuses on the core Kentico features as well as the platform architecture every Kentico developer should know in order to build high quality sites. You will also find yourself learning some new things in the process of preparing for the exam as well as brushing up on your existing knowledge.

    If you succeed, you can proudly display this badge in all its glory! ;-)

    Kentico 9 Certified Developer

  • My bookshelf was really in the need of a good clear out. Out of all the books I own, I noticed that I seem to have more technical/programming books compared any other form of book. I guess this makes me your typical nerd with the high interest of anything programming related. Then again, my blog posts may already show that.

    Books Shelf of Programming Books (Click for enlarged image)

    As I peruse through my vast collection, I can't help but get in the mood to reminisce back at a time where I was still trying to find my feet in the coding world. I am reminded of the confusing and somewhat challenging journey as a student at Oxford Brookes University, where I was trying to get a grip on the fundamentals of programming by sinking my teeth into books about Pascal, Delphi and C++.

    It was only when carrying out my year long dissertation that I had a profound interest in Web Development as well as Microsoft development frameworks in general. This is probably the point in my life where my programming book purchases soared drastically. As you can see from my collection of my books in this post, two things are noticed:

    1. How out dated the subject matter is. Yes, there is a Classic ASP book in there.
    2. The thickness of each book. I think JavaScript Bible is probably the thickest!

    Collection of Programming Books (Click for enlarged image)

    The last programming book I purchased was around three years ago - C# In Depth by Jon Skeet. This was the first book purchase I made in a very long time after studying because I needed to up my game as well as to demonstrate my C# prowess. I generally use developer blogs and forums to expand my knowledge to all my never ending questions.

    So this leads me to the question that I will just throw out there. What is a better method to learning? Books or online resources?

    I think our way of learning has changed over the past few years and I don't think our old friend "the book" is as prominent as it once was as a learning aid, especially when there are far more accessible and interactive ways of learning.

    Pluralsight + Microsoft Virtual Academy + StackOverflow = My Learning Heaven

    Lets take training via Pluralsight as a fine example. Since registering, I find myself having the ability to learn on demand at my own choosing. I am not restricted to lugging a thick programming book around as (believe or not!) I once did. The flexibility of multiple learning paths guides me to all the courses I need to be proficient in a subject all from the comfort of a laptop, phone or tablet. In addition, unlike book purchases that will inevitably go out of date, you will access to all latest content at no extra cost. Big bonus!

    Pluralsight alongside Microsoft Virtual Academy (if you're a .NET Developer) is the most powerful learning resource a developer could have. As much as my argument is swaying more towards the paperless approach, there is nothing like having the satisfaction of flicking through pages of a book. I don't think I could completely empty my book shelf of all programming books. I have just too many timeless classics that I could never give away and will always go back to reach for, one of them being Code Complete.

    I came across an insightful article by Caroline Myrberg called: Screen vs. paper: what is the difference for reading and learning?, where she writes an interesting piece on what recent research had to say about addressing the issues of the learning processes involved in reading on screen compared to on paper. Surprisingly, there isn't much of a substantial difference in the how we are able to absorb information regardless of medium. It's all about how information is presented to us. The article highlights a study where participants completed a knowledge test of 24 questions after one group were given learning material in paper format and another on an interactive web page. The outcome:

    ...the web page group scored better on 18 of those questions, and significantly better (90% or higher) on six. So enhancing the electronic text instead of just turning it into a copy of the printed version seems to have helped the students to score higher on the test.

    I think this is why online learning like Pluralsight works so well! At the same time, there will always be a need for books. No matter how far technology continues to immerse ourselves on a daily basis. We as human-beings relate towards things that are tangible - physical objects we hold and touch. It's our default behavior and the way we're wired. But you can't help and embrace the massive leaps in technology, making access to learning resources more convenient then it ever has been.

  • Early last month, I decided to make the move and finally run my site under a secure certificate. This something I’ve been meaning to do over the last year as it became apparent that Google will soon penalise your search rankings if an SSL is not installed. Quite a few of the developer blogs I follow have already made the transition, so I thought I too should do the same. I was surprised how cheap it was to move to HTTPS. For myself, I pay around £25 a year that consists of a basic Comodo SSL certificate and a dedicated IP. This is purely because my website is hosted on a shared hosting provider. It’ll probably be even more cheaper for those who manage their own hosting.

    I highly recommend anyone who still has qualms on making the move to https to read the following post by Scott Helme: Still think you don't need HTTPS?. He brings up some very interesting points and benefits that motivated me to make the move.

    The transition to making the move to HTTPS was painless and required no major downtime. But I did have to spend time in ensuring all external requests from my site were secure, for example Disqus, Google Ads and some external JS references. However, something a little more pressing caught my eye and got quite a fright when I logged into Google Webmaster Tools yesterday. Unbeknown to me, ever since my site changed to HTTPS, both my clicks and CTR statistics declined drastically over the month. Take a look at the blue and yellow lines:

    Google Webmaster Tools Clicks/CTR Decline

    At least this decline has not been reflected in my Google Analytics report. The number of visitors to my site has remained stable and have even noticed a slight increase - I don’t think the increase has anything to do with the SSL certificate. So what caused the rapid decline in Webmaster Tools? It seems like I missed something in my haste. I needed to in fact create a new website inside Webmaster Tools that contained my website URL prefixed with "https://". This is because a "http://www.surinderbhomra.com" is considered a different URL to "https://www.surinderbhomra.com". Makes sense when I think about it. I wrongly presumed that as long as I have the correct 301 redirects in place so all pages on my site is served over HTTPS, there won't be an issue.

    HTTP and HTTPS Sites In Google Webmaster Tools

    John Mueller wrote a FAQ post on Google+ that covers most of the important things you need to know and how to setup Webmaster Tools correctly on change to HTTPS: https://plus.google.com/+JohnMueller/posts/PY1xCWbeDVC.

    I won't lie to you, seeing that green padlock in the web address bar whenever I visit my site gives me a new found sense of euphoria!

  • For a site I'm working on, the Facebook's Comments plugin is being utilised on all our article pages. There was a requirement to pull in the latest comments in a listing page for each of these article pages as well as number of comments. Facebook's JavaScript library provides the ability to display a comments counter but not the ability to pull out x number of comments. So we'll have to go server-side and use Graph API to get the data we want.

    In this post, I will show you how you can get back all comments for a page by it's full URL.

    Prerequisites

    Before we get into the main C# logic methods, you need to make sure we have a few things in place:

    • ApiWebRequestHelper Class
    • Newtonsoft Json
    • Facebook App Settings
    • Class Objects

    ApiWebRequestHelper Class

    Whenever I am making a call to Facebook's Graph API endpoints, I will be making references to a "ApiWebRequestHelper" helper class. This is something I developed last month to make it easier for me to deserialize XML or JSON requests to a strongly-typed class object. You can take a look at the full code here.

    Newtonsoft Json

    The Newtonsoft Json library is a key ingredient to any JSON web requests. I'd be surprised if you've never heard or used it. :-) Nevertheless, you can get it here: http://www.newtonsoft.com/json.

    Facebook App Settings

    I haven't created a Facebook App for quite some time and things have changed very slightly in terms of the interface and options presented. The key things you need to get out of your created App is:

    • Application ID
    • Application Secret
    • Client Token

    I set the security settings with the following modes, which can be found in Settings > Advanced >  Security.

    Facebook App Advanced API Settings

    Class Objects

    The following class objects will be used to deserialize Graph API requests into class objects.

    The FacebookPageInfo, FacebookPage and FacebookPageShare objects will get the core information about the queried page, such as the Title and Description, as well as the comments and share counts.

    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageInfo
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("og_object")]
            public FacebookPage Page { get; set; }
    
            [JsonProperty("share")]
            public FacebookPageShare Share { get; set; }
        }
    
        public class FacebookPage
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("description")]
            public string Description { get; set; }
    
            [JsonProperty("title")]
            public string Title { get; set; }
    
            [JsonProperty("type")]
            public string Type { get; set; }
    
            [JsonProperty("updated_time")]
            public DateTime UpdatedTime { get; set; }
    
            [JsonProperty("url")]
            public string Url { get; set; }
        }
    }
    
    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageShare
        {
            [JsonProperty("comment_count")]
            public int CommentCount { get; set; }
    
            [JsonProperty("share_count")]
            public int ShareCount { get; set; }
        }
    }
    

    All comments for a page will be stored in the following objects:

    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookPageCommentInfo
        {
            public int TotalComments { get; set; }
            public List<FacebookCommentItem> Comments { get; set; }
        }
    }
    
    namespace Site.BusinessObjects.Facebook
    {
        public class FacebookCommentItem
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("created_time")]
            public DateTime CreatedTime { get; set; }
    
            [JsonProperty("from")]
            public FacebookCommentFrom From { get; set; }
    
            [JsonProperty("message")]
            public string Message { get; set; }
        }
    
        public class FacebookCommentFrom
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("name")]
            public string Name { get; set; }
        }
    }
    

    Facebook Logic Class

    Now that we have the pre-requisites in place, lets get to the code that will perform the required functions:

    namespace Site.BusinessLogic
    {
        public class FacebookLogic
        {
            private string _accessToken;
    
            /// <summary>
            /// Uses default Client ID and Secret as set in the web.config.
            /// </summary>
            public FacebookLogic()
            {
                GetAccessToken(Config.Facebook.ClientId, Config.Facebook.ClientSecret);
            }
    
            /// <summary>
            /// Requires  Client ID and Secret.
            /// </summary>
            /// <param name="clientId"></param>
            /// <param name="clientSecret"></param>
            public FacebookLogic(string clientId, string clientSecret)
            {
                GetAccessToken(clientId, clientSecret);
            }
    
            /// <summary>
            /// Gets page info that has been shared to Facebook.
            /// </summary>
            /// <param name="pageUrl"></param>
            /// <returns></returns>
            public FacebookPageInfo GetPage(string pageUrl)
            {
                return ApiWebRequestHelper.GetJsonRequest<FacebookPageInfo>($"https://graph.facebook.com/{pageUrl}?access_token={_accessToken}");
            }
    
            /// <summary>
            /// Gets comments for a page based on its absolute URL.
            /// </summary>
            /// <param name="pageUrl"></param>
            /// <param name="maxComments"></param>
            public FacebookPageCommentInfo GetPageComments(string pageUrl, int maxComments)
            {
                try
                {
                    // Get page information in order to retrieve page ID to pass to commenting.
                    FacebookPageInfo facebookPage = GetPage(pageUrl);
    
                    if (facebookPage.Page != null)
                    {
                        return new FacebookPageCommentInfo
                        {
                            TotalComments = facebookPage.Share.CommentCount,
                            Comments = GetCommentsByPageId(facebookPage.Page.Id, maxComments).Comments
                        };
                    }
                    else
                    {
                        return null;
                    }
                }
                catch (Exception ex)
                {
                    // NOTE: Log exception here...
    
                    return null;
                }
            }
    
            /// <summary>
            /// Gets comments by Facebook's Page ID.
            /// </summary>
            /// <param name="fbPageId"></param>
            /// <param name="max"></param>
            /// <returns></returns>
            public FacebookCommentInfo GetCommentsByPageId(string fbPageId, int max = 10)
            {
                return ApiWebRequestHelper.GetJsonRequest<FacebookCommentInfo>($"https://graph.facebook.com/comments?id={fbPageId}&access_token={_accessToken}&limit={max}");
            }
    
            /// <summary>
            /// Retrieves Access Token from Facebook App.
            /// </summary>
            /// <param name="clientId"></param>
            /// <param name="clientSecret"></param>
            private void GetAccessToken(string clientId, string clientSecret)
            {
                UriBuilder builder = new UriBuilder($"https://graph.facebook.com/oauth/access_token?client_id={Config.Facebook.ClientId}&client_secret={Config.Facebook.ClientSecret}&grant_type=client_credentials");
    
                try
                {
                    using (WebClient client = new WebClient())
                    {
                        // Get Access Token from incoming response.
                        string data = client.DownloadString(builder.Uri);
    
                        NameValueCollection parsedQueryString = HttpUtility.ParseQueryString(data);
    
                        _accessToken = parsedQueryString["access_token"];
                    }
                }
                catch (Exception ex)
                {
                    // NOTE: Log exception here...
                }
            }
        }
    }
    

    By default, on initiation of the FacebookLogic class, the Application ID and Secret values will be inherited from the web.config, or you can pass in these values directly with the class overload parameters.

    Out of all the methods used here, we're interested in only using one: GetPageComments(). What you will notice from this method is that we cannot get the comments from one API call alone. We first have to make an extra API call to get the ID of the page. This ID is passed to the GetCommentsByPageId() method, to return all comments.

    Usage

    Comments for a page can be returned by adding the following in your code, where you will then be able to access properties to iterate through the comments:

    FacebookLogic fbl = new FacebookLogic();
    
    // Pass in the page URL and number of comments to be returned.
    var pageComments = fbl.GetPageComments("https://www.surinderbhomra.com/", 2);
    

    Whenever you call this piece of code, I would make sure you cache the results for 5 - 10 minutes, so you do not use up your API request limits.

  • Go Daddy LogoThere's a saying that goes along the lines of: if it ain't broke don't fix it. But as human beings, we're a very inquisitive bunch and like to delve into the dark abyss of the unknown just to see what's out there. This is exactly what I did when I decided to move hosting providers.

    I've hosted my site for many years very happily on SoftSys Hosting ever since UltimaHosts decided to delete my site, with no backup to restore. Now when a hosting company does that, you know it's time to move on - no matter how good they say they are. If I have no issues with my current hosting provider, why would I ever decide to move? Afterall, my site uptime has been better than it has ever been and the support is top notch!

    Why Make The Move To Go Daddy Pro?

    Well, for starters the cost of hosting was not something to be sniffed at and based on my current site usage, the deluxe package (priced at £3.99 per month) on top of a promo code - an absolute bargain! Secondly, the infrastructure that consists of real-time performance and uptime monitoring (powered by NodePing) and high spec servers Go Daddy describe as:

    Hosting isn’t about having the best hardware, but it sure doesn’t hurt. Our servers are powered by Intel Xeon processors for heavy lifting, blazing-fast DDR 3 memory for low latency, and high speed, redundant storage drives for blistering page load times and reliability.

    Currently, my server is based in the US and wanted to house my website on servers nearer to home back in the UK, which unfortunately my existing shared hosting package does not offer. Even though it doesn't look it, my website is quite large and there are benefits in having hosting in the same locale to where you live, such as faster upload and download speeds when having to update a website via FTP.

    Thirdly, to have my domain and website all under a single dashboard login that Go Daddy seems to do well in a clean manageable interface and have the ability to self administer an SSL certificate without paying the hosting company for installation on my behalf.

    Lastly, the 24/7 support accessible through either email or online chat, where GoDaddy Pro boasts its low waiting time and highly specialised support that can be escalated to resolve problems quickly. But from what I experienced, it was at this the very point where they failed.

    Go Daddy Pro Advanced Technical Support

    The Extremely Painful Support Experience

    I kid you not, it made me want to smack myself over the head repeatedly with my Macbook Pro. My experience with GoDaddy Pro's "specialised support" personnel was a painful one. For explanation purposes, I will call the person I spoke to: Gary.

    After uploading files onto my GoDaddy Pro account, I carried out all initial setup with ease and was near enough ready to go. However, I was missing one critical piece of the puzzle that would demonstrate the low technical acumen of GoDaddy Pro support and thus my departure from their services. The critical piece being: the ability to upload and restore a backup of an MS SQL Server database.

    Now, the SQL Server database backup in question is around 150MB and could not simply access the online MS SQL portal to allow me to upload my backup since the URL to the portal uses a sub domain (if I remember correctly) based on your own live website domain. This was a problem for me purely because I have not pointed the domain over to Go Daddy's servers. If I did, the consequence would be my current site going down. Not an option! I required a temporary generated domain that would allow me to not only access the MS SQL portal but also test my site before I re-point my domain.

    My query was quite a simple one, so I opened up GoDaddy's support chat window and Gary came online to help. After much discussion he did not seem to understand why I'd want to do this and never managed to resolve the access issue to the MS SQL portal. All Gary did is send me links to documentation, even though I told him I have read the support literature beforehand and there was nothing relating to my original query.

    He then suggested that I log into the database through Microsoft Management Studio and then go through the process of restoring my backup directly from my computer to their database. As we all know, (unfortunately Gary didn't) you cannot upload and restore a database backup directly from your own computer. The backup needs to be on the database server itself and even after I told him I am a Microsoft .NET Developer and that this was not possible, Gary kept telling me I was wrong. This went on for quite sometime and got to the point where I just couldn't be bothered anymore and any motivation I had on moving hosting providers dissolved in that very instant.

    On one hand, I cannot falter Go Daddy's support response. It is pretty instant. On the other hand, the quality of support I had received is questionable to say the least.

    Outcome

    I decided to stay with SoftSys Hosting based on the fact I haven't had any issues and any queries I've ever had was dealt professionally and promptly. I think it's quite difficult to see how good you have it until you try something less adequate. Their prompt support exceeded my expectations when I requested an SSL to be installed on a Sunday evening and to my surprise it was all done when I got up the next morning. Now that's what I call service!

    If I can say anything positive about my GoDaddy experience is the money back promise within 30 days of purchase. I had no problems getting a full refund swiftly. I spoke to the friendly customer service fellow over the phone and explained why I wanted to leave and my refund was processed the next working day.

    I just wish I could get those 5 wasted hours of my life back...

  • Published on
    -
    6 min read

    My Development Overview of Kentico 9 MVC

    When Kentico offered the option to build websites using MVC, I was one of the many developers who jumped at the chance to utilise the new programming model. I've been building websites using the MVC programming model ever since it was first made available in Kentico 7 and with each version, the MVC implementation just got better and better. So much so, I even built my very own website (currently in Kentico 8) in MVC.

    MVC in Kentico has always been a bit of a hybrid being in the sense that it wasn't true MVC to the core, which is to be expected when you have to accommodate vast array of features the Kentico platform offers. Luckily for us Kentico 9 has wholeheartedly embraced MVC with open arms and things can only get better with subsequent versions.

    I have listed a few observations I thought would be good to write about from my initial experience of using MVC in Kentico 9 whilst working on a client project. I will be talking (very high level) about the changes from previous versions of Kentico MVC as well as the new development approaches for Kentico 9.

    1) Goodbye "Pages", Hello "Content-only Pages" Page Types

    "Content-only pages" is a new addition to Kentico 9 and is another form of Page Type, with its primary job (as the name suggests) to store content. The main difference between "Pages" and "Content-only Pages", is:

    • Aren't based on Page Templates.
    • Provides a simplified interface when managing content.
    • Does not have a presentation URL. URL patterns now need to be specified which handles the presentation to the MVC site via routing.
    • Lacks the ability to create many Page Aliases.

    "Content-only pages" is a requirement to developing MVC sites in Kentico 9. Overall, I actually found "Content-only pages" quite restrictive and useful key page properties are no longer available, such as the URLs and Navigation tabs. I really do wish that these features were left in.

    Kentico 9 MVC Missing Page Properties

    I will be talking more about the removal of the URLs in my next point, the missing Navigation property is easier to get around. I created a base content page called "Kentico Core Content" that contained all the fields that you would normally find under Navigation properties and inherited this page type on all my content-only pages, such as Articles. You'll then have to just make the customisations to inherit these fields at code level. Easy!

    Kentico 9 Core Content Page Inheritance

    2) No Document Aliases

    There's no option to allow the site administrator to add multiple document aliases for a page. This alone was nearly a deal breaker for me and was tempted to either go down Portal or ASPX templates route. The ability to create multiple document aliases in the URLs section is very powerful feature, especially if you plan on adding 301 redirects.

    To get around this excluded feature, you will either have to use URL Rewriting at web.config level or add additional routes at controller level to carry out all specific page redirects.

    So before deciding whether to choose the MVC approach, ask yourself if this is pressing feature for you and your client.

    3) Separation of the CMS and Presentation Layer

    Kentico 8 stepped up the MVC integration by allowing the developer to build their sites using MVC through the CMSApp_MVC project. This created a basic form of separation at project level that was much better suited compared to mixing key MVC components (Controllers/Models/Views) into what is a Web Form powered site in it's infancy in Kentico 7.

    Now there is complete separation between the CMS Admin and Presentation layer (or MVC site). Since the CMSApp_MVC approach has been made obselete in Kentico 9, you now have the full ability to create an MVC site as you would do normally in a non-Kentico web application. The only way Kentico and your MVC website can talk to one another is through a Web Farm configuration.

    Kentico 9 MVC Architecture

    I personally love this setup. My website can be kept light as possible and still harness the power of what Kentico has to offer through using conventional API calls from the Kentico library. I can tell you this for sure, the website itself performs better than ever and no one can tell what CMS is powering the site. Good for security.

    4) Licensing and Environment Setup

    Due to the need for Web Farm setup to allow syncronisation of content between the CMS and MVC site, the licensing requirements have changed. Based upon how you want to setup your separate sites, Kentico provides different licensing approaches, which pretty much covers all scenarios.

    My preferred setup is to run the Kentico and the MVC application under two different sites, on separate domains. Again, my reasoning comes down to catering for that additional level of security where the management of your site is on a sub-domain and not so obvious where the administration area resides. In this case, two licenses will be required. For example:

    You will get a free license for the Kentico site as long as the subdomain is "admin".

    The only sad part (for me personally) is that Kentico CMS Free Edition license does not allow for MVC websites. I really do hope that this changes at some point. I'd love to utilise full Kentico 9 MVC on future personal projects that are currently on the free edition. Otherwise they will forever be stuck in version 8.2.

    5) Page Templates

    The ability to use Page Templates alongside standard page types is still available within the Kentico interface, but you can only develop an MVC site this way by creating a (now obsolete) "CMSApp_MVC" project. Kentico 9 MVC is still fully backwards compatible with this approach.

    6) Retrieving Content In MVC Controllers

    In Kentico 8, a controller acted as the code-behind to your Page Template where you could get all the information about a current page by calling DocumentContext.CurrentDocument. In Kentico 9, this is no longer the case and it is recommended content should be retrieved using its auto-generated code. I generally don't go down the route of using the auto-generated code. I instead like to create my own custom methods so I have the freedom to pull out the exact data my data needs by passing the Node Alias Path into my control from the URL route patten. Personal preference.

    7) Friendly URL's Should Include A Node ID

    Kentico recommends all page URL's should consist of NodeID and Page alias, to ensure optimum search engine optimisation on the event the page alias of pages changes on the live site. Kentico's documentation states:

    Typically, a page alias will be part of a macro expression that makes up a URL pattern in content only page types. For example, a URL pattern can be specified like this /Articles/{%NodeID%}/{%NodeAlias%}. Where {%NodeAlias%} accesses the page alias.

    I've gone down the route of creating a custom route contraint in my MVC project, to allow me to retrieve TreeNode for the current document via HttpContext just from passing the Node Alias only. I could go into more detail, but this is probably best suited for another blog post.

    Conclusion

    The MVC integration in Kentico is like a fine wine. It gets better and better every year (in this case release) and they should be congratulated for undertaking such a humongous task. Would I choose it over Portal or ASPX pages for every project? Probably not, because I can see clients expecting functionality that is not quite present in an MVC installation as of yet.

    I do like the freedom MVC gives me whilst harnessing the power of Kentico and it works great on those projects that require maximum flexibility and the seperation of code-levels allows me to do that. In addition, if my site requires scaling, I can easily move it to Azure. I am very much looking forward to what Kentico has in store for feature releases.

    If there is anything I have listed in my initial observation that are incorrect, please leave a comment and I will update this post.

  • I have created a helper class that will allow me to consume any XML or JSON request for deserialization into a class object. As you can see from the code below, the GetJsonRequest() and GetXmlRequest() methods allow you to pass an unknown type as well as the URL to where you are getting your request from. This makes things very straight-forward when you want to easily strongly type the data.

    public class ApiWebRequestHelper
    {
        /// <summary>
        /// Gets a request from an external JSON formatted API and returns a deserialized object of data.
        /// </summary>
        /// <typeparam name="T"></typeparam>
        /// <param name="requestUrl"></param>
        /// <returns></returns>
        public static T GetJsonRequest<T>(string requestUrl)
        {
            try
            {
                WebRequest apiRequest = WebRequest.Create(requestUrl);
                HttpWebResponse apiResponse = (HttpWebResponse)apiRequest.GetResponse();
    
                if (apiResponse.StatusCode == HttpStatusCode.OK)
                {
                    string jsonOutput;
                    using (StreamReader sr = new StreamReader(apiResponse.GetResponseStream()))
                        jsonOutput = sr.ReadToEnd();
                        
                    var jsResult = JsonConvert.DeserializeObject<T>(jsonOutput);
    
                    if (jsResult != null)
                        return jsResult;
                    else
                        return default(T);
                }
                else
                {
                    return default(T);
                }
            }
            catch (Exception ex)
            {
                // Log error here.
    
                return default(T);
            }
        }
    
        /// <summary>
        /// Gets a request from an external XML formatted API and returns a deserialized object of data.
        /// </summary>
        /// <typeparam name="T"></typeparam>
        /// <param name="requestUrl"></param>
        /// <returns></returns>
        public static T GetXmlRequest<T>(string requestUrl)
        {
            try
            {
                WebRequest apiRequest = WebRequest.Create(requestUrl);
                HttpWebResponse apiResponse = (HttpWebResponse)apiRequest.GetResponse();
    
                if (apiResponse.StatusCode == HttpStatusCode.OK)
                {
                    string xmlOutput;
                    using (StreamReader sr = new StreamReader(apiResponse.GetResponseStream()))
                        xmlOutput = sr.ReadToEnd();
    
                    XmlSerializer xmlSerialize = new XmlSerializer(typeof(T));
    
                    var xmlResult = (T)xmlSerialize.Deserialize(new StringReader(xmlOutput));
    
                    if (xmlResult != null)
                        return xmlResult;
                    else
                        return default(T);
                }
                else
                {
                    return default(T);
                }
            }
            catch (Exception ex)
            {
                // Log error here.
                return default(T);
            }
        }
    }
    

    The ApiWebRequestHelper class relies on the following namespaces:

    • Newtonsoft Json
    • System.Xml.Serialization
    • ​​System.IO;

    The ApiWebRequestHelper can be used in the following way:

    // Get Json Request
    ApiWebRequestHelper.GetJsonRequest<MyCustomJsonClass>("http://www.surinderbhomra.com/api/result.json");
    
    // Get XML Request
    ApiWebRequestHelper.GetXmlRequest<MyCustomXMLClass>("http://www.surinderbhomra.com/api/result.xml");