Blog

Blogging on programming and life in general.

  • I created a simple GatsbyJS pagination component that would work in a similar way to my earlier ASP.NET Core version, where the user will be able to paginate through a list using the standard "Previous" and "Next" links as well as selecting individual page numbers.

    Like the ASP.NET Core version, I have tried to make this pagination component very portable, so there shouldn't be any issues in adding this straight into your project. Plug and play!

    import * as React from 'react'
    import { Link } from 'gatsby'
    import PropTypes from 'prop-types'
    
    // Create URL path for numeric pagination
    const getPageNumberPath = (currentIndex, basePath) => {
      if (currentIndex === 1) {
        return basePath
      }
      
      return `${basePath}/page-${(currentIndex)}`
    }
    
    // Create an object array of pagination numbers. 
    // The number of page numbers to render is set to 5.
    const getPaginationGroup = (basePath, currentPage, pageCount, noOfPagesNos = 5) => {
        let startPage = currentPage;
    
        if (startPage === 1 || startPage === 2 || pageCount < noOfPagesNos)
            startPage = 1;
        else
            startPage -= 2;
    
        let maxPage = startPage + noOfPagesNos;
    
        if (pageCount < maxPage) {
            maxPage = pageCount + 1
        }
    
        if (maxPage - startPage !== noOfPagesNos && maxPage > noOfPagesNos) {
            startPage = maxPage - noOfPagesNos;
        }
    
        let paginationInfo = [];
    
        for (let i = startPage; i < maxPage; i++) {        
            paginationInfo.push({
                number: i,
                url: getPageNumberPath(i, basePath),
                isCurrent: currentPage === i
            });
        }
    
        return paginationInfo;
    };
    
    export const Pagination = ({ pageInfo, basePath }) => {
        if (!pageInfo) 
            return null
    
        const { currentPage, pageCount } = pageInfo
    
        // Create URL path for previous and next buttons
        const prevPagePath = currentPage === 2 ? basePath : `${basePath}/page-${(currentPage - 1)}`
        const nextPagePath = `${basePath}/page-${(currentPage + 1)}`
        
        if (pageCount > 1) { 
            return (
                    <ol>
                        {currentPage > 1 ? 
                            <li>
                                <Link to={prevPagePath}>
                                    Go to previous page
                                </Link>
                            </li> : null}       
                        {getPaginationGroup(basePath, currentPage, pageCount).map((item, i) => {
                            return (
                                <li key={i}>
                                    <Link to={item.url} className={`${item.isCurrent ?  "is-current" : ""}`}>
                                        Go to page {item.number}
                                    </Link>
                                </li>
                            )
                        })}
                        {currentPage !== pageCount ?
                            <li>
                                <Link to={nextPagePath}>
                                    Go to next page
                                </Link>
                            </li> : null}
                    </ol>
            )
        }
        else {
            return null
        }
      }
    
    Pagination.propTypes = {
        pageInfo: PropTypes.object,
        basePath: PropTypes.string
    }
    
    export default Pagination;
    

    This component requires just two parameters:

    1. pageInfo: A page context object created when Gatsby generates the site pages. The object should contain two properties consisting of the current page the that is being viewed (currentPage) and total number of pages (pageCount).
    2. basePath: The parent URL of where the pagination component will reside. For example, if your listing page is "/customers", this will be the base path. The pagination component will then prefix this to construct URL's in the format of - "/customers/page-2".
  • Published on
    -
    3 min read

    A Mother Tongue Lost

    I've always been unashamedly honest about not having the capability of speaking my mother tongue - Punjabi. I was brought up in an Indian household where those older than I would freely converse with one another in the home-grown language handed down to them. So Punjabi isn't a completely foreign language to me and I'm capable of getting the gist of what is being said... especially Indian family gossip!

    Throughout my life, I have always been spoken to in English by family around me and speaking fluently in Punjabi wasn't something that was ever forced upon me. In some ways, I'm grateful for this. If I look back at the mindset of my younger self, I more than likely would have rejected the notion of speaking a different language as I felt inadequate when it came to anything to do with learning. I knew from a very young age I was someone who wasn't naturally gifted in the art of learning - not from the lack of trying.

    Sometimes it's easier to reject our weaknesses instead of confronting them.

    I can't help thinking how different things could have been if my primary language of birth was Punjabi and English as secondary...

    Would I feel less ostracised from my own culture?
    Would I have taken more of a liking to Indian films and music?
    Would I be more aware of key festivals and take an active part in them?

    These thoughts have always been a part of me and it is only now they've come to the surface and stare me in the face each day after many years of keeping it locked tight within. The driving force behind these thoughts is being married to someone born in India where my cultural inadequacies that were once hidden in the dark now shines brightly for her to see.

    The time has come where change needs to happen, where I no longer take humour (a sign of weakness) in not knowing my cultural heritage or language. Why hasn't this cultural yearning come sooner? Well, I guess you never yearn for the thing you've never felt was lost.

    Through my wife's influence, I think my "Indian meter" is increasing day by day and have been introduced to things that are new to me, which can both be interesting and scary at the same time. I've even been watching a lot of Indian films and to my surprise rather enjoyed.

    I think it's through her where I worry less of not knowing certain aspects of my culture, whereas in the past only ever relied on my Mum to keep me in the loop. I've previously said in my Preserving Digital Memories post the following:

    I came to the sudden realisation as we all sat down to my mum belting out her prayers that this will not last forever and it dawned on me that these are truly special moments. Being an Indian who is culturally inept in all the senses and cannot speak his native tongue, I would be comforted knowing that I'll have photos and video to look back on many years to come.

    I am comforted knowing that the little Indian culture I have left will (as a minimum) be retained, if not grow through both their influences.

    The path that lies ahead won’t be easy, especially the part where I have to grasp the concepts of learning Punjabi as well as taking more of an invested effort in special days of the year and the reasons behind them.

    I can take comfort in knowing my journey has already begun and have learnt a lot from watching Indian TV shows alone. Even though the TV shows are in Hindi, it’s a start. For some reason, I seem to have the ability to remember bad words and phrases with such ease - to my wife's displeasure.

    At the core, I will always be British but somewhere along the way, I will always be an Indian at heart too.

  • When building React applications, my components would consist of a mixture of React Function and Class components. As much as I love to use React Function components due to their simple, lightweight and concise code construction, I had a misconception they were limited in features.

    React Function components are generally classed as "stateless" where I would go as far as to send props (if required) and return the rendered output in HTML. So whenever there was a need for some form of interaction where state was required, a Class component would be used.

    It was only until recently I was made aware of React Hooks that allowed React Function components to have state. Let's take a look at a simple example where we are increasing and decreasing a number.

    import React, { useState } from 'react';
    
    const CounterApp = () => {
      const [state, setState] = useState({
        count: 0
      });  
    
      const incrementCount = () => {
        setState({
          count: state.count + 1,
        });
      }
    
      const decrementCount = () => {
        setState({
          count: state.count - 1,
        });
      }
    
      return (
        <div>
          <h1>{state.count}</h1>
    
          <button onClick={incrementCount}>Increment</button>
          <button onClick={decrementCount}>Decrement</button>
        </div>
      );
    };
    
    export default CounterApp;
    

    In the code above we have accomplished two things:

    1. Storing our counter in state.
    2. Click handlers

    The useState() hook declaration accepts properties in the way we are accustomed to when used in a Class component. The similarity doesn't stop there, you'll also notice the same approach when setting and getting a state property.

    Multiple states can also be declared and through the array destructuring syntax lets us give different names to the state variables we declared by calling useState. Really cool!

    const [age, setAge] = useState(42);
    const [fruit, setFruit] = useState('banana');
    const [todos, setTodos] = useState([{ text: 'Learn Hooks' }]);
    

    Even though this post has concentrated on state, I've found React Function components offer similar features as a Class component. This is how I will be building my components going forward.

  • I've owned a NAS in various forms for many years. Something that started initially as re-using an old PC attached to a home network to purchasing a mini NAS box with RAID support. All systems based on Windows Server Operating System. The main focus on early iteration was to purely serve files and media.

    In 2015, I decided to take a gamble and invest in a Synology system after hearing rave reviews from users. Opting for the DS416play seemed like a safe option from a features, pricing and (most importantly) expandability point of view.

    After the move to Synology, everything I had beforehand felt so archaic and over-engineered. To this very day, my DS416play is chugging along housing around 4.5TB of data consisting of a combination of documents, videos, pictures and laptop image backups. All four hard drive slots have been filled providing a total of 8TB of storage space.

    Being a piece of hardware that is on 24/7 and acts as an extension of my laptop storage (via Synology Drive and MountainDuck), I'm pleased to see that after 7 years of constant use it's still ticking along. But I feel this year I'm due an upgrade as only recently the hardware has been playing up and starting to feel a little sluggish, which is only resolved via a restart. This is to be expected from a server running on an Intel Celeron processor and 1GB of RAM.

    So is having your own dedicated NAS a worthwhile investment? - Yes.

    Some of the positive and negative points listed below revolve around the ownership of a Synology NAS, as this is the only type of NAS I've had the most experience with.

    Positives

    You're not a slave to a storage providers terms where they can change what they offer or pricing structure. Just take a look back at what Google did with their photo service.

    Easy to setup, where you require little knowledge to get up and running (based on using a Synology). I'm no network wizard by any means and Synology allows me to get set up and use all features using basic settings initially and tinker with the more advanced areas if necessary.

    Access for many users without any additional costs. I've created accounts for my parents, sister and wife so they can access all that the server has to offer, such as photos, movies and documents. This comes at no additional cost and is very easy to give access via Synology apps.

    Cost of ownership will decrease over time after your initial setup costs (detailed in the negative section). Providing you don't have any hardware issues, which is very doubtful as my own NAS has been issue free since purchase and running 24/7, no reinvestment is required. The only area where you may need to invest is in additional drives for more storage allocation, but the cost for these are nominal.

    Always accessible 24/7 locally and remotely to myself and all the users I have set up. There isn't a day that goes by where my NAS isn't used. Most of the time I use the additional NAS storage as an extension to my laptop that has very little hard-drive space through MountainDuck application.

    Solid Eco-system consisting of applications that you need and which are accessible on mobile and tablet devices. This is where Synology is steps ahead of its competitors as they invest in software as well as hardware. The core applications are Synology Drive, DS Video and Synology Photos.

    Backup's can be easily configured for on-site and off-site. This can be done through RAID configuration and using the CloudSync application to backup to one of the many well-known cloud providers.

    Negatives

    Initial setup costs can be quite high on the offset as you not only need to purchase an adequate NAS server as well as multiple hard-disk drives that will fit into your long-term expansion plan. You need to ask yourself:

    • How much data do you currently need to store?
    • What's the estimated rate of storage consumption over the next few years?
    • Does the NAS have enough hard-disk drive slots for your needs?

    If I could go back and start my Synology NAS journey again, I'd invest more in purchasing larger hard disks.

    Synology Drive On-demand Sync is a non-existent feature on Mac OS, which makes it difficult to store large files without taking up your own workstation disk space. I and many other Mac OS users have been waiting very patiently for this key feature that is already fully functioning on Windows OS. MountainDuck is a workaround but annoyingly takes you out of the otherwise solid Synology eco-system.

    Repairability can be somewhat restrictive depending on the model purchased. The majority of the components such as CPU, RAM and PSU are soldered directly onto the motherboard and if one piece were to fail, you are left with an oversized paper-weight. It is only the more expensive models that allows you to replace/upgrade the RAM.

    Conclusion

    In my view, a NAS is a very worthy investment even by today's standards. You are spoilt for choice - there is a NAS for everyone based on your needs whether you're looking for something basic or advanced. The amount of choice now available proves the popularity and is something that users are not ignoring.

    If you want true freedom and ownership over your data and don't mind a little bit of setup to get you there, a NAS would be right up your street. You'll find even more uses if the NAS you've purchased has developed applications that might prevent you from having to purchase another subscription to an online service. This would help in aiding a quicker return of investment from the original cost of the hardware. For example, through Synology I've found the following replacements for common services:

    • Google Photos > Synology Photos
    • Google Drive/OneDrive > Synology Drive
    • Evernote > Note Station
    • Nest Security Camera > Surveillance Station

    I for one is fully invested and looking for my next upgrade depending on what happens first: Hardware dies or used up all storage capacity where more drive slots are required. The Synology DS1621+ seems to be right up my street.

  • Published on
    -
    1 min read

    Decision To Cross-post To Medium

    As much as I'd like to retain content on a single web presence, I find there are some posts that don't get much traction. The majority of my posts contain technical content that Google seems to pick up very easily due to relatable key phrases and words that developers in my industry search for.

    However, posts that are less technical in nature and (in my opinion) more thought-provoking lack page views due to the organic nature of how it's written. I believe these posts are more suited to be shared on Medium.

    I get more satisfaction in the posts that speak from my experiences and thought processes, most of which you will find in the Random Thoughts and Surinder's Log categories.

    I've already shared a handful of posts on Medium in the past - my last post was published in October 2018. I still plan on making this site the first place where content is published and then cross-post to Medium as I see fit. Check out my "Technical Blogging: Where Should I Be Writing?" post that details thoughts on the very subject of cross-posting.

    Feel free to checkout my Medium profile here.

  • Published on
    -
    5 min read

    Year In Review - 2021

    I haven't met any of the tasks I set myself based on my last year in review. But as life springs up random surprises, you see yourself shifting to a moment in time that you never thought was conceivable.

    If someone were to tell me last year that 2021 would be the year I'd find someone and finally settle down, I'd say you've been drinking too much of the finest Rioja.

    When such a shift in one's life happens, this takes utmost priority and as a result, my blogging has taken a backseat. After April things have been a little sporadic - a time when the new stage in my life kicked up a notch.

    Even though blogging this year hasn't been a priority, it's not a result of a lack of learning. I've just been focusing on learning some new life skills during this new stage in my life as well as keeping on top of new technological advances within a work environment on a daily basis.

    2021 In Words/Phrases

    Coronavirus, Covid-19, Omicron, Hubspot, Wedding, No Time To Die, Money Heist, Tailwind CSS, Prismic, Gatsby Prismic, Beard, Azure, Back The Gym, Blenheim Light Show, Camping, Abingdon Fireworks, New Family/Friends

    My Site

    Believe it or not, I have been working on an updated version by completely starting from scratch. This involved updating to the latest Gatsby framework and redoing the front-end. I came across a very good tried and tested CSS framework called Tailwind CSS.

    Tailwind is a utility CSS framework that allows for a quick turnaround in building blocks of markup to create bespoke designs based on a library of flexible predefine CSS classes. The main benefits I found so far is that it has a surprisingly minimal footprint when building for production and many sites have pre-developed HTML components you can customise and implement on your site. Only time will tell whether this is the correct approach.

    Beard Gains

    Growing some facial hair wasn't an outcome to living like a hermit during these Covid times, but a requirement from my wife. My profile picture is due for an update to reflect such a change in appearance. Even I don't recognise myself sometimes.

    Statistics

    When it comes to site statistics, I tend to lower my expectations so I'm not setting myself up for failure when it comes to checking Google Analytics. I wasn't expecting much from this year's stats due to my lack of contribution, but suffice to say I haven’t faired too badly.

    2020/2021 Comparison:

    • Users: +41.09%
    • Page Views: +45.56%
    • New Users: +42.03%
    • Bounce Rate: -3.06%
    • Search Console Total Clicks: +254%
    • Search Console Impressions: +295%
    • Search Console Page Position: -8.3%

    I'm both surprised and relieved that existing content is still getting traction resulting in more page views and users. The bounce rate has decreased a further 3.05% over last year. Out of all the statistics listed above, I believe the Google Page Position is the most important and quite disheartened that I've slipped up in this area.

    To my surprise, the site search implemented earlier this year using Algolia was getting used by visitors. This was very unexpected as the primary reason why I even added a site search is mainly for my use.

    One can only imagine how things could have been if I managed to be more consistent in the number of posts published over the year.

    Things To Look Into In 2022

    NFT and Crypto

    The main thing I want to look into further is the realms of Cryptocurrency and NFT’s. I’ve been following the likes of Dan Petty and Paul Stamatiou on Twitter and has opened my eyes to how things have moved on since I last took a brief look at this space.

    Holiday

    I haven’t been on a holiday since my trip to the Maldives in 2019 and I’m well overdue on another one - preferably abroad if I feel safe enough to do so and COVID allowing.

    Lego Ford Mustang

    I purchased a Lego Creator Series Ford Mustang near the end of last year as an early Christmas present to myself and I’m still yet to complete it. I’ve gone as far as building the underlying chassis and suspension. It doesn’t even resemble a car yet. How embarrassing. :-)

    On completion, it’ll make a fine centre-piece in my office.

    Azure

    Ever since I worked on a project at the start of the year where I was dealing with Azure Functions, deployment slots and automation I’ve been more interested in the services Azure has to offer. I’ve always stuck to the hosting related service setup and search indexing. Only ventured very little elsewhere. I’d like to keep researching in this area, especially in cognitive services.

    Git On The Command-Line

    Even though I’ve been using Git for as long as I’ve been working as a developer, it has always been via a GUI such as TortoiseGit or SourceTree. When it comes to interacting with a Git repo using the command-line, I’m not as experienced as I’d like to be when it comes to the more complex commands. In the past when I have used complex commands without a GUI, it’s been far more straightforward when compared to the comfort of a GUI where I naturally find myself when interacting with a repository.

    Twitter Bot

    For some reason, I have a huge interest in creating a Twitter bot that will carry out some form of functionality based on the contents of a tweet. At this moment in time, I have no idea what the Twitter bot will do. Once I have thought of an endearing task it can perform, the development will start.

    Final Thoughts

    If you thought 2021 was bad enough with the continuation of the sequel no one wanted (Covid part 2), we are just days away from entering the year 2022. The year grim events took place from the fictitious film - Soylent Green.

    Soylent Green Poster (1973)

    Luckily for me, 2021 has been a productive year filled with personal and career-based accomplishments and hoping for this to continue into the new year. But I do feel it's time I pushed myself further.

    I’d like to venture more into technologies that don’t form part of my existing day-to-day coding language or framework. This may make for more interesting blog posts. But to do this, I need to focus more over the next year and allocate time for research and writing.

  • I decided to write this post to primarily act as a reminder to myself when dealing with programmatically creating content pages in Umbraco and expanding upon my previous post on setting a dropdownlist in code. I have been working on a piece of functionality where I needed to develop an import task to pull in content from a different CMS platform to Umbraco that encompassed the use of different field-types, such as:

    • Textbox
    • Dropdownlist
    • Media Picker
    • Content Picker

    It might just be me, but I find it difficult to find solutions to Umbraco related problems I sometimes face. This could be due to results returned in search engines reference forum posts for older versions of Umbraco that are no longer compatible in the version I'm working in (version 8).

    When storing data in the field types listed (above), I encountered issues when trying to store values in all field types except “Textbox”. The other fields either required some form of JSON structure or Udi to be parsed.

    Code

    My code contains three methods:

    1. SetPost - to create a new blog post, or update an existing blog post if one already exists.
    2. GetAuthorIdByName - uses Umbraco Examine Search Index to get back an Author document and return the Udi.
    3. GetUmbracoMedia - uses the internal Examine Search Index to return details of a file in a form that will be acceptable to store within a Media Picker content field.

    The SetPost method consists of a combination of fields required by my Blog Post document, the primary ones being:

    • Blog Post Type (blogPostType) - Dropdownlist
    • Blog Post Author (blogPostAuthor) - Content Picker
    • Image (image) - Media Picker
    • Categories (blogPostCategories) - Tags
    /// <summary>
    /// Creates or updates an existing blog post.
    /// </summary>
    /// <param name="title"></param>
    /// <param name="summary"></param>
    /// <param name="postDate"></param>
    /// <param name="type"></param>
    /// <param name="imageUrl"></param>
    /// <param name="body"></param>
    /// <param name="categories"></param>
    /// <param name="authorId"></param>
    /// <returns></returns>
    private static PublishResult SetPost(string title, 
                                        string summary, 
                                        DateTime postDate, 
                                        string type, 
                                        string imageUrl, 
                                        string body, 
                                        List<string> categories = null, 
                                        string authorId = "")
    {
        PublishResult publishResult = null;
        IContentService contentService = Current.Services.ContentService;
        ISearcher searchIndex = ExamineUtility.GetIndex().GetSearcher();
    
        // Get blog post by it's page title.
        ISearchResult blogPostSearchItem = searchIndex.CreateQuery()
                                        .Field("pageTitle", title.TrimEnd())
                                        .And()
                                        .NodeTypeAlias("blogPost")
                                        .Execute(1)
                                        .FirstOrDefault();
    
        bool existingBlogPost = blogPostSearchItem != null;
    
        // Get the parent section where the new blog post will reside, in this case Blog Index.
        IContent blogIndex = contentService.GetPagedChildren(1099, 0, 1, out _).FirstOrDefault();
    
        if (blogIndex != null)
        {
            IContent blogPostContent;
    
            // If blog post doesn't already exist, then create a new node, otherwise retrieve existing node by ID to update.
            if (!existingBlogPost)
                blogPostContent = contentService.CreateAndSave(title.TrimEnd(), blogIndex.Id, "blogPost");
            else
                blogPostContent = contentService.GetById(int.Parse(blogPostSearchItem.Id));
    
            if (!string.IsNullOrEmpty(title))
                blogPostContent.SetValue("pageTitle", title.TrimEnd());
    
            if (!string.IsNullOrEmpty(summary))
                blogPostContent.SetValue("pageSummary", summary);
    
            if (!string.IsNullOrEmpty(body))
                blogPostContent.SetValue("body", body);
                    
            if (postDate != DateTime.MinValue)
                blogPostContent.SetValue("blogPostDate", postDate);
    
            // Set Dropdownlist field.
            if (!string.IsNullOrEmpty(type))
                blogPostContent.SetValue("blogPostType", JsonConvert.SerializeObject(new[] { type }));
    
            // Set Content-picker field by parsing a "Udi". Reference to an Author page. 
            if (authorId != string.Empty)
                blogPostContent.SetValue("blogPostAuthor", authorId);
    
            // Set Media-picker field.
            if (imageUrl != string.Empty)
            {
                string umbracoMedia = GetUmbracoMedia(imageUrl);
    
                // A stringified JSON object is required to set a Media-picker field.
                if (umbracoMedia != string.Empty)
                    blogPostContent.SetValue("image",  umbracoMedia);
            }    
    
            // Set tags.
            if (categories?.Count > 0)
                blogPostContent.AssignTags("blogPostCategories", categories);
    
            publishResult = contentService.SaveAndPublish(blogPostContent);
        }
    
        return publishResult;
    }
    
    /// <summary>
    /// Gets UDI of an author by fullname.
    /// </summary>
    /// <param name="fullName"></param>
    /// <returns></returns>
    private static string GetAuthorIdByName(string fullName)
    {
        if (!string.IsNullOrEmpty(fullName))
        {
            ISearcher searchIndex = ExamineUtility.GetIndex().GetSearcher();
    
            ISearchResult authorSearchItem = searchIndex.CreateQuery()
                                            .Field("nodeName", fullName)
                                            .And()
                                            .NodeTypeAlias("author")
                                            .Execute(1)
                                            .FirstOrDefault();
    
            if (authorSearchItem != null)
            {
                UmbracoHelper umbracoHelper = Umbraco.Web.Composing.Current.UmbracoHelper;
                return Udi.Create(Constants.UdiEntityType.Document, umbracoHelper.Content(authorSearchItem.Id).Key).ToString();
            }
        }
    
        return string.Empty;
    }
    
    /// <summary>
    /// Gets the umbracoFile of a media item by filename.
    /// </summary>
    /// <param name="fileName"></param>
    /// <returns></returns>
    private static string GetUmbracoMedia(string fileName)
    {
        if (!string.IsNullOrEmpty(fileName))
        {
            ISearcher searchIndex = ExamineUtility.GetIndex("InternalIndex").GetSearcher();
    
            ISearchResult imageSearchItem = searchIndex.CreateQuery()
                                            .Field("umbracoFileSrc", fileName)
                                            .Execute(1)
                                            .FirstOrDefault();
    
            if (imageSearchItem != null)
            {
                List<Dictionary<string, string>> imageData = new List<Dictionary<string, string>> {
                        new Dictionary<string, string>() {
                            { "key", Guid.NewGuid().ToString() },
                            { "mediaKey", imageSearchItem.AllValues["__Key"].FirstOrDefault().ToString() },
                            { "crops", null },
                            { "focalPoint", null }
                    }
                };
    
                return JsonConvert.SerializeObject(imageData);
            }
        }
    
        return string.Empty;
    }
    

    Usage Example - Iterating Through A Dataset

    In this example, I'm iterating through a dataset of posts and parsing the field value to each parameter of the SetPost method.

    ...
    ...
    ...
    SqlDataReader reader = sqlCmd.ExecuteReader();
    
    if (reader.HasRows)
    {
        while (reader.Read())
        {
            SetPost(reader["BlogPostTitle"].ToString(),
                    reader["BlogPostSummary"].ToString(),
                    DateTime.Parse(reader["BlogPostDate"].ToString()),
                    reader["BlogPostType"].ToString(),
                    reader["BlogPostImage"].ToString(),
                    reader["BlogPostBody"].ToString(),
                    new List<string>
                    {
                            "Category 1",
                            "Category 2",
                            "Category 3"
                    },
                    GetAuthorIdByName(reader["BlogAuthorName"].ToString()));
        }
    }
    ...
    ...
    ...
    

    Use of Umbraco Examine Search

    One thing to notice is that when I’m retrieving the parent page to where the new page will reside or checking for a page or media file, Umbraco Examine Search Index is used. I find querying the search index is the most efficient way to return data without consistently hitting the database - ideal for when carrying out a repetitive task like an import.

    In my code samples, I'm using a custom ExamineUtility class to retrieve the search index in a more condensed and tidy manner:

    public class ExamineUtility
    {
        /// <summary>
        /// Get Examine search index.
        /// </summary>
        /// <param name="defaultIndexName"></param>
        /// <returns></returns>
        public static IIndex GetIndex(string defaultIndexName = "ExternalIndex")
        {
            if (!ExamineManager.Instance.TryGetIndex(defaultIndexName, out IIndex index) || !(index is IUmbracoIndex))
                throw new Exception("Examine Search Index could not be found.");
    
            return index;
        }
    }
    

    Conclusion

    Hopefully, the code I have demonstrated in this post will give a clearer idea on how to programmatically work with content pages using a combination of different field types. For further code samples on working with different field types, take a look at the "Built-in Umbraco Property Editors" documentation.

  • After working on all things Hubspot over the last year whether that involved building and configuring site instances to developing API integrations, I have seemed to have missed out on CMS development-related projects. Most recently, I have been involved in an Umbraco site build where pages needed to be dynamically created via an external API.

    Programmatically creating CMS pages is quite a straight-forward job as all one needs to do is:

    • Select the parent node your dynamically created page needs to reside
    • Check the parent exists
    • Create page and set field values
    • Publish

    From past experience when passing values to page fields, it's been simple as passing a single value based on the field type. For example:

    myNewPage.SetValue("pageTitle", "Hello World");
    myNewPage.SetValue("bodyContent", "<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit.</p>");
    myNewPage.SetValue("hasExpired", true);
    myNewPage.SetValue("price", 9.99M);
    

    Umbraco has something completely different in mind if you plan on setting the value of type "Dropdown". Simply sending a single value will not work even though it is accepted during runtime. You will need to send a value as a Json array:

    string type = "Permanent";
    myNewPage.SetValue("jobType", JsonConvert.SerializeObject(new[] { type }));
    

    This is approach is required regardless of whether you've set the "Dropdown" field type in Umbraco as single or multiple choice.

  • Ever since a Ucommerce site built in Umbraco went live, the uptime monitoring system would send a notification every day or so to report the site has gone down.

    There isn't any reason for this to happen as the site in question wasn't getting enough visitors a day to warrant such a service disruption. It was a new online presence. In addition, the hosting architecture can handle such an influx of visitors with ease should such a scenario arise.

    Something random is happening at application level causing the site to timeout. Naturally, the first place to check when there are any application problems is the event log. No errors were being flagged. Just pages and pages of "Information" level entries, which are of little use. This didn't seem right and decided to check where the log files are stored within the '/App_Data/Logs' directory and just as well I did. I was greeted by many log files totalling over 3GB in size, with the current day log measuring at 587MB and increasing.

    Appending new lines of text to an already large text file is bound to have an impact on site performance, especially when it's happening at such regular intervals. I needed to streamline what gets logged. I have no interest in "Information" log entries. To do this, the following setting in the serilog.config file found in the "/config" director needed to be updated:

    <add key="serilog:write-to:File.restrictedToMinimumLevel" value="Warning" />
    

    Previously, this setting is set to "Debug" which logs all site activity. Once this was updated, the socket timeout issue was resolved.

  • Sometimes the simplest piece of development can be the most rewarding and I think my Azure Function that checks for broken links on a nightly basis is one of those things. The Azure Function reads from a list of links from a database table and carries out a check to determine if a 200 response is returned. If not, the link will be logged and sent to a user by email using the Sendgrid API.

    Scenario

    I was working on a project that takes a list of products from an API and stores them in a Hubspot HubDB table. This table contained all product information and the expected URL to a page. All the CMS pages had to be created manually and assigned the URL as stored in the table, which in turn would allow the page to be populated with product data.

    As you can expect, the disadvantage of manually created pages is that a URL change in the HubDB table will result in a broken page. Not ideal! In this case, the likelihood of a URL being changed is rare. All I needed was a checker to ensure I was made aware on the odd occasion where a link to the product page could be broken.

    I won't go into any further detail but rest assured, there was an entirely legitimate reason for this approach in the grand scheme of the project.

    Azure Function

    I have modified my original code purely for simplification.

    using System;
    using System.Collections.Generic;
    using System.Net;
    using System.Text;
    using System.Threading.Tasks;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Extensions.Logging;
    using SendGrid;
    using SendGrid.Helpers.Mail;
    
    namespace ProductsSyncApp
    {
      public static class ProductLinkChecker
      {
        [FunctionName("ProductLinkChecker")]
        public static void Run([TimerTrigger("%ProductLinkCheckerCronTime%"
          #if DEBUG
          , RunOnStartup=true
          #endif
          )]TimerInfo myTimer, ILogger log)
        {
          log.LogInformation($"Product Link Checker started at: {DateTime.Now:G}");
    
          #region Iterate through all product links and output the ones that return 404.
    
          List<string> brokenProductLinks = new List<string>();
    
          foreach (string link in GetProductLinks())
          {
            if (!IsEndpointAvailable(link))
              brokenProductLinks.Add(link);
          }
    
          #endregion
    
          #region Send Email
    
          if (brokenProductLinks.Count > 0)
            SendEmail(Environment.GetEnvironmentVariable("Sendgrid.FromEmailAddress"), Environment.GetEnvironmentVariable("Sendgrid.ToAddress"), "www.contoso.com - Broken Link Report", EmailBody(brokenProductLinks));
    
          #endregion
    
          log.LogInformation($"Product Link Checker ended at: {DateTime.Now:G}");
        }
    
        /// <summary>
        /// Get list of a product links.
        /// This would come from a datasource somewhere containing a list of correctly expected URL's.
        /// </summary>
        /// <returns></returns>
        private static List<string> GetProductLinks()
        {
          return new List<string>
          {
            "https://www.contoso.com/product/brokenlink1",
            "https://www.contoso.com/product/brokenlink2",
            "https://www.contoso.com/product/brokenlink3",
          };
        }
    
        /// <summary>
        /// Checks if a URL endpoint is available.
        /// </summary>
        /// <param name="url"></param>
        /// <returns></returns>
        private static bool IsEndpointAvailable(string url)
        {
          try
          {
            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
    
            using HttpWebResponse response = (HttpWebResponse)request.GetResponse();
    
            if (response.StatusCode == HttpStatusCode.OK)
              return true;
    
            return false;
          }
          catch
          {
            return false;
          }
        }
    
        /// <summary>
        /// Create the email body.
        /// </summary>
        /// <param name="brokenLinks"></param>
        /// <returns></returns>
        private static string EmailBody(List<string> brokenLinks)
        {
          StringBuilder body = new StringBuilder();
    
          body.Append("<p>To whom it may concern,</p>");
          body.Append("<p>The following product URL's are broken:");
    
          body.Append("<ul>");
    
          foreach (string link in brokenLinks)
            body.Append($"<li>{link}</li>");
    
          body.Append("</ul>");
    
          body.Append("<p>Many thanks.</p>");
    
          return body.ToString();
        }
    
        /// <summary>
        /// Send email through SendGrid.
        /// </summary>
        /// <param name="fromAddress"></param>
        /// <param name="toAddress"></param>
        /// <param name="subject"></param>
        /// <param name="body"></param>
        /// <returns></returns>
        private static Response SendEmail(string fromAddress, string toAddress, string subject, string body)
        {
          SendGridClient client = new SendGridClient(Environment.GetEnvironmentVariable("SendGrid.ApiKey"));
    
          SendGridMessage sendGridMessage = new SendGridMessage
          {
            From = new EmailAddress(fromAddress, "Product Link Report"),
          };
    
          sendGridMessage.AddTo(toAddress);
          sendGridMessage.SetSubject(subject);
          sendGridMessage.AddContent("text/html", body);
    
          return Task.Run(() => client.SendEmailAsync(sendGridMessage)).Result;
        }
      }
    }
    

    Here's a rundown on what is happening:

    1. A list of links is returned from the GetProductLinks() method. This will contain a list of correct links that should be accessible on the website.
    2. Loop through all the links and carry out a check against the IsEndpointAvailable() method. This method carries out a simple check to see if the link returns a 200 response. If not, it'll be marked as broken.
    3. Add any link marked as broken to the brokenProductLinks collection.
    4. If there are broken links, send an email handled by SendGrid.

    As you can see, the code itself is very simple and the only thing that needs to be customised for your use is the GetProductLinks method, which will need to output a list of expected links that a site should contain for cross-referencing.

    Email Send Out

    When using Azure functions, you can't use the standard .NET approach to send emails and Microsoft recommends that an authenticated SMTP relay service that reduces the likelihood of email providers rejecting the message. More insight into this can be found in the following StackOverflow post - Not able to connect to smtp from Azure Cloud Service.

    When it comes to SMTP relay services, SendGrid comes up favourably and being someone who uses it in their current workplace, it was my natural inclination to make use of it in my Azure Function. Plus, they've made things easy by providing a Nuget package to allow direct access to their Web API v3 endpoints.