Blog

Blogging on programming and life in general.

  • I've owned a NAS in various forms for many years. Something that started initially as re-using an old PC attached to a home network to purchasing a mini NAS box with RAID support. All systems based on Windows Server Operating System. The main focus on early iteration was to purely serve files and media.

    In 2015, I decided to take a gamble and invest in a Synology system after hearing rave reviews from users. Opting for the DS416play seemed like a safe option from a features, pricing and (most importantly) expandability point of view.

    After the move to Synology, everything I had beforehand felt so archaic and over-engineered. To this very day, my DS416play is chugging along housing around 4.5TB of data consisting of a combination of documents, videos, pictures and laptop image backups. All four hard drive slots have been filled providing a total of 8TB of storage space.

    Being a piece of hardware that is on 24/7 and acts as an extension of my laptop storage (via Synology Drive and MountainDuck), I'm pleased to see that after 7 years of constant use it's still ticking along. But I feel this year I'm due an upgrade as only recently the hardware has been playing up and starting to feel a little sluggish, which is only resolved via a restart. This is to be expected from a server running on an Intel Celeron processor and 1GB of RAM.

    So is having your own dedicated NAS a worthwhile investment? - Yes.

    Some of the positive and negative points listed below revolve around the ownership of a Synology NAS, as this is the only type of NAS I've had the most experience with.

    Positives

    You're not a slave to a storage providers terms where they can change what they offer or pricing structure. Just take a look back at what Google did with their photo service.

    Easy to setup, where you require little knowledge to get up and running (based on using a Synology). I'm no network wizard by any means and Synology allows me to get set up and use all features using basic settings initially and tinker with the more advanced areas if necessary.

    Access for many users without any additional costs. I've created accounts for my parents, sister and wife so they can access all that the server has to offer, such as photos, movies and documents. This comes at no additional cost and is very easy to give access via Synology apps.

    Cost of ownership will decrease over time after your initial setup costs (detailed in the negative section). Providing you don't have any hardware issues, which is very doubtful as my own NAS has been issue free since purchase and running 24/7, no reinvestment is required. The only area where you may need to invest is in additional drives for more storage allocation, but the cost for these are nominal.

    Always accessible 24/7 locally and remotely to myself and all the users I have set up. There isn't a day that goes by where my NAS isn't used. Most of the time I use the additional NAS storage as an extension to my laptop that has very little hard-drive space through MountainDuck application.

    Solid Eco-system consisting of applications that you need and which are accessible on mobile and tablet devices. This is where Synology is steps ahead of its competitors as they invest in software as well as hardware. The core applications are Synology Drive, DS Video and Synology Photos.

    Backup's can be easily configured for on-site and off-site. This can be done through RAID configuration and using the CloudSync application to backup to one of the many well-known cloud providers.

    Negatives

    Initial setup costs can be quite high on the offset as you not only need to purchase an adequate NAS server as well as multiple hard-disk drives that will fit into your long-term expansion plan. You need to ask yourself:

    • How much data do you currently need to store?
    • What's the estimated rate of storage consumption over the next few years?
    • Does the NAS have enough hard-disk drive slots for your needs?

    If I could go back and start my Synology NAS journey again, I'd invest more in purchasing larger hard disks.

    Synology Drive On-demand Sync is a non-existent feature on Mac OS, which makes it difficult to store large files without taking up your own workstation disk space. I and many other Mac OS users have been waiting very patiently for this key feature that is already fully functioning on Windows OS. MountainDuck is a workaround but annoyingly takes you out of the otherwise solid Synology eco-system.

    Repairability can be somewhat restrictive depending on the model purchased. The majority of the components such as CPU, RAM and PSU are soldered directly onto the motherboard and if one piece were to fail, you are left with an oversized paper-weight. It is only the more expensive models that allows you to replace/upgrade the RAM.

    Conclusion

    In my view, a NAS is a very worthy investment even by today's standards. You are spoilt for choice - there is a NAS for everyone based on your needs whether you're looking for something basic or advanced. The amount of choice now available proves the popularity and is something that users are not ignoring.

    If you want true freedom and ownership over your data and don't mind a little bit of setup to get you there, a NAS would be right up your street. You'll find even more uses if the NAS you've purchased has developed applications that might prevent you from having to purchase another subscription to an online service. This would help in aiding a quicker return of investment from the original cost of the hardware. For example, through Synology I've found the following replacements for common services:

    • Google Photos > Synology Photos
    • Google Drive/OneDrive > Synology Drive
    • Evernote > Note Station
    • Nest Security Camera > Surveillance Station

    I for one is fully invested and looking for my next upgrade depending on what happens first: Hardware dies or used up all storage capacity where more drive slots are required. The Synology DS1621+ seems to be right up my street.

  • Published on
    -
    1 min read

    Decision To Cross-post To Medium

    As much as I'd like to retain content on a single web presence, I find there are some posts that don't get much traction. The majority of my posts contain technical content that Google seems to pick up very easily due to relatable key phrases and words that developers in my industry search for.

    However, posts that are less technical in nature and (in my opinion) more thought-provoking lack page views due to the organic nature of how it's written. I believe these posts are more suited to be shared on Medium.

    I get more satisfaction in the posts that speak from my experiences and thought processes, most of which you will find in the Random Thoughts and Surinder's Log categories.

    I've already shared a handful of posts on Medium in the past - my last post was published in October 2018. I still plan on making this site the first place where content is published and then cross-post to Medium as I see fit. Check out my "Technical Blogging: Where Should I Be Writing?" post that details thoughts on the very subject of cross-posting.

    Feel free to checkout my Medium profile here.

  • Published on
    -
    5 min read

    Year In Review - 2021

    I haven't met any of the tasks I set myself based on my last year in review. But as life springs up random surprises, you see yourself shifting to a moment in time that you never thought was conceivable.

    If someone were to tell me last year that 2021 would be the year I'd find someone and finally settle down, I'd say you've been drinking too much of the finest Rioja.

    When such a shift in one's life happens, this takes utmost priority and as a result, my blogging has taken a backseat. After April things have been a little sporadic - a time when the new stage in my life kicked up a notch.

    Even though blogging this year hasn't been a priority, it's not a result of a lack of learning. I've just been focusing on learning some new life skills during this new stage in my life as well as keeping on top of new technological advances within a work environment on a daily basis.

    2021 In Words/Phrases

    Coronavirus, Covid-19, Omicron, Hubspot, Wedding, No Time To Die, Money Heist, Tailwind CSS, Prismic, Gatsby Prismic, Beard, Azure, Back The Gym, Blenheim Light Show, Camping, Abingdon Fireworks, New Family/Friends

    My Site

    Believe it or not, I have been working on an updated version by completely starting from scratch. This involved updating to the latest Gatsby framework and redoing the front-end. I came across a very good tried and tested CSS framework called Tailwind CSS.

    Tailwind is a utility CSS framework that allows for a quick turnaround in building blocks of markup to create bespoke designs based on a library of flexible predefine CSS classes. The main benefits I found so far is that it has a surprisingly minimal footprint when building for production and many sites have pre-developed HTML components you can customise and implement on your site. Only time will tell whether this is the correct approach.

    Beard Gains

    Growing some facial hair wasn't an outcome to living like a hermit during these Covid times, but a requirement from my wife. My profile picture is due for an update to reflect such a change in appearance. Even I don't recognise myself sometimes.

    Statistics

    When it comes to site statistics, I tend to lower my expectations so I'm not setting myself up for failure when it comes to checking Google Analytics. I wasn't expecting much from this year's stats due to my lack of contribution, but suffice to say I haven’t faired too badly.

    2020/2021 Comparison:

    • Users: +41.09%
    • Page Views: +45.56%
    • New Users: +42.03%
    • Bounce Rate: -3.06%
    • Search Console Total Clicks: +254%
    • Search Console Impressions: +295%
    • Search Console Page Position: -8.3%

    I'm both surprised and relieved that existing content is still getting traction resulting in more page views and users. The bounce rate has decreased a further 3.05% over last year. Out of all the statistics listed above, I believe the Google Page Position is the most important and quite disheartened that I've slipped up in this area.

    To my surprise, the site search implemented earlier this year using Algolia was getting used by visitors. This was very unexpected as the primary reason why I even added a site search is mainly for my use.

    One can only imagine how things could have been if I managed to be more consistent in the number of posts published over the year.

    Things To Look Into In 2022

    NFT and Crypto

    The main thing I want to look into further is the realms of Cryptocurrency and NFT’s. I’ve been following the likes of Dan Petty and Paul Stamatiou on Twitter and has opened my eyes to how things have moved on since I last took a brief look at this space.

    Holiday

    I haven’t been on a holiday since my trip to the Maldives in 2019 and I’m well overdue on another one - preferably abroad if I feel safe enough to do so and COVID allowing.

    Lego Ford Mustang

    I purchased a Lego Creator Series Ford Mustang near the end of last year as an early Christmas present to myself and I’m still yet to complete it. I’ve gone as far as building the underlying chassis and suspension. It doesn’t even resemble a car yet. How embarrassing. :-)

    On completion, it’ll make a fine centre-piece in my office.

    Azure

    Ever since I worked on a project at the start of the year where I was dealing with Azure Functions, deployment slots and automation I’ve been more interested in the services Azure has to offer. I’ve always stuck to the hosting related service setup and search indexing. Only ventured very little elsewhere. I’d like to keep researching in this area, especially in cognitive services.

    Git On The Command-Line

    Even though I’ve been using Git for as long as I’ve been working as a developer, it has always been via a GUI such as TortoiseGit or SourceTree. When it comes to interacting with a Git repo using the command-line, I’m not as experienced as I’d like to be when it comes to the more complex commands. In the past when I have used complex commands without a GUI, it’s been far more straightforward when compared to the comfort of a GUI where I naturally find myself when interacting with a repository.

    Twitter Bot

    For some reason, I have a huge interest in creating a Twitter bot that will carry out some form of functionality based on the contents of a tweet. At this moment in time, I have no idea what the Twitter bot will do. Once I have thought of an endearing task it can perform, the development will start.

    Final Thoughts

    If you thought 2021 was bad enough with the continuation of the sequel no one wanted (Covid part 2), we are just days away from entering the year 2022. The year grim events took place from the fictitious film - Soylent Green.

    Soylent Green Poster (1973)

    Luckily for me, 2021 has been a productive year filled with personal and career-based accomplishments and hoping for this to continue into the new year. But I do feel it's time I pushed myself further.

    I’d like to venture more into technologies that don’t form part of my existing day-to-day coding language or framework. This may make for more interesting blog posts. But to do this, I need to focus more over the next year and allocate time for research and writing.

  • I decided to write this post to primarily act as a reminder to myself when dealing with programmatically creating content pages in Umbraco and expanding upon my previous post on setting a dropdownlist in code. I have been working on a piece of functionality where I needed to develop an import task to pull in content from a different CMS platform to Umbraco that encompassed the use of different field-types, such as:

    • Textbox
    • Dropdownlist
    • Media Picker
    • Content Picker

    It might just be me, but I find it difficult to find solutions to Umbraco related problems I sometimes face. This could be due to results returned in search engines reference forum posts for older versions of Umbraco that are no longer compatible in the version I'm working in (version 8).

    When storing data in the field types listed (above), I encountered issues when trying to store values in all field types except “Textbox”. The other fields either required some form of JSON structure or Udi to be parsed.

    Code

    My code contains three methods:

    1. SetPost - to create a new blog post, or update an existing blog post if one already exists.
    2. GetAuthorIdByName - uses Umbraco Examine Search Index to get back an Author document and return the Udi.
    3. GetUmbracoMedia - uses the internal Examine Search Index to return details of a file in a form that will be acceptable to store within a Media Picker content field.

    The SetPost method consists of a combination of fields required by my Blog Post document, the primary ones being:

    • Blog Post Type (blogPostType) - Dropdownlist
    • Blog Post Author (blogPostAuthor) - Content Picker
    • Image (image) - Media Picker
    • Categories (blogPostCategories) - Tags
    /// <summary>
    /// Creates or updates an existing blog post.
    /// </summary>
    /// <param name="title"></param>
    /// <param name="summary"></param>
    /// <param name="postDate"></param>
    /// <param name="type"></param>
    /// <param name="imageUrl"></param>
    /// <param name="body"></param>
    /// <param name="categories"></param>
    /// <param name="authorId"></param>
    /// <returns></returns>
    private static PublishResult SetPost(string title, 
                                        string summary, 
                                        DateTime postDate, 
                                        string type, 
                                        string imageUrl, 
                                        string body, 
                                        List<string> categories = null, 
                                        string authorId = "")
    {
        PublishResult publishResult = null;
        IContentService contentService = Current.Services.ContentService;
        ISearcher searchIndex = ExamineUtility.GetIndex().GetSearcher();
    
        // Get blog post by it's page title.
        ISearchResult blogPostSearchItem = searchIndex.CreateQuery()
                                        .Field("pageTitle", title.TrimEnd())
                                        .And()
                                        .NodeTypeAlias("blogPost")
                                        .Execute(1)
                                        .FirstOrDefault();
    
        bool existingBlogPost = blogPostSearchItem != null;
    
        // Get the parent section where the new blog post will reside, in this case Blog Index.
        IContent blogIndex = contentService.GetPagedChildren(1099, 0, 1, out _).FirstOrDefault();
    
        if (blogIndex != null)
        {
            IContent blogPostContent;
    
            // If blog post doesn't already exist, then create a new node, otherwise retrieve existing node by ID to update.
            if (!existingBlogPost)
                blogPostContent = contentService.CreateAndSave(title.TrimEnd(), blogIndex.Id, "blogPost");
            else
                blogPostContent = contentService.GetById(int.Parse(blogPostSearchItem.Id));
    
            if (!string.IsNullOrEmpty(title))
                blogPostContent.SetValue("pageTitle", title.TrimEnd());
    
            if (!string.IsNullOrEmpty(summary))
                blogPostContent.SetValue("pageSummary", summary);
    
            if (!string.IsNullOrEmpty(body))
                blogPostContent.SetValue("body", body);
                    
            if (postDate != DateTime.MinValue)
                blogPostContent.SetValue("blogPostDate", postDate);
    
            // Set Dropdownlist field.
            if (!string.IsNullOrEmpty(type))
                blogPostContent.SetValue("blogPostType", JsonConvert.SerializeObject(new[] { type }));
    
            // Set Content-picker field by parsing a "Udi". Reference to an Author page. 
            if (authorId != string.Empty)
                blogPostContent.SetValue("blogPostAuthor", authorId);
    
            // Set Media-picker field.
            if (imageUrl != string.Empty)
            {
                string umbracoMedia = GetUmbracoMedia(imageUrl);
    
                // A stringified JSON object is required to set a Media-picker field.
                if (umbracoMedia != string.Empty)
                    blogPostContent.SetValue("image",  umbracoMedia);
            }    
    
            // Set tags.
            if (categories?.Count > 0)
                blogPostContent.AssignTags("blogPostCategories", categories);
    
            publishResult = contentService.SaveAndPublish(blogPostContent);
        }
    
        return publishResult;
    }
    
    /// <summary>
    /// Gets UDI of an author by fullname.
    /// </summary>
    /// <param name="fullName"></param>
    /// <returns></returns>
    private static string GetAuthorIdByName(string fullName)
    {
        if (!string.IsNullOrEmpty(fullName))
        {
            ISearcher searchIndex = ExamineUtility.GetIndex().GetSearcher();
    
            ISearchResult authorSearchItem = searchIndex.CreateQuery()
                                            .Field("nodeName", fullName)
                                            .And()
                                            .NodeTypeAlias("author")
                                            .Execute(1)
                                            .FirstOrDefault();
    
            if (authorSearchItem != null)
            {
                UmbracoHelper umbracoHelper = Umbraco.Web.Composing.Current.UmbracoHelper;
                return Udi.Create(Constants.UdiEntityType.Document, umbracoHelper.Content(authorSearchItem.Id).Key).ToString();
            }
        }
    
        return string.Empty;
    }
    
    /// <summary>
    /// Gets the umbracoFile of a media item by filename.
    /// </summary>
    /// <param name="fileName"></param>
    /// <returns></returns>
    private static string GetUmbracoMedia(string fileName)
    {
        if (!string.IsNullOrEmpty(fileName))
        {
            ISearcher searchIndex = ExamineUtility.GetIndex("InternalIndex").GetSearcher();
    
            ISearchResult imageSearchItem = searchIndex.CreateQuery()
                                            .Field("umbracoFileSrc", fileName)
                                            .Execute(1)
                                            .FirstOrDefault();
    
            if (imageSearchItem != null)
            {
                List<Dictionary<string, string>> imageData = new List<Dictionary<string, string>> {
                        new Dictionary<string, string>() {
                            { "key", Guid.NewGuid().ToString() },
                            { "mediaKey", imageSearchItem.AllValues["__Key"].FirstOrDefault().ToString() },
                            { "crops", null },
                            { "focalPoint", null }
                    }
                };
    
                return JsonConvert.SerializeObject(imageData);
            }
        }
    
        return string.Empty;
    }
    

    Usage Example - Iterating Through A Dataset

    In this example, I'm iterating through a dataset of posts and parsing the field value to each parameter of the SetPost method.

    ...
    ...
    ...
    SqlDataReader reader = sqlCmd.ExecuteReader();
    
    if (reader.HasRows)
    {
        while (reader.Read())
        {
            SetPost(reader["BlogPostTitle"].ToString(),
                    reader["BlogPostSummary"].ToString(),
                    DateTime.Parse(reader["BlogPostDate"].ToString()),
                    reader["BlogPostType"].ToString(),
                    reader["BlogPostImage"].ToString(),
                    reader["BlogPostBody"].ToString(),
                    new List<string>
                    {
                            "Category 1",
                            "Category 2",
                            "Category 3"
                    },
                    GetAuthorIdByName(reader["BlogAuthorName"].ToString()));
        }
    }
    ...
    ...
    ...
    

    Use of Umbraco Examine Search

    One thing to notice is that when I’m retrieving the parent page to where the new page will reside or checking for a page or media file, Umbraco Examine Search Index is used. I find querying the search index is the most efficient way to return data without consistently hitting the database - ideal for when carrying out a repetitive task like an import.

    In my code samples, I'm using a custom ExamineUtility class to retrieve the search index in a more condensed and tidy manner:

    public class ExamineUtility
    {
        /// <summary>
        /// Get Examine search index.
        /// </summary>
        /// <param name="defaultIndexName"></param>
        /// <returns></returns>
        public static IIndex GetIndex(string defaultIndexName = "ExternalIndex")
        {
            if (!ExamineManager.Instance.TryGetIndex(defaultIndexName, out IIndex index) || !(index is IUmbracoIndex))
                throw new Exception("Examine Search Index could not be found.");
    
            return index;
        }
    }
    

    Conclusion

    Hopefully, the code I have demonstrated in this post will give a clearer idea on how to programmatically work with content pages using a combination of different field types. For further code samples on working with different field types, take a look at the "Built-in Umbraco Property Editors" documentation.

  • After working on all things Hubspot over the last year whether that involved building and configuring site instances to developing API integrations, I have seemed to have missed out on CMS development-related projects. Most recently, I have been involved in an Umbraco site build where pages needed to be dynamically created via an external API.

    Programmatically creating CMS pages is quite a straight-forward job as all one needs to do is:

    • Select the parent node your dynamically created page needs to reside
    • Check the parent exists
    • Create page and set field values
    • Publish

    From past experience when passing values to page fields, it's been simple as passing a single value based on the field type. For example:

    myNewPage.SetValue("pageTitle", "Hello World");
    myNewPage.SetValue("bodyContent", "<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit.</p>");
    myNewPage.SetValue("hasExpired", true);
    myNewPage.SetValue("price", 9.99M);
    

    Umbraco has something completely different in mind if you plan on setting the value of type "Dropdown". Simply sending a single value will not work even though it is accepted during runtime. You will need to send a value as a Json array:

    string type = "Permanent";
    myNewPage.SetValue("jobType", JsonConvert.SerializeObject(new[] { type }));
    

    This is approach is required regardless of whether you've set the "Dropdown" field type in Umbraco as single or multiple choice.

  • Ever since a Ucommerce site built in Umbraco went live, the uptime monitoring system would send a notification every day or so to report the site has gone down.

    There isn't any reason for this to happen as the site in question wasn't getting enough visitors a day to warrant such a service disruption. It was a new online presence. In addition, the hosting architecture can handle such an influx of visitors with ease should such a scenario arise.

    Something random is happening at application level causing the site to timeout. Naturally, the first place to check when there are any application problems is the event log. No errors were being flagged. Just pages and pages of "Information" level entries, which are of little use. This didn't seem right and decided to check where the log files are stored within the '/App_Data/Logs' directory and just as well I did. I was greeted by many log files totalling over 3GB in size, with the current day log measuring at 587MB and increasing.

    Appending new lines of text to an already large text file is bound to have an impact on site performance, especially when it's happening at such regular intervals. I needed to streamline what gets logged. I have no interest in "Information" log entries. To do this, the following setting in the serilog.config file found in the "/config" director needed to be updated:

    <add key="serilog:write-to:File.restrictedToMinimumLevel" value="Warning" />
    

    Previously, this setting is set to "Debug" which logs all site activity. Once this was updated, the socket timeout issue was resolved.

  • Sometimes the simplest piece of development can be the most rewarding and I think my Azure Function that checks for broken links on a nightly basis is one of those things. The Azure Function reads from a list of links from a database table and carries out a check to determine if a 200 response is returned. If not, the link will be logged and sent to a user by email using the Sendgrid API.

    Scenario

    I was working on a project that takes a list of products from an API and stores them in a Hubspot HubDB table. This table contained all product information and the expected URL to a page. All the CMS pages had to be created manually and assigned the URL as stored in the table, which in turn would allow the page to be populated with product data.

    As you can expect, the disadvantage of manually created pages is that a URL change in the HubDB table will result in a broken page. Not ideal! In this case, the likelihood of a URL being changed is rare. All I needed was a checker to ensure I was made aware on the odd occasion where a link to the product page could be broken.

    I won't go into any further detail but rest assured, there was an entirely legitimate reason for this approach in the grand scheme of the project.

    Azure Function

    I have modified my original code purely for simplification.

    using System;
    using System.Collections.Generic;
    using System.Net;
    using System.Text;
    using System.Threading.Tasks;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Extensions.Logging;
    using SendGrid;
    using SendGrid.Helpers.Mail;
    
    namespace ProductsSyncApp
    {
      public static class ProductLinkChecker
      {
        [FunctionName("ProductLinkChecker")]
        public static void Run([TimerTrigger("%ProductLinkCheckerCronTime%"
          #if DEBUG
          , RunOnStartup=true
          #endif
          )]TimerInfo myTimer, ILogger log)
        {
          log.LogInformation($"Product Link Checker started at: {DateTime.Now:G}");
    
          #region Iterate through all product links and output the ones that return 404.
    
          List<string> brokenProductLinks = new List<string>();
    
          foreach (string link in GetProductLinks())
          {
            if (!IsEndpointAvailable(link))
              brokenProductLinks.Add(link);
          }
    
          #endregion
    
          #region Send Email
    
          if (brokenProductLinks.Count > 0)
            SendEmail(Environment.GetEnvironmentVariable("Sendgrid.FromEmailAddress"), Environment.GetEnvironmentVariable("Sendgrid.ToAddress"), "www.contoso.com - Broken Link Report", EmailBody(brokenProductLinks));
    
          #endregion
    
          log.LogInformation($"Product Link Checker ended at: {DateTime.Now:G}");
        }
    
        /// <summary>
        /// Get list of a product links.
        /// This would come from a datasource somewhere containing a list of correctly expected URL's.
        /// </summary>
        /// <returns></returns>
        private static List<string> GetProductLinks()
        {
          return new List<string>
          {
            "https://www.contoso.com/product/brokenlink1",
            "https://www.contoso.com/product/brokenlink2",
            "https://www.contoso.com/product/brokenlink3",
          };
        }
    
        /// <summary>
        /// Checks if a URL endpoint is available.
        /// </summary>
        /// <param name="url"></param>
        /// <returns></returns>
        private static bool IsEndpointAvailable(string url)
        {
          try
          {
            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
    
            using HttpWebResponse response = (HttpWebResponse)request.GetResponse();
    
            if (response.StatusCode == HttpStatusCode.OK)
              return true;
    
            return false;
          }
          catch
          {
            return false;
          }
        }
    
        /// <summary>
        /// Create the email body.
        /// </summary>
        /// <param name="brokenLinks"></param>
        /// <returns></returns>
        private static string EmailBody(List<string> brokenLinks)
        {
          StringBuilder body = new StringBuilder();
    
          body.Append("<p>To whom it may concern,</p>");
          body.Append("<p>The following product URL's are broken:");
    
          body.Append("<ul>");
    
          foreach (string link in brokenLinks)
            body.Append($"<li>{link}</li>");
    
          body.Append("</ul>");
    
          body.Append("<p>Many thanks.</p>");
    
          return body.ToString();
        }
    
        /// <summary>
        /// Send email through SendGrid.
        /// </summary>
        /// <param name="fromAddress"></param>
        /// <param name="toAddress"></param>
        /// <param name="subject"></param>
        /// <param name="body"></param>
        /// <returns></returns>
        private static Response SendEmail(string fromAddress, string toAddress, string subject, string body)
        {
          SendGridClient client = new SendGridClient(Environment.GetEnvironmentVariable("SendGrid.ApiKey"));
    
          SendGridMessage sendGridMessage = new SendGridMessage
          {
            From = new EmailAddress(fromAddress, "Product Link Report"),
          };
    
          sendGridMessage.AddTo(toAddress);
          sendGridMessage.SetSubject(subject);
          sendGridMessage.AddContent("text/html", body);
    
          return Task.Run(() => client.SendEmailAsync(sendGridMessage)).Result;
        }
      }
    }
    

    Here's a rundown on what is happening:

    1. A list of links is returned from the GetProductLinks() method. This will contain a list of correct links that should be accessible on the website.
    2. Loop through all the links and carry out a check against the IsEndpointAvailable() method. This method carries out a simple check to see if the link returns a 200 response. If not, it'll be marked as broken.
    3. Add any link marked as broken to the brokenProductLinks collection.
    4. If there are broken links, send an email handled by SendGrid.

    As you can see, the code itself is very simple and the only thing that needs to be customised for your use is the GetProductLinks method, which will need to output a list of expected links that a site should contain for cross-referencing.

    Email Send Out

    When using Azure functions, you can't use the standard .NET approach to send emails and Microsoft recommends that an authenticated SMTP relay service that reduces the likelihood of email providers rejecting the message. More insight into this can be found in the following StackOverflow post - Not able to connect to smtp from Azure Cloud Service.

    When it comes to SMTP relay services, SendGrid comes up favourably and being someone who uses it in their current workplace, it was my natural inclination to make use of it in my Azure Function. Plus, they've made things easy by providing a Nuget package to allow direct access to their Web API v3 endpoints.

  • Since around September last year, I've been involved in a lot of Hubspot projects at my place of work - Syndicut. It's the latest edition to the numerous other platforms that are offered to clients.

    The approach to developing websites in Hubspot is not something I'm used to coming from a programming background where you build everything custom using some form of server-side language. But I was surprised by what you can achieve within the platform.

    Having spent months building sites using the Hubspot Markup Language (HUBL), utilising a lot of the powerful marketing features and using the API to build a custom .NET Hubspot Connector, I thought it was time to attempt a certification focusing on the CMS aspect of Hubspot.

    There are two CMS certifications:

    1. Hubspot CMS for Marketers
    2. Hubspot CMS for Developers

    I decided to tackle the "CMS for Marketers" certification first as this mostly covers the theory aspect on how you use Hubspot to create a user-friendly, high-performing website and leveraging that with Hubspot CRM. These are the areas you can get quite shielded from if you're purely just developing in pages and modules. I thought it would be beneficial to expose myself from a marketing standpoint to get an insight into how my development forms part of the bigger picture.

    I'm happy to report I am now Hubspot CMS for Marketers certified.

    Hubspot CMS for Marketers Certification

  • On my UniFi Dream Machine, I have set up a guest wireless network for those who come to my house and need to use the Internet. I've done this across all routers I've ever purchased, as I prefer to use the main non-guest wireless access point (WAP) just for me as I have a very secure password that I rather not share with anyone.

    It only occurred to me a few days ago that my reason for having a guest WAP is flawed. After all, the only difference between the personal and guest WAP's is a throw-away password I change regularly. There is no beneficial security in that. It is time to make good use of UniFi’s Guest Control settings and prevent access to internal network devices. I have a very simple network setup and the only two network devices I want to block access to is my Synology NAS and IP Security Camera.

    UniFi’s Guest Control settings do a lot of the grunt work out the box and is pretty effortless to set up. Within the UniFi controller (based on my own UniFi Dream Machine), the following options are available to you:

    1. Guest Network: Create a new wireless network with its own SSID and password.
    2. Guest User Group: Set download/upload bandwidth limitations that can be attached to the Guest Network.
    3. Guest Portal: A custom interface can be created where a guest will be served a webpage to enter a password to access the wireless network - much like what you'd experience when using the internet at an airport or hotel. UniFi gives you enough creative control to make the portal interface look very professional. You  can expire the connection by a set number of hours.
    4. Guest Control: Limit access to devices within the local network via IP address.

    I don't see the need to enable all guest features the UniFi controller offers and the only two that are of interest to me is setting up a guest network and restricting access (options 1 and 4). This is a straight-forward process that will only take a few minutes.

    Guest Network

    A new wireless network will need to be created and be marked as a guest network. To do this, we need to set the following:

    • Name/SSID: MyGuestNetwork
    • Enable this wireless network: Yes
    • Security: WPA Personal. Add a password
    • Guest Policy: Yes

    All other Advanced Options can be left as they are.

    UniFi Controller - Guest Network Access Point

    Guest Control

    To make devices unavailable over your newly create guest network, you can simply add IPV4 hostname or subnet within the "Post Authorisation Restrictions" section. I've added the IP to my Synology NAS - 172.16.1.101.

    UniFi Controller - Guest Control

    If all has gone to plan when connecting to the guest WAP you will not be able to access any network connected devices.

  • Investing in a UniFi Dream Machine has been one of the wisest things I've done last year when it comes to relatively expensive purchases. It truly has been worth every penny for its reliability, security and rock-solid connection - something that is very much needed when working from home full-time.

    The Dream Machine has been very low maintenance and I just leave it to do its thing apart from carrying out some minor configuration tweaks to aid my network. The only area that I did encounter problems was accessing the Synology Disk Station Manager (DSM) web interface. I could access Synology if I used the local IP address instead of the "myusername.synology.me" domain. Generally, this would be an ok solution, but not the right one for two reasons:

    1. Using a local IP address would restrict connection to my Synology if I was working outside from another location. This was quite the deal-breaker as I do have a bunch of Synology apps installed on my Mac, such as Synology Drive that carries out backups and folder synchronisation.
    2. I kept on getting a security warning in my browser when accessing DSM regarding the validity of my SSL certificate, which is to be expected as I force all connections to be carried out over SSL.

    To my befuddlement, I had no issue accessing the data in my Synology by mapping them as network drives from my computer.

    There was an issue with my local network as I was able to access the Synology DSM web interface externally. From perusing the UniFi community forum, there have been quite a few cases where users have reported the same thing and the common phrase that came popping up in all the posts was: Broken Hairpin NAT. What is a Hairpin NAT?

    A Hairpin NAT allows you to run a server (in this case a NAS) inside your network but connect to it as if you were outside your network. For example via a web address, "myusername.synology.me" that will resolve to the internal IP of the server.

    What I needed to do was to run an internal DNS server and a local entry for "myusername.synology.me" and point that to the internal IP address of the NAS. What was probably happening is that my computer/device was trying to make a connection past the firewall and then back in again to access the NAS. Not the most efficient way to make a connection for obvious reasons and in some cases may not work. A loopback would resolve this.

    A clever user posted a solution to the issue on the UniFi forum that is very easy to follow and worked like a charm - Loopback/DNS Synology DiskStation.

    I have also saved a screenshot of the solution for posterity.