Blog

Blogging on programming and life in general.

  • At times there is need to get a list of files that have been updated. This could for the following reasons:

    • Audit compliance to maintain records of application changes.
    • Backup verification to confirm the right files were backed up.
    • Verification of changed files to confirm which files were added, modified, or deleted during an update.
    • Security checks to ensure that there have been no unauthorised or suspicious files changed or installed through hacking.
    • Troubleshooting Issues after a new application release by seeing a list of changed files can help identify the source of issues.

    Based on the information I found online, I put together a PowerShell script that was flexible enough to meet the needs of the above scenarios, as I encountered one of them this week. I'll let you guess the scenario I faced.

    At its core, the following PowerShell script uses the Get-ChildItem command to list out all files recursively across all sub-folders, ordered by the created date descending with the addition of handful of optional parameters.

    Get-ChildItem -Path C:\My-Path -Recurse -Include *.png | 
    			Select -Last 5 CreationTime,LastWriteTime,FullName | 
    			Sort-Object -Property CreationTime -Descending | 
    			Export-Csv "file-list.csv"
    

    Breakdown of the parameters used:

    Parameter/Object Detail Is Optional
    -Path The folder path to where files need to be listed. No
    -Recurse Get files from the path and its subdirectories Yes
    -Include Filter the file output through a path element or pattern,. This only works when the "Recurse" parameter is present. Yes
    Select Set the maximum output (-Last) and list of fields to be listed. Yes
    Sort-Object Specify field and sort order. Yes
    Export-Csv Export the list of files list to a CSV. Yes

    If the files need to be sorted by last modified date, the Sort-Object property needs to be set to "LastWriteTime".

    When the script is run, you'll see the results rendered in the following way:

    CreationTime        LastWriteTime       FullName
    ------------        -------------       --------
    25/05/2023 20:33:44 25/05/2023 20:33:44 X:\Downloads\synology\Screenshot 2023-05-25 at 20.33.38.png
    16/05/2023 14:18:21 16/05/2023 14:18:21 X:\Downloads\synology\Screenshot 2023-05-16 at 14.18.15.png
    

    Further Information

  • I've been working with custom functionality for registering and authenticating external site users in Umbraco 13 using its Members feature.

    A custom Member Type was created so I could create field properties to specifically store all member registeration data. This consisted of Textboxes, Textareas and Dropdown fields.

    Getting values for fields in code is very straight-forward, but I encountered issues in when dealing with fields that consist of preset values, such as a Dropdown list of titles (Mr/Mrs/Ms/etc).

    Based on the Umbraco documentation for working with a Dropdown field, I should be able to get the selected value through this one line of code:

    @if (Model.HasValue("title"))
    {
        <p>@(Model.Value<string>("title"))</p>
    }
    

    When working with custom properties from a Member Type, the approach seems to be different. A GetValue() is the only accessor we have available to us to output a value - something we are already accustomed to working in Umbraco.

    IMember? member = memberService.GetByEmail("johndoe@gmail.com");
    string title = member.Properties["title"].GetValue()?.ToString(); // Output: "[\"Mr\"]"
    

    However, the value is returned as a serialized array. This is also the case when using the typed GetValue() accessor on the property:

    IMember? member = memberService.GetByEmail("johndoe@gmail.com");
    string title = member.GetValue<string>("title"); // Output: "[\"Mr\"]"
    

    Umbraco 13 - Dropdown Value From Custom Member Type Property

    The only way to get around this was to create a custom extension method to deserialize the string array so the value alone could be output:

    public static class MemberPropertyExtensions
    {
        /// <summary>
        /// Gets the selected value of a Dropdown property.
        /// </summary>
        /// <param name="property"></param>
        /// <returns></returns>
        public static string? GetSelectedDropdownValue(this IProperty property)
        {
            if (property == null)
                return string.Empty;
    
            string? value = property?.GetValue()?.ToString();
    
            if (string.IsNullOrEmpty(value))
                return string.Empty;
    
            string[]? propertyArray = JsonConvert.DeserializeObject<string[]>(value);
    
            return propertyArray?.FirstOrDefault();
        }
    }
    

    It's a simple but effective solution. Now our original code can be updated by adding our newly created GetSelectedDropdownValue() method to the property:

    IMember? member = memberService.GetByEmail("johndoe@gmail.com");
    string title = member.Properties["title"].GetSelectedDropdownValue();
    

    Useful Information

  • I've had a Spotify music membership for as long as I can remember. Many other subscriptions held throughout my life have come and gone, but Spotify has stood the test of time.

    A seed of doubt was planted when Spotify began raising the prices of their plans multiple times over a short period of time, beginning in April 2021. Even then, I was relatively unconcerned; it was annoying, but I felt content knowing there were no better music providers that could compete with what Spotify provided. Spotify made music very accessible to me in every way.

    During the first price hike, I trialled Apple Music during a brief period of insanity only to quickly come running back to the safety of Spotify.

    The penny dropped in May 2024, during the third price hike, when I began to question whether my Spotify usage was worth paying £11.99 per month. Even though I listen to music, I occasionally go through periods where I only listen to podcasts, which are freely available online and podcasting platforms.

    First Steps To Considering YouTube Music As A Viable Replacement

    Before making any hasty decisions, I audited all subscriptions both my wife and I use to if there is any possibility of making cost savings... Just like a Conservative party government imposing austerity measures, except my actions wouldn't lead to a Liz Truss level economic crises.

    It wasn't until I discovered my wife's YouTube Premium subscription, which she had purchased through the Apple App Store for an absurdly high price. A word to the wise: Never buy subscriptions through Apple's App Store because Apple charges a commission on top. My wife was paying around £18 per month compared to £12.99 if purchased directly from the YouTube website.

    I digress...

    This was enough to get me thinking about upgrading to the Family tier that included:

    • Ad-free videos
    • Offline downloads
    • YouTube Music
    • Add up to 5 members to the subscription

    All this costing £19.99 per month. At this price, we would be making savings if we moved away from our individual YouTube and Spotify plans. I was already sold on ad-free videos (those advertisements are so annoying!) and if I could be persuaded to subscribe to YouTube Music, this would end up being a very cost-effective option.

    The writing was on the wall. My Spotify days were numbered. I looked into what was involved (if possible) in migrating all my playlists over to YouTube Music.

    Requirements and Initial Thoughts of YouTube Music

    Prior to carrying out any form of migration, I opted for a 30 day free trial of YouTube Music as I wanted to see if it met as many key requirements as possible.

    Requirement Requirement Met?
    Availability of all songs from artists I listen to including the obscure ones Yes
    Podcasts Big yes
    Native MacOS app Room for improvement
    Ability to cast music to my speakers on my network Yes
    Quality new music suggestions Yes

    Overall, YouTube Music met majority of my requirements. As expected, it does take a little while to familiarise one self with the interface but there are similarities when compared with Spotify.

    YouTube Music - The Extension of YouTube

    YouTube Music is really an extension of YouTube in how it is able to pull in specific YouTube content, whether that is music videos, podcasts or shows. All the audio related content in video form you would normally view in YouTube is encompassed here. In most cases, this is seen as an advantage, however the only aspect where the lines between music and video get blurred is in the auto-generated "Liked music" playlist.

    You may find the "Liked music" playlist is already prefilled with videos you have liked on YouTube. If YouTube Music deems a liked video as music, it will also be shown here, which isn't necessarily accurate. For example, it automatically listed a Breaking Bad Parody video I liked from 9 years ago. If you prefer your randomly liked videos to stay in solely in YouTube, you have to manually disable the "Show your liked music from YouTube" feature in the settings.

    The Music Catalog and New Music Recommendations

    The music catalog size is on par with Spotify and there hasn't been a time where a track wasn't available. In fact, there were 3-4 tracks in my Spotify playlist that was no longer accessible, but this was not the case on YouTube Music, which was a surprise.

    During times when I am in the search for new music, I found the recommendation algorithm far better than Spotify and after a couple weeks of using YouTube Music I was compiled some really good personalised mixes - something that will get even better in time. Due to its link with YouTube, I was recommended even more options of live performances, remixes and cover tracks.

    What surprised me the most is the a feature I didn't even think I needed: The Offline Mixtape. There are times when I don't actually know what tracks I want to listen to when on the road and the Offline Mixtape compiles a list of tracks consisting of a combination of my liked songs and similar tracks for added variation. All automatically synchronised to my devices.

    Podcasts

    From the podcasts I listen to on Spotify I didn't have any issues in finding on YouTube Music. There is an added benefit of playing a podcast as audio or video (if the podcast offers this format), which is a nice touch. I was also recommended new types of podcasts that I would have never been exposed to based on what I listen to. I am sure (and correct me if I am wrong) Spotify didn't make recommendations as visible as what I am seeing in YouTube Music where podcasts are categorised. For example, the categories offered to me are: Wealth, Finances, Health, Mysteries, etc

    Lack of Native Desktop App

    The lack of a native desktop app detracts from my otherwise glowing review of YouTube Music. I was surprised to find that there isn't one, given that this is the norm among other music providers.

    Even though Chrome allows you to download it as a Progressive Web App, it's better than nothing. It just doesn't seem integrated enough. I keep accidentally closing the YouTube Music app on my MacOS by clicking the "close" button when all I want to do is hide the window.

    It can also be laggy at times, especially when Chromecasting to a smart speaker. When I change tracks, my speaker takes a few seconds to catch up.

    Overall, it's good but not great. Does not have the same polish as the Spotify app. But it's definitely manageable. The lack of a native desktop app has not dissuaded me from using it. If needed, I can always use the YouTube Music app on my Pixel or iPad.

    The Migration

    After a satisfactory trial period using YouTube Music, I looked for ways to move all my Spotify playlists. There are many options through online services and software that can aid the migration process, which can be used for free (sometimes with limitations) or at a cost.

    After carrying out some research on the various options available to me, I opted for a free CLI tool built in Python: spotify_to_ytmusic. It has received a lot of good reviews from a Reddit post and received positive feedback where users were able to migrate thousands of their songs spanning multiple playlists with ease. The only disadvantage with free options that provide unlimited migration is that they aren't necessarily straight-forward for the average user and some technical acumen is required.

    The installation, setup and familiarising yourself with the CLI commands to use the spotify_to_ytmusic application is the only part that takes some time. But once you have generated API Keys in both Spotify and Google, followed the instructions as detailed in the Github repo, the migration process itself doesn't take long at all.

    Conclusion

    When I told one of my coworkers that I had switched to YouTube Music, I received a sceptical look and a response to confirm I am of sane mind. This exemplifies how we have simply accepted Spotify as the only acceptable music platform, blinded to alternatives.

    YouTube Premium, which includes YouTube Music in one package, is an extremely good deal. Not only can you watch YouTube videos ad-free, but you also get a music library comparable to Spotify at a similar price.

    If you have been questioning whether YouTube Music is worth a try. Question no more and make the move.

  • The Google Maps Distance Matrix API gives us the capability to calculate travel distance and time between multiple locations across different modes of transportation, such as driving walking, or cycling. This is just one of the many other APIs Google provides to allow us to get the most out of location and route related data.

    I needed to use the Google Distance Matrix API (GDMA) to calculate the distance of multiple points of interests (destinations) from one single origin. The dataset of destinations consisted of sixty to one-hundred rows of data containing the following:

    • Title
    • Latitude
    • Longitude

    This dataset would need to be parsed to the GDMA as destinations in order get the information on how far each item was away from the origin. One thing came to light during integration was that the API is limited to only outputting 25 items of distance data per request.

    The limit posed by the GDMA would be fine for the majority of use-cases, but in my case this posed a small problem as I needed to parse the whole dataset of destinations to ensure all points of interests were ordered by the shortest distance.

    The only way I could get around the limits posed by the GDMA was to batch my requests 25 destinations at a time. The dataset of data I would be parsing would never exceed 100 items, so I was fairly confident this would be an adequate approach. However, I cannot be 100% certain what the implications of such an approach would be if you were dealing with thousands of destinations.

    The code below demonstrates a small sample-set of destination data that will be used to calculate distance from a single origin.

    /*
    	Initialise the application functionality.
    */
    const initialise = () => {
    	const destinationData = [
                        {
                          title: "Wimbledon",
                          lat: 51.4273717,
                          long: -0.2444923,
                        },
                        {
                          title: "Westfields Shopping Centre",
                          lat: 51.5067724,
                          long: -0.2289425,
                        },
                        {
                          title: "Sky Garden",
                          lat: 51.3586154,
                          long: -0.9027887,
                        }
                      ];
                      
    	getDistanceFromOrigin("51.7504091", "-1.2888729", destinationData);
    }
    
    /*
    	Processes a list of destinations and outputs distances closest to the origin.
    */
    const getDistanceFromOrigin = (originLat, originLong, destinationData) => {
      const usersMarker = new google.maps.LatLng(originLat, originLong);
      let distanceInfo = [];
      
      if (destinationData.length > 0) {
      	// Segregate dealer locations into batches.
        const destinationBatches = chunkArray(destinationData, 25);
    
        // Make a call to Google Maps in batches.
        const googleMapsRequestPromises = destinationBatches.map(batch => googleMapsDistanceMatrixRequest(usersMarker, batch));
    
        // Iterate through all the aynchronous promises returned by Google Maps batch requests.
        Promise.all(googleMapsRequestPromises).then(responses => {
          const elements = responses.flatMap(item => item.rows).flatMap(item => item.elements);
    
          // Set the distance for each dealer in the dealers data
          elements.map(({ distance, status }, index) => {
            if (status === "OK") {
              destinationData[index].distance = distance.text;
              destinationData[index].distance_value = distance.value;
            }
          });
          
          renderTabularData(destinationData.sort((a, b) => (a.distance_value > b.distance_value ? 1 : -1)));
        })
        .catch(error => {
          console.error("Error calculating distances:", error);
        });
      }
    }
    
    /*
    	Outputs tabular data of distances.
    */
    renderTabularData = (destinationData) => {
    	let tableHtml = "";
      
        tableHtml = `<table>
                        <tr>
                            <th>No.</th>
                            <th>Destination Name</th>
                            <th>Distance</th>
                        </tr>`;
    
    	if (destinationData.length === 0) {
            tableHtml += `<tr colspan="2">
                            <td>No data</td>
                        </tr>`;
      }
      else {
            destinationData.map((item, index) => {
      		        tableHtml += `<tr>
                                    <td>${index+1}</td>
                                    <td>${item.title}</td>
                                    <td>${item.distance}</td>
                                </tr>`;
                });
      }
      
      tableHtml += `</table>`;
      
      document.getElementById("js-destinations").innerHTML = tableHtml;
    }
    
    /*
    	Queries Google API Distance Matrix to get distance information.
    */
    const googleMapsDistanceMatrixRequest = (usersMarker, destinationBatch) => {
      const distanceService = new google.maps.DistanceMatrixService();
      let destinationsLatLong = [];
      
      if (destinationBatch.length === 0) {
      	return;
      }
      
      destinationBatch.map((item, index) => {
        destinationsLatLong.push({
          lat: parseFloat(item.lat),
          lng: parseFloat(item.long),
        });
      });
      
      const request = 
            {
              origins: [usersMarker],
              destinations: destinationsLatLong,
              travelMode: "DRIVING",
            };
    
      return new Promise((resolve, reject) => {
        distanceService.getDistanceMatrix(request, (response, status) => {
          if (status === "OK") {
            resolve(response);
          } 
          else {
            reject(new Error(`Unable to retrieve distances: ${status}`));
          }
        });
      });
    };
    
    /*
    	Takes an array and resizes to specified size.
    */
    const chunkArray = (array, chunkSize) => {
      const chunks = [];
    
      for (let i = 0; i < array.length; i += chunkSize) {
        chunks.push(array.slice(i, i + chunkSize));
      }
    
      return chunks;
    }
    
    /*
    	Load Google Map Distance Data.
    */
    initialise();
    

    The getDistanceFromOrigin() and googleMapsDistanceMatrixRequest() are the key functions that take the list of destinations, batches them into chunks of 25 and returns a tabular list of data. This code can be expanded further to be used alongside visual representation to render each destination as pins on an embedded Google Map, since we have the longitude and latitude points.

    The full working demo can be found via the following link: https://jsfiddle.net/sbhomra/ns2yhfju/. To run this demo, a Google Maps API key needs to be provided, which you will be prompted to enter on load.

  • Published on
    -
    2 min read

    The Silent Blogger

    Everyone has different reasons for blogging. It could be for professional development, knowledge exchange, documenting a personal journey, or just as a form of self-expression. My motive for blogging includes a small portion of each of these reasons, with one major difference: you have to find me.

    I don't go out of my way to promote this small portion of the internet web-sphere that I own. In the past, I experimented with syndicating articles to more prominent blogging media platforms and communities, but it didn't fulfil my expectations or bring any further benefits.

    I've observed that my demeanour mirrors an approach to blogging in that I don't feel the need to go to excessive lengths to disclose my accomplishments or a problems I've solved. This could be due to my age, as I am more comfortable just being myself. I have nothing to prove to anyone.

    A 13th century poet, Rumi, once said:

    In silence, there is eloquence. Stop weaving and see how the pattern improves.

    This quote implies that silence is the source of clarity that allows thoughts to develop naturally and emerge.

    Ever since I stopped the pursuit of recognition and a somewhat futile attempt to force my written words onto others, the natural order has allowed this blog to grow organically. Those who have found me from keyword searches has resulted in better interaction and monetisation (through my Buy Me A Coffee page). Fortunately, since I've made an effort to make this blog as SEO-friendly as possible, my posts appear to perform fairly well across search engines.

    No longer do I stress over feeling the need to write blog posts using the "carrot and stick" approach just to garner more readership. I found I benefit from blogging about the things of interest. It's quality over quantity.

    If you have got this far in this very random admission of silent blogging, you're probably thinking: So what's your point?

    I suppose what I'm trying to say is that it's okay to blog without the expectation of having to promote every single post out to the world in hopes for some recognition. Previously, this was my way of thinking, and I've since realised that I was blogging (for the most part) for the wrong reasons. In one of my posts written in 2019 I was in pursuit to be in the same league as the great bloggers I idolised:

    I look at my blogging heroes like Scott Hanselman, Troy Hunt, Mosh Hamedani and Iris Classon (to name a few) and at times ponder if I will have the ability to churn out great posts on a regular basis with such ease and critical acclaim as they do.

    I've learnt not to be so hard on myself and lessen expectations. When you trade your expectations for appreciation, your whole world changes; even though a sense of achievement feels great, it's far more important to enjoy what you're doing (roughly para-phrasing Tony Robbins here).

    This new perspective has reaffirmed my belief that I have always enjoyed blogging, but being a silent blogger provides a sense of freedom.

  • Cookiebot was added to a Kentico 13 site a few weeks ago resulting in unexpected issues with pages that contained Kentico forms, which led me to believe there is a potential conflict with Kentico Page Builders client-side files.

    As all Kentico Developers are aware, the Page Builder CSS and JavaScript files are required for managing the layout of pages built with widgets as well as the creation and use of Kentico forms consisting of:

    • PageBuilderStyles - consisting CSS files declared in the </head> section of the page code.
    • PageBuilderScripts - consisting of JavaScript files declared before the closing </body> tag.

    In this case, the issue resided with Cookiebot blocking scripts that are generated in code as an extension method or as a Razor Tag Helper.

    <html>
    <body>
        ...
        <!-- Extension Method -->
        @Html.Kentico().PageBuilderScripts()    
        ...
        <!-- Razor Tag Helper -->
        <page-builder-scripts />
        ...
    </body>
    </html>
    

    Depending on the cookie consent given, Kentico Forms either failed on user submission or did not fulfil a specific action, such as, conditional form element visibility or validation.

    The first thing that came to mind was that I needed to configure the Page Builder scripts by allowing it to be ignored by Cookiebot. Cookiebot shouldn't hinder any key site functionality as long as you have configured the consent options correctly to disable cookie blocking for specific client-side scripts via the data-cookieconsent attribute:

    <script data-cookieconsent="ignore">
        // This JavaScript code will run regardless of cookie consent given.
    </script>
    
    <script data-cookieconsent="preferences, statistics, marketing">
        // This JavaScript code will run if consent is given to one or all of options set in "cookieconsent" data attribute.
    </script>
    

    Of course, it's without saying that the data-cookieconsent should be used sparingly - only in situations where you may need the script to execute regardless of consent and have employed alternative ways of ensuring that the cookies are only set after consent has been obtained.

    But how can the Page Builder scripts generated by Kentico be modified to include the cookie consent attribute?

    If I am being honest, the approach I have taken to resolve this issue does not sit quite right with me, as I feel there is a better solution out there I just haven't been able to find...

    Inside the _Layout.cshtml file, I added a conditional statement that checked if the page is in edit mode. If true, the page builder scripts will render normally using the generated output from the Tag Helper. Otherwise, manually output all the scripts from the Tag Helper and assign the data-cookieconsent attribute.

    <html>
    <body>
        ... 
        ...
        @if (Context.Kentico().PageBuilder().EditMode)
        {
            <page-builder-scripts />
        }
        else
        {
            <script src="/_content/Kentico.Content.Web.Rcl/Scripts/jquery-3.5.1.js" data-cookieconsent="ignore"></script>
            <script src="/_content/Kentico.Content.Web.Rcl/Scripts/jquery.unobtrusive-ajax.js" data-cookieconsent="ignore"></script>
            <script type="text/javascript" data-cookieconsent="ignore">
                window.kentico = window.kentico || {};
                window.kentico.builder = {};
                window.kentico.builder.useJQuery = true;
            </script>
            <script src="/Content/Bundles/Public/pageComponents.min.js" data-cookieconsent="ignore"></script>
            <script src="/_content/Kentico.Content.Web.Rcl/Content/Bundles/Public/systemFormComponents.min.js" data-cookieconsent="ignore"></script>
        }
    </body>
    </html>
    

    After the modifications were made, all Kentico Forms were once again fully functional. However, the main disadvantage of this approach is that issues may arise when new hotfixes or major versions are released as the hard-coded script references will require checking.

    If anyone can suggest a better approach to integrating a cookie compliance solution or making modifications to the page builder script output, please leave a comment.

    Useful Information

  • Banner Image by: pch.vector on Freepik

    I've been looking out for a side hustle to supplement my monthly stock and shares investment contribution - trying to make up for lost time in the years I did not invest. As it was my first foray into the world of side hustling, I wanted to ease myself into things. So it was important for it to be flexible enough to work around office/personal hours and not require too much time.

    During the COVID-era, I kept note of some side hustles I was planning to try out but never got around to doing so. Forgetfulness also has a part to play in matters and was only reminded when coming across one of my notes from July 2021 stored in Evernote.

    Now was a good time as any to try out one of them: Usertesting.com.

    What Is UserTesting?

    Usertesting.com provides a platform for businesses to get feedback on their products and services. Anyone can apply to be a contributor and provide feedback that consists of:

    • Accessibility
    • Usability
    • Live conversations with businesses
    • Pre-release platform feature review
    • Competitor benchmarking tests
    • A/B testing to compare different versions of a product or feature

    Before becoming an active contributor, a UserTesting will require some basic information as part of the registration process and a practice test to be completed.

    Acing The Practice Test

    UserTesting will provide a test scenario to prove you're a legitimate person and have the capability to demonstrate good communication and analytical thinking. It provides a good standard that is expected when carrying out real tests.

    The test itself is not complicated but you should be prepared to clearly think out loud so there is an understanding of your thought process as you're undertaking various tasks. It's always a good idea before performing a task to read the question out loud so your interpretation of what is being asked is clear. Most importantly, be honest in what you're reviewing.

    At the end of the test, provide a conclusion and thank them for their time in this opportunity.

    The fact that UserTesting.com forces users to take an assessment beforehand demonstrates the credibility of the service and sets the standard for the type of businesses they work with.

    UserTesting will respond to your practice test within 2-3 days, provide feedback and let you know if you will be accepted as a contributor.

    What To Expect From The Real Test?

    After completing the practice test, I didn't get real tests immediately. It took a good couple of weeks for them to start trickling in. Even then, I didn't qualify to take part in some tests as I didn't have experience in the area of expertise.

    Tests are performed on Windows, Mac, Android or iOS devices. There might be a requirement to provide feedback using a specific device. Access to a microphone and sharing your screen is a strict prerequisite. Some do ask for a face recording as well, but I decided to refuse tests that requested this.

    Test vary in length and payout:

    1. Short tests - $4
    2. 10-20 minute tests - $10
    3. 30-minute test - $30
    4. 60-minute test - $60

    The 60-minute tests will always be live a conversation directly with the business and scheduled in advance.

    The Type of Tests I've Contributed To

    I have been quite lucky as to the tests offered to me as they seem to relate to the tech industry. Providing feedback for businesses such as Microsoft, SalesForce, Github, GitLab and Amazon has been insightful.

    Other tests have evolved around the sectors of AI, website accessibility, pre-release platform updates and cloud-hosting.

    Payout

    This is the part you have all been waiting for. How much money have I made since starting at the beginning of June?

    Jerry Maguire - Show Me The Money

    I completed twenty tests consisting majority of $10 tests, one $60 test and a handful of $4 tests. Totalling to $232. Each test is paid out within two weeks to your linked PayPal account. Not so bad for an ad-hoc side hustle.

    UserTesting.com Payout - August 2024

    Twenty tests over the span of three months is not a lot when my contribution could have been higher. But when taking into consideration that this side hustle is only pursued outside of working hours and some tests do not apply to my expertise, it's not so bad.

    The majority of tests offered will be worth $10. Some may question whether they're even worth doing, to which I say: Yes! A $10 test can take anywhere between 5-15 minutes to complete on average. When you take the hourly UK National Minimum wage of £11.44, it's not bad. $10 converted to GBP equates to around £7.60. Easy money!

    The more you contribute the higher chance there is in getting more tests offered to you, providing your feedback rating is good. There are some damn interesting ones as well.

    Conclusion

    Don't apply to UserTesting with the expectation of mass riches as you will sorely be disappointed. Think of it as petty cash to count towards a little "fun money".

    Apart from the monetisation aspect of using UserTesting, I feel I am getting an early insight into where certain industry sectors are going, including my own, which is almost as valuable as the payout itself.

    There will be some days or even weeks when there will be no applicable tests. Just stick with it as all it takes is a handful of 30 or 60-minute tests (which can be hard to come by) to get a nice chunk of change for the month.

  • Published on
    -
    4 min read

    Addressing The Lack of Kentico Content

    I spoke to one of my developer friends a while back and as conversations go with someone tech-minded, it's a mixture of talking about code, frameworks, and platforms entwined with the more life-centric catch-up.

    Both having been in the tech industry for over 15 years, we discussed the "old ways" and what we did back then that we don't do now, which led to Kentico - a platform that we used to talk about all the time, where we'd try and push the boundaries to create awesome websites in the hopes of winning the coveted site of the month or year award. It occurred to us that it's not something we talk much about anymore. Almost as if overnight it vanished from our consciousness.

    Looking through the archive of postings, it's evident I haven't published anything Kentico-related in a long time, with my most recent being in September 2020. Despite the lack of Kentico content on my site, it remains a key player in the list of CMS platforms that I work with. The only difference is the share of Kentico projects are smaller when compared to the pre-2020 era.

    In this post, I discuss my thoughts as to the reason behind my lack of Kentico-related output.

    NOTE: This post consists of my view points alone.

    Licensing Cause and Effect

    A contributing factor was the substantial shift in their licensing model sometime in 2020. Moving to an annual subscription at an increased cost and ditching the base license created somewhat of a barrier to entry for small to mid-sized clients who just needed a reliable CMS platform with customisability. So for someone like myself who could provide Kentico solutions in a freelance capacity was instantly priced out.

    I understand why Kentico needed to reassess its price structure. They offer one of the best .NET CMSs and to stay at the top, an increase in revenue is required to drive the business forward. In all honesty, I believe we had a good run on the old licensing model for over ten years, and it was only a matter of time until a pricing review was required.

    It's just a hard sell when trying to sell a CMS with a £10,000 price tag before any development has even started.

    In light of this, it's only natural to look for alternatives that align with your own business strategy and development needs. The time originally spent developing Kentico has now been reallocated to alternative CMS platforms.

    A Stable Well-Rounded Platform

    Kentico is a mature product with many out-of-the-box capabilities (that get better with every release), which indirectly contributed to my lack of blogging on the subject. I usually only blog about a platform when I find useful workarounds or discover an issue that I was able to resolve.

    This is truly a compliment and testament to Kentico's build quality. There is no need to write about something that is already well-documented and written by active users of the community.

    Reassessing The Kentico Offering

    Kentico is still offered whenever possible. Both clients and developers alike have confidence in the platform. Clients enjoy the interface and security. Developers appreciate the customisability, clear architecture, quick hot fixing, and consistency between editions.

    The only question we now have to ask ourselves is whether Kentico is the right platform for the client's requirements. Prior to the change in licensing, you would be scoffed at for asking such a question. Kentico would be the front-runner before considering anything else.

    Nowadays, Kentico would only be put forward to a client if they had large-scale requirements where cheaper CMS offerings fall short for the licensing costs to be justified.

    I was recently involved in an e-commerce project that ticked all the boxes in line with the client's priorities, which made for an ideal use-case to carry out the build in Kentico, such as:

    • Enterprise-level security
    • Industry-standard compliance
    • All in one solution consisting of content management, e-commerce, and marketing automation
    • Scalability
    • Ability to handle large sets of data
    • Advanced customisability

    In my view, if a client is not too concerned about the above, then alternatives will be used and additional development will be carried out to fill in any gaps.

    The Alternatives

    The CMS sphere is ripe with offerings where we are spoilt for choice. I have whittled these down to:

    1. Umbraco
    2. Kentico
    3. Prismic
    4. Dato
    5. HubSpot

    In my view, those variety of CMSs covers all pricing points, technologies and customisability.

    Conclusion

    I would always jump at the chance in developing in Kentico as I know a large complex website can be developed with almost infinite customisation. But we can't help but notice there is a lot of competition out there, each providing a range of features across different architectures and price ranges.

    Based on my own experience, the demand for fully featured CMS platforms that have a large hosting footprint are reducing in popularity in the advent of more API driven (also known as headless) content delivery that works alongside other microservices.

    Investing in the Kentico eco-system (including its headless variant, Kontent) is always worth considering. It may just not be something I will be writing about consistently here as it requires a more corporate-level type of clientele.

  • Published on
    -
    2 min read

    CSS Zen Garden - The Road To Enlightenment

    Who remembers CSS Zen Garden? I do. As if it was yesterday... I remember first gazing my sights on a simplistic but visually stunning webpage demonstrating what all websites in the future could look like.

    CSS Zen Garden broke the norm of the websites we were used to seeing at the time - crowded blocky tabular-based layouts that lacked personality. It was a revelation and a turning point in web design standards!

    As described within the content of every CSS Zen design, its ethos was clear:

    The Road To Enlightenment

    Littering a dark and dreary road lay the past relics of browser-specific tags, incompatible DOMs, broken CSS support, and abandoned browsers.

    We must clear the mind of the past. Web enlightenment has been achieved thanks to the tireless efforts of folk like the W3C, WaSP, and the major browser creators.

    The CSS Zen Garden invites you to relax and meditate on the important lessons of the masters. Begin to see with clarity. Learn to use the time-honored techniques in new and invigorating fashion. Become one with the web.

    CSS Zen Garden pathed The Road To Enlightenment for me in an entirely different manner - deciding my career.

    Having completed my degree in Information Systems at university in 2006, I was at a crossroads as to which IT field I should specialise in. University seems to prepare you for anything apart from how to apply yourself when you exit the final doors of education into the real world.

    CSS Zen Garden changed the trajectory of my career. Originally, I considered entering the field of Consulting to then changing my mindset into garnering interest as a Web Developer instead. This has had a lasting effect. Even after 18 years, I am still involved in Web Development. I tend to focus more on backend functionality rather than look and feel.

    The CSS Zen Garden community spawned a variety of other designs from talented Web Developers that encompassed a design flair. But the design that started it all, 001, will always hold a special place in my heart. The unforgettable Japanese elements - the Itsukushima Shrine, water lilies, and a blossom tree.

    All the designs have stood the test of time and even through the age of modern web browsers and high-resolution screens, they still present a timeless look that fills me with nostalgia.

  • Published on
    -
    6 min read

    Year In Review - 2023

    Where Have I Been?

    Out of all the years I have been blogging, 2023 seems to be the year that has just flown by like a speeding train leaving us at the platform of memories before we have had a chance to truly comprehend what we have left behind.

    During the year, I am normally able to stop and take note of what I have accomplished (some end up as blog posts) but then also look towards the horizon for what is next. This year has been different - I was unable to transform my learnings and experiences into words. A combination of limited time and (if I am being honest) the lack of passion to write... I got tired.

    As a result, my blogging output has dwindled - totalling a mere seven posts for the year. This blog is important to me and is not something I will ever plan on abandoning as it is very important to own your words. I just have to be realistic in the sense that I won't be blogging as regularly as I have done so in the past.

    The lack of blog posts is not a reflection of having a lack of things to do.

    2023 In Words/Phrases

    Stocks and Shares, Investments, Upskilling, Hardwired Dashcam, Cornwall, Spirituality, Home Networking, Synology 1821+, Writer's block, Serverless functions, Azure CI, Shopify, Google Analytics 4, First Indian Wedding Anniversary, YouTube Content Curation, Green fingers, Lawn Enthusiast, Garmin Venu 2 Plus, The Flash IMAX, Patio/Garage Door renovation, Sky Garden London

    Stocks and Investments

    Since I began investing in the stock market at the beginning of 2022, it has quickly become a source of enormous interest for me, eventually becoming a passion of mine and a skill I wanted to invest (no pun intended) more time. - This could be the main reason for the lack of blogging output.

    I decided to take things a step further by going back to basics by taking a six week course provided by a very clever and patient coach, Vittorio, at StoicMoney. The course consisted of online learning material as well as weekly one-to-one private sessions.

    The course filled in the gaps where I was lacking in understanding crucial areas of the stock market as well as pointing me in the right direction to where my existing investment portfolio could be improved.

    Synology NAS Upgrade

    I am finally the owner of a 8-bay NAS powerhouse that is the Synology DS1821+, which is by far the next best network-related purchase I have made since the UniFi Dream Machine router.

    Investing in my home network is part of a transition I am currently undertaking to be less reliant on third-party services. By having a NAS, I have been able to reduce my cloud-based subscriptions to just Google Photos.

    My NAS is not just being used by myself, it is also regularly being used by family members - all the more reason to invest in making my home network more reliable and efficient. A NAS will only run as efficiently as the network is on.

    If the Synology DS1821+ is anything like the DS415Play (still fully functional), I expect it to last a very long time especially since it encompasses more than enough upgradeable options, such as:

    • Capability for up to 8 Hard Disk Drives
    • Upgradeable RAM
    • SSD Cache
    • Upgradeable Ethernet Card

    A Lawn Enthusiast Is Born

    Who could have ever seen coming that 2023 would be the year I would have taken an interest in something gardening-related, such as lawn care?

    I can't exactly explain where this obsession for obtaining thick, green, luscious blades of grass came from. Throughout the summer months, I was on my hands and knees creating the perfect foundation for grass to grow by creating a mixture of topsoil and grass seed to sprinkle on bare patches and areas that required thickening.

    I can confirm this will not be a one-off obsession based on how disheartened I felt when I was unable to tend to the lawn as I would have liked during the colder months.

    Where my lawn is now compared to how it was previously is night and day. I was both proud and somewhat surprised such a change could be made in what was a barron wasteland of dirt to something pleasing to the eye.

    I look forward to springtime next year to make my lawn even better!

    YouTube Video Curation

    My wife is quite the cook. She has the natural art of turning any group of ingredients into something damn tasty! So much so, her vegetarian cooking converted me from a regular meat eater to someone who enjoys eating meat-free alternatives.

    Meat-eater to vegetarian is quite the accomplishment!

    We had been toying with the idea of creating a YouTube channel to showcase her culinary skills for a while and back in June decided to train myself in video recording and editing.

    Together we made a very short test video on a simple subject matter (making Indian Chai) just to test the waters as to whether it's something we could do. The video had to accomplish the following to determine the feasibility:

    1. Recording content in good lighting conditions.
    2. Edit the video to a certain length (3 minutes).
    3. Apply narration throughout the video for cooking instructions.
    4. Add background music.
    5. Export the video as a YouTube Short and in its long-form variant.
    6. Overlay a logo.

    I am in no way a savvy video editor as this isn't something I've ever done before. Luckily, I managed to tick off all requirements and output a relatively slick (totally biased here!) video.

    Statistics

    I was hesitating as to whether it would be worth mentioning this section for a number of reasons:

    1. I know for a fact my visitors would have dipped due to lack of content output.
    2. The lack of ability to sufficiently compare 2022 and 2023 due to upgrading to Google Analytics 4 in May.
    3. I haven't been keeping track of any analytics for the majority of the year and will not have a clue how I will fair.

    However, the resulting stats didn't look too bad...

    2022/2023 Comparison:

    • Users: -7.3%
    • Page Views: -2.7%
    • Search Console Total Clicks: +87.1%
    • Search Console Impressions: +84%
    • Search Console Page Position: +3.3%

    There is an evident drop in users and page views - this is to be expected as I haven't been cranking out relatable content for some time. I wasn't expecting the positive outcome reported by Search Console.

    As stated earlier, the transition to Google Analytics 4 could have skewed the results to some extent as further filter refinement is required. The true comparison will come in next years year's end review.

    Goals for 2024

    I made a choice in 2022 to make next year's goals somewhat more achievable and less career/programming orientated. This has worked for me and those goals are still relevant to me - even a year on.

    I managed to get back into reading, made more of an effort to pick up the phone to loved ones and to be more present. One area I definitely need to improve on is fitness and exercise!

    I would also like to create a YouTube Channel for my wife's cooking and will set a goal to publish at least one video. I already have a name for the channel and my wife has a few dishes she would like to showcase. This is out the norm for me and quite looking forward to curating some video content.

    Final Thoughts

    My final thought is simply: Be happy. Be healthy. Be productive. Be kind.

Support

If you've found anything on this blog useful, you can buy me a coffee. It's certainly not necessary but much appreciated!

Buy Me A Coffee