Posts written in 2023.

  • Published on
    8 min read

    Migrating from Synology DS415Play to DS1821+

    I've owned my first Synology device - the DS415Play for over eight years. It has been my true day-to-day workhorse, which never faltered whilst churning through multiple downloads and uploads 24 hours a day, 7 days a week, 12 months a year since 2015. It's been one of the most reliable pieces of computer-related hardware I've ever owned.

    Unfortunately, I started to outgrow the device from a CPU, RAM and (most importantly!) hard-drive capacity standpoint and the need for regular restarting became the norm to resolve performance-related issues.

    The performance issues are also caused indirectly by the fact that my NAS isn't solely used by me, but also by my parents and wife where photos are viewed/backed up, documents are stored and videos are streamed.

    The most natural route was to move to stick with Synology and move to one of their larger and expandable NAS devices. The DS1821+ ticked all my requirements:

    • Quad-core 2.2Ghz processor
    • 8 bay Hard Drive capacity
    • Upgradeable RAM
    • NVMe Cache slots
    • Improved scalability

    Focusing on the potential for expansion should mean I won't hit the "hardware glass-ceiling" for many years, which was, unfortunately, the case with my DS415Play. I truly believe that if the DS415Play had the potential for some form of expansion, such as increasing the amount of RAM, it would solve the majority of my usage issues.

    Migrating From One Synology to Another

    DS415Play To DS1821+

    I was under the misconception that migrating from one Synology device to another would be as simple as moving the existing drives over. However, this was not the case due to a variety of factors:

    1. Age: Lack of straightforward compatible migration approaches from older NAS models.
    2. "Value" model discrimination: DS415Play is considered a "value" model and no straightforward migration route is available when upgrading to a "plus" model.
    3. Difference in Package Architecture: If the source and destination NAS use different package architectures may result in DSM configurations and package settings being lost after migration. You can only migrate drives to models with the same package architecture as your source NAS.
    4. Direct Ethernet Connection: Data cannot be copied over via a direct connection between both devices.

    The How To Migration tutorial provided by Synology raised more questions about how I should move my data and configuration settings. Out of the three methods proposed (Migration Assistant/HDD Migration/Hyper Backup), there was only one approach that applied to me - Hyper Backup.

    Manual Copy and Paste To USB Drive

    Before settling with Hyper Backup, I decided to carry out a direct copy-and-paste approach of each user's home directory from Synology to an external USB Drive. I thought this might be a less process-intensive and quickest way to move the data. - No Synology app-related overhead that could cause my DS415Play to grind to a halt.

    However, I quickly realised this could come to the detriment of the integrity and overall completeness of the backup. My NAS was still getting used daily and there was a high chance of missing new files and updates.

    With Hyper Backup, I could carry out a full backup initially and then schedule incremental backups nightly until I was ready to make the switch to DS1821+.

    Hyper Backup

    At the time, unbeknownst to me, this would prove to be a right pain. I knew from the start that moving around 5TB of data would be time-consuming but I didn't factor in the additional trial and error investigation time just to complete this task.

    To ensure smooth uninterrupted running, I disabled all photo and file indexing.

    Avoiding Slow Backup Speeds

    The backup procedure wasn't as straightforward as I'd hoped. Early on I experienced very slow backup speeds. This is down to the type of "Local Folder & USB" backup option selected in Hyper Backup. There is a vast difference in transfer speeds:

    • Local Folder & USB (single-version): 10MB - 60MB/s
    • Local Folder & USB: 0 - 1.2MB/s with longer gaps of no transfer speed

    To reduce any further overhead, compression and encryption were also disabled.

    Additional steps were also taken, such as reformatting the external hard drive to ext4 format and enabling the "Enable delayed allocation for EXT4" setting from the Control Panel.

    What is delayed allocation?

    All byte writes are cached into RAM and it's only when all the byte writes have finished and the file is closed then the data is copied out of the cache and then written to the drive.

    The potential disadvantage of enabling this setting is the drive is more vulnerable to data loss in the event of a power outage.

    Make Use of The High-speed USB Port

    Older Synology models have front and rear USB ports. To further aid in faster data transfer, be sure to connect the external hard drive to the rear USB port as this will be USB 3.0 - a better option over the slower USB 2.0 port provided at the front.

    Backup Strategy

    Once I had Hyper Backup running in the most efficient way, I created three backup tasks so the restore process could be staggered:

    1. User Home Directories: Everything within the /homes path.
    2. Photos: DS Photo-related files that have yet to properly be migrated over to Synology Photos
    3. Application Settings*: Settings and configuration for the key apps that I use. This doesn't include any physical files the app manages.

    * Only the "Local Folder & USB" backup type has the option to allow application settings to be solely backed up. Transfer speeds were not a concern as the settings have a very minimal file size.

    Once a full backup was completed, a nightly schedule was set to ensure all backups were up-to-date whilst I waited for some new hard drives for the DS1821+.


    Restoring the backed-up data was a lot more straightforward than the backup process itself. The only delay was waiting for the new hard drives to arrive.

    New Hard Drives

    Due to the limitations posed by the only migration approach applicable to me, new drives had to be purchased. This was an unexpected additional cost as I hoped to re-use the 8TB worth of drives I already had in my DS415Play.

    I decided to invest in larger capacity drives to make the most of the 8-bays now at my disposal. Two 8TB Western Digital Reds are just what was required.

    Setup and Restore Process

    Utilising new hard drives was actually a refreshing way to start getting things going with the DS1821+, as any missteps I made as a new Synology owner when originally setting up the DS415Play could be corrected.

    Once the drives were installed, the following restore process was carried out:

    1. Install DSM 7.1.
    2. Create Drive Storage Pools.
    3. Install applications.
    4. Re-create all user profiles using the same details and usernames.
    5. Using Hyper Backup, copy all files into each home directory.
    6. Ensure each user's home folder and child directories are assigned with the correct permissions and are only accessible by the user account.
    7. Restore the /photo directory.
    8. Login to Synology Account in Control Panel and restore all DSM configuration settings from online backup - minus user profiles.
    9. Restore application settings (backup task number 3) using Hyper Backup.

    It was only after restoring the DSM configuration settings (point 8), that I realised user profiles including permissions could be restored.

    DSM Configuration Backup Items

    • File Sharing: Shared Folder, File Services, User & Group, Domain/LDAP
    • Connectivity: External Access, Network, Security, Terminal & SNMP
    • System: Login Portal, Regional Options, Notification, Update & Restore
    • Services: Application Privileges, Index Service, Task Scheduler

    Over Network File Restoration

    I decided to limit the use of over-network file copying to just the final leg of the restoration journey to handle some of the less important/replaceable files.

    I would only recommend over-network file copying if you have a fast and stable home network. My UniFi Dream Machine was more than able to handle the amount of data to the DS1821+.

    What Will Become of The DS415Play?

    There is still life in my old trusty DS415Play as it can still handle low-intensive tasks where background processes are kept to a minimum. Any form of file indexing on a large scale would not be suitable.

    I see the DS415Play being used purely as a network storage device avoiding the use of Synology apps. For example, a suitable use case could be an off-site backup at my parent's house.

    Final Thoughts

    Even though the migration process wasn't as smooth as I hoped it would be, there was a silver lining:

    • A Considered Setup Approach: As a long-term Synology user, I consider myself more experienced and understood more about the configuration aspects, allowing me to set up my new NAS in a better way.
    • Data Cleanse: When faced with limited migration routes, it makes you question what data is worth moving. I am somewhat of a data hoarder and being able to let go of files I rarely use was refreshing.
    • Storage Pools: I was able to set up Storage Pools and Volumes in a way that would benefit the type of data I was storing. For example, Surveillance Station recordings will write to a single hard disk, rather than constantly writing to multiple disks based on a RAID setup.

    After completing the full migration, the following thoughts crossed my mind: How long will this Synology serve me? When will I have to perform another migration?

    It has taken me eight years to outgrow the DS415Play. The DS1821+ is double in capacity and more so from a specification perspective (thanks to its upgradeability). Maybe 10 to 14 years?

    As someone who has just turned 38, I can't help but feel a sense of melancholy thinking about where I will be after that duration of time and whether the investment to preserve memories to my Synology will truly be the success I hope it will be.

  • While manually importing data into a Google Sheet to complete the boring chore of data restructuring, I wondered if there was any way that the initial import might be automated. After all, it would be much more efficient to link directly to an external platform to populate a spreadsheet.

    Google App Scripts provides a UrlFetchApp service giving us the ability to make HTTP POST and GET requests against an API endpoint. The following code demonstrates a simple API request to a HubSpot endpoint that will return values from a Country field by performing a GET request with an authorization header.

    function run() {
    function apiFetch() {
      // API Endpoint options, including header options.
      var apiOptions = {
         "async": true,
         "crossDomain": true,
         "method" : "GET",
         "headers" : {
           "Authorization" : "Bearer xxx-xxx-xxxxxxx-xxxxxx-xxxxx",
           "cache-control": "no-cache"
      // Fetch contents from API endpoint.
      const apiResponse = UrlFetchApp.fetch("", apiOptions);
      // Parse response as as JSON object.
      const data = JSON.parse(apiResponse.getContentText());
      // Populate "Sheet1" with data from API.
      if (data !== null && data.options.length > 0) {
          // Select the sheet.
          const activeSheet = SpreadsheetApp.getActiveSpreadsheet();
          const sheet = activeSheet.getSheetByName("Sheet1");
          for (let i = 0; i < data.options.length; i++) {
            const row = data.options[i];
            // Add value cells in Google sheet.
            sheet.getRange(i+1, 1).setValue(row.label);

    When this script is run, a request is made to the API endpoint to return a JSON response containing a list of countries that will populate the active spreadsheet.

    The Google App Script official documentation provides even more advanced options for configuring the UrlFetchAppservice to ensure you are not limited in how you make your API requests.

    In such little code, we have managed to populate a Google Sheet from an external platform. I can see this being useful in a wide variety of use cases to make a Google Sheet more intelligent and reduce manual data entry.

    In the future, I'd be very interested in trying out some AI-related integrations using the ChatGPT API. If I manage to think of an interesting use case, I'd definitely write a follow-up blog post.

  • Published on
    3 min read

    Working From Home - A Three-Year Update

    It's 23rd March 2020 8:30pm and Boris has announced a stay-at-home order effective immediately. Absolutely befuddled at what is about to happen, I receive a call soon after the broadcast from my manager saying we should go to the office and collect any equipment that would allow us to start working from home.

    It is only after returning back from the office later that evening where it dawned on me that things have changed. Thoughts about health, family and concerns about my livelihood all of a sudden came into question. Uncertainty of life as we knew it.

    As strange as it may sound, my basic home office setup gave me a sense of focus, purpose and security. I consider myself as one of the lucky ones during the pandemic where I had a room that could act as a dedicated home office.

    The Very First Work From Office Setup

    Three years on I'm still happily working remotely. The only thing that has changed with each passing year since that fateful night is my office is better equipped and as of last year started making more of an effort in hybrid working by making appearances a couple times a week to my place of work. Being able to choose when to work from the office on my own terms gives me the freedom to break up the week and do find it refreshing. Best of both worlds!

    The pandemic was an uncontrollable force for change in many peoples lives and in my case, both personally and professionally. When I reflect back to the time before the pandemic, I often wonder how I was able to work around 47 hour week and yet still find time to cook, clean and relax. I can only surmise that this is all we have ever known and what was expected of us. A way of life ingrained in our DNA from the very moment we begin our careers.


    The pace of life has slowed to the point where, for the most part, I can schedule my working hours around my day. I now start my day earlier and undisturbed, allowing me to rip through emails and complete some tasks from the day before, all before my first meeting.

    I still work in a similar fashion to how I normally would do within an office environment where I am sat at my desk over many hours, but the main difference being - I have more time!

    I’m able to get personal mundane tasks done, such as putting the washing on, prepping healthy food (now completely handled by my wife), or even cleaning the bathroom during my lunch break. These types of tasks help get me away from the desk for small moments of time and be productive in doing so.

    Even though I've gone to a hybrid working pattern, there are stark differences between being at home and the office. Suffice to say I am more productive within a home environment as I'm able to focus on the job in hand without distractions (apart from a few scheduled meetings), which is a blessing as a programmer.

    There just doesn’t seem to be time to work in an office anymore on a full-time basis. Working from home has proved to be a positive change from both a work and personal perspective.

    Minor Downsides

    Working from home gives me a lot of flexibility in how I work. Sometimes too much flexibility can be a detriment to when you feel you can have a break. Some are able to walk away from their desk during lunchtime and stick to the 9 to 5. Unfortunately, I'm not the type of person who can do that.

    I find it quite difficult to set boundaries at home even though I have a dedicated working space. At least when working from the office, the day ends from the moment you leave the building.

    The Future and Sustainability

    Regardless what employers think of the work from home phenomenon, it isn’t going anywhere soon. It’s what future employees expect. If your job can be done at a desk, does it matter where your desk is?

    Since we've returned from post pandemic normality, I’ve become more conscious how professional myself and my surroundings may come across to new clients I talk to on Zoom calls and make an active effort in ensuring everything is up-to-par. - Some clients may interpret disturbances from family members and informal looking office surroundings as unprofessional.


    Working from home is just a small part in a bigger picture on how the pandemic has changed my life. It was a catalyst of positive change that forced me to reassess my priorities.

    Home really is where the heart is and there is no longer any doubt whether work and family life are able to mix under one roof. I wouldn't have it any other way.

  • There are times when you need to call multiple API endpoints to return different data based on the same data structure. Normally, I'd go down the approach of manually creating multiple Axios GET requests and then inserting the endpoint responses into a single object array. But there is a more concise and readable way to handle such tasks.

    With the help of, we will be using the product search endpoint to search for different products to consolidate into a single array of product objects: /products/search?q=.

    As you can see from the code below, we start off by populating an array with a list of API endpoints where multiple GET request can be carried out for each endpoint from our array. The requests variable contains an array of promises based on each of these GET requests.

    Finally, axios.all() allows us to to make multiple HTTP requests to our endpoints altogether. This function can only iterate through a collection of promises. For more information regarding this Axios function, I found the following article very insightful for a better understanding: Using axios.all to make concurrent requests.

    // List all endpoints.
    let endpoints = [
    // Perform a GET request on all endpoints.
    const requests = => axios.get(url));
    // Loop through the requests and output the data.
    axios.all(requests).then((responses) => {
    	let data = [];
      responses.forEach((resp) => {
      // Output consolidated array to the page.
      const template = $.templates("#js-product-template");
      const htmlOutput = template.render(data);

    As we're looping through each request, we push the response to our data array. It is here where we merge all requests together into a single array of objects. To make things a little more easier to display the results to the page, I use the jsrender.js templating plugin.

    A working demo can be seen on JsFiddle.

  • Published on
    6 min read

    Hardwiring A Dash Cam Into An Audi A1

    Purchasing a Dash Cam has been a priority for me since my pride and joy was damaged when I was in an area of London I hate to drive in. The area in question shall not be named, but I deem it as "a world without rules".

    There are many varieties of Dash Cams on the market that differ in their size and features. From the offset, I had the following set list of requirements in mind:

    • Small form factor
    • LCD Display
    • Option to add a rear camera
    • GPS
    • High-quality night vision
    • Impact sensor - to automatically start recording if a collision occurs when away from the car

    The one that met all these requirements was the RedTiger 4K Dash Cam Front Rear Camera.

    Hardwire Installation

    All Dash Cam's include a power connection to the 12V Cigarette socket, which is absolutely fine if you want a quick setup or are not comfortable in delving into a vehicles fuses. The only disadvantage of this approach is that the 12V socket will always remain occupied and can look slightly untidy.... Dangling wires - no thank you.

    I always had in mind that if I were to get a Dash Cam, I would go down the hardwire option for a more integrated and neater look. To achieve this, a hardware kit needs to be purchased separately. There are many different varieties out there, it's just a matter of finding the right one with a suitable connector for the Dash Cam. In my case, a USB-C connector was needed.

    Understanding The Fuse Box

    Going down the hardwiring approach can be a little daunting and it's recommended one does their due-diligence by reading your car manual and researching online how to access the fuse box as well as getting an understanding of what each fuse does. - This took longer than the installation process itself.

    For my 2018 Audi A1, I found the following resources useful:

    1. Nextbase - How To Fit A Dash cam
    2. Fusebox Info - Audi A1
    3. Audi A5 / S5 2007 -2018 how to fit dash cam to fuse box
    4. Physical Car Owner Manual - Lookup the fuse section
    5. Car Fuse Guide

    Some of these resources were not specifically tailored to my Audi A1, but it gave me a source of reference on what to look out for during the installation process.

    Accessing The Fuses

    The fuses that need to accessed will be located on inside the dashboard. This could either be on the passenger or driver side depending on the most suitable fuse you wish to "piggy-back" onto.

    When you pop open the side-panel of the dashboard, you'll see something that looks like the following:

    In Dash Fuse Boxes

    IMPORTANT: When referring to the Owner Manual, do not make the same mistake I did where I read the fuse box order wrong. In my manual, it illustrated diagrams based on a left-hand drive car, when mine is right-hand drive. This can be confirmed based on the fuse box colour order. As you can see from the image of my fuse box (above), the drivers side fuse boxes are ordered (left to right) - black, brown and red.

    "Piggy-back" A Fuse and Earth Connection

    The hardwire kit will contain various sized fuse wires that can be used based on the size of fuse you plan on "piggy-backing" off of.

    Fuse Wire Options

    For my use, I'm concentrating on the red fuse box where regular sized fuses are present. I opted to connect the fuse wire on slot 2 where the 5A Tan fuse is present. The 5A fuse is then slotted onto the fuse wire to "piggy back" the connection.

    Piggy-backed and Earthed Fuse Wire

    One other thing to point out here is that connection needs to be earthed to the vehicle chassis. This is done by connecting the earth cable (shaped like a hook) to a metal screw/bolt.

    Fuse Slot Update

    Whilst finishing the write-up of this post, I made a slight amendment to the fuse slot used. I decided to use slot 11 (7.5A Red fuse) instead. Even though the first iteration worked absolutely fine in the weeks post install, I preferred to piggy-back off the "Control unit for information electronics" rather than "ABS Control Unit" for peace of mind.

    Rear Camera

    I wasn't planning on installing the rear camera as the initial hard wiring required quite a bit of effort. It just happened that I had a few hours to free one free weekend and thought I'd give it a shot. Connecting the rear camera to the main dash cam unit was the easy part, but finding an inventive way to neatly tuck the wires from the front of the car all the way to the rear took a quite some time.

    The end result is pretty cool - if I do say so myself. Now my car has an additional layer of surveillance.

    Rear Dash Cam Unit

    You may have to be inventive as to how you mount the rear camera. The most ideal place would be close to the rear window on the sill. Due to the way my boot door opens, this was not an option and instead opted to mount it to the roof.


    The steps I have detailed in this section is based on an approach that worked for me. The information provided does not constitute professional installation advice. I cannot guarantee that the information is always up to date and will work for all vehicles.

    I am not liable for any personal injury that you may suffer, or vehicle damage as a result of partaking in the installation of a dash cam. To do so, will be at your own risk.

    RedTiger 4K Dash Cam Quick Review

    Overall, the RedTiger dash cam is a worthy addition to maintain the security of my car. I can't complain about the camera quality and was very pleased with the resolution output. For example, vehicle number plates are crisp, even from a distance. I was very much surprised at just how well it performed in all conditions - even on poorly-lit roads.

    My only gripe is a lack of backup battery or onboard memory so that your own settings are saved. Unless the device is hardwired where power is always provided, every time you turn off your car these settings will be wiped. This is a minor annoyance and luckily the default settings suffice for my usage.

    The accompanying app is a little clunky and I was expecting this. But it does it's job very well and was surprised to see how much information is presented when playing back a recording. This specific RedTiger dash cam model has the ability to connect via WiFi from the app allowing me to view all recordings and download selected videos directly to my phone.


    Hardwiring a dash cam may come across as a daunting task. My advice is to invest time in the preparation and understand where the fuse boxes are located. Most importantly, during the installation process, take things slow and plan how the wiring will run along the inside of the car.

    I found the whole experience quite rewarding.

    Once you have a dash cam fitted, you'd wonder how you ever lived without it.

  • Published on
    4 min read

    Stack Overflow Trigger-Happy Downvoting

    Stack Overflow is known for being the one of the best resources for developers to get answers to their questions. It truly is a treasure trove of information where you're guaranteed to get an answer. But I find asking a question can be a little trying at times and this is coming from someone like myself who has been a Stack Overflow user for around 11 years.

    I wrote my last Stack Overflow question (now deleted) middle of last year with the feeling of ineptitude after within a matter of 15 minutes my question was hit with downvotes with no clear explanation. As I stated above, I'm not a Stack Overflow novice and understand how to ask a question and adhere by its strict guidelines.

    I don't think downvoting has any correlation with what users historically faced where there were reports of the platforms toxic nature. I can see Stack Overflow has made active steps to be more welcoming and inclusive to developers of all experience levels so they won't experience what I previously called "Question Assassination", where a question is posted and some (if not all) of the following could be experienced:

    • Downvoting to oblivion
    • Comments linking to "How to ask a question"
    • Marked as low-quality
    • Comments with a hint of condescending attitude referencing the inadequacy of the question

    There have been less of these negative interactions over the years and the Stack Overflow community seemed to have turned a corner. But there are still cases of "trigger-happy" downvoting that still runs rife.

    On a personal level, I've received downvotes from questions where I didn’t fully grasp some of the core concepts of what I was asking. - Something that can happen to anyone who is learning something new. You would think this is what a Q&A platform is for. For some well-seasoned Stack Overflow contributors, there seems to be an expectation for a set level of understanding of the question you ask, which can all be very subjective.

    At times it feels like questions are downvoted before they even have a chance to properly be nurtured.

    Downvoting without any form of context should not be allowed, making it compulsory to add a comment as to the reason why a question is not up to par. This in turn will provide guidance on what improvements need to be made to make the question more adequate. - As long the comment doesn't form judgement on why the question was asked in the first place. For example:

    I downvoted because these "How can I do task A without using obvious language features X, Y, or Z?" questions are silly puzzles which have no place in serious programming, and essentially no value for future readers.

    A comment like that would fall under my "Question Assassination" criteria. This type of criticism is not constructive to the individual asking the question.

    When down votes are accumulated in such quick-fire fashion, the first inclination is to delete the question so lost reputation points can be regained and to also keep a clean track record on your profile. I have done this very thing in the past as I fear this displays ineptitude.

    But maybe I should be looking at downvotes from a different angle...

    I believe that developers like myself need to develop a thicker skin and not be too concerned with the number of downvotes received, instead viewing it as a learning experience in order to grow and become better. When we ask a question, we have just as much of a role to play in nurturing it to make it better from the negative responses, and given enough time, a question may make a turnaround where the number of upvotes negates the number of downvotes.

    Guide To Asking A Question

    Everyone needs to remember Stack Overflow is a very different type of forum, where serious time needs to be invested into writing a concise question. No stone should be left unturned when it comes to providing what has been attempted and context to what is expected from the successful execution of problem code.

    Hopefully, there will be less chance in receiving negative feedback if these steps are followed.

    1) Has The Question Already Been Answered?

    There is a high likelihood your question to some degree has already been asked before. Stack Overflow provides a list of similar posts as you write the post title. Do not ignore these posts and give them a thorough read as they may help.

    If none of the previously written posts does fit your use case, you have a basis to post your question. To ensure the question does not get marked as a duplicate, it is a good idea to refer to the questions you have already read and how your scenario differs.

    2) Spelling and Grammar

    Proper use of grammar should be used along with correct spelling to ensure readability and understanding to a wider audience.

    3) Formatting

    Moderators will not look so kindly to a poorly formatted post. Most importantly, a badly formatted post does a disservice in getting a prompt answer to your question. Use code blocks, lists and headings for easier reading.

    4) Examples, Examples, Examples

    Provide as many code samples that demonstrate what you've attempted to resolve the issue. Without sufficient code samples, it's difficult for someone to assist. As stated in the "Formatting" section, ensure code blocks are used.

    Final Thoughts

    I come to the conclusion that asking a question on Stack Overflow is unlike any other Q&A forum where any type of question is openly accepted for what it is with minimal guidelines. I can't fault the tight grip its moderators have in attempting to retain the quality of both questions and answers. However, I do think there is a persistent issue with how a question is nurtured regardless of the quality it is perceived to be.

    Stack Overflow is run by a passionate programming community of experts who set the bar high in what is expected when a question is posted and there is a lot to learn from them if done correctly - free of rudeness and demeaning responses.

    As painful as it is to be downvoted, I'll try to be a little optimistic and not take it personally as it's the type of constructive criticism you won't get anywhere else and just might make you a better developer.