Automatically Backing Up Plesk Data To Synology

In light of my hosting issues over the last week, I decided it was time to take measures in ensuring all websites under my hosting provider are always backed up automatically. I generally take hosting backups offsite on an ad-hoc basis and entrust the hosting provider to keep up their end of the bargain by doing this on my behalf.

If you are with a hosting provider (like I was previously - A2 Hosting), who talks the talk but can't actually walk the walk in regards to the service they offer, you will more than likely end up having backup woes. It's always best practice to take control of your own backups and if this can be automated, makes life so much easier!

All Plesk panels have a "Backup Manager" area where you can action manual or scheduled backup processes. Depending on your hosting provider, the features shown in this area might be varied. Some have the option to backup straight to your Dropbox account. What we will be focusing on is remotely backing up our website data to our Synology NAS using FTP.

Before we log into Plesk to select our Remote Backup option, we need to carry out some setup on our Synology.

Port Forwarding for FTP and FTPS Protocols

Most likely, your router will have a limited number of ports open to allow outside internet traffic to enter the local network. To make the most of your Synology, there is a recommended number of ports you need to open to make use of all the services.

We are interested in opening to the following ports:

  • FTP: 21
  • FTPS: 990

I prefer to send over any data using FTPS just for better security.

You will have to login to your router settings to open ports. I would provide some instructions on how to do this, but every router is different. I just managed to find these settings hidden away in my own Billion router.

Synology Setup

Setting up FTP is pretty straight-forward. Just make sure you have administrative privileges to access the Control Panel.

Enable FTP

In Control Panel, go to: File services > FTP Tab.

All we need to do here is to enable two FTP settings:

  • Enable FTP service (no encryption)
  • Enable FTP SSL/TLS encryption (FTPS)

Synology Control Panel - Enable FTP

The reason why I selected the "Enable FTP service (no encryption)" option is purely for initial testing purposes. If there are any issues when making a connection via FTP from a new service for the first time, I just like to ensure if a successful connection can be made via standard FTP. After my testing is done, I would disable this option.

Create a Synology User

I prefer to create a new user specifically for FTP connections rather than my main own account, as I have the ability to lock down access to only read and write permissions in its home directory. No Synology services or applications will be accessible.

Synology FTP User Permissions

The only application I allow my user to access is "FTP".

Synology FTP User Application Permissions

Plesk Backup Manager Setup

FTP Configuration

In Backup Manager, go to "Remote Storage Settings and select "FTP". Enter the following settings along with your user credentials:

  • FTP server hostname or IP: http://mysynology.synology.me
  • Directory for backup files storage: /home
  • Use passive mode:  Yes
  • Use FTPS: Yes
On clicking the "OK" or "Apply" button should return no errors. But if there are errors, check the logs and ensure you haven't missed any permissions for your Synology user.

Set Backup Schedule

Now we have set our remote storage settings, we now need to put a schedule in place to generate a backup on as often as we require. It's up to on how often you set the regularity of the backups. I've set mine to run daily at 11pm and retain these backups for a month.

Plesk Scheduled Backup

Make sure you set the Backup settings to store the backup in your newly created FTP storage.

A Word To The Wise

Just because we now have automatic backups running protecting us from any foreseen hosting issues, this doesn't mean we're all in the clear. Backups are useless to us if they don't work. I check my backups at least once a week and ensure the most recently backed up file is free from corruption and can be opened.

A2 Hosting - Can Any Hosting Provider Be Trusted?

It's been a turbulent last few days at the house of A2 Hosting where not only all their Windows hosting, but also a number of Wordpress hosting (as of 23rd April) has come to a standstill. After much pressing by its customers, it has come to light that a malware related security breach caused an outage, not only in one service, but many across A2 Hosting infrastructure. 

It's now been 3 days in counting where the outage still persists. Luckily, I managed to move back to my old hosting provider after waiting 2 days patiently for some form of recovery and I'm glad I did! I truly feel sorry for the many others who are still waiting on some form of resolution. I think I managed to get out from under A2 Hosting relatively unscathed.

This whole outage has caused me to not only reflect on my time with A2 Hosting but also hosting providers in general.

The Lies

If I'm honest, the days were counting down after getting infuriated by their support (lack of!) and the lies by their marketing and sales to meet my relatively simple hosting needs. I like to think I'm very scrupulous when it comes to hosting and do my due diligence... In this case, A2 managed to get one over me in that department!

I run a couple of sites on Kentico CMS and it was important to find a hosting company that caters for this platform due to the hardware resources required to run.

Lo and behold...

A2 Hosting - Best Kentico Hosting

Judging by that page alone filled me with confidence at a reasonable price with a lot of extras thrown in. I confirmed this was the case by talking at length to the A2 sales team beforehand and was ensured any tier would meet my needs. So I opted for the mid-tier plan - Swift, costing around £125 for 2 years after some nice promotional offers.

Knowing what I know now, I can report that the Swift plan and potentially all the other shared plans do not fit the requirements of a reasonably small Kentico site. Hosting Kentico on A2 Hosting was the bane of my life, as every so often my site would randomly timeout, with only one explanation from their support team:

We suggest you optimize your website with help from your web developer to fix the issue.

After politely requesting more information on the issue and also entertaining the fact I may need to up my hosting, I never really did get any adequate reason. It was always the efficiency of my website to blame.

Don't Believe Them, Don't Trust Them

Lack of Transparency

In light of recent events, transparency isn't one of A2 Hostings strengths (unless when pressed upon by its many customers). When problems arise, I'd prefer to know exactly what is the root cause. Knowing this actually puts more confidence in a hosting provider. I think we all know the feeling when we're not given the full picture.

Our minds have a habit of thinking of a worst-case scenario when we do not have the full picture.

Honesty is the best policy!

A2 Hosting Tweet - Transparency
(Example of A2 Hosting Lack of Transparency)

99.9% Uptime Promise

In reality, I don't expect 99.9% uptime from hosting providers as things do happen due to unforeseen circumstances. But I still expect the 98-99% range.

A2 Hosting - 99.9%25 Uptime

Judging by my uptime monitoring, I have never been blessed with 99.9% uptime during my tenure (1 out of 2-year plan) at A2 Hosting. My site has always encountered timeouts and downtime. The last major outage was around 2 months ago -  amounting up to 24 hours downtime!

Trusting Your Hosting Provider

If your website is big or small, handing over your online presence to a third-party is a big deal. You are whole-heartedly trusting a company to house your website with tender loving care. Any downtime and slow loading times can negatively impact your client base and SEO.

I've learnt that a hosting provider could have many 5 star reviews and still lack the infrastructure and support to back it up. In fact, this is what perplexed me about A2 Hostings many positive reviews.

Finding quality and appropriately priced hosting is very difficult to find. There are so many options, but the hosting industry has the classic issue of quantity over quality.

Backups

Regardless of how good any hosting company is, I would always recommend you take suitable measures to regularly carry out offsite backups on all your sites. Yes, this can be a laborious task if you are managing many sites, but its the only way to 100% sure you can be in control.

This was the only way I was able to move swiftly back to SoftSys Hosting and not wait on A2 Hosting to restore their services. At one point, there was a question mark over the current state of A2 Hosting's backups are in.

Tweet - A2 Hosting Backup
(A2 Hosting Questionable Backups)

Moving Back To Previous Hosting

Believe or not, I can't remember the exact reason why I left Softsys Hosting. After all, I never had any issues with them throughout the 9 years I was with them. Very accommodating bunch of guys! I think what attracted me to A2 Hosting was their shiny website, the promise of faster load times and the option to have my site hosted on UK servers.

It's always an absolute pain having to move and set everything back up again. But thanks to Ruchir at Softsys Hosting who was very attentive in helping me during my predicament and answering all my queries, managed to assist in achieving a quick turnaround. So in total, my site was only down for just under 2 days.

It seems quite apt that I come back to the hosting provider I call home under the same reasons to why I started using them in the first place back in 2009 when I was failed by my first ever hosting provider (Ultima Hosts). Oh, the irony!

Conclusion

Unfortunately, there isn't an exact science to finding the most ideal hosting provider for your budget and requirements. If you ever have any qualms regarding your current hosting provider, you might have good reason to be. Hosting should be worry and hassle free, knowing that your data is in the hands of capable people. If you have the finances to move, just do it. Hardware can be replaced, data can not. Data is a commodity!

Take online reviews with a pinch of salt. Instead, take a look at the existing users responses through their main Twitter and status accounts. Some might even have status pages. This will hopefully give you a more unbiased view on their operation and approach to resolving past issues.


Update - 26/04/2019

I have asked A2 Hosting for some form of compensation, especially since I purchased 2 years up front. Awaiting their response to the exact amount. I am hoping they will add some additional compensation as a goodwill gesture for misleading on their Kentico host offering.

Update - 27/04/2019

As of 27/04/2019 8pm (GMT), I managed to log back into A2 Hosting Plesk Administration to get a more recent backup of my hosting. Noticed there were some database errors in the process.

Update - 01/05/2019

Not looking good. I think there is a very slim chance in getting any form of reimbursement from A2 Hosting as they have decided to delete my support ticket. Not "close", but actually delete. I thought this was probably a mistake and after delving into the mass of responses from many other unhappy users, it seems I am not the only one.

Tweet - A2 Hosting Deleting Tickets

One can only assume that A2 Hosting are wiping their hands of any form of user correspondence. There hasn't been any further considerable updates or timescales to when services will resume. I am still awaiting for the ability to carry out a proper backup.

Using Cloudflare With Kentico - Purging Cached Media Files

This month I've been writing some blog posts on why I decided to start using Cloudflare service for my website and utilising its API to allow me to purge cached files from the Cloudflare CDN on demand. Before reading further, I highly suggest perusing those posts just to put everything into context for my reasoning into using Cloudflare as well as the C# code that interacts with the API, which I will be referencing later on within this very post.

My intial Cloudflare integration evolves around serving media files more efficiently through a CDN and having the ability to refresh these files automatically as updates are made within the Kentico CMS. Cloudflare's CDN services can help cache your content across their large global network, moving static files closer to your visitor.

Based on the Page Rules I configured within the Cloudflare dashboard, I am caching all media library files served through the /getmedia/ URL path into the Cloudflare CDN. The same file will be served through the CDN until the set cache limit has expired. We need to implement functionality that will add some automation to the Kentico platform to purge the cache of a specific media library file when updated.

Add A Global Event

I created an event handler for the updating of Media library files as I wanted to get details of the file being updated by leveraging the MediaFileInfo class to access the Update.After event.

protected override void OnInit()
{
    base.OnInit();

    MediaFileInfo.TYPEINFO.Events.Update.After += Update_After;
}

private void Update_After(object sender, ObjectEventArgs e)
{
    MediaFileInfo fileInfo = e.Object as MediaFileInfo;

    GlobalEventFunctions.PurgeMediaCache(fileInfo);
}

PurgeMediaCache() Method

The event above calls a GlobalEventFunctions.PurgeMediaCache() method that will pass the information about the changed file ready for purging. The file URL parsed to the Cloudflare.PurgeSelectedFiles() method needs to be exact and take into consideration how your instance of Kentico is serving media files. If Permanent URL's are being used the /getmedia/ URL needs to be constructed consisting of:

  • Current domain
  • File GUID
  • File Name
  • File Extension

Otherwise, we can just use get the file path as normal to where the media file resides.

public class GlobalEventFunctions
{
    /// <summary>
    /// Purges a file from the Cloudflare cache.
    /// </summary>
    /// <param name="fileInfo"></param>
    public static void PurgeMediaCache(MediaFileInfo fileInfo)
    {
        bool permanentURLEnabled = SettingsKeyInfoProvider.GetBoolValue($"{SiteContext.CurrentSiteName}.CMSMediaUsePermanentURLs");
        string filePath = string.Empty;
            
        if (permanentURLEnabled)
            filePath = $"{GetCurrentDomain()}/getmedia/{fileInfo.FileGUID.ToString()}/{fileInfo.FileName}{fileInfo.FileExtension}";
        else
            filePath = $"{GetCurrentDomain()}/{fileInfo.FilePath}";

        try
        {
            // Get code from: https://www.surinderbhomra.com/Blog/Post/2018/11/11/Cloudflare-API-Purge-Files-By-URL-In-C
            CloudflareCacheHelper cloudflareHelper = new CloudflareCacheHelper();

            cloudflareHelper.PurgeSelectedFiles(new List<string> { filePath });
        }
        catch (Exception ex)
        {
            EventLogProvider.LogException("Cloudflare Purge File Cache", "CLOUDFLARE_PURGE", ex, SiteContext.CurrentSiteID, $"Purge File: {filePath}");
        }
    }

    /// <summary>
    /// Get domain from current http context.
    /// </summary>
    /// <returns></returns>
    private static string GetCurrentDomain()
    {
        return $"{HttpContext.Current.Request.Url.Scheme}{Uri.SchemeDelimiter}{HttpContext.Current.Request.Url.Host}{(!HttpContext.Current.Request.Url.IsDefaultPort ? $":{HttpContext.Current.Request.Url.Port}" : null)}";
    }
}

We need not consider any other scenarios, such as insert or deletion. If a file is inserted, there is nothing to purge as it's a new file that will be cached directly into in the CDN on first request and when it comes to deletion we can just wait for the cache to expire.

What's Next?

The integration I have detailed so far is just scratching the surface of what Cloudflare has to offer and will investigate further on pushing more content over to the CDN. One area, in particular, I am looking into is carrying out full page caching. You might be thinking why even bother as Kentico has pretty good caching mechanisms already in place?

Well Cloudflare has a really neat feature called "Always Online", where a cached version of a page is served if on the off chance it happens to go down or requires a reboot to install key security updates. But implementing this feature requires strict Page Rules to be setup within the Cloudflare dashboard to ensure the general workings of Kentico are not effected.

My Reasons for Using Cloudflare With Kentico

A couple day ago my website got absolutely hammered by a wave of constant SQL injection attacks by the same IP over a time period of a couple hours.

I only managed to notice this whilst perusing the Event Log within the Kentico Administration interface. I don't normally check my own error logs as regularly as I should do, but since my site has recently gone through a bit of a revamp (which I'm still yet to post about), I wanted to ensure I haven't broken anything.

To be honest, I am flattered that someone would think this site is worth the time and energy in trying to hack my site. Trust me, it ain't worth it.

Even though Kentico has handled these attacks well, as a precaution I wanted to implement an additional layer of security before any further untoward activity reaches to my site. Being on a shared hosting platform my options are limited and my hands are tied to put in an infrastructure that doesn't cost the world.

Enter Cloudflare

I have always had some form of awareness of the Cloudflare content delivery network, just never put the service into practice. At one point I was looking into utilising Cloudflare to manage all my site media files over their CDN. But Cloudflare isn't just a CDN, it's able to offer much more:

  • Analytics - monitor traffic as well as caching ratio and more!
  • Firewall - manage access by IP, country, or query rules.
  • Rate limiting - protect your site or API from malicious traffic by blocking client IP addresses that hit a URL pattern and exceed a threshold you define.
  • Page rules - to allow caching to be triggered by a number of rules targeting specific areas of a site.

The great thing is that these options are part of the free plan... even though there are restrictions to the number of settings you are able to put in place. The security options alone was enough of a reason to try out Cloudflare. But for my needs, the free plan seemed to tick all the boxes. I am just scratching the surface to what Cloudflare has to offer and have already got a few of the features working alongside Kentico.

How I'm Using Cloudflare With Kentico

As security was a main concern for me, one of the first things I did was to add in some rules through the firewall and block suspicious traffic. Naturally the next was to take advantage of the CDN capabilities. I wouldn't recommend full site caching unless you have some pretty strict page rules in place as this has a chance to cause issues with the Kentico Admin interface. By default, Cloudflare doesn't cache HTML pages, which is a good thing as it gives us a plain canvas to target the areas we want to cache.

Being on the free plan, I only had three page rules at my disposal and made the decision to cache the following parts of my site.

AreaRegex RuleSettings
Media Library Files (using Permanent URL's)*surinderbhomra.com/
getmedia/*

Cache Level: Caches Everything
Edge Cache TTL: A month
Browser Cache TTL: 16 days

Site CSS, JavaScript and Images*surinderbhomra.com/
resources/*
Cache Level: Caches Everything
Edge Cache TTL: A month
Browser Cache TTL: 16 days
ScriptResource.axd/WebResource.axd*surinderbhomra.com/*.axdCache Level: Caches Everything
Edge Cache TTL: 7 days
Browser Cache TTL: 7 days

Two cache types are used:

  1. Edge Cache TTL - is the setting that controls how long CloudFlare's edge servers will cache a resource before requesting a fresh copy from your server. 
  2. Browser Cache TTL - is the time that CloudFlare instructs a visitor's browser to cache a resource. Until this time expires, the browser will load the resource from its local cache and speeding up the request.

I am caching my assets for a considerable amount of time, which begs the question how will changes to files purge the cache in Cloudflare? Luckily, Cloudflare has quite a nice API, where I have the ability to purge everything or individual files (maximum of 30 in one request).

Purge Media Library File Cache

I am in the middle of testing a GlobalEvent handler that will carry out the following steps when files are inserted or updated:

  • Get path of the file.
  • Check if site is using Permanent URL's. As the URL to the file will be constructured differently if enabled.
  • Convert the relative path to an absolute path.
  • Pass the absolute file path to Cloudflare using the following Purge Files by URL API endpoint.

Once I have carried out some further tests, I will be posting this code in a follow-up post.

Purge Site File Cache

Now attempting to purge the cache for site files such as JavaScript, CSS and images are a little more tricky as I need to keep an eye on the files changed. The easiest thing to do is I could write some code that will iterate through all the files in the /resources folder and purge everything from the CDN. Not the most elegant solution. Still, need to ponder on the correct implementation. If anyone has got any better suggestions, let me know.

How Do I Know Cloudflare Is Caching My Site Files?

It does take a little time in Cloudflare to cache all files on a site. It all depends on pages being viewed. A page needs to be loaded in order to submit all its contents to cache. On first load, the response header CF-Cache-Status will return a "MISS", which means the content has not been served from Cloudflare.

Cloudflare Response - MISS

However, when you go back to the page and re-check the page headers, the CF-Cache-Status should return "HIT". If this is not the case, check your Page Rules within the Cloudflare dashboard.

Cloudflare Response - HIT

Is Cloudflare Worth It?

Quick answer - Yes!

Setup is very straight-forward. All that is required is to carry out a change to your domain and point your DNS to the DNS Cloudflare assigns to you. There is no downtime in doing this. As a result, overall site performance has improved and page speed test faired much better.

To give you a better insight for transparency, here are some statistics straight from my Cloudflare portal over a 24 hour period:

Requests Through Cloudflare
(Understandably, the number of cached items served through Cloudflare is low due to only caching specific areas)

Cloudflare Performance
(My basic shared hosting should now be performing better as less requests are being served via the origin server)

Cloudflare Detected Threats
(Blocked threats - the reason why I decided to give Cloudflare a try in the first place)

I still have to carry out a lot more research in using Cloudflare to its full potential and will use my website as a test bed to see what I can achieve. My end goal is to make my website quicker and more secure!

5 Hours Of GoDaddy Pro Is More Than Enough For Me

Go Daddy LogoThere's a saying that goes along the lines of: if it ain't broke don't fix it. But as human beings, we're a very inquisitive bunch and like to delve into the dark abyss of the unknown just to see what's out there. This is exactly what I did when I decided to move hosting providers.

I've hosted my site for many years very happily on SoftSys Hosting ever since UltimaHosts decided to delete my site, with no backup to restore. Now when a hosting company does that, you know it's time to move on - no matter how good they say they are. If I have no issues with my current hosting provider, why would I ever decide to move? Afterall, my site uptime has been better than it has ever been and the support is top notch!

Why Make The Move To Go Daddy Pro?

Well, for starters the cost of hosting was not something to be sniffed at and based on my current site usage, the deluxe package (priced at £3.99 per month) on top of a promo code - an absolute bargain! Secondly, the infrastructure that consists of real-time performance and uptime monitoring (powered by NodePing) and high spec servers Go Daddy describe as:

Hosting isn’t about having the best hardware, but it sure doesn’t hurt. Our servers are powered by Intel Xeon processors for heavy lifting, blazing-fast DDR 3 memory for low latency, and high speed, redundant storage drives for blistering page load times and reliability.

Currently, my server is based in the US and wanted to house my website on servers nearer to home back in the UK, which unfortunately my existing shared hosting package does not offer. Even though it doesn't look it, my website is quite large and there are benefits in having hosting in the same locale to where you live, such as faster upload and download speeds when having to update a website via FTP.

Thirdly, to have my domain and website all under a single dashboard login that Go Daddy seems to do well in a clean manageable interface and have the ability to self administer an SSL certificate without paying the hosting company for installation on my behalf.

Lastly, the 24/7 support accessible through either email or online chat, where GoDaddy Pro boasts its low waiting time and highly specialised support that can be escalated to resolve problems quickly. But from what I experienced, it was at this the very point where they failed.

Go Daddy Pro Advanced Technical Support

The Extremely Painful Support Experience

I kid you not, it made me want to smack myself over the head repeatedly with my Macbook Pro. My experience with GoDaddy Pro's "specialised support" personnel was a painful one. For explanation purposes, I will call the person I spoke to: Gary.

After uploading files onto my GoDaddy Pro account, I carried out all initial setup with ease and was near enough ready to go. However, I was missing one critical piece of the puzzle that would demonstrate the low technical acumen of GoDaddy Pro support and thus my departure from their services. The critical piece being: the ability to upload and restore a backup of an MS SQL Server database.

Now, the SQL Server database backup in question is around 150MB and could not simply access the online MS SQL portal to allow me to upload my backup since the URL to the portal uses a sub domain (if I remember correctly) based on your own live website domain. This was a problem for me purely because I have not pointed the domain over to Go Daddy's servers. If I did, the consequence would be my current site going down. Not an option! I required a temporary generated domain that would allow me to not only access the MS SQL portal but also test my site before I re-point my domain. 

My query was quite a simple one, so I opened up GoDaddy's support chat window and Gary came online to help. After much discussion he did not seem to understand why I'd want to do this and never managed to resolve the access issue to the MS SQL portal. All Gary did is send me links to documentation, even though I told him I have read the support literature beforehand and there was nothing relating to my original query.

He then suggested that I log into the database through Microsoft Management Studio and then go through the process of restoring my backup directly from my computer to their database. As we all know, (unfortunately Gary didn't) you cannot upload and restore a database backup directly from your own computer. The backup needs to be on the database server itself and even after I told him I am a Microsoft .NET Developer and that this was not possible, Gary kept telling me I was wrong. This went on for quite sometime and got to the point where I just couldn't be bothered anymore and any motivation I had on moving hosting providers dissolved in that very instant.

On one hand, I cannot falter Go Daddy's support response. It is pretty instant. On the other hand, the quality of support I had received is questionable to say the least.

Outcome

I decided to stay with SoftSys Hosting based on the fact I haven't had any issues and any queries I've ever had was dealt professionally and promptly. I think it's quite difficult to see how good you have it until you try something less adequate. Their prompt support exceeded my expectations when I requested an SSL to be installed on a Sunday evening and to my surprise it was all done when I got up the next morning. Now that's what I call service!

If I can say anything positive about my GoDaddy experience is the money back promise within 30 days of purchase. I had no problems getting a full refund swiftly. I spoke to the friendly customer service fellow over the phone and explained why I wanted to leave and my refund was processed the next working day.

I just wish I could get those 5 wasted hours of my life back...

Back Up and Running!!!

crying-man Last week my blog was offline due to an unfortunate mishap. I won’t go into the details on what happened. I’d rather just forget.

After a lot of hard work, sweat and tears, my blog is almost back to its former glory. I was lucky enough to find a backup that was made a few months ago using BlogEngine’s BlogML export tool. Hooray!

Even though all my posts are back on display, I have unfortunately lost some user comments and ratings. It saddens me to know that I have lost this valuable information, since hearing your thoughts makes this blog a more exciting read.

There is still some work to do, but I am glad to say the difficult part is over.