Blog

Posts written in 2020.

  • Published on
    -
    2 min read

    Adding Security Headers In Netlify

    I normally like my last blog post of the year to end with a year in review. In light of being in Tier 4 local restrictions, there isn't much to do during the festive period unlike previous years. So I have decided to use this time to tinker around with various tech-stacks and work my own site to keep me busy.

    Whilst making some efficiency improvements under-the-hood to optimise my sites build and loading times, I randomly decided to check the security headers on securityheaders.com and to my surprise received a grade 'D'. When my site previously ran on the .NET Framework, I managed to secure things down to get graded an 'A'. I guess one of my misconceptions on moving to a statically-generated site is there isn't a need. How wrong I was.

    A dev.to post by Matt Nield explains why static sites need basic security headers in place:

    As you add external services for customer reviews, contact forms, and eCommerce integration etc., we increase the number of possible vulnerabilities of the application. It may be true that your core data is on accessed when you rebuild your application, but all of those other features added can leave you, your customers, and your organisation exposed. Being frank, even if you don't add external services there is a risk. This risk is easily reduced using some basic security headers.

    Setting security headers on a Netlify hosted site couldn't be simpler. If like me, your site is built using GatsbyJS, you simply need to add a _headers file in the /static directory containing the following header rules:

    /*
    X-Frame-Options: DENY
    X-XSS-Protection: 1; mode=block
    Referrer-Policy: no-referrer
    X-Content-Type-Options: nosniff
    Content-Security-Policy: base-uri 'self'; default-src 'self' https: ; script-src 'self' 'unsafe-inline' https: ; style-src 'self' 'unsafe-inline' https: blob: ; object-src 'none'; form-action 'self' https://*.twitter.com; font-src 'self' data: https: ; connect-src 'self' https: ; img-src 'self' data: https: ;
    Feature-Policy: geolocation 'self'; midi 'self'; sync-xhr 'self'; microphone 'self'; camera 'self'; magnetometer 'self'; gyroscope 'self'; fullscreen 'self'; payment 'self'
    

    When adding a "Content-Security-Policy" header be sure to thoroughly re-check your site as you may need to whitelist resources that are loaded from a different origin. For example, I had to make some tweaks specifically to the "Content-Security-Policy" to allow embedded Tweets to render correctly.

    My site is now back to its 'A' grade glory!

    Useful Links

  • Published on
    -
    7 min read

    Year In Review - 2020

    Well, hasn’t this been an interesting year? I couldn't have described it as eloquently as Helen Rosner, who managed to sum up the thoughts that ran through my mind at the start of the pandemic:

    2020 In Words/Phrases

    Coronavirus, Covid-19, holiday less, lockdown, DIY, UniFi router, armchair, daily HIIT sessions, home office setup, GatsbyJS, 10th work anniversary, Netlify hosting, WordPress (begrudgingly!), Batman Begins, Failing MacBook Pro, Social Share Image Improvements, Hubspot, work from home, social hermit, ultra-wide curved monitor, smart-home automation, family, new forest, outside Christmas lights, Pixel 5, Azure Web Apps, Azure DevOps, Google Photos disappointment

    Holiday - A Paradise Lost

    Over the last couple of years, I started a tradition on writing about my one big holiday I like to take each year. This was supposed to be the year where I expanded my Maldives horizons (after visiting Vilamendhoo last year) holidaying on another island - Cocoon Island!

    I wanted to go to Cocoon Island to celebrate my 35th year of being on this earth with family. But alas, it was not to be. Covid-19 cast doubt of travel uncertainty throughout the year and I'm hoping (like many others), I'll have the opportunity to travel once again in the coming year.

    My Site

    This has been the year where I have fully transitioned my site into the Gatsby framework and had an absolute ball in doing so! There is something liberating about having a website that doesn’t rely on the conventional CMS platform and as a bonus, I’m saving around £100 in yearly hosting costs after moving to Netlify for hosting.

    I need to pluck up the courage to update the front-end build as it not only looks dated but doesn’t perform very well on the Google Lighthouse score, which is something that should easily be achievable using Gatsby JS. Redeveloping this aspect of my website has always taken a back-seat as writing content will take precedence. Strangely enough, looking back over the year I should have had time to write more especially during the lockdown period, but I found this year to be mentally exhausting.

    Statistics

    When I look at the stats for this year, it seems like my older posts still seem to get a lot of traction. Maybe the numbers are trying to tell me that my more recent posts aren’t that interesting. In all seriousness, I have had another positive bump but not on the same level as in previous years. I am ok with that.

    As I stated in my last year in review post, I accepted that the figures will plateau. I’m surprised I managed to get any increase in stats as I lacked focus when it came to blogging and most importantly talking more about unique technical subjects with depth.

    2019/2020 Comparison:

    • Users: +11.45%
    • Page Views: +10.54%
    • New Users: +10.72%
    • Bounce Rate: -0.01%
    • Search Console Total Clicks:  +99%
    • Search Console Impressions: +91%
    • Search Console Page Position: +1.7%

    Experiencing The Missed Cinematic Experience of 2005

    On the 15th June 2005, a film was released that would forever redefine super-hero cinema - Batman Begins! There are certain films that must be seen on the big screen and for me, Batman Begins was one of them. It was unfortunate I gave it a miss on release as I fell out of love with the film interpretations of Batman after “Batman and Robin” scarred me for life.

    I instantly regretted this miss-step when finally watching the film on DVD over a year later. I yearned the day when I’d get an opportunity to see Christopher Nolan’s Batman on the big screen. Fast-forward 15 years from its original release, Covid-19 presented a small silver-lining where a handful of films were re-released to fill the gaping hole in the cinemas' schedule, caused by film studios withholding their new releases.

    The screening itself couldn’t have been more perfect. Sitting in the VIP seating area and having the whole auditorium to myself, gave a somewhat immersive and intimate viewing experience.

    The MacBook Pro Engineer

    My 2015 MacBook Pro's battery has been failing for some time now. So much so it's become a glorified desktop rather than a laptop, as any attempt to disconnect would result in the full loss of power. Being that my laptop is out of warranty and even considered buying a replacement, I plucked up the courage to replace the battery myself. Some may call this madness, but I thought this would be the quickest way to get a new battery in when compared to the estimated time Apple quoted me - 2 weeks. Two weeks is a very long time to be without a laptop.

    There is such a wealth of online resources demonstrating how the battery can be replaced via DIY videos on YouTube and iFixit tutorials. I'll admit, it takes guts to rip out an existing battery mainly due to the heavy-duty adhesive. It's a slow and arduous process. After this is done, the rest is plain sailing.

    I wish I could say my laptop is fully operational but it’s still a glorified desktop as I am still getting battery health warnings, leading me to think some other component is playing up.

    Syndicut

    1st July marked my 10th anniversary at Syndicut. I always knew I wanted my 10th anniversary to be marked with something memorable... Covid made it memorable indeed for all the wrong reasons. I would have preferred to celebrate with my workmates on a social outing of some sorts, instead, it was a more low-key affair involving a raised glass of the finest Rioja to another successful 10 years!

    At this point, I have to really thank Steve and Nick (the directors of Syndicut) who managed to steer us through the choppy waters of the Covid-19 ripple effect. It’s thanks to them our jobs remained secure and I’m sure my fellow work colleagues would express the same gratitude that we came through the other side! For the first time in my life, I felt the possibility of facing financial insecurity.

    If this year has taught me anything, it's not to take one's job and career for granted especially when words such as “furlough” and “unemployment” is so prevalent.

    Journey for Self Improvement

    Depending on how one looks at it, when living on your lonesome and placed in a lockdown can be a recipe for borderline insanity! You could while away the time watching excessive amounts of TV or playing Scrabble GO (my lockdown game of choice!) with friends and randoms over the world, or utilise this time improving one's self. As they say - Idle hands make for the devil's work.

    With so much time on my hands, I became very conscious of ensuring I was being as productive as I could, whether that was doing DIY, learning new a new programming framework/language or forcing myself to exercise more often using resistence-bands with gyms being closed. Seriously, those resistance-bands are worth every penny. I don’t think I’ll ever be going back to the gym.

    Home Improvement

    When in lockdown, I no longer had an excuse to put off all the DIY and general house jobs I previously been telling myself I'm too busy to complete. The outcome has been very satisfying and in can finally say things are more homely.

    My most precious purchase is the new leather armchair which I've placed in the corner of the room along with some plants. It's since become a place where I can read, write and think... I've called it my "thinking space"! :-)

    Working from home gave me the extra push to properly kit out a small office space. Thankfully, this is something that was already in motion before the lockdown and had a nice industrial desk (made out of re-purposed Indian mango wood and steel) and a leather chair. Over the months, I kept adding more items to make my work life more comfortable. Currently, I am awaiting some Displates to cover up the bare walls.

    I’ve also been delving into some smart-home automation starting with the purchase of some smart plugs leading me wanting more! At some point in the future, I could hook up my smart devices to a Raspberry Pi for additional control through a mini touchscreen. Now that would be very cool!

    Google Pixel 5

    I didn’t end up getting an iPhone to compliment my iPad purchase from last year. Couldn’t bring myself to do it. Even though I’ve been looking for a replacement for my Pixel 2 for some time, there weren’t any Android phones I deemed a worthy purchase. Last years Pixel 4 didn’t tick the boxes that I’d hope it would and so opted for this years Pixel 5.

    The Pixel 5 isn’t what I’d class as the typical flagship. Google has redefined what they class as a “flagship” by not using the most up-to-date components when it comes to the processor and the camera. Strangely enough, the camera hardware hasn’t been updated since the Pixel 2, which is very odd. Nevertheless, I have found the Pixel 5 to be a fine phone. The battery lasts me two days on a single charge and (most importantly!) the camera picture quality cannot be faulted.

    Home Network Upgrade

    In light of having to work from home, I thought now might be a good time to give the network and little more stability, speed and security. My trusty old Billion 7800DXL router started to wane and found myself having to manually restart it on a daily basis. After failing to find up-to-date firmware to help remedy the issue, I thought it’s best to opt for an upgrade to a prosumer grade router - UniFi Dream Machine.

    At some point, I would like to beef up my network setup by getting a network switch cabinet filled with hardware from the UniFi range of products. Even though this would be overkill for my needs, it would be very interesting to setup.

    Final Thoughts

    I leave 2020 with an immense sense of gratitude where all those I consider close to me are safe and healthy. It’s strange to think over the last year has been something we have all bear witness and experienced together. Covid-19 has changed things - the very fabric of our existence. It squashes a persons ego.

  • Published on
    -
    5 min read

    Preserving Digital Memories

    The older I get, the more obsessed I have become with preserving life’s memories through photos and video. With so many companies offering their storage solutions, we’re living in an age where storage is no longer something that comes at a premium. There are a wide variety of pricing and feature tiers for all, benefiting us as consumers. If you have full trust in the service provider, they are suited particularly well for the majority of consumer needs. But as a consumer, you need to be prepared to shift with potential service changes that may or may not work in your favour.

    For many years, I have always been conscious that I’m a photo hoarder and believe that there isn’t a bad photo one can take with the help of advancements in phone camera technology. If you ask any of my work colleagues, they’d probably tell you I have a problem. When we go on any of our socials, I’m the first person to whip out my phone and take pictures as they make nice mementoes to look back on and share.

    On a more personal note, during last years Diwali I came to the sudden realisation as we all sat down to my mum belting out her prayers that this will not last forever and it dawned on me that these are truly special moments. Being an Indian who is culturally inept in all the senses and cannot speak his native tongue, I would be comforted knowing that I'll have photos and video to look back on many years to come. From that moment, I decided to make an active effort to capture smaller moments like these. Maybe the pandemic has shown me not take things for granted and appreciate time with family more.

    I got a little serious in my crusade and took things a step further by acquiring as many family photos as possible by purchasing a photo scanner to digitise all prints for safekeeping. Prints fade in time, not in the digital world.

    Photo Backup Strategy

    Whether I take photos on my phone or my FujiFilm X100F camera, the end destination will always be my Synology NAS where I have the following redundancies in place:

    • RAID backup on all hard disks.
    • Nightly backups to the cloud using BackBlaze.
    • Regular backup to an external disk drive and stored off-site.

    As expected, my phone gets the most use and everything by default is stored within my Google Photos account using free unlimited storage. I then use Synology Moments that acts as my local Google Photos where my photos are automatically stored to my Synology in original quality.

    My camera gets mostly used for when I go on holiday and events. I store the RAW and processed photos on my Synology. I still upload the processed photos to Google Photos as I love its AI search capability and makes sharing easy.

    At the end of the day, the layers of redundancy you put in place depend on how important specific photos are to you. I like the idea of controlling my own backups. I take comfort knowing my data is stored in different places:

    • Synology
    • Backblaze
    • Google Photos
    • Offsite Hard Drive

    Cloud Storage and Shifting Goalposts

    The fear I had pushed to the back of my head finally came to the forefront when Google changed its storage policy.

    The recent news regarding the changes in Google Photos service gives me a sense of resolve knowing I already have my local storage solution that is already working in parallel with Google Photos. But I can’t help but feel disappointed by the turn of events though. Even though I can to some extent understand Google's change in their service, I can't help but feel slightly cheated. After all, they offered us all free unlimited storage in exchange to allow them to apply data mining and analysis algorithms to improve their services. That's the price you pay for using a free service. You are the product (this I have no grievances with)!

    Now they have enough of our data, they can feel free to cut the cord. We all know Google has a history of just killing products. Google Photos may not be killed, but life has certainly been sucked out of it.

    It may come across as if I’m solely bashing Google Photos, when in fact this is a clear example of how companies can change their service conditions for their benefit and face no repercussions. We as users have no say on the matter and just have to roll with the punches. It just seems wrong that a company would entice so many users with a free service to then strip it away. This is a classic monopolistic strategy to grab market share by pricing out its competitors to now demand money from its users.

    For me, Google Photos provided a fundamental part of the photo storage experience by making things easily accessible to family and friends. No longer will I be able to invite friends/family to contribute to shared albums unless they opt for the paid plan. Now when you’re surrounded by iPhone users, this creates another barrier of entry.

    This has cemented my stance more so to ensure have control of my assets and service, which is something I have been doing.

    Final Thoughts

    If I have carried out my photo archival process correctly, they should be accessible to future generations for many years to come and continue to live on even after I’ve expired. This should be achievable as I’ll continue to maintain this time-capsule as technology continues to evolve.

    The most important take-away: If you strip down my approach to the barebones, I’m not giving in to the monopolistic behaviour of the tech giants - Google, Apple or Microsoft. Just using them as a secondary thought to compliment my process. It’s just my NAS doing the heavy-lifting where I set the rules.

    These priceless heirlooms are a legacy and my gift for future generations to come.

  • I love my Google Nest and it truly is a revolutionary piece of kit. Not only does it display my photos but it also forms a key part of some basic smart-home automation. I really have no gripes. But there is one small area I feel it's lacking. The Radio Alarm. I'm the type of person who detests alarm sounds and prefer my sleep cycle to be shattered by something a little softer, like a radio station.

    How difficult is it for Google to add a feature that will allow one to wake up to their favourite radio station? I would have thought this key feature would be very easy to put in place, after all, the Nest Hub can carry out much more complex operations. There are varying reports that this feature is available only within the US, which I find very odd to why this is the case. Does Google not know here in the UK we also would find this feature useful?

    In the meantime, whilst I await an official release (that may not come anytime soon!) I managed to concoct a somewhat preposterous way to get some form of radio alarm automation. You will require the following:

    • An Android phone with Google Assistant capability
    • Google Nest Hub (standard or max)
    • Phone stand to sit next to your Nest Hub (optional)

    The premise of the approach I detail is to get an Android phone to fire off the alarm at the desired time and when the alarm is dismissed manually, the phone will utter a phrase that will be picked up by your Google Nest Hub and play your radio station.

    If you’re still here and intrigued by this approach, let's get to it!

    The first thing we need to do is set up a “Good Morning” routine on the Google Nest Hub, which can be done through the Google Home app on your phone. It is here where we will carry out the following:

    1. Assistant will section: Adjust media volume to 40%.
    2. And then play section: Select Play radio and enter the name of a radio station.
    3. Save the routine.

    Now when you utter the magic phrase “Good morning”, the Google Nest Hub will do exactly what we set up in our routine. Now we need to add some automation to do this for us and this is where the alarm feature on your Android phone comes into play.

    I cannot be sure if the alarm feature on all newish Android phones gives the ability to define a Google Assistant routine. If it does, you should see this as an option when setting the alarm. We need to carry out a similar process as we carried out (above) when setting a “Good Morning” routine on the Google Nest Hub:

    1. When I dismiss my alarm: Adjust media volume to 50%.
    2. Select the “Add action” button and under the “Enter command” tab, enter the following text: Hey Google. Good Morning.
    3. Leave the “And then play” section to do nothing.
    4. Save the routine.

    Your phone will ideally be placed in close proximity of your Google Nest Hub for the “Hey Google. Good Morning” utterance to be heard. In my case, I have my phone right next to the Nest Hub on my bedside cabinet to make it easy to dismiss the alarm.

    I have to concede the approach I have to take comes across quite lame. It just seems ridiculous that you have to rely on a phone to fire off a process to allow one to have the radio to play automatically. Why can’t routines be more flexible at Nest Hub level?

    I’m unable to determine whether my approach comes across naive or clever. Maybe it's somewhere in between.

  • Another day, another ASP.NET Core error... This time relating to JSON not being parsable. Like the error I posted yesterday, this was another strange one as it only occurred within an Azure environment.

    Let me start by showing the file compilation error:

    Application '/LM/W3SVC/144182150/ROOT' with physical root 'D:\home\site\wwwroot\' hit unexpected managed exception, exception code = '0xe0434352'. First 30KB characters of captured stdout and stderr logs:
    Unhandled exception. System.FormatException: Could not parse the JSON file.
     ---> System.Text.Json.JsonReaderException: '0x00' is an invalid start of a value. LineNumber: 0 | BytePositionInLine: 0.
       at System.Text.Json.ThrowHelper.ThrowJsonReaderException(Utf8JsonReader& json, ExceptionResource resource, Byte nextByte, ReadOnlySpan`1 bytes)
       at System.Text.Json.Utf8JsonReader.ConsumeValue(Byte marker)
       at System.Text.Json.Utf8JsonReader.ReadFirstToken(Byte first)
       at System.Text.Json.Utf8JsonReader.ReadSingleSegment()
       at System.Text.Json.Utf8JsonReader.Read()
       at System.Text.Json.JsonDocument.Parse(ReadOnlySpan`1 utf8JsonSpan, Utf8JsonReader reader, MetadataDb& database, StackRowStack& stack)
       at System.Text.Json.JsonDocument.Parse(ReadOnlyMemory`1 utf8Json, JsonReaderOptions readerOptions, Byte[] extraRentedBytes)
       at System.Text.Json.JsonDocument.Parse(ReadOnlyMemory`1 json, JsonDocumentOptions options)
       at System.Text.Json.JsonDocument.Parse(String json, JsonDocumentOptions options)
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationFileParser.ParseStream(Stream input)
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationFileParser.Parse(Stream input)
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationProvider.Load(Stream stream)
       --- End of inner exception stack trace ---
       at Microsoft.Extensions.Configuration.Json.JsonConfigurationProvider.Load(Stream stream)
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.Load(Boolean reload)
    --- End of stack trace from previous location where exception was thrown ---
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.HandleException(ExceptionDispatchInfo info)
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.Load(Boolean reload)
       at Microsoft.Extensions.Configuration.FileConfigurationProvider.Load()
       at Microsoft.Extensions.Configuration.ConfigurationRoot..ctor(IList`1 providers)
       at Microsoft.Extensions.Configuration.ConfigurationBuilder.Build()
       at Microsoft.Extensions.Logging.AzureAppServices.SiteConfigurationProvider.GetAzureLoggingConfiguration(IWebAppContext context)
       at Microsoft.Extensions.Logging.AzureAppServicesLoggerFactoryExtensions.AddAzureWebAppDiagnostics(ILoggingBuilder builder, IWebAppContext context)
       at Microsoft.Extensions.Logging.AzureAppServicesLoggerFactoryExtensions.AddAzureWebAppDiagnostics(ILoggingBuilder builder)
       at Microsoft.AspNetCore.Hosting.AppServicesWebHostBuilderExtensions.<>c.<UseAzureAppServices>b__0_0(ILoggingBuilder builder)
       at Microsoft.Extensions.DependencyInjection.LoggingServiceCollectionExtensions.AddLogging(IServiceCollection services, Action`1 configure)
       at Microsoft.AspNetCore.Hosting.WebHostBuilderExtensions.<>c__DisplayClass8_0.<ConfigureLogging>b__0(IServiceCollection collection)
       at Microsoft.AspNetCore.Hosting.HostingStartupWebHostBuilder.<>c__DisplayClass6_0.<ConfigureServices>b__0(WebHostBuilderContext context, IServiceCollection services)
       at Microsoft.AspNetCore.Hosting.HostingStartupWebHostBuilder.ConfigureServices(WebHostBuilderContext context, IServiceCollection services)
       at Microsoft.AspNetCore.Hosting.GenericWebHostBuilder.<.ctor>b__5_2(HostBuilderContext context, IServiceCollection services)
       at Microsoft.Extensions.Hosting.HostBuilder.CreateServiceProvider()
       at Microsoft.Extensions.Hosting.HostBuilder.Build()
       at Site.Web.Program.Main(String[] args) in C:\Development\surinder-main-website\Site.Web\Program.cs:line 11
    
    Process Id: 2588.
    File Version: 13.1.20169.6. Description: IIS ASP.NET Core Module V2 Request Handler. Commit: 62c098bc170f50feca15916e81cb7f321ffc52ff
    

    The application was not consuming any form of JSON as part of its main functionality. The only JSON being used were three variations of appsettings.json - each one for development, staging and production. So this had to be the source of the issue. The error message also confirmed this as Program.cs was referenced and it’s at this point where the application startup code is run.

    My first thought was I must have forgotten a comma or missing a closing quote for one of my values. After running the JSON through a validator, it passed with flying colours.

    Solution

    After some investigation, the issue was caused by incorrect encoding of the file. All the appsettings.json files were set to "UTF-8" and as a result, possibly causing some metadata to be added stopping the application from reading the files. Once this was changed to "UTF-8-BOM" through Notepad++ everything worked fine.

  • You gotta love .NET core compilation errors! They provide the most ambiguous error messages known to man. I have noticed the error message and accompanying error code could be caused by a multitude of factors. This error is no different so I’ll make my contribution, hoping this may help someone else.

    The error in question occurred really randomly whilst deploying a minor HTML update to a .NET Core site I was hosting within an Azure Web App. It couldn’t have been a simpler release - change to some markup in a View. When the site loaded, I was greeted with the following error:

    Failed to start application '/LM/W3SVC/####/ROOT', ErrorCode '0x8007023e’.
    

    I was able to get some further information about the error from the Event Log:

    Application 'D:\home\site\wwwroot\' failed to start. Exception message:
    Executable was not found at 'D:\home\site\wwwroot\%LAUNCHER_PATH%.exe'
    Process Id: 10848.
    File Version: 13.1.19331.0. Description: IIS ASP.NET Core Module V2.
    

    The error could only be reproduced on Azure and not within my local development and staging environments. I created a new deployment slot to check if somehow my existing slot got corrupted. Unfortunately, this made no difference. The strange this is, the application was working completely fine up until this release. It's still unknown to me what could have happened for this error to occur all of a sudden.

    Solution

    It would seem that no one else on the planet experienced this issue when Googling the error message and error code. After a lot of fumbling around, the fix ended up being relatively straight-forward. The detail provided by the Event Log pointed me in the right direction and the clue was in the %LAUNCHER_PATH% placeholder. The %LAUNCHER_PATH% placeholder is set in the web.config and this is normally replaced when the application is run in Visual Studio or IIS.

    In Azure, both %LAUNCHER_PATH% and %LAUNCHER_ARGS% variables need to be explicitly set. The following line in the web.config needs to be changed from:

    <aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="false" startupTimeLimit="3600" requestTimeout="23:00:00" hostingModel="InProcess">
    

    To:

    <aspNetCore processPath=".\Site.Web.exe" arguments="" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="false" startupTimeLimit="3600" requestTimeout="23:00:00" hostingModel="InProcess">
    

    The processPath is now pointing to the executable generated by the project. In this case, "Site.Web.exe". Also, since no arguments are being parsed in my build, the arguments attribute is left empty. When you push up your next release, the error should be rectified.

    As a side note, there was one thing recommended to me by Azure support regarding my publish settings in Visual Studio. It was recommended that I should set the deployment mode from "Framework-Dependent" to "Self-Contained". This will ensure the application will always run in its current framework version on the off-chance framework changes happen at an Azure level.

  • The Kentico Kontent ASP.NET Core boilerplate contains a CustomContentLinkUrlResolver class that allows all links within your content to be transformed into a custom URL path based on the content-type a link is referencing. The out-of-the-box boilerplate solution works for most scenarios. But there will be times when links cannot be resolved in such a simplistic fashion, especially if your project is using dynamic page routing.

    What we need to do is make a small tweak to the CustomContentLinkUrlResolver class so we can use Kontent’s DeliveryClient object, which in turn allows us to query the API and carry out a complex ruleset for resolving URL’s.

    To give a frame of reference, the out-of-the-box CustomContentLinkUrlResolver class contains the following code:

    public class CustomContentLinkUrlResolver : IContentLinkUrlResolver
    {
        /// <summary>
        /// Resolves the link URL.
        /// </summary>
        /// <param name="link">The link.</param>
        /// <returns>A relative URL to the page where the content is displayed</returns>
        public string ResolveLinkUrl(ContentLink link)
        {
            return $"/{link.UrlSlug}";
        }
    
        /// <summary>
        /// Resolves the broken link URL.
        /// </summary>
        /// <returns>A relative URL to the site's 404 page</returns>
        public string ResolveBrokenLinkUrl()
        {
            // Resolves URLs to unavailable content items
            return "/404";
        }
    }
    

    This will be changed to:

    public class CustomContentLinkUrlResolver : IContentLinkUrlResolver
    {
        IDeliveryClient deliveryClient;
        public CustomContentLinkUrlResolver(DeliveryOptions deliveryOptions)
        {
            deliveryClient = DeliveryClientBuilder.WithProjectId(deliveryOptions.ProjectId).Build();
        }
    
        /// <summary>
        /// Resolves the link URL.
        /// </summary>
        /// <param name="link">The link.</param>
        /// <returns>A relative URL to the page where the content is displayed</returns>
        public string ResolveLinkUrl(ContentLink link)
        {                
            switch (link.ContentTypeCodename)
            {
                case Home.Codename:
                    return "/";
                case BlogListing.Codename:
                    return "/Blog";
                case BlogPost.Codename:
                    return $"/Blog/{link.UrlSlug}";
                case NewsArticle.Codename:
                    // A simplistic example of the Delivery Client in use to resolve a link...
                    NewsArticle newsArticle = Task.Run(async () => await deliveryClient.GetItemsAsync<NewsArticle>(
                                                                                new EqualsFilter("system.id", link.Id),
                                                                                new ElementsParameter("url"),
                                                                                new LimitParameter(1)
                                                                            )).Result?.Items.FirstOrDefault();
    
                    if (!string.IsNullOrEmpty(newsArticle?.Url))
                        return newsArticle.Url;
                    else
                        return ResolveBrokenLinkUrl();
                default:
                    return $"/{link.UrlSlug}"; 
            }
        }
    
        /// <summary>
        /// Resolves the broken link URL.
        /// </summary>
        /// <returns>A relative URL to the site's 404 page</returns>
        public string ResolveBrokenLinkUrl()
        {
            // Resolves URLs to unavailable content items
            return "/404";
        }
    }
    

    In the updated code, we are using DeliveryClientBuilder.WithProjectId() method to create a new instance of the DeliveryClient object, which can then be used if a link needs to resolve a News Article content type. You have may have also noticed the class is now accepting a DeliveryOptions object as its parameter. This object is populated on startup with Kontent’s core settings from the appsettings.json file. All we’re interested in is retrieving the Project ID.

    A small update to the Startup.cs file will also need to be carried out where the CustomContentLinkUrlResolver class is referenced.

    public void ConfigureServices(IServiceCollection services)
    {
        ...
    
        var deliveryOptions = new DeliveryOptions();
        Configuration.GetSection(nameof(DeliveryOptions)).Bind(deliveryOptions);
    
        IDeliveryClient BuildBaseClient(IServiceProvider sp) => DeliveryClientBuilder
            .WithOptions(_ => deliveryOptions)
            .WithTypeProvider(new CustomTypeProvider())
            .WithContentLinkUrlResolver(new CustomContentLinkUrlResolver(deliveryOptions)) // Line to update.
            .Build();
    
        ...
    }
    

    I should highlight at this point the changes that have been illustrated above have been made on an older version of the Kentico Kontent boilerplate. But the same approach applies. The only thing I’ve noticed that normally changes between boilerplate revisions is the Startup.cs file. The DeliveryOptions class is still in use, but you may have to make a small tweak to ascertain its values.

  • I’ll be the first to admit that I very rarely (if at all!) assign a nice pretty share image to any post that gets shared on social networks. Maybe it’s because I hardly post what I write to social media in the first place! :-) Nevertheless, this isn’t the right attitude. If I am really going to do this, then the whole process needs to be quick and render a share image that sets the tone before that will hopefully entice a potential reader to click on my post.

    I started delving into how my favourite developer site, dev.to, manages to create these really simple text-based share images dynamically. They have a pretty good setup as they’ve somehow managed to generate a share image that contains relevant post related information perfectly, such as:

    • Post title
    • Date
    • Author
    • Related Tech Stack Icons

    For those who are nosey as I and want to know how dev.to undertakes such functionality, they have kindly written the following post - How dev.to dynamically generates social images.

    Since my website is built using the Gatsby framework, I prefer to use a local process to dynamically generate a social image without the need to rely on another third-party service. What's the point in using a third-party service to do everything for you when it’s more fun to build something yourself!

    I had envisaged implementing a process that will allow me to pass in the URL of my blog posts to a script, which in turn will render a social image containing basic information about a blog post.

    Intro Into Puppeteer

    Whilst doing some Googling, one tool kept cropping up in different forms and uses - Puppeteer. Puppeteer is a Node.js library maintained by Google Chrome’s development team and enables us to control any Chrome Dev-Tools based browser through scripts. These scripts can programmatically execute a variety of actions that you would generally do in a browser.

    To give you a bit of an insight into the actions Puppeteer can carry out, check out this Github repo. Here you can see Puppeteer is a tool for testing, scraping and automating tasks on web pages. It’s a very useful tool. The only part I spent most of my time understanding was its webpage screenshot feature.

    To use Puppeteer, you will first need to install the library package in which two options are available:

    • Puppeteer Core
    • Puppeteer

    Puppeteer Core is the more lighter-weight package that can interact with any Dev-Tool based browser you already have installed.

    npm install puppeteer-core
    

    You then have the full package that also installs the most recent version of Chromium within the node_modules directory of your project.

    npm install puppeteer
    

    I opted for the full package just to ensure I have the most compatible version of Chromium for running Puppeteer.

    Puppeteer Webpage Screenshot Script

    Now that we have Puppeteer installed, I wrote a script and added it to the root of my Gatsby site. The script carries out the following:

    • Accepts a single argument containing the URL of a webpage. This will be the page containing information about my blog post in a share format - all will become clear in the next section.
    • Approximately screenshot a cropped version of the webpage. In this case 840px x 420px - the exact size of my share image.
    • Use the page name in the URL as the image file name.
    • Store the screenshot in my "Social Share” media directory.
    const puppeteer = require('puppeteer');
    
    // If an argument is not provided containing a website URL, end the task.
    if (process.argv.length !== 3) {
      console.log("Please provide a single argument containing a website URL.");
      return;
    }
    
    const pageUrl = process.argv[2];
    
    const options = {
        path: `./static/media/Blog/Social Share/${pageUrl.substring(pageUrl.lastIndexOf('/') + 1)}.jpg`,
        fullPage: false,
        clip: {
          x: 0,
          y: 0,
          width: 840,
          height: 420
        }
      };
      
      (async () => {
        const browser = await puppeteer.launch({headless: false});
        const page = await browser.newPage()
        await page.setViewport({ width: 1280, height: 800, deviceScaleFactor: 1.5 })
        await page.goto(pageUrl)
        await page.screenshot(options)
        await browser.close()
      })(); 
    

    The script can be run as so:

    node puppeteer-screenshot.js http://localhost:8000/socialcard/Blog/2020/07/25/Using-Instagram-API-To-Output-Profile-Photos-In-ASPNET-2020-Edition
    

    I made an addition to my Gatsby project that generated a social share page for every blog post where the URL path was prefixed with /socialcard. These share pages will only be generated when in development mode.

    Social Share Page

    Now that we have our Puppeteer script, all that needs to be accomplished is to create a nice looking visual for Puppeteer to convert into an image. I wanted some form of automation where blog post information was automatically populated.

    I’m starting off with a very simple layout taking some inspiration from dev.to and outputting the following information:

    • Title
    • Date
    • Tags
    • Read time

    Working with HTML and CSS isn’t exactly my forte. Luckily for me, I just needed to do enough to make the share image look presentable.

    Social Card Page

    You can view the HTML and CSS on JSFiddle. Feel free to update and make it better! If you do make any improvements, update the JSFiddle and let me know!

    Next Steps

    I plan on adding some additional functionality allowing a blog post teaser image (if one is added) to be used as a background and make things look a little more interesting. At the moment the share image is very plain. As you can tell, I keep things really simple as design isn’t my strongest area. :-)

    If all goes to plan, when I share this post to Twitter you should see my newly generated share image.

  • Published on
    -
    2 min read

    Sony Android TV - Resolving Internet Access Issue

    I thought I should write a quick post for those who may also experience lost internet connection on a Sony TV (running Android OS) as my parents encountered a couple weeks ago. My parents have had their TV for a few years now and never experienced connection issues... unless the Internet was truly down.

    The TV is connected to the internet modem via an ethernet cable. Even though the network status was marked as "connected" there was no Internet connection. Other household devices were connected to the Internet successfully, which confirmed there was solely an issue with the Sony TV. Luckily, after a lot of fumbling around the settings and a lot of Googling, there was in fact a simple solution requiring no technical knowledge.

    The issue occurs when the date-time is incorrect. This needs to be corrected by carrying out the following steps:

    1. Press the Home button on the remote control.
    2. Select Settings (cog icon) found on the top right.
    3. Go to System Settings and select Date and Time.
    4. In Date and Time, select Automatic date & time set to Use network time.
    5. Carry out a hard reboot by pressing and holding down the Power button on the remote.

    I’d also recommend checking if there are any OS updates at the same time just to see if Sony has released any fixes for the issue. At the time of writing, it doesn’t look like this issue has been resolved. I can confirm I checked for any outstanding updates to which there were none.

    Even now I don’t understand how the date-time on the TV shifted out of sync. This shouldn’t happen again as we have now set the date-time to be set automatically via the network.

    So why are there Internet issues if the date-time isn't correct on a device? Ensuring the correct time on a device is more important than you might think. If the clock on a device manages to diverge too far from the correct time (more than a few hours), the Operating System and applications that are dependent on Internet-based services and authorisation will be rejected. As a result, either cause issues where some applications do not function, or on a wider scale where connection to the Internet is dropped in its entirety.

  • Don’t you love the good ol’ days back when querying API’s provided by social platforms was a straight forward process and at a time when it felt like there were fewer barriers for entry. This thought came to mind when I had to use Instagram’s API feed again recently to output a list of photos for one of my personal projects.

    The last time I carried out any interaction with Instagram's API was back in 2013 where it was a much more simpler affair. All that was required is to generate an access token that never expired, which could then be used against any API endpoint to get data back from Instagram.

    Reading Jamie Maguires really useful three-part blog post on tapping into Instagrams API gave me a good foundation into how the API has changed since I used it last. But the example he used required the profile you wanted to interact with had to be set up as a business user. However, interacting with Instagram’s API requires a lot more effort if you (like most users) do not have a business profile.

    As far as I understand it, if you plan on making any interaction with Instagrams API as a non-business user the process is:

    1. Create Facebook Developer App
    2. Authenticate application by logging into the Instagram account. This will then generate a short-lived token valid for 1 hour.
    3. Exchange the short-lived token for a long-lived token that is valid for 60 days. This token will need to be stored somewhere at the application level.
    4. Ensure the long-lived token is refreshed before it expires within the 60-day window.

    It is very important that we are using the long-lived token and to continually renew it by having some form of background process that carries out this check, whether this is at application or Azure level. We can then use this token to make queries to Instagram API endpoints.

    In this post, I am going to perform very simple integration to output a list of images from an Instagram profile. By demonstrating this, we should then get a fair idea on how to interact with the other API endpoints. Personally, setting everything up the process to acquire an access-token is the part that requires the most effort. So let's get to it!

    Create a Facebook Developer Application

    Since Facebook has taken over Instagram, naturally the application process starts within Facebook's developer site, which can be found at: https://developers.facebook.com/apps/. Once you have logged in using your Facebook credentials, the following steps will need to be carried out:

    • Select "Add New App". In the popup, enter the application name and contact email.
    • You will be presented with a list of products. Select "Instagram" by pressing the "Setup" button.
    • From the left-hand navigation, go to: Settings > Basic. Scroll to the bottom and click on the "Add platform" button. From the popup, select "Website".
    • Enter the website URL. For testing, this will be the URL we set up in the previous section. Ensure this URL is prefixed with https://. Click the "Save" button.
    • Again, from the left-hand navigation (under Products > Instagram), select: "Basic Display". At the bottom of the page, click the "Create New App" button.
    • Enter the following fields (based on our test domain):
    • Valid OAuth Redirect URIs: https://myinstagramapp.surinderbhomra.com/Instagram/Auth
    • Deauthorize Callback URL: https://myinstagramapp.surinderbhomra.com
    • Data Deletion Requests: https://myinstagramapp.surinderbhomra.com
    • Add an Instagram Test user. This can be the clients Instagram profile name.
    • Add instagram_graph_user_media permission, so we can read the profile images.
    • Click "Save Changes" button.

    You will have noticed I have added website URL’s for the OAuth Redirect, Deauthorize Redirect and Deletion Request fields. As these fields are required, you can enter the dummy website URL for local development purposes. Just remember to change this when you move the application to the live domain. In the meantime to utilise those dummy URL’s, your local host file will need to be updated. Please refer to a post I wrote in 2012 for further information. It might be over 8 years old, but the process is the same even if the Facebook Developer interface has changed.

    Once the Developer Application has been set up, grab the App ID and App Secret to be used in our demo application.

    Photo Feed Application

    The Photo Feed application will provide a very simple demonstration of the authentication process and interacting with the API to get back our media objects. From here, you will have the tools to improve and expand on this to delve deeper into other API endpoints Instagram has to offer.

    To start, we have two helper classes that the application will rely on:

    1. InstagramAuthProvider
    2. InstagramMediaProvider

    InstagramAuthProvider

    The InstagramAuthProvider carries out all authentication processes for acquiring both short and long-lived tokens.

    public class InstagramAuthProvider
    {
        #region Json Response Objects
    
        public class AuthenticateRequest
        {
            [JsonProperty("error_type")]
            public string ErrorType { get; set; }
    
            [JsonProperty("code")]
            public int StatusCode { get; set; }
    
            [JsonProperty("error_message")]
            public string ErrorMessage { get; set; }
    
            [JsonProperty("access_token")]
            public string AccessToken { get; set; }
    
            [JsonProperty("user_id")]
            public long UserId { get; set; }
        }
    
        public class LongLivedTokenRequest
        {
            [JsonProperty("access_token")]
            public string AccessToken { get; set; }
    
            [JsonProperty("token_type")]
            public string TokenType { get; set; }
    
            [JsonProperty("expires_in")]
            public long ExpiresInSeconds { get; set; }
        }
    
        #endregion
    
        /// <summary>
        /// Carries out initial authentication approach after user had approved app to Instagram account link.
        /// Returns a short-lived token valid for 1 hour.
        /// </summary>
        /// <param name="code"></param>
        /// <returns></returns>
        public static async Task<AuthenticateRequest> GetAccessTokenAsync(string code)
        {
            string authResponse = string.Empty;
    
            if (!string.IsNullOrEmpty(code))
            {
                Dictionary<string, string> parameters = new Dictionary<string, string>
                    {
                        { "client_id", ConfigurationManager.AppSettings["Instagram.ClientID"].ToString() },
                        { "client_secret", ConfigurationManager.AppSettings["Instagram.AppSecret"].ToString() },
                        { "grant_type", "authorization_code" },
                        { "redirect_uri", $"{ConfigurationManager.AppSettings[“Site.Domain"]}{ConfigurationManager.AppSettings["Instagram.AuthRedirectPath"]}" },
                        { "code", code }
                    };
    
                FormUrlEncodedContent encodedParameters = new FormUrlEncodedContent(parameters);
    
                HttpClient client = new HttpClient();
    
                HttpResponseMessage response = await client.PostAsync("https://api.instagram.com/oauth/access_token", encodedParameters);
                authResponse = await response.Content.ReadAsStringAsync();
            }
    
            return JsonConvert.DeserializeObject<AuthenticateRequest>(authResponse);
        }
    
        /// <summary>
        /// Exchanges a short-lived token for a long-lived token that are valid for 60 days.
        /// </summary>
        /// <param name="shortliveAccessToken"></param>
        /// <returns></returns>
        public static async Task<LongLivedTokenRequest> GetLongLifeTokenAsync(string shortliveAccessToken)
        {
            string authResponse = string.Empty;
    
            if (!string.IsNullOrEmpty(shortliveAccessToken))
            {
                HttpClient client = new HttpClient();
    
                HttpResponseMessage response = await client.GetAsync($"https://graph.instagram.com/access_token?client_secret={ConfigurationManager.AppSettings["Instagram.AppSecret"].ToString()}&grant_type=ig_exchange_token&access_token={shortliveAccessToken}");
                authResponse = await response.Content.ReadAsStringAsync();
            }
    
            return JsonConvert.DeserializeObject<LongLivedTokenRequest>(authResponse);
        }
    
        /// <summary>
        /// Refresh a long-lived Instagram User Access Token that is at least 24 hours old but has not expired.
        /// </summary>
        /// <param name="longLivedAccessToken"></param>
        /// <returns></returns>
        public static async Task<LongLivedTokenRequest> RefreshTokenAsync(string longLivedAccessToken)
        {
            string authResponse = string.Empty;
    
            if (!string.IsNullOrEmpty(longLivedAccessToken))
            {
                HttpClient client = new HttpClient();
    
                HttpResponseMessage response = await client.GetAsync($"https://graph.instagram.com/refresh_access_token?grant_type=ig_refresh_token&access_token={longLivedAccessToken}");
                authResponse = await response.Content.ReadAsStringAsync();
            }
    
            return JsonConvert.DeserializeObject<LongLivedTokenRequest>(authResponse);
        }
    }
    

    InstagramMediaProvider

    The InstagramMediaProvider returns media information based on the authentication token.

    public class InstagramMediaProvider
    {
        #region Json Response Objects
    
        public class MediaCollection
        {
            [JsonProperty("data")]
            public List<MediaInfo> Data { get; set; }
        }
    
        public class MediaInfo
        {
            [JsonProperty("id")]
            public string Id { get; set; }
    
            [JsonProperty("caption")]
            public string Caption { get; set; }
    
            [JsonProperty("permalink")]
            public string InstagramUrl { get; set; }
    
            [JsonProperty("media_type")]
            public string Type { get; set; }
    
            [JsonProperty("thumbnail_url")]
            public string VideoThumbnailUrl { get; set; }
    
            [JsonProperty("media_url")]
            public string Url { get; set; }
        }
    
        #endregion
    
        private string _accessToken;
    
        public InstagramMediaProvider(string accessToken)
        {
            _accessToken = accessToken;
        }
    
        /// <summary>
        /// Gets list of all user images.
        /// </summary>
        /// <returns></returns>
        public async Task<List<MediaInfo>> GetUserMedia()
        {
            var mediaInfo = await GetAllMediaAsync();
    
            if (mediaInfo?.Data.Count > 0)
                return mediaInfo.Data;
            else
                return new List<MediaInfo>();
        }
    
        /// <summary>
        /// Outputs information about a single media item.
        /// </summary>
        /// <returns></returns>
        private async Task<MediaCollection> GetAllMediaAsync()
        {
            string mediaResponse = string.Empty;
    
            HttpClient client = new HttpClient();
    
            HttpResponseMessage response = await client.GetAsync($"https://graph.instagram.com/me/media?fields=id,media_type,media_url,thumbnail_url,permalink,caption,timestamp&access_token={_accessToken}");
    
            mediaResponse = await response.Content.ReadAsStringAsync();
    
            if (response.StatusCode != HttpStatusCode.OK)
                return null;
    
            return JsonConvert.DeserializeObject<MediaCollection>(mediaResponse);
        }
    }
    

    Authorisation and Authentication

    The authorisation and authentication functionality will be performed in the InstagramController.

    public class InstagramController : Controller
    {
         /// <summary>
         /// Authorises application with Instagram.
         /// </summary>
         /// <returns></returns>
        public ActionResult Authorise()
        {
            return Redirect($"https://www.instagram.com/oauth/authorize?client_id={ConfigurationManager.AppSettings["Instagram.AppID"].ToString()}&redirect_uri={ConfigurationManager.AppSettings[“Site.Domain"]}{ConfigurationManager.AppSettings["Instagram.AuthRedirectPath"]}&scope=user_profile,user_media&response_type=code");
        }
    
        /// <summary>
        /// Makes authentication request to create access token.
        /// </summary>
        /// <param name="code"></param>
        /// <returns></returns>
        public async Task<ActionResult> Auth(string code)
        {
            InstagramAuthProvider.AuthenticateRequest instaAuth = await InstagramAuthProvider.GetAccessTokenAsync(code);
    
            if (!string.IsNullOrEmpty(instaAuth?.AccessToken))
            {
                InstagramAuthProvider.LongLivedTokenRequest longTokenRequest = await InstagramAuthProvider.GetLongLifeTokenAsync(instaAuth.AccessToken);
    
                if (!string.IsNullOrEmpty(longTokenRequest?.AccessToken))
                {                
                   // Storing long-live token in a session for demo purposes. 
                   // Store the token in a more permanent place, such as a database.
                    Session["InstagramAccessToken"] = longTokenRequest.AccessToken;
    
                    return Redirect("/");
                }
            }
    
            return Content("Error authenticating.");
        }
    }
    

    Authorise

    Before we can get any of our tokens, the first step is to get authorisation from Instagram against our web application. So somewhere in the application (preferably not publicly visible), we will need an area that will kick this off. In this case, by navigating to /Instagram/Authorise, will cause the Authorise action in the controller to be fired.

    All the Authorise action does is takes you to Instagrams login page and sends over the App ID and Redirect Path. Remember, the Redirect path needs to be exactly as you've set it in your Facebook Developer Application. Once you have successfully logged in, you’ll be redirected back to the application.

    NOTE: I’ll be honest here and say I am not sure if there is a better way to acquire the access token as it seems very odd to me that you have to log in to Instagram first. If any of you know of a better way, please leave a comment.

    Auth

    If the authorise process was successful, you will be redirected back the application and be given an authorisation code. The authorisation code will be parsed to the InstagramAuthProvider.GetAccessTokenAsync() method to be exchanged for our first access-token valid - short-lived valid for 1 hour.

    The last step of the process is to now send the short-lived access token to the InstagramAuthProvider.GetLongLifeTokenAsync() method that will carry out a final exchange to retrieve our long-life token valid for 60 days. It is this token we need to store somewhere so we can use it against any Instagram API endpoint.

    In my example, the long-life token is stored in a Session for demonstration purposes. In a real-world application, we would want to store this in a database somewhere and have a scheduled task in place that will call the InstagramAuthProvider.RefreshTokenAsync() method every 59 days for the long life token to be renewed for another 60 days.

    Photo Feed

    Now we come onto the easy part - use the long-live access token to return a list of photos.

    public class InstagramMediaController : Controller
    {
        /// <summary>
        /// Outputs all user images from Instagram profile.
        /// </summary>
        /// <returns></returns>
        [OutputCache(Duration = 60)]
        public PartialViewResult PhotoGallery()
        {
            if (Session["InstagramAccessToken"] != null)
            {
                InstagramMediaProvider instaMedia = new InstagramMediaProvider(Session["InstagramAccessToken"].ToString());
    
                return PartialView("_PhotoGallery", Task.Run(() => instaMedia.GetUserMedia()).Result);
            }
    
            return PartialView("_PhotoGallery", new List<InstagramMediaProvider.MediaInfo>());
        }
    }
    

    This controller contains a single piece of functionality - a PhotoGallery partial view. The PhotoGallery partial view uses the InstagramMediaProvider.GetUserMedia() method to return a collection of profile photos.

    Final Thoughts

    Before carrying out any Instagram integration, think about what you are trying to achieve. Look through the documentation to ensure the API's you require are available as some permissions and features may require business or individual verification to access live data.

    Also, you may (or may not) have noticed whilst going through the Facebook Development Application setup that I never submitted the application for review. I would advise you to always submit your application to Facebook as you might find your access is revoked. For personal purposes where you're only outputting your own Instagram profile information, you could just leave it "In development" mode. But again, I do not advise this, especially when working with Business accounts.