My Essential iPad Accessories and Applications

Following up on my previous post about the joy that is using my new iPad Air, I thought I’d write about what I deem are essential accessories and applications. It’s only been a couple of weeks since making my purchase and has surprisingly found the transition from Android to iOS not too much of a pain. It’s fast becoming part of my daily workflow for creative writing and note-taking.

Here are some applications and accessories I use…

Accessories

Keyboard Case

Apple’s own Smart Keyboard Cover felt very unnatural to use and didn’t provide enough protection for my nice new tablet. The Inateck Keyboard Case is an absolute pleasure to use and the keys have a very nice responsive rebound. I can literally use this anywhere and feels just as stable on my lap as it is when being used on a desk.

The only downside is the connectivity relies on Bluetooth rather than Apple’s own Smart connector which would normally power the keyboard. Nevertheless, the pairing has no latency and the battery lasts weeks even with daily usage.

Apple Pencil

The iPad Air is only compatible with the first generation pencil and has a really ridiculous way to charge using the lightning connector. Apple could have quite easily made the iPad Air work with the second generation pencil. If the iPad Pro was a cracker, then the second generation pencil would be the caviar.

Regardless of the design, it’s refreshing to scribble away notes to store electronically. Previously to keep track of my written notes, I would write on paper (oh how old fashioned!?) and then scan digitally using Evernote on my phone.

Draw Screen Protector

Writing on glass using the Apple Pencil is a little slippery and need something that gives the texture to almost simulate the friction you would get when writing on paper. There are a handful of screen protectors that provide this with varying degrees of success. The most popular being is Paperlike, which I plan on putting an order for when I’ve worn out my current screen protector.

My current screen protector is Nillkin and isn’t too bad. It provides adequate protection as well as giving enough texture with enough anti-reflection qualities that doesn’t hinder screen visibility. Added bonus: a nice light scratchy sound as you'd expect if writing with an old-fashioned pencil!

Applications

I'm deliberately leaving out the most obvious and well-known apps that we are well aware of such as YouTube, Netflix, Gmail, Kindle, Twitter, Spotify etc.

Jump Desktop

I wrote about this very briefly in my previous post. If you want a link to your laptop/workstation from your iPad, Jump Desktop is your best option. Once you have the application installed on your iPad and host machine you are up and running in minutes. Judging by past updates, it’s getting better with every release.

Evernote

I don’t think I can speak about Evernote highly enough. I am a premium member and is one of my most highly used applications across all mediums. Worth every penny! It organises my notes, scribbles and agendas with little effort.

Evernote is effectively my brain dump of ideas.

Notes haven’t looked so good with the use of a recent feature - Templates. On creation of a new note, you have the option to select a predefined template based from the many Evernote provides from their own Template Gallery.

Grammarly

Grammarly is a must for all writers to improve the readability of your content. I myself had only started using Grammarly since last year and now can't think of writing a post without it. In the iPad form, Grammarly forms part of the keyboard that carries out checks as you type. This works quite well with my writing workflow when using Evernote.

Autodesk Sketchbook

If the Apple Pencil has done anything for me, is to allow me to experiment more with what it can do and in the process allowing me to try things I don’t generally do. In this case, sketch! I would be lying if I said Autodesk Sketchbook is the best drawing apps out there as I haven’t used any others. For an app that is free, it has a wide variety of features that will accommodate both novice and experts alike.

1.1.1.1

Developed by the team who brought you the Cloudflare CDN infrastructure comes 1.1.1.1, an app for providing faster and more private internet. This is something I always have running in the background to have a form or protection using public hotspots and to stop my ISP from snooping where I go on the internet.

When compared to other DNS directory services, Cloudflare touts 1.1.1.1 as the fastest. As everything you do on the internet starts with a DNS request, choosing the fastest DNS directory will accelerate the online experience.

Searching For A New Tablet and Going Back To iPad

I’ve been looking for a tablet for quite some time and doing some in-depth research on the best one to get. I am always a stickler for detail and wanting to get best for the time based on budget and specification.

Only having ever owned two tablets in the past - an iPad 2 and Nexus 7. Being someone who has semented himself in the Android/Google ecosystem, I automatically got along with the Nexus and quickly became my daily driver for web browsing and reading the vast variety of books from Amazon and Google Books. That was 5-6 years ago. The tablet game has changed... No longer is it just about viewing information, watching videos with some minor swipe gestures and basic gaming. It’s more!

Ever since Microsoft released the first version of their Surface tablet computer, it shifted the industry standards to what we should now expect from a tablet, which then led to more innovation such as:

  • Keyboard support
  • Writing with palm rejection (not that old school stylus from yesteryear!)
  • Multitasking with the ability to view multiple apps in one screen, which is only getting better by the day!
  • Near laptop replacement - We’ll go into this a little later

I wasn’t so quick to jump on the new iterations of tablets entering the market as I was waiting to see the proof in the pudding and for prices to go down. I just don’t think its worth spending over £600 on a tablet - looking at you iPad Pro! Nevertheless, from initially piquing my interest, it now got my full attention. For the first in a long time, I could see how having a tablet be useful in my day to day activities again.

Do I Really Need A Tablet?

Short answer: Yes.

If you asked me this question last year I would have more than likely have said no. My Pixel 2 smartphone fit the bill for for my portable needs. Tablet life was soon being relegated to just holidays and long weekends away.

The only thing that has changed is the increased amount of blogging and writing I now do. Typing on a smartphone really made my thumbs tired for long periods of time for when I didn’t have a computer to hand. On the other hand, I found lugging around my MacBook Pro 15" just for writing was a little excessive and lacking all day battery life.

I could see myself buying a tablet along with a Bluetooth keyboard for easy quick note taking for when going to conferences and for writing something a little more indepth. For anyone who writes, they will probably tell you when you have a sudden spark of inspiration you need to just write it down.

Conundrum: To Android or Not To Android

There seems to be a real lack of good Android tablets going around that has good build quality, vanilla OS with accessories to match. It’s guaranteed that if you go for an Android tablet, you’ll be subjected to inferior cheap cases and hardware. This was indeed the case when looking for a nice flip case for my old Nexus 7.

One would be forgiven for being given the impression accessory manufacturers don’t give Android tablets the light of day - very annoying. I still love the nice leather Kavaj case I purchased soon after getting my iPad 2. My iPad 2 may not be getting used, but still looks the part resting on the bookshelf! I guess it’s understandable why accessory manufacturers are not providing the goods for where there is limited demand. It all comes down to a lack of flagship Android devices and I was hoping the Pixel Slate would change this. Not a chance! I really wanted to go for the Pixel Slate but the main unknown factor for me is the longevity of a device that starts at £749.

The only choice was to consider an iPad.

What About The Microsoft Surface?

The Microsoft Surface is a computer powerhouse and if I needed another laptop, this would have been a great purchase. I look forward to owining one in the future. Again, it all comes down to price. You have to take into consideration the cost of the computer itself as well as the added type cover. Plus I feared I would be greeted with a long Windows Update when I have a sudden spark of writing inspiration.

Choosing An iPad

Apple’s have positioned their iPad lineup that should meet all demand:

I opted for the iPad Air for the 10.5” screen, A12 Bionic processor and Smart connector. The Smart connector was something previously available to the Pro series only and it was a welcome addition to their mid range tablet as this will give me the ability to connect a keyboard cover and any further peripherals that maybe on the horizon. Future proof!

The performance and multi-tasking support is pretty good as well. I am writing this very post with Evernote in one window, Chrome in another whilst listening to Spotify.

Near Laptop Replacement and iPad OS

I have no expectation to make the iPad a laptop replacement. But it’s the nearest experience to it. In all honesty, I don’t understand how some even find the iPad Pro a complete replacement. Would someone enlighten me?

I found using Jump Desktop to remote onto my laptop a really good way to get a laptop experience on my iPad. Very useful when I need to use applications I’d never be able to run on a tablet like VMWare. Jump Desktop is one of the best remote desktop applications you can use on an iPad.

Jump Desktop features one of the fastest RDP rendering engines on the planet. Built in-house and hand tuned for high performance on mobile devices. Jump’s RDP engine also supports audio streaming, printer and folder sharing, multi-monitors, touch redirection, RD Gateway and international keyboards.

I am really looking forward to the release of iPad OS as it might lead to a more immersive experience that bridges the gap closer to the basic features we expect from a laptop. I always felt iOS still lacks some of features currently present in Android, such as widgets to see app activity at a glance and more control over your files.

Conclusion

If you’re looking for a tablet that has the capability to do a lot of wonderful things with a lot of nice supported accessories, you can’t go wrong with an iPad Air.

Update (21/06/2019): Android Abandoning The Tablet Market

Well this totally caught me off guard. The Verge reported on 20th June that Android are exiting the tablet market and concentrating their efforts on building laptops. This further validates my purchase and choosing an iPad was indeed the right decision.

Export Changes Between Two Git Commits In SourceTree

I should start off by saying how much I love TortoiseGit and it has always been the reliable source control medium, even though it's a bit of a nightmare to set up initially to work alongside Bitbucket. But due to a new development environment for an external project, I am kinda forced to use preinstalled Git programs:

  • SourceTree
  • Git Bash

I am more inclined to use a GUI when interacting with my repositories and use the command line when necessary.

One thing that has been missing from SourceTree ever since it was released, is the ability to export changes over multiple commits. I was hoping after many years this feature would be incorporated. Alas, no. After Googling around, I came across a StackOverflow post that showed the only way to export changes in Sourcetree based on multiple commits is by using a combination of the git archive and git diff commands:

git archive --output=archived_changes.zip HEAD $(git diff --diff-filter=ACMRTUXB --name-only hash1 hash2)

This can be run directly using the Terminal window for a repository in Sourcetree. The "hash1" and "hash2" values are the long 40 character length commit ID's.

The StackOverflow post has helped me in what I needed to achieve and as a learning process, I want to take things a step further in my post to understand what the archive command is actually doing for my own learning. So let's dissect the command into manageable chunks.

Part 1

git archive --output=archived_changes.zip HEAD

This creates the archive of the whole repository into a zip file. We can take things further in the next section to select the commits we need.

Part 2

git diff --diff-filter=ACMRTUXB

The git diff command shows changes in between commits. The filter option gives us more flexibility to select the files that are:

  • A Added
  • C Copied
  • D Deleted
  • M Modified
  • R Renamed
  • T have their type (mode) changed
  • U Unmerged
  • X Unknown
  • B have had their pairing Broken

Part 3

--name-only hash1 hash2

The second part of the git diff command uses the "name-only" option that just shows which files have changed over multiple commits based on the hash values entered.

Part 4

The git diff command needs to be wrapped around parentheses to act as a parameter for the git archive command. 

Kentico Cloud Certified

Over the Bank Holiday weekend, I had some time to kill one evening and decided to have a go at completing the Kentico Cloud exam to become a certified developer. Taking the exam is a natural progression to warrant oneself as an expert on the platform, especially as I have been using Kentico Cloud since it was first released. Time to put my experience to the test!

Unlike traditional Kentico CMS Developer exams, the Kentico Cloud exam consists of 40 questions to complete over a duration of 40 mins. The pass rate is still the same at 70%.

Even though I have been using Kentico Cloud for many years, I highly recommend developers to get yourself certified providing you are familiar with the interface, built a few applications already and have exposure to the API endpoint. The exam itself is platform-agnostic and you won't be tested on any language-specific knowledge. 

The surprising thing I found after completing the exam is a higher awareness of what Kentico Cloud does not only as a platform but also touched upon areas you wouldn't have necessarily been familiar with. There certainly more to Kentico Cloud than meets the eye!

Temporary ASP.NET Files Directory Compilation Error After Project Rename

After renaming my MVC project from "SurinderBhomra" to "Site.Web" (to come across less self-serving) I get the following error:

Compiler Error Message: CS0246: The type or namespace name 'xxx' could not be found (are you missing a using directive or an assembly reference?)

CS0246 Compiler Error - Temporary ASP.NET Files

But the misleading part of this compiler error is the source file reference to ASP.NET Temporary Files directory, which led me to believe that my build was been cached when it was actually to do with the fact I missed some areas where the old project name still remained.

I carried out a careful rename throughout my project by updating all the places that mattered, such as:

  • Namespaces
  • Using statements
  • "AssemblyTitle" and "AssemblyProduct" attributes in AssemblyInfo.cs
  • Assembly name and Default Namespace in Project properties (followed by a rebuild)

The key area I missed that caused the above compiler error is overlooking the the namespace section in the /Views/web.config file.

<system.web.webpages.razor>
  <host factorytype="System.Web.Mvc.MvcWebRazorHostFactory, System.Web.Mvc, Version=5.2.3.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35">
  <pages pagebasetype="System.Web.Mvc.WebViewPage">
    <namespaces>
      ...
      ...
      ...
      <add namespace="System.Web">
      ...
      ...
      ...
    </add></namespaces>
  </pages>
</host></system.web.webpages.razor>

When you first create your project in Visual Studio, it automatically adds its original namespace to this file. This also goes for any other web.config you happen to have nested in other areas inside you MVC project.

Automatically Backing Up Plesk Data To Synology

In light of my hosting issues over the last week, I decided it was time to take measures in ensuring all websites under my hosting provider are always backed up automatically. I generally take hosting backups offsite on an ad-hoc basis and entrust the hosting provider to keep up their end of the bargain by doing this on my behalf.

If you are with a hosting provider (like I was previously - A2 Hosting), who talks the talk but can't actually walk the walk in regards to the service they offer, you will more than likely end up having backup woes. It's always best practice to take control of your own backups and if this can be automated, makes life so much easier!

All Plesk panels have a "Backup Manager" area where you can action manual or scheduled backup processes. Depending on your hosting provider, the features shown in this area might be varied. Some have the option to backup straight to your Dropbox account. What we will be focusing on is remotely backing up our website data to our Synology NAS using FTP.

Before we log into Plesk to select our Remote Backup option, we need to carry out some setup on our Synology.

Port Forwarding for FTP and FTPS Protocols

Most likely, your router will have a limited number of ports open to allow outside internet traffic to enter the local network. To make the most of your Synology, there is a recommended number of ports you need to open to make use of all the services.

We are interested in opening to the following ports:

  • FTP: 21
  • FTPS: 990

I prefer to send over any data using FTPS just for better security.

You will have to login to your router settings to open ports. I would provide some instructions on how to do this, but every router is different. I just managed to find these settings hidden away in my own Billion router.

Synology Setup

Setting up FTP is pretty straight-forward. Just make sure you have administrative privileges to access the Control Panel.

Enable FTP

In Control Panel, go to: File services > FTP Tab.

All we need to do here is to enable two FTP settings:

  • Enable FTP service (no encryption)
  • Enable FTP SSL/TLS encryption (FTPS)

Synology Control Panel - Enable FTP

The reason why I selected the "Enable FTP service (no encryption)" option is purely for initial testing purposes. If there are any issues when making a connection via FTP from a new service for the first time, I just like to ensure if a successful connection can be made via standard FTP. After my testing is done, I would disable this option.

Create a Synology User

I prefer to create a new user specifically for FTP connections rather than my main own account, as I have the ability to lock down access to only read and write permissions in its home directory. No Synology services or applications will be accessible.

Synology FTP User Permissions

The only application I allow my user to access is "FTP".

Synology FTP User Application Permissions

Plesk Backup Manager Setup

FTP Configuration

In Backup Manager, go to "Remote Storage Settings and select "FTP". Enter the following settings along with your user credentials:

  • FTP server hostname or IP: http://mysynology.synology.me
  • Directory for backup files storage: /home
  • Use passive mode:  Yes
  • Use FTPS: Yes
On clicking the "OK" or "Apply" button should return no errors. But if there are errors, check the logs and ensure you haven't missed any permissions for your Synology user.

Set Backup Schedule

Now we have set our remote storage settings, we now need to put a schedule in place to generate a backup on as often as we require. It's up to on how often you set the regularity of the backups. I've set mine to run daily at 11pm and retain these backups for a month.

Plesk Scheduled Backup

Make sure you set the Backup settings to store the backup in your newly created FTP storage.

A Word To The Wise

Just because we now have automatic backups running protecting us from any foreseen hosting issues, this doesn't mean we're all in the clear. Backups are useless to us if they don't work. I check my backups at least once a week and ensure the most recently backed up file is free from corruption and can be opened.

Structuring Navigation and Other Page Properties In Kentico 12 MVC

As great as it is building sites using the MVC framework in Kentico, I do miss some of the page features we’re spoilt with that previous editions of Kentico has to offer.

Like anything, there are always workarounds to implement the features we have become accustomed to but ensuring the correct approach is key, especially when moving existing Kentico clients onto Kentico 12 MVC edition. Familiarity is ever so important for the longer tenure clients who already have experience using previous versions of Kentico.

In my Syndicut Medium post, I show how to use Kentico's existing fields from the CMS_Document table to work alongside the new MVC approach.

Take a read here: https://medium.com/syndicut/adding-navigation-properties-to-pages-in-kentico-12-mvc-8f054f804af2

A2 Hosting - Can Any Hosting Provider Be Trusted?

It's been a turbulent last few days at the house of A2 Hosting where not only all their Windows hosting, but also a number of Wordpress hosting (as of 23rd April) has come to a standstill. After much pressing by its customers, it has come to light that a malware related security breach caused an outage, not only in one service, but many across A2 Hosting infrastructure. 

It's now been 3 days in counting where the outage still persists. Luckily, I managed to move back to my old hosting provider after waiting 2 days patiently for some form of recovery and I'm glad I did! I truly feel sorry for the many others who are still waiting on some form of resolution. I think I managed to get out from under A2 Hosting relatively unscathed.

This whole outage has caused me to not only reflect on my time with A2 Hosting but also hosting providers in general.

The Lies

If I'm honest, the days were counting down after getting infuriated by their support (lack of!) and the lies by their marketing and sales to meet my relatively simple hosting needs. I like to think I'm very scrupulous when it comes to hosting and do my due diligence... In this case, A2 managed to get one over me in that department!

I run a couple of sites on Kentico CMS and it was important to find a hosting company that caters for this platform due to the hardware resources required to run.

Lo and behold...

A2 Hosting - Best Kentico Hosting

Judging by that page alone filled me with confidence at a reasonable price with a lot of extras thrown in. I confirmed this was the case by talking at length to the A2 sales team beforehand and was ensured any tier would meet my needs. So I opted for the mid-tier plan - Swift, costing around £125 for 2 years after some nice promotional offers.

Knowing what I know now, I can report that the Swift plan and potentially all the other shared plans do not fit the requirements of a reasonably small Kentico site. Hosting Kentico on A2 Hosting was the bane of my life, as every so often my site would randomly timeout, with only one explanation from their support team:

We suggest you optimize your website with help from your web developer to fix the issue.

After politely requesting more information on the issue and also entertaining the fact I may need to up my hosting, I never really did get any adequate reason. It was always the efficiency of my website to blame.

Don't Believe Them, Don't Trust Them

Lack of Transparency

In light of recent events, transparency isn't one of A2 Hostings strengths (unless when pressed upon by its many customers). When problems arise, I'd prefer to know exactly what is the root cause. Knowing this actually puts more confidence in a hosting provider. I think we all know the feeling when we're not given the full picture.

Our minds have a habit of thinking of a worst-case scenario when we do not have the full picture.

Honesty is the best policy!

A2 Hosting Tweet - Transparency
(Example of A2 Hosting Lack of Transparency)

99.9% Uptime Promise

In reality, I don't expect 99.9% uptime from hosting providers as things do happen due to unforeseen circumstances. But I still expect the 98-99% range.

A2 Hosting - 99.9%25 Uptime

Judging by my uptime monitoring, I have never been blessed with 99.9% uptime during my tenure (1 out of 2-year plan) at A2 Hosting. My site has always encountered timeouts and downtime. The last major outage was around 2 months ago -  amounting up to 24 hours downtime!

Trusting Your Hosting Provider

If your website is big or small, handing over your online presence to a third-party is a big deal. You are whole-heartedly trusting a company to house your website with tender loving care. Any downtime and slow loading times can negatively impact your client base and SEO.

I've learnt that a hosting provider could have many 5 star reviews and still lack the infrastructure and support to back it up. In fact, this is what perplexed me about A2 Hostings many positive reviews.

Finding quality and appropriately priced hosting is very difficult to find. There are so many options, but the hosting industry has the classic issue of quantity over quality.

Backups

Regardless of how good any hosting company is, I would always recommend you take suitable measures to regularly carry out offsite backups on all your sites. Yes, this can be a laborious task if you are managing many sites, but its the only way to 100% sure you can be in control.

This was the only way I was able to move swiftly back to SoftSys Hosting and not wait on A2 Hosting to restore their services. At one point, there was a question mark over the current state of A2 Hosting's backups are in.

Tweet - A2 Hosting Backup
(A2 Hosting Questionable Backups)

Moving Back To Previous Hosting

Believe or not, I can't remember the exact reason why I left Softsys Hosting. After all, I never had any issues with them throughout the 9 years I was with them. Very accommodating bunch of guys! I think what attracted me to A2 Hosting was their shiny website, the promise of faster load times and the option to have my site hosted on UK servers.

It's always an absolute pain having to move and set everything back up again. But thanks to Ruchir at Softsys Hosting who was very attentive in helping me during my predicament and answering all my queries, managed to assist in achieving a quick turnaround. So in total, my site was only down for just under 2 days.

It seems quite apt that I come back to the hosting provider I call home under the same reasons to why I started using them in the first place back in 2009 when I was failed by my first ever hosting provider (Ultima Hosts). Oh, the irony!

Conclusion

Unfortunately, there isn't an exact science to finding the most ideal hosting provider for your budget and requirements. If you ever have any qualms regarding your current hosting provider, you might have good reason to be. Hosting should be worry and hassle free, knowing that your data is in the hands of capable people. If you have the finances to move, just do it. Hardware can be replaced, data can not. Data is a commodity!

Take online reviews with a pinch of salt. Instead, take a look at the existing users responses through their main Twitter and status accounts. Some might even have status pages. This will hopefully give you a more unbiased view on their operation and approach to resolving past issues.


Update - 26/04/2019

I have asked A2 Hosting for some form of compensation, especially since I purchased 2 years up front. Awaiting their response to the exact amount. I am hoping they will add some additional compensation as a goodwill gesture for misleading on their Kentico host offering.

Update - 27/04/2019

As of 27/04/2019 8pm (GMT), I managed to log back into A2 Hosting Plesk Administration to get a more recent backup of my hosting. Noticed there were some database errors in the process.

Update - 01/05/2019

Not looking good. I think there is a very slim chance in getting any form of reimbursement from A2 Hosting as they have decided to delete my support ticket. Not "close", but actually delete. I thought this was probably a mistake and after delving into the mass of responses from many other unhappy users, it seems I am not the only one.

Tweet - A2 Hosting Deleting Tickets

One can only assume that A2 Hosting are wiping their hands of any form of user correspondence. There hasn't been any further considerable updates or timescales to when services will resume. I am still awaiting for the ability to carry out a proper backup.

Responsive Images In ASP.NET: Converting Image Tag To Picture Tag

A picture tag allows us to serve different sized images based on different viewport breakpoints or pixel-ratios, resulting in better page load performance. Google's Pagespeed Insights negatively scores your site if responsive images aren't used. Pretty much all modern browsers support this markup and on the off chance it doesn't, an image fallback can be set.

Using the picture markup inside page templates is pretty straight-forward, but when it comes to CMS related content where HTML editors only accommodate image tags, it's really difficult to get someone like a client to add this form of markup. So the only workaround is to transform any image tag into a picture tag at code-level.

Code: ConvertImageToPictureTag Extension Method

The ConvertImageToPictureTag method will perform the following tasks:

  1. Loop through all image tags.
  2. Get the URL of the image from the "src" attribute.
  3. Get other attributes such as "alt" and "style".
  4. Generate picture markup and add as many source elements based on the viewport breakpoints required, apply the URL of the image, style and alt text.
  5. Replace the original image tag with the new picture tag.

The ConvertImageToPictureTag code uses HtmlAgilityPack, making it very easy to loop through all HTML nodes and manipulate the markup. In addition, this implementation relies on a lightweight client-side JavaScript plugin - lazysizes. The lazysizes plugin will delay the loading of the higher resolution image based on the viewport rules in the picture tag until the image is scrolled into view.

To use this extension, add this to any string containing HTML markup, as so:

// The HTML markup will generate responsive images using based on the following parameters:
// - Images to be resized in 10% increments.
// - Images have to be more than 200px wide.
// - Viewport sizes to take into consideration: 1000, 768, 300.
string contentWithResponsiveImages = myHtmlContent.ConvertImageToPictureTag(10, 200, 1000, 768, 300);

Sidenote

The code I've shown doesn't carry out any image resizing, you will need to integrate that yourself. Generally, any good content management platform will have the capability to serve responsive images. In my case, I use Kentico and can resize images by adding a "width" and/or "height" query parameter to the image URL.

In addition, all image URL's used inside an image tags "src" attribute requires a width query string parameter. The value of the width parameter will be the size the image in its largest form. Depending on the type of platform used, the URL structure to render image sizes might be different. This will be the only place where the code will need to be retrofitted to adapt to your own use case.

ASP.NET Developers Who Use Eval()... Evaluate Yourself!

The title of this post might seem a tad extreme, but I just feel so strongly about it! Ever since I started learning ASP.NET those many years ago, I've never been a fan of using "Eval" in data-bound controls I primarily use, such as GridViews, Repeaters and DataList. When I see it still being used regularly in web applications I cringe a little and I feel I need to express some reasons to why it should stop being used.

I think working on an application handed down to me from an external development agency pushed me to write this post... Let's call it a form of therapy! I won't make this post a rant and will "try" to be constructive and concise. My views might come across a little one-sided, but I promise I will start with at least one good thing to say about our evil friend Eval.

Postive: Quick To Render Simple Data

If the end goal is to list out some string values as is from the database with some minor manipulation from a relatively small dataset, I almost have no problem with that, even though I still believe it can be used and abused by inexperienced developers.

Negative: Debugging

The main disadvantage of embedding code inside your design file (.aspx or .ascx) is that it's not very easy to view the output during debugging. This causes a further headache when your Eval contains some conditional statements to alter the output on a row-by-row basis.

Negative: Difficult To Carry Out Complex HTML Changes

I wouldn't recommend using Eval in scenario's where databound rows require some form of HTML change. I've seen some ugly implementations where complex conditional statements were used to list out data in a creative way. If the HTML ever had to be changed through design updates, it would be a lot more time consuming to carry when compared to moving around some form controls that are databound through a RowDataBound event.

Negative: Ugly To Look At

This point will come across very superficial. Nevertheless, what I find painful to look at is when Eval is still used to carry out more functionality by calling additional methods and potentially repeating the same functionality numerous times.

Performance/Efficiency

From my research, it's not clear if there specifically is a performance impact in using Eval alone, especially with the advances in the .NET framework over the years. A post from 2012 on StackExchange brought up a very good point:

Eval uses reflection to get the value of the relevant property/field, and using Reflection to get values from object members is very slow.

If the type of an object can be determined at runtime, you're better off explicitly declaring this. After all, it's good coding standards. In the real world, the performance impact is nominal depending on the number of records you are dealing with. Not recommended for building scalable applications. I generally notice a slow down (in milliseconds) when outputting 500 rows of data.

I have read that reflection is not as much of an issue in the most recent versions of the .NET framework when compared to say, .NET 1.1. But I am unable to find any concrete evidence of this. Regardless, I'd always prefer to use the faster approach, even if I am happening to shave off a few milliseconds in the process.

Conclusion

Just don't use Eval. Regardless of the size of the dataset I am dealing with, there would only be two approaches I'd ever use:

  1. RowDataBoundEvent: A controls RowDataBoundEvent event is triggered every time a row is databound with data. This approach enables us to modify the rows appearance and structure in a specific way depending on the type of rules we have in place.
  2. Start From Scratch: Construct the HTML markup by hand based on the datasource and render to the page.

If I were to be building a scalable application dealing with thousands of rows of data, I am generally inclined to go for option 2. As you're not relying on a .NET control, you won't be contributing to the page viewstate.

Even though I have been working on a lot more applications using MVC where I have more control on streamlining the page output, I still have to dabble with Web Forms. I feel with Web Forms, it's very easy to make a page that performs really bad, which makes it even more important to ensure you are taking all necessary steps to ensure efficiency.