Blog

Posts written in July 2012.

  • Today I came across this really interesting tweet on my Twitter timeline today:

    Read about why we’re deleting our Facebook page: facebook.com/limitedpressin… — Limited Run (@limitedrun) July 30, 2012

    Limited Run, posted on their Facebook profile stating that they would be deleting their account due to the amount Facebook is charging for clicks on their advertising. Here’s the interesting part: About 80% of the clicks Facebook charged Limited Run, JavaScript wasn't on. And if the person clicking the ad doesn't have JavaScript, it's very difficult for an analytics service to verify the click. Only 1-2% of people going to their site have JavaScript disabled, not 80% like the clicks coming from Facebook.

    Interesting stuff.

    Before Limited Run takes down their Facebook profile, I’ve attached a screenshot of their post below:

    Limited Pressing Facebook Post

    Reading this post today reminded me on a news article I read on “virtual likes” and how advertising through Facebook doesn’t necessarily mean you’ll be any better off. It all comes down to the level of engagement user’s have with a profile page. If users are just liking the page and not interacting with your posts or general content, those likes are worth nothing. Some companies are wising up to the effectiveness of Facebook’s advertising strategy.

    Limited Run isn’t the first to ditch Facebook ad’s, General Motor’s pulled away from Facebook ad’s earlier this year due to the ad’s Facebook produce do not have the visual impact needed to justify the cost.

    I think certain aspects of Facebook is a joke filled mostly of people looking for attention, not an effective marketing tool.

  • Facebook ConnectIf I need to login and authenticate a Facebook user in my ASP.NET website, I either use the Facebook Connect's JavaScript library or SocialAuth.NET. Even though these two methods are sufficient for the purpose, I don't think it's the most ideal or efficient way.

    The Facebook Connect JavaScript library is quite basic and doesn't have the flexibility required for full .NET integration through FormsAuthentication. Whereas SocialAuth.NET provides full .NET integration and all authentication is done server-side with minimal development.

    I'd say if you are looking for a straight-forward way to integrate social site authentication, SocialAuth.NET is the way to go. It's API can communicate with other social sites such as Twitter, LinkedIn and Gmail.

    Recently, I found a better and more efficient way to authenticate Facebook users on my site using Graph API and Hammock.

    Hammock is a C# a REST library for .NET that greatly simplifies consuming and wrapping RESTful services. This allows us to embrace the social site’s core technology instead of using varied SDK's or API's. There are many community driven frameworks and API's readily available on the Internet, but they can really cause problems if they evolve too quickly or haven’t been thoroughly tested.

    Suddenelfilio, has written a useful blog post on connecting Facebook using Hammock. You will see by his example that you can interact with Facebook anyway you want.

    The same principle could also be applied to other website API's that use REST based services, such as Twitter.

  • I always found writing code to read an RSS feed within my .NET application very time-consuming and long-winded. My RSS code was always a combination of using WebRequest, WebResponse, Stream, XmlDocument, XmlNodeList and XmlNode. That’s a lot of classes just to read an RSS feed.

    Yesterday, I stumbled on an interesting piece of code on my favourite programming site StackOverflow.com, where someone asked how to parse an RSS feed in ASP.NET. The answer was surprisingly simple. RSS feeds can now be consumed using the System.ServiceModel.Syndication namespace in .NET 3.5 SP1. All you need is two lines of code:

    var reader = XmlReader.Create("http://mysite.com/feeds/serializedFeed.xml");
    var feed = SyndicationFeed.Load(reader);
    

    Here’s a full example on how we can iterate through through the SyndicationFeed class:

    public static List<BlogPost> Get(string rssFeedUrl)
    {
        var reader = XmlReader.Create(rssFeedUrl);
        var feed = SyndicationFeed.Load(reader);
    
        List<BlogPost> postList = new List<BlogPost>();
    
        //Loop through all items in the SyndicationFeed
        foreach (var i in feed.Items)
        {
            BlogPost bp = new BlogPost();
            bp.Title = i.Title.Text;
            bp.Body = i.Summary.Text;
            bp.Url = i.Links[0].Uri.OriginalString;
            postList.Add(bp);
        }
    
        return postList;
    }
    

    That’s too simple, especially when compared to the 70 lines of code I normally use to do the exact same thing.

  • Location HTTPEver since I decided to expand my online presence, I thought the best step would be to have a better domain name. My current domain name is around twenty-nine characters in length. Ouch! So I was determined to find another name that was shorter and easier to remember.

    Ever since “.me” top level domain (TLD) came out, I snapped up “surinder.me”, partly because all other domains with my first name were gone (you know who you are!) and the “.me” extension seemed to fulfil what I wanted my website to focus on. ME! Having said that, I would have loved to get a “.com” domain, but I guess that’s what happens when you enter the online world so late.

    I was ready to move over all my content to “surinder.me” until one on my techy friends told me that things are still undecided when it comes to “.me” TLD’s in general. Originally, the “.me” extension was assigned to Montenegro’s locale only. But it’s fast gained traction over the years due to it’s simplicity and wide range of possible domain names. Even companies such as Microsoft, Facebook, Wordpress and Samsung rushed to register their “.me” domains. Hence the reason why I decided to get one.

    Companies seem to be using “.me” extensions for either URL shortening services or redirects to partner sites with “.com” extensions. It doesn’t fill me with much confidence when “.me” extensions are used this way. Google’s software engineer, Matt Cutts wrote a reassuring post on his Google+ profile earlier this year by stating:

    “…regardless of the top-level domain (TLD). Google will attempt to rank new TLDs appropriately, but I don't expect a new TLD to get any kind of initial preference over .com…If you want to register an entirely new TLD for other reasons, that's your choice, but you shouldn't register a TLD in the mistaken belief that you'll get some sort of boost in search engine rankings.”

    This should put all my “.me” fears to rest…right? Well it’s nice to know Google won’t penalise a site based on an extension. In the world of web, a search optimised site is king (as it should be). It’s nice that Google have given “.me” (as a country extension) global status given the nature of how its been used of late. But if you check Google’s Geotargetable Domains article, the text in brackets worries me.

    Google’s Webmaster Tools Geotargetable Domains

    I get the feeling you can’t go wrong with a “.com” domain providing you can find something meaningful to your cause. Steps are being made in the right direction for gccTLD’s. For example, Webmaster Tools gives you the option to geographically target your “.me” site. However, I can’t find anything concrete to alleviate my concerns in the long-run.

    So where does this leave me? Well, we’ll just have to find out if my future domain contains a .me extension. Smile