Beginner’s Guide To Using Google Plus .NET API Part 2: User Posts

Ok I’ll admit Part 2 to my “Beginner’s Guide To Using Google Plus .NET API” has been on the back-burner for some time (or maybe it’s because I completely forgot). After getting quite a few email recently on the subject, I thought now would be the best time to continue with Part 2.

It’s recommended you take a gander at Part 1 before proceeding to this post.

As the title suggests, I will be showing how to output user’s publicly view posts. The final output of what my code will produce can be seen on my homepage under the “Google+ Posts” section.

Create Class Object

We will create a class called “GooglePlusPost" to allow us to easily store each item of post data within a Generic List.

public class GooglePlusPost
{
    public string Title { get; set; }
    public string Text { get; set; }
    public string PostType { get; set; }
    public string Url { get; set; }
}

 

Let’s Get The Posts!

I have created a method called “GetPosts” that accepts a parameter to select the number of posts of your choice.

public class GooglePlus
{
    private static string ProfileID = ConfigurationManager.AppSettings["googleplus.profileid"].ToString();
    
    public static List<GooglePlusPost> GetPosts(int max)
    {
        try
        {
            var service = new PlusService();
            service.Key = GoogleKey;
            var profile = service.People.Get(ProfileID).Fetch();

            var posts = service.Activities.List(ProfileID, ActivitiesResource.Collection.Public);
            posts.MaxResults = max;

            List<GooglePlusPost> postList = new List<GooglePlusPost>();

            foreach (Activity a in posts.Fetch().Items)
            {
                GooglePlusPost gp = new GooglePlusPost();

                //If the post contains your own text, use this otherwise look for
                //text contained in the post attachment.
                if (!String.IsNullOrEmpty(a.Title))
                {
                    gp.Title = a.Title;
                }
                else
                {
                    //Check if post contains an attachment
                    if (a.Object.Attachments != null)
                    {
                        gp.Title = a.Object.Attachments[0].DisplayName;
                    }
                }

                gp.PostType = a.Object.ObjectType; //Type of post
                gp.Text = a.Verb;
                gp.Url = a.Url; //Post URL

                postList.Add(gp);
            }

            return postList;
        }
        catch
        {
            return new List<GooglePlusPost>();
        }
    }
}

By default, I have ensured that my own post comment takes precedence over the contents of the attachment (see lines 24-35). If I decided to just share an attachment without a comment, the display text from the attachment will be used instead.

There are quite a few facets of information an attachment contains and this only becomes apparent when you add a breakpoint and debug line 33. For example, if the attachment had an object of type “video”, you will get a wealth of information to embed a YouTube video along with its thumbnails and description.

Attachment Debug View

So there is room to make your Google+ feed much more intelligent. You just have to make sure you cater for every event to ensure your feed displays something useful without breaking. I’m in the process myself of displaying redoing my own Google+ feed to allow full access to content directly from my site.

Recommendation

It is recommended that you cache your collection of posts so you are not making constantly making request to Google+. You don’t want to exceed your daily request limit now do you.

I’ve set my cache duration to refresh every three hours.

The Floppy Disk Reinvented – Into a Coffee Table

You have to see it to believe it. The inner geek in me want to purchase this.

Floppy Disk Coffee Table

The guys who made this have managed to put in an impressive amount of detail (as much detail as you can get from a floppy disk!).

Floppy Disk’s were well-known for their lack of storage space, thankfully, there’s a adequate sized secret compartment that is revealed by simply moving the metal shutter.

More images of this beauty can be seen over at Design Boom: http://www.designboom.com/design/floppy-disk-table-by-axel-van-exel-marian-neulant/

And whoever said the Floppy Disk is dead!? Smile

Where touchscreen and keyboard meet in perfect harmony…

Touchscreen LaptopI had the opportunity to try out one of my friends new gadget purchases (someone's been a good boy this year!) - the Asus Transformer Infinity. I read a lot about the Asus Transformer range ever since its first release in 2011, but got the impression that this was just another worthless mishmash of tech with a mistaken identity. I never understood why anyone would buy a touchscreen tablet that had a keyboard. That just defeats the whole point of having a tablet device does it not? How wrong was I...

The combination of a keyboard and touchscreen just works and feels perfectly natural. I always had the misconception that a mouse is needed to accurately communicate with an OS user interface. The more I used the Asus Transformer, the more I wished I had one and oddly when returning back to my Alienware m11x I felt something missing. In my eyes, the good ol' touchpad just seemed inadequate. In all honest, the Asus Transformer touchpad is just as inadequate and a little flaky (possibility due to lack of support by Android). They should have left that out.

With the advent of next generation on laptop/touchscreen hybrids such as the Microsoft Surface and Lenovo Yoga (which has an awesome ad), maybe there is method in this concept after all and I look forward to using future variations.

For the moment, all I know for sure is I want a Asus Transformer Infinity!

It’s all about Website Hotkeys!

During the latter-end of 2010, Twitter overhauled their somewhat simplistic website to compete with client-side offerings (e.g. TweetDeck, Seesmic). What I found really impressive was a hidden bit of functionality that allowed the user to navigate around the site using keyboard shortcuts (or hot keys). If you haven't tried it, take a look at the list of shortcuts below and try them out.

Twitter Keyboard Shortcuts

Some people I know in the industry think it's a pointless feature. But I believe something so simple automatically enhances the users experience when accessing a site. In fact, you could think of hotkeys as an additional web accessibility requirement for those who don’t have a mouse or just prefer the more direct approach in navigating through a site. Many sites have been utilising hotkeys to get their sites to act like locally installed software programmes, for example Google Docs.

I was very keen on replicating hotkey functionality on my next project. Not surprising, there are a lot of custom jQuery plugins that allowed you to implement some basic keyboard shortcut functionality. The best one I found through trial and error is Mousetrap. I found Mousetrap to be the most flexible plugin to fire your own custom JavaScript events by binding a single, sequence or combination key press.

Using Mousetrap, I could replicate a simple Twitter-style shortcut to take a user back to the homepage by pressing the following keys in sequence: “G H”:

Mousetrap.bind("g h",
    function () { 
        window.location = "/Home.aspx"; 
    }
);

It’ll Be A Sad Day When iGoogle Is No More

Amongst the number of services Google provides, iGoogle portal has to be at the top of my list. It’s my one stop shop for daily news, weather forecasts and playing the odd game. I was surprised when Google announced they will discontinue the service from November 2013. I was reminded by the deadline on my iGoogle page today, reinforcing that this is going to happen. I was hoping Google would reconsider but it doesn’t look like that's going to happen.

iGoogle Discontinued

Google’s decision to discontinue iGoogle in my opinion is a little rash. They claim: “With modern apps that run on platforms likeChromeandAndroid, the need for something like iGoogle has eroded over time”. And this is where the problem lies. Why does everything nowadays have to evolve around an app? Some things are best left accessible through a browser.

I like getting to work in the mornings and gazing over the days topics. It’s bloody informative! I’ve yet to find an app that matches what iGoogle offers. iGoogle is a one page where everything is displayed without having to click to another page. Google Chrome's substitutes require me to do exactly that. Big waste of time.

I’m not the type of person to be concerned about change and in most cases I welcome it with open arms. But this will take a little time to get use to.

Goodbye old friend, you’ll be sorely missed!

iOS Safari Browser Has A Massive Caching Issue!

Safari iOS6It wasn’t until today I found that the Safari browser used on iPad and iPhone caches page functionality to such an extent that it stops the intended functionality. So much so, it affects the user experience. I think Apple has gone a step too far in making their browser uber efficient to minimise page loading times.

We can accept browsers will cache style-sheets and client side scripts. But I never expected Safari to go as far as caching responses from web services. This is a big issue. So something as simple as the following will have issues in Safari:

// JavaScript function calling web service
function GetCustomerName(id)
{
    var name = "";

    $.ajax({
        type: "POST",
        url: "/Internal/ShopService.asmx/GetCustomerName",
        data: "{ 'id' : '" + id + "' }",
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        cache: false,
        success: function (result) {
            var data = result.d;
            name = data;
        },
        error: function () {
        },
        complete: function () {
        }
    });
    
    return name;
}
//ASP.NET Web Service method
[WebMethod]
public string GetCustomerName(int id)
{
   return CustomerHelper.GetFullName(id);
}

In the past to ensure my jQuery AJAX requests were not cached, the “cache: false” option within the AJAX call normally sufficed. Not if you’re making POST web service requests. It’s only until recently I found using “cache:false” option will not have an affect on POST requests, as stated on jQuery API:

Pages fetched with POST are never cached, so the cache and ifModified options in jQuery.ajaxSetup() have no effect on these requests.

In addition to trying to fix the problem by using the jQuery AJAX cache option, I implemented practical techniques covered by the tutorial: How to stop caching with jQuery and JavaScript.

Luckily, I found an informative StackOverflow post by someone who experienced the exact same issue a few days ago. It looks like the exact same caching bug is still prevalent in Apple’s newest operating system, iOS6*. Well you didn’t expect Apple to fix important problems like these now would you (referring to Map’s fiasco!). The StackOverflow poster found a suitable workaround by passing a timestamp to the web service method being called, as so (modifying code above):

// JavaScript function calling web service with time stamp addition
function GetCustomerName(id)
{
    var timestamp = new Date();

    var name = "";

    $.ajax({
        type: "POST",
        url: "/Internal/ShopService.asmx/GetCustomerName",
        data: "{ 'id' : '" + id + "', 'timestamp' : '" + timestamp.getTime() + "' }", //Timestamp parameter added.
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        cache: false,
        success: function (result) {
            var data = result.d;
            name = data;
        },
        error: function () {
        },
        complete: function () {
        }
    });
    
    return name;
}
//ASP.NET Web Service method with time stamp parameter
[WebMethod]
public string GetCustomerName(int id, string timestamp)
{
    string iOSTime = timestamp;
    return CustomerHelper.GetFullName(id);
}

The timestamp parameter doesn’t need to do anything once passed to web service. This will ensure every call to the web service will never be cached.

*UPDATE: After further testing it looks like only iOS6 contains the AJAX caching bug.

FireShot – A Really Good Webpage Screen Capture Tool

Working in the web industry and having the opportunity to develop a wide variety of websites, I like to take a snap-shot of a few pages for my portfolio (working on that!). But I generally come into issues when taking a screen-shot of a very long webpage. In fact, I always experience issues when screen grabbing a scrolling page.

Luckily, I found a really useful add-on to Firefox called Fireshot. Fireshot makes it really easy to screenshot an entire page. Once you have made a screenshot, you can carry out the following tasks within the comfort of your browser:

  • Upload to Facebook, Picasa, Flickr.
  • Saved to disk as PDF/PNG/GIF/JPEG/BMP
  • Sent to clipboard
  • Print
  • E-Mail
  • Export

I was expecting this tool generate a screen grab really slowly. Even on long pages with a lot of content, images are generated quickly. Take a look at the screen-shot I made of "http://www.theverge.com" here.

Definitely try it out.

.NET Library To Retrieve Twitpic Images

I’ve been working on a .NET library to retrieve all images from a users Twitpic account. I thought it would be quite a useful .NET library to have since there have been some users requesting one (including me) on some websites and forums.

I will note that this is NOT a completely functioning Twitpic library that makes use of all API requests that have been listed on Twitpic’s developer site. Currently, the library only contains core integration on returning information of a specified user (users/show), enough to create a nice picture gallery.

My Twitpic .NET library will return the following information:

  • ID
  • Twitter ID
  • Location
  • Website
  • Biography
  • Avatar URL
  • Image Timestamp
  • Photo Count
  • Images

Code Example:

private void PopulateGallery()
{
    var hasMoreRecords = false;

    //Twitpic.Get(<username>, <page-number>)
    TwitpicUser tu = Twitpic.Get("sbhomra", 1);

    if (tu != null)
    {
        if (tu.PhotoCount > 20)
            hasMoreRecords = true;

        if (tu.Images != null && tu.Images.Count > 0)
        {
            //Bind Images to Repeater
            TwitPicImages.DataSource = tu.Images;
            TwitPicImages.DataBind();
        }
        else
        {
            TwitPicImages.Visible = false;
        }
    }
    else
    {
        TwitPicImages.Visible = false;
    }
}

From using the code above as a basis, I managed to create a simple Photo Gallery of my own: /Photos.aspx

If you experience any errors or issues, please leave a comment.

Download: iSurinder.TwitPic.zip (5.15 kb)

HTTP Request Script

In one of my website builds, I needed to output around a couple thousand records from a database permanently into the .NET cache. Even though I set the cache to never expire, it will get cleared whenever the application pool recycles (currently set to every 24 hours). As you can expect, if a user happens to visit the site soon after the cache is cleared, excess page loading times will be experienced.

The only way I could avoid this from happening is by setting up a Scheduled Task that would run a script that would carry out a web request straight after the application pool was set to recycle.

Luckily, I managed to find a PowerShell script on StackOverflow that will do exactly that:

$request = [System.Net.WebRequest]::Create("")
$response = $request.GetResponse()
$response.Close()

Null Columns When Importing Excel Data into SQL Server

I don’t generally have a problem importing an Excel spread sheet into one of my SQL Server tables. But today would end my run of Excel importing perfection.

I experienced an problem where all rows that only contained numbers were ending up as NULL in my table after import, which I thought was strange since the Excel spread sheet did not contain empty cells. It contained a mixture of data formats: text and numbers.

I decided to format all rows in my spread sheet to text and try another re-import. No change.

After much experimentation, the solution was to copy all columns and paste them into Notepad in order to remove all formatting inherited from Excel. I then re-copied all my data from Notepad back into my spread sheet and carried out another import. Lo and behold it worked!

I don’t understand why I had this problem. It could have been due to the fact the spread sheet contained cells of different data formats and causing confusing through the import process.