Blog

Tagged by 'sql'

  • When using the “Deploy to Azure Database” option in Microsoft SQL Management Studio to move a database to Azure, you may sometimes come across the following error:

    Error SQL71562: Error validating element [dbo].[f]: Synonym: [dbo].[f] has an unresolved reference to object [server].[database_name].[table_name]. External references are not supported when creating a package from this platform.

    These type of errors are generated as you cannot setup a linked server in Azure and queries using four-part [server].[database].[schema].[table] references are not supported. I’ve come across a SQL71562 error in the past, but this one was different. Generally, the error details are a lot more helpful and relates to stored procedures or views where a table path contains the database name:

    Error SQL71562: Procedure: [dbo].[store_procedure_name] has an unresolved reference to object [database_name].[dbo].[table_name]

    Easy enough to resolve. The error I was getting this time threw me as it didn’t point me to any object in the database to where the conflict resides and would require me to look through all possible database objects. This would be easy enough to do manually on a small database, but not a large database consisting of over 50 stored procedures and 30 views. Thankfully, SQL to the rescue...

    To search across all stored procedures and views, you can use the LIKE operator to search against the database’s offending system objects based on the details you can gather from the error message:

    -- Stored Procedures
    SELECT OBJECT_NAME(object_id),
           OBJECT_DEFINITION(object_id)
    FROM sys.procedures
    WHERE OBJECT_DEFINITION(object_id) LIKE '%[database_name]%'
    
    -- Views
    SELECT OBJECT_NAME(object_id),
           OBJECT_DEFINITION(object_id)
    FROM sys.views
    WHERE OBJECT_DEFINITION(object_id) LIKE '%[database_name]%'
    
  • Earlier this week a post I wrote for C# Corner was published. It was about an alternative to use the very well known SQL Server "IN" condition when working with many values. I discuss storing a list of values you would normally pass directly into your "IN" condition for querying to a User Defined Data Type.

    There will probably be a very small number of cases where additional steps I write in the post will need to be carried out. Afterall, SQL Server has a very large limit on the number of values the “IN” condition can handle, based on the length of instruction (max 65k).

    Check it out here: http://www.c-sharpcorner.com/blogs/alternative-to-sql-in-condition-when-working-with-many-values

  • Whilst trying to import a .bacpac database exported from an SQL Azure platform onto Microsoft SQL Server 2012, I received the following error:

    Internal Error. The internal target platform type SqlAzureDatabaseSchemaProvider does not support schema file version '2.5'. (File: C:\Users\Surinder\Downloads\SurinderAzureDB.bacpac) (Microsoft.Data.Tools.Schema.Sql)

    Now it seems many people online have been encountering this horrible error and majority of my fellow bloggers have suggested that installing the SQL Server Data Tools – December 2012 Update for VS 2010 or 2012 will resolve this issue. Unfortunately installing this alone did not help me. I found out that I missed one key part of my SQL Server installation: Service Pack 2!

    Can't believe this one small oversight caused me such a headache.

    You can download Microsoft SQL Server 2012 SP2 from the following link: https://www.microsoft.com/en-gb/download/details.aspx?id=43340.

  • Having developed quite a few websites in Azure, there are some key tools I found that made my life easier when accessing all areas of my Azure cloud instance. The great thing about the selection of tools I have listed below is that it gives me access to all the features I need wrapped in a nice interface.

    So lets get to it!

    Azure Storage Explorer

    Azure Storage Explorer is a useful tool for inspecting and altering the data in your Azure storage projects, including the logs of your cloud-hosted applications. This includes:

    • Blobs
    • Queues
    • Tables

    Unlike some of the previous storage explorer software I've used in the past, Azure Storage Explorer allows you to preview a blob directly through its interface, such as: Images, Video or Text files. So you don't have to waste time downloading a blob just to check if its been generated correctly. Amazing time saver!

    Once you have your storage set up within your Azure account, you can use this application to manage everything: create, view, copy, rename and delete all three types of storage types (listed above).

    Azure Storage Explorer

    An application as full featured as this shouldn't be free. But luckily for us, it is.

    Download: https://azurestorageexplorer.codeplex.com/

    Azure User Management Console

    Azure User Management Console manages the users and logins of an Azure SQL database. The tool is simply converting your action into T-SQL commands and execute them against an Azure database of your choice.

    Azure User Management Console

    What some beginner Azure developers do is they use the same master credentials that is assigned to the database on creation within their web application too. Of course, this master user has full "db_owner" privileges against the database. Not a good idea! This application allows you to create a new new user with restricted access access levels really easily.

    Download: https://aumc.codeplex.com/

    Redgate SQL Azure Backup

    One thing I found lacking in Azure SQL databases is the ease of creating a regular backup. There doesn't seem to be an automated way to do this directly through the Azure account.

    I've been toying around with Redgate's Azure backup service and that seems to do the job quite nicely. But it does come at a price. For a daily backup on one database will cost around £7 per month.

    Full range of backup plans: http://cloudservices.red-gate.com/

    CloudXplorer

    Whenever I needed to take a quick look at any of my blob containers, Azure Storage Explorer would suffice for majority of cases. However, the only thing I've started noticing with Azure Storage Explorer is that it lacks the efficiency of being able to export a batch of files from a blob to local storage with ease.

    CloudXplorer by ClumsyLeaf Software made browsing files within my blob container a breeze. All files were organised and displayed in a folder structure allowing me to download specific directories. The slick UI alone makes CloudXplorer a pleasure to use, especially if you have blob that is large in volume.

    I have downloaded around 200MB worth of files from one of my blobs to a local drive without any issue.

  • It's a shame that I have come quite late in the game when it comes to fully utilising the benefits of stored procedures. I am more than familiar on the general uses, such as CRUD operations, passing and returning parameters. Standard stuff!

    When I decided to work on a new project, I made a conscious decision to place all my database queries within stored procedures for the following reasons:

    • To broaden my current knowledge.
    • Remove all queries from web application level for maintainability.
    • Attempt to use logic in my stored procedures to do the brunt of the processing in one database call.
    • Increase speed of execution and performance for when a procedure is run multiple times.

    Out all of the points listed above, my main aim was to try and minimise all my database calls and get the data in a format that can be used directly without any further manipulation at application level.

    As you can see from the following stored procedure, I am carrying out checks to ensure specific data elements exist prior to carrying out further database operations. Thus, hopefully improving performance an efficiency.

    CREATE PROCEDURE spLogTagActivity
    (
        @ItemPrimaryHashtag AS nvarchar(100),
        @Hashtag AS nvarchar(100),
        @TagCategoryID AS int
    )
    AS
    BEGIN
        -- SET NOCOUNT ON added to prevent extra result sets from
        -- interfering with SELECT statements.
        SET NOCOUNT OFF;
        
        DECLARE @TagID AS int
        DECLARE @TagIsIgnored AS bit
        DECLARE @ItemID AS int
        DECLARE @ItemTagID AS int
            
        --Get TagID and Tag Ignore value based on the text value
        SELECT @TagID = ID, @TagIsIgnored = IsIgnored FROM Tag WHERE TextLowercase = LOWER(@Hashtag) COLLATE SQL_Latin1_General_CP1_CS_AS
        
        SET @ItemID = (SELECT ID FROM Item WHERE PrimaryHashtag = @ItemPrimaryHashtag)
    
        --If an Item can be found based on the Primary Hashtag, we will use the data.
        IF (@ItemID IS NOT NULL)
        BEGIN
            IF (@TagID IS NULL AND @TagIsIgnored IS NULL)    
            BEGIN
                --Only insert tags that are not the same as the Item Primary Hashtag
                IF (LOWER(@ItemPrimaryHashtag) <> LOWER(@Hashtag))
                BEGIN
                    EXEC spInsertTag @TagCategoryID, @Hashtag, 0, 0, NULL, NULL, NULL, 0, @TagID OUTPUT
                END
                                            
                --Associate new tag to the Item.
                IF (@TagID IS NOT NULL)
                    BEGIN
                        EXEC spInsertSingleItemTag @TagID, @ItemID, 1, @ItemTagID OUTPUT
                                                        
                        --Create a new Activity line for the Item.
                        EXEC spInsertTagTracking @ItemTagID
                    END
            END    
            ELSE
            BEGIN
                --If a Tag has already been logged, we will create a new Activity line for the Item
                --only if it does not have an ignored status
                IF (@TagID IS NOT NULL AND @TagIsIgnored <> 1)    
                BEGIN
                    --Associate the Tag to an Item, if it doesn't already exist.
                    SET @ItemTagID = (SELECT ID FROM ItemTag WHERE TagID = @TagID AND ItemID = @ItemID)
                    
                    IF (@ItemTagID IS NULL)
                    BEGIN
                        EXEC spInsertSingleItemTag @TagID, @ItemID, 1, @ItemTagID OUTPUT                                        
                    END        
                                        
                    EXEC spInsertTagTracking @ItemTagID    
                END
            END
        END
        ELSE
        BEGIN
            IF (@TagID IS NULL AND @TagIsIgnored IS NULL)    
            BEGIN    
                EXEC spInsertTag @TagCategoryID, @Hashtag, 0, 0, NULL, NULL, NULL, 0, @TagID OUTPUT    
            END
        END
    END
    GO
    

    I won't go into detail on what this sample stored procedure is doing, but I am delivering different outcomes based on the conditional statements. This is something that could have been done at application level, but much nicer in one database call. In other procedures I have utilised CASE and WHILE operators.

    The benefits of using stored procedures far outweigh the negatives...Negatives!? Yes negatives...Well I just have one small gripe. If you create a stored procedure that contains quite a bit of meaty logic and find that something is not working (even though syntactically correct), it's quite difficult to debug exactly what point the procedure is failing.

    To end the post, here's something surprising I found: The execution time of running a simple SELECT query can be slow when used within stored procedures, even though the query itself is fast. This is something I've encountered myself when wanting to retrieve a single value from one of my tables. The only way to get around this, is by passing you query parameter to a local variable inside the stored procedure. Take a look at this answer on StackOverflow. Strange!

  • Grouping records by their respective month and year based on a date column can be done really easily by simply running the following SQL query:

    SELECT
           DATEPART(MONTH, DownloadDate) AS Month,
           DATEPART(YEAR, DownloadDate) AS Year,
           COUNT(DownloadID) AS NumberOfItems
    FROM
           Download
    GROUP BY
           DATEPART(YEAR, DownloadDate), DATEPART(MONTH, DownloadDate)
    ORDER BY
           Year DESC, NumberOfItems DESC
    

    As you can see from the query (above), I am using the "DownloadDate" column to group my records and to also get a count of how many records belong to that month and year.

  • Don't you just hate it when you've created a Document Type in Kentico that is full of data and for some reason you need to rename it. I know I do.

    Even though Kentico does give you the ability to rename your Document Types via its interface, the old table name still exists within the SQL database. This may not bother some developers but this makes me feel damn right dirty.

    I came across an SQL script Matt Lee wrote in his blog, that takes the hassle of renaming the tables manually within a Kentico database.

    Definitely check it out. It has saved me a lot of time.

    Source: SQL Script To Rename A Document Type Table

  • I’ve been using Message Boards for some page templates within my Kentico site. I needed to count the number of messages for each document to be displayed on the homepage. I couldn’t find a way to do this using Kentico’s API.

    The only way I could retrieve the message count for a document is to query the database directly. The SQL query is as follows.

    SELECT COUNT(*)
    FROM Board_Board
    JOIN Board_Message ON Board_board.BoardID = Board_Message.MessageBoardID
    WHERE 
    Board_Message.MessageApproved = 1 AND Board_Board.BoardDocumentID = @DocumentID
    
     
    

    If anyone knows how to achieve the exact same thing through the API, please leave a comment.

  • UPDATE queries are great to modify all information stored within our column. But what if we needed to only update a specific phrase or word within our column?

    Well, we can use an UPDATE query along with a REPLACE function. It can be used like this:

    UPDATE <table_name>
    SET <column_name> = Replace(<column_name>, 'old phrase', 'new phrase')
    
  • I always used NULL and DBNULL interchangeably in my coding around my database results without ever considering what were the differences. Fortunately, I stumbled upon a great blog written by Bilal Haidar, "Difference between NULL and DBNull"

    If I understand correctly, you use DBNULL to check if a certain field in a database is null and you would use NULL to check if a whole record is not found in the database.