Category Archives: Database Administration Tasks

Documentation, the Necessary Evil

Everyone hates to write documentation, myself included.  No one likes to read your documentation either, if we’re being totally honest.  In my day, I have created some amazing documentation as well as some crappy documentation and both sit on a SharePoint site or file share somewhere dying from a cold, lonely existence never to be updated again.  Granted, someone will search for something, that one particular nugget that will help their understanding only to see that you glossed over that area and Jimmy is now besmirching your good name at the water cooler because of your poor documentation skills.  He said something about your mother as well, but I am not going to repeat that.  Not cool Jimmy, not cool.

As a consultant, I understand wholeheartedly the importance of documentation as it correlates to the success of the project at hand.  However, that doesn’t make it any easier or exciting to write and certainly doesn’t make it any easier to read. Have insomnia?  I have just the solution for you!!! Step right up and read this seventy page project discovery documentation outlining goals, tasks, and source to target mapping. Can you imagine being a technical writer producing these documents every day?  I dated such a person for a short time last year and she was just as boring as the documents that she was paid to create but that’s a topic for another day.

The flip side to this is that a project without initial documentation in a discovery process can lead to impending doom, heartache, mild stomach discomfort, depression, and diarrhea.  Well I am not sure about the diarrhea, but it will be bad, really bad.  Feelings will be hurt, your good name will be run through the mud, and the your parent’s having a child out-of-wedlock will be questioned (that would be calling you a bastard for you millennials out there as there was a time when that was an insult).  Herein lies the rub, you are going to have to advocate for documentation at the beginning of the process even though you despise doing it because simply your Mom’s honor is at stake.  Do it for your Mom, she still loves you even if Jimmy doesn’t.  Not cool Jimmy.  Enjoy!!

First Pre-Con is in the Books! What a Blast!

On Thursday, January 15, 2015, my good friend Mike Lawell (b|t) and I embarked upon our first pre-conference training for SQL Saturday Nashville. The event was a smashing success with a fabulous turnout.  In addition, the crowd was engaged and responsive, as a teacher it does not get any better than that.  363For those that attended our session, here are the files that we promised you.  We’ve included all of the links and the code contains a wealth of information as well as see the rest of the slides that we were unable to get to. Please use them and feel free to ask us any questions that you may have.  In addition, we are more than happy to help you at the beginning of your DBA journey.  Welcome to the #SQLFamily!

For those of you who did not attend, the session was entitled “Getting the New DBA Up to Speed with Performance Tuning.”  The session was aimed at new and ‘accidental’ DBAs as they embark upon a journey learning the art of performance tuning.  If you are a SQL Saturday organizer, we would love to offer this session at your event.  Please feel free to contact us if you would like us to provide this session for you.  Enjoy!

 

 

Grody to the VARCHAR(MAX)

This morning I was looking through a new vendor database to see exactly what I would be supporting and I stumbled upon a curious table.  This table seemed quite normal to the naked eye until I glanced upon the data types.  There were four, count them four fields set to VARCHAR(MAX) and two set to NVARCHAR(MAX).  First of all, why the inconsistency in the data types?  Maybe there is a legitimate reason, but I am not buying it.  If you need Unicode support provided by the NVARCHAR data type wouldn’t you need it across all fields?  I have only worked with a few databases that needed Unicode support and that was the case for them.  Maybe there is a case for mixed usage, but I do not understand it and obviously I was not part of their design team.

Now onto the bigger elephant in the room, why on earth would you have all of these fields set to the maximum storage amount of two gigabytes (1GB for NVARCHAR taking up 2GB of space).  Are you really storing that much data in six fields of EACH record?  Keep in mind that SQL Server stores records in 8K pages.  When you exceed that the data goes to an overflow page with a pointer so that SQL knows where to find the rest of the data.  I understand that it will not take up that much space if you are not actually storing that much data, but there is a potential here for these table to be problematic.  Granted this might be a good design for one field if it stores a large amount of text or notes, but six fields in the same record?  I looked at the length of the records currently entered for these six fields and I found nothing larger than 100 characters.  Overkill?  What do you think?

Enjoy!

 

SQL Server Migration Assistant Problem Solved

Risk the Game of Global DominationHaving used SQL Server Migration Assistant for a major Oracle conversion project over the last year, I quickly grew aggravated with its project limitation involving a single database.  In other words, if you setup the project file in SSMA for staging and then you are unable to point it to an acceptance environment to migrate to that environment.  That can be very frustrating especially if you have customized many of the project settings.  I have discovered two workarounds to resolve this problem.

  1. Backup (or copy) the project folder (c:\Users\<<Username>>\Documents\SSMAProjects\<<Project Name>>).  Close SSMA and then delete the target-metabase.mb file in the project folder.   Now open SSMA and now you can connect to a new SQL Server and/or database.  You would want to copy the folder if you want to retain the settings for a different environment such as staging, acceptance, production, etc.  If you rename the folder for a different environment, make sure to rename the *.o2ssproj file to match the folder name.  This is the file that opens inside of SSMA.
  2. The second option is to create a new project using SSMA and then copying in the object-containers.mappings, project-container.mappings, and finally preferences.prefs.  These are the project settings that you have painstakingly setup in your original project.

I hope this helps.  Enjoy!

Are Those Transaction Logs Empty?

Ever wonder what your usage statistics are for your transaction log files without opening each individual one in the SQL Server Management Studio shrink file dialog.  DBCC SQLPERF is a neat little utility that can tell you the size of the log file and the amount of the space occupied by transactions inside the file.

DBCC SQLPERF

DBCC SQLPERF

You can also use this utility to clear the sys.dm_os_wait_stats and sys.dm_os_latch_stats statistics, in such a manner:

DBCC SQLPERF(“sys.dm_os_wait_stats”,CLEAR);

SQL Server Domination Plan: Two Weeks After

A few weeks ago, I let loose a rebel yell and issued a command directive: Death to Oracle!  Then I delivered the death knell as we migrated all of the data to SQL Server over Memorial Day weekend.  Today, I am proud to announce that after two weeks of the Finance SQL Servers being in production, the users are happy and productive.

Most would say that is how it should be, but the Oracle DBAs told me that the users would complain daily.  They were happy to rid themselves of those users and now I know why.

Hopefully I did not jinx myself but it looks like the domination plan is working!

Initiatives

We had a meeting this morning where our Director talked about initiatives and people bringing forth new ideas to help streamline our department.  Therefore, soon I will be bringing forth a new initiative to eliminate Oracle altogether and replace it with SQL Server.  It will ultimately be denied because of the cost involved with the conversion but they will have to discuss it and think about it.  Take that Oracle, SQL Server Global Domination plan is gaining traction.

Enjoy!

SQL Server Domination Plan Phase One Has Completed

Last week I let loose a rebel yell and issued a command directive: Death to Oracle! Today I am proud to announce that I dealt a death knell to one of our Oracle servers by migrating our finance applications from Oracle to SQL Server. This was a year-long project that culminated in one really, really long weekend of SQL Fun. I survived and Oracle bit the dust. My plan for SQL Server World Domination has completed phase one.

My part of the project included a 27-hour marathon involving migrating the data using SQL Server Migration Assistant and then numerous scripts and comparisons to day-zero databases from the vendor while upgrading the application across five different product upgrades using RedGate SQL Compare (which worked beautifully). It was a day full of Starbucks triple espressos and five-hour energy shots before I crashed and the testers took over. We then rounded out the weekend with a 33-hour ETL process which all completed successfully before the application went live at 7am this morning. Mission accomplished.

This project has great visibility in my organization and its success will open the flood gates to future migrations in showing our world that SQL Server is an enterprise level database. I am not stopping here, I want their 100 terabytes of production data! It will be mine, oh yes it will be mine.

SQL Server Domination Plan Has Begun

Today signals the end and the beginning for a year-long project.  My project for SQL Server Domination.  Our shop is primarily an Oracle shop with dot net developers.  Although I do have almost 300 databases across all levels, the bulk of mission critical applications are written for Oracle.

A little over a year ago we began a plan to migrate our financial applications to SQL Server in order to retire the oldest Oracle servers on site and reduce our massive licensing footprint.  This was all part of my plan for SQL Server World Domination (said in a maniacal voice, imagine that).  We knew that if this mission critical migration was a success that we could leverage our product against the Oracle servers in-house and begin the destruction of their infrastructure in favor of my glorious SQL Servers!  (Hear the trumpets heralding the new king?)

This project, if successful, will be the Oracle death knell.  Over the last year I have prepared diligently with one goal in site:  this migration weekend.  The weekend of my birthday seems apropos to give me the present that I so desire, SQL Server Domination.  You may have noticed that the number of blog posts recently have been limited especially after blogging every day last year.  For the last few months, I have had one purpose and one purpose only:  ensuring the success of this migration!

Let the trumpets herald the coming of a new dawn.  Long live SQL Server.  I will blog again next week on the other side of this project to declare that the Queen is dead and a new King has ascended the throne!   Until then queue the Imperial March, hello Starbucks and massive four day working weekend with little sleep migration project here I come!

 

Auditing Revisited, Makes No Sense To Me

Back in January, I did a series on simple auditing which walked through setting up an audit and creating a simple SSRS report.  I actually used the report and the query that I demonstrated for a couple of servers in my stack that require auditing.  The basis of this revolves around the following line of code:

SELECT * FROM fn_get_audit_file (‘g:\auditing\*’, DEFAULT, DEFAULT)

This line of code runs great for my login and I never gave it much thought because I am running it from a sysadmin account.  When asked for users to be able to run this report that is when things are getting sticky.  I cannot fathom why a simple query for log data cannot be attributed to a read-only account.  In order for you to run this select statement you either need a sysadmin level account or the CONTROL SERVER permission which pretty much gives you the keys to the kingdom.  Why would they do that?

I am sure there is a great technical reason, but why not allow it to filter down to read-only access to simply view the data?  With those permissions whoever is in charge of running the audit can manipulate the data however they see fit.  That makes no sense to me but then again who am I in the grand scheme of things?  Shortly I will post a few different scenarios in how to solve this problem.  Enjoy!

Follow

Get every new post delivered to your Inbox.

Join 1,767 other followers