Monthly Archives: June 2012
Today’s installment of the Saturday SQL Schoolhouse is brought to you by the fine folks over at Microsoft. When searching for a topic to “teach” today in the schoolhouse, I came across this site called Microsoft Case Studies and could not stop reading. Obviously, it makes perfect sense for Microsoft to create such a page to help business see the correlation between their products and solving the problems that they have.
Being a SQL geek, I was enthralled like when you are reading a good spy novel wondering if the migration was successful, did the BI solution solve their reporting quandary, did the cloud really save them millions in unnecessary overhead. Forget the kindle, pick up a good case study and it may help you solve a problem you are having in your office. Enjoy!
If you follow me or the Midnight DBAs on twitter then you may have seen lately a challenge to get healthy not only in time for the PASS Summit but also in time for the rest of your life. This contest appeals to me, not only because I enjoy getting healthier but because I enjoy talking smack and I would challenge my own mother to a contest if there was a prize. And there is!
Don’t just take it from me, read about on Jen’s blog, take a look for yourself at the #GetHawt GameOn Challenge. Come on you wanna do it too and talk smack with me. Game on!
The Oracle Conversion, day 14, continues and it is not as exciting as I envisioned. Oh, don’t get me wrong, the end result, Global Domination for SQL Server, will be the ends that justify the means. But until then, migrate data, check data, clean data, remigrate date, check data, clean data, time migration, repair migration process, check data, clean data, remigrate, ad nauseam. Rinse, lather, repeat.
Oh, this is not my first rodeo, I’ve converted many projects before but never have I converted Oracle before. Granted, the SQL Server Migration Assistant for Oracle has made life much easier but at the same time it has made life much more difficult.
It is a quirky tool that is very temperamental. We had to restart it before we did a successive migration or it would fail we are guessing because it possibly caches information from the previous migration process and the MSDN blog wasn’t much help. We also had to hit apply on every setting change or it would revert some of them back to the original value.
In addition, we would have to convert schema sometimes four or five times before it would synchronize them properly to SQL Server, saying that the SQL execution failed. That answer gives me a huge amount of troubleshooting possibilities. When it works, it was a great time saver but had we known the quirkiness we probably could have scripted it all out and saved a lot of headaches.
Doesn’t sound like a big deal?
Well when you have some big tables and it takes eight hours to convert the schema and you try synchronizing and it fails and you have to convert the schema again, you watch the progress bar for many hours. Oh yeah, I left out the fact that the vendor is responsible for the migration contractually, so I guess ultimately they should have scripted it out. Enjoy!
Have you registered for PASS Summit? This week is the last week to register at the second stage early bird rate.
Do I have a financial interest in the Summit? No.
Do I want to see you there? Heck yeah!
Am I sure that I am going? Not yet, still waiting on approval by my employer. It is a weird tight rope I am walking at this stage, do I pay my own way and save the $700 or do I wait in hopes that they will fund it?
Have I been before? No, not yet.
Have I wanted to? Heck yeah!
What will I do if my employer says no? Stand on the street corner with a sign that signs will administer your databases for a trip to the Summit. Am I serious? You bet. Ok, maybe this winter when it is not 100 degrees outside. Seriously if they say no, I will pay the $999 early bird and pay my own way and drive to Charlotte. Let’s hope I can meet you there, but only if you register for PASS Summit 2012. Enjoy!
While looking up something in the racecar book, I thought the topic would make a great blog post. What books stay on your desk? I have a book shelf full of wonderful books that are written by people smarter than me. However, there are four books that never return back to the book shelf mainly because they are my “go to” books of choice. They rarely fail me in my time of need. The first being my training manual for the SQLskills Immersion Event for Internals and Performance (aka IE1). If you have taken this class, then you understand why this never leaves my desk. The second manual is affectionately called the racecar book. It is the actually called Professional SQL Server 2008 Internals and Troubleshooting and is the first book that I open when I have a question. The third book is a new book called Troubleshooting SQL Server, A Guide for the Accidental DBA. I am currently reading this book, but I think it will stay on my desk for some time to come. The final book is Pro SQL Server 2008 Failover Clustering because I have yet to see another book for clustering that covers everything under the sun. Now don’t get me wrong, eBooks are great, but it still does not beat a real book, maybe I am just old school. What books remain on your desk?
Last week we talked about my Oracle migration to SQL Server and the vendor’s use of datetime over datetime2. Today, I thought I would discuss the differences between the two datatypes.
First of all, datetime2 is recommended by Microsoft as opposed to datetime as it provides a larger data range, a larger default fractional precision as well as the optionally defined user precision level. Datetime2 is also ANSI and ISO 8601 compliant, whereas datetime is not. Which would you use?
Datetime2: 2012-06-25 12:45:10.1234567
Datetime: 2012-06-25 12:45:10.123
Sunday Funday has returned! We took a hiatus from these series of posts because quite frankly it is hard to find people having fun with SQL Server these days. Let’s have some fun!
Today I bring you a maze game written in SQL, enjoy Lionel Clarke : A Sql Game.
Today’s installment of the Saturday SQL Schoolhouse is brought to you by the fine folks over at Microsoft TechEd which was held last week in Orlando. Didn’t get to go to TechEd this year, me either. But that’s fine because we have the next best thing though, all of the sessions online! Top Rated Sessions at TechEd North America 2012.
For the last two weeks, I have been working with a vendor to migrate our financials from Oracle to SQL Server using SQL Server Migration Assistant. This has been a time-consuming process, which requires an extreme attention to detail as precious data hangs in the balance.
We have found numerous pieces of ‘bad data’ where either Oracle or the application allowed date such as 03/02/0007. These dates are allowed to migrate to SQL Server when using a datetime2 with precision 0, however the vendor requested to use datetime originally which ranges from 1/1/1753 to 12/31/9999, thus causing migration errors.
Here is the dilemma, do you modify data for financials that have closed to have clean data or leave data as is? The DBA in me says to clean the data and make everything pristine before you move it into my SQL Servers, however there are implications with financials and the department made the decision to import as is to ensure data was identical and unmodified. That I can certify, well the vendor will need to certify that as they have chosen the column type conversion fields that work with their application’s SQL Server version.
This conversion is a critical piece in my plan for global domination, well that is my plan to destroy our dependence on Oracle and move solely to SQL Server especially since we are a dot NET development shop. When this project is successful and users see better performance then we will be able to migrate other databases as well and we will see better performance! Enjoy.
Found a weirdness this morning that I thought I would share with you. I have recently setup up two new SQL Server 2008 clusters and setup the database mail yesterday so that I can receive a nightly backup report, which is much easier than 50-100 individual emails per server. The database mail worked on the send test email, but was not sending out for jobs.
I talked to our exchange guy to make sure there was nothing on that end preventing the flow. I reviewed the settings and review the SQL agent job. Nothing jumped out at me. The job history showed SQLSTATE 42000, ERROR 14607.
Upon bingling the error code, I found the error to mean incorrect profile name. I named the profile, so now SQL Server is judging my naming convention?
Wait, I scripted out the job from another server, but I changed the names in the script. I rechecked the profile on the source server and it was named “<server name> Mail”. However my profile on the new server was simply named after the server name. Once I changed the name by deleting the profile and creating a new one, the job worked successfully. Attention to details! I hope this helps someone, enjoy!