I’ve been tasked with the SQL portion of our data migration and conversion testing for our big release we’ve got coming up. What’s that mean? Well, we’ve got an upgrade coming to the application we support. We’ve got 15 databases on 7 servers. All of them have database schema changes coming, data moving around from one table to another, new tables getting created, old ones getting dropped, new columns populated from other databases, etc. I’ve got to make sure all the data that moves makes it there and is correct. Here was how I attacked it:
The High Level Overview:
- Script production databases from our log shipping servers
- Create two copies of each database on a test server(s)
- Upgrade one database of each set
- Compare schema on Pre and Post upgraded databases
- List out all the diffences
- Catagorize how each new column or table is populated
- Ignore the new tables that are populated on use or populated with defaults – no data migration
- Create Checksum and row count scripts for all the migrated data
- Run the scripts and record the results on our Prod-Stage environment before the upgrade
- Upgrade Prod-Stage databases
- Run the scripts and compare the results to the Pre-upgrade.
- Address any issues
This approached seemed to work well, so my plan is to write a post on a bunch of these individual steps, so stay tuned.
- Network outage cause and BCP job to fail ungracefully. There are a lot of opportunities to improve this process, but to get the system functioning we ended up restoring a backup and comparing data to find differences.
- Data Migration Testing – more posts to come on this…
- Set up two databases and wrote checksum and row count Scripts to compare values before and after our upgrade.
- Found an issue that was overpopulating a table – only occurs in production due to incremental builds in all other enviroments.
- The explanation and the corrdination of the fix took lots of time – technical part was easy, dealing with people takes time!
- Found an issue with a procedure that calls another procedure that checks if a service broker queue exists. To check for service broker’s existence, the user needs DBO rights, so the fix we came up with was to use execute as owner – note this is a special case procedure.
As mentioned in a previous post, I’m going to be coaching a Dale Carnegie Effective Communications class. The other night I attended a training session for current and future coaches. The topic that we covered and went in-depth on was how to give feedback. An effective way that we learned is based on a three part formula. Being a computer and math guy, I thought it’s pretty easy to fit things into a formula, so I made a game of it.
The formula for a response/compliment is as follows:
- Tell the person the a strength they possess
- Show evidence of that strength
- Tell them how that strength will help them in the future
In the last two weeks, I’ve attempted to give at least one compliment a day to a co-worker. It’s not worked everyday, but it has helped me and the team I’m working with quite a bit. People seem to be more confident around me and proud of what they’ve done. I’ve made a stronger connection with a few the people as well.
Today, for example, I was troubleshooting an issue with a less experienced DBA. He was driving the workstation we were working on and jumping around like he knew the next move. I simply stopped and said, “Wow, you’ve really picked up on the environment around here. You’re two moves ahead of where I’d be and jumping to shortcuts I didn’t know existed. This will really help you in the future to be more efficient.” This was a simple example, but he was all smiles the rest of the day and even showed me a couple of quicker ways to do things.
As computer geeks, we sometimes sacrifice our social skills. Try giving out a few compliments and you’ll be amazed at the response and help you’ll get from others. You may even pick up a few new short cuts like I did.
- On Call
- False alerts was about the only real challenge – long running jobs, high wait times, data file space (all known issues)
- A couple of backup failures that needed to be re-run
- Inspired to teach by new developer -
- She got “volunteered” to work on some stored procedures with very little experience in SQL development.
- I showed her a couple of tricks and gave her a couple of tips, and a couple of days – then asked for status.
- She responded with I NEED HELP! – which made me think I’d end up working on them this weekend.
- She had them re-written, she just didn’t understand how to check the code in! Done and done!
- Documented and created some test cases for data migration. This is my big project for my big project. So I began writing down all the tests and plans I have to verify data is in the correct spot after we deploy our new schema. Further details to come on this as I’ll be spending a lot of time this week on it.
- Naming standards – put my foot down to rename a trigger for consistency sake as it’s function was changing. Interesting battle.
- Indexes or stats not getting rebuilt or updated on our Development servers. Shocking…they are having performance issues!
- Deadlocks occurring in our Development environment – I used this script to identify what’s causing the issue – quick and easy (one of the main reasons for this blog)!
- Data Migration testing continues – still documenting the mapping of the moving data
- DB Compares – comparing test databases to Dev and QA schema
- Incremental builds caused a couple of missing defaults and and one foreign key
- Used an article from Jason Strate (blog|twitter) to identify and name the missing foreign key
- Indexed views in Dev database quit working due to schema binding and with index clause – recreated the index and things worked
- Rewriting a procedure to identify data to archive – need to keep 30 days worth of data or two incidents for some customers and 30 days or three incidents for others
- I got my laptop back – with no data loss – it took until Wednesday afternoon, but no data loss!
- Installed SQL 2008 Developer Edition
- Set up a batch file for starting and stopping SQL service
- Wrote a restore script which include running the latest script from TFS
- Investigated the ability to change the Auto Save location in SSMS – couldn’t find anything
- Wrote a TSQL Tuesday post
- Attended Dale Carnegie Training for Coaches – there will be more posts to come on that.
This month’s T-SQL Tuesday, “T-SQL Tuesday #008: Gettin’ Schooled” is hosted by Robert Davis (Blog/Twitter) aka the SQL Soldier. The topic is: How do you learn? How do you teach? What are you learning or teaching?
I usually try to teach others the same way I learn – a bit of theory and lots of hands on examples. Mainly though I always try to “Keep It Simple” or “Keep It Simple, Stupid” – aka K.I.S.S. I’ve always loved the idea that a genius can take a complicated situation and make it simple. How many times do we take a simple situation and make it complicated? That’s kind of the opposite if you ask me.
When I’m teaching someone I always remember the words of a football coach I had, Dennis Raarup. “Keep It Simple – Stupid!” When he’d ask what coverage the defense was in, a hot shot young kid would say something complicated like “Smash – Man Free with a Monster”. Coach Raarup would yell back, “No – Are they chasing the receivers or checking off? If they are chasing them, it’s man coverage and we need to keep moving. If they are checking off, it’s a zone and our receivers need to settle in the holes.” That was his views on the passing game – it was that simple. He could take a complicated situation and break it down to running or settling – the rest was easy.
Back to SQL, when I’m teaching someone somthing, I always try to keep it as simple as I can. I always take the situation to a test server, create a database named Test, create a table named Table1 with column 1, column2, etc. I then create the other objects to mimic the situation’s trouble spot. It’s amazing what you can recreate quickly and easily on your local machine and how obvious the problem becomes when it’s table1 outer joined to table2 using column1 from both tables instead of the Incident table outer joined to Contact on SecondaryContactLastName and ContactLastNameID. Not only can you simplify complicated situations, but you can also identify and investigate the actual problem instead of getting wrapped up in the details of the real situation. Instead of “it doesn’t work”, you can explain that the process is missing a step or the code needs to include a where clause, etc.
Another benefit to working on a test server in a separate database is as I like to say “Russian roulette is not the same without a gun“. That means that it’s a much safer environment and if you mess up – which is a great way to learn – you won’t stop 30 application developers from meeting their deadline. I always try to build up the scripts and save them so if something goes awry, you can drop your test database and easily recreate the situation. Here is an example of the start of one:
-- Test database create scripts --
-- Note this DROPS the database and recreates it --
-- Created by Andy Lohn of www.sqlfeatherandquill.com --
IF EXISTS (SELECT name FROM sys.databases WHERE name = N'Test')
DROP DATABASE [Test]
CREATE DATABASE [Test]
Create table Table1 (column1 int identity(1,1),
Insert Table1 (column2) Select GETDATE()
I recently used this technique with a soon-to-be junior DBA here. He was running some tests trying to get test data ready. I showed him how to randomly create data and populate a column (this situation inspired this random data post). I didn’t give him too many specifics, but showed him the idea. About seven emails later that day from him – he was quite excited, he had figured out he could set up all his test data in multiple tables like this and what normally took him a day of manual data entry, now took him less than a day to do initially and now will take him less than an hour for any test coming up. Situations like this is why I started this blog – to help people do things I know are possible and I know will help. This is also why I need to teach and have an impact on people – like Coach Raarup had on me.
When someone states that adding a column to an existing table is easy and you need a list of risks, feel free to copy and paste this, add your own, etc. In fact, contact me via email, comments, twitter – @SQLQuill
with any additions.
Adding a column to an existing table – in theory and best practices world is easy – but in reality it is not always as easy as ALTER TABLE ADD COLUMN. The risk is that any procedure, view or application code that uses a “Select *” or an “Insert <TableName> Values <…>” (without column names explicitly stated) may not function, or may not function correctly, with an additional column added to the table. Also note there could be temp tables, table variables and table value parameters built off the table that an additional column affects as well. Best practices and my database development guidelines says not to use the “Select *” or the “Insert <TableName> Values <…>”.
I’m bringing the Weekly Challenges post back. It’s basically a status report of what I’m doing, but I figure there are a few benefits to it. Such as…keep a record of what I’m doing, give credit to the people who have helped solve some of my challenges, show others what I’m doing, so they can ask me questions (I’m not the best at twitter and answering #SQLHelp questions.) So without further adu, here are my weekly challenges for 7/9/10:
I’m currently assigned to one project and it’s crunch time on this project.
- Data migration testing – all the schema changes in the project that have any data movement need to be tested. I’m working with our VT group to coordinate those activities.
- Reviewed some new schema changes for the project – I’m the database change reviewer.
- Rewrote two deletion stored procedures
- One had forgotten a join, so it was updating a summary table to the samething for every situation
- One had missing new table - FK violation when running it. Used this to find all the missing tables in the procedure – thanks Pinal Dave!
- My laptop died today. I had a noon deadline for a document on the data migration testing and at 9:02 this morning, my screen went black and white. I got in once in Safe Mode and emailed myself the nearly completed document. I then sent it from my phone to the person who needed it.