Continuous IntegrationLeave a comment
May 9, 2017 by Kenneth Fisher
It’s T-SQL Tuesday again! This month we are being hosted by James Anderson (b/t). Thanks James! He has asked us to talk about Shipping Database Changes. I actually have some experience with an aspect of this at a previous company and I thought this would be a good time to share that story.
Once upon a time (about 10-15 years ago, I’m old I don’t exactly remember) in a company far far away (~60 miles) I worked for a company who handled their software builds in a strange and mysterious way (i.e. well thought out and practical).
I wasn’t aware of at the time but we were using a form of Continuous integration. At least it matches up with the majority of the description anyway.
First of all, we used source control for all of our scripts, including the scripts to build the database. If your change (new column, new SP, modified function whatever) was not in source control, it wasn’t part of the software. Period. Now here is where it got interesting. Every night there was an automated build. The software would be compiled and the database built from the latest versions of the scripts in source control. But what if something was put into source control that broke the build you might ask? Simple. An email would go out to the entire company with the name of the file and who checked it in last. There would then be some good natured ribbing (because we all knew it might be us next time) and the file would be rolled back or fixed and the build re-started.
Once the build itself was completed the default data was loaded. We had a lot of default data that came with the software and it was all kept in text files (also in source control) that would be loaded into the database. Again, if anything had been checked in that failed to load we were notified by an automated email.
And last but not least came testing. Yes, we had automated testing. We actually had a testing team. Two amazing guys who would do a fair amount of manual testing but also wrote automated tests for each new process to be run as part of the nightly build. The tests were designed directly from the requirements with questions directed to the business people, not the developers. The developers were involved of course but we tried not to build tests that just matched the code. That way we made sure that the end results were closer to what was expected. And again, automated emails for any of the tests that failed.
I want to point out something here. This was all automated. We would come into work in the morning with news that the build succeeded (most of the time) or failed (occasionally). If it failed, it was fixed and the process re-run. It was awesome.
So why was this so awesome? Well, first of all, we always had a brand new copy of the database with all of the latest changes. Next, since all of the tests were run, not just the latest set, we got a fair amount of integration testing done automatically. And lastly, we were far more careful about what we checked in than we might have been otherwise. No one wanted to be embarrassed :).
This was a while ago so a fair amount of our process was developed in-house. Now, however, there are plenty of tools out there to help you out. Fair warning though, it’s not like this type of process will work for everyone. That was a software company so the processes around building our software had to be a bit sharper than otherwise. My current company might be able to do something like this, but honestly, it would be a huge amount of work and it’s not that good a fit. (We are working on getting closer tho). I really enjoyed it while I had it though.
Category: Documentation, SQLServerPedia Syndication, T-SQL Tuesday | Tags: Continuous Integration, microsoft sql server, T-SQL Tuesday