February 17, 2016 by Kenneth Fisher
When developing in SQL Server you are eventually going to have to modify a column or two (at least). And if you use T-SQL to make the modifications (and I would recommend it most of the time) there is something you need to watch out for.
First remember that when you define a column there are three parts. Datatype, precision and nullability. (And sometimes identity but we are going to ignore that for these purposes.)
CREATE TABLE AlterColumns ( Column1 datetime2 NULL, Column2 varchar(50) NOT NULL, Column3 int );
You’ll notice that in only one case did I mention all three things, in the other cases I accepted various defaults.
- Column1 – datetime2 is the datatype, I accepted the default precision (7 in this case) and specified that this column is nullable.
- Column2 – varchar is the datatype, 50 is the precision (There is a default precision for string datatypes. Don’t use it.) and the column is not nullable.
- Column3 – int is the datatype, int has a fixed precision (you can’t change or even specify it) and I accepted the default of a nullable column.
So here is the thing. When you change one you change them all. That means if you don’t specify a precision when you can then you get the default. That’s not exactly a common problem though. Usually what you are changing is the precision (or possibly the datatype). What is a common mistake is not specifying the nullability.
ALTER TABLE AlterColumns ALTER COLUMN Column2 varchar(255)
You’ll see statements like this all the time. The problem is that it has an unexpected consequence. You just changed the column to the default of nullable, but it was non-nullable. Hopefully your code is careful about inserts/updates. It won’t ever try to put a NULL into the column so it isn’t likely to break anything. It’s still something you want to watch out for though. You made that column non-nullable for a reason. It should stay that way.