1. Create a table with an integer IDENTITY column and a single CHAR column, NULLable. 2. Insert 10,000 rows with NULL in the char column. 3. Use `[sys].[dm_db_index_physical_stats]` to find out the min and max row sizes. 4. Create another table with an integer IDENTITY column and two CHAR columns, NULLable. 5. Insert 10,000 rows with NULL in the char columns. 6. Use `[sys].[dm_db_index_physical_stats]` to find out the min and max row sizes.
Hakan is completely right, but this question made me think about how to show it expirmentally. So I created a new database, and created one table: create table CharTestTbl ( col1 char(8000) null) After that, the database had a size of 3.0 MB with .88 MB free on my system. I then populated that table with 10000 rows of null. insert into CharTestTbl (col1) select top 10000 null as col1 from sys.all_objects Join sys.all_objects s2 on 1=1 Join sys.all_objects s3 on 1=1 After that insert, the size of the database was 179.75 MB with .69 MB free space. This of course is precisely what you would expect from the reference Hakan provided, but it gives you an experimental demonostration for people that like to see it happen.