A Programmer's Dream

Exporting Large Dataset into Excel

Posted by Stephen Wrighton on 16 Jan 2007

I like CodeProject. I like it a lot, and it is one of the sites I hit while searching for answers to questions.

Of course, on occasion, they get articles like this one. Basically, it's the author's idea on the ONLY way to take a large data table and place it into an Excel file in a fast manner.

His algorithm is basically a method, which accepts a data table and a filename. He then walks the data table and builds a tab-delimited string using the StringBuilder class. The next thing he does is pass that StringBuilder to another method to write the data out to a file.

My first question is what version of Excel is he writing to? To me, it seems that he would be easier to put the data table into a dataset and export that as XML which Excel can natively accept in the newer versions of Excel.

If he's writing to an older version, why not create a Excel file, and then use ADO.Net to open that file for Data Access, and merge the Excel file with the Data Table?

And then there's the overhead of running this in two separate methods, with what amounts to nested Try/Catch blocks. The author of the article is using the Try/Catch blocks to do error handling rather than checking for the possible issues which would generate an error.

So while there is nothing wrong with this implementation, it cannot be the 'fastest' way of doing things. Not with the overhead of the two methods and two Try/Catch blocks. Likewise, I would drop the use of constructs such as IsDBNull, and IIF. Rather do a null check with a construct such as "stringParameter is DBNull.Value."

Tweet me @kidananubix if you like this post.

Tweet