SQL - importing a text file

Author
Discussion

tigger1

Original Poster:

8,402 posts

222 months

Wednesday 8th August 2007
quotequote all
Ok, sound simple so far?

How can I get SQL Query Mangle-iser to only read in certain rows from the text file? Is it better to do a read of all the rows and then just drop the one's I don't need using an update table statement?

Also, the file is fixed width...and I'm going to do this regularly - is there a way I can just code this, rather than using GUI?

(Please don't suggest pressing F1...I know that if you got this far you know how to do it wink )


Don

28,377 posts

285 months

Wednesday 8th August 2007
quotequote all
tigger1 said:
Ok, sound simple so far?

How can I get SQL Query Mangle-iser to only read in certain rows from the text file? Is it better to do a read of all the rows and then just drop the one's I don't need using an update table statement?

Also, the file is fixed width...and I'm going to do this regularly - is there a way I can just code this, rather than using GUI?

(Please don't suggest pressing F1...I know that if you got this far you know how to do it wink )
Either write a program in VB to do it (easy for a VB programmer) or buy an "Importer". Loads available. T'other easy thing to do is to read your file into Excel, and then cut and paste the data into the back end of an Access "linked table".

I've never used SQL Query Analyser to import data...sure it will open a text file but what's the point in that? Can you update a table with it after???

pdV6

16,442 posts

262 months

Wednesday 8th August 2007
quotequote all
If you want to code it then whatever code you write can encapsulate whatever logic you want in terms of selectively picking which bits of data to import. As Don says, anyone with a bit of VB experience should be able to sort it out for you.

tigger1

Original Poster:

8,402 posts

222 months

Wednesday 8th August 2007
quotequote all
Cheers Guys, looks like a bit of buggering into access with VBA and then importing from there then.

I actually meant the Server Management Studio rather than Query Analyser

FunkyGibbon

3,786 posts

265 months

Wednesday 8th August 2007
quotequote all
Have you tried the DTS tool? Quite a powerful for getting data in/out of SQL Server, and you can schedule it to run as well.

You can define what bits of the input file to drag across and you can map and transform (using VBscript) input fields to output fields quite easily.

Loads of examples of what you can do here:

http://www.sqldts.com/

HTH

FG

wiggy001

6,545 posts

272 months

Wednesday 8th August 2007
quotequote all
Can you not use BULK INSERT to get the data into a table, then either delete the rows you don't want or insert those you do into another table?

NWMark

517 posts

217 months

Wednesday 8th August 2007
quotequote all
DTS is the way to do it and there is also a pretty good wizard to get the basics in place as a starting point and it can be easily scheduled.

Mark
p.s. if your using SQL 2005 its called Integration Services.

cyberface

12,214 posts

258 months

Wednesday 8th August 2007
quotequote all
DTS will do the job and you can embed logic in the import step... but if you want any logging of failure conditions / numbers of rows loaded / ignored / etc. then it's short of functionality.

Integration Services in 2005 is meant to be a LOT better in this regard.

Depending on where you want to keep your logic, it may be more consistent to take the approach suggested above and bulk insert all rows into a staging table and apply logic in a stored proc to which rows to use. If all your other DB data handling logic is in other SPs then having critical jobs hanging out on DTS packages is poor design IMO.

tigger1

Original Poster:

8,402 posts

222 months

Thursday 9th August 2007
quotequote all
cyberface said:
DTS will do the job and you can embed logic in the import step... but if you want any logging of failure conditions / numbers of rows loaded / ignored / etc. then it's short of functionality.

Integration Services in 2005 is meant to be a LOT better in this regard.

Depending on where you want to keep your logic, it may be more consistent to take the approach suggested above and bulk insert all rows into a staging table and apply logic in a stored proc to which rows to use. If all your other DB data handling logic is in other SPs then having critical jobs hanging out on DTS packages is poor design IMO.
Gone for a stored procedure that loads the file into a table, which is then chopped / changed / edited as required until it resembles the required data format. Bit of a bastardisation...but it appears to work, and has saved me the joy of learning how to use Integration Services 2005!