big text file taking a v long time to import (2 days so far)

[expired user #2764]'s profile image [expired user #2764] posted 17 years ago in Import/Export Permalink
I'm importing a large text file (8 million lines) into a mysql table via the heidisql front end. Each line has 42 fields (tab separated) so there's a lot of data to bring in.

I tested the import with a truncated version of the text file (10,000 lines) and the import worked fine. It took about a minute (i think) to import. Now though, the full version (which is 800x as long) has been importing for nearly 48 hours. Heidisql seems to be still working - it hasn't frozen, it's busy.

Is it normal for a file of this size to take so long to import?
[expired user #2759]'s profile image [expired user #2759] posted 17 years ago Permalink
Well, I'm not seeing your PC, but it's possible. Take into account that since HeidiSQL is a frontend which is constantly displaying data (like each query executed) it needs a lot of for each query to complete...

An alternative would be to import the .sql file via the MySQL commandline.
ansgar's profile image ansgar posted 17 years ago Permalink
Did you use the 3.0 final release or the latest build + the new "run sql file" feature?

If 3.0 final: your file will be loaded into RAM (maybe even twice the size of the file). So, this is not a good idea.

If build 755 with "run sql file":
The feature is not yet as round as it should be. A big memory leak makes it getting slower and slower. I'm working on that feature currently.

In the meantime you really should use the mysql command line.
[expired user #2764]'s profile image [expired user #2764] posted 17 years ago Permalink

Did you use the 3.0 final release or the latest build + the new "run sql file" feature?

If 3.0 final: your file will be loaded into RAM (maybe even twice the size of the file). So, this is not a good idea.

If build 755 with "run sql file":
The feature is not yet as round as it should be. A big memory leak makes it getting slower and slower. I'm working on that feature currently.

In the meantime you really should use the mysql command line.



It's the 3.0 final - i guess that's the problem. I'll use the command line instead. Thanks for the advice!
gogia7's profile image gogia7 posted 2 years ago Permalink

I am trying to import a file with Tab separated text. First of all I do not know how to mention Tab to be the field separator character. As a result the entire rows' data gets crammed into the first column (and obviously trimmed as it exceeds the length) and all other columns are displaying as Null. I first thought this was due to non recognition of the field type (BIGINT). Discovered this issue when I changed all columns to Varchar

ansgar's profile image ansgar posted 2 years ago Permalink

You can specify "\t" for tabs, as well as some other escape sequences:

Description

Please login to leave a reply, or register at first.