First if all congrats for HeidiSQl comunity it's great.
Now the problem
I have a huge database that I want to export in sql (millions registers).
Is there a way to make multiple files, say a new file every 10.000/100.000 or whatever registers ?
Another option would be to export every row with it's own insert/replace into ... that way I could split the file externally and would work.
My last sql export was 800mb text file !!! and growing every day.
An other would be selective export, in the sense that I could export only register with id > last_backup.
Any idea, would be welcome.
No, there is no such a splitting option. Also, the extended INSERT syntax is always used, no option to deactivate it. Single INSERTs perform veeeeeery slow when you import these again. HeidiSQL does not support slowness :)
You should see if your underlying tables can be trimmed at some point. It's a good discipline to have a cron job deleting old rows from tables which are pure logging tables.
The root problem is that the more options you add the more difficult to use it becomes. And if you browse this forum you'll find that some users can't figure out how to use the feature with current design!
Some time ago I was faced to a very specific need (create a SQL dump for version control purposes) and I suggested creating export profiles so you're able to fine tune the export but you aren't forced to do it every time. But the idea wasn't very successful :)
But simplicity is not always the best solution, anyway I am very happy with Heidi it's a wonderfull tool and a nice name too.
I will take a look at phpdump or any other scriptor even some bash script, the server is running in a centos 6 box.
You are being very childish.
I pointed out that numerous times users have brought up the need for an option to not have extended inserts.
Each time you have responded with a childish comment, as again.
Tell us all, what happens when using a 350mb dump file and there is a connection problem? Yep, start again and pray.
Thus, you should consider the possibility of not just an option to not use extended inserts, but also to resume.
I never heard good arguments for single INSERTs. There is not even an issue report in the tracker about that.
The problem is very simple i have a database with millions registers, the dump file is actually 2.4gb yes gb.
If i had an insert for every register the file would be greater, but I could split it with a decent editor and use several small files.
If there was a possibility to have saveral files with say 100.000 resigsters the restore would be easy with a simple bat to handle all files. BUT.
BUT with a single 2.4 gb file, I've tried to restore the database in an other computer and heidi failed to restore. it simply can handle taht file, I had to kill it.
It's not a childish question, neiter childish answer. Heidisql is a very nice tool and for me has a very limited backup system.
Batabase are bigger and bigger and a plain text backup is a very good idea for me.
Kalvaro between extreme simplicity and extreme complexity, there are several options.
Thank you, and please,be constructive.
heidisql is not an ideal backup client; i suggest that you use the mysqldump and mysql command line clients for that (espec. mysqldump --skip-extended-insert, which should do what you want).
Please login to leave a reply, or register at first.