Jump to content

Recommended Posts

  • Superior Administrator
Posted

All Databases

 

You need SSH access and possible Super User Privileges to run this commands.

 

[iCODE] mysqldump --all-databases | gzip > /path/backups/backup_$(date "+%b-%d-%Y-%H-%M-%S").sql.gz[/iCODE]

 

[iCODE]mysqldump[/iCODE]: utility to dump mysql database

 

[iCODE]-u [/iCODE]specifies the user with mysql privileges with DB privileges.. In this tutorial, I am using root user

 

[iCODE]-p[/iCODE] specifies the password flag needed by the user.

 

[iCODE]-h[/iCODE] flag specifies the host you are connecting to.. You can simply ignore this if its localhost but if it's a remote IP, you need to specify the IP address..

 

[iCODE]gzip[/iCODE]: Utility to compress the .sql file after dump file is created.

 

[iCODE]date[/iCODE]: $(date "+%b-%d-%Y-%H-%M-%S") by adding this you are giving a date and time to your backups...

 

Single Databases"

 

If you're just trying to backup a simple database you can run this command

 

[iCODE] mysqldump -h localhost -u root -ppassword dbname | gzip > /path/backups/backup_$(date "+%b-%d-%Y-%H-%M-%S").sql.gz[/iCODE]

  • Like 1
  • Replies 3
  • Created
  • Last Reply

Top Posters In This Topic

Popular Days

Top Posters In This Topic

  • Superior Administrator
Posted

To export the data to a remote host. I advice using rsync for this.

Eg. command to transfer the backup to a backup server..

 

rsync -av --progress /your/local/backup/directory/backup_$(date "+%b-%d-%Y-%H-%M-%S") .sql.gz [email protected]:/home/backups

 

[iCODE]rsync -av --progress /your/local/backup/directory/backup_$(date "+%b-%d-%Y-%H-%M-%S") .sql.gz[/iCODE]

 

rsync -- tool to transfer files

 

-v flag needed to view the output of rsync

 

--progress flag to see the progress of the transfer

 

/your/local/backup/directory path to your backup directory

 

backup_$(date "+%b-%d-%Y-%H-%M-%S") .sql.gz this will be your backup file if you used the backup eg.in my first post.

 

[iCODE][email protected]:/home/backups[/iCODE]

 

root the user that has access to the remote server receiving the backup, in my case it;s root, you can change this accordingly.

 

@remote.ip.address.here the remote location or IP address eg. @45.36.324.23

 

: add this if the remote is using default SSH ports..

 

/home/backups this is the drectory you want the backup to be stored into

  • Like 1
  • Superior Administrator
Posted

Fast data import trick

 

[iCODE]$ mysqldump -h localhost -u root -p --extended-insert --quick --no-create-info mydb mytable | gzip > mytable.sql.gz [/iCODE]

 

A bit more about this line:

 

  • [iCODE]--extended-insert:[/iCODE] it makes sure that it is not one INSERT per line, meaning a single statement can have dozens of rows.
  • [iCODE]--quick:[/iCODE] useful when dumping large tables, by default MySQL reads the whole table in memory then dumps into a file, that way the data is streamed without consuming much memory.
  • [iCODE]--no-create-info[/iCODE]: this means only the data is being exported, no CREATE TABLE statements will be added

 

To do the import do this:

 

Disable foreign key checks when importing batch data in MySQL

 

[iCODE]SET foreign_key_checks = 0;[/iCODE]

/* do you stuff REALLY CAREFULLY */

[iCODE]SET foreign_key_checks = 1;[/iCODE]

 

 

Add SET FOREIGN_KEY_CHECKS=0; to the beginning of your sql file

 

or

  1. [iCODE]cat <(echo "SET FOREIGN_KEY_CHECKS=0;") data.sql | mysql[/iCODE]

or

  1. [iCODE]mysql --init-command="SET SESSION FOREIGN_KEY_CHECKS=0;"[/iCODE]

You don't need to run SET FOREIGN_KEY_CHECKS=1 after as it is reset automatically after the session ends

  • Like 1

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...
×
GloTorrents Community Forum
Home
Activities
Sign In
Search
More
×