Hi
I have a table after dump around 5gb. is it possible auto split to 500mb per file via devart dump?
if no, have any suggestion to achieve this requirement?
thank you
Dump single table data to multiple files.
Re: Dump single table data to multiple files.
MySqlDump itself does not have this function, but you could use the system utilities to split the dump file into parts.
Is more interesting if you wanted to split the dump into correct SQL chunks that could be executed separately. Perhaps a code like this will help you in that case:
If you are using UseExtSyntax=true, remember to set a reasonable CommitBatchSize value as well (CommitBatchSize<<maxPartSize).
- Windows: File manager like Total Commander etc…
- Linux: (example)
Code: Select all
# split -b 500M -d dump.sql dump.
Is more interesting if you wanted to split the dump into correct SQL chunks that could be executed separately. Perhaps a code like this will help you in that case:
Code: Select all
using System;
using Devart.Data;
using Devart.Data.MySql;
using System.IO;
using System.Text.RegularExpressions;
namespace ConsoleApp_dump
{
class Program
{
static void Main(string[] args)
{
string cs = "User Id=user;Password=1111;Host=mysql.local;Database=world;";
string dumpFile = "dupm.sql";
string partsFiles = "dupm.sql.";
int maxPartSize = 50000;
MySqlConnection conn = new MySqlConnection(cs);
conn.Open();
MySqlDump dump = new MySqlDump(conn);
dump.UseExtSyntax = true;
dump.CommitBatchSize = 2000;
dump.Backup(dumpFile);
int bytesWrite = 0;
int partsCount = 0;
StreamWriter sw = new StreamWriter(partsFiles + partsCount, false);
using (StreamReader reader = new StreamReader(dumpFile))
{
string sql = "";
string line;
while ((line = reader.ReadLine()) != null)
{
sql += line + "\n";
if (Regex.IsMatch(sql, @";\n$"))
{
if (bytesWrite + sql.Length > maxPartSize)
{
sw.Close();
partsCount++;
sw = new StreamWriter(partsFiles + partsCount, false);
bytesWrite = 0;
}
sw.WriteLine(sql);
bytesWrite += sql.Length;
sql = "";
}
}
}
sw.Close();
}
}
}