Solved good old backup problem

February 17, 2014 at 06:47:46
Specs: RedHat 6 in 64-bit, x86-64
Shortly said : I need to make a backup of a filesystem that is 250 Gig in size.

I can FTP all the files independantly, but I am certain that the FTP mode (binary/text) will be wrong for at least one file of that structure. So, that option really is a no-go.

I was thinking on TAR and GZIP, but it seems that he TARs the FULL filestructure first, then starts to GZIP. Of course, I was running out of diskspace.

There's lots of file in that structure, big binary files, big text files, small binaries, small text, etc. etc.

It's no problem that I need to have multiple files, instead of 1
There must be some sort of compression

I do need to use STANDARD Linux features, nothing extra.
The file will be copied to Windows, but it does NOT need to be opened on Windows. Only stored. If necessary, the file splitting may occur on Windows level (for that reason, 1 big file is acceptable).

Hi there.


See More: good old backup problem

Report •

#1
February 17, 2014 at 07:19:29
✔ Best Answer
Use Samba to transfer the files via SMB networking.

But why would binary mode in FTP be unsuitable for any files? It is perfectly safe to transfer text files using binary mode as long as you don't want any conversion to take place (which you don't). FTP is also considerably quicker than SMB.


Report •

#2
February 17, 2014 at 07:48:08
That's indeed a good remark, binary mode would actually be suited since I would never use the files on Windows anyway (maybe to open a copy, but never edit or save to it). The problem then is not the mode, but the fact there may be lots of files. And I mean lots, we're talking at least 100.000, but probably a lot more.

Is Samba Red Hat native compatible ?

Hi there.

message edited by User123456789


Report •

#3
February 17, 2014 at 10:56:30
You should be able to run Samba on Red Hat. I'm not sure if it's installed by default. Another possibility is to use rsync (you can get Windows versions). Personally, for speed and convenience, I would use an external hard drive rather than messing about with Windows. You could thne just tar the files directly to it, forgetting about compression.

Report •

Related Solutions

#4
February 18, 2014 at 02:46:10
Thanks for the info, I'll have a look at that, and retry ZIP as well, maybe there actually is a 64-bit version of ZIP that handles huge files ..

Hi there.


Report •

#5
February 20, 2014 at 07:32:54
Writing a TAR and copying it to an USB connected to my client computer would technicallly work, but it would be sloooooooooow, besides that fact I need an additonal 250 gigs extra because TAR doesn't know compression.

Hi there.


Report •

#6
February 24, 2014 at 02:39:00
The GZIP option combined with TAR seems to work ... For creation, it's both fast and small (meaning the compression is done on the fly). I need to do test with extract now.

The ZIP and UNZIP tools seem to be 32-bit, in most or any cases ... While TAR and GZIP are both 64-bit executables.

Hi there.


Report •

Ask Question