Exact match. Not showing close matches.
'[EE] HELP NEEDED Tar, XP, and dir by dir backup - '
Well, my trying to do this with crossover tools slowed me down... I got a copy
of Info-zip command line zip and used it as:
FOR /D %%G IN (\*.*) DO D:\bin-test\zip\zip -r -S "k:\gz3%%G.zip" "d:%%G\*.*"
and it seems to have worked. The registries didn't copy while in use, as
expected, but are there any diff tools that will read the inside of a zip and
compare with the real file (and can be run from the command line)? It could be
done as an unzip-compare operation but I'd prefer to not require so much disk
space... I know there is a 'test' option, but I want to make sure,
independently, that it got every file, not just that what it decided to get worked.
BTW, 7zip portable and winzip open them fine, but peazip doesn't.
I wrote too soon. The Info-zip based zip.exe seems to crash on a dir that
winzip does fine on and compresses to 3.5GB. It stops with an undescriptive
error midway through a certain file, yet using it to zip a lower folder
including that file works. I suspect it hits some internal size or number limit
(the dir has 48k files and >4GB). It also passes over hidden dirs (even when
told to include) and any (at least top level) dir that has '.' in it, such as
Dr Skip wrote:
> but are there any diff tools that will read the inside of a zip and
> compare with the real file (and can be run from the command line)? It
> could be done as an unzip-compare operation but I'd prefer to not
> require so much disk space...
I can't recall having ever seen a program that works on files in a zip file
without extracting them to a temp location. Extracting one file at a time
shouldn't cost you that much disk space, though.
> I can't recall having ever seen a program that works on
> files in a zip file
> without extracting them to a temp location. Extracting one
> file at a time
> shouldn't cost you that much disk space, though.
How critical is the need for ZIP files?
With modern disk costs plummeting the savings are not liable
to be vast in many cases unless there is very substantial
quantities of data involved.
SATA seems to be approaching $US0.15/GB here at best size
and that will probably be under $US0.10/GB before long.
That will be $10/100 GB and $100/TB ! :-). Down from the
$20,000/GB or so when I bought my first HDD - several times
that in real terms. Few other things I've bought have even
come down in price by a factor of 200,000+ in my lifetime
ZIP itself should do this. There are some options I can't remember, but it
was like Update or Archive or something like that so that it updated old
files plus added new ones - but basically internally it reconstructed the
archive again, however, it avoids re compression non changed files. I think
it was -u and -f.
Also you can use Archive bit (great invention of MSDOS? :-)) so ZIP, XCOPY
and other tools can handle this bit, it cleared when copied/archived, and
when file modified it sets back again. In this way next time you start
archiving it won't pick that file again. Just checked, I could not find the
command line option for WinZIP to clear archive attributes, however, you can
use this when you use the GUI. And also with WinRAR there are command line
options for that. Or use XCOPY /M to external HD or temp folder and zip up
that folder instead.
The reason is it might be better to use this archive attribute is that in
that way you can make incremental backups, and still each ZIP will be
independent from the previous one... For source files though I could not
fined better solution than versioning system, like I use CVS and then back
up the entire CVSROOT to external HD.
On Fri, Jun 13, 2008 at 2:24 PM, Gerhard Fiedler <connectionbrazil.com> lists
|It does really help, especially in getting the efficiency of large files rather
than millions of little ones. The original test 60GB compresses down to 21GB so
far, without special compression values. 1 TB here costs $200 (US) and change,
and I have several TB to back up, with a mind towards doing it right some day
and having at least 2 full backups somewhere at any given time. ;)
At those rates, it would also speed up the network transfer by up to 3x if
transferring to a drive on the net, as I would assume the processing is much
faster than the I/O. Moving terabytes has made for loooong backup times.
7zip has turned out to be the most reliable, and in fact has a tar mode, but
zip seems to work fine so far. It had some oddities though: In the shell, *.*
will include everything. In 7z, *.* includes only those things with a '.',
which is understandable. '*' is the proper form. However, if I had a folder
such as c:\temp and backed it up as C:\temp\ without any wildcards, it would
zip up every occurrence where a directory was named temp into that zipfile, no
matter how deep it was somewhere else. Interesting redundancy...
I dont understand, you need:
- one time backup
- a scheduled backup
Dr Skip wrote:
Weekly full backup, taken offline and stored, with daily incrementals,
would be a good goal. I'll probably never be able to get to keeping full
backups on a yearly or monthly schedule in a mountain somewhere... ;)
Nicola Perotto wrote:
> I dont understand, you need:
> - one time backup
> - a scheduled backup
In my opinion you CAN'T do a weekly backup with a packer (of any sort)!
Give a look at the rsync documentation: it allows an incremental backup
sending only the changed portion of a file!
There are many solutions for both linux and windows.
Take also a look at freenas http://www.freenas.org it's a powerfull but
simple storage server.
I use freenas and rsync to backup daily my data archive (over 100GB)
while working and without losing performance!
Dr Skip wrote:
John La Rooy
On Sun, Jun 15, 2008 at 5:04 AM, Nicola Perotto <nicolaperotto.it> nicola
> In my opinion you CAN'T do a weekly backup with a packer (of any sort)!
You can use gzip --rsyncable. Whether it is worthwhile with disk being so
cheap is another question.
Here is a paper about rsync
I looked at rsync a while back and couldn't find much for win. It looks like
Deltacopy and Nasbackup are both based on Cygwin. Maybe good for the image
copies, although Nasbackup isn't much on the server end for windows. Freenas
looks good too, although I'll have to change things around a bit. It doesn't
like any disk formatting but its own, so that will take some thought to avoid
just throwing $$ at it and recycling some other hardware.
A couple of questions though.... The nice part about zipping to a usb drive is
that I can take it offline, save wear and tear when off, etc, although the
times are getting too long (I can also make better use of the A bit ;) .
What fear should I have regarding all the issues of having these backup devices
online all the time? It doesn't protect against physical damage (fire, etc),
and there's the wear and susceptibility to damage from the host OS or power
The other is version backup. One use is to restore everything, or a lost file,
the other is to be able to go back and get the copy of xx from a month ago
because someone's changed it beyond repair (or Microsoft botched an update).
Separate incremental backups do that. How is that done using rsync?
Lastly, what would be the best way (and economical) to attach a bunch of PATA
drives to a single machine? Preferably windows (probably 2k).
I've had Linux copy a filesystem and say all was fine and find thousands of
files missing on a compare. I've had Linux Samba stop working under moderate
loads. I've had XP (and 2k) just freeze, and leave disks damaged. Even locally
attached disks can start acting 'funny'. So, I'm very paranoid and haven't yet
found a backup I can 'set and forget'. At least with whole image copies, I can
do a bit level compare to verify. The zip solution MAY work for putting away
older stuff, but the compare cycle is a killer. I've used tape, in the pre-dvd
days, and they would go bad, and I've even had dvd's, made less than a year
earlier AND verified, go bad on parts of them. In one case, a 6 dvd backup lost
75% of the files as unreadable - and on the same device that burned them. Less
luck on an alternate drive. I'm having a hard time trusting anything... :(
Wow! Nice link! I really must dig into this rsync more...
I just worry that, having windows when the good stuff is in UNIX, it'll be like
getting all excited about spending a weekend at some palace, only to find the
only accomodations available for your situation are the servant's room... ;)
ie, the windows ports will be limited, buggy, incomplete, etc...
John La Rooy wrote:
> On Sun, Jun 15, 2008 at 5:04 AM, Nicola Perotto <nicolaperotto.it> nicola
> You can use gzip --rsyncable. Whether it is worthwhile with disk being so
> cheap is another question.
> Here is a paper about rsync
Dr Skip wrote:
> Wow! Nice link! I really must dig into this rsync more...
> I just worry that, having windows when the good stuff is in UNIX, it'll be like
> getting all excited about spending a weekend at some palace, only to find the
> only accomodations available for your situation are the servant's room... ;)
> ie, the windows ports will be limited, buggy, incomplete, etc...
The windows port works well and is complete.
The problems are:
- differences across file systems on user rights (but it's because you
have linux and windows machines)
- can be some problems with foreign languages (that needs unicode)
Also with Deltacopy there is an RSync server very good.
Deltacopy is now working, and certainly helpful considering the mind-numbing
array of options in rsync (it should be called kitchen-sink ;)
I particularly like the showing of the command line at run time and in the log.
However, I need some help. In testing, I backed up a small dir, then added a
file, then updated the file, all with rsync in between. All is fine for a
mirrored setup. I then added the '-b' parameter to do incremental backups,
hoping it would do them indefinitely. The first time, file.txt became file.txt~
(the original wasn't touched) which was expected, but thereafter, the original
would get overwritten, rather than a new file getting written again with some
other name modifier. Is there a way to get it to make file.txt~1 file.txt~2
etc, or even the expected file.txt~~ or such?
Nicola Perotto wrote:
> Also with Deltacopy there is an RSync server very good.
Dr Skip wrote:
> What fear should I have regarding all the issues of having these backup
> devices online all the time? It doesn't protect against physical damage
> (fire, etc), and there's the wear and susceptibility to damage from the
> host OS or power supply, etc.
I have always at least one reasonably recent copy of my backups offline
(external USB harddisk).
> Lastly, what would be the best way (and economical) to attach a bunch of PATA
> drives to a single machine? Preferably windows (probably 2k).
I think individual USB2 interfaces are (I think they start at $25 or so).
> So, I'm very paranoid and haven't yet found a backup I can 'set and
> forget'. At least with whole image copies, I can do a bit level compare
> to verify.
Acronis TrueImage (to harddisk) seems to work well; it's not free, though.
The '-b' parameter not do an incremental backup but only a 'backup' .
Reread the manual:
First, if you want an incremental backup you MUST define some parameters
and think about this.
- time schedule
- number of copies
- expected/maximum size
If you have more computers this is a big problem... think well!
Dr Skip wrote:
Thanks. I also found a nice explanation with a lot of examples here:
A lot of good stuff in rsync. I'll have to rethink how things are done here...
So far the only pain has been in permissions. We are very open here (yes, that
can be bad), so I'm having to make liberal use of the cacls command in xp. ;)
Nicola Perotto wrote:
> Some rsync examples:
> Dr Skip wrote:
More... (looser matching)
- Last day of these posts
- In 2008
, 2009 only
- New search...