Searching \ for '[EE] HELP NEEDED Tar, XP, and dir by dir backup -' in subject line. ()
Make payments with PayPal - it's fast, free and secure! Help us get a faster server
FAQ page: www.piclist.com/techref/index.htm?key=help+needed+tar
Search entire site for: 'HELP NEEDED Tar, XP, and dir by dir backup -'.

Exact match. Not showing close matches.
PICList Thread
'[EE] HELP NEEDED Tar, XP, and dir by dir backup - '
2008\06\12@160453 by Dr Skip

picon face
Well, my trying to do this with crossover tools slowed me down... I got a copy
of Info-zip command line zip and used it as:

FOR /D %%G IN (\*.*) DO D:\bin-test\zip\zip -r -S "k:\gz3%%G.zip" "d:%%G\*.*"

and it seems to have worked. The registries didn't copy while in use, as
expected, but are there any diff tools that will read the inside of a zip and
compare with the real file (and can be run from the command line)? It could be
done as an unzip-compare operation but I'd prefer to not require so much disk
space... I know there is a 'test' option, but I want to make sure,
independently, that it got every file, not just that what it decided to get worked.

BTW, 7zip portable and winzip open them fine, but peazip doesn't.

Thanks,
-Skip

2008\06\12@172447 by Dr Skip

picon face
I wrote too soon. The Info-zip based zip.exe seems to crash on a dir that
winzip does fine on and compresses to 3.5GB. It stops with an undescriptive
error midway through a certain file, yet using it to zip a lower folder
including that file works. I suspect it hits some internal size or number limit
(the dir has 48k files and >4GB). It also passes over hidden dirs (even when
told to include) and any (at least top level) dir that has '.' in it, such as
\temp.now .

-Skip

2008\06\13@092500 by Gerhard Fiedler

picon face
Dr Skip wrote:

> but are there any diff tools that will read the inside of a zip and
> compare with the real file (and can be run from the command line)? It
> could be done as an unzip-compare operation but I'd prefer to not
> require so much disk space...

I can't recall having ever seen a program that works on files in a zip file
without extracting them to a temp location. Extracting one file at a time
shouldn't cost you that much disk space, though.

Gerhard

2008\06\13@100317 by Apptech

face
flavicon
face
> I can't recall having ever seen a program that works on
> files in a zip file
> without extracting them to a temp location. Extracting one
> file at a time
> shouldn't cost you that much disk space, though.

How critical is the need for ZIP files?
With modern disk costs plummeting the savings are not liable
to be vast in many cases unless there is very substantial
quantities of data involved.
SATA seems to be approaching $US0.15/GB here at best size
and that will probably be under $US0.10/GB before long.

That will be $10/100 GB and $100/TB ! :-). Down from the
$20,000/GB or so when I bought my first HDD - several times
that in real terms. Few other things I've bought have even
come down in price by a factor of 200,000+ in my lifetime
:-).


       Russell

2008\06\13@110253 by Tamas Rudnai

face picon face
ZIP itself should do this. There are some options I can't remember, but it
was like Update or Archive or something like that so that it updated old
files plus added new ones - but basically internally it reconstructed the
archive again, however, it avoids re compression non changed files. I think
it was -u and -f.

Also you can use Archive bit (great invention of MSDOS? :-)) so ZIP, XCOPY
and other tools can handle this bit, it cleared when copied/archived, and
when file modified it sets back again. In this way next time you start
archiving it won't pick that file again. Just checked, I could not find the
command line option for WinZIP to clear archive attributes, however, you can
use this when you use the GUI. And also with WinRAR there are command line
options for that. Or use XCOPY /M to external HD or temp folder and zip up
that folder instead.

The reason is it might be better to use this archive attribute is that in
that way you can make incremental backups, and still each ZIP will be
independent from the previous one... For source files though I could not
fined better solution than versioning system, like I use CVS and then back
up the entire CVSROOT to external HD.

Tamas


On Fri, Jun 13, 2008 at 2:24 PM, Gerhard Fiedler <spam_OUTlistsTakeThisOuTspamconnectionbrazil.com>
wrote:

{Quote hidden}

> -

2008\06\13@111653 by Dr Skip

picon face
It does really help, especially in getting the efficiency of large files rather
than millions of little ones. The original test 60GB compresses down to 21GB so
far, without special compression values. 1 TB here costs $200 (US) and change,
and I have several TB to back up, with a mind towards doing it right some day
and having at least 2 full backups somewhere at any given time. ;)

At those rates, it would also speed up the network transfer by up to 3x if
transferring to a drive on the net, as I would assume the processing is much
faster than the I/O. Moving terabytes has made for loooong backup times.

7zip has turned out to be the most reliable, and in fact has a tar mode, but
zip seems to work fine so far. It had some oddities though: In the shell, *.*
will include everything. In 7z, *.* includes only those things with a '.',
which is understandable. '*' is the proper form. However, if I had a folder
such as c:\temp and backed it up as C:\temp\ without any wildcards, it would
zip up every occurrence where a directory was named temp into that zipfile, no
matter how deep it was somewhere else. Interesting redundancy...

-Skip


Apptech wrote:
{Quote hidden}

2008\06\13@120114 by Nicola Perotto

picon face
I dont understand, you need:
- one time backup
- a scheduled backup
???

Dr Skip wrote:
{Quote hidden}

2008\06\14@124722 by Dr Skip

picon face
Weekly full backup, taken offline and stored, with daily incrementals,
would be a good goal. I'll probably never be able to get to keeping full
backups on a yearly or monthly schedule in a mountain somewhere... ;)

Nicola Perotto wrote:
> I dont understand, you need:
> - one time backup
> - a scheduled backup
> ???
>  
>

2008\06\14@150642 by Nicola Perotto

picon face
In my opinion you CAN'T do a weekly backup with a packer (of any sort)!
Give a look at the rsync documentation: it allows an incremental backup
sending only the changed portion of a file!
There are many solutions for both linux and windows.
Take also a look at freenas http://www.freenas.org it's  a powerfull but
simple storage server.
I use freenas and rsync to backup daily my data archive (over 100GB)
while working and without losing performance!
       Nic


Dr Skip wrote:
{Quote hidden}

2008\06\16@013753 by John La Rooy

flavicon
face
On Sun, Jun 15, 2008 at 5:04 AM, Nicola Perotto <.....nicolaKILLspamspam@spam@nicolaperotto.it>
wrote:

> In my opinion you CAN'T do a weekly backup with a packer (of any sort)!

You can use gzip --rsyncable. Whether it is worthwhile with disk being so
cheap is another question.

Here is a paper about rsync
olstrans.sourceforge.net/release/OLS2000-rsync/OLS2000-rsync.html

2008\06\16@135224 by Dr Skip

picon face
I looked at rsync a while back and couldn't find much for win. It looks like
Deltacopy and Nasbackup are both based on Cygwin. Maybe good for the image
copies, although Nasbackup isn't much on the server end for windows. Freenas
looks good too, although I'll have to change things around a bit. It doesn't
like any disk formatting but its own, so that will take some thought to avoid
just throwing $$ at it and recycling some other hardware.

A couple of questions though.... The nice part about zipping to a usb drive is
that I can take it offline, save wear and tear when off, etc, although the
times are getting too long (I can also make better use of the A bit ;) .
What fear should I have regarding all the issues of having these backup devices
online all the time? It doesn't protect against physical damage (fire, etc),
and there's the wear and susceptibility to damage from the host OS or power
supply, etc.

The other is version backup. One use is to restore everything, or a lost file,
the other is to be able to go back and get the copy of xx from a month ago
because someone's changed it beyond repair (or Microsoft botched an update).
Separate incremental backups do that. How is that done using rsync?

Lastly, what would be the best way (and economical) to attach a bunch of PATA
drives to a single machine? Preferably windows (probably 2k).

I've had Linux copy a filesystem and say all was fine and find thousands of
files missing on a compare. I've had Linux Samba stop working under moderate
loads. I've had XP (and 2k) just freeze, and leave disks damaged. Even locally
attached disks can start acting 'funny'. So, I'm very paranoid and haven't yet
found a backup I can 'set and forget'. At least with whole image copies, I can
do a bit level compare to verify. The zip solution MAY work for putting away
older stuff, but the compare cycle is a killer. I've used tape, in the pre-dvd
days, and they would go bad, and I've even had dvd's, made less than a year
earlier AND verified, go bad on parts of them. In one case, a 6 dvd backup lost
75% of the files as unreadable - and on the same device that burned them. Less
luck on an alternate drive. I'm having a hard time trusting anything... :(

2008\06\16@163327 by Dr Skip

picon face
Wow! Nice link! I really must dig into this rsync more...

I just worry that, having windows when the good stuff is in UNIX, it'll be like
getting all excited about spending a weekend at some palace, only to find the
only accomodations available for your situation are the servant's room... ;)

ie, the windows ports will be limited, buggy, incomplete, etc...


John La Rooy wrote:
> On Sun, Jun 15, 2008 at 5:04 AM, Nicola Perotto <nicolaspamKILLspamnicolaperotto.it>
> wrote:
>
>
> You can use gzip --rsyncable. Whether it is worthwhile with disk being so
> cheap is another question.
>
> Here is a paper about rsync
> olstrans.sourceforge.net/release/OLS2000-rsync/OLS2000-rsync.html

2008\06\16@165813 by Nicola Perotto

picon face


Dr Skip wrote:
> Wow! Nice link! I really must dig into this rsync more...
>
> I just worry that, having windows when the good stuff is in UNIX, it'll be like
> getting all excited about spending a weekend at some palace, only to find the
> only accomodations available for your situation are the servant's room... ;)
>
> ie, the windows ports will be limited, buggy, incomplete, etc...
>  
The windows port works well and is complete.
The problems are:
- differences across file systems on user rights (but it's because you
have linux and windows machines)
- can be some problems with foreign languages (that needs unicode)

Also with Deltacopy there is an RSync server very good.

{Quote hidden}

2008\06\16@181903 by Dr Skip

picon face
Deltacopy is now working, and certainly helpful considering the mind-numbing
array of options in rsync (it should be called kitchen-sink ;)

I particularly like the showing of the command line at run time and in the log.
However, I need some help. In testing, I backed up a small dir, then added a
file, then updated the file, all with rsync in between. All is fine for a
mirrored setup. I then added the '-b' parameter to do incremental backups,
hoping it would do them indefinitely. The first time, file.txt became file.txt~
(the original wasn't touched) which was expected, but thereafter, the original
would get overwritten, rather than a new file getting written again with some
other name modifier. Is there a way to get it to make file.txt~1 file.txt~2
etc, or even the expected file.txt~~ or such?

TIA,
Skip


Nicola Perotto wrote:
> >
> Also with Deltacopy there is an RSync server very good.
>
>

2008\06\16@184333 by Gerhard Fiedler

picon face
Dr Skip wrote:

> What fear should I have regarding all the issues of having these backup
> devices online all the time? It doesn't protect against physical damage
> (fire, etc), and there's the wear and susceptibility to damage from the
> host OS or power supply, etc.

I have always at least one reasonably recent copy of my backups offline
(external USB harddisk).

> Lastly, what would be the best way (and economical) to attach a bunch of PATA
> drives to a single machine? Preferably windows (probably 2k).

I think individual USB2 interfaces are (I think they start at $25 or so).

> So, I'm very paranoid and haven't yet found a backup I can 'set and
> forget'. At least with whole image copies, I can do a bit level compare
> to verify.

Acronis TrueImage (to harddisk) seems to work well; it's not free, though.

Gerhard

2008\06\17@034034 by Nicola Perotto

picon face
The '-b' parameter not do an incremental backup but only a 'backup' .
Reread the manual:
http://rsync.samba.org/ftp/rsync/rsync.html

First, if you want an incremental backup you MUST define some parameters
and think about this.
- time schedule
- number of copies
- expected/maximum size
- etc
If you have more computers this is a big problem... think well!


Dr Skip wrote:
{Quote hidden}

2008\06\17@051004 by Nicola Perotto

picon face
Some rsync examples:
http://samba.anu.edu.au/rsync/examples.html


Dr Skip wrote:
{Quote hidden}

2008\06\17@153617 by Dr Skip

picon face
Thanks. I also found a nice explanation with a lot of examples here:
http://www.mikerubel.org/computers/rsync_snapshots/

A lot of good stuff in rsync. I'll have to rethink how things are done here...
So far the only pain has been in permissions. We are very open here (yes, that
can be bad), so I'm having to make liberal use of the cacls command in xp. ;)


Nicola Perotto wrote:
> Some rsync examples:
> http://samba.anu.edu.au/rsync/examples.html
>
>
> Dr Skip wrote:
>

More... (looser matching)
- Last day of these posts
- In 2008 , 2009 only
- Today
- New search...