[gpfsug-discuss] GPFS cpoying huge batches of files from one stgpool to another- how do you do it?
Jez.Tucker at rushes.co.uk
Mon Mar 19 23:36:17 GMT 2012
Just wondering how other people go about copying loads of files in a many, many deep directory path from one file system to another. Assume filenames are full UNICODE and can contain almost any character.
It has me wishing GPFS had a COPY FROM support as well as a MIGRATE FROM function for policies.
Surely that would be possible...?
Ways I can think of are:
- Multiple 'scripted intelligent' rsync threads
- Creating a policy to generate a file list to pass N batched files to N nodes to exec (again rsync?)
- Barry Evans suggested via AFM. Though out file system needs to be upgraded before we could try this.
Rsync handles UNICODE names well. tar, though faster for the first pass does not.
GPFSUG Chairman (chair at gpfsug.org)
Rushes Postproduction Limited, 66 Old Compton Street, London W1D 4UH
tel: +44 (0)20 7437 8676
The information contained in this e-mail is confidential and may be subject to legal privilege. If you are not the intended recipient, you must not use, copy, distribute or disclose the e-mail or any part of its contents or take any action in reliance on it. If you have received this e-mail in error, please e-mail the sender by replying to this message. All reasonable precautions have been taken to ensure no viruses are present in this e-mail. Rushes Postproduction Limited cannot accept responsibility for loss or damage arising from the use of this e-mail or attachments and recommend that you subject these to your virus checking procedures prior to use.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the gpfsug-discuss