有没有可能让 SCP 在复制过程中忽略符号链接?

我需要重新安装我们的一台服务器,作为预防措施,我想把 /home/etc/opt/Services转移到备份服务器上。

然而,我有一个问题: 由于大量的符号链接,大量的文件被复制多次。

是否有可能使 scp忽略符号链接(或者实际上将链接复制为链接而不是目录或文件) ?如果没有,还有别的办法吗?

69953 次浏览

I knew that it was possible, I just took wrong tool. I did it with rsync

rsync --progress -avhe ssh /usr/local/  XXX.XXX.XXX.XXX:/BackUp/usr/local/

I found that the rsync method did not work for me, however I found an alternative that did work on this website (www.docstore.mik.ua/orelly).

Specifically section 7.5.3 of "O'Reilly: SSH: The Secure Shell. The Definitive Guide".

7.5.3. Recursive Copy of Directories

...

Although scp can copy directories, it isn't necessarily the best method. If your directory contains hard links or soft links, they won't be duplicated. Links are copied as plain files (the link targets), and worse, circular directory links cause scp1 to loop indefinitely. (scp2 detects symbolic links and copies their targets instead.) Other types of special files, such as named pipes, also aren't copied correctly.A better solution is to use tar, which handles special files correctly, and send it to the remote machine to be untarred, via SSH:

$ tar cf - /usr/local/bin | ssh server.example.com tar xf -

Using tar over ssh as both sender and receiver does the trick as well:

cd $DEST_DIR
ssh user@remote-host 'cd $REMOTE_SRC_DIR; tar cf - ./' | tar xvf -

One solution is to use a shell pipe. I have a situation where I got some *.gz files and symbolic links generated by some software to link to the same *.gz files with a slightly shorter name. If I simply use scp, then the symbolic links will be copied as regular files and resulting in duplicates. I know rsync can ignore symbolic links, but my gz files are not compressed with syncable options, and sync is very slow in copying these gz files. So I simply use the following script to copy over the files:

find . -type f -exec scp {} target_host:/directory/name/data \;

The -f option will only find regular files and ignore symbolic links. You need to give this command on the source host. Hope this may help some user in my situation. Let me know if I missed anything.

A one liner solution which can be executed at client to copy folder from server using tar + ssh command.

ssh user@<Server IP/link> 'mkdir -p <Remote destination directory;cd <Remote destination directory>; tar cf - ./' | tar xf - C <Source destination directory>

Note: mkdir is must, if the remote destination directory is not present then the command will simply compress the entire home of the remote server and extract it to client.