3.7.1p2 sftp recurse patch

Ben Lindstrom mouring at etoh.eviladmin.org
Wed Oct 1 15:44:25 EST 2003


A few comments before wandering off to bed for the night.

[..]
>
>  static int
> +do_recursive_download(struct sftp_conn *conn, char *remote_path,
> +		      char *local_path, int pflag)
> +{
> +	char *remote_tmp, *local_tmp;
> +	int err, n;
> +	SFTP_DIRENT **d;
> +	extern int errno;
> +
> +	if (recursion && remote_is_dir(conn, remote_path)) {
> +		if (!is_dir(local_path)) {
> +			/* Create local directory */
> +			err = mkdir(local_path, 0777);
> +			if (err == -1) {
> +				error("Couldn't create local directory \"%s\": "
> +					"%s", local_path, strerror(errno));
> +				goto END;
> +			}
> +		}
> +
> +		err = do_readdir(conn, remote_path, &d);
> +		if (err == -1) {
> +			error("Couldn't gather list of remote files");
> +			goto END;
> +		}
> +
> +		for (n = 0; d[n] != NULL; n++) {
> +			/* Skip '.' and '..' */
> +			if ((strcmp(d[n]->filename, ".")  == 0) ||
> +			    (strcmp(d[n]->filename, "..") == 0))
> +				continue;
> +			remote_tmp = path_append(remote_path, d[n]->filename);
> +			local_tmp = path_append(local_path, d[n]->filename);
> +			err = do_recursive_download(conn, remote_tmp,
> +				local_tmp, pflag);
				^^^^^^^^^^^^
Ditch the recursive concept.  On long directory runs you will kill your
memory usage; if not crash the client (even worse looping due to
symlinks).

[..]
> +static int
> +do_recursive_upload(struct sftp_conn *conn, char *local_path,
> +		    char *remote_path, int pflag)
> +{
> +	int err;
> +	DIR *d;
> +	struct dirent *f;
> +	char *local_tmp, *remote_tmp;
> +	Attrib a;
> +	extern int errno;
> +

Use fts().  That is why it was created.

Too bad fts() does not support call backs like glob() or it would
be useful for download.

I've toyed with importing OpenBSD's fts() and modifying it to work for
downloads, but didn't get too far since I was working on local
caching.

Yes, it should be 'get -r' and 'put -r'.

- Ben





More information about the openssh-unix-dev mailing list