
	NON-STANDARD FTP FEATURES:


	This FTPSERVER has some special features which may help on grabbing
        files from here:

	<filename>.<NN> See about file-splitting below
	<filename>.Z	Compress designated file while
			fetching from this system. (Using BSD compress)
	<filename>.gz	Compress designated file while
			fetching from this system. (Using GNU-Zip)
	<dirname>.tar	Make a TAR (using GNU tar-1.11.2) of dir.
	<filename>.tar	Make a TAR (- " -) of file.
	<dirname>.tar.Z Make a compressed tar of dir.
	<dirname>.tar.gz GNU-Ziped variant of above
	<filename>.tar.Z Make a compressed tar of file.
                        (Why to tar a single file? To get its date & time
                         information too...)
	<filename>.tar.gz GNU-Ziped variant of above
	<filename>+".Z"	How to say...  There exists file:  <filename>.Z,
			you ask it to be UNCOMPRESSED for FTP transfer...
			(For all of you who can't uncompress a'la BSD UNIX.)
	<filename>+".gz"  Same as above, but with GNU-UnZip.

	Note:	It isn't very useful to compress .arc, .zoo, .gif, .Z or
		.gz files.  Usually they just expand when compressed :-(
		(Why?  They all use similar compression schemes.)

	Note2:  This ftp server won't perform .tar or .tar.Z transfer
                (mentioned above) for top level directories.
		This is to avoid accidental loading of whole archives over
		the network to your disk.   Several gigabytes of .tar isn't
		very easy to handle.
                So .tar and/or .tar.Z will only work in some subdirectories.
                Of course, this restriction doesn't apply on regular files.

	Note3:	Unless you REALLY can't uncompress files, please only then do
		retrieve files in their uncompressed form! (Leaving out the
		".Z"/".z"/".gz"  from true file path.)
		There are workable 16-bit UNcompress programs available for
		all machines which have at least 640k memory!
		Consider using GNU-Zip, which can decompress BSD compress (.Z),
		SysV pack (.z), and  gzip  (.gz) files.  Versions for multiple
		platforms are available, including UNIX, MSDOS, VMS, ...


	Files can be split while retrieving them by defining
	the desired part size:
		ftp> site partsize 720000
	and then GETing a file on which name a decimal order number
	suffix is added:
		ftp> get BIGFILE.01
		ftp> get BIGFILE.02
		ftp> get BIGFILE.03
	which for a file of  2000 000 bytes would produce parts:
		BIGFILE.01	720 000 bytes
		BIGFILE.02	720 000 bytes
		BIGFILE.03	560 000 bytes
	An attempt to get  BIGFILE.04  would result an error message
	about retrieve beyond the end of the file...


	Special command   `SITE FIND globexpr'  (your average UNIX ftp client
	accepts it like this:  `quote site find gcc960')  to run SH-GLOB-
	expression matching against the filenames in archive.  It scans
	special file with ordinary" GNU-fastfind -program.

	Alternate usage method is to GET a pseudo-file with prefix:
		/search:
	Those exact 8 chars, not "/SEARCH:" or something else..
	This returns the query report in the data-stream, and usually
	the user must do explicite result renameing;  UNIX-style:
		ftp> get "/search:gcc960" gcc960.search.log

	If you have a special FTP client, which sends PORT command before
	sending SITE FIND command, you can get search result into a file.
	(ftp.funet.fi:/pub/local/src/net2+mea-ftp.tar.gz)

