summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorDaniel Stenberg <daniel@haxx.se>2000-03-16 11:32:53 +0000
committerDaniel Stenberg <daniel@haxx.se>2000-03-16 11:32:53 +0000
commit5992252b3d5e0b37587bf3f234d289ffeae42ba4 (patch)
treecbf9314cc6b23ac8a7abccd46d6c10c23f19e8be
parent90030a49c7facfefeca8157255f213197343c340 (diff)
downloadcurl-5992252b3d5e0b37587bf3f234d289ffeae42ba4.tar.gz
updates and fixes
-rw-r--r--CONTRIBUTE8
-rw-r--r--FAQ11
-rw-r--r--FEATURES10
-rw-r--r--FILES1
-rw-r--r--INSTALL26
-rw-r--r--README16
-rw-r--r--README.curl67
-rw-r--r--TODO33
8 files changed, 138 insertions, 34 deletions
diff --git a/CONTRIBUTE b/CONTRIBUTE
index e268d4e21..5550d7f5a 100644
--- a/CONTRIBUTE
+++ b/CONTRIBUTE
@@ -1,4 +1,10 @@
- Date: 1999-08-04
+ _ _ ____ _
+ ___| | | | _ \| |
+ / __| | | | |_) | |
+ | (__| |_| | _ <| |___
+ \___|\___/|_| \_\_____|
+
+CONTRIBUTE
To Think About When Contributing Source Code
diff --git a/FAQ b/FAQ
index dea807e96..5496057d3 100644
--- a/FAQ
+++ b/FAQ
@@ -1,4 +1,4 @@
-Date: 19 November 1999
+Date: 15 March 2000
Frequently Asked Questions about Curl
@@ -29,3 +29,12 @@ Date: 19 November 1999
I am very interested in once and for all getting some kind of report or
README file from those who have used libcurl in a threaded environment,
since I haven't and I get this question more and more frequently!
+
+4. Why doesn't my posting using -F work?
+
+ You can't simply use -F or -d at your choice. The web server that will
+ receive your post assumes one of the formats. If the form you're trying to
+ "fake" sets the type to 'multipart/form-data', than and only then you must
+ use the -F type. In all the most common cases, you should use -d which then
+ causes a posting with the type 'application/x-www-form-urlencoded'.
+
diff --git a/FEATURES b/FEATURES
index 303e884ba..11d75f832 100644
--- a/FEATURES
+++ b/FEATURES
@@ -1,7 +1,16 @@
+ _ _ ____ _
+ ___| | | | _ \| |
+ / __| | | | |_) | |
+ | (__| |_| | _ <| |___
+ \___|\___/|_| \_\_____|
+
+FEATURES
+
Misc
- full URL syntax
- custom maximum download time
- custom least download speed acceptable
+ - custom output result after completion
- multiple URLs
- guesses protocol from host name unless specified
- uses .netrc
@@ -21,6 +30,7 @@ HTTP
- follow redirects
- custom HTTP request
- cookie get/send
+ - understands the netscape cookie file
- custom headers (that can replace internally generated headers)
- custom user-agent string
- custom referer string
diff --git a/FILES b/FILES
index a46138233..abf1d2426 100644
--- a/FILES
+++ b/FILES
@@ -1,3 +1,4 @@
+BUGS
CHANGES
CONTRIBUTE
FEATURES
diff --git a/INSTALL b/INSTALL
index a2e389433..d96bb8b57 100644
--- a/INSTALL
+++ b/INSTALL
@@ -6,6 +6,32 @@
How To Compile
+Curl has been compiled and built on numerous different operating systems. The
+way to proceed is mainly devided in two different ways: the unix way or the
+windows way.
+
+If you're using Windows (95, 98, NT) or OS/2, you should continue reading from
+the Win32 header below. All other systems should be capable of being installed
+as described un the the UNIX header.
+
+PORTS
+=====
+ Just to show off, this is a probably incomplete list of known hardware and
+ operating systems that curl has been compiled for:
+
+ Sparc Solaris 2.4, 2.5, 2.5.1, 2.6, 7
+ Sparc SunOS 4.1.*
+ i386 Linux 1.3, 2.0, 2.2
+ MIPS IRIX
+ HP-PA HP-UX
+ PowerPC Mac OS X
+ - Ultrix
+ i386 OpenBSD
+ m68k OpenBSD
+ i386 Windows 95, 98, NT
+ i386 OS/2
+ m68k AmigaOS 3
+
UNIX
====
diff --git a/README b/README
index ccd538074..5abbf3205 100644
--- a/README
+++ b/README
@@ -26,3 +26,19 @@ README
Sweden -- ftp://ftp.sunet.se/pub/www/utilities/curl/
Germany -- ftp://ftp.fu-berlin.de/pub/unix/network/curl/
China -- http://www.pshowing.com/curl/
+
+ To download the very latest source off the CVS server do this:
+
+ cvs -d :pserver:cvs@curl.sourceforge.net/curl login
+
+ (just press enter when asked for password)
+
+ cvs -d :pserver:cvs@curl.sourceforge.net/curl co .
+
+ (now, you'll get all the latest sources downloaded into your current
+ directory. Note that this does not create a directory named curl or
+ anything)
+
+ cvs -d :pserver:cvs@curl.sourceforge.net/curl logout
+
+ (you're off the hook!)
diff --git a/README.curl b/README.curl
index 3daca8caa..7cddbca6c 100644
--- a/README.curl
+++ b/README.curl
@@ -122,33 +122,37 @@ UPLOADING
FTP
- Upload all data on stdin to a specified ftp site:
+ Upload all data on stdin to a specified ftp site:
curl -t ftp://ftp.upload.com/myfile
- Upload data from a specified file, login with user and password:
+ Upload data from a specified file, login with user and password:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
- Upload a local file to the remote site, and use the local file name remote
- too:
+ Upload a local file to the remote site, and use the local file name remote
+ too:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/
- NOTE: Curl is not currently supporing ftp upload through a proxy! The reason
- for this is simply that proxies are seldomly configured to allow this and
- that no author has supplied code that makes it possible!
+ Upload a local file to get appended to the remote file using ftp:
+
+ curl -T localfile -a ftp://ftp.upload.com/remotefile
+
+ NOTE: Curl does not support ftp upload through a proxy! The reason for this
+ is simply that proxies are seldomly configured to allow this and that no
+ author has supplied code that makes it possible!
HTTP
- Upload all data on stdin to a specified http site:
+ Upload all data on stdin to a specified http site:
curl -t http://www.upload.com/myfile
- Note that the http server must've been configured to accept PUT before this
- can be done successfully.
+ Note that the http server must've been configured to accept PUT before this
+ can be done successfully.
- For other ways to do http data upload, see the POST section below.
+ For other ways to do http data upload, see the POST section below.
VERBOSE / DEBUG
@@ -457,9 +461,9 @@ FTP and firewalls
HTTPS
- Secure HTTP requires SSLeay to be installed and used when curl is built. If
- that is done, curl is capable of retrieving and posting documents using the
- HTTPS procotol.
+ Secure HTTP requires SSL libraries to be installed and used when curl is
+ built. If that is done, curl is capable of retrieving and posting documents
+ using the HTTPS procotol.
Example:
@@ -472,9 +476,10 @@ HTTPS
browsers (Netscape and MSEI both use the so called PKCS#12 format). If you
want curl to use the certificates you use with your (favourite) browser, you
may need to download/compile a converter that can convert your browser's
- formatted certificates to PEM formatted ones. Dr Stephen N. Henson has
- written a patch for SSLeay that adds this functionality. You can get his
- patch (that requires an SSLeay installation) from his site at:
+ formatted certificates to PEM formatted ones. This kind of converter is
+ included in recent versions of OpenSSL, and for older versions Dr Stephen
+ N. Henson has written a patch for SSLeay that adds this functionality. You
+ can get his patch (that requires an SSLeay installation) from his site at:
http://www.drh-consultancy.demon.co.uk/
Example on how to automatically retrieve a document using a certificate with
@@ -601,6 +606,34 @@ ENVIRONMENT VARIABLES
The usage of the -x/--proxy flag overrides the environment variables.
+NETRC
+
+ Unix introduced the .netrc concept a long time ago. It is a way for a user
+ to specify name and password for commonly visited ftp sites in a file so
+ that you don't have to type them in each time you visit those sites. You
+ realize this is a big security risk if someone else gets hold of your
+ passwords, so therefor most unix programs won't read this file unless it is
+ only readable by yourself (curl doesn't care though).
+
+ Curl supports .netrc files if told so (using the -n/--netrc option). This is
+ not restricted to only ftp, but curl can use it for all protocols where
+ authentication is used.
+
+ A very simple .netrc file could look something like:
+
+ machine curl.haxx.nu login iamdaniel password mysecret
+
+CUSTOM OUTPUT
+
+ To better allow script programmers to get to know about the progress of
+ curl, the -w/--write-out option was introduced. Using this, you can specify
+ what information from the previous transfer you want to extract.
+
+ To display the amount of bytes downloaded together with some text and an
+ ending newline:
+
+ curl -w 'We downloaded %{size_download} bytes\n' www.download.com
+
MAILING LIST
We have an open mailing list to discuss curl, its development and things
diff --git a/TODO b/TODO
index 63a3ab71c..2520cda57 100644
--- a/TODO
+++ b/TODO
@@ -24,18 +24,17 @@ TODO
* HTTP Pipelining/persistant connections
- - I'm gonna introduce HTTP "pipelining". Curl should be able
- to request for several HTTP documents in one connect. It is the beginning
- for supporing more advanced functions in the future, like web site
+ - We should introduce HTTP "pipelining". Curl could be able to request for
+ several HTTP documents in one connect. It would be the beginning for
+ supporing more advanced functions in the future, like web site
mirroring. This will require that the urlget() function supports several
documents from a single HTTP server, which it doesn't today.
- - When curl supports fetching several documents from the same
- server using pipelining, I'd like to offer that function to the command
- line. Anyone has a good idea how? The current way of specifying one URL
- with the output sent to the stdout or a file gets in the way. Imagine a
- syntax that supports "additional documents from the same server" in a way
- similar to:
+ - When curl supports fetching several documents from the same server using
+ pipelining, I'd like to offer that function to the command line. Anyone has
+ a good idea how? The current way of specifying one URL with the output sent
+ to the stdout or a file gets in the way. Imagine a syntax that supports
+ "additional documents from the same server" in a way similar to:
curl <main URL> --more-doc <path> --more-doc <path>
@@ -52,12 +51,11 @@ TODO
And some friendly person's server source code is available at
http://hopf.math.nwu.edu/digestauth/index.html
- Then there's the Apache mod_digest source code too of course.
- It seems as if Netscape doesn't support this, and not many servers
- do. Although this is a lot better authentication method than the more
- common "Basic". Basic sends the password in cleartext over the network,
- this "Digest" method uses a challange-response protocol which increases
- security quite a lot.
+ Then there's the Apache mod_digest source code too of course. It seems as
+ if Netscape doesn't support this, and not many servers do. Although this is
+ a lot better authentication method than the more common "Basic". Basic
+ sends the password in cleartext over the network, this "Digest" method uses
+ a challange-response protocol which increases security quite a lot.
* Different FTP Upload Through Web Proxy
I don't know any web proxies that allow CONNECT through on port 21, but
@@ -88,3 +86,8 @@ TODO
(http://search.ietf.org/internet-drafts/draft-murray-auth-ftp-ssl-05.txt)
* HTTP POST resume using Range:
+
+ * Make curl capable of verifying the server's certificate when connecting
+ with HTTPS://.
+
+ * Make the timeout work as expected!