summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorDaniel Gustafsson <daniel@yesql.se>2020-09-30 21:05:14 +0200
committerDaniel Gustafsson <daniel@yesql.se>2020-09-30 21:05:14 +0200
commit021f2c25fd7fbe2887ed156354f8919c0d06a412 (patch)
treeba79d666e1e7f70dde873e3fc2ab435078725523
parent025b20971c0cc5c8df4e773c79af9746e024d2df (diff)
downloadcurl-021f2c25fd7fbe2887ed156354f8919c0d06a412.tar.gz
MANUAL: update examples to resolve without redirects
www.netscape.com is redirecting to a cookie consent form on Aol, and cool.haxx.se isn't responding to FTP anymore. Replace with examples that resolves in case users try out the commands when reading the manual. Closes #6024 Reviewed-by: Daniel Stenberg <daniel@haxx.se> Reviewed-by: Emil Engler <me@emilengler.com>
-rw-r--r--docs/MANUAL.md16
1 files changed, 8 insertions, 8 deletions
diff --git a/docs/MANUAL.md b/docs/MANUAL.md
index 7063dbc8b..e4c7d7971 100644
--- a/docs/MANUAL.md
+++ b/docs/MANUAL.md
@@ -2,9 +2,9 @@
## Simple Usage
-Get the main page from Netscape's web-server:
+Get the main page from a web-server:
- curl http://www.netscape.com/
+ curl https://www.example.com/
Get the README file the user's home directory at funet's ftp-server:
@@ -16,7 +16,7 @@ Get a web page from a server using port 8000:
Get a directory listing of an FTP site:
- curl ftp://cool.haxx.se/
+ curl ftp://ftp.funet.fi
Get the definition of curl from a dictionary:
@@ -24,7 +24,7 @@ Get the definition of curl from a dictionary:
Fetch two documents at once:
- curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/
+ curl ftp://ftp.funet.fi/ http://www.weirdserver.com:8000/
Get a file off an FTPS server:
@@ -61,13 +61,13 @@ Get a file from an SMB server:
Get a web page and store in a local file with a specific name:
- curl -o thatpage.html http://www.netscape.com/
+ curl -o thatpage.html http://www.example.com/
Get a web page and store in a local file, make the local file get the name of
the remote document (if no file name part is specified in the URL, this will
fail):
- curl -O http://www.netscape.com/index.html
+ curl -O http://www.example.com/index.html
Fetch two files and store them with their remote names:
@@ -657,11 +657,11 @@ Download with `PORT` but use 192.168.0.10 as our IP address to use:
Get a web page from a server using a specified port for the interface:
- curl --interface eth0:1 http://www.netscape.com/
+ curl --interface eth0:1 http://www.example.com/
or
- curl --interface 192.168.1.10 http://www.netscape.com/
+ curl --interface 192.168.1.10 http://www.example.com/
## HTTPS