------------------------------------------------------------------------------------- / _ \ \_\(_)/_/ _//"\\_ JOHLEM.net / \ https://johlem.net/V1/topics/cheatsheet.php ------------------------------------------------------------------------------------- --- cURL cURL is powered by a library: libcurl. This means you can write entire programs based on cURL cURL vs wget ====================================================================================================== wget's major strong side compared to curl is its ability to download recursively. wget is command line only. There's no lib or anything, but curl's features are powered by libcurl. curl supports FTP, FTPS, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, LDAP, LDAPS, FILE, POP3, IMAP, SMTP, RTMP and RTSP. wget supports HTTP, HTTPS and FTP. curl builds and runs on more platforms than wget. wget is released under a free software copyleft license (the GNU GPL). curl is released under a free software permissive license (a MIT derivate). curl offers upload and sending capabilities. wget only offers plain HTTP POST support. == They were made for different purposes wget is a tool to download files from servers curl is a tool that let's you exchange requests/responses with a server wget Wget solely lets you download files from an HTTP/HTTPS or FTP server. You give it a link and it automatically downloads the file where the link points to. It builds the request automatically. curl Curl in contrast to wget lets you build the request as you wish. Combine that with the plethora of protocols supported - FTP, FTPS, Gopher, HTTP, HTTPS, SCP, SFTP, TFTP, Telnet, DICT, LDAP, LDAPS, IMAP, POP3, SMTP, RTSP and URI - and you get an amazing debugging tool (for testing protocols, testing server configurations, etc.). As many already mentioned you can download a file with curl. True, but that is just an "extra". In practice, use CURL when you want to download a file via a protocol that wget doesn't support. ====================================================================================================== COMMAND: curl https://example.com -o example.txt #Download and Upload Using FTP wget --user=abhi --password='myPassword' ftp://abc.com/hello.pdf curl -u abhi:myPassword 'ftp://abc.com/hello.pdf' -o hello.pdf #upload files to an FTP server with curl. For this, we can use the -T parameter curl -T "img.png" ftp://ftp.example.com/upload/ #Recursive download use wget (recursive download not supporte dby cURL.) wget can follow links in HTML, XHTML, and CSS pages, to create local versions of remote web sites, fully recreating the directory structure of the original site. wget --recursive http://example.com n the case of HTTP or HTTPS URLs, wget scans and parses the HTML or CSS. Then, it retrieves the files the document refers to, through markups like href or src. By default, wget will exclude paths under robots.txt (Robot Exclusion Standard). To switch this off, we can use the -e parameter: wget -e robots=off http://example.com curl -T "{file1,file2}" http://www.uploadtothissite.com curl -T "img[1-1000].png" ftp://ftp.picturemania.com/upload/ To send your password file to the server, where 'password' is the name of the form-field to which /etc/passwd will be the input: $ curl -F password=@/etc/passwd www.mypasswords.com To retrieve a web page and display in the terminal $ curl http://www.tutorialspoint.com To retrieve a web page and display header information $ curl http://www.example.com -i To retrieve a web page and save to a file. $ curl http://www.tutorialspoint.com -0 tutorialspoint.html To retrieve a web page, or its redirected target $ curl www.tutorialspoint.com/unix/ $ curl www.tutorialspoint.com/unix/ --location To limit the rate of data transfer to 1 Kilobytes/sec $ curl http://www.tutorialspoint.com/unix/ --limit-rate 1k -o unix.html To download via a proxy server $ curl -x proxy.example.com:3128 http://www.tutorialspoint.com/unix/ wget ‐l=1 ‐‐recursive ‐‐no-parent http://example.com