Skip to content
Advertisement

Is wget or similar programs always available on POSIX systems?

Is there an HTTP client like wget/lynx/GET that is distributed by default in POSIX or *nix operating systems that could be used for maximum portability?

I know most systems have wget or lynx installed, but I seem to remember installing some Ubuntu server systems using default settings and they had neither wget or lynx installed in the base package.

I am writing a shell script for Linux (and probably Mac) to install a piece of software onto the computer. To prevent having to distribute a couple of large files, I would like to fetch these files from the internet instead of packaging in with the installer. Currently, the install script is to be distributed in a single file created from Makeself.

I’d like to avoid having the install script be over 100 MB which it would be if the files were included, and also they may not be required if the person is upgrading or re-installing the software. Maybe the most portable thing to do is include the files in the pacakage.

Right now I am just thinking of having the script check for wget, lynx, and GET, in that order and it will use whichever one it can for downloading, but I could avoid this altogether if there was a way I could download the files that would work on all systems.

EDIT:

Does anyone know much about lwp-request (GET) and its availability? This seems to be readily available on several of the systems I have checked so far, and I remember this always being around 10+ years ago going back to RedHat.

Advertisement

Answer

Edit in 2019-11-04: I’m rewriting my answer to reflect the importance of ensuring that a transfer isn’t tampered with while in flight. I’ll leave my original answer below the rule.

I suggest using rsync over ssh to transfer your files. rsync‘s interface may look overwhelming, but most users may be able to pick rsync -avzP, and if you need more flexibility, rsync can adapt. Using ssh will provide integrity, authenticity, and privacy to your connection.

curl is the de facto standard for http transfers; if plain http or https are preferred, curl or tools based on curl are probably a good choice.


In my experience, tools are available about in this order:

  • wget
  • curl
  • sftp
  • ftp
  • GET (I use HEAD all the time and often forget it is just one tool in the suite)
  • tftp
  • nc (not as common as I wish)
  • socat (even less common)

The bash /dev/tcp tool is available on most systems I’ve used (some used dash or pdksh instead), but using echo with bash, nc, or socat is going the long-way-around for HTTP access — you’ll have to handle headers somehow, which reduces its elegance.

User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement