Skip to content
Advertisement

Tag: https

How to download URLs from the website and save them in a file (wget, curl)?

How to use WGET to separate the marked links from this side? Can this be done with CURL? I want to download URLs from this page and save them in a file. I tried like that. wget -r -p -k https://polsatboxgo.pl/wideo/seriale/pierwsza-milosc/5027238/sezon-44/5027472/pierwsza-milosc-odcinek-2984/585ddf5a3dde69cb58c7f42ba52790a4 Link Gopher separated the addresses. EDITION. How can I download addresses to the file from the terminal? Can it

Bind Monit to use Port 443 [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers. This question does not appear to be about programming within the scope defined in the help center. Closed 4 years ago. Improve this question I’m using Monit with this config: I can’t change to port to 443 nor 80 but I just want to use

curl command line API change on debian 9 regarding HTTPS

Is there any command line API change in Debian 9 curl? Recently I started to use Debian 9 (9.4, from Debian 8.x) and a script involving curl stopped working. I connect to internet through a squid proxy on localhost connected to a parent proxy. My environment variables are configured like this When I use wget, it works: when I use

How to force HTTP to HTTPS with an exception?

So I had to force all connections via HTTP to go to HTTPS on a specific folder (e.g. “public_html/folder1”). I found a tutorial somewhere and used the following in an .htaccess file: Problem is, I have a folder inside that folder1 which should be allowed to be accessed via HTTP and not just HTTPS, how do I make an exception?

Advertisement