Previously just ipfs://<cid> and ipns://<cid> was supported, which is
too strict for some usecases.
This patch allows paths and query arguments to be used too.
Making this work according to normal http semantics:
ipfs://<cid>/foo/bar?key=val
ipns://<cid>/foo/bar?key=val
The gateway url support is changed.
It now only supports gateways in the form of:
http://<gateway>/foo/bar
http://<gateway>
Query arguments here are explicitly not allowed and trigger an intended
malformed url error.
There also was a crash when IPFS_PATH was set with a non trailing
forward slash. This has been fixed.
Lastly, a load of test cases have been added to verify the above.
Reported-by: Steven Allen
Fixes#12148Closes#12152
Finding a 'Content-Range:' in the response changed the handling.
Add test case 1475 to verify -C - with 416 and Content-Range: header,
which is almost exactly like test 194 which instead uses a fixed -C
offset. Adjusted test 194 to also be considered fine.
Fixes#10521
Reported-by: Smackd0wn
Fixes#12174
Reported-by: Anubhav Rai
Closes#12176
Prior to this change the state machine attempted to change the remote
resolve to a local resolve if the hostname was longer than 255
characters. Unfortunately that did not work as intended and caused a
security issue.
Bug: https://curl.se/docs/CVE-2023-38545.html
- ipfs://<cid>
- ipns://<cid>
This allows you tu use ipfs in curl like:
curl ipfs://<cid>
and
curl ipns://<cid>
For more information consult the readme at:
https://curl.se/docs/ipfs.htmlCloses#8805
- Use CERT_CONTEXT's pbCertEncoded to determine chain order.
CERT_CONTEXT from SECPKG_ATTR_REMOTE_CERT_CONTEXT contains
end-entity/server certificate in pbCertEncoded. We can use this pointer
to determine the order of certificates when enumerating hCertStore using
CertEnumCertificatesInStore.
This change is to help ensure that the ordering of the certificate chain
requested by the user via CURLINFO_CERTINFO has the same ordering on all
versions of Windows.
Prior to this change Schannel certificate order was reversed in 8986df80
but that was later reverted in f540a39b when it was discovered that
Windows 11 22H2 does the reversal on its own.
Ref: https://github.com/curl/curl/issues/9706
Closes https://github.com/curl/curl/pull/11632
- Error on missing input file for --data, --data-binary,
--data-urlencode, --header, --variable, --write-out.
Prior to this change if a user of the curl tool specified an input file
for one of the above options and that file could not be opened then it
would be treated as zero length data instead of an error. For example, a
POST using `--data @filenametypo` would cause a zero length POST which
is probably not what the user intended.
Closes https://github.com/curl/curl/pull/11677
Store numerical IPv6 addresses in the alt-svc file with the brackets
present.
Verify with test 437 and 438
Fixes#11737
Reported-by: oliverpool on github
Closes#11743
In this situation, only part of the data has been sent before aborting
so the connection is no longer usable.
Assisted-by: Jay Satiro
Fixes#11678Closes#11679
Expansions whose output starts with NUL were being expanded to the empty
string, and not being recognised as values that contain a NUL byte, and
should error.
Closes#11694
This test runs a perl script that checks all string options are properly
translated by the OS400 character code conversion wrapper. It also
verifies these options are listed in alphanumeric order in the wrapper
switch statement.
Closes#11650
This is a bad header fold but since the popular browsers accept this
violation, so does curl now. Unless built with hyper.
Add test 1473 to verify and adjust test 2306.
Reported-by: junsik on github
Fixes#11605Closes#11607
Add `test1279` to verify that `libcurl.def` lists all exported API
functions found in libcurl headers.
Also:
- extend test suite XML `stdout` tag with the `loadfile` attribute.
- fix `tests/extern-scan.pl` and `test1135` to include websocket API.
- use all headers (sorted) in `test1135` instead of a manual list.
- add options `--sort`, `--heading=` to `tests/extern-scan.pl`.
- add `libcurl.def` to the auto-labeler GHA task.
Follow-up to 2ebc74c36aCloses#11570
It can be used multiple times. Use %output{>>name} to append.
Add docs. Test 990 and 991 verify.
Idea: #11400
Suggested-by: ed0d2b2ce19451f2
Closes#11416
Add support for command line variables. Set variables with --variable
name=content or --variable name@file (where "file" can be stdin if set
to a single dash (-)).
Variable content is expanded in option parameters using "{{name}}"
(without the quotes) if the option name is prefixed with
"--expand-". This gets the contents of the variable "name" inserted, or
a blank if the name does not exist as a variable. Insert "{{" verbatim
in the string by prefixing it with a backslash, like "\\{{".
Import an environment variable with --variable %name. It makes curl exit
with an error if the environment variable is not set. It can also rather
get a default value if the variable does not exist, using =content or
@file like shown above.
Example: get the USER environment variable into the URL:
--variable %USER
--expand-url = "https://example.com/api/{{USER}}/method"
When expanding variables, curl supports a set of functions that can make
the variable contents more convenient to use. It can trim leading and
trailing white space with "trim", output the contents as a JSON quoted
string with "json", URL encode it with "url" and base 64 encode it with
"b64". To apply functions to a variable expansion, add them colon
separated to the right side of the variable. They are then performed in
a left to right order.
Example: get the contents of a file called $HOME/.secret into a variable
called "fix". Make sure that the content is trimmed and percent-encoded
sent as POST data:
--variable %HOME=/home/default
--expand-variable fix@{{HOME}}/.secret
--expand-data "{{fix:trim:url}}"
https://example.com/
Documented. Many new test cases.
Co-brainstormed-by: Emanuele Torre
Assisted-by: Jat Satiro
Closes#11346
Make sure the user and password for the second request is taken from the
redirected-to URL.
Add test case 899 to verify.
Reported-by: James Lucas
Fixes#11410Closes#11412
- Implement AUTH=+LOGIN for CURLOPT_LOGIN_OPTIONS to prefer plaintext
LOGIN over SASL auth.
Prior to this change there was no method to be able to fall back to
LOGIN if an IMAP server advertises SASL capabilities. However, this may
be desirable for e.g. a misconfigured server.
Per: https://www.ietf.org/rfc/rfc5092.html#section-3.2
";AUTH=<enc-auth-type>" looks to be the correct way to specify what
authenication method to use, regardless of SASL or not.
Closes https://github.com/curl/curl/pull/10041
... and have curl trim the end when it reaches the expected total amount
of bytes instead of over-sending.
Reported-by: JustAnotherArchivist on github
Closes#11223
libcurl assumes that a --continue-at resumption is done to continue an
upload using the read callback and neither --data nor --form use
that and thus won't do what the user wants. Whatever the user wants
with this strange combination.
Add test 426 to verify.
Reported-by: Smackd0wn on github
Fixes#11081Closes#11083
- with `--proxy-http2` allow h2 ALPN negotiation to
forward proxies
- applies to http: requests against a https: proxy only,
as https: requests will auto-tunnel
- adding a HTTP/1 request parser in http1.c
- removed h2h3.c
- using new request parser in nghttp2 and all h3 backends
- adding test 2603 for request parser
- adding h2 proxy test cases to test_10_*
scorecard.py: request scoring accidentally always run curl
with '-v'. Removed that, expect double numbers.
labeller: added http1.* and h2-proxy sources to detection
Closes#10967
Output specific components from the used URL. The following variables
are added for this purpose:
url.scheme, url.user, url.password, url.options, url.host, url.port,
url.path, url.query, url.fragment, url.zoneid
Add the following for outputting parts of the "effective URL":
urle.scheme, urle.user, urle.password, urle.options, urle.host, urle.port,
urle.path, urle.query, urle.fragment, urle.zoneid
Added test 423 and 424 to verify.
Closes#10853
RFC 7686 states that:
> Applications that do not implement the Tor
> protocol SHOULD generate an error upon the use of .onion and
> SHOULD NOT perform a DNS lookup.
Let's do that.
https://www.rfc-editor.org/rfc/rfc7686#section-2
Add test 1471 and 1472 to verify
Fixes#543Closes#10705
Adding `bufq`:
- at init() time configured to hold up to `n` chunks of `m` bytes each.
- various methods for reading from and writing to it.
- `peek` support to get access to buffered data without copy
- `pass` support to allow buffer flushing on write if it becomes full
- use case: IO buffers for dynamic reads and writes that do not blow up
- distinct from `dynbuf` in that:
- it maintains a read position
- writes on a full bufq return CURLE_AGAIN instead of nuking itself
- Init options:
- SOFT_LIMIT: allow writes into a full bufq
- NO_SPARES: free empty chunks right away
- a `bufc_pool` that can keep a number of spare chunks to
be shared between different `bufq` instances
Adding `dynhds`:
- a straightforward list of name+value pairs as used for HTTP headers
- headers can be appended dynamically
- headers can be removed again
- headers can be replaced
- headers can be looked up
- http/1.1 formatting into a `dynbuf`
- configured at init() with limits on header counts and total string
sizes
- use case: pass a HTTP request or response around without being version
specific
- express a HTTP request without a curl easy handle (used in h2 proxy
tunnels)
- future extension possibilities:
- conversions of `dynhds` to nghttp2/nghttp3 name+value arrays
Closes#10720
all s3 requests default to UNSIGNED-PAYLOAD and add the required
x-amz-content-sha256 header. this allows CURLAUTH_AWS_SIGV4 to correctly
sign s3 requests to amazon with no additional configuration
Signed-off-by: Casey Bodley <cbodley@redhat.com>
Closes#9995
When returned from the CURLOPT_SOCKOPTFUNCTION, like when we have a
custom socket connected in the app, passed in to libcurl.
Verifies the fix in #10648Closes#10651