When the total transfered amount (upload or download) for parallel
transfers was larger than 2^63/100 bytes (81 petabytes) the progress
percent counter would show wrong.
Closes#16284
Sources used `lib/curlx.h` with both `ENABLE_CURLX_PRINTF` set and unset
before including it.
In a cmake "unity" batch where the first included source had it unset,
the next sources did not get the macros requested with
`ENABLE_CURLX_PRINTF` because `lib/curl.x` had already been included
without them.
Fix it by by making the macros enabled permanently and globally for
internal sources, and dropping `ENABLE_CURLX_PRINTF`.
This came up while testing unity builds with smaller batches. The full,
default unity build where all `src` is bundled up in a single unit, was
not affected.
Fixes:
```
$ cmake -B build -DCMAKE_UNITY_BUILD=ON -DCMAKE_UNITY_BUILD_BATCH_SIZE=15
$ make -C build
...
curl/src/tool_getparam.c: In function ‘getparameter’:
curl/src/tool_getparam.c:2409:11: error: implicit declaration of function ‘msnprintf’; did you mean ‘vsnprintf’? [-Wimplicit-function-declaration]
2409 | msnprintf(buffer, sizeof(buffer), "%" CURL_FORMAT_CURL_OFF_T "-",
| ^~~~~~~~~
| vsnprintf
curl/src/tool_getparam.c:2409:11: warning: nested extern declaration of ‘msnprintf’ [-Wnested-externs]
[...]
```
Reported-by: Daniel Stenberg
Bug: https://github.com/curl/curl/pull/14626#issuecomment-2301663491Closes#14632
Based on the standards and guidelines we use for our documentation.
- expand contractions (they're => they are etc)
- host name = > hostname
- file name => filename
- user name = username
- man page => manpage
- run-time => runtime
- set-up => setup
- back-end => backend
- a HTTP => an HTTP
- Two spaces after a period => one space after period
Closes#14073
Earlier this year we changed our own stderr variable to use the standard
name `stderr` (to avoid bugs where someone is using `stderr` instead of
the curl-tool specific variable). This solution needed to override the
standard `stderr` symbol via the preprocessor. This in turn didn't play
well with unity builds and caused curl tool to crash or stay silent due
to an uninitialized stderr. This was a hard to find issue, fixed by
manually breaking out one file from the unity sources.
To avoid two these two tricks, this patch implements a different
solution: Restore using our own local variable for our stderr output and
leave `stderr` as-is. To avoid using `stderr` by mistake, add a
`checksrc` rule (based on logic we already used in lib for `strerror`)
that detects any `stderr` use in `src` and points to using our own
variable instead: `tool_stderr`.
Follow-up to 06133d3e9b
Follow-up to 2f17a9b654Closes#11958
- freopen stderr with the user-specified file (--stderr file) instead of
using a separate 'errors' stream.
- In tool_setup.h override stdio.h's stderr macro as global variable
tool_stderr.
Both freopen and overriding the stderr macro are necessary because if
the user-specified filename is "-" then stdout is assigned to
tool_stderr and no freopen takes place. See the PR for more information.
Ref: https://github.com/curl/curl/issues/10491
Closes https://github.com/curl/curl/pull/10673
- they are mostly pointless in all major jurisdictions
- many big corporations and projects already don't use them
- saves us from pointless churn
- git keeps history for us
- the year range is kept in COPYING
checksrc is updated to allow non-year using copyright statements
Closes#10205
Add licensing and copyright information for all files in this repository. This
either happens in the file itself as a comment header or in the file
`.reuse/dep5`.
This commit also adds a Github workflow to check pull requests and adapts
copyright.pl to the changes.
Closes#8869
- Abort via progress callback to fail early during parallel transfers.
When a critical error occurs during a transfer (eg --fail-early
constraint) then other running transfers will be aborted via progress
callback and finish with error CURLE_ABORTED_BY_CALLBACK (42). In this
case, the callback error does not become the most recent error and a
custom error message is used for those transfers:
curld --fail --fail-early --parallel
https://httpbin.org/status/404https://httpbin.org/delay/10
curl: (22) The requested URL returned error: 404
curl: (42) Transfer aborted due to critical error in another transfer
> echo %ERRORLEVEL%
22
Fixes https://github.com/curl/curl/issues/6939
Closes https://github.com/curl/curl/pull/6984
Make sure the total amount of DL/UL bytes are counted before the
transfer finalizes. Otherwise if a transfer finishes too quick, its
total numbers are not added, and results in a DL%/UL% that goes above
100%.
Detail:
progress_meter() is called periodically, and it may not catch a
transfer's total bytes if the value was unknown during the last call,
and the transfer is finished and deleted (i.e., lost) during the next
call.
Closes https://github.com/curl/curl/pull/6840
As it was just unnecessary duplicated information already stored in the
'per_transfer' struct and that's around mostly anyway.
The duplicated pointer caused problems when the code flow was aborted
before the dupe was filled in and could cause a NULL pointer access.
Reported-by: Brian Carpenter
Fixes#4807Closes#4810
Attempt to unpause a busy read in the CURLOPT_XFERINFOFUNCTION.
When uploading from stdin in non-blocking mode, a delay in reading
the stream (EAGAIN) causes curl to pause sending data
(CURL_READFUNC_PAUSE). Prior to this change, a busy read was
detected and unpaused only in the CURLOPT_WRITEFUNCTION handler.
This change performs the same busy read handling in a
CURLOPT_XFERINFOFUNCTION handler.
Fixes#2051Closes#4599
Reported-by: bdry on github
This is done by making sure each individual transfer is first added to a
linked list as then they can be performed serially, or at will, in
parallel.
Closes#3804