curl (version 2.6)

multi: Async Multi Download

Description

AJAX style concurrent requests, possibly using HTTP/2 multiplexing. Results are only available via callback functions. Advanced use only!

Usage

multi_add(handle, done = NULL, fail = NULL, pool = NULL)

multi_run(timeout = Inf, pool = NULL)

multi_set(total_con = 50, host_con = 6, multiplex = TRUE, pool = NULL)

multi_list(pool = NULL)

multi_cancel(handle)

new_pool(total_con = 100, host_con = 6, multiplex = TRUE)

Arguments

handle

a curl handle with preconfigured url option.

done

callback function for completed request. Single argument with response data in same structure as curl_fetch_memory.

fail

callback function called on failed request. Argument contains error message.

pool

a multi handle created by new_pool. Default uses a global pool.

timeout

max time in seconds to wait for results. Use 0 to poll for results without waiting at all.

total_con

max total concurrent connections.

host_con

max concurrent connections per host.

multiplex

enable HTTP/2 multiplexing if supported by host and client.

Details

Requests are created in the usual way using a curl handle and added to the scheduler with multi_add. This function returns immediately and does not perform the request yet. The user needs to call multi_run which performs all scheduled requests concurrently. It returns when all requests have completed, or case of a timeout or SIGINT (e.g. if the user presses ESC or CTRL+C in the console). In case of the latter, simply call multi_run again to resume pending requests.

When the request succeeded, the done callback gets triggerd with the response data. The structure if this data is identical to curl_fetch_memory. When the request fails, the fail callback is triggered with an error message. Note that failure here means something went wrong in performing the request such as a connection failure, it does not check the http status code. Just like curl_fetch_memory, the user has to implement application logic.

Raising an error within a callback function stops execution of that function but does not affect other requests.

A single handle cannot be used for multiple simultaneous requests. However it is possible to add new requests to a pool while it is running, so you can re-use a handle within the callback of a request from that same handle. It is up to the user to make sure the same handle is not used in concurrent requests.

The multi_cancel function can be used to cancel a pending request. It has no effect if the request was already completed or canceled.

Examples

Run this code
h1 <- new_handle(url = "https://eu.httpbin.org/delay/3")
h2 <- new_handle(url = "https://eu.httpbin.org/post", postfields = "bla bla")
h3 <- new_handle(url = "https://urldoesnotexist.xyz")
multi_add(h1, done = print, fail = print)
multi_add(h2, done = print, fail = print)
multi_add(h3, done = print, fail = print)
multi_run(timeout = 2)
multi_run()

Run the code above in your browser using DataLab