One needs to send 1 mln HTTP requests concurrently, in batches, and read the responses. No more than 100 requests at a time.

Which way will it be better, recommended, idiomatic?

  • Send 100 ones, wait for them to finish, send another 100, wait for them to finish… and so on

  • Send 100 ones. As a a request among the 100 finishes, add a new one into the pool. “Done - add a new one. Done - add a new one”. As a stream.

  • vmaziman@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 months ago

    Maybe producer consumer?

    Producer spits out all the messages to send out onto a message queue, fifo or whatever suits u.

    Parrallelizable consumers (think deployed containers) listen to queue and execute request, get response and save it

    Scale consumer count up or down as you need to deal with ratelimits