In JavaScript, implementing concurrent requests in browsers typically involves using the XMLHttpRequest or fetch API to send HTTP requests. However, the number of concurrent requests is managed by the browser, with different browsers imposing varying limits. For instance, older browsers might restrict concurrent requests per domain to 6, whereas newer versions may allow higher limits.
However, if you wish to control the number of concurrent requests at the code level, you can employ various techniques and third-party libraries. Below, I will explain a commonly used method along with an example.
Controlling Concurrency with Promise and async/await
We can use Promise combined with async/await to manage the concurrency of asynchronous requests. This approach does not rely on specific libraries but leverages JavaScript's native features to control concurrency.
Here, I will provide an example demonstrating how to limit concurrent requests using this method, assuming we use the fetch API:
javascriptasync function fetchWithConcurrency(urls, limit) { // Array for storing results const results = []; // Recursive function to execute requests async function request(url) { if (!url) return; try { // Send request const response = await fetch(url); const data = await response.json(); // Store results results.push(data); } catch (error) { console.error(`Request failed URL: ${url} Error: ${error}`); } finally { // Request the next if (urls.length > 0) { await request(urls.shift()); } } } // Initialize requests const initialRequests = urls.splice(0, limit).map(url => request(url)); // Wait for all requests to complete await Promise.all(initialRequests); return results; } // Example URL list const urls = [ 'https://api.example.com/data1', 'https://api.example.com/data2', 'https://api.example.com/data3', // More URLs ]; // Start requests, limiting concurrency to 2 fetchWithConcurrency(urls, 2).then(results => console.log(results));
In the above code, the fetchWithConcurrency function accepts an array of URLs and a concurrency limit parameter limit. Internally, it maintains a current array to track active requests. When the count of active requests is below limit, it retrieves new URLs from the urls array. After each request completes, it removes the request from the current array and proceeds to request the next URL until all URLs are processed.
The advantage is that it does not depend on external libraries, utilizing only native JavaScript, which makes it easy to understand and implement. The disadvantage is that it requires manual management of the request queue and concurrency, which can be somewhat complex.
Conclusion
Using this method, we can flexibly manage request concurrency within applications, optimizing resource utilization and enhancing performance. For scenarios involving large volumes of requests and complex concurrency control, third-party libraries such as async.js can be considered to simplify the code.