🚀 How to Configure a Robust, Reliable, and Clean HTTP Client in Node.js
When building microservices or an API Gateway in Node.js, your HTTP client becomes critical infrastructure.
Most engineers rely on default configurations like:
const http = require('http');
http.request(options, callback).end();
It works. But in production? It can silently destroy your system under load. This guide explains:
- Why default HTTP client configuration is dangerous
- How to properly configure connection pooling
- Timeout and resilience strategies
- Handling slow downstream services
- Migrating to Undici (modern Node HTTP client)
- Performance comparison (http vs axios vs undici)
Why Default HTTP Client Configuration Is Dangerous
Since Node 19+, http.globalAgent enables keepAlive: true. That’s better than older versions. But production risk still exists.
1. No Default Timeout = Infinite Hanging Requests
If a downstream service hangs, your Node.js process will wait indefinitely.
By default:
- No connection timeout
- No response timeout
- No body timeout
If downstream hangs:
- Memory grows
- Event loop congests
- Cascading failures happen
A gateway without timeout is a ticking bomb.
Example:
const http = require('http');
const options = {
hostname: 'slow-service.com',
port: 80,
path: '/',
method: 'GET'
};
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
res.on('data', (chunk) => process.stdout.write(chunk));
});
req.on('error', (e) => console.error(`problem with request: ${e.message}`));
// No timeout!
req.end();
If slow-service.com never responds, this request will hang forever.
Your Node.js process will eventually run out of file descriptors or memory.
2. Unlimited Concurrency (maxSockets = Infinity)
Default http.globalAgent:
http.globalAgent = new Agent({
maxSockets: Infinity,
keepAlive: true
});
If you receive 5,000 concurrent requests:
Node will open 5,000 sockets.
This can:
- Overwhelm downstream
- Exhaust file descriptors
- Kill your gateway
Without proper cleanup, connections remain open indefinitely.
This exhausts system resources.
3. No Circuit Breaker (Protection against cascading failures or slow downstream services)
Slow responses cause:
- Memory retention
- Increased latency
- Resource starvation
Without timeout + abort logic, your system is fragile.
If a service is down, the default client keeps retrying, overwhelming the failing service and cascading the failure.
Example:
const http = require('http');
const options = {
hostname: 'slow-service.com',
port: 80,
path: '/',
method: 'GET'
};
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
res.on('data', (chunk) => process.stdout.write(chunk));
});
req.on('error', (e) => console.error(`problem with request: ${e.message}`));
// No timeout!
req.end();
Production-Ready HTTP Client Design
1. Connection Pooling
Even in Node 19+, define your own agent:
const http = require('http');
const agent = new http.Agent({
maxSockets: 100,
keepAlive: true,
keepAliveMsecs: 10000,
maxFreeSockets: 100,
timeout: 60000,
freeSocketTimeout: 15000,
});
Then attach:
const agent = new http.Agent({
maxSockets: 100,
keepAlive: true,
keepAliveMsecs: 10000,
maxFreeSockets: 100,
timeout: 60000,
freeSocketTimeout: 15000,
});
const options = {
hostname: 'slow-service.com',
port: 80,
path: '/',
method: 'GET',
agent
};
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
res.on('data', (chunk) => process.stdout.write(chunk));
});
req.on('error', (e) => console.error(`problem with request: ${e.message}`));
req.end();
Why this is better:
maxSockets: Limits total socketskeepAlive: Reuses connectionskeepAliveMsecs: How long to keep idle connectionsmaxFreeSockets: Limits idle connectionstimeout: Aborts after N msfreeSocketTimeout: Removes idle sockets after N ms
2. Timeout Strategy
Always:
- Set request timeout
- Handle timeout error
- Return proper status (e.g., 504)
Example:
const options = {
hostname: 'slow-service.com',
port: 80,
path: '/',
method: 'GET',
timeout: 5000 // 5 seconds
};
// or if using setTimeout
req.setTimeout(5000, () => {
req.destroy(new Error('Request timed out'));
});
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
res.on('data', (chunk) => process.stdout.write(chunk));
});
req.on('error', (e) => {
if (e.code === 'ETIMEDOUT') {
console.error('Request timed out');
} else {
console.error(`problem with request: ${e.message}`);
}
});
req.end();
3. Abort Controller when client close connection (disconnected)
If client cancel the request:
this.request.on('close', () => {
req.destroy();
});
Without this:
- Upstream continues processing
- Memory retained
- Wasted CPU
Example if using abort controller:
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 5000);
const options = {
hostname: 'slow-service.com',
port: 80,
path: '/',
method: 'GET',
signal: controller.signal
};
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
res.on('data', (chunk) => process.stdout.write(chunk));
});
// on close
req.on('close', () => {
clearTimeout(timeout);
});
// on error
req.on('error', (e) => {
if (e.name === 'AbortError') {
console.error('Request aborted');
} else {
console.error(`problem with request: ${e.message}`);
}
});
req.end();
4. Safe retry strategy
Retry only for:
- GET
- Idempotent operations
- Specific status codes (500, 502, 503, 504, etc, could check in https://developer.mozilla.org/en-US/docs/Web/HTTP/Status). Avoid retry for 4xx status codes.
With:
- Exponential backoff
- Jitter
- Max retry limit
- Never blindly retry POST.
Example:
function retry(fn, maxRetries = 3, delay = 1000) {
return fn().catch((error) => {
if (maxRetries === 0) {
throw error;
}
return new Promise((resolve) => {
setTimeout(() => {
resolve(retry(fn, maxRetries - 1, delay));
}, delay);
});
});
}
5. Handling slow downstream services: Applying Circuit Breaker Pattern
Slow downstream is more dangerous than failure.
It creates:
- Queue buildup
- Latency amplification
- Throughput collapse
Strategies:
- Aggressive timeout
- Circuit breaker
- Rate limiting
- Bulkhead isolation
Example:
class CircuitBreaker {
constructor(options) {
this.options = options;
this.state = 'CLOSED';
this.failures = 0;
this.lastFailure = null;
this.lastSuccess = null;
}
async execute(fn) {
if (this.state === 'OPEN') {
if (Date.now() - this.lastFailure < this.options.resetTimeout) {
throw new Error('Circuit is open');
}
this.state = 'HALF_OPEN';
}
try {
const result = await fn();
this.lastSuccess = Date.now();
this.failures = 0;
this.state = 'CLOSED';
return result;
} catch (error) {
this.lastFailure = Date.now();
this.failures++;
if (this.failures >= this.options.failureThreshold) {
this.state = 'OPEN';
}
throw error;
}
}
}
6. Monitoring and Metrics
Always track metrics:
- Requests
- Success
- Failures
- Latency
- Timeouts
- Aborts
Example:
const metrics = {
requests: 0,
success: 0,
failures: 0,
latency: 0,
timeouts: 0,
aborts: 0
};
🧠 The Modern solution: Use dedicated HTTP client libraries: Undici
Node.js now recommends Undici. Why?
- Faster than built-in http
- Better pooling
- Lower memory overhead
- Native promise API
- Built-in timeout controls
- Some contributor came from Node.js core team
Node.js’s built-in fetch is powered by a bundled version of undici:
// Available globally in Node.js v18+
const response = await fetch('https://api.example.com/data');
const data = await response.json();
// Check the bundled undici version
console.log(process.versions.undici); // e.g., "5.28.4"
However, fetch API is not a drop-in replacement for http module. It has some differences.
Pros:
- No additional dependencies required
- Works across different JavaScript runtimes
- Automatic compression handling (gzip, deflate, br)
- Built-in caching support (in development)
Cons:
- Limited to the undici version bundled with your Node.js version
- Less control over connection pooling and advanced features
- Error handling follows Web API standards (errors wrapped in TypeError)
- Performance overhead due to Web Streams implementation
Undici Module
See Undici Module for more information.
npm install undici
import { request, fetch, Agent, setGlobalDispatcher } from 'undici';
// Use undici.request for maximum performance
const { statusCode, headers, body } = await request('https://api.example.com/data');
const data = await body.json();
// Or use undici.fetch with custom configuration
const agent = new Agent({ keepAliveTimeout: 10000 });
setGlobalDispatcher(agent);
const response = await fetch('https://api.example.com/data');
Basic Usage
import { request } from 'undici';
const { body, statusCode } = await request(
'http://service.internal/api',
{
method: 'GET',
headersTimeout: 5000,
bodyTimeout: 5000,
}
);
const data = await body.json();
Connection Pooling
import { Pool } from 'undici';
const pool = new Pool('http://service.internal', {
connections: 200,
pipelining: 1,
headersTimeout: 5000,
bodyTimeout: 5000,
});
Then attach pool to global dispatcher:
import { setGlobalDispatcher } from 'undici';
setGlobalDispatcher(pool);
Benchmark result:
| Tests | Samples | Result | Tolerance | Difference with slowest |
|---|---|---|---|---|
| axios | 15 | 5708.26 req/sec | ± 2.91 % | - |
| http - no keepalive | 10 | 5809.80 req/sec | ± 2.30 % | + 1.78 % |
| request | 30 | 5828.80 req/sec | ± 2.91 % | + 2.11 % |
| undici - fetch | 40 | 5903.78 req/sec | ± 2.87 % | + 3.43 % |
| node-fetch | 10 | 5945.40 req/sec | ± 2.13 % | + 4.15 % |
| got | 35 | 6511.45 req/sec | ± 2.84 % | + 14.07 % |
| http - keepalive | 65 | 9193.24 req/sec | ± 2.92 % | + 61.05 % |
| superagent | 35 | 9339.43 req/sec | ± 2.95 % | + 63.61 % |
| undici - pipeline | 50 | 13364.62 req/sec | ± 2.93 % | + 134.13 % |
| undici - stream | 95 | 18245.36 req/sec | ± 2.99 % | + 219.63 % |
| undici - request | 50 | 18340.17 req/sec | ± 2.84 % | + 221.29 % |
| undici - dispatch | 40 | 22234.42 req/sec | ± 2.94 % | + 289.51 % |
See: Benchmark Result
🎯 Final Thought
The default HTTP client in Node.js works.
But production systems require more than “works”.
They require:
- Predictability
- Stability
- Resilience
- Control
Undici is currently the best choice for modern Node.js applications.
If you’re building:
- API Gateway
- Microservices
- Aggregator services
- High concurrency backend
Invest time in hardening your HTTP client.
It will save you from catastrophic outages later.
