Fetch API Mastery: Beyond Basic GET Requests
A deep dive into the Fetch API covering POST requests, file uploads, error handling, AbortController, retry logic, streaming, interceptors, and building a complete API client.
Fetch API Mastery: Beyond Basic GET Requests
Overview
The Fetch API is the modern standard for making HTTP requests in JavaScript, available in every major browser and in Node.js since version 18. Most tutorials stop at fetch(url).then(res => res.json()), which barely scratches the surface. The real power of Fetch is in its composability -- you can build retry logic, interceptors, streaming parsers, cancellation, and full API clients without pulling in a single dependency. This article covers every practical aspect of Fetch that you will need in production, from the basics through advanced patterns that I use daily in both browser and server-side applications.
Prerequisites
- Solid understanding of JavaScript Promises and async/await
- Basic knowledge of HTTP methods, headers, and status codes
- A browser with DevTools or Node.js 18+ for running examples
- Familiarity with JSON and REST API conventions
Why Fetch Replaced XMLHttpRequest
XMLHttpRequest served the web for over fifteen years, but its API is callback-based, verbose, and awkward. Fetch gives you Promises natively, a clean request/response model, and streaming support. Here is the same request in both APIs:
// XMLHttpRequest -- verbose, callback-based
var xhr = new XMLHttpRequest();
xhr.open("GET", "/api/users");
xhr.onload = function () {
if (xhr.status === 200) {
var data = JSON.parse(xhr.responseText);
console.log(data);
}
};
xhr.onerror = function () {
console.error("Network error");
};
xhr.send();
// Fetch -- clean, Promise-based
fetch("/api/users")
.then(function (response) {
return response.json();
})
.then(function (data) {
console.log(data);
})
.catch(function (err) {
console.error("Network error:", err);
});
Fetch is not just shorter. It is a fundamentally better abstraction. The Response object gives you fine-grained control over headers, status codes, body consumption, and streaming. XMLHttpRequest conflates all of that into a single object with dozens of properties.
The Response Object
Every call to fetch() resolves to a Response object. Understanding its properties is critical.
fetch("/api/users/42").then(function (response) {
console.log(response.ok); // true if status is 200-299
console.log(response.status); // 200, 404, 500, etc.
console.log(response.statusText); // "OK", "Not Found", etc.
console.log(response.url); // Final URL after redirects
console.log(response.redirected); // true if redirected
console.log(response.type); // "basic", "cors", "opaque"
// Headers are an iterable Headers object
response.headers.forEach(function (value, name) {
console.log(name + ": " + value);
});
// Body can be consumed ONCE using one of these methods:
// response.json() -- parse as JSON
// response.text() -- read as plain text
// response.blob() -- read as Blob (binary)
// response.arrayBuffer() -- read as ArrayBuffer
// response.formData() -- read as FormData
return response.json();
});
The body is a ReadableStream and can only be consumed once. If you need to read the body multiple times, clone the response first with response.clone().
POST, PUT, PATCH, and DELETE Requests
Fetch defaults to GET. For other methods, pass an options object:
// POST with JSON body
function createUser(userData) {
return fetch("/api/users", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(userData)
}).then(function (response) {
if (!response.ok) {
throw new Error("Create failed: " + response.status);
}
return response.json();
});
}
// PUT -- full replacement
function updateUser(id, userData) {
return fetch("/api/users/" + id, {
method: "PUT",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(userData)
}).then(function (response) {
return response.json();
});
}
// PATCH -- partial update
function patchUser(id, fields) {
return fetch("/api/users/" + id, {
method: "PATCH",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(fields)
}).then(function (response) {
return response.json();
});
}
// DELETE
function deleteUser(id) {
return fetch("/api/users/" + id, {
method: "DELETE"
}).then(function (response) {
if (response.status === 204) {
return null; // No content
}
return response.json();
});
}
A common mistake is forgetting the Content-Type header when sending JSON. Without it, many servers will reject the request or misparse the body.
Sending FormData and File Uploads
When sending FormData, do not set the Content-Type header manually. The browser needs to set it automatically to include the multipart boundary string.
// File upload with FormData
function uploadAvatar(userId, fileInput) {
var formData = new FormData();
formData.append("avatar", fileInput.files[0]);
formData.append("userId", userId);
return fetch("/api/users/" + userId + "/avatar", {
method: "POST",
body: formData
// Do NOT set Content-Type -- browser sets it with boundary
}).then(function (response) {
if (!response.ok) {
throw new Error("Upload failed: " + response.status);
}
return response.json();
});
}
// Multiple file upload
function uploadDocuments(files) {
var formData = new FormData();
for (var i = 0; i < files.length; i++) {
formData.append("documents", files[i], files[i].name);
}
return fetch("/api/documents", {
method: "POST",
body: formData
}).then(function (response) {
return response.json();
});
}
Handling Different Response Types
Not every API returns JSON. Fetch handles binary data, text, and blobs natively:
// Download an image as a Blob
function downloadImage(url) {
return fetch(url).then(function (response) {
return response.blob();
}).then(function (blob) {
// Create an object URL for display
var objectUrl = URL.createObjectURL(blob);
var img = document.createElement("img");
img.src = objectUrl;
document.body.appendChild(img);
});
}
// Download a binary file as ArrayBuffer
function downloadBinary(url) {
return fetch(url).then(function (response) {
return response.arrayBuffer();
}).then(function (buffer) {
console.log("Downloaded " + buffer.byteLength + " bytes");
// Process raw bytes
var view = new Uint8Array(buffer);
return view;
});
}
// Read plain text (CSV, XML, etc.)
function fetchCSV(url) {
return fetch(url).then(function (response) {
return response.text();
}).then(function (csvText) {
var rows = csvText.split("\n").map(function (row) {
return row.split(",");
});
return rows;
});
}
Error Handling: The Biggest Fetch Gotcha
This is where most developers get burned. Fetch does not reject the Promise on HTTP errors. A 404 or 500 response is a successful network request from Fetch's perspective. The Promise only rejects on actual network failures -- DNS errors, connection refused, CORS blocks, or aborted requests.
// WRONG -- this catch will NOT fire on 404 or 500
fetch("/api/users/999")
.then(function (response) {
return response.json(); // Will try to parse the error body as JSON
})
.catch(function (err) {
// Only fires on network errors, NOT HTTP errors
console.error(err);
});
// CORRECT -- check response.ok
fetch("/api/users/999")
.then(function (response) {
if (!response.ok) {
// Throw so it flows into .catch()
throw new Error("HTTP " + response.status + ": " + response.statusText);
}
return response.json();
})
.then(function (data) {
console.log(data);
})
.catch(function (err) {
console.error("Request failed:", err.message);
});
// BETTER -- extract error details from response body
function checkResponse(response) {
if (response.ok) {
return response;
}
return response.text().then(function (body) {
var error = new Error("HTTP " + response.status + ": " + response.statusText);
error.status = response.status;
error.statusText = response.statusText;
try {
error.body = JSON.parse(body);
} catch (e) {
error.body = body;
}
throw error;
});
}
fetch("/api/users/999")
.then(checkResponse)
.then(function (response) {
return response.json();
})
.catch(function (err) {
console.error("Status:", err.status);
console.error("Body:", err.body);
});
I have seen production bugs where a 500 error was silently swallowed because developers assumed Fetch worked like Axios. Always check response.ok.
AbortController: Cancellation and Timeouts
Fetch has no built-in timeout. You build one with AbortController:
// Cancel a request
var controller = new AbortController();
fetch("/api/slow-endpoint", {
signal: controller.signal
}).then(function (response) {
return response.json();
}).catch(function (err) {
if (err.name === "AbortError") {
console.log("Request was cancelled");
} else {
console.error("Request failed:", err);
}
});
// Cancel after 2 seconds
setTimeout(function () {
controller.abort();
}, 2000);
// Reusable timeout wrapper
function fetchWithTimeout(url, options, timeoutMs) {
var controller = new AbortController();
var timeoutId = setTimeout(function () {
controller.abort();
}, timeoutMs || 10000);
var fetchOptions = Object.assign({}, options, {
signal: controller.signal
});
return fetch(url, fetchOptions).finally(function () {
clearTimeout(timeoutId);
});
}
// Usage
fetchWithTimeout("/api/data", {}, 5000)
.then(function (response) {
return response.json();
})
.catch(function (err) {
if (err.name === "AbortError") {
console.error("Request timed out after 5 seconds");
}
});
In newer runtimes (Node 18.11+, modern browsers), you can use AbortSignal.timeout():
fetch("/api/data", {
signal: AbortSignal.timeout(5000)
}).then(function (response) {
return response.json();
});
Retry Logic with Exponential Backoff
Network requests fail. Servers return 503. Connections drop. Retry logic is not optional for production code.
function fetchWithRetry(url, options, maxRetries, baseDelay) {
maxRetries = maxRetries || 3;
baseDelay = baseDelay || 1000;
function attempt(retryCount) {
return fetch(url, options).then(function (response) {
if (response.status === 429 || response.status >= 500) {
if (retryCount < maxRetries) {
var delay = baseDelay * Math.pow(2, retryCount);
// Add jitter to prevent thundering herd
var jitter = Math.random() * delay * 0.5;
var totalDelay = delay + jitter;
console.log(
"Retry " + (retryCount + 1) + "/" + maxRetries +
" after " + Math.round(totalDelay) + "ms"
);
return new Promise(function (resolve) {
setTimeout(resolve, totalDelay);
}).then(function () {
return attempt(retryCount + 1);
});
}
}
return response;
}).catch(function (err) {
// Network error -- retry
if (retryCount < maxRetries && err.name !== "AbortError") {
var delay = baseDelay * Math.pow(2, retryCount);
return new Promise(function (resolve) {
setTimeout(resolve, delay);
}).then(function () {
return attempt(retryCount + 1);
});
}
throw err;
});
}
return attempt(0);
}
// Usage
fetchWithRetry("/api/unstable-service", { method: "GET" }, 3, 1000)
.then(function (response) {
return response.json();
})
.then(function (data) {
console.log("Got data after retries:", data);
});
Only retry on idempotent requests (GET, PUT, DELETE) or when you are certain the server did not process the request. Retrying a POST that partially succeeded will create duplicate records.
Parallel Requests with Promise.all
When you need data from multiple endpoints, fire them in parallel:
function loadDashboard(userId) {
return Promise.all([
fetch("/api/users/" + userId).then(function (r) { return r.json(); }),
fetch("/api/users/" + userId + "/orders").then(function (r) { return r.json(); }),
fetch("/api/users/" + userId + "/notifications").then(function (r) { return r.json(); })
]).then(function (results) {
return {
user: results[0],
orders: results[1],
notifications: results[2]
};
});
}
// With Promise.allSettled -- don't fail everything if one request fails
function loadDashboardResilient(userId) {
return Promise.allSettled([
fetch("/api/users/" + userId).then(function (r) { return r.json(); }),
fetch("/api/users/" + userId + "/orders").then(function (r) { return r.json(); }),
fetch("/api/users/" + userId + "/notifications").then(function (r) { return r.json(); })
]).then(function (results) {
return {
user: results[0].status === "fulfilled" ? results[0].value : null,
orders: results[1].status === "fulfilled" ? results[1].value : null,
notifications: results[2].status === "fulfilled" ? results[2].value : null
};
});
}
Promise.all fails fast -- if any request fails, all results are lost. Use Promise.allSettled when you want partial results. I use allSettled for dashboard loads and all for transactional flows where partial data is useless.
Sequential Request Chains
Sometimes requests depend on previous responses:
function createOrderFlow(cartId) {
var orderId;
return fetch("/api/carts/" + cartId)
.then(function (response) { return response.json(); })
.then(function (cart) {
// Create order from cart
return fetch("/api/orders", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ items: cart.items, total: cart.total })
});
})
.then(function (response) { return response.json(); })
.then(function (order) {
orderId = order.id;
// Process payment
return fetch("/api/payments", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ orderId: order.id, amount: order.total })
});
})
.then(function (response) { return response.json(); })
.then(function (payment) {
// Confirm order
return fetch("/api/orders/" + orderId + "/confirm", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ paymentId: payment.id })
});
})
.then(function (response) { return response.json(); });
}
Streaming Responses with ReadableStream
Fetch gives you access to the response body as a stream. This is powerful for large downloads, server-sent events, or progress tracking:
function downloadWithProgress(url, onProgress) {
return fetch(url).then(function (response) {
var contentLength = parseInt(response.headers.get("Content-Length"), 10);
var loaded = 0;
var reader = response.body.getReader();
var chunks = [];
function read() {
return reader.read().then(function (result) {
if (result.done) {
// Combine all chunks
var blob = new Blob(chunks);
return blob;
}
chunks.push(result.value);
loaded += result.value.length;
if (onProgress && contentLength) {
onProgress({
loaded: loaded,
total: contentLength,
percent: Math.round((loaded / contentLength) * 100)
});
}
return read();
});
}
return read();
});
}
// Usage
downloadWithProgress("/api/export/large-report.csv", function (progress) {
console.log("Downloaded: " + progress.percent + "%");
}).then(function (blob) {
console.log("Complete. Size: " + blob.size + " bytes");
});
Streaming is also useful for processing newline-delimited JSON (NDJSON) from a server:
function streamNDJSON(url, onRecord) {
return fetch(url).then(function (response) {
var reader = response.body.getReader();
var decoder = new TextDecoder();
var buffer = "";
function processChunk() {
return reader.read().then(function (result) {
if (result.done) {
// Process remaining buffer
if (buffer.trim()) {
onRecord(JSON.parse(buffer.trim()));
}
return;
}
buffer += decoder.decode(result.value, { stream: true });
var lines = buffer.split("\n");
buffer = lines.pop(); // Keep incomplete line in buffer
lines.forEach(function (line) {
if (line.trim()) {
onRecord(JSON.parse(line));
}
});
return processChunk();
});
}
return processChunk();
});
}
Fetch Interceptor Pattern
Axios has interceptors built in. With Fetch, you build them by wrapping the global function:
function createFetchInterceptor(interceptors) {
var originalFetch = window.fetch;
window.fetch = function (url, options) {
var modifiedOptions = Object.assign({}, options);
// Run request interceptors
if (interceptors.request) {
interceptors.request.forEach(function (interceptor) {
var result = interceptor(url, modifiedOptions);
if (result) {
url = result.url || url;
modifiedOptions = result.options || modifiedOptions;
}
});
}
return originalFetch(url, modifiedOptions).then(function (response) {
// Run response interceptors
if (interceptors.response) {
interceptors.response.forEach(function (interceptor) {
response = interceptor(response) || response;
});
}
return response;
});
};
// Return a restore function
return function restore() {
window.fetch = originalFetch;
};
}
// Usage -- add auth header to every request
var restore = createFetchInterceptor({
request: [
function (url, options) {
var token = localStorage.getItem("authToken");
if (token) {
options.headers = options.headers || {};
options.headers["Authorization"] = "Bearer " + token;
}
return { url: url, options: options };
}
],
response: [
function (response) {
if (response.status === 401) {
window.location.href = "/login";
}
return response;
}
]
});
Credential Handling and CORS
Fetch does not send cookies by default for cross-origin requests. You must opt in:
// Same-origin -- cookies sent automatically
fetch("/api/profile");
// Cross-origin -- must set credentials
fetch("https://api.example.com/profile", {
credentials: "include" // Send cookies cross-origin
});
// credentials options:
// "omit" -- never send cookies
// "same-origin" -- default, cookies for same-origin only
// "include" -- always send cookies
// CORS request with custom headers
fetch("https://api.example.com/data", {
method: "POST",
headers: {
"Content-Type": "application/json",
"X-Custom-Header": "value"
},
credentials: "include",
mode: "cors" // default for cross-origin, explicit here for clarity
});
Remember that the server must set Access-Control-Allow-Credentials: true and cannot use Access-Control-Allow-Origin: * when credentials are included. It must echo the specific origin.
Caching with the Cache API
The Cache API pairs naturally with Fetch for offline-first patterns and custom caching:
function cachedFetch(url, options, cacheName, maxAge) {
cacheName = cacheName || "api-cache";
maxAge = maxAge || 300000; // 5 minutes
return caches.open(cacheName).then(function (cache) {
return cache.match(url).then(function (cachedResponse) {
if (cachedResponse) {
var cachedTime = cachedResponse.headers.get("X-Cached-At");
if (cachedTime && (Date.now() - parseInt(cachedTime, 10)) < maxAge) {
return cachedResponse;
}
}
return fetch(url, options).then(function (response) {
if (response.ok) {
// Clone because we need to consume the body twice
var cloned = response.clone();
var headers = new Headers(cloned.headers);
headers.set("X-Cached-At", String(Date.now()));
var cachedResp = new Response(cloned.body, {
status: cloned.status,
statusText: cloned.statusText,
headers: headers
});
cache.put(url, cachedResp);
}
return response;
});
});
});
}
Request and Response Cloning
Since response bodies can only be consumed once, cloning is essential when you need to read the body multiple times:
fetch("/api/data").then(function (response) {
var clone = response.clone();
// Read body as JSON for processing
response.json().then(function (data) {
console.log("Processed:", data);
});
// Read clone as text for logging
clone.text().then(function (text) {
console.log("Raw response:", text);
});
});
Cloning also works for requests, which is useful for retry logic where you need to re-send the same request.
Fetch vs Axios
I get asked this constantly. Here is my honest take:
| Feature | Fetch | Axios |
|---|---|---|
| Built-in | Yes (browser + Node 18+) | No (50KB+ dependency) |
| HTTP error rejection | No (manual check) | Yes (automatic) |
| Request cancellation | AbortController | CancelToken / AbortController |
| Interceptors | Manual wrapper | Built-in |
| Timeout | Manual AbortController | Built-in option |
| Progress events | ReadableStream | onUploadProgress / onDownloadProgress |
| Automatic JSON | No (manual parse) | Yes |
| Request transforms | Manual | Built-in |
| Browser support | IE excluded | IE11 with polyfills |
My recommendation: use Fetch for new projects. The "missing" features are trivially implemented (as shown throughout this article), and you eliminate a dependency. Axios still makes sense if you need upload progress events (Fetch cannot track upload progress in browsers) or if your team wants a batteries-included HTTP client immediately.
Node.js Fetch (Undici)
Since Node.js 18, fetch is available globally, powered by the Undici HTTP client. It is not a polyfill -- it is a high-performance, spec-compliant implementation.
// server.js -- Node.js 18+
// No imports needed -- fetch is global
function fetchFromService(endpoint) {
return fetch("http://internal-service:3000" + endpoint, {
headers: {
"X-Service-Name": "my-app",
"X-Request-Id": generateRequestId()
}
}).then(function (response) {
if (!response.ok) {
throw new Error("Service error: " + response.status);
}
return response.json();
});
}
function generateRequestId() {
return Date.now().toString(36) + Math.random().toString(36).substr(2, 9);
}
One gotcha: Node.js fetch does not support the credentials option (no cookie jar by default). For cookie management in Node, use the undici package directly or a library like tough-cookie.
Complete Working Example: Production API Client
Here is the API client module I use as a starting point in production projects. It wraps Fetch with automatic JSON handling, error normalization, retry logic, interceptors, timeout support, and token refresh.
// api-client.js -- Production-ready Fetch wrapper
// Works in browser and Node.js 18+
var ApiClient = (function () {
function ApiClient(config) {
this.baseUrl = config.baseUrl || "";
this.timeout = config.timeout || 30000;
this.maxRetries = config.maxRetries || 2;
this.retryDelay = config.retryDelay || 1000;
this.headers = config.headers || {};
this.requestInterceptors = [];
this.responseInterceptors = [];
this._getToken = config.getToken || null;
this._refreshToken = config.refreshToken || null;
this._isRefreshing = false;
this._refreshQueue = [];
}
// Register interceptors
ApiClient.prototype.onRequest = function (fn) {
this.requestInterceptors.push(fn);
return this;
};
ApiClient.prototype.onResponse = function (fn) {
this.responseInterceptors.push(fn);
return this;
};
// Core request method
ApiClient.prototype.request = function (method, path, options) {
var self = this;
options = options || {};
var url = self.baseUrl + path;
var controller = new AbortController();
var timeoutId = setTimeout(function () {
controller.abort();
}, options.timeout || self.timeout);
var fetchOptions = {
method: method,
headers: Object.assign({}, self.headers, options.headers || {}),
signal: controller.signal
};
// Auto-set JSON content type for object bodies
if (options.body && typeof options.body === "object" && !(options.body instanceof FormData)) {
fetchOptions.headers["Content-Type"] = "application/json";
fetchOptions.body = JSON.stringify(options.body);
} else if (options.body) {
fetchOptions.body = options.body;
}
// Add auth token
if (self._getToken) {
var token = self._getToken();
if (token) {
fetchOptions.headers["Authorization"] = "Bearer " + token;
}
}
// Run request interceptors
self.requestInterceptors.forEach(function (interceptor) {
var result = interceptor(url, fetchOptions);
if (result) {
url = result.url || url;
fetchOptions = result.options || fetchOptions;
}
});
function executeRequest(retryCount) {
return fetch(url, fetchOptions)
.then(function (response) {
clearTimeout(timeoutId);
// Run response interceptors
self.responseInterceptors.forEach(function (interceptor) {
response = interceptor(response) || response;
});
// Handle 401 with token refresh
if (response.status === 401 && self._refreshToken && retryCount === 0) {
return self._handleTokenRefresh().then(function (newToken) {
fetchOptions.headers["Authorization"] = "Bearer " + newToken;
return fetch(url, fetchOptions);
});
}
// Retry on 5xx and 429
if ((response.status >= 500 || response.status === 429) && retryCount < self.maxRetries) {
var delay = self.retryDelay * Math.pow(2, retryCount);
var jitter = Math.random() * delay * 0.3;
return new Promise(function (resolve) {
setTimeout(resolve, delay + jitter);
}).then(function () {
return executeRequest(retryCount + 1);
});
}
return self._normalizeResponse(response);
})
.catch(function (err) {
clearTimeout(timeoutId);
if (err.name === "AbortError") {
var timeoutErr = new Error("Request timeout after " + (options.timeout || self.timeout) + "ms");
timeoutErr.code = "TIMEOUT";
throw timeoutErr;
}
// Retry network errors
if (retryCount < self.maxRetries) {
var delay = self.retryDelay * Math.pow(2, retryCount);
return new Promise(function (resolve) {
setTimeout(resolve, delay);
}).then(function () {
return executeRequest(retryCount + 1);
});
}
var networkErr = new Error("Network error: " + err.message);
networkErr.code = "NETWORK_ERROR";
throw networkErr;
});
}
return executeRequest(0);
};
// Normalize response into consistent format
ApiClient.prototype._normalizeResponse = function (response) {
var contentType = response.headers.get("Content-Type") || "";
var parseBody;
if (contentType.indexOf("application/json") !== -1) {
parseBody = response.json();
} else {
parseBody = response.text();
}
return parseBody.then(function (body) {
if (!response.ok) {
var error = new Error("HTTP " + response.status + ": " + (body.message || response.statusText));
error.status = response.status;
error.body = body;
error.headers = response.headers;
throw error;
}
return {
status: response.status,
headers: response.headers,
data: body
};
});
};
// Token refresh with queue to prevent concurrent refreshes
ApiClient.prototype._handleTokenRefresh = function () {
var self = this;
if (self._isRefreshing) {
return new Promise(function (resolve, reject) {
self._refreshQueue.push({ resolve: resolve, reject: reject });
});
}
self._isRefreshing = true;
return self._refreshToken().then(function (newToken) {
self._isRefreshing = false;
self._refreshQueue.forEach(function (pending) {
pending.resolve(newToken);
});
self._refreshQueue = [];
return newToken;
}).catch(function (err) {
self._isRefreshing = false;
self._refreshQueue.forEach(function (pending) {
pending.reject(err);
});
self._refreshQueue = [];
throw err;
});
};
// Convenience methods
ApiClient.prototype.get = function (path, options) {
return this.request("GET", path, options);
};
ApiClient.prototype.post = function (path, body, options) {
return this.request("POST", path, Object.assign({}, options, { body: body }));
};
ApiClient.prototype.put = function (path, body, options) {
return this.request("PUT", path, Object.assign({}, options, { body: body }));
};
ApiClient.prototype.patch = function (path, body, options) {
return this.request("PATCH", path, Object.assign({}, options, { body: body }));
};
ApiClient.prototype.del = function (path, options) {
return this.request("DELETE", path, options);
};
return ApiClient;
})();
// ============================================
// Usage Examples
// ============================================
// Browser usage
var api = new ApiClient({
baseUrl: "https://api.example.com",
timeout: 15000,
maxRetries: 2,
headers: {
"X-App-Version": "2.1.0"
},
getToken: function () {
return localStorage.getItem("accessToken");
},
refreshToken: function () {
return fetch("https://api.example.com/auth/refresh", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
refreshToken: localStorage.getItem("refreshToken")
})
})
.then(function (r) { return r.json(); })
.then(function (data) {
localStorage.setItem("accessToken", data.accessToken);
return data.accessToken;
});
}
});
// Add request logging interceptor
api.onRequest(function (url, options) {
console.log("[API]", options.method, url);
return { url: url, options: options };
});
// Add response timing interceptor
api.onResponse(function (response) {
console.log("[API] Response:", response.status, response.url);
return response;
});
// Make requests
api.get("/users/me").then(function (result) {
console.log("User:", result.data);
});
api.post("/orders", { items: [{ sku: "ABC", qty: 2 }] })
.then(function (result) {
console.log("Order created:", result.data.id);
})
.catch(function (err) {
console.error("Failed:", err.status, err.body);
});
// Node.js usage (same code, different token strategy)
// var api = new ApiClient({
// baseUrl: "http://localhost:3000",
// timeout: 10000,
// headers: { "X-Service": "order-processor" },
// getToken: function () { return process.env.SERVICE_TOKEN; }
// });
This client handles the five most common production issues: timeouts, retries, token refresh races, error normalization, and consistent JSON handling. It weighs zero dependencies because it is pure Fetch.
Common Issues and Troubleshooting
1. "TypeError: Failed to fetch" with no other details
This is a CORS error, a network error, or a mixed content block. Open DevTools Network tab to see the actual cause. Fetch intentionally hides details of CORS failures for security. Check that your server sends the correct Access-Control-Allow-Origin header and that you are not making HTTP requests from an HTTPS page.
2. "Body has already been consumed" error
You tried to call .json(), .text(), or another body method twice on the same Response. The body is a stream and can only be read once. Clone the response first:
// Fix: clone before consuming
var clone = response.clone();
response.json(); // first read
clone.text(); // second read on clone
3. POST request sends empty body -- server receives nothing
You forgot to stringify the JSON body or set the Content-Type header:
// Broken -- body is [object Object]
fetch("/api", { method: "POST", body: { name: "test" } });
// Fixed
fetch("/api", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ name: "test" })
});
4. Cookies not sent on cross-origin requests
Fetch does not include cookies for cross-origin requests by default. Add credentials: "include" to the fetch options. Also verify the server response includes Access-Control-Allow-Credentials: true and specifies the exact origin (not *).
5. AbortError thrown after component unmounts (React/framework memory leak)
Cancel in-flight requests when components unmount to prevent state updates on unmounted components:
// In a component lifecycle or useEffect equivalent
var controller = new AbortController();
fetch("/api/data", { signal: controller.signal })
.then(function (r) { return r.json(); })
.then(function (data) {
// update state
})
.catch(function (err) {
if (err.name !== "AbortError") {
console.error(err);
}
});
// On unmount
controller.abort();
Best Practices
Always check
response.ok-- Fetch does not throw on HTTP errors. Wrap every fetch call with a status check, or use a client like the one above that normalizes errors automatically.Set timeouts on every request -- Fetch will hang forever waiting for a response if the server stalls. Use AbortController with a timeout on every production request. I default to 30 seconds for API calls and 60 seconds for file uploads.
Do not retry non-idempotent requests blindly -- Retrying a failed POST can create duplicate records. Only retry when you know the request was not processed (network errors before the server responded) or when the server explicitly tells you to retry (429, 503 with Retry-After).
Clone responses before caching -- Response bodies are streams. If you consume the body and then try to cache the response, the cached version will have an empty body. Always clone before reading.
Avoid monkey-patching global
fetchin libraries -- The interceptor pattern shown above is fine for application code. If you are writing a library, create a wrapper function instead. Monkey-patchingfetchin a library will affect every other library in the page.Use
AbortSignal.any()to combine cancellation sources -- When you need both a timeout and a manual cancel button, combine signals rather than managing multiple controllers.Stream large responses instead of buffering -- Calling
.json()or.text()on a 500MB response will buffer the entire thing in memory. Useresponse.body.getReader()and process chunks incrementally for large payloads.Set appropriate
Content-Typeheaders -- Sending JSON withoutapplication/jsoncauses silent parsing failures on many servers. Sending FormData with a manually set Content-Type breaks the multipart boundary. Know when to set it and when to let the browser handle it.