Loading

Quipoin Menu

Learn • Practice • Grow

express-js / Rate Limiting
tutorial

Rate Limiting

Imagine a busy restaurant where one customer tries to order 100 meals at once, blocking everyone else. That's what happens when a single client makes too many requests too quickly – it can overwhelm your server and deny service to others. **Rate limiting** prevents this by controlling how many requests a client can make in a given time period.

Why Rate Limiting?

  • Prevent Brute Force Attacks: Limit login attempts to stop password guessing.
  • Prevent DDoS: Mitigate denial-of-service attacks.
  • API Fairness: Ensure fair usage among all clients.
  • Cost Control: Protect against excessive API usage.

Think of rate limiting as a bouncer at a club – only letting in a certain number of people per minute to keep things manageable inside.

Express Rate Limit Middleware

The most popular rate limiting library for Express is `express-rate-limit`.
npm install express-rate-limit

Basic Rate Limiting
const rateLimit = require('express-rate-limit');

<!-- Create a limiter -->
const limiter = rateLimit({
  windowMs: 15 * 60 * 1000, <!-- 15 minutes -->
  max: 100, <!-- Limit each IP to 100 requests per windowMs -->
  message: 'Too many requests from this IP, please try again later.',
  standardHeaders: true, <!-- Return rate limit info in headers -->
  legacyHeaders: false,
});

<!-- Apply to all routes -->
app.use(limiter);

Different Limits for Different Routes

You might want stricter limits on login routes.
<!-- Strict limiter for auth routes -->
const authLimiter = rateLimit({
  windowMs: 15 * 60 * 1000, <!-- 15 minutes -->
  max: 5, <!-- Only 5 attempts per 15 minutes -->
  skipSuccessfulRequests: true, <!-- Don't count successful logins -->
});

<!-- Loose limiter for public API -->
const apiLimiter = rateLimit({
  windowMs: 60 * 1000, <!-- 1 minute -->
  max: 60, <!-- 60 requests per minute -->
});

<!-- Apply different limiters -->
app.use('/api/auth', authLimiter);
app.use('/api', apiLimiter);

Advanced Configuration Options
const limiter = rateLimit({
  windowMs: 60 * 1000,
  max: 10,
 
  <!-- Custom key generator (default is IP) -->
  keyGenerator: (req) => {
    return req.user?.id || req.ip; <!-- Rate limit by user ID if authenticated -->
  },
 
  <!-- Skip certain requests -->
  skip: (req) => {
    return req.path === '/health'; <!-- Don't rate limit health checks -->
  },
 
  <!-- Custom handler when limit is exceeded -->
  handler: (req, res) => {
    res.status(429).json({
      error: 'Rate limit exceeded',
      retryAfter: Math.ceil(req.rateLimit.resetTime / 1000)
    });
  }
});

Slow Down Middleware

Instead of blocking requests, you can slow them down using `express-slow-down`. This gradually increases response time for excessive requests.
npm install express-slow-down
const slowDown = require('express-slow-down');

const speedLimiter = slowDown({
  windowMs: 15 * 60 * 1000, <!-- 15 minutes -->
  delayAfter: 10, <!-- Allow 10 requests without delay -->
  delayMs: (hits) => hits * 500, <!-- Add 500ms delay per additional hit -->
  maxDelayMs: 10000, <!-- Max delay of 10 seconds -->
});

app.use('/api', speedLimiter);

Redis Store for Distributed Rate Limiting

If you have multiple server instances, use Redis to share rate limit data.
npm install rate-limit-redis redis
const redis = require('redis');
const RedisStore = require('rate-limit-redis');

const client = redis.createClient({
  url: process.env.REDIS_URL
});
client.connect().catch(console.error);

const limiter = rateLimit({
  store: new RedisStore({
    sendCommand: (...args) => client.sendCommand(args),
  }),
  windowMs: 15 * 60 * 1000,
  max: 100,
});

Response Headers

When rate limiting is active, clients receive these headers:
RateLimit-Limit: 100
RateLimit-Remaining: 99
RateLimit-Reset: 1623456789
Retry-After: 50

Two Minute Drill

  • Rate limiting prevents abuse by controlling request frequency.
  • Use `express-rate-limit` for simple rate limiting.
  • Set stricter limits on sensitive routes like login.
  • Use `express-slow-down` to gradually slow down excessive clients instead of blocking.
  • For distributed systems, use Redis to share rate limit data.

Need more clarification?

Drop us an email at career@quipoinfotech.com