Profile First to Avoid Guessing
Use Node.js built-in Performance Hooks API to measure exact timings without code changes—run multiple times to confirm slowness like 'took: 1200 ms' and identify the slowest function. This reveals true bottlenecks instead of assumptions, preventing wasted fixes on non-issues.
Solve Database N+1 Queries and Indexing
N+1 problem from repeated queries in loops (e.g., one query per item) dominates slowdowns—replace with single relational fetches or SQL joins to load all data at once. Add indexes on queried fields, slashing times from 1s to 100ms. Avoid unindexed scans on large tables, as they force full table reads.
Cache Repeated Queries and Parallelize Logic
For static-ish data like product lists or dashboards queried per request, store results in Redis—subsequent calls hit cache instantly instead of DB. Parallelize sequential operations (e.g., Promise.all on arrays) over for-loops, but limit concurrency for large arrays (e.g., 1000 items) with p-limit to prevent overload. Watch for blocking sync code, big loops, or heavy JSON parsing that stalls Node's single thread—keep event loop free.
Trim Payloads, Timeout Externals, and Compress
Request only needed fields (e.g., /users?fields=id,name) to cut data volume. Set timeouts on external API calls to avoid indefinite hangs, with error handling. Enable gzip/Brotli compression middleware—turns 500kb JSON into 50-100kb over the wire, speeding perceived response without core changes.