Launch Offer: Use codelaunch30for 30% off

out of memory

This error occurs when PostgreSQL cannot allocate enough memory for an operation. Learn about memory configuration and query optimization.

The out of memory error means PostgreSQL couldn't allocate memory for the operation.

Understanding the Error

ERROR: out of memory
DETAIL: Failed on request of size 1048576 in memory context "ExprContext".

The query or operation needs more memory than available.

Common Causes

1. Large Sort Operations

SQL
-- Sorting huge result sets
SELECT * FROM huge_table ORDER BY multiple_columns;

2. Complex Aggregations

SQL
-- Many groups in GROUP BY
SELECT user_id, COUNT(*) FROM events GROUP BY user_id;

3. Insufficient work_mem

Default work_mem might be too small for your queries.

4. Too Many Connections

Each connection uses memory, too many exhausts available RAM.

5. Large Hash Joins

SQL
SELECT * FROM table_a a
JOIN table_b b ON a.id = b.id;  -- Hash join needs memory

How to Fix It

Solution 1: Increase work_mem

SQL
-- Per session
SET work_mem = '256MB';

-- For specific query
SET LOCAL work_mem = '512MB';
SELECT ...;
RESET work_mem;

In postgresql.conf:

work_mem = '64MB'  # Default is 4MB

Solution 2: Optimize the Query

SQL
-- Add LIMIT
SELECT * FROM huge_table ORDER BY created_at DESC LIMIT 100;

-- Add WHERE clause
SELECT * FROM events WHERE created_at > NOW() - INTERVAL '7 days';

-- Use index for sorting
CREATE INDEX idx_table_sort ON huge_table(created_at DESC);

Solution 3: Reduce Hash Operations

SQL
-- Force merge join instead of hash join
SET enable_hashjoin = off;
SELECT ...;
SET enable_hashjoin = on;

Solution 4: Configure shared_buffers

In postgresql.conf:

# Typically 25% of system RAM
shared_buffers = '4GB'

Solution 5: Limit Connections

SQL
-- In postgresql.conf
max_connections = 100  # Reduce if too high

Use connection pooling:

# PgBouncer configuration
max_client_conn = 1000
default_pool_size = 20

Solution 6: Process in Batches

JAVASCRIPT
// Instead of loading all data
const batchSize = 1000;
let offset = 0;

while (true) {
  const rows = await db.query(
    'SELECT * FROM huge_table LIMIT $1 OFFSET $2',
    [batchSize, offset]
  );
  if (rows.length === 0) break;

  // Process rows
  offset += batchSize;
}

Memory Configuration

SQL
-- Check current settings
SHOW work_mem;
SHOW shared_buffers;
SHOW maintenance_work_mem;

-- Recommended values for 16GB RAM:
-- shared_buffers = 4GB
-- work_mem = 64MB
-- maintenance_work_mem = 1GB
-- effective_cache_size = 12GB

Best Practices

  1. Size work_mem appropriately for your workload
  2. Use connection pooling to limit concurrent connections
  3. Monitor memory usage with pg_stat_activity
  4. Optimize queries before increasing memory
  5. Use EXPLAIN ANALYZE to understand memory usage