Memoization
Problem
Components recalculate expensive operations on every render, even when inputs have not changed, wasting CPU cycles computing identical results repeatedly. This causes visible lag during typing in search boxes, scrolling through long lists, or any frequent interactions because filtering, sorting, and transformations run dozens of times per second unnecessarily, while sorting large datasets of thousands of items recalculates on every parent re-render even though the data hasn’t changed.
Complex filtering operations with multiple conditions execute repeatedly with identical inputs. Data transformations like aggregating statistics, formatting numbers, or generating derived values happen on every component update, while creating event handler functions or callback references on every render causes child components to re-render unnecessarily because reference equality checks fail. Expensive regex operations, date formatting, or string manipulations run redundantly, while pure functions that always return the same output for the same inputs get recalculated wastefully.
Solution
Cache expensive computation results and reuse them when inputs have not changed, using shallow equality checks on dependencies to determine if recalculation is needed.
Memoize computed values using framework-specific hooks like useMemo in React, computed properties in Vue, or reactive statements in Svelte. Wrap components with React.memo to prevent re-renders when props haven’t changed.
Create stable callback references with useCallback to avoid triggering child re-renders. Implement custom memoization for vanilla JavaScript using Maps or WeakMaps to cache results by argument values.
This skips redundant work and improves performance for calculations that would otherwise run on every render, making expensive operations viable in frequently updating components. Cache only truly expensive operations - shallow array operations, simple arithmetic, or object property access don’t benefit from memoization and add overhead.
Example
This example demonstrates memoization that caches computation results based on function arguments, avoiding redundant calculations when called with the same inputs.
React useMemo
function ProductList({ products, category, sortBy }) {
// useMemo caches the filtered result
const filteredProducts = useMemo(() => {
console.log('Filtering products...'); // Only logs when dependencies change
// This expensive filter only runs when dependencies change
return products.filter(p => p.category === category);
}, [products, category]); // Recompute only when these change
// Chain memoizations for multi-step transformations
const sortedProducts = useMemo(() => {
console.log('Sorting products...');
return [...filteredProducts].sort((a, b) =>
a[sortBy] > b[sortBy] ? 1 : -1
);
}, [filteredProducts, sortBy]);
return (
<ul>
{sortedProducts.map(p => (
<li key={p.id}>{p.name}</li>
))}
</ul>
);
}
React useCallback
function TodoList({ todos, onToggle }) {
// Without useCallback, this creates a new function on every render
// causing TodoItem to re-render even when todo hasn't changed
const handleToggle = useCallback((id) => {
onToggle(id);
}, [onToggle]); // Function reference stays stable unless onToggle changes
return (
<ul>
{todos.map(todo => (
// TodoItem wrapped in React.memo won't re-render if props unchanged
<TodoItem
key={todo.id}
todo={todo}
onToggle={handleToggle}
/>
))}
</ul>
);
}
const TodoItem = React.memo(({ todo, onToggle }) => {
console.log('TodoItem rendered:', todo.id);
return (
<li onClick={() => onToggle(todo.id)}>
{todo.text}
</li>
);
});
Component Memoization
// Wrap component to prevent re-renders when props haven't changed
const ExpensiveComponent = React.memo(({ data, filter }) => {
console.log('ExpensiveComponent rendered');
const processedData = data.map(item => ({
...item,
computed: expensiveCalculation(item)
}));
return (
<div>
{processedData.map(item => (
<div key={item.id}>{item.computed}</div>
))}
</div>
);
});
// Custom comparison function for complex props
const ExpensiveComponentWithCustomCompare = React.memo(
({ data, config }) => {
// Component implementation
},
(prevProps, nextProps) => {
// Return true if props are equal (skip re-render)
return (
prevProps.data.length === nextProps.data.length &&
prevProps.config.id === nextProps.config.id
);
}
);
Vue Computed Properties
<template>
<ul>
<li v-for="product in filteredProducts" :key="product.id">
{{ product.name }} - {{ formatPrice(product.price) }}
</li>
</ul>
</template>
<script>
export default {
props: ['products', 'category'],
computed: {
// Computed properties are automatically memoized
filteredProducts() {
console.log('Computing filtered products');
return this.products.filter(p => p.category === this.category);
},
// Can depend on other computed properties
productCount() {
return this.filteredProducts.length;
},
// Expensive formatting cached
formattedTotal() {
return this.filteredProducts.reduce((sum, p) => sum + p.price, 0)
.toLocaleString('en-US', { style: 'currency', currency: 'USD' });
}
},
methods: {
// Methods are NOT memoized - called every time
formatPrice(price) {
return price.toLocaleString('en-US', {
style: 'currency',
currency: 'USD'
});
}
}
};
</script>
Vue Composition API
<script setup>
import { computed } from 'vue';
const props = defineProps(['products', 'category']);
// computed() creates memoized value
const filteredProducts = computed(() => {
console.log('Filtering...');
return props.products.filter(p => p.category === props.category);
});
const sortedProducts = computed(() => {
console.log('Sorting...');
return [...filteredProducts.value].sort((a, b) => a.name.localeCompare(b.name));
});
</script>
<template>
<div>
<p>Count: {{ filteredProducts.length }}</p>
<ul>
<li v-for="product in sortedProducts" :key="product.id">
{{ product.name }}
</li>
</ul>
</div>
</template>
Svelte Reactive Statements
<script>
export let products;
export let category;
export let sortBy;
// Reactive statements are automatically memoized
$: filteredProducts = products.filter(p => p.category === category);
// Can depend on other reactive values
$: sortedProducts = [...filteredProducts].sort((a, b) =>
a[sortBy] > b[sortBy] ? 1 : -1
);
// Complex reactive blocks
$: {
console.log('Filtered products changed:', filteredProducts.length);
}
</script>
<ul>
{#each sortedProducts as product (product.id)}
<li>{product.name}</li>
{/each}
</ul>
Vanilla JavaScript Memoization
// Framework-agnostic memoization
function memoize(fn) {
// Store cached results in a Map
const cache = new Map();
return function(...args) {
// Create cache key from arguments
const key = JSON.stringify(args);
// Return cached result if available
if (cache.has(key)) {
console.log('Cache hit');
return cache.get(key);
}
console.log('Cache miss, computing...');
// Compute result if not cached
const result = fn(...args);
// Store result in cache for future calls
cache.set(key, result);
return result;
};
}
// Usage: wrap expensive function with memoization
const filterProducts = memoize((products, category) => {
return products.filter(p => p.category === category);
});
// First call computes and caches result
const filtered1 = filterProducts(products, 'electronics');
// Subsequent calls with same args return cached result instantly
const filtered2 = filterProducts(products, 'electronics'); // Cache hit!
LRU Cache for Limited Memory
class LRUCache {
constructor(maxSize = 100) {
this.cache = new Map();
this.maxSize = maxSize;
}
get(key) {
if (!this.cache.has(key)) return undefined;
// Move to end (most recently used)
const value = this.cache.get(key);
this.cache.delete(key);
this.cache.set(key, value);
return value;
}
set(key, value) {
// Remove if already exists
if (this.cache.has(key)) {
this.cache.delete(key);
}
// Add new entry
this.cache.set(key, value);
// Remove oldest entry if over limit
if (this.cache.size > this.maxSize) {
const firstKey = this.cache.keys().next().value;
this.cache.delete(firstKey);
}
}
}
function memoizeWithLRU(fn, maxSize = 100) {
const cache = new LRUCache(maxSize);
return function(...args) {
const key = JSON.stringify(args);
let result = cache.get(key);
if (result === undefined) {
result = fn(...args);
cache.set(key, result);
}
return result;
};
}
Selector Pattern
// Reselect library pattern for computed state
import { createSelector } from 'reselect';
const getProducts = state => state.products;
const getCategory = state => state.category;
// Memoized selector
const getFilteredProducts = createSelector(
[getProducts, getCategory],
(products, category) => {
console.log('Computing filtered products');
return products.filter(p => p.category === category);
}
);
// Chained selectors
const getSortedProducts = createSelector(
[getFilteredProducts, state => state.sortBy],
(filtered, sortBy) => {
console.log('Computing sorted products');
return [...filtered].sort((a, b) => a[sortBy] > b[sortBy] ? 1 : -1);
}
);
// Usage
const filtered = getFilteredProducts(state);
const sorted = getSortedProducts(state);
Fibonacci with Memoization
// Classic example: exponential time -> linear time
const fibonacci = (() => {
const cache = {};
return function fib(n) {
if (n in cache) return cache[n];
if (n <= 1) return n;
cache[n] = fib(n - 1) + fib(n - 2);
return cache[n];
};
})();
console.log(fibonacci(50)); // Instant with memoization
Benefits
- Dramatically improves performance by avoiding expensive recalculations when inputs haven’t changed, making complex operations feasible in frequently updating components.
- Eliminates visible lag during user interactions with heavy computations like filtering large lists or complex transformations.
- Reduces wasted CPU cycles for operations with identical inputs, freeing resources for other work and improving battery life on mobile devices.
- Makes expensive operations viable in render functions that would otherwise cause unacceptable performance degradation.
- Prevents unnecessary child component re-renders by maintaining stable callback and object references.
- Enables complex derived state calculations without performance concerns, simplifying state management.
Tradeoffs
- Adds memory overhead to store cached results - each memoized value requires memory for the cache, which can accumulate with many cached computations.
- Can actually hurt performance if used on cheap operations due to comparison cost - checking dependencies and managing cache overhead exceeds the cost of just recalculating.
- Makes code harder to reason about with implicit caching behavior - understanding when computations run requires knowledge of dependency arrays and equality checks.
- Requires careful dependency management to avoid stale cached values - forgetting dependencies causes bugs where cached results don’t update when they should.
- Dependency arrays can become long and complex for functions with many inputs, making code verbose and error-prone.
- Shallow equality checks can miss nested object changes - changing
user.namewon’t trigger recomputation if dependency isuserobject reference which didn’t change. - JSON.stringify for cache keys is expensive for large objects and breaks for circular references, functions, or objects with non-enumerable properties.
- WeakMap-based caching only works with object keys, not primitive values, limiting applicability.
- LRU cache implementation adds complexity and requires tuning max size - too small causes cache thrashing, too large wastes memory.
- React.memo prevents re-renders but doesn’t prevent parent re-renders from triggering, requiring memoization at multiple levels.
- Custom equality functions in React.memo add complexity and can be buggy if implemented incorrectly, causing either too many or too few re-renders.
- Computed properties in Vue are synchronous - async operations require watchers instead, which aren’t memoized automatically.
- Memoization breaks if functions have side effects - only pure functions with identical output for identical input should be memoized.
- Debugging memoized code is harder because breakpoints may not hit on every call when cache hits occur.
- Profiling becomes more complex - need to account for cache hits vs misses to understand actual computation time.
- Over-memoization creates “premature optimization” where code complexity increases without measurable performance benefit.