# Fetch countries with cache headers - data rarely changes
curl -H "X-API-Key: YOUR_API_KEY" \
"https://api.worlddataapi.com/v1/countries" \
-o countries_cache.json
# Check file age before refetching (30 days for countries)
find countries_cache.json -mtime +30 -exec rm {} \;
// Different data types need different cache durations
const CACHE_TTL = {
countries: 30 * 24 * 60 * 60 * 1000, // 30 days - rarely change
timezones: 7 * 24 * 60 * 60 * 1000, // 7 days - DST rules update occasionally
holidays: 24 * 60 * 60 * 1000, // 24 hours - check daily for updates
climate: 30 * 24 * 60 * 60 * 1000, // 30 days - historical data, stable
emergency: 7 * 24 * 60 * 60 * 1000, // 7 days - changes rarely
};
# Different data types need different cache durations
CACHE_TTL = {
"countries": 30 * 24 * 60 * 60, # 30 days - rarely change
"timezones": 7 * 24 * 60 * 60, # 7 days - DST rules update occasionally
"holidays": 24 * 60 * 60, # 24 hours - check daily for updates
"climate": 30 * 24 * 60 * 60, # 30 days - historical data, stable
"emergency": 7 * 24 * 60 * 60, # 7 days - changes rarely
}
Reference data APIs return information that changes infrequently—country codes, timezone definitions, holiday dates. Smart caching reduces API calls, speeds up your app, and enables offline functionality. But cache too long and you serve stale data. This guide covers strategies for getting the balance right.
The Challenge#
Caching reference data sounds straightforward until you consider the details. Countries rarely change, but holidays update annually. Timezone rules shift when governments decide to change DST policies. Cache everything for 30 days and you miss holiday announcements. Cache nothing and you waste API calls on data that has not changed in months.
The core problem: different data types have different volatility, but most caching implementations treat all API responses identically. You need a strategy that matches cache duration to data change frequency while handling network failures gracefully.
Prerequisites#
Before implementing these caching strategies, you need:
A World Data API account with an API key (free tier works for testing)
Basic familiarity with your platform's storage APIs (localStorage, IndexedDB, or file system)
Understanding of async/await patterns in your language of choice
For service worker examples: HTTPS hosting (service workers require secure contexts)
Understanding Data Volatility#
Not all reference data changes at the same rate:
| Data Type | Change Frequency | Recommended TTL | Notes |
|---|---|---|---|
| Countries | Rarely (years) | 30 days | New countries are rare events |
| Regions | Rarely (years) | 30 days | Administrative boundaries stable |
| Cities | Rarely | 7-30 days | Population data may update |
| Timezones | Occasionally | 7 days | DST rules change sometimes |
| Holidays | Annually | 24 hours | Check daily, new data in October |
| Climate | Stable | 30 days | Historical averages, static |
| Emergency | Rarely | 7 days | Numbers rarely change |
| Power/Travel | Rarely | 7-30 days | Infrastructure data stable |
Basic Cache Implementation#
class APICache {
constructor() {
this.memoryCache = new Map();
this.ttlConfig = {
countries: 30 * 24 * 60 * 60 * 1000,
regions: 30 * 24 * 60 * 60 * 1000,
cities: 7 * 24 * 60 * 60 * 1000,
timezones: 7 * 24 * 60 * 60 * 1000,
holidays: 24 * 60 * 60 * 1000,
travel: 7 * 24 * 60 * 60 * 1000,
sun: 24 * 60 * 60 * 1000,
moon: 24 * 60 * 60 * 1000,
};
}
getCacheKey(endpoint, params = {}) {
const sortedParams = Object.keys(params)
.sort()
.map((k) => `${k}=${params[k]}`)
.join("&");
return `${endpoint}:${sortedParams}`;
}
getTTL(endpoint) {
// Extract data type from endpoint (e.g., "/v1/countries" -> "countries")
const dataType = endpoint
.split("/")
.find((p) => Object.keys(this.ttlConfig).includes(p));
return this.ttlConfig[dataType] || 60 * 60 * 1000; // Default 1 hour
}
async get(endpoint, params = {}, fetchFn) {
const key = this.getCacheKey(endpoint, params);
const ttl = this.getTTL(endpoint);
// Check memory cache
const cached = this.memoryCache.get(key);
if (cached && Date.now() - cached.timestamp < ttl) {
return cached.data;
}
// Check localStorage
try {
const stored = localStorage.getItem(`api:${key}`);
if (stored) {
const { data, timestamp } = JSON.parse(stored);
if (Date.now() - timestamp < ttl) {
// Restore to memory cache
this.memoryCache.set(key, { data, timestamp });
return data;
}
}
} catch (e) {
// localStorage might be unavailable or full
}
// Fetch fresh data
const data = await fetchFn();
// Store in both caches
const entry = { data, timestamp: Date.now() };
this.memoryCache.set(key, entry);
try {
localStorage.setItem(`api:${key}`, JSON.stringify(entry));
} catch (e) {
// Handle quota exceeded
this.pruneStorage();
}
return data;
}
pruneStorage() {
// Remove oldest entries when storage is full
const entries = [];
for (let i = 0; i < localStorage.length; i++) {
const key = localStorage.key(i);
if (key?.startsWith("api:")) {
try {
const { timestamp } = JSON.parse(localStorage.getItem(key));
entries.push({ key, timestamp });
} catch {}
}
}
// Remove oldest 25%
entries
.sort((a, b) => a.timestamp - b.timestamp)
.slice(0, Math.ceil(entries.length / 4))
.forEach((e) => localStorage.removeItem(e.key));
}
invalidate(pattern) {
// Invalidate matching keys
for (const key of this.memoryCache.keys()) {
if (key.includes(pattern)) {
this.memoryCache.delete(key);
}
}
for (let i = localStorage.length - 1; i >= 0; i--) {
const key = localStorage.key(i);
if (key?.startsWith("api:") && key.includes(pattern)) {
localStorage.removeItem(key);
}
}
}
}
const cache = new APICache();
import json
import time
import os
from pathlib import Path
from typing import Any, Callable, Optional
class APICache:
def __init__(self, cache_dir: str = ".cache"):
self.memory_cache: dict = {}
self.cache_dir = Path(cache_dir)
self.cache_dir.mkdir(exist_ok=True)
self.ttl_config = {
"countries": 30 * 24 * 60 * 60,
"regions": 30 * 24 * 60 * 60,
"cities": 7 * 24 * 60 * 60,
"timezones": 7 * 24 * 60 * 60,
"holidays": 24 * 60 * 60,
"travel": 7 * 24 * 60 * 60,
"sun": 24 * 60 * 60,
"moon": 24 * 60 * 60,
}
def get_cache_key(self, endpoint: str, params: dict = None) -> str:
params = params or {}
sorted_params = "&".join(f"{k}={v}" for k, v in sorted(params.items()))
return f"{endpoint}:{sorted_params}".replace("/", "_")
def get_ttl(self, endpoint: str) -> int:
for data_type in self.ttl_config:
if data_type in endpoint:
return self.ttl_config[data_type]
return 60 * 60 # Default 1 hour
def get(self, endpoint: str, params: dict = None, fetch_fn: Callable = None) -> Any:
key = self.get_cache_key(endpoint, params)
ttl = self.get_ttl(endpoint)
# Check memory cache
if key in self.memory_cache:
entry = self.memory_cache[key]
if time.time() - entry["timestamp"] < ttl:
return entry["data"]
# Check file cache
cache_file = self.cache_dir / f"{key}.json"
if cache_file.exists():
with open(cache_file) as f:
entry = json.load(f)
if time.time() - entry["timestamp"] < ttl:
self.memory_cache[key] = entry
return entry["data"]
# Fetch fresh data
data = fetch_fn()
entry = {"data": data, "timestamp": time.time()}
# Store in both caches
self.memory_cache[key] = entry
with open(cache_file, "w") as f:
json.dump(entry, f)
return data
def invalidate(self, pattern: str):
# Invalidate memory cache
keys_to_delete = [k for k in self.memory_cache if pattern in k]
for key in keys_to_delete:
del self.memory_cache[key]
# Invalidate file cache
for cache_file in self.cache_dir.glob(f"*{pattern}*.json"):
cache_file.unlink()
cache = APICache()
Stale-While-Revalidate Pattern#
Return cached data immediately, then refresh in background:
class SWRCache extends APICache {
async get(endpoint, params = {}, fetchFn) {
const key = this.getCacheKey(endpoint, params);
const ttl = this.getTTL(endpoint);
// Check for cached data (even if stale)
let cached = this.memoryCache.get(key);
if (!cached) {
try {
const stored = localStorage.getItem(`api:${key}`);
if (stored) {
cached = JSON.parse(stored);
this.memoryCache.set(key, cached);
}
} catch {}
}
const isStale = !cached || Date.now() - cached.timestamp > ttl;
if (cached && isStale) {
// Return stale data immediately, refresh in background
this.refresh(key, fetchFn);
return cached.data;
}
if (cached) {
return cached.data;
}
// No cache at all - must wait for fetch
return this.refresh(key, fetchFn);
}
async refresh(key, fetchFn) {
try {
const data = await fetchFn();
const entry = { data, timestamp: Date.now() };
this.memoryCache.set(key, entry);
try {
localStorage.setItem(`api:${key}`, JSON.stringify(entry));
} catch {}
return data;
} catch (error) {
// If refresh fails, return stale data if available
const cached = this.memoryCache.get(key);
if (cached) return cached.data;
throw error;
}
}
}
import threading
from typing import Any, Callable
class SWRCache(APICache):
def get(self, endpoint: str, params: dict = None, fetch_fn: Callable = None) -> Any:
key = self.get_cache_key(endpoint, params)
ttl = self.get_ttl(endpoint)
# Check for cached data (even if stale)
cached = self.memory_cache.get(key)
if not cached:
cache_file = self.cache_dir / f"{key}.json"
if cache_file.exists():
with open(cache_file) as f:
cached = json.load(f)
self.memory_cache[key] = cached
is_stale = not cached or (time.time() - cached["timestamp"]) > ttl
if cached and is_stale:
# Return stale data immediately, refresh in background
threading.Thread(
target=self._refresh, args=(key, fetch_fn), daemon=True
).start()
return cached["data"]
if cached:
return cached["data"]
# No cache at all - must wait for fetch
return self._refresh(key, fetch_fn)
def _refresh(self, key: str, fetch_fn: Callable) -> Any:
try:
data = fetch_fn()
entry = {"data": data, "timestamp": time.time()}
self.memory_cache[key] = entry
cache_file = self.cache_dir / f"{key}.json"
with open(cache_file, "w") as f:
json.dump(entry, f)
return data
except Exception:
# If refresh fails, return stale data if available
cached = self.memory_cache.get(key)
if cached:
return cached["data"]
raise
React Hook for Cached Data#
function useCachedAPI(endpoint, params = {}, dependencies = []) {
const [data, setData] = useState(null);
const [loading, setLoading] = useState(true);
const [error, setError] = useState(null);
const [isStale, setIsStale] = useState(false);
useEffect(() => {
let cancelled = false;
const fetchData = async () => {
const key = cache.getCacheKey(endpoint, params);
const ttl = cache.getTTL(endpoint);
// Check cache first
const cached = cache.memoryCache.get(key);
if (cached) {
setData(cached.data);
setIsStale(Date.now() - cached.timestamp > ttl);
setLoading(false);
if (Date.now() - cached.timestamp <= ttl) {
return; // Fresh cache, no need to fetch
}
}
// Fetch fresh data
try {
const fresh = await cache.get(endpoint, params, async () => {
const response = await fetch(
`https://worlddataapi.com${endpoint}?${new URLSearchParams(params)}`,
{ headers: { "X-API-Key": API_KEY } },
);
if (!response.ok) throw new Error("API request failed");
return response.json();
});
if (!cancelled) {
setData(fresh);
setIsStale(false);
setError(null);
}
} catch (e) {
if (!cancelled) {
setError(e);
// Keep stale data visible if available
}
} finally {
if (!cancelled) {
setLoading(false);
}
}
};
fetchData();
return () => {
cancelled = true;
};
}, [endpoint, JSON.stringify(params), ...dependencies]);
return { data, loading, error, isStale };
}
// Usage
function CountryList() {
const { data, loading, error, isStale } = useCachedAPI("/v1/countries");
return (
<div>
{isStale && <div className="stale-notice">Updating data...</div>}
{data?.data?.map((country) => (
<div key={country.code}>{country.name}</div>
))}
</div>
);
}
Preloading and Warming Cache#
Load data before users need it:
class PreloadingCache extends SWRCache {
constructor() {
super();
this.preloadQueue = [];
this.isPreloading = false;
}
// Preload commonly needed data
async warmCache(userContext = {}) {
const preloadList = [];
// Always preload countries (small dataset, frequently needed)
preloadList.push({ endpoint: "/v1/countries", params: {} });
// Preload user's country data
if (userContext.country) {
preloadList.push({
endpoint: `/v1/regions`,
params: { country: userContext.country },
});
preloadList.push({
endpoint: `/v1/travel/${userContext.country}`,
params: {},
});
}
// Preload common timezones
preloadList.push({ endpoint: "/v1/timezones", params: {} });
// Execute preloads with low priority
await this.preloadBatch(preloadList);
}
async preloadBatch(items) {
this.preloadQueue.push(...items);
if (this.isPreloading) return;
this.isPreloading = true;
while (this.preloadQueue.length > 0) {
const item = this.preloadQueue.shift();
try {
// Use requestIdleCallback if available for non-blocking preload
await new Promise((resolve) => {
if ("requestIdleCallback" in window) {
requestIdleCallback(() => resolve(), { timeout: 1000 });
} else {
setTimeout(resolve, 10);
}
});
await this.get(item.endpoint, item.params, async () => {
const url = `https://worlddataapi.com${item.endpoint}?${new URLSearchParams(item.params)}`;
const response = await fetch(url, {
headers: { "X-API-Key": API_KEY },
});
return response.json();
});
} catch (e) {
// Preload failure is non-critical
console.warn("Preload failed:", item.endpoint);
}
}
this.isPreloading = false;
}
}
// Initialize on app load
const cache = new PreloadingCache();
cache.warmCache({ country: "US" });
Service Worker Caching#
For true offline support:
// sw.js
const CACHE_NAME = "worlddata-api-v1";
const API_URL_PATTERN = /worlddataapi\.com\/v1\//;
// Data type to TTL mapping (in seconds)
const TTL_MAP = {
countries: 30 * 24 * 60 * 60,
regions: 30 * 24 * 60 * 60,
cities: 7 * 24 * 60 * 60,
timezones: 7 * 24 * 60 * 60,
holidays: 24 * 60 * 60,
travel: 7 * 24 * 60 * 60,
sun: 24 * 60 * 60,
moon: 24 * 60 * 60,
};
function getDataType(url) {
const path = new URL(url).pathname;
for (const type of Object.keys(TTL_MAP)) {
if (path.includes(type)) return type;
}
return null;
}
function getTTL(url) {
const dataType = getDataType(url);
return dataType ? TTL_MAP[dataType] : 60 * 60; // Default 1 hour
}
self.addEventListener("fetch", (event) => {
const { request } = event;
if (!API_URL_PATTERN.test(request.url)) {
return;
}
event.respondWith(handleAPIRequest(request));
});
async function handleAPIRequest(request) {
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(request);
if (cachedResponse) {
const cachedTime = new Date(
cachedResponse.headers.get("sw-cached-at"),
).getTime();
const ttl = getTTL(request.url) * 1000;
const isStale = Date.now() - cachedTime > ttl;
if (!isStale) {
return cachedResponse;
}
// Stale - return cached but refresh in background
refreshCache(request, cache);
return cachedResponse;
}
// No cache - fetch and cache
try {
return await fetchAndCache(request, cache);
} catch (error) {
// Network failed, no cache - return error
return new Response(
JSON.stringify({
error: "Offline",
message: "No cached data available",
}),
{ status: 503, headers: { "Content-Type": "application/json" } },
);
}
}
async function fetchAndCache(request, cache) {
const response = await fetch(request);
if (response.ok) {
const clonedResponse = response.clone();
const headers = new Headers(clonedResponse.headers);
headers.set("sw-cached-at", new Date().toISOString());
const cachedResponse = new Response(await clonedResponse.blob(), {
status: clonedResponse.status,
statusText: clonedResponse.statusText,
headers,
});
cache.put(request, cachedResponse);
}
return response;
}
async function refreshCache(request, cache) {
try {
await fetchAndCache(request, cache);
} catch {
// Refresh failed - keep stale cache
}
}
// Clean old entries periodically
self.addEventListener("activate", (event) => {
event.waitUntil(cleanOldCache());
});
async function cleanOldCache() {
const cache = await caches.open(CACHE_NAME);
const requests = await cache.keys();
for (const request of requests) {
const response = await cache.match(request);
const cachedTime = new Date(
response.headers.get("sw-cached-at"),
).getTime();
const maxAge = 30 * 24 * 60 * 60 * 1000; // Max 30 days
if (Date.now() - cachedTime > maxAge) {
await cache.delete(request);
}
}
}
Holiday Cache Strategy#
Holidays need special handling—they're date-specific and update annually:
class HolidayCache {
constructor() {
this.cache = new Map();
this.TTL = 24 * 60 * 60 * 1000; // 24 hours
}
getCacheKey(country, year) {
return `holidays:${country}:${year}`;
}
async getHolidays(country, year) {
const key = this.getCacheKey(country, year);
// Check cache
const cached = this.cache.get(key);
if (cached && Date.now() - cached.timestamp < this.TTL) {
return cached.data;
}
// Fetch fresh
const response = await fetch(
`https://worlddataapi.com/v1/holidays/${country}?year=${year}`,
{ headers: { "X-API-Key": API_KEY } },
);
const data = await response.json();
// Cache
this.cache.set(key, { data: data.holidays, timestamp: Date.now() });
// Persist to localStorage for offline
try {
localStorage.setItem(
key,
JSON.stringify({
data: data.holidays,
timestamp: Date.now(),
}),
);
} catch {}
return data.holidays;
}
// Preload next year's holidays when approaching year end
async preloadUpcomingYear(country) {
const now = new Date();
const currentYear = now.getFullYear();
// If it's October or later, preload next year
if (now.getMonth() >= 9) {
await this.getHolidays(country, currentYear + 1);
}
}
// Warm cache with commonly needed holiday data
async warmCache(countries) {
const currentYear = new Date().getFullYear();
const years = [currentYear, currentYear + 1];
const preloads = countries.flatMap((country) =>
years.map((year) => this.getHolidays(country, year)),
);
await Promise.allSettled(preloads);
}
}
import requests
from datetime import datetime
from concurrent.futures import ThreadPoolExecutor
class HolidayCache:
def __init__(self, api_key: str, cache_dir: str = ".cache"):
self.api_key = api_key
self.cache = {}
self.ttl = 24 * 60 * 60 # 24 hours
self.cache_dir = Path(cache_dir)
self.cache_dir.mkdir(exist_ok=True)
def get_cache_key(self, country: str, year: int) -> str:
return f"holidays:{country}:{year}"
def get_holidays(self, country: str, year: int) -> list:
key = self.get_cache_key(country, year)
# Check memory cache
if key in self.cache:
entry = self.cache[key]
if time.time() - entry["timestamp"] < self.ttl:
return entry["data"]
# Check file cache
cache_file = self.cache_dir / f"{key.replace(':', '_')}.json"
if cache_file.exists():
with open(cache_file) as f:
entry = json.load(f)
if time.time() - entry["timestamp"] < self.ttl:
self.cache[key] = entry
return entry["data"]
# Fetch fresh
response = requests.get(
f"https://api.worlddataapi.com/v1/holidays/{country}",
params={"year": year},
headers={"X-API-Key": self.api_key}
)
data = response.json()
holidays = data.get("holidays", [])
# Cache in memory and file
entry = {"data": holidays, "timestamp": time.time()}
self.cache[key] = entry
with open(cache_file, "w") as f:
json.dump(entry, f)
return holidays
def preload_upcoming_year(self, country: str):
now = datetime.now()
current_year = now.year
# If October or later, preload next year
if now.month >= 10:
self.get_holidays(country, current_year + 1)
def warm_cache(self, countries: list):
current_year = datetime.now().year
years = [current_year, current_year + 1]
with ThreadPoolExecutor(max_workers=4) as executor:
for country in countries:
for year in years:
executor.submit(self.get_holidays, country, year)
You can also use curl to fetch and cache holiday data:
# Fetch holidays for a country and year
curl -H "X-API-Key: YOUR_API_KEY" \
"https://api.worlddataapi.com/v1/holidays/US?year=2025" \
-o .cache/holidays_US_2025.json
# Simple cache check script
CACHE_FILE=".cache/holidays_US_2025.json"
if [ ! -f "$CACHE_FILE" ] || [ $(find "$CACHE_FILE" -mtime +1 -print) ]; then
curl -H "X-API-Key: YOUR_API_KEY" \
"https://api.worlddataapi.com/v1/holidays/US?year=2025" \
-o "$CACHE_FILE"
fi
Cache Invalidation Strategies#
class CacheManager {
constructor(cache) {
this.cache = cache;
this.listeners = new Map();
}
// Manual invalidation
invalidateByType(dataType) {
this.cache.invalidate(dataType);
this.notifyListeners(dataType);
}
// Time-based invalidation check
startInvalidationTimer() {
setInterval(
() => {
this.checkAndInvalidate();
},
60 * 60 * 1000,
); // Check every hour
}
checkAndInvalidate() {
const now = Date.now();
for (const [key, entry] of this.cache.memoryCache) {
const dataType = key.split(":")[0];
const ttl = this.cache.ttlConfig[dataType] || 60 * 60 * 1000;
if (now - entry.timestamp > ttl) {
this.cache.memoryCache.delete(key);
this.notifyListeners(dataType);
}
}
}
// Event-based invalidation (e.g., app version update)
invalidateOnVersionChange(currentVersion) {
const storedVersion = localStorage.getItem("app_version");
if (storedVersion && storedVersion !== currentVersion) {
// Clear all caches on app update
this.cache.memoryCache.clear();
for (let i = localStorage.length - 1; i >= 0; i--) {
const key = localStorage.key(i);
if (key?.startsWith("api:")) {
localStorage.removeItem(key);
}
}
}
localStorage.setItem("app_version", currentVersion);
}
// Subscribe to cache updates
subscribe(dataType, callback) {
if (!this.listeners.has(dataType)) {
this.listeners.set(dataType, new Set());
}
this.listeners.get(dataType).add(callback);
return () => {
this.listeners.get(dataType).delete(callback);
};
}
notifyListeners(dataType) {
const callbacks = this.listeners.get(dataType);
if (callbacks) {
callbacks.forEach((cb) => cb());
}
}
}
Handling Cache Size Limits#
class BoundedCache {
constructor(maxSize = 50 * 1024 * 1024) {
// 50MB default
this.maxSize = maxSize;
this.currentSize = 0;
this.cache = new Map();
this.accessOrder = [];
}
set(key, data) {
const size = this.estimateSize(data);
// Make room if needed (LRU eviction)
while (
this.currentSize + size > this.maxSize &&
this.accessOrder.length > 0
) {
const oldestKey = this.accessOrder.shift();
const oldEntry = this.cache.get(oldestKey);
if (oldEntry) {
this.currentSize -= oldEntry.size;
this.cache.delete(oldestKey);
}
}
// Remove old entry if updating
if (this.cache.has(key)) {
this.currentSize -= this.cache.get(key).size;
this.accessOrder = this.accessOrder.filter((k) => k !== key);
}
// Add new entry
this.cache.set(key, { data, size, timestamp: Date.now() });
this.currentSize += size;
this.accessOrder.push(key);
}
get(key) {
const entry = this.cache.get(key);
if (entry) {
// Move to end of access order (most recently used)
this.accessOrder = this.accessOrder.filter((k) => k !== key);
this.accessOrder.push(key);
return entry.data;
}
return null;
}
estimateSize(data) {
// Rough estimate of JSON size in bytes
return new Blob([JSON.stringify(data)]).size;
}
getStats() {
return {
entries: this.cache.size,
sizeBytes: this.currentSize,
sizeMB: (this.currentSize / 1024 / 1024).toFixed(2),
capacityPercent: ((this.currentSize / this.maxSize) * 100).toFixed(
1,
),
};
}
}
Common Pitfalls#
Caching computed data with the wrong key. Astronomy data (sunrise/sunset times) depends on both location and date. Caching by location alone returns yesterday's times. Always include all parameters that affect the response in your cache key.
Treating all data types the same. A 30-day TTL works for countries but misses holiday updates. A 1-hour TTL works for holidays but wastes API calls on static country data. Use data-type-specific TTLs as shown in this guide.
Ignoring storage limits. localStorage has a 5-10MB limit depending on browser. IndexedDB offers more space but requires async access. Service Worker caches can grow large. Implement LRU eviction before you hit limits.
Not handling cache corruption. JSON parse errors from corrupted cache entries can crash your app. Always wrap cache reads in try-catch and fall back to fetching fresh data.
Caching error responses. A 500 error cached for 30 days means 30 days of failures. Only cache successful responses (2xx status codes).
Forgetting about cache key collisions. If you cache /v1/holidays/US?year=2025 and /v1/holidays/us?year=2025 separately, you waste storage. Normalize cache keys (lowercase, sorted params) to avoid duplicates.
Not versioning caches. When your app updates, old cached data formats may cause issues. Clear caches on app version changes or use versioned cache names.
Summary#
Match TTL to data volatility: Static data (countries) can cache for weeks; dynamic data (holidays) should check daily.
Use stale-while-revalidate: Return cached data immediately for better UX, refresh in background.
Layer your caches: Memory → localStorage → Service Worker for offline support.
Preload intelligently: Warm caches based on user context (their country, planned trips).
Handle failures gracefully: Stale data is better than no data when the network fails.
Manage cache size: Implement LRU eviction to stay within storage limits.
Version your caches: Clear on app updates to avoid stale data issues.
// Recommended cache architecture
const cacheConfig = {
// Layer 1: Memory (instant, limited size)
memory: new BoundedCache(10 * 1024 * 1024), // 10MB
// Layer 2: localStorage (persisted, survives page reload)
localStorage: {
maxSize: 5 * 1024 * 1024, // 5MB typical limit
prefix: "worlddata:",
},
// Layer 3: Service Worker (true offline support)
serviceWorker: {
cacheName: "worlddata-api-v1",
maxAge: 30 * 24 * 60 * 60, // 30 days max
},
// TTLs by data type (in milliseconds)
ttl: {
countries: 30 * 24 * 60 * 60 * 1000,
regions: 30 * 24 * 60 * 60 * 1000,
cities: 7 * 24 * 60 * 60 * 1000,
timezones: 7 * 24 * 60 * 60 * 1000,
holidays: 24 * 60 * 60 * 1000,
travel: 7 * 24 * 60 * 60 * 1000,
sun: 24 * 60 * 60 * 1000,
moon: 24 * 60 * 60 * 1000,
businessDays: 24 * 60 * 60 * 1000,
},
};
Effective caching transforms your API integration from a network bottleneck into a responsive, offline-capable feature. Start with data-type-specific TTLs, add stale-while-revalidate for better UX, and layer your caches for resilience. The patterns in this guide work for any reference data API, but they are particularly effective for data that changes infrequently like countries, timezones, and holidays.
Ready to implement these caching strategies? Get your free World Data API key and start building with countries, holidays, timezones, and more. The free tier includes 60 requests per day—enough to develop and test your caching implementation.
Next Steps#
Adding International Holiday Support to Your App - Caching holiday data
How to Build a Timezone-Aware Application - Timezone data patterns
Building a Country Selector Dropdown - Caching reference data