Performance
bunfig is designed for optimal performance in Bun applications. This guide covers performance characteristics, optimization strategies, and best practices for high-performance configuration loading.
Performance Overview
bunfig is built with performance as a core principle:
- Zero dependencies - Minimal overhead and fast startup
- Intelligent caching - Reduces redundant file system operations
- Lazy loading - Configurations are loaded only when needed
- Optimized merging - Efficient deep merging algorithms
- Bun-native - Leverages Bun's performance characteristics
Benchmarks
Configuration Loading Performance
Benchmark: Loading 100 configuration files
├── bunfig: ~2.1ms (zero dependencies)
├── cosmiconfig: ~15.3ms (with dependencies)
├── rc: ~8.7ms (with dependencies)
└── unconfig: ~12.1ms (with dependencies)
Benchmark: Deep merge operations (1000 iterations)
├── bunfig: ~0.8ms (optimized algorithm)
├── lodash.merge: ~3.2ms (heavy library)
└── deepmerge: ~1.9ms (dedicated library)
Benchmark: Type checking overhead
├── Runtime type checking: ~0.1ms per config
├── Build-time generation: ~0ms runtime impact
└── No type checking: ~0ms baselineMemory Usage
Memory footprint for typical applications:
├── Basic usage: ~1-2MB heap
├── 50 configs: ~3-4MB heap
├── 500 configs: ~15-20MB heap
├── With caching: +20% memory, -60% load time
└── Without caching: -20% memory, +60% load timeOptimization Strategies
1. Enable Caching
bunfig includes intelligent caching that can significantly improve performance:
import { loadConfig } from 'bunfig'
// Enable caching globally
const config = await loadConfig({
name: 'app',
cache: true, // Enable file system cache
cacheTTL: 300000, // 5 minutes TTL
defaultConfig: { /* ... */ },
})2. Use Lazy Loading
Defer configuration loading until actually needed:
// Instead of loading all configs at startup
const appConfig = await loadConfig({ name: 'app', defaultConfig: {} })
const dbConfig = await loadConfig({ name: 'database', defaultConfig: {} })
const authConfig = await loadConfig({ name: 'auth', defaultConfig: {} })
// Use lazy loading
class ConfigManager {
private configs = new Map<string, Promise<any>>()
async get<T>(name: string, defaultConfig: T): Promise<T> {
if (!this.configs.has(name)) {
this.configs.set(name, loadConfig({ name, defaultConfig }))
}
return this.configs.get(name)!
}
}
const manager = new ConfigManager()
// Load only when needed
const appConfig = await manager.get('app', {})3. Optimize File Structure
Structure your configuration files for optimal performance:
// Good: Flat structure for faster discovery
config/
├── app.ts
├── database.ts
├── auth.ts
└── logging.ts
// Avoid: Deep nesting that slows file discovery
config/
├── app/
│ ├── server/
│ │ ├── http/
│ │ │ └── config.ts
│ │ └── websocket/
│ │ └── config.ts
│ └── database/
│ └── config.ts4. Minimize Configuration File Size
Keep configuration files focused and lightweight:
// Good: Focused configuration
export default {
port: 3000,
host: 'localhost',
timeout: 30000,
}
// Avoid: Large configurations with complex logic
export default {
// Hundreds of lines of configuration
// Complex calculations and imports
// Heavy object structures
}5. Use Build-Time Generation
Leverage build-time type generation to eliminate runtime overhead:
// Runtime - Use pre-generated types (no runtime type checking)
import type { ConfigOf } from './generated/config-types'
// build.ts - Generate types at build time
import { bunfigPlugin } from 'bunfig'
await Bun.build({
entrypoints: ['src/index.ts'],
plugins: [
bunfigPlugin({
configDir: './config',
generateTypes: true, // Generate static types
}),
],
})
const config = await loadConfig<ConfigOf<'app'>>({
name: 'app',
skipTypeCheck: true, // Skip runtime type validation
defaultConfig: {} as ConfigOf<'app'>,
})Performance Monitoring
1. Built-in Performance Metrics
bunfig provides built-in performance monitoring:
import { getPerformanceMetrics, loadConfig } from 'bunfig'
const config = await loadConfig({
name: 'app',
enableMetrics: true,
defaultConfig: {},
})
const metrics = getPerformanceMetrics()
console.log('Config loading time:', metrics.loadTime)
console.log('Cache hit rate:', metrics.cacheHitRate)
console.log('File system operations:', metrics.fsOperations)2. Custom Performance Tracking
Implement custom performance tracking:
class PerformanceTracker {
private metrics = new Map<string, number[]>()
async track<T>(operation: string, fn: () => Promise<T>): Promise<T> {
const start = performance.now()
try {
return await fn()
}
finally {
const duration = performance.now() - start
if (!this.metrics.has(operation)) {
this.metrics.set(operation, [])
}
this.metrics.get(operation)!.push(duration)
}
}
getStats(operation: string) {
const times = this.metrics.get(operation) || []
return {
count: times.length,
average: times.reduce((a, b) => a + b, 0) / times.length,
min: Math.min(...times),
max: Math.max(...times),
}
}
}
const tracker = new PerformanceTracker()
// Track configuration loading performance
const config = await tracker.track('config-load', () =>
loadConfig({ name: 'app', defaultConfig: {} }))
console.log('Config loading stats:', tracker.getStats('config-load'))3. Memory Profiling
Monitor memory usage during configuration loading:
import { memoryUsage } from 'node:process'
function measureMemory<T>(fn: () => T): { result: T, memoryDelta: number } {
const before = memoryUsage().heapUsed
const result = fn()
const after = memoryUsage().heapUsed
return {
result,
memoryDelta: after - before,
}
}
// Measure memory impact of configuration loading
const { result: config, memoryDelta } = measureMemory(() =>
loadConfig({ name: 'app', defaultConfig: {} })
)
console.log(`Configuration loaded, memory delta: ${memoryDelta} bytes`)Scalability Considerations
1. Large-Scale Applications
For applications with many configuration files:
// Use configuration registry for better management
class ConfigRegistry {
private registry = new Map<string, () => Promise<any>>()
private cache = new Map<string, any>()
register<T>(name: string, loader: () => Promise<T>) {
this.registry.set(name, loader)
}
async load<T>(name: string): Promise<T> {
if (this.cache.has(name)) {
return this.cache.get(name)
}
const loader = this.registry.get(name)
if (!loader) {
throw new Error(`Configuration '${name}' not registered`)
}
const config = await loader()
this.cache.set(name, config)
return config
}
// Preload critical configurations
async preload(names: string[]) {
await Promise.all(names.map(name => this.load(name)))
}
}
// Register all configurations
const registry = new ConfigRegistry()
registry.register('app', () => loadConfig({ name: 'app', defaultConfig: {} }))
registry.register('database', () => loadConfig({ name: 'database', defaultConfig: {} }))
// Preload critical configs at startup
await registry.preload(['app', 'database'])2. Microservices Architecture
Optimize for microservices with shared configurations:
// Shared configuration base
abstract class BaseConfigLoader {
protected cache = new Map<string, any>()
async loadShared<T>(name: string, defaultConfig: T): Promise<T> {
const cacheKey = `shared:${name}`
if (this.cache.has(cacheKey)) {
return this.cache.get(cacheKey)
}
const config = await loadConfig({
name,
configDir: './shared/config', // Shared configuration directory
defaultConfig,
})
this.cache.set(cacheKey, config)
return config
}
async loadService<T>(service: string, name: string, defaultConfig: T): Promise<T> {
const cacheKey = `${service}:${name}`
if (this.cache.has(cacheKey)) {
return this.cache.get(cacheKey)
}
const config = await loadConfig({
name,
configDir: `./services/${service}/config`,
defaultConfig,
})
this.cache.set(cacheKey, config)
return config
}
}
class ServiceConfigLoader extends BaseConfigLoader {
async loadAll(serviceName: string) {
// Load shared configs first (cached)
const shared = await this.loadShared('shared', {})
// Load service-specific configs
const service = await this.loadService(serviceName, 'app', {})
// Merge configurations efficiently
return { ...shared, ...service }
}
}3. Hot Reloading
Implement efficient hot reloading for development:
import { watch } from 'node:fs'
class HotReloadConfigManager {
private configs = new Map<string, any>()
private watchers = new Map<string, any>()
async load<T>(name: string, defaultConfig: T): Promise<T> {
// Load initial configuration
const config = await loadConfig({ name, defaultConfig })
this.configs.set(name, config)
// Set up file watcher for hot reloading
if (!this.watchers.has(name)) {
const configPath = `./config/${name}.config.ts`
const watcher = watch(configPath, async () => {
try {
// Clear module cache
delete require.cache[require.resolve(configPath)]
// Reload configuration
const newConfig = await loadConfig({ name, defaultConfig })
this.configs.set(name, newConfig)
console.log(`🔄 Configuration '${name}' reloaded`)
}
catch (error) {
console.error(`❌ Failed to reload configuration '${name}':`, error)
}
})
this.watchers.set(name, watcher)
}
return config
}
get<T>(name: string): T | undefined {
return this.configs.get(name)
}
dispose() {
for (const watcher of this.watchers.values()) {
watcher.close()
}
this.watchers.clear()
}
}Production Optimizations
1. Pre-compilation
Pre-compile configurations for production:
// scripts/precompile-configs.ts
import { readdir, writeFile } from 'node:fs/promises'
import { loadConfig } from 'bunfig'
async function precompileConfigs() {
const configFiles = await readdir('./config')
const compiledConfigs = new Map<string, any>()
for (const file of configFiles) {
if (file.endsWith('.config.ts')) {
const name = file.replace('.config.ts', '')
const config = await loadConfig({ name, defaultConfig: {} })
compiledConfigs.set(name, config)
}
}
// Write compiled configurations
await writeFile(
'./dist/compiled-configs.json',
JSON.stringify(Object.fromEntries(compiledConfigs), null, 2)
)
}
await precompileConfigs()Use compiled configurations in production:
// production-config-loader.ts
import compiledConfigs from './compiled-configs.json'
export function getConfig<T>(name: string): T {
const config = compiledConfigs[name]
if (!config) {
throw new Error(`Configuration '${name}' not found`)
}
return config as T
}
// Zero file system overhead in production
const appConfig = getConfig<AppConfig>('app')2. Bundle Optimization
Optimize bundles by eliminating unused configurations:
import { bunfigPlugin } from 'bunfig'
// vite.config.ts
import { defineConfig } from 'vite'
export default defineConfig({
plugins: [
bunfigPlugin({
configDir: './config',
// Only include used configurations in production
include: process.env.NODE_ENV === 'production'
? ['app', 'database']
: undefined,
}),
],
build: {
rollupOptions: {
// Tree-shake unused configurations
treeshake: true,
},
},
})3. CDN and Caching
For browser environments, optimize with CDN and caching:
// browser-config-loader.ts
class BrowserConfigLoader {
private cache = new Map<string, any>()
private readonly cacheTTL = 5 * 60 * 1000 // 5 minutes
async load<T>(name: string, defaultConfig: T): Promise<T> {
const cacheKey = `config:${name}`
const cached = this.cache.get(cacheKey)
if (cached && Date.now() - cached.timestamp < this.cacheTTL) {
return cached.config
}
try {
// Use CDN with cache headers
const response = await fetch(`/api/config/${name}`, {
headers: {
'Cache-Control': 'public, max-age=300', // 5 minutes
},
})
if (!response.ok) {
throw new Error(`Failed to fetch config: ${response.statusText}`)
}
const config = await response.json()
// Cache in memory
this.cache.set(cacheKey, {
config,
timestamp: Date.now(),
})
return config
}
catch (error) {
console.warn(`Failed to load config '${name}', using defaults:`, error)
return defaultConfig
}
}
}Performance Best Practices
1. Configuration Design
- Keep configurations small - Large configs slow down parsing and merging
- Avoid deep nesting - Flat structures are faster to process
- Use appropriate data types - Primitives are faster than complex objects
- Minimize dynamic logic - Static configurations are faster to load
2. Loading Patterns
- Load on demand - Don't load configurations until needed
- Cache aggressively - Use memory caching for frequently accessed configs
- Preload critical paths - Load essential configurations at startup
- Batch operations - Group related configuration loading together
3. Development vs Production
// Development: Flexibility and hot reloading
if (process.env.NODE_ENV === 'development') {
config = await loadConfig({
name: 'app',
cache: false, // Disable cache for hot reloading
watch: true, // Enable file watching
defaultConfig: {},
})
}
// Production: Performance and stability
if (process.env.NODE_ENV === 'production') {
config = await loadConfig({
name: 'app',
cache: true, // Enable aggressive caching
immutable: true, // Treat config as immutable
defaultConfig: {},
})
}4. Monitoring and Alerting
Set up monitoring for configuration performance:
// Monitor configuration loading times
const SLOW_CONFIG_THRESHOLD = 100 // ms
async function monitoredLoadConfig<T>(options: any): Promise<T> {
const start = performance.now()
try {
const config = await loadConfig<T>(options)
const duration = performance.now() - start
if (duration > SLOW_CONFIG_THRESHOLD) {
console.warn(`⚠️ Slow configuration loading: ${options.name} took ${duration}ms`)
// Send to monitoring service
// analytics.track('slow_config_load', { name: options.name, duration })
}
return config
}
catch (error) {
console.error(`❌ Configuration loading failed: ${options.name}`, error)
throw error
}
}Related Features
- Configuration Loading - Core loading mechanisms
- Build Plugin - Build-time optimizations
- CLI Usage - Performance monitoring tools