PivotPHP Core v1.1.1+ introduces a revolutionary JSON optimization system that dramatically improves performance for JSON operations through intelligent buffer pooling and automatic optimization.
The JSON optimization system consists of two main components:
- JsonBuffer: High-performance buffer for JSON operations with capacity management
- JsonBufferPool: Intelligent pooling system with automatic optimization decisions
These work together to provide automatic performance improvements with zero configuration required while maintaining full backward compatibility.
The system is seamlessly integrated into the framework's core Response::json() method:
// This code automatically benefits from pooling when appropriate
$response->json($data);The system automatically activates pooling based on data characteristics:
- Arrays: 10 or more elements
- Objects: 5 or more properties
- Strings: Greater than 1KB in size
For smaller datasets, the system uses traditional json_encode() for optimal performance.
For advanced use cases, you can interact with the pooling system directly:
use PivotPHP\Core\Json\Pool\JsonBufferPool;
// Direct encoding with pooling (always returns string)
$json = JsonBufferPool::encodeWithPool($data);
// Get a buffer for manual operations
$buffer = JsonBufferPool::getBuffer(4096);
$buffer->appendJson(['key' => 'value']);
$buffer->append(',');
$buffer->appendJson(['another' => 'value']);
$result = $buffer->finalize();
JsonBufferPool::returnBuffer($buffer);The system exposes public constants for advanced configuration and testing:
// Size estimation constants
JsonBufferPool::EMPTY_ARRAY_SIZE; // 2
JsonBufferPool::SMALL_ARRAY_SIZE; // 512
JsonBufferPool::MEDIUM_ARRAY_SIZE; // 2048
JsonBufferPool::LARGE_ARRAY_SIZE; // 8192
JsonBufferPool::XLARGE_ARRAY_SIZE; // 32768
// Threshold constants
JsonBufferPool::SMALL_ARRAY_THRESHOLD; // 10
JsonBufferPool::MEDIUM_ARRAY_THRESHOLD; // 100
JsonBufferPool::LARGE_ARRAY_THRESHOLD; // 1000
// Pooling decision thresholds
JsonBufferPool::POOLING_ARRAY_THRESHOLD; // 10
JsonBufferPool::POOLING_OBJECT_THRESHOLD; // 5
JsonBufferPool::POOLING_STRING_THRESHOLD; // 1024
// Type-specific constants
JsonBufferPool::STRING_OVERHEAD; // 20
JsonBufferPool::OBJECT_PROPERTY_OVERHEAD; // 50
JsonBufferPool::OBJECT_BASE_SIZE; // 100
JsonBufferPool::BOOLEAN_OR_NULL_SIZE; // 10
JsonBufferPool::NUMERIC_SIZE; // 20
JsonBufferPool::DEFAULT_ESTIMATE; // 100
JsonBufferPool::MIN_LARGE_BUFFER_SIZE; // 65536The pool can be configured for production workloads:
JsonBufferPool::configure([
'max_pool_size' => 200, // Maximum buffers per pool
'default_capacity' => 8192, // Default buffer size (8KB)
'size_categories' => [
'small' => 2048, // 2KB
'medium' => 8192, // 8KB
'large' => 32768, // 32KB
'xlarge' => 131072 // 128KB
]
]);The system provides comprehensive validation with precise error messages:
try {
JsonBufferPool::configure([
'max_pool_size' => -1 // Invalid: negative value
]);
} catch (InvalidArgumentException $e) {
echo $e->getMessage(); // "'max_pool_size' must be a positive integer"
}
try {
JsonBufferPool::configure([
'max_pool_size' => 'invalid' // Invalid: wrong type
]);
} catch (InvalidArgumentException $e) {
echo $e->getMessage(); // "'max_pool_size' must be an integer"
}The system provides comprehensive statistics for monitoring and optimization:
$stats = JsonBufferPool::getStatistics();
echo "Reuse Rate: {$stats['reuse_rate']}%\n";
echo "Total Operations: {$stats['total_operations']}\n";
echo "Current Usage: {$stats['current_usage']} buffers\n";
echo "Peak Usage: {$stats['peak_usage']} buffers\n";
// Pool sizes by category
foreach ($stats['pool_sizes'] as $category => $count) {
echo "{$category}: {$count} buffers\n";
}- Sustained Throughput: 101,000+ operations per second
- Reuse Rate: 100% in high-frequency scenarios
- Memory Efficiency: Significant reduction in GC pressure
- Latency: Consistent performance under load
The system excels in:
- High-throughput APIs (1000+ requests/second)
- Microservices with frequent JSON responses
- Real-time applications with continuous data streaming
- Batch processing with large datasets
Buffers are organized into size-based pools:
- buffer_1024: 1KB buffers for small data
- buffer_4096: 4KB buffers for medium data
- buffer_16384: 16KB buffers for large data
- buffer_65536: 64KB buffers for extra-large data
- Acquisition: Get buffer from appropriate pool or create new
- Usage: Append JSON data with automatic expansion
- Finalization: Convert buffer contents to final JSON string
- Return: Reset and return buffer to pool for reuse
- Automatic Expansion: Buffers grow as needed
- Efficient Reset: Buffers are reset without reallocation
- Pool Limits: Configurable maximum pool sizes prevent memory bloat
- Garbage Collection: Unused buffers are automatically cleaned up
$app->get('/api/users', function($req, $res) {
$users = User::all(); // Array of 100+ users
// Automatically uses pooling for large dataset
return $res->json($users);
});$app->get('/api/metrics/live', function($req, $res) {
$buffer = JsonBufferPool::getBuffer(32768); // 32KB buffer
try {
$buffer->append('{"metrics":[');
$first = true;
foreach ($this->streamMetrics() as $metric) {
if (!$first) $buffer->append(',');
$buffer->appendJson($metric);
$first = false;
}
$buffer->append(']}');
$json = $buffer->finalize();
return $res->setHeader('Content-Type', 'application/json')
->setBody($json);
} finally {
JsonBufferPool::returnBuffer($buffer);
}
});$app->get('/health', function($req, $res) {
$health = [
'status' => 'ok',
'json_pool' => JsonBufferPool::getStatistics(),
'timestamp' => time()
];
return $res->json($health);
});// High-traffic configuration
JsonBufferPool::configure([
'max_pool_size' => 500,
'default_capacity' => 16384,
'size_categories' => [
'small' => 4096,
'medium' => 16384,
'large' => 65536,
'xlarge' => 262144
]
]);Set up regular monitoring of pool statistics:
// Add to your monitoring system
function checkJsonPoolHealth() {
$stats = JsonBufferPool::getStatistics();
// Alert if reuse rate is too low
if ($stats['reuse_rate'] < 50 && $stats['total_operations'] > 1000) {
log_warning("Low JSON pool reuse rate: {$stats['reuse_rate']}%");
}
// Alert if pool usage is growing without bounds
if ($stats['current_usage'] > 1000) {
log_warning("High JSON pool usage: {$stats['current_usage']} buffers");
}
return $stats;
}The system includes robust error handling with automatic fallback:
try {
$json = JsonBufferPool::encodeWithPool($data);
} catch (\Throwable $e) {
// Automatic fallback to traditional encoding
log_error("JSON pooling failed: " . $e->getMessage());
$json = json_encode($data);
}- Low Reuse Rate: Check if data sizes match pool categories
- High Memory Usage: Reduce max_pool_size or adjust size categories
- Performance Regression: Verify pooling is being used for appropriate data sizes
// Enable detailed debugging
$debug = JsonBufferPool::getStatistics();
var_dump($debug['detailed_stats']);
// Clear pools for testing
JsonBufferPool::clearPools();No migration is required! The system works automatically with existing code:
// Before v1.1.1
$response->json($data); // Uses json_encode()
// After v1.1.1
$response->json($data); // Automatically uses pooling when beneficialFor applications wanting to maximize performance, consider:
- Configuring pool sizes for your specific workload
- Adding monitoring to track pool efficiency
- Using manual pooling for specialized use cases