Performance
Prism provides comprehensive performance features for fast test execution and performance analysis.
Parallel Execution
Section titled “Parallel Execution”Run tests across multiple worker processes:
vendor/bin/prism test --parallel 4Or use the short flag:
vendor/bin/prism test -j 4How It Works
Section titled “How It Works”- Test files divided into batches
- Each batch runs in separate PHP process
- Results collected and merged
- Total execution time significantly reduced
Optimal Worker Count
Section titled “Optimal Worker Count”# Use CPU core countvendor/bin/prism test -j $(nproc)
# Conservative (50% of cores)vendor/bin/prism test -j $(($(nproc) / 2))
# Maximum performancevendor/bin/prism test -j $(nproc --all)Performance Comparison
Section titled “Performance Comparison”# Sequential executionvendor/bin/prism test# Time: 45.2s
# Parallel with 4 workersvendor/bin/prism test -j 4# Time: 12.8s (3.5x faster)
# Parallel with 8 workersvendor/bin/prism test -j 8# Time: 7.1s (6.4x faster)Performance Profiling
Section titled “Performance Profiling”Identify slowest tests with performance profiling:
vendor/bin/prism test --profileProfile Output
Section titled “Profile Output”Test Performance Profile
Slowest Tests: 1. complex-nested-validation (2.145s) 2. large-array-validation (1.832s) 3. recursive-schema-test (1.456s) 4. deep-object-validation (0.987s) 5. pattern-matching-test (0.654s)
Total Tests: 1,247Average Duration: 0.023sMedian Duration: 0.012sUse Cases
Section titled “Use Cases”- Identify bottlenecks: Find slow tests to optimize
- Monitor performance: Track test suite speed over time
- CI optimization: Prioritize fast tests in pipelines
Baseline Benchmarks
Section titled “Baseline Benchmarks”Save and compare performance baselines:
Save Baseline
Section titled “Save Baseline”vendor/bin/prism test --baselineSave with custom name:
vendor/bin/prism test --baseline "pre-optimization"Compare Against Baseline
Section titled “Compare Against Baseline”vendor/bin/prism test --compareCompare against named baseline:
vendor/bin/prism test --compare "pre-optimization"Benchmark Output
Section titled “Benchmark Output”Benchmark Comparison: default
Performance Changes: Total Duration: 45.2s → 38.7s (14.4% faster) Average Test: 0.036s → 0.031s (13.9% faster)
Significant Changes: ✓ complex-validation: 2.145s → 1.234s (42.5% faster) ✓ array-processing: 1.832s → 1.123s (38.7% faster) ✗ string-validation: 0.234s → 0.387s (65.4% slower)
Tests Analyzed: 1,247Improved: 892 (71.5%)Degraded: 123 (9.9%)Unchanged: 232 (18.6%)Workflow Example
Section titled “Workflow Example”# Save baseline before optimizationvendor/bin/prism test --baseline "before-cache"
# Make optimizations to validatorvim src/Validator.php
# Compare performancevendor/bin/prism test --compare "before-cache"Incremental Testing
Section titled “Incremental Testing”Run only changed tests for fast iteration:
vendor/bin/prism test --incrementalHow It Works
Section titled “How It Works”- Tracks file modification times
- Identifies changed test files
- Only runs tests from changed files
- Significantly faster during development
Example Workflow
Section titled “Example Workflow”# First run - all tests executevendor/bin/prism test --incremental# Time: 45.2s
# Make small change to one test filevim tests/validation/strings.json
# Only changed file tests runvendor/bin/prism test --incremental# Time: 0.8s (56x faster)Cache Management
Section titled “Cache Management”Cache stored in .prism/incremental-cache.json:
{ "tests/validation/strings.json": 1703001234, "tests/validation/numbers.json": 1703001156, "tests/validation/objects.json": 1703000987}Clear cache to force full run:
rm .prism/incremental-cache.jsonCombining Performance Features
Section titled “Combining Performance Features”Maximize performance by combining features:
Development Workflow
Section titled “Development Workflow”Fast iteration with incremental mode and watch:
vendor/bin/prism test --incremental --watchCI Optimization
Section titled “CI Optimization”Parallel execution with profiling:
vendor/bin/prism test -j 8 --profilePerformance Monitoring
Section titled “Performance Monitoring”Baseline comparison with profiling:
vendor/bin/prism test --compare "main" --profileWatch Mode
Section titled “Watch Mode”Automatically re-run tests when files change:
vendor/bin/prism test --watchHow It Works
Section titled “How It Works”- Runs initial test suite
- Monitors test files for changes
- Automatically re-runs tests on change
- Continues until interrupted (Ctrl+C)
Watch Output
Section titled “Watch Output”[12:34:56] Initial run completed (45.2s)[12:35:12] Watching for changes...[12:36:08] Change detected in tests/validation/strings.json[12:36:08] Re-running tests...[12:36:09] Tests completed (0.8s)[12:36:09] Watching for changes...Best Practices
Section titled “Best Practices”Combine watch mode with other features:
# Watch with incremental testingvendor/bin/prism test --watch --incremental
# Watch with specific filtervendor/bin/prism test --watch --filter "string"
# Watch with failures onlyvendor/bin/prism test --watch --failuresPerformance Best Practices
Section titled “Performance Best Practices”1. Use Parallel Execution in CI
Section titled “1. Use Parallel Execution in CI”- name: Run Tests run: vendor/bin/prism test -j ${{ steps.cpu.outputs.count }}2. Profile Regularly
Section titled “2. Profile Regularly”Monitor test suite performance:
# Weekly baselinevendor/bin/prism test --baseline "$(date +%Y-%m-%d)"
# Compare against last weekvendor/bin/prism test --compare "$(date -d '7 days ago' +%Y-%m-%d)"3. Optimize Slow Tests
Section titled “3. Optimize Slow Tests”Use profiling to identify and optimize:
vendor/bin/prism test --profile | grep -A 10 "Slowest Tests"4. Incremental During Development
Section titled “4. Incremental During Development”Always use incremental mode when developing:
alias prism-dev="vendor/bin/prism test --incremental --watch"5. Filter Before Profiling
Section titled “5. Filter Before Profiling”Profile specific test subsets:
vendor/bin/prism test --filter "validation" --profilePerformance Metrics
Section titled “Performance Metrics”Execution Time
Section titled “Execution Time”Total time from start to finish:
vendor/bin/prism test# Total execution time: 45.234sTest Duration
Section titled “Test Duration”Individual test execution times:
vendor/bin/prism test --profile# Shows per-test durationsThroughput
Section titled “Throughput”Tests executed per second:
# 1,247 tests in 45.2s = 27.6 tests/secondNext Steps
Section titled “Next Steps”- Learn about advanced features
- Explore output formats
- See custom assertions