7.4 KiB
NIP Test Suite
Comprehensive test suite for the NIP bootstrap system.
Quick Start
Run all tests:
cd nip/tests
./run_all_tests.sh
Test Structure
Unit Tests
Individual component tests written in Nim:
test_recipes.nim- Recipe parsing and validationtest_bootstrap_integration.nim- Component integration tests
Run unit tests:
nim c -r test_recipes.nim
nim c -r test_bootstrap_integration.nim
Integration Tests
Tests that verify components work together:
test_bootstrap_integration.nim- Full integration test suite- RecipeManager initialization
- Recipe loading and parsing
- DownloadManager functionality
- Checksum verification
- InstallationManager operations
- Archive extraction
- Script execution
- Error handling
- Caching
Run integration tests:
nim c -r test_bootstrap_integration.nim
End-to-End Tests
Complete workflow tests using the CLI:
test_e2e_bootstrap.sh- E2E bootstrap flow tests- CLI command testing
- Bootstrap list/info/recipes
- Recipe validation
- Tool detection
- Container runtime detection
- Documentation verification
Run e2e tests:
./test_e2e_bootstrap.sh
Bootstrap Flow Tests
Real-world bootstrap scenarios:
test_bootstrap_flow.sh- Complete bootstrap workflows- First-time installation
- Tool verification
- Update scenarios
- Error recovery
Run flow tests:
./test_bootstrap_flow.sh
Test Runner
The master test runner orchestrates all tests:
./run_all_tests.sh
This will:
- Check prerequisites (Nim, dependencies)
- Build NIP if needed
- Run all unit tests
- Run integration tests
- Run e2e tests
- Validate recipes
- Validate scripts
- Check documentation
- Generate comprehensive report
Test Results
Test results are saved to /tmp/nip-test-results-<pid>/:
/tmp/nip-test-results-12345/
├── build.log # NIP build log
├── unit-test_recipes.nim.log # Unit test logs
├── Integration Tests.log # Integration test log
├── End-to-End Tests.log # E2E test log
└── Bootstrap Flow Test.log # Flow test log
Running Specific Tests
Recipe Parser Tests
nim c -r test_recipes.nim
Integration Tests Only
nim c -r test_bootstrap_integration.nim
E2E Tests Only
./test_e2e_bootstrap.sh
Recipe Validation Only
# Validate all recipes
for recipe in ../recipes/*/minimal-*.kdl; do
echo "Validating: $recipe"
# Add validation command here
done
Test Coverage
Phase 2: Recipe System
- ✅ Recipe parsing (KDL format)
- ✅ Recipe validation
- ✅ Platform selection
- ✅ Download management
- ✅ Checksum verification (Blake2b-512)
- ✅ Archive extraction
- ✅ Script execution
- ✅ Installation verification
- ✅ Error handling
- ✅ Caching
- ✅ CLI integration
- ✅ Progress reporting
Components Tested
- ✅ RecipeManager
- ✅ DownloadManager
- ✅ InstallationManager
- ✅ RecipeParser
- ✅ CLI commands
- ✅ Installation scripts
- ✅ Verification scripts
Scenarios Tested
- ✅ Fresh installation
- ✅ Tool detection
- ✅ Missing tools handling
- ✅ Container runtime detection
- ✅ Recipe updates
- ✅ Error recovery
- ✅ Checksum mismatches
- ✅ Invalid recipes
- ✅ Missing files
Prerequisites
Required
- Nim compiler (1.6.0+)
- Bash shell
- Standard Unix tools (tar, gzip, etc.)
Optional
- Podman or Docker (for container tests)
- Network access (for recipe updates)
Continuous Integration
The test suite is designed for CI/CD integration:
# Example GitHub Actions
- name: Run NIP Tests
run: |
cd nip/tests
./run_all_tests.sh
# Example GitLab CI
test:
script:
- cd nip/tests
- ./run_all_tests.sh
artifacts:
paths:
- /tmp/nip-test-results-*/
when: always
Writing New Tests
Unit Test Template
## Test description
import std/[unittest, os]
import ../src/nimpak/build/your_module
suite "Your Module Tests":
setup:
# Setup code
discard
teardown:
# Cleanup code
discard
test "should do something":
# Test code
check someCondition == true
Integration Test Template
proc testYourFeature(): bool =
let start = startTest("Your feature")
try:
# Test code
let result = yourFunction()
if result == expected:
endTest("Your feature", start, true)
return true
else:
endTest("Your feature", start, false, "Unexpected result")
return false
except Exception as e:
endTest("Your feature", start, false, e.msg)
return false
E2E Test Template
test_your_feature() {
log_info "Test: Your feature"
output=$($NIP_BIN your command 2>&1 || true)
if echo "$output" | grep -q "expected output"; then
log_success "Your feature works"
else
log_error "Your feature failed"
fi
}
Debugging Tests
Enable Verbose Output
# Nim tests
nim c -r --verbosity:2 test_recipes.nim
# Shell tests
bash -x test_e2e_bootstrap.sh
Run Single Test
# Nim
nim c -r test_recipes.nim --test:"specific test name"
# Shell - edit script to comment out other tests
Check Test Logs
# View latest test results
ls -lt /tmp/nip-test-results-*/
# View specific log
cat /tmp/nip-test-results-12345/Integration\ Tests.log
Test Maintenance
Adding New Tests
- Create test file in
nip/tests/ - Follow naming convention:
test_*.nimortest_*.sh - Add to
run_all_tests.shif needed - Update this README
Updating Tests
When adding new features:
- Add unit tests for new components
- Add integration tests for component interactions
- Add e2e tests for user-facing features
- Update test documentation
Test Data
Test data and fixtures:
- Recipes:
../recipes/*/minimal-*.kdl - Scripts:
../recipes/*/scripts/*.sh - Schemas:
../recipes/schema/recipe.json
Troubleshooting
Tests Fail to Build
# Check Nim installation
nim --version
# Clean and rebuild
rm -rf nimcache/
nim c test_recipes.nim
Tests Fail to Run
# Check permissions
chmod +x test_*.sh
# Check NIP binary
ls -l ../nip
# Rebuild NIP
cd ..
nim c -d:release nip.nim
Network-Dependent Tests Fail
Some tests require network access:
- Recipe updates
- Git repository cloning
Run offline-safe tests only:
# Skip network tests
OFFLINE=1 ./run_all_tests.sh
Performance
Test suite execution time (approximate):
- Unit tests: ~5 seconds
- Integration tests: ~10 seconds
- E2E tests: ~15 seconds
- Full suite: ~30-45 seconds
Contributing
When contributing tests:
- Follow existing test patterns
- Add descriptive test names
- Include error messages
- Clean up test artifacts
- Update documentation
Future Enhancements
Planned test improvements:
- Code coverage reporting
- Performance benchmarks
- Stress testing
- Multi-platform CI
- Container-based tests
- Mock external dependencies
- Parallel test execution
Summary
The NIP test suite provides comprehensive coverage of the bootstrap system:
- Unit tests verify individual components
- Integration tests verify components work together
- E2E tests verify user workflows
- Master runner orchestrates everything
Run ./run_all_tests.sh to execute the complete test suite.