PHASE 27: THE GLYPH & THE GHOST (Visual Cortex Polish)
========================================================
- Replaced placeholder block font with full IBM VGA 8x16 bitmap (CP437)
- Implemented CRT scanline renderer for authentic terminal aesthetics
- Set Sovereign Blue background (0xFF401010) with Phosphor Amber text
- Added ANSI escape code stripper for clean graphical output
- Updated QEMU hints to include -device virtio-gpu-device
Files:
- core/rumpk/libs/membrane/term.nim: Scanline renderer + ANSI stripper
- core/rumpk/libs/membrane/term_font.nim: Full VGA bitmap data
- src/nexus/forge.nim: QEMU device flag
- docs/dev/PHASE_26_VISUAL_CORTEX.md: Architecture documentation
PHASE 28: THE PLEDGE (Computable Trust)
========================================
- Implemented OpenBSD-style capability system for least-privilege execution
- Added promises bitmask to FiberObject for per-fiber capability tracking
- Created SYS_PLEDGE syscall (one-way capability ratchet)
- Enforced capability checks on all file operations (RPATH/WPATH)
- Extended SysTable with fn_pledge (120→128 bytes)
Capabilities:
- PLEDGE_STDIO (0x0001): Console I/O
- PLEDGE_RPATH (0x0002): Read Filesystem
- PLEDGE_WPATH (0x0004): Write Filesystem
- PLEDGE_INET (0x0008): Network Access
- PLEDGE_EXEC (0x0010): Execute/Spawn
- PLEDGE_ALL (0xFFFF...): Root (default)
Files:
- core/rumpk/core/fiber.nim: Added promises field
- core/rumpk/core/ion.nim: Capability constants + SysTable extension
- core/rumpk/core/kernel.nim: k_pledge + enforcement checks
- core/rumpk/libs/membrane/ion_client.nim: Userland ABI sync
- core/rumpk/libs/membrane/libc.nim: pledge() wrapper
- docs/dev/PHASE_28_THE_PLEDGE.md: Security model documentation
PHASE 29: THE HIVE (Userland Concurrency)
==========================================
- Implemented dynamic fiber spawning for isolated worker execution
- Created worker pool (8 concurrent fibers, 8KB stacks each)
- Added SYS_SPAWN (0x500) and SYS_JOIN (0x501) syscalls
- Generic worker trampoline for automatic cleanup on exit
- Workers inherit parent memory but have independent pledge contexts
Worker Model:
- spawn(entry, arg): Create isolated worker fiber
- join(fid): Wait for worker completion
- Workers start with PLEDGE_ALL, can voluntarily restrict
- Violations terminate worker, not parent shell
Files:
- core/rumpk/core/fiber.nim: user_entry/user_arg fields
- core/rumpk/core/kernel.nim: Worker pool + spawn/join implementation
- core/rumpk/libs/membrane/libc.nim: spawn()/join() wrappers
- docs/dev/PHASE_29_THE_HIVE.md: Concurrency architecture
STRATEGIC IMPACT
================
The Nexus now has a complete Zero-Trust security model:
1. Visual identity (CRT aesthetics)
2. Capability-based security (pledge)
3. Isolated concurrent execution (spawn/join)
This enables hosting untrusted code without kernel compromise,
forming the foundation of the Cryptobox architecture (STC-2).
Example usage:
proc worker(arg: uint64) {.cdecl.} =
discard pledge(PLEDGE_INET | PLEDGE_STDIO)
http_get("https://example.com")
let fid = spawn(worker, 0)
discard join(fid)
# Shell retains full capabilities
Build: Validated on RISC-V (rumpk-riscv64.elf)
Status: Production-ready
|
||
|---|---|---|
| .. | ||
| fixtures | ||
| README.md | ||
| benchmark_resolver.nim | ||
| run_all_tests.sh | ||
| run_build_tests.nim | ||
| run_multiplatform_tests.sh | ||
| show_progress.nim | ||
| test_all.nim | ||
| test_archives.nim | ||
| test_benchmark.nim | ||
| test_binary_cache.nim | ||
| test_blake2b.nim | ||
| test_bootstrap_container_integration.nim | ||
| test_bootstrap_flow.sh | ||
| test_bootstrap_integration.nim | ||
| test_bootstrap_integration_fixed.nim | ||
| test_build_coordinator.nim | ||
| test_build_synthesis.nim | ||
| test_build_synthesis_property.nim | ||
| test_build_system.nim | ||
| test_build_system_simple.nim | ||
| test_cache_invalidation.nim | ||
| test_cas.nim | ||
| test_cas_integration.nim | ||
| test_cdcl_solver.nim | ||
| test_cell_manager.nim | ||
| test_cli_commands.nim | ||
| test_cli_integration.nim | ||
| test_cnf_translation.nim | ||
| test_config_manager.nim | ||
| test_conflict_detection.nim | ||
| test_conflict_minimality.nim | ||
| test_container_builds.nim | ||
| test_container_management.nim | ||
| test_container_manager.nim | ||
| test_container_namespace.nim | ||
| test_container_startup.nim | ||
| test_crypto_transitions.nim | ||
| test_database.nim | ||
| test_decentralized.nim | ||
| test_deduplication.nim | ||
| test_dependency.nim | ||
| test_dependency_graph.nim | ||
| test_download_manager.nim | ||
| test_e2e_bootstrap.sh | ||
| test_e2e_graft.nim | ||
| test_end_to_end.nim | ||
| test_errors.nim | ||
| test_filesystem.nim | ||
| test_flexible_adapter.nim | ||
| test_frozen_adapter.nim | ||
| test_garbage_collection.nim | ||
| test_gentoo_adapter.nim | ||
| test_git_adapter.nim | ||
| test_graft.nim | ||
| test_graft_coordinator.nim | ||
| test_grafting.nim | ||
| test_grafting_integration.nim | ||
| test_graph_builder.nim | ||
| test_hash_verifier.nim | ||
| test_helpers.nim | ||
| test_install.nim | ||
| test_install_manager.nim | ||
| test_integration.nim | ||
| test_integration_e2e.nim | ||
| test_integrity_canonical.nim | ||
| test_integrity_monitoring.nim | ||
| test_logging.nim | ||
| test_lru_cache.nim | ||
| test_manifest_desktop.nim | ||
| test_manifest_hash_determinism.nim | ||
| test_manifest_parser.nim | ||
| test_manifest_security.nim | ||
| test_merkle_tree.nim | ||
| test_metadata.nim | ||
| test_metadata_properties.nim | ||
| test_migration.nim | ||
| test_minimal.nim | ||
| test_multiplatform.nim | ||
| test_namespace.nim | ||
| test_nexter_archive.nim | ||
| test_nexter_comm.nim | ||
| test_nexter_installer.nim | ||
| test_nexter_manifest.nim | ||
| test_nexter_removal.nim | ||
| test_nip_desktop_advanced.nim | ||
| test_nip_installer.nim | ||
| test_nip_launcher.nim | ||
| test_nip_manifest_roundtrip.nim | ||
| test_nip_parse_debug.nim | ||
| test_nipcell_fallback.nim | ||
| test_nipcells.nim | ||
| test_nippels_cli.nim | ||
| test_nippels_merkle_integration.nim | ||
| test_nippels_merkle_simple.nim | ||
| test_nippels_namespace_integration.nim | ||
| test_nippels_performance.nim | ||
| test_nippels_profile_integration.nim | ||
| test_nippels_utcp_integration.nim | ||
| test_nippels_xdg_integration.nim | ||
| test_nix_adapter.nim | ||
| test_npk_archive.nim | ||
| test_npk_conversion.nim | ||
| test_npk_installation_atomicity.nim | ||
| test_npk_installer.nim | ||
| test_npk_manifest.nim | ||
| test_npk_packages.nim | ||
| test_optimizations.nim | ||
| test_orchestrator.nim | ||
| test_orchestrator_integration.nim | ||
| test_package_metadata.nim | ||
| test_packages.nim | ||
| test_pacman_adapter.nim | ||
| test_pkgsrc_adapter.nim | ||
| test_platform.nim | ||
| test_profiler.nim | ||
| test_publish.nim | ||
| test_recipe_manager.nim | ||
| test_recipe_parser.nim | ||
| test_recipes.nim | ||
| test_remote_cache.nim | ||
| test_remote_cli.nim | ||
| test_repo_hierarchy.nim | ||
| test_resolution_cache.nim | ||
| test_resolution_properties.nim | ||
| test_resolver_integration.nim | ||
| test_security.nim | ||
| test_security_audit.nim | ||
| test_security_event_logging.nim | ||
| test_serialization.nim | ||
| test_signature.nim | ||
| test_signature_verifier.nim | ||
| test_solver_types.nim | ||
| test_source_selection.nim | ||
| test_stress.nim | ||
| test_sync_basic.nim | ||
| test_sync_engine.nim | ||
| test_system_integration.nim | ||
| test_topological_sort.nim | ||
| test_types.nim | ||
| test_unified_storage.nim | ||
| test_updates.nim | ||
| test_utcp_protocol.nim | ||
| test_variant_cli.nim | ||
| test_variant_coexistence.nim | ||
| test_variant_compiler.nim | ||
| test_variant_config.nim | ||
| test_variant_database.nim | ||
| test_variant_fingerprint.nim | ||
| test_variant_hash_properties.nim | ||
| test_variant_manager.nim | ||
| test_variant_mapper.nim | ||
| test_variant_migration.nim | ||
| test_variant_parser.nim | ||
| test_variant_paths.nim | ||
| test_variant_profiles.nim | ||
| test_variant_validator.nim | ||
| test_verify_command.nim | ||
README.md
NIP Test Suite
Comprehensive test suite for the NIP bootstrap system.
Quick Start
Run all tests:
cd nip/tests
./run_all_tests.sh
Test Structure
Unit Tests
Individual component tests written in Nim:
test_recipes.nim- Recipe parsing and validationtest_bootstrap_integration.nim- Component integration tests
Run unit tests:
nim c -r test_recipes.nim
nim c -r test_bootstrap_integration.nim
Integration Tests
Tests that verify components work together:
test_bootstrap_integration.nim- Full integration test suite- RecipeManager initialization
- Recipe loading and parsing
- DownloadManager functionality
- Checksum verification
- InstallationManager operations
- Archive extraction
- Script execution
- Error handling
- Caching
Run integration tests:
nim c -r test_bootstrap_integration.nim
End-to-End Tests
Complete workflow tests using the CLI:
test_e2e_bootstrap.sh- E2E bootstrap flow tests- CLI command testing
- Bootstrap list/info/recipes
- Recipe validation
- Tool detection
- Container runtime detection
- Documentation verification
Run e2e tests:
./test_e2e_bootstrap.sh
Bootstrap Flow Tests
Real-world bootstrap scenarios:
test_bootstrap_flow.sh- Complete bootstrap workflows- First-time installation
- Tool verification
- Update scenarios
- Error recovery
Run flow tests:
./test_bootstrap_flow.sh
Test Runner
The master test runner orchestrates all tests:
./run_all_tests.sh
This will:
- Check prerequisites (Nim, dependencies)
- Build NIP if needed
- Run all unit tests
- Run integration tests
- Run e2e tests
- Validate recipes
- Validate scripts
- Check documentation
- Generate comprehensive report
Test Results
Test results are saved to /tmp/nip-test-results-<pid>/:
/tmp/nip-test-results-12345/
├── build.log # NIP build log
├── unit-test_recipes.nim.log # Unit test logs
├── Integration Tests.log # Integration test log
├── End-to-End Tests.log # E2E test log
└── Bootstrap Flow Test.log # Flow test log
Running Specific Tests
Recipe Parser Tests
nim c -r test_recipes.nim
Integration Tests Only
nim c -r test_bootstrap_integration.nim
E2E Tests Only
./test_e2e_bootstrap.sh
Recipe Validation Only
# Validate all recipes
for recipe in ../recipes/*/minimal-*.kdl; do
echo "Validating: $recipe"
# Add validation command here
done
Test Coverage
Phase 2: Recipe System
- ✅ Recipe parsing (KDL format)
- ✅ Recipe validation
- ✅ Platform selection
- ✅ Download management
- ✅ Checksum verification (Blake2b-512)
- ✅ Archive extraction
- ✅ Script execution
- ✅ Installation verification
- ✅ Error handling
- ✅ Caching
- ✅ CLI integration
- ✅ Progress reporting
Components Tested
- ✅ RecipeManager
- ✅ DownloadManager
- ✅ InstallationManager
- ✅ RecipeParser
- ✅ CLI commands
- ✅ Installation scripts
- ✅ Verification scripts
Scenarios Tested
- ✅ Fresh installation
- ✅ Tool detection
- ✅ Missing tools handling
- ✅ Container runtime detection
- ✅ Recipe updates
- ✅ Error recovery
- ✅ Checksum mismatches
- ✅ Invalid recipes
- ✅ Missing files
Prerequisites
Required
- Nim compiler (1.6.0+)
- Bash shell
- Standard Unix tools (tar, gzip, etc.)
Optional
- Podman or Docker (for container tests)
- Network access (for recipe updates)
Continuous Integration
The test suite is designed for CI/CD integration:
# Example GitHub Actions
- name: Run NIP Tests
run: |
cd nip/tests
./run_all_tests.sh
# Example GitLab CI
test:
script:
- cd nip/tests
- ./run_all_tests.sh
artifacts:
paths:
- /tmp/nip-test-results-*/
when: always
Writing New Tests
Unit Test Template
## Test description
import std/[unittest, os]
import ../src/nimpak/build/your_module
suite "Your Module Tests":
setup:
# Setup code
discard
teardown:
# Cleanup code
discard
test "should do something":
# Test code
check someCondition == true
Integration Test Template
proc testYourFeature(): bool =
let start = startTest("Your feature")
try:
# Test code
let result = yourFunction()
if result == expected:
endTest("Your feature", start, true)
return true
else:
endTest("Your feature", start, false, "Unexpected result")
return false
except Exception as e:
endTest("Your feature", start, false, e.msg)
return false
E2E Test Template
test_your_feature() {
log_info "Test: Your feature"
output=$($NIP_BIN your command 2>&1 || true)
if echo "$output" | grep -q "expected output"; then
log_success "Your feature works"
else
log_error "Your feature failed"
fi
}
Debugging Tests
Enable Verbose Output
# Nim tests
nim c -r --verbosity:2 test_recipes.nim
# Shell tests
bash -x test_e2e_bootstrap.sh
Run Single Test
# Nim
nim c -r test_recipes.nim --test:"specific test name"
# Shell - edit script to comment out other tests
Check Test Logs
# View latest test results
ls -lt /tmp/nip-test-results-*/
# View specific log
cat /tmp/nip-test-results-12345/Integration\ Tests.log
Test Maintenance
Adding New Tests
- Create test file in
nip/tests/ - Follow naming convention:
test_*.nimortest_*.sh - Add to
run_all_tests.shif needed - Update this README
Updating Tests
When adding new features:
- Add unit tests for new components
- Add integration tests for component interactions
- Add e2e tests for user-facing features
- Update test documentation
Test Data
Test data and fixtures:
- Recipes:
../recipes/*/minimal-*.kdl - Scripts:
../recipes/*/scripts/*.sh - Schemas:
../recipes/schema/recipe.json
Troubleshooting
Tests Fail to Build
# Check Nim installation
nim --version
# Clean and rebuild
rm -rf nimcache/
nim c test_recipes.nim
Tests Fail to Run
# Check permissions
chmod +x test_*.sh
# Check NIP binary
ls -l ../nip
# Rebuild NIP
cd ..
nim c -d:release nip.nim
Network-Dependent Tests Fail
Some tests require network access:
- Recipe updates
- Git repository cloning
Run offline-safe tests only:
# Skip network tests
OFFLINE=1 ./run_all_tests.sh
Performance
Test suite execution time (approximate):
- Unit tests: ~5 seconds
- Integration tests: ~10 seconds
- E2E tests: ~15 seconds
- Full suite: ~30-45 seconds
Contributing
When contributing tests:
- Follow existing test patterns
- Add descriptive test names
- Include error messages
- Clean up test artifacts
- Update documentation
Future Enhancements
Planned test improvements:
- Code coverage reporting
- Performance benchmarks
- Stress testing
- Multi-platform CI
- Container-based tests
- Mock external dependencies
- Parallel test execution
Summary
The NIP test suite provides comprehensive coverage of the bootstrap system:
- Unit tests verify individual components
- Integration tests verify components work together
- E2E tests verify user workflows
- Master runner orchestrates everything
Run ./run_all_tests.sh to execute the complete test suite.