Manifold GFN: 100% Accuracy on 100k Token Parity with O(1) Memory
1.4 MB model | Constant memory | Perfect generalization to 2000ร training length
This repository hosts a trained Manifold model that demonstrates rule learning and extreme length generalization on the XOR/Parity task. Unlike transformers that scale quadratically with sequence length, Manifold uses continuous-time Hamiltonian dynamics to achieve O(1) memory complexity.
For the full framework, see the GitHub repository.
What This Model Demonstrates
- โ Learns parity as a rule, not via memorization
- โ Generalizes to sequence lengths >2000ร training length (trained on L=20, tested up to L=100,000)
- โ Maintains constant memory usage (O(1) via symplectic integration)
- โ Uses continuous-time dynamics instead of attention mechanisms
Benchmarks: Manifold vs Transformer
| Metric | Manifold GFN (1.4 MB) | MicroGPT Transformer |
|---|---|---|
| Model Size | 1.4 MB | ~2 MB |
| Training Length | 20 tokens | 20 tokens |
| Test Accuracy @ L=20 | 100% | 90% |
| Test Accuracy @ L=100 | 100% | 50% |
| Test Accuracy @ L=1,000 | 100% | 52% |
| Test Accuracy @ L=10,000 | 100% | OOM (Out of Memory) |
| Test Accuracy @ L=100,000 | 100% | OOM (Out of Memory) |
| VRAM @ L=100 | 30 MB | 85 MB |
| VRAM @ L=10,000 | 60 MB | >8 GB (quadratic) |
| Memory Complexity | O(1) | O(Nยฒ) |
Benchmarks run on NVIDIA GTX 1650 (4gb vram). Transformer fails at L>5000 due to attention matrix explosion.
Architecture
Manifold is a next-generation neural architecture based on Riemannian Geometry and Hamiltonian Dynamics.
This particular model uses:
- Type: Manifold v2.6.2 (Symplectic Neural ODE)
- Dimensions: 128 (Latent), 16 (Coordinate)
- Depth: 6 Layers (Continuous Time)
- Heads: 4 (Isomeric Geodesic Flows)
- Integrator: Leapfrog (Symplectic)
- Physics:
- Active Inference: Enabled (Plasticity 0.1)
- Singularities: Enabled (Strength 5.0)
- Fractal Manifolds: Enabled
Task
The model solves the Cumulative Parity (XOR) problem:
- Input: Binary sequence (e.g.,
1 0 1 1 ...) - Output: Cumulative sum modulo 2
Usage
1. Install the Manifold library:
pip install gfn
2. Download the model files:
# Clone this repository
git clone https://huggingface.co/Manifold-Labs/manifold-gfn-xor-128d
cd manifold-gfn-xor-128d
3. Run inference:
# Test on sequences of length 100 (5 samples)
python inference.py --length 100 --samples 5
# Test on longer sequences (e.g., 1000 tokens)
python inference.py --length 1000 --samples 3
Expected Output:
[*] Testing Parity/XOR (Length: 100)
------------------------------------------------------------
Seq 1: โ
PASS
Input: 00110000000101110110...
Target: 00100000000110100100...
Pred: 00100000000110100100...
------------------------------------------------------------
Total Success: 5/5
Performance
- Accuracy: 100% on test set
- Length-Invariant Generalization: Verified up to 100,000 tokens
- Stability: Energy conserved via Hamiltonian constraint
Links
- GitHub Repository: Manifold-Laboratory/manifold
- Documentation: Full technical details and API reference available in the repository
Manifold Laboratory | Neural ODEs meet Riemannian Geometry
License: Apache License 2.0
- Downloads last month
- 1