A clean-slate Layer-1 blockchain protocol built for the post-quantum era. Native Falcon-1024 from genesis. ORCA parallel execution. Adaptive HotStuff-2 BFT. RISC-V hardware-native execution. No classical elliptic-curve operation at any protocol layer.
LQ1 organises all protocol functions into seven isolated layers — from cryptographic primitives at the base to developer tooling at the top. Each exposes well-defined interfaces, enabling governance-driven upgrades without destabilising adjacent components. Click any layer to explore its simulator.
The application-facing layer exposing NexaScript smart contracts, native account abstraction wallets, cross-chain intent protocols, and the NexaForge developer toolkit. NexaForge provides compile-time static analysis for read/write set validation, formal verification support, and proof generation tooling.
Proprietary post-quantum signature set verification architecture. cPQC-Agg (Compact Post-Quantum Cryptographic Aggregation) produces constant-size quorum certificates of ~48 bytes amortised per validator with O(1) verification cost — based on the LaBRADOR lattice proof framework. Validators hold two PQ key pairs: a Falcon-1024 identity key and a cPQC-Agg consensus key.
Plonky3 — FRI-based ZK proof system. No trusted setup. Post-quantum secure under BLAKE3 hash collision resistance alone. Proof generation is asynchronous — blocks finalise through HotStuff-2 first, proofs attach in a subsequent round. ZK-verified mode: ~150ms parallelised for 50k tx. Mandatory for cross-chain bridge transactions.
Adaptive HotStuff-2 BFT with O(n) communication complexity. Tolerates up to 1/3 Byzantine validators. Propose → Vote → Commit. LQ1-VRF leader election (BLAKE3 + Falcon-1024). NexaNet uses QUIC transport with Kademlia DHT. IBLT delta propagation: <150ms effective global block propagation.
ORCA constructs a dependency DAG from declared read/write sets and schedules independent paths across all processor cores — eliminating rollback in over 98% of transactions. NexaVM: RISC-V RV64GC ahead-of-time compiled bytecode via LLVM. Near-native performance with hardware acceleration for cryptographic operations.
The base of all protocol trust. Every primitive is NIST-standardised. Falcon-1024 for all transaction and validator signatures (~1,280 bytes). ML-KEM-1024 for key encapsulation (FIPS 203). Poseidon2 for ZK-compatible state hashing. BLAKE3 for general hashing and FRI commitments. Zero classical elliptic-curve operations.
The global blockchain ecosystem carries trillions in value secured by classical cryptographic assumptions that quantum computing renders obsolete. The threat is architectural — embedded in every major deployed chain.
Bitcoin and Ethereum use secp256k1 ECDSA — broken in polynomial time by Shor's algorithm. Solana uses Ed25519 equally vulnerable. EVM rollups depend on pairing-based cryptography reliant on elliptic curve hardness. Retrofitting requires consensus-breaking hard forks.
NIST finalised FIPS 203 (ML-KEM), FIPS 205 (SLH-DSA), and FIPS 206 (Falcon) in 2024 — establishing the approved post-quantum primitive set for long-lived infrastructure. LQ1 integrates all three standard families from genesis as first-class primitives.
Beyond cryptography, a second structural failure: sequential execution. Even the most advanced rollups process transactions one-by-one, leaving multi-core validator hardware almost entirely idle. A protocol designed from genesis for parallel execution delivers order-of-magnitude throughput gains.
LQ1 is the only production-targeted protocol combining post-quantum security, hardware-native execution, massively parallel throughput, and BFT consensus from genesis. No currently deployed blockchain achieves all four properties simultaneously.
Every operation — transaction signing, validator consensus, key encapsulation, state hashing, and ZK proof generation — uses NIST-standardised quantum-resistant primitives exclusively. Zero classical elliptic-curve operations at any protocol layer.
| Purpose | Primitive | Standard | Key Property |
|---|---|---|---|
| Transaction Signatures | Falcon-1024 | FIPS 206 | ~1,280 byte sigs · lattice-based NTRU · HSM isolated signing |
| Validator Signatures | Falcon-1024 | FIPS 206 | Identity key — transaction signing & peer authentication |
| Consensus Aggregation | cPQC-Agg | LaBRADOR | ~48B/validator amortised · O(1) verify · 128B Groth16 proof · 99.98% detection rate (k=80, n=200) |
| Key Encapsulation | ML-KEM-1024 | FIPS 203 | Formerly CRYSTALS-Kyber · IND-CCA2 · 128-bit PQ security |
| ZK-Friendly Hash | Poseidon2 | ZK-std | State tree · execution witness commitments · ZK circuit native |
| General Hash | BLAKE3 | IETF | Block headers · FRI commitments · peer auth · Merkle trees |
| Long-Term Recovery | SLH-DSA-128f | FIPS 205 | Archival key rotation · ~17KB sigs · not used in consensus hot path |
| VRF Leader Election | LQ1-VRF | Protocol | BLAKE3 PRF output + Falcon-1024 proof of knowledge · replaces ECVRF |
Validators hold two post-quantum key pairs: a Falcon-1024 identity key for transaction signing and peer authentication, and a cPQC-Agg consensus key for block voting. cPQC-Agg produces ~48 bytes amortised per validator with O(1) verification cost — allowing the validator set to scale without growing consensus message overhead. There is no classical elliptic-curve key anywhere in the validator credential set.
ORCA constructs a directed acyclic dependency graph from declared read/write state sets, then schedules independent execution paths across all available processor cores simultaneously — eliminating rollback in over 98% of transactions.
Every transaction declares the state keys it will read and write before execution. The NexaForge toolchain performs static analysis at compile time. Over-declaration is safe and recommended. Undeclared accesses trigger mandatory post-execution conflict detection and rollback.
ORCA builds a directed acyclic graph from declared state sets. Transactions with overlapping write sets are linked with dependency edges. Transactions with no shared state dependencies are disconnected — eligible for fully parallel execution across all CPU cores.
Disconnected DAG nodes execute concurrently. Unlike Block-STM's optimistic concurrency (Aptos/Sui), ORCA uses declared dependencies — avoiding speculative execution overhead entirely. Phase 1 simulation confirmed ~121,000 TPS on 12-core hardware.
After execution, ORCA performs a mandatory conflict check comparing actual state accesses against declared sets. Any undeclared access triggers rollback and re-queue. Conflict rate: <2% in practice. Rollback prevents malicious or faulty declarations from corrupting global state.
Linear O(n) communication complexity. Deterministic block finality. Tolerates up to 1/3 Byzantine validators. Post-quantum leader election via LQ1-VRF. Phase 3 live testnet: 232ms average finality on AWS multi-region infrastructure.
All performance figures are simulation-confirmed and live-testnet validated across three completed phases. No figures are theoretical without explicit labelling.
ORCA parallel efficiency at 5% conflict, 8 execution lanes
Avg block finality Virginia–Oregon with real Falcon-1024 signatures
Verification throughput per core (Rust prototype)
Constraint count for Merkle + CBDS sub-circuits. 128-byte Groth16 proof. ~3ms verify.
| Metric | Re-Exec | ZK-Verified |
|---|---|---|
| CPU/tx | ~0.05ms | ~10–30ms |
| Block (50k tx) | ~2,500ms | ~150ms |
| Proof size | None | 40–120KB |
| Scheme | Quorum Cert | Verify Cost |
|---|---|---|
| cPQC-Agg | ~48B/validator | O(1) |
| Sequential | scales linearly | O(n) |
| Propagation | <150ms IBLT | global |
Structured peer-to-peer overlay with three node categories: Validator Nodes (consensus participants), Full Nodes (independent verifiers), and Light Clients (DAS-based verification without full block download).
Multiplexed streams, 0-RTT connection, no TCP head-of-line blocking. Mandatory for all validator-to-validator consensus messaging.
Structured peer discovery. Nodes locate peers in O(log n) time. Bootstrap nodes seed initial connections; routing is fully distributed thereafter.
Invertible Bloom Lookup Table delta synchronisation. Nodes exchange only differences vs local mempool — not full block payloads. Sub-150ms effective propagation.
Reed-Solomon erasure coding k=32, n=128. Any 32 of 128 shares reconstruct a complete block. DAS for light clients without full download.
32–128 cores, 128–256 GB RAM, 10–40 Gbps network. Data-centre grade by design — deliberate tradeoff prioritising throughput. Light validator tier planned (4–8 cores, 16–32 GB RAM).
Up to 33% malicious validators. Multi-peer connections, adaptive slot timing, redundant propagation, geographically diverse validator set. Safety and liveness guaranteed below 1/3 threshold.
Smart contracts compile ahead-of-time into RISC-V RV64GC bytecode via LLVM, eliminating interpreted VM overhead. Hardware-native execution enables vectorised cryptographic operations and potential ASIC/FPGA acceleration.
Open instruction set. Contracts compile to deterministic machine-level bytecode. Full ISA enables CPU vectorisation for Falcon verification and Poseidon2 hashing.
Contracts compile to RISC-V bytecode before deployment. NexaForge static analysis validates read/write completeness and supports formal verification at compile time.
NexaVM generates execution witnesses for ZK-verified mode. Witnesses feed into Plonky3 FRI arithmetic circuit translation — enabling opt-in cryptographic proof of correct execution.
All execution fully deterministic — required for distributed consensus. Read/write sets enforced post-execution. Any undeclared access triggers ORCA rollback and re-queue.
LQ1's quantum-resistant, high-throughput architecture is purpose-built for use cases that cannot accept deferred security risk or throughput ceilings.
Sub-second finality + 100k+ TPS matches global card network capacity with quantum-safe security. Real-time settlement without deferred quantum migration risk.
ZK-verified execution mode enables proof-of-correct-computation for AI inference — trustless AI output markets with cryptographic auditability.
High-frequency sensor data and micropayment settlements require LQ1's throughput and finality profile. Hardware-native execution supports IoT-scale transaction volumes.
Falcon-1024 / SLH-DSA credentials remain secure over decade-scale time horizons against future quantum adversaries. Long-term identity anchoring.
L1–L2 interface with mandatory ZK proof verification for bridge transactions. Auditable, secure cross-chain transfers with cryptographic guarantees.
NIST-standardised cryptographic primitives satisfy regulatory requirements for financial institutions, government systems, and critical digital infrastructure.
All performance claims are phase-validated. No figures are presented without phase attribution and measurement method.
Full protocol specification (v15.5), cPQC-Agg paper (v4.5), Rust simulator with real Falcon-1024 + ORCA. 232ms live TCP testnet finality achieved.
Core protocol in Go, single-node execution, full ORCA scheduler, NexaVM runtime. Validator onboarding begins.
Multi-node consensus, validator onboarding, stress testing. Public benchmark scripts. TGE milestone prep.
Production launch with 200 initial validators. TGE Q3 2027 with 2% initial circulating supply. cPQC-Agg paper peer-reviewed.
Every architectural claim in LQ1 is grounded in formal specification or controlled empirical measurement. No architectural decision was made without a formal security justification or performance model.
Founder of the Lahara Protocol Foundation and the driving force behind LQ1. Directed the complete research programme and architecture of the LQ1 protocol — including the system-level design of post-quantum cryptographic integration, the ORCA parallel execution framework, the proprietary cPQC-Agg signature aggregation architecture, and the deterministic coordination model. Research-driven approach: every architectural claim in LQ1 is grounded in formal specification or controlled empirical measurement.
The Lahara Protocol Foundation is the research body behind LQ1. The Foundation directed all protocol specification work — cryptographic system design, ORCA scheduler architecture, NexaVM execution model, and the cPQC-Agg signature aggregation scheme. Protocol specification v12.7 (2026) is publicly available. Reference implementation published at GitHub.
ORCA constructs a dependency DAG from declared transaction read/write sets. Transactions with no shared state execute concurrently across all CPU cores. Adjust parameters and observe parallel execution in real time.
Visualisation of the HotStuff-2 BFT protocol. Validator nodes vote in rounds — Propose → Vote → Commit. Byzantine validators are marked red. Quorum certificates aggregate via cPQC-Agg. Toggle Byzantine faults and observe protocol recovery.
Model validator incentives — staking, block rewards, slashing conditions, and fee distribution. The LQ1 economic model aligns validator incentives with network security, performance, and decentralisation.
Signing two conflicting blocks at the same height. Enforced deterministically. Full stake slash. No appeals.
Progressive slash rate for extended validator absence. Designed to incentivise uptime without catastrophic penalty for temporary failures.
Submitting blocks with invalid Falcon-1024 or cPQC-Agg signatures. Immediate detection via cryptographic verification. Stake at risk.
The Emergency Quantum Switch (EQS) is LQ1's cryptographic agility protocol — enabling live migration between post-quantum primitive sets without halting the network. Simulate a triggered EQS event and observe the protocol's response.
Governance oracle or cryptographic break disclosure triggers EQS flag. Propagated via emergency broadcast to all validators within 3 rounds.
Network enters dual-signature mode — both current and successor primitive sets accepted simultaneously. No downtime. Blocks continue finalising.
Validators rotate to new key pairs within governance-defined window. Key rotation is on-chain verifiable. Non-rotating validators face progressive exclusion.
All state commitments re-anchored under new cryptographic parameters. ZK proof system updated to successor hash function. Full backward compatibility maintained.
Legacy primitive support deprecated. Network operates exclusively on successor cryptographic stack. Migration logged on-chain with full auditability.