Biometric identity today trades privacy for convenience: a central database, a cloud MPC service, or a proprietary scanner. sable does neither. A Halo2 zero-knowledge proof — generated in ~250ms on a phone — shows a verifier your live face matches the enrolled template, revealing nothing else. Offline. No special hardware.
A face is not a password — it can't be rotated. Yet the industry ships architectures that centralise, cache, or transmit the one credential you can never replace. Breaches are irrevocable. Trust is assumed. The privacy-preserving alternatives rely on cloud MPC or blockchain.
Centralised databases — a single breach exposes millions of irrevocable biometric records.
Cloud MPC — templates are sharded across nodes, but you still trust the operators and need always-on internet.
Blockchain ID — solves uniqueness, not privacy; requires gas fees and on-chain verification.
Hardware systems — iris scanners and secure enclaves lock deployment to controlled environments.
Biometrics never leave the device — enrolment produces a Pedersen commitment; the template stays on the phone.
Real zero-knowledge proofs over the biometric data itself — not statistical matching on encrypted fragments.
Offline verification via NFC or BLE — no cloud, no blockchain, no internet dependency.
Any smartphone camera works — screen-flash liveness detection prevents photo-replay without special hardware.
Enrol, authenticate, prove, verify, disclose. Click through a real sable flow.
user@phone sable enrol [camera] capturing face … ✓ 512-d embedding [quant] quantising to field elements ✓ 256 limbs [hash] Poseidon(template) … ✓ 32 B digest [commit] Pedersen(template, r) … ✓ 64 B commit stored on device: · encrypted template (AES-GCM, device key) · commitment C = g^t · h^r · opening r (never leaves secure storage) # Nothing sent off-device. No server sees anything.
user@phone sable auth --verifier gate-42 [camera] starting liveness sequence … [flash] screen → white reflectance 0.82 ✓ [flash] screen → red reflectance 0.41 ✓ [flash] screen → blue reflectance 0.53 ✓ [classify] 3D face vs planar display ✓ real liveness: pass (Tang et al. NDSS 2018 · controlled-illumination) # The screen is the light source. No depth camera needed. # Photos and phone-screen replays fail the reflectance model.
user@phone # generating Halo2 proof [circuit] face-match (Poseidon + Pedersen opening) [witness] live_template, stored_template, r [prove] Halo2 … ✓ 248 ms proof size: ~11 KB public input: commitment C, context nonce private: template, opening, live scan proof π generated (transparent setup — no trusted ceremony) # The verifier learns: "this person owns a template that opens C, # and that template matches a live, liveness-checked capture." # Nothing else.
verifier@gate # receives proof over BLE, no internet [ble] received (C, π, nonce) · 11.2 KB [verify] Halo2 verify … ✓ 1.8 ms result: ACCEPT learnt: this bearer controls commitment C proof is fresh (nonce matches challenge) NOT learnt: · the template itself · the embedding · any feature of the face · the opening r # Verification is fully offline. No server, no phone-home.
user@phone sable disclose --claim "age >= 18" [credential] loading VC issued by gov-au-id [bbs+] generating selective disclosure proof … [bind] binding to biometric proof π ✓ proved: · bearer of commitment C holds a VC from gov-au-id · VC asserts date-of-birth <= 2008-04-13 · VC is not revoked NOT revealed: · actual DOB, name, address, passport number · any other attribute on the credential # BBS+ signatures make "prove predicate, hide attribute" a primitive. # Combined with the biometric proof: you prove you are the holder, # and you prove one fact about the holder — nothing more.
Halo2 circuits prove that a live, quantised embedding opens a stored Pedersen commitment. The template stays secret. No trusted setup, no ceremony, no toxic waste.
Verification happens peer-to-peer over NFC or BLE. The verifier doesn't need a database, an API, or an internet connection. The phone holds the template; the verifier checks the proof.
Liveness uses controlled-illumination reflectance (Tang et al., NDSS 2018): the screen flashes colours, the camera measures how a 3D face reflects them differently from a flat display. Works on commodity phones.
Enrolment hashes and commits to the template. Authentication regenerates the embedding, proves equality under the commitment, and binds the proof to a fresh challenge from the verifier. Optional verifiable credentials carry attested attributes you can reveal selectively.
SNARK-friendly primitives keep the circuit small enough to prove in ~250ms on a modern phone.
No trusted ceremony, no toxic waste, no "we promise nobody kept the τ." The proof system is auditable.
Controlled-illumination reflectance uses the screen as a light source. Defeats photo and screen replay without IR hardware.
A government-issued VC can be shown as "age >= 18" without revealing DOB — or any other attribute.
The VC disclosure is cryptographically bound to the biometric proof, so a stolen VC can't be replayed by another holder.
Apache 2.0, 519 tests (77 Halo2-specific), 89% coverage. Build it, fuzz it, read the circuits.
Halo2 ZK on the edge. Offline NFC/BLE verification. Selective disclosure. Built on commodity phone hardware, open source under Apache 2.0.
git clone https://codeberg.org/anuna/sable