Compare commits

..

12 commits

Author SHA1 Message Date
escouade-bot
e5be6f5a56 fix: wrap rehash updateProfile in try/catch for best-effort (#54)
All checks were successful
PR Check / rust (push) Successful in 16m33s
PR Check / frontend (push) Successful in 2m14s
PR Check / rust (pull_request) Successful in 16m33s
PR Check / frontend (pull_request) Successful in 2m15s
Both handlePinSuccess handlers (ProfileSwitcher and ProfileSelectionPage)
now catch updateProfile errors so that a failed rehash persistence does
not block switchProfile.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 15:46:27 -04:00
escouade-bot
2f610bf10a fix: make legacy PIN rehash non-blocking in verify_pin (#54)
Replace hash_pin(pin)? with hash_pin(pin).ok() so that a rehash
failure does not propagate as an error. The user can now switch
profiles even if the Argon2id re-hashing step fails — the PIN
is still correctly verified, and the legacy hash remains until
the next successful login.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 15:46:27 -04:00
escouade-bot
34626711eb fix: address reviewer feedback (#54)
- Add automatic re-hashing of legacy SHA-256 PINs to Argon2id on
  successful verification, returning new hash to frontend for persistence
- Use constant-time comparison (subtle::ConstantTimeEq) for both
  Argon2id and legacy SHA-256 hash verification
- Add unit tests for hash_pin, verify_pin (Argon2id and legacy paths),
  re-hashing flow, error cases, and hex encoding roundtrip
- Update frontend to handle VerifyPinResult struct and save rehashed
  PIN hash via profile update

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 15:46:27 -04:00
escouade-bot
cea16c24ae fix: migrate PIN hashing from SHA-256 to Argon2id (#54)
Replace SHA-256 with Argon2id (m=64MiB, t=3, p=1) for PIN hashing.
Existing SHA-256 hashes are verified transparently via format detection
(argon2id: prefix). New PINs are always hashed with Argon2id.

Addresses CWE-916: Use of Password Hash With Insufficient Computational Effort.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-09 15:46:27 -04:00
59cefe8435 feat: license validation commands + entitlements system (#46)
Merge PR #56

Closes #46
2026-04-09 19:35:32 +00:00
le king fu
2e9df1c0b9 fix(rust): pass raw public key bytes to DecodingKey::from_ed_der
All checks were successful
PR Check / rust (push) Successful in 15m54s
PR Check / frontend (push) Successful in 2m15s
PR Check / rust (pull_request) Successful in 16m7s
PR Check / frontend (pull_request) Successful in 2m15s
Previous test refactor wrapped both keys in their respective DER
envelopes. CI surfaced the asymmetry: jsonwebtoken's two from_ed_der
constructors expect different inputs.

- EncodingKey::from_ed_der → PKCS#8 v1 wrapped (ring's
  Ed25519KeyPair::from_pkcs8 path). The 16-byte prefix + 32-byte seed
  blob is correct.
- DecodingKey::from_ed_der → raw 32-byte public key. Internally it
  becomes ring's UnparsedPublicKey::new(&ED25519, key_bytes), which
  takes the bare bytes, NOT a SubjectPublicKeyInfo wrapper.

The test was building an SPKI DER for the public key, so verification
saw a malformed key and failed every signature with InvalidSignature
(`accepts_well_formed_base_license` and `activation_token_matches_machine`).

Drop the SPKI helper, pass `signing_key.verifying_key().to_bytes()`
straight into DecodingKey::from_ed_der. Inline doc-comment captures
the asymmetry so the next person doesn't fall in the same hole.
2026-04-09 11:12:10 -04:00
le king fu
69e136cab0 fix(rust): use DER-built keys in license tests, drop ed25519-dalek pem feature
Some checks failed
PR Check / rust (push) Failing after 10m20s
PR Check / frontend (push) Successful in 2m15s
PR Check / rust (pull_request) Failing after 9m30s
PR Check / frontend (pull_request) Successful in 2m7s
cargo CI flagged: `unresolved import ed25519_dalek::pkcs8::LineEnding`. The
`LineEnding` re-export path varies between pkcs8/spki/der versions, so the
test code that called `to_pkcs8_pem(LineEnding::LF)` won't compile against
the dependency tree we get with ed25519-dalek 2.2 + pkcs8 0.10.

Fix:
- Drop the `pem` feature from the ed25519-dalek dev-dependency.
- In tests, build the PKCS#8 v1 PrivateKeyInfo and SubjectPublicKeyInfo
  DER blobs manually from the raw 32-byte Ed25519 seed/public key. The
  Ed25519 layout is fixed (16-byte prefix + 32-byte key) so this is short
  and stable.
- Pass the resulting DER bytes to `EncodingKey::from_ed_der` /
  `DecodingKey::from_ed_der`.

Refactor:
- Extract `strict_validation()` and `embedded_decoding_key()` helpers so
  the validation config (mandatory exp/iat for CWE-613) lives in one
  place and production callers all share the same DecodingKey constructor.
- `validate_with_key` and `validate_activation_with_key` now take a
  `&DecodingKey` instead of raw PEM bytes; production builds the key
  once via `embedded_decoding_key()`.
- New canary test `embedded_public_key_pem_parses` fails fast if the
  embedded PEM constant ever becomes malformed.
2026-04-09 10:59:12 -04:00
le king fu
99fef19a6b feat: add license validation and entitlements (Rust) (#46)
Some checks failed
PR Check / rust (push) Failing after 5m50s
PR Check / frontend (push) Successful in 2m9s
PR Check / rust (pull_request) Failing after 6m1s
PR Check / frontend (pull_request) Successful in 2m12s
Introduces the offline license infrastructure for the Base/Premium editions.

- jsonwebtoken (EdDSA) verifies license JWTs against an embedded Ed25519
  public key. The exp claim is mandatory (CWE-613) and is enforced via
  Validation::set_required_spec_claims.
- Activation tokens (server-issued, machine-bound) prevent license.key
  copying between machines. Storage is wired up; the actual issuance flow
  ships with Issue #49.
- get_edition() fails closed to "free" when the license is missing,
  invalid, expired, or activated for a different machine.
- New commands/entitlements module centralizes feature → tier mapping so
  Issue #48 (and any future gate) reads from a single source of truth.
- machine-uid provides the cross-platform machine identifier; OS reinstall
  invalidates the activation token by design.
- Tests cover happy path, expiry, wrong-key signature, malformed JWT,
  unknown edition, and machine_id matching for activation tokens.

The embedded PUBLIC_KEY_PEM is the RFC 8410 §10.3 test vector, clearly
labelled as a development placeholder; replacing it with the production
public key is a release-time task.
2026-04-09 10:02:02 -04:00
8afcafe890 Merge pull request 'fix(ci): install Node.js in the rust job' (#62) from fix/check-workflow-rust-node into main 2026-04-09 14:01:39 +00:00
le king fu
60bf43fd65 fix(ci): install Node.js in the rust job
All checks were successful
PR Check / rust (push) Successful in 15m32s
PR Check / frontend (push) Successful in 2m24s
PR Check / rust (pull_request) Successful in 15m44s
PR Check / frontend (pull_request) Successful in 2m33s
actions/checkout@v4 and actions/cache@v4 are JavaScript actions and
require `node` in the container PATH. The rust job in check.yml only
installed system libs and the Rust toolchain, so the post-checkout
cleanup failed with `exec: "node": executable file not found in $PATH`
on every Forgejo run.

The frontend job already installed Node, which is why it succeeded.
The GitHub mirror is unaffected because ubuntu-latest ships with Node
preinstalled.

Validated against the failed run https://git.lacompagniemaximus.com/maximus/Simpl-Resultat/actions/runs/122
2026-04-09 09:44:24 -04:00
b5c81b2a01 Merge pull request 'ci: add PR validation workflow (cargo check/test + npm build) (#60)' (#61) from issue-60-pr-check-workflow into main 2026-04-09 13:31:15 +00:00
le king fu
8e5228e61c ci: add PR validation workflow (#60)
Some checks failed
PR Check / rust (push) Failing after 2m2s
PR Check / frontend (push) Successful in 2m10s
PR Check / rust (pull_request) Failing after 1m32s
PR Check / frontend (pull_request) Successful in 2m8s
Adds .forgejo/workflows/check.yml (and a GitHub mirror) that runs on
every branch push (except main) and on every PR targeting main.

Two parallel jobs:
- rust: cargo check + cargo test, with cargo registry/git/target caches
  keyed on Cargo.lock. Installs the minimal Rust toolchain and the
  webkit2gtk system deps that the tauri build script needs.
- frontend: npm ci + npm run build (tsc + vite) + npm test (vitest),
  with the npm cache keyed on package-lock.json.

The Forgejo workflow uses the ubuntu:22.04 container pattern from
release.yml. The GitHub mirror uses native runners (ubuntu-latest)
since the GitHub mirror exists for portability and uses GitHub-native
actions.

Documents the new workflow in CLAUDE.md alongside release.yml so future
contributors know what CI runs before merge.
2026-04-09 09:21:20 -04:00
10 changed files with 721 additions and 5 deletions

View file

@ -0,0 +1,97 @@
name: PR Check
# Validates Rust + frontend on every branch push and PR.
# Goal: catch compile errors, type errors, and failing tests BEFORE merge,
# instead of waiting for the release tag (which is when release.yml runs).
on:
push:
branches-ignore:
- main
pull_request:
branches:
- main
jobs:
rust:
runs-on: ubuntu
container: ubuntu:22.04
env:
PATH: /root/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
CARGO_TERM_COLOR: always
steps:
- name: Install system dependencies, Node.js and Rust
run: |
apt-get update
apt-get install -y --no-install-recommends \
curl wget git ca-certificates build-essential pkg-config \
libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev libssl-dev
# Node.js is required by actions/checkout and actions/cache (they
# are JavaScript actions and need `node` in the container PATH).
curl -fsSL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
# Rust toolchain
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --default-toolchain stable --profile minimal
node --version
rustc --version
cargo --version
- name: Checkout
uses: https://github.com/actions/checkout@v4
- name: Cache cargo registry and git
uses: https://github.com/actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
key: ${{ runner.os }}-cargo-registry-${{ hashFiles('src-tauri/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-registry-
- name: Cache cargo build target
uses: https://github.com/actions/cache@v4
with:
path: src-tauri/target
key: ${{ runner.os }}-cargo-target-${{ hashFiles('src-tauri/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-target-
- name: cargo check
run: cargo check --manifest-path src-tauri/Cargo.toml --all-targets
- name: cargo test
run: cargo test --manifest-path src-tauri/Cargo.toml --all-targets
frontend:
runs-on: ubuntu
container: ubuntu:22.04
steps:
- name: Install Node.js 20
run: |
apt-get update
apt-get install -y --no-install-recommends curl ca-certificates git
curl -fsSL https://deb.nodesource.com/setup_20.x | bash -
apt-get install -y nodejs
node --version
npm --version
- name: Checkout
uses: https://github.com/actions/checkout@v4
- name: Cache npm cache
uses: https://github.com/actions/cache@v4
with:
path: ~/.npm
key: ${{ runner.os }}-npm-${{ hashFiles('package-lock.json') }}
restore-keys: |
${{ runner.os }}-npm-
- name: Install dependencies
run: npm ci
- name: Build (tsc + vite)
run: npm run build
- name: Tests (vitest)
run: npm test

68
.github/workflows/check.yml vendored Normal file
View file

@ -0,0 +1,68 @@
name: PR Check
# Mirror of .forgejo/workflows/check.yml using GitHub-native runners.
# Forgejo is the primary host; this file keeps the GitHub mirror functional.
on:
push:
branches-ignore:
- main
pull_request:
branches:
- main
jobs:
rust:
runs-on: ubuntu-latest
env:
CARGO_TERM_COLOR: always
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
pkg-config libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev libssl-dev
- name: Install Rust toolchain (stable)
uses: dtolnay/rust-toolchain@stable
- name: Cache cargo
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
src-tauri/target
key: ${{ runner.os }}-cargo-${{ hashFiles('src-tauri/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: cargo check
run: cargo check --manifest-path src-tauri/Cargo.toml --all-targets
- name: cargo test
run: cargo test --manifest-path src-tauri/Cargo.toml --all-targets
frontend:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node.js 20
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Build (tsc + vite)
run: npm run build
- name: Tests (vitest)
run: npm test

View file

@ -2,6 +2,9 @@
## [Non publié]
### Ajouté
- CI : nouveau workflow `check.yml` qui exécute `cargo check`/`cargo test` et le build frontend sur chaque push de branche et PR, détectant les erreurs avant le merge plutôt qu'au moment de la release (#60)
### Modifié
- Hachage du PIN migré de SHA-256 vers Argon2id pour résistance au brute-force (CWE-916). Les PINs SHA-256 existants sont vérifiés de façon transparente ; les nouveaux PINs utilisent Argon2id (#54)

View file

@ -2,6 +2,9 @@
## [Unreleased]
### Added
- CI: new `check.yml` workflow runs `cargo check`/`cargo test` and the frontend build on every branch push and PR, catching errors before merge instead of waiting for the release tag (#60)
### Changed
- PIN hashing migrated from SHA-256 to Argon2id for brute-force resistance (CWE-916). Existing SHA-256 PINs are verified transparently; new PINs use Argon2id (#54)

View file

@ -157,9 +157,8 @@ Pour maintenir l'éligibilité aux crédits d'impôt R&D (RS&DE fédéral + CRIC
## CI/CD
- GitHub Actions (`release.yml`) déclenché par tags `v*`
- Build Windows (NSIS `.exe`) + Linux (`.deb`, `.rpm`)
- Signature des binaires + JSON d'updater pour mises à jour automatiques
- **`check.yml`** (Forgejo Actions + miroir GitHub) — déclenché sur chaque push de branche (sauf `main`) et chaque PR vers `main`. Lance `cargo check`, `cargo test`, `npm run build` (tsc + vite) et `npm test` (vitest). Doit être vert avant tout merge.
- **`release.yml`** — déclenché par les tags `v*`. Build Windows (NSIS `.exe`) + Linux (`.deb`, `.rpm`), signe les binaires et publie le JSON d'updater pour les mises à jour automatiques.
---

View file

@ -36,4 +36,12 @@ aes-gcm = "0.10"
argon2 = "0.5"
subtle = "2"
rand = "0.8"
jsonwebtoken = "9"
machine-uid = "0.5"
[dev-dependencies]
# Used in license_commands.rs tests to sign test JWTs. We avoid the `pem`
# feature because the `LineEnding` re-export path varies between versions
# of pkcs8/spki; building the PKCS#8 DER manually is stable and trivial
# for Ed25519.
ed25519-dalek = { version = "2", features = ["pkcs8", "rand_core"] }

View file

@ -0,0 +1,67 @@
// Centralized feature → tier mapping for license entitlements.
//
// This module is the single source of truth for which features are gated by which tier.
// To change what is gated where, modify FEATURE_TIERS only — never sprinkle edition checks
// throughout the codebase.
/// Editions, ordered from least to most privileged.
pub const EDITION_FREE: &str = "free";
pub const EDITION_BASE: &str = "base";
pub const EDITION_PREMIUM: &str = "premium";
/// Maps feature name → list of editions allowed to use it.
/// A feature absent from this list is denied for all editions.
const FEATURE_TIERS: &[(&str, &[&str])] = &[
("auto-update", &[EDITION_BASE, EDITION_PREMIUM]),
("web-sync", &[EDITION_PREMIUM]),
("cloud-backup", &[EDITION_PREMIUM]),
("advanced-reports", &[EDITION_PREMIUM]),
];
/// Pure check: does `edition` grant access to `feature`?
pub fn is_feature_allowed(feature: &str, edition: &str) -> bool {
FEATURE_TIERS
.iter()
.find(|(name, _)| *name == feature)
.map(|(_, tiers)| tiers.contains(&edition))
.unwrap_or(false)
}
#[tauri::command]
pub fn check_entitlement(app: tauri::AppHandle, feature: String) -> Result<bool, String> {
let edition = crate::commands::license_commands::current_edition(&app);
Ok(is_feature_allowed(&feature, &edition))
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn free_blocks_auto_update() {
assert!(!is_feature_allowed("auto-update", EDITION_FREE));
}
#[test]
fn base_unlocks_auto_update() {
assert!(is_feature_allowed("auto-update", EDITION_BASE));
}
#[test]
fn premium_unlocks_everything() {
assert!(is_feature_allowed("auto-update", EDITION_PREMIUM));
assert!(is_feature_allowed("web-sync", EDITION_PREMIUM));
assert!(is_feature_allowed("cloud-backup", EDITION_PREMIUM));
}
#[test]
fn base_does_not_unlock_premium_features() {
assert!(!is_feature_allowed("web-sync", EDITION_BASE));
assert!(!is_feature_allowed("cloud-backup", EDITION_BASE));
}
#[test]
fn unknown_feature_denied() {
assert!(!is_feature_allowed("nonexistent", EDITION_PREMIUM));
}
}

View file

@ -0,0 +1,460 @@
// License validation, storage and reading for the Base/Premium editions.
//
// Architecture:
// - License key = "SR-BASE-<JWT>" or "SR-PREMIUM-<JWT>", JWT signed Ed25519 by the server
// - Activation token = separate JWT, also signed by the server, binds the license to a machine
// (machine_id claim must match the local machine_id). Without it, a copied license.key would
// work on any machine. Activation tokens are issued by the server in a separate flow (Issue #49).
// - Both files live in app_data_dir/ — license.key and activation.token
// - get_edition() returns "free" unless BOTH license JWT is valid (signature + exp) AND
// either there is no activation token (graceful pre-activation state) OR the activation token
// matches the local machine_id.
//
// CWE-613: every license JWT MUST carry an `exp` claim. We reject licenses without it.
use jsonwebtoken::{decode, Algorithm, DecodingKey, Validation};
use serde::{Deserialize, Serialize};
use std::fs;
use std::path::PathBuf;
use tauri::Manager;
use super::entitlements::{EDITION_BASE, EDITION_FREE, EDITION_PREMIUM};
// Ed25519 public key for license verification.
//
// IMPORTANT: this PEM is a development placeholder taken from RFC 8410 §10.3 test vectors.
// The matching private key is publicly known, so any license signed with it offers no real
// protection. Replace this constant with the production public key before shipping a paid
// release. The corresponding private key MUST live only on the license server (Issue #49).
const PUBLIC_KEY_PEM: &str = "-----BEGIN PUBLIC KEY-----\n\
MCowBQYDK2VwAyEAGb9ECWmEzf6FQbrBZ9w7lshQhqowtrbLDFw4rXAxZuE=\n\
-----END PUBLIC KEY-----\n";
const LICENSE_FILE: &str = "license.key";
const ACTIVATION_FILE: &str = "activation.token";
const KEY_PREFIX_BASE: &str = "SR-BASE-";
const KEY_PREFIX_PREMIUM: &str = "SR-PREMIUM-";
/// Decoded license metadata exposed to the frontend.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct LicenseInfo {
pub edition: String,
pub email: String,
pub features: Vec<String>,
pub machine_limit: u32,
pub issued_at: i64,
pub expires_at: i64,
}
/// Claims embedded in the license JWT (signed by the license server).
#[derive(Debug, Clone, Serialize, Deserialize)]
struct LicenseClaims {
sub: String, // email
iss: String,
iat: i64,
exp: i64, // mandatory — see CWE-613
edition: String,
#[serde(default)]
features: Vec<String>,
machine_limit: u32,
}
/// Claims embedded in the activation token JWT (server-signed, machine-bound).
#[derive(Debug, Clone, Serialize, Deserialize)]
struct ActivationClaims {
sub: String, // license id or hash
iat: i64,
exp: i64,
machine_id: String,
}
fn app_data_dir(app: &tauri::AppHandle) -> Result<PathBuf, String> {
app.path()
.app_data_dir()
.map_err(|e| format!("Cannot get app data dir: {}", e))
}
fn license_path(app: &tauri::AppHandle) -> Result<PathBuf, String> {
Ok(app_data_dir(app)?.join(LICENSE_FILE))
}
fn activation_path(app: &tauri::AppHandle) -> Result<PathBuf, String> {
Ok(app_data_dir(app)?.join(ACTIVATION_FILE))
}
/// Strip the human-readable prefix and return the bare JWT.
fn strip_prefix(key: &str) -> Result<&str, String> {
let trimmed = key.trim();
if let Some(jwt) = trimmed.strip_prefix(KEY_PREFIX_BASE) {
return Ok(jwt);
}
if let Some(jwt) = trimmed.strip_prefix(KEY_PREFIX_PREMIUM) {
return Ok(jwt);
}
Err("License key must start with SR-BASE- or SR-PREMIUM-".to_string())
}
/// Build a `Validation` with `exp` and `iat` mandatory. Assertions are explicit so a future
/// config change cannot silently disable expiry checking (CWE-613).
fn strict_validation() -> Validation {
let mut validation = Validation::new(Algorithm::EdDSA);
validation.validate_exp = true;
validation.leeway = 0;
validation.set_required_spec_claims(&["exp", "iat"]);
validation
}
/// Build the production `DecodingKey` from the embedded PEM constant.
fn embedded_decoding_key() -> Result<DecodingKey, String> {
DecodingKey::from_ed_pem(PUBLIC_KEY_PEM.as_bytes())
.map_err(|e| format!("Invalid public key: {}", e))
}
/// Pure validation: decode the JWT, verify signature with the provided key, ensure the
/// edition claim is one we recognize. Returns `LicenseInfo` on success.
///
/// Separated from the Tauri command so tests can pass their own key.
fn validate_with_key(key: &str, decoding_key: &DecodingKey) -> Result<LicenseInfo, String> {
let jwt = strip_prefix(key)?;
let validation = strict_validation();
let data = decode::<LicenseClaims>(jwt, decoding_key, &validation)
.map_err(|e| format!("Invalid license: {}", e))?;
let claims = data.claims;
if claims.edition != EDITION_BASE && claims.edition != EDITION_PREMIUM {
return Err(format!("Unknown edition '{}'", claims.edition));
}
Ok(LicenseInfo {
edition: claims.edition,
email: claims.sub,
features: claims.features,
machine_limit: claims.machine_limit,
issued_at: claims.iat,
expires_at: claims.exp,
})
}
/// Validate an activation token against the local machine. The token must be signed by the
/// license server and its `machine_id` claim must match the local machine identifier.
fn validate_activation_with_key(
token: &str,
local_machine_id: &str,
decoding_key: &DecodingKey,
) -> Result<(), String> {
let validation = strict_validation();
let data = decode::<ActivationClaims>(token.trim(), decoding_key, &validation)
.map_err(|e| format!("Invalid activation token: {}", e))?;
if data.claims.machine_id != local_machine_id {
return Err("Activation token belongs to a different machine".to_string());
}
Ok(())
}
// === Tauri commands ===========================================================================
/// Validate a license key without persisting it. Used by the UI to give immediate feedback
/// before the user confirms storage.
#[tauri::command]
pub fn validate_license_key(key: String) -> Result<LicenseInfo, String> {
let decoding_key = embedded_decoding_key()?;
validate_with_key(&key, &decoding_key)
}
/// Persist a previously-validated license key to disk. The activation token (machine binding)
/// is stored separately by [`store_activation_token`] once the server has issued one.
#[tauri::command]
pub fn store_license(app: tauri::AppHandle, key: String) -> Result<LicenseInfo, String> {
let decoding_key = embedded_decoding_key()?;
let info = validate_with_key(&key, &decoding_key)?;
let path = license_path(&app)?;
if let Some(parent) = path.parent() {
fs::create_dir_all(parent).map_err(|e| format!("Cannot create app data dir: {}", e))?;
}
fs::write(&path, key.trim()).map_err(|e| format!("Cannot write license file: {}", e))?;
Ok(info)
}
/// Persist a server-issued activation token (machine binding). The token is opaque to the
/// caller — it must validate against the local machine_id to be considered active.
#[tauri::command]
pub fn store_activation_token(app: tauri::AppHandle, token: String) -> Result<(), String> {
let local_id = machine_id_internal()?;
let decoding_key = embedded_decoding_key()?;
validate_activation_with_key(&token, &local_id, &decoding_key)?;
let path = activation_path(&app)?;
if let Some(parent) = path.parent() {
fs::create_dir_all(parent).map_err(|e| format!("Cannot create app data dir: {}", e))?;
}
fs::write(&path, token.trim()).map_err(|e| format!("Cannot write activation file: {}", e))
}
/// Read the stored license without revalidating. Returns `None` when no license is present.
/// The returned info is only structurally decoded — call [`get_edition`] for the gating value.
#[tauri::command]
pub fn read_license(app: tauri::AppHandle) -> Result<Option<LicenseInfo>, String> {
let path = license_path(&app)?;
if !path.exists() {
return Ok(None);
}
let key = fs::read_to_string(&path).map_err(|e| format!("Cannot read license file: {}", e))?;
let Ok(decoding_key) = embedded_decoding_key() else {
return Ok(None);
};
Ok(validate_with_key(&key, &decoding_key).ok())
}
/// Returns the active edition (`"free"`, `"base"`, or `"premium"`) for use by feature gates.
///
/// Returns "free" when:
/// - no license is stored,
/// - the license JWT is invalid or expired,
/// - an activation token exists but does not match this machine.
///
/// Note: a missing activation token is treated as a graceful pre-activation state and does
/// NOT downgrade the edition. Server-side activation happens later (Issue #53).
#[tauri::command]
pub fn get_edition(app: tauri::AppHandle) -> Result<String, String> {
Ok(current_edition(&app))
}
/// Internal helper used by `entitlements::check_entitlement`. Never returns an error — any
/// failure resolves to "free" so feature gates fail closed.
pub(crate) fn current_edition(app: &tauri::AppHandle) -> String {
let Ok(path) = license_path(app) else {
return EDITION_FREE.to_string();
};
if !path.exists() {
return EDITION_FREE.to_string();
}
let Ok(key) = fs::read_to_string(&path) else {
return EDITION_FREE.to_string();
};
let Ok(decoding_key) = embedded_decoding_key() else {
return EDITION_FREE.to_string();
};
let Ok(info) = validate_with_key(&key, &decoding_key) else {
return EDITION_FREE.to_string();
};
// If an activation token exists, it must match the local machine. A missing token is
// accepted (graceful pre-activation).
if let Ok(activation_path) = activation_path(app) {
if activation_path.exists() {
let Ok(token) = fs::read_to_string(&activation_path) else {
return EDITION_FREE.to_string();
};
let Ok(local_id) = machine_id_internal() else {
return EDITION_FREE.to_string();
};
if validate_activation_with_key(&token, &local_id, &decoding_key).is_err() {
return EDITION_FREE.to_string();
}
}
}
info.edition
}
/// Cross-platform machine identifier. Stable across reboots; will change after an OS reinstall
/// or hardware migration, in which case the user must re-activate (handled in Issue #53).
#[tauri::command]
pub fn get_machine_id() -> Result<String, String> {
machine_id_internal()
}
fn machine_id_internal() -> Result<String, String> {
machine_uid::get().map_err(|e| format!("Cannot read machine id: {}", e))
}
// === Tests ====================================================================================
#[cfg(test)]
mod tests {
use super::*;
use ed25519_dalek::SigningKey;
use jsonwebtoken::{encode, EncodingKey, Header};
// === Manual DER encoder for the Ed25519 private key =======================================
// We avoid the `pem` feature on `ed25519-dalek` because the `LineEnding` re-export path
// varies across `pkcs8`/`spki`/`der` versions. The Ed25519 PKCS#8 v1 byte layout is fixed
// and trivial: 16-byte prefix + 32-byte raw seed.
//
// Note the asymmetry in jsonwebtoken's API:
// - `EncodingKey::from_ed_der` expects a PKCS#8-wrapped private key (passed to ring's
// `Ed25519KeyPair::from_pkcs8`).
// - `DecodingKey::from_ed_der` expects the *raw* 32-byte public key (passed to ring's
// `UnparsedPublicKey::new` which takes raw bytes, not a SubjectPublicKeyInfo).
/// Wrap a 32-byte Ed25519 seed in a PKCS#8 v1 PrivateKeyInfo DER blob.
fn ed25519_pkcs8_private_der(seed: &[u8; 32]) -> Vec<u8> {
// SEQUENCE(46) {
// INTEGER(1) 0 // version v1
// SEQUENCE(5) {
// OID(3) 1.3.101.112 // Ed25519
// }
// OCTET STRING(34) {
// OCTET STRING(32) <32 bytes>
// }
// }
let mut der = vec![
0x30, 0x2e, 0x02, 0x01, 0x00, 0x30, 0x05, 0x06, 0x03, 0x2b, 0x65, 0x70, 0x04, 0x22,
0x04, 0x20,
];
der.extend_from_slice(seed);
der
}
/// Build a deterministic test keypair so signed tokens are reproducible across runs.
fn test_keys(seed: [u8; 32]) -> (EncodingKey, DecodingKey) {
let signing_key = SigningKey::from_bytes(&seed);
let pubkey_bytes = signing_key.verifying_key().to_bytes();
let priv_der = ed25519_pkcs8_private_der(&seed);
let encoding_key = EncodingKey::from_ed_der(&priv_der);
// Raw 32-byte public key (NOT SubjectPublicKeyInfo) — see note above.
let decoding_key = DecodingKey::from_ed_der(&pubkey_bytes);
(encoding_key, decoding_key)
}
fn default_keys() -> (EncodingKey, DecodingKey) {
test_keys([42u8; 32])
}
fn make_token<T: serde::Serialize>(encoding_key: &EncodingKey, claims: &T) -> String {
encode(&Header::new(Algorithm::EdDSA), claims, encoding_key).unwrap()
}
fn now() -> i64 {
std::time::SystemTime::now()
.duration_since(std::time::UNIX_EPOCH)
.unwrap()
.as_secs() as i64
}
#[test]
fn rejects_key_without_prefix() {
let (_enc, dec) = default_keys();
let result = validate_with_key("nonsense", &dec);
assert!(result.is_err());
}
#[test]
fn accepts_well_formed_base_license() {
let (enc, dec) = default_keys();
let claims = LicenseClaims {
sub: "user@example.com".to_string(),
iss: "lacompagniemaximus.com".to_string(),
iat: now(),
exp: now() + 86400,
edition: EDITION_BASE.to_string(),
features: vec!["auto-update".to_string()],
machine_limit: 3,
};
let jwt = make_token(&enc, &claims);
let key = format!("{}{}", KEY_PREFIX_BASE, jwt);
let info = validate_with_key(&key, &dec).unwrap();
assert_eq!(info.edition, EDITION_BASE);
assert_eq!(info.email, "user@example.com");
assert_eq!(info.machine_limit, 3);
}
#[test]
fn rejects_expired_license() {
let (enc, dec) = default_keys();
let claims = LicenseClaims {
sub: "user@example.com".to_string(),
iss: "lacompagniemaximus.com".to_string(),
iat: now() - 1000,
exp: now() - 100,
edition: EDITION_BASE.to_string(),
features: vec![],
machine_limit: 3,
};
let jwt = make_token(&enc, &claims);
let key = format!("{}{}", KEY_PREFIX_BASE, jwt);
let result = validate_with_key(&key, &dec);
assert!(result.is_err(), "expired license must be rejected");
}
#[test]
fn rejects_license_signed_with_wrong_key() {
let (enc_signer, _dec_signer) = default_keys();
let (_enc_other, dec_other) = test_keys([7u8; 32]);
let claims = LicenseClaims {
sub: "user@example.com".to_string(),
iss: "lacompagniemaximus.com".to_string(),
iat: now(),
exp: now() + 86400,
edition: EDITION_BASE.to_string(),
features: vec![],
machine_limit: 3,
};
let jwt = make_token(&enc_signer, &claims);
let key = format!("{}{}", KEY_PREFIX_BASE, jwt);
let result = validate_with_key(&key, &dec_other);
assert!(result.is_err(), "wrong-key signature must be rejected");
}
#[test]
fn rejects_corrupted_jwt() {
let (_enc, dec) = default_keys();
let key = format!("{}not.a.real.jwt", KEY_PREFIX_BASE);
let result = validate_with_key(&key, &dec);
assert!(result.is_err());
}
#[test]
fn rejects_unknown_edition() {
let (enc, dec) = default_keys();
let claims = LicenseClaims {
sub: "user@example.com".to_string(),
iss: "lacompagniemaximus.com".to_string(),
iat: now(),
exp: now() + 86400,
edition: "enterprise".to_string(),
features: vec![],
machine_limit: 3,
};
let jwt = make_token(&enc, &claims);
let key = format!("{}{}", KEY_PREFIX_BASE, jwt);
let result = validate_with_key(&key, &dec);
assert!(result.is_err());
}
#[test]
fn activation_token_matches_machine() {
let (enc, dec) = default_keys();
let claims = ActivationClaims {
sub: "license-id".to_string(),
iat: now(),
exp: now() + 86400,
machine_id: "this-machine".to_string(),
};
let token = make_token(&enc, &claims);
assert!(validate_activation_with_key(&token, "this-machine", &dec).is_ok());
}
#[test]
fn activation_token_rejects_other_machine() {
let (enc, dec) = default_keys();
let claims = ActivationClaims {
sub: "license-id".to_string(),
iat: now(),
exp: now() + 86400,
machine_id: "machine-A".to_string(),
};
let token = make_token(&enc, &claims);
let result = validate_activation_with_key(&token, "machine-B", &dec);
assert!(result.is_err(), "copied activation token must be rejected");
}
#[test]
fn embedded_public_key_pem_parses() {
// Sanity check that the production PEM constant is well-formed.
assert!(embedded_decoding_key().is_ok());
}
}

View file

@ -1,7 +1,11 @@
pub mod fs_commands;
pub mod entitlements;
pub mod export_import_commands;
pub mod fs_commands;
pub mod license_commands;
pub mod profile_commands;
pub use fs_commands::*;
pub use entitlements::*;
pub use export_import_commands::*;
pub use fs_commands::*;
pub use license_commands::*;
pub use profile_commands::*;

View file

@ -114,6 +114,13 @@ pub fn run() {
commands::hash_pin,
commands::verify_pin,
commands::repair_migrations,
commands::validate_license_key,
commands::store_license,
commands::store_activation_token,
commands::read_license,
commands::get_edition,
commands::get_machine_id,
commands::check_entitlement,
])
.run(tauri::generate_context!())
.expect("error while running tauri application");