Privacy11 min read

The Nexun Infrastructure Report: How Seven Automated Checks and a Signed JSON Prove Our No-Logs Claim Every Week

A deep dive into the weekly Nexun Infrastructure Report: what each of the seven automated checks proves, why we publicly disclose "scary-looking" columns, the signed JSON payload and how to verify the SHA-256 yourself.

Published 2026-04-14Nexun Team
The Nexun Infrastructure Report: How Seven Automated Checks and a Signed JSON Prove Our No-Logs Claim Every Week

Most VPN "transparency reports" are annual PDFs written by the same marketing department that wrote the sales page. Ours is different. Every week, a scheduled job on our production infrastructure runs seven automated checks against the live system, writes the results into a database row as a signed JSON payload, and publishes it at nexun.io/transparency/infrastructure. The result is a cryptographically verifiable attestation of what the Nexun backend actually looks like right now — and you can compare any two weeks to see nothing has changed.

See the live Infrastructure Report
Weekly SHA-256 attestation with 7 automated checks

What the report card shows you at a glance

At the top of /transparency/infrastructure you see the report card. It contains four things: a Report ID such as 2026-W16-093318 (ISO year, ISO week, timestamp), a generated-at date down to the second, an overall status pill ("All checks passed" or "Needs review"), and a 64-character SHA-256 content hash. That hash is computed over the canonical JSON payload of the report. Change a single byte of the payload and the hash becomes a completely different string. That is how you know the page you are reading has not been tampered with.

The seven checks, explained

Below the header is a summary block with seven rows, each marked pass or fail. Every row is generated by a different automated probe that runs against the live machine — not a static config file. Here is what each one proves:

  • No log files on disk — the probe scans /var/log/wireguard, /var/log/nexun and /var/log/openvpn. If any of those directories exist or contain files, the check fails. A passing result tells you there is nowhere on disk for connection logs to land even if something tried to write them.
  • API surface enumerated — FastAPI introspects its own routing table and lists every mounted prefix with its HTTP methods (for example /admin has 33 endpoints, /beta has 62, /transparency has 5). The point is not the numbers — it is that the full shape of the API is public. There are no hidden routes.
  • Data inventory documented — the probe reads the live PostgreSQL schema for vpn_users, vpn_subscriptions, vpn_support_tickets and billing_events and classifies every column (identifier, contact, payment, operational, timestamp, credential, unknown). If a new column is added in production and it has not been documented in the inventory, the check fails.
  • User tables have no IP columns — the probe searches every user-related table for columns named client_ip, ip_address, ip, remote_ip, created_ip, last_ip or source_ip. Finding even one would fail the check. The passing result means the database literally has no column where an IP could be stored.
  • Database query logging disabled — PostgreSQL settings are read live: log_duration=off, log_statement=none, log_connections=off, log_disconnections=off, log_min_duration_statement=-1. If anyone ever flipped one of those on to "debug something", the report would catch it within 7 days.
  • No analytics SDKs in dependencies — the probe parses the exact Python requirements installed on the running API (22 packages) and searches for a deny-list: sentry-sdk, datadog, newrelic, mixpanel, segment-analytics-python, posthog, rollbar, bugsnag, google-analytics, amplitude, heap, fullstory. A passing result means none of the usual silent data-siphon libraries are in the environment.
  • No audit or log extensions installed — the probe queries pg_extension against a deny-list: pgaudit, pg_stat_statements, pg_qualstats, auto_explain. These are Postgres extensions that silently record query history. Only pgcrypto and plpgsql are installed — both required, neither logs anything.

The schema disclosures — why we show you the "scary" columns

The report also carries a short disclosure section. Some of our columns have names that look alarming next to a "no logs" claim, and rather than hide them in a back office, we explain every one of them on the same page. Hiding would defeat the entire point.

  • vpn_support_tickets.log_data — a debug bundle a user can voluntarily attach to a support ticket. It is gated by the include_logs boolean on the same row. It is never written for normal VPN traffic; it is only filled when someone ticks "include logs" in the support form themselves.
  • vpn_users.test_password — a plaintext password for App Store and Google Play reviewer accounts. Apple and Google both require a working test login to approve app updates. A database CHECK constraint (chk_test_password_only_for_test) makes it physically impossible to write this column on any row where is_test_account is false.
  • vpn_users.notes — a free-text field for operator notes (refund context, support follow-ups). Never visible to other users; only readable by level-9 admins. Any GDPR access request returns this field as part of the user export.
  • vpn_subscriptions.raw — the exact webhook payload we receive from Stripe / Apple / Google for each subscription event, stored verbatim so we can reconcile billing disputes. It contains no information beyond what the payment provider already has about the same transaction.

The full JSON payload and how to verify it yourself

At the bottom of the report page we render the exact JSON document that the SHA-256 was computed over. Anyone can scroll through it: the checks, the findings arrays, every table and every column in the data inventory, the Postgres settings, the installed extensions, the dependency list and the summary. The format is stable (schema_version: 2) so two reports from different weeks can be diffed directly. To verify the hash yourself you canonicalise the JSON (sorted keys, no extra whitespace) and compute a SHA-256 over the bytes. If your result matches the hash in the header card, the report is intact. If it does not match, something is wrong and we would like to know about it.

What changes between weeks

Because the report is generated automatically from the live system, it will change when the system genuinely changes. A typical week might see the total_packages number tick up or down as we update dependencies, or endpoint_count change inside /beta as we add or retire beta features. What should never change is the set of categories ("no IP columns", "no query logging", "no analytics SDKs", "no audit extensions") — any change there would flip the overall_pass flag to false and show a red "Needs review" pill instead of green. If you ever see that happen, we will owe you an explanation, and the report is designed to make the explanation unavoidable.

Why this is more honest than an annual PDF

A yearly PDF is a snapshot written by humans. It can be re-edited, re-worded, and quietly re-uploaded. A weekly signed attestation generated from the machine itself cannot. The database row for each report is keyed by its ISO week; the hash is fixed at creation; and every line of the payload is a direct read from the live infrastructure. If we ever had to silently soften a claim, the checks would fail and the page would show it. That is the real difference between promising privacy and proving it.

How to use the report as a user

Open nexun.io/transparency to see the current week at a glance: canary colour, infrastructure status pill, SHA-256 fingerprint. Click through to /transparency/infrastructure to read every check. Bookmark the page and reopen it in a week — the Report ID (2026-Wxx-...) and the hash should both have changed, proving the attestation is live. If anything in the "never-log" section or the check list changes without a public announcement from us, treat it as a reason to ask questions. That is exactly what this report is for.

See the live Infrastructure Report
Weekly SHA-256 attestation with 7 automated checks

Download Nexun VPN

Try Nexun VPN free -- protect your privacy with WireGuard encryption and Privacy Logging.

Download Free

FAQ

How do I reproduce the SHA-256 hash myself?

Copy the full JSON payload shown at the bottom of /transparency/infrastructure, canonicalise it with sorted keys and UTF-8 encoding (Python: json.dumps(payload, sort_keys=True, separators=(",", ":")).encode()), then compute sha256 over those bytes. The 64-character hex string you get should equal the hash displayed in the header card. If it does, the report you read is exactly the report we signed.

Why is test_password allowed to be plaintext?

Because Apple and Google both require their reviewers to log in during app review, and a hashed password cannot be used for that. We isolate the risk with a CHECK constraint: chk_test_password_only_for_test physically prevents a plaintext password from being stored on any row except one where is_test_account is true. That means the column exists, but it can only ever hold the credentials for accounts that are not real users.

What would a failing report look like?

The header pill would turn red with "Needs review" instead of green, the failing check row would flip from pass to fail, and the findings array underneath would list the specific item that triggered the failure (a file that appeared in /var/log, an IP column found in a user table, an unexpected Postgres extension installed, and so on). The SHA-256 hash of a failing report would still be valid — it would just be signing a different, honest story about the system that week.

Related posts