Exposure Notification Reference Server | Covid-19 Exposure Notifications

Exposure Notification Reference Key Server

COVID‑19 Exposure Notifications API

Exposure Notification Reference Key Server Documentation

In our continued effort to help governments and health authorities during the COVID-19 pandemic, we have authored an open source reference implementation of an Exposure Notification Key Server.

The server reference in this repository implements the Exposure Notifications API and provides reference code for working with Android and iOS apps that are built by public health authorities. The reference server source code is available on GitHub and can be deployed on any infrastructure or cloud provider selected by a public health authority.

Our hope is by making this privacy-preserving server implementation available to health authorities, we can enable their developers to use the open source code to get started quickly.

Overview

The server is responsible for the following functions:

  • Accepting the temporary exposure keys of affected users from mobile devices.

  • Validating the temporary exposure keys using the device attestation API.

  • Storing the temporary exposure keys in a database.

  • Periodically generating incremental files that will be downloaded by mobile devices to perform the key matching algorithm on the mobile device.

  • Sending a public key to devices, and digitally signing the incremental files with a private key.

  • Periodically deleting old temporary exposure keys. After 14 days, or configured time period, the exposure keys can no longer be matched to a device.

Tutorials and reference documentation

You can read tutorials on deploying and using the reference Exposure Notification Key Server here:

Issues and Questions

You can open a GitHub Issue. Please be sure to include as much detail as you can to help aid in addressing your concern. If you wish to reach out privately, you can send an e-mail [email protected].

Contributing to this project

Contributions to this project are welcomed. For more information about contributing to this project, see the contribution guidelines.

Comments
  • Failed to create files for batch

    Failed to create files for batch

    Creating index file for batch 17: creating file /index.txt in bucket exposure-notification-export-cxxdc: storage.Writer.Close: googleapi: Error 400: Bucket is requester pays bucket but no user project provided., required.

  • Broadcasting a list of infected persons is not GDPR conform

    Broadcasting a list of infected persons is not GDPR conform

    Describe the bug

    The whole concept of this server is to store a list of infected COVID19 persons and send it to everyone. I know that the list is supposed to be anonym, but locally you can match the IDs to a person. Even a beginner software developer will be able to modify the mobile app (Open Source and API are a good thing, but they are transparent and you can modify them) and store GPS and timestamp with the key and you don't have to upload the new app to the Store, so Google will not be able to check this. When the list of infected COVID19 patients (or their Ids or Keys even if you encrypt everything 100 times) is sent from the server the new app will be able to find out where and when it detected the infected person. If such a modified app is then distributed you can even create a whole database of keys.

    This huge data privacy leak was already mentioned on the DP3T paper: https://github.com/DP-3T/documents/blob/master/DP3T%20White%20Paper.pdf

    "Infected individuals. The centralised and decentralised contact tracing systems share the inherent privacy limitation that they can be exploited by an eavesdropper to learn whether an individual user got infected and by a tech-savvy user to reveal which individuals in their contact list might be infected now. However, the centralised design does not allow proactive and retroactive linkage attacks by tech-savvy users to learn which contacts are infected because the server never reveals the EphID s of infected users."

    The so called "retroactive linkage attacks by tech-savvy users" is a huge problem!

    Furthemore the whole thing is not GDPR conform and is not conform to the EU recommendations: https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32020H0518&from=EN

    (16) | With particular regard the use of COVID-19 mobile warning and prevention applications, the following principles should be observed: (1) safeguards ensuring respect for fundamental rights and prevention of stigmatization, in particular applicable rules governing protection of personal data and confidentiality of communications; (4) effective cybersecurity requirements to protect the availability, authenticity integrity, and confidentiality of data; (5) the expiration of measures taken and the deletion of personal data obtained through these measures when the pandemic is declared to be under control, at the latest; (6) uploading of proximity data in case of a confirmed infection and appropriate methods of warning persons who have been in close contact with the infected person, who shall remain anonymous; and (7) transparency requirements on the privacy settings to ensure trust into the applications.

    To Reproduce Steps to reproduce the behavior: Broadcasting of the Keys

    Expected behavior No Broadcasting of infected Keys combined with Open Source and Open APIs.

    Screenshots

    Desktop (please complete the following information): All OSs, all versions.

    Smartphone (please complete the following information): All Smartphones.

    Additional context Major data privacy leak.

  • Error on terraform apply on vpcaccess.googleapis.com service

    Error on terraform apply on vpcaccess.googleapis.com service

    Question

    I am trying to deploy exposure server on our own gcp account. I am pretty new to gcp and terraform. And I got stuck on the error after I execute terraform apply. Our gcp account have been enabled billing. And I have follow every steps from the official docs and deploying with terraform then I got until the errors below.

    null_resource.migrate (local-exec): Finished Step #1 - "migrate"
    null_resource.migrate (local-exec): PUSH
    null_resource.migrate (local-exec): DONE
    null_resource.migrate (local-exec): -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    
    null_resource.migrate (local-exec): ID                                    CREATE_TIME                DURATION  SOURCE                                                                                            IMAGES  STATUS
    null_resource.migrate (local-exec): 16082944-ccd9-4c81-bdd1-741514d590ed  2020-08-30T16:32:31+00:00  1M52S     gs://gaen-project-name_cloudbuild/source/1598805146.595107-09c406fd75a34731976a71eb5e92cc9c.tgz  -       SUCCESS
    null_resource.migrate: Creation complete after 2m1s [id=1281597494475427654]
    
    Error: Error running command './../scripts/build': exit status 1. Output: ✋ Uncommitted local changes!
    
    
    
    
    Error: Request "Enable Project Service \"vpcaccess.googleapis.com\" for project \"gaen-project-name\"" returned error: failed to send enable services request: googleapi: Error 400: Another activation or deactivation is in progress for the following service(s): 
    vpcaccess.googleapis.com and project gaen-project-name., failedPrecondition
    
      on main.tf line 27, in resource "google_project_service" "services":
      27: resource "google_project_service" "services" {
    

    Below is the command that I have executed so far.

    // history
       66  git clone [email protected]:google/exposure-notifications-server.git
       67  cd exposure-notifications-server/
       68  git checkout tags/v0.5.1
       69  git status
       70  unset GOOGLE_APPLICATION_CREDENTIALS
       71  export PROJECT_ID="gaen-project-name"
       72  gcloud auth login && gcloud auth application-default login
       73  cd terraform/
       74  echo "project = \"${PROJECT_ID}\"" >> ./terraform.tfvars
       75  echo 'generate_cron_schedule = "*/15 * * * *"' >> ./terraform.tfvars
       76  cat terraform.tfvars
       77  gsutil mb -p ${PROJECT_ID} gs://${PROJECT_ID}-tf-state
       78  cat <<EOF > ./state.tf
    terraform {
      backend "gcs" {
        bucket = "${PROJECT_ID}-tf-state"
      }
    }
    EOF
    
       79  terraform init
       80  history
       81  terraform apply
    
    

    Any thoughts how can I continue with?

  • Error in setting up CLEANUP_TTL to delete older export files

    Error in setting up CLEANUP_TTL to delete older export files

    Trying to abide by the max file limit for IOS and Android we have reset the export config to a min of 24 hours to generate less files. The index file still contains references for the older files which we tried to clear by setting the CLEANUP_TTL to a lower value of 24 hours so we can get rid of the older batch files but the clean jobs always result the below error : "error processing cutoff time: cleanup ttl is less than configured minimum ttl"

    Checking the code base the "CLEANUP_TTL" seems to be set in 3 different services "export" "clean up exposure" and "clean up export"

    Please advice

  • Add tests for Google Cloud Storage storage

    Add tests for Google Cloud Storage storage

    We need to:

    1. Create a bucket
    2. Grant the prow runner service account permission to manage objects in that bucket
    3. Set GOOGLE_CLOUD_BUCKET on the prow workers
  • Add staticcheck linter

    Add staticcheck linter

    scripts/presubmit.sh: fix typo

    scripts/presubmit.sh: add newlines for consistency

    ci: add staticcheck to presumbit.sh

    This depends on a new docker image, which will need to be manually pushed by an administrator (CC @sethvargo)

    Fixes #443

    all: fix staticcheck warnings

  • Posting Keys from previosly working clients results error 400

    Posting Keys from previosly working clients results error 400

    Recenetly submitting exposure Keys from clients that were succesffully working before started giving : Error 400 : unable to read request data: exposure keys have overlapping intervals Any pointers on how to trace that and the root cause ?

  • Deployment fails with Cloud Run Error

    Deployment fails with Cloud Run Error

    Deployment fails with Cloud Run Error: ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information.

  • Fix/terraform regions

    Fix/terraform regions

    The infrastructure deployment must target a region that supports both CloudRun and AppEngine otherwise terraform fails with errors. I've added a locals entry mapping the valid regions for CloudRun to the associated region for AppEngine. This has the side effect of preventing deployments to invalid regions from progressing past the "terraform plan" stage.

  • SafetyNet nonce is not randomly-generated by server

    SafetyNet nonce is not randomly-generated by server

    Location internal/android/nonce.go

    Problem The nonce is generated based on values selected by the client app. Without being random, the attestation may be replayed to the server. This seems consistent with SafetyNet doc:obtain-a-nonce

    Proposed Fix I think we need the client to call the server to get a nonce, then reply with the same nonce. This will introduce some statefulness on server. Or we could generate PRF(K, expiry_time, other_metadata) so that we need not store the nonce server-side: this gives the attacker a window for replay, but I think it is acceptable.

    The nonce could look like expiry_time||other_metadata||PRF(...)

    Note: there are options opts.MinValidTime and opts.MaxValidTime in the code; and they that may be also be to mitigate replays to some extent. I'm not sure what happens if the nonce is re-used and the timestamp is validated: safetynet page warns attackers could gather attestations blob.

    Note also that the use of opts.MinValidTime and opts.MaxValidTime is currently optional in the code.

  • Terraform e2e: add back custom db deletion step

    Terraform e2e: add back custom db deletion step

    Unfortunately terraform doesn't like #1097 , saying cycling dependencies:

    Error: Cycle: module.en.google_sql_user.user, module.en.google_sql_database.db, module.en.google_sql_database_instance.db-inst, module.en.google_sql_ssl_cert.db-cert
    

    https://prow.k8s.io/view/gcs/oss-prow/logs/ci-en-server-terraform-smoke/1318009362538565632#1:build-log.txt%3A88

    Restore custom db deletion step to make terraform smoke test work

A simple Go lib to get information on Covid-19

Govid A simple Go library which lets you get information on Covid-19 Examples Getting total data about all countires: package main import ( "Govid/g

Aug 18, 2022
EU Digital Covid Certificate utilities in Go [Create, Validate and Parse Green-Pass/EU-DCC]

go-dcc EU Digital Covid Certificate utilities in Go [Create, Validate and Parse Green-Pass/EU-DCC] Repo work in-progress CLI Usage: ######Create and S

Dec 23, 2021
whatsup is the reference server implementation for the fmrl protocol

whatsup whatsup is the reference server implementation for the fmrl protocol. Currently whatsup has no web interface, but may gain one in the future.

Mar 25, 2022
Super fault-tolerant gateway for HTTP clusters, written in Go. White paper for reference - https://github.com/gptankit/serviceq-paper
Super fault-tolerant gateway for HTTP clusters, written in Go. White paper for reference - https://github.com/gptankit/serviceq-paper

ServiceQ ServiceQ is a fault-tolerant gateway for HTTP clusters. It employs probabilistic routing to distribute load during partial cluster shutdown (

Jul 16, 2022
Reference implementation of the PLAN Data Model and core components

PLAN is a free and open platform for groups to securely communicate, collaborate, and coordinate projects and activities.

Nov 2, 2021
Reference Golang implementation of the Alphanet - Network of Momentum Phase 0

Zenon Node Reference Golang implementation of the Alphanet - Network of Momentum Phase 0. Building from source Building znnd requires both a Go (versi

Dec 25, 2022
Kubernetes Custom Resource API Reference Docs generator

Kubernetes Custom Resource API Reference Docs generator If you have a project that is Custom Resource Definitions and wanted to generate API Reference

Dec 7, 2021
Watch for interesting patterns in Caddy logs and send a Telegram notification.

Watch for interesting patterns in Caddy logs and send a Telegram notification.

Jul 21, 2022
A service notification for Telegram that accepts icinga vars as arguments

A service notification for Telegram that accepts icinga vars as arguments. This is mainly to workaround a limitation in Icinga Director which is unabl

Dec 19, 2021
Library for receiving (near) realtime notifications about earthquakes using websockets

goseismic goseismic is library for receiving (near) realtime notifications about earthquakes using websockets from SeismicPortal. Using goseismic, rec

Dec 29, 2022
cross-platform library for sending desktop notifications

Golang-Toast cross-platform library for sending desktop notifications Installation go get

Nov 24, 2022
This small Docker project is the easiest way to send notifications directly via .txt files to services like: Gotify, Telegram, SMTP (Email) or Webhook.
This small Docker project is the easiest way to send notifications directly via .txt files to services like: Gotify, Telegram, SMTP (Email) or Webhook.

This small Docker project is the easiest way to send notifications directly via .txt files to services like: Gotify, Telegram, SMTP (Email) or Webhook.

Oct 5, 2022
A Language Server Protocol (LSP) server for Jsonnet

Jsonnet Language Server Warning: This project is in active development and is likely very buggy. A Language Server Protocol (LSP) server for Jsonnet.

Nov 22, 2022
The server-pubsub is the main backend of DATAVOC project that manages all the other web-server modules of the same project such as the processor

server-pubsub The server-pubsub is the main backend of DATAVOC project that manages all the other web-server modules of the same project such as the p

Dec 3, 2021
server-to-server sync application, written in go/golang.

svcpy: server to server copy a basic server-to-server copy application. on a single binary, it can be a server or a client. example usage: on the serv

Nov 4, 2021
Server and client implementation of the grpc go libraries to perform unary, client streaming, server streaming and full duplex RPCs from gRPC go introduction

Description This is an implementation of a gRPC client and server that provides route guidance from gRPC Basics: Go tutorial. It demonstrates how to u

Nov 24, 2021
Pape-server - A small server written in golang to serve a random wallpaper.

pape-server I like to inject custom CSS themes into a lot of websites and electron apps, however browsers don't let websites access local disk through

Dec 31, 2021
Cert bound sts server - Certificate Bound Tokens using Security Token Exchange Server (STS)
Cert bound sts server - Certificate Bound Tokens using Security Token Exchange Server (STS)

Certificate Bound Tokens using Security Token Exchange Server (STS) Sample demonstration of Certificate Bound Tokens acquired from a Security Token Ex

Jan 2, 2022
Echo-server - An HTTP echo server designed for testing applications and proxies

echo-server An HTTP echo server designed for testing applications and proxies. R

Dec 20, 2022