two Pillars of privacy perserverance 

Make a difference


This year’s competition will challenge teams coming from around the world on two specific tasks, 1) Homomorphic Encryption and 2) Differential Privacy. We are seeking answers to those two widely adopted methods in privacy computing in real-life applications that blockchain technology becomes essential in support of effective business operations and management in today and future’s digital economy.    

Homomorphic Encryption

Homomorphic encryption “is a form of encryption that permits users to perform computations on its encrypted data without first decrypting it. These resulting computations are left in an encrypted form which, when decrypted, result in an identical output to that produced had the operations been performed on the unencrypted data. Homomorphic encryption can be used for privacy-preserving outsourced storage and computation. This allows data to be encrypted and out-sourced to commercial cloud environments [such as distributed blockchain applications] for processing, all while encrypted.” (https://en.wikipedia.org/wiki/Homomorphic_encryption)

Task 1: Homomorphic encryption

Background:

Goal:

Challenge:

Evaluation:

Dataset:

Reference:

 

License: This challenging task requires every participating team to share their algorithms, code and/or binaries under the BSD 3-Clause License Open-Source license. The organizers will upload all submitted algorithms, code and/or binaries to a GitHub repository or the like under the BSD 3-Clause License right after the competition results are announced. By submitting their algorithms, code and/or binaries, the participants automatically consent to allow the challenge organizers to release their algorithms, code and/or binaries under the BSD 3-Clause License.

Differential Privacy 

Differential Privacy “is a mathematical framework for ensuring the privacy of individuals’ data in the presence of data analysis. It works by adding random noise to the data in such a way that the privacy of individuals is protected while still allowing for meaningful analysis. This method guarantees that an attacker who has access to the data and the analysis results cannot determine the values of any individual’s data with high accuracy.” (ChatGPT)

Task 2: Differential Privacy

Background:

Goal:

Challenge:

Evaluation:

Dataset:

Reference:

 

License: This challenging task requires every participating team to share their algorithms, code and/or binaries under the BSD 3-Clause License Open-Source license. The organizers will upload all submitted algorithms, code and/or binaries to a GitHub repository or the like under the BSD 3-Clause License right after the competition results are announced. By submitting their algorithms, code and/or binaries, the participants automatically consent to allow the challenge organizers to release their algorithms, code and/or binaries under the BSD 3-Clause License.