0% found this document useful (0 votes)
4 views

Week 7 - Solution

Uploaded by

Sudula Smily
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Week 7 - Solution

Uploaded by

Sudula Smily
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Assignment week 7

1. When calculating the sensitivity in ε-Differential Privacy where the values to be derived
from the data points is a d-dimension vector, identify the normalisation technique.
(Notations are the same as used in the lecture)
a. Manhattan normalisation
b. Eucledian normalisation
c. Max normalisation
d. Min-max normalisation
e. Sigmoid normalisation

2. In (ε, δ)- Differential privacy what does δ=0 imply? (Notations are the same as used in
the lecture)
ε
a. The equation (𝑃(𝑀(𝑥)ϵ 𝑆) ≤ 𝑒 (𝑃(𝑀(𝑥')ϵ 𝑆) should hold for some of the subsets
S
ε
b. The equation (𝑃(𝑀(𝑥)ϵ 𝑆) ≤ 𝑒 (𝑃(𝑀(𝑥')ϵ 𝑆) should hold for most of the subsets
S
ε
c. The equation (𝑃(𝑀(𝑥)ϵ 𝑆) ≤ 𝑒 (𝑃(𝑀(𝑥')ϵ 𝑆) should hold for all of the subsets S
ε
d. The equation (𝑃(𝑀(𝑥)ϵ 𝑆) ≤ 𝑒 (𝑃(𝑀(𝑥')ϵ 𝑆) should hold for none of the subsets
S

3. How do the utilities vary in the Laplacian mechanism vs the Gaussian mechanism in a
higher dimension differential privacy setting?
a. As the dimension increases, the Gaussian mechanism requires quadratically
more amount of noise than the Laplacian mechanism, decreasing the utility
b. As the dimension increases, the Gaussian mechanism requires quadratically
lesser amount of noise than the Laplacian mechanism, decreasing the utility
c. As the dimension increases, the Gaussian mechanism requires quadratically
lesser amount of noise than the Laplacian mechanism, increasing the utility
d. As the dimension increases, the Gaussian mechanism requires quadratically
more amount of noise than the Laplacian mechanism, increasing the utility

4. _____ property ensures that a function applied on the privacy-protected data _____ its
privacy aspect after applying a function over it.
a. i. Post-processing ii. Retains
b. i. Post-processing ii. Loses
c. i. Composition ii. Retains
d. i. Composition ii. Loses
5. After using k mechanisms for getting k (ε, δ)- differentially private data variations for a
dataset, the combined leakage that is observed from these k mechanisms can be
minimized by:
a. Using Laplacian Mechanism
b. Using Gaussian Mechanism
c. Using Uniform Mechanism
d. Using Exponential Mechanism

6. In a buyer-seller problem, given n buyers and n valuations by the buyers, what is the
total revenue given a price p.
𝑛
a. 𝑝 ∑ 𝐴 𝑤ℎ𝑒𝑟𝑒 𝐴 = 1 𝑖𝑓 𝑣𝑖 ≥ 𝑝 𝑎𝑛𝑑 𝐴 = 0 𝑖𝑓 𝑣𝑖 ≤ 𝑝
𝑖=𝑛
𝑛
b. 𝑝 ∑ 𝐴 𝑤ℎ𝑒𝑟𝑒 𝐴 = 0 𝑖𝑓 𝑣𝑖 ≥ 𝑝 𝑎𝑛𝑑 𝐴 = 1 𝑖𝑓 𝑣𝑖 ≤ 𝑝
𝑖=𝑛
c. 𝑝𝑛
d. 𝑝(𝑛 − 1)
e. 𝑝(1/𝑛)

7. In the exponential mechanism to calculate the price to maximize the revenue, identify the
correct statement in the scenario where 2 unequal prices result in the same revenue:
a. Both prices have an unequal probability of being selected
b. Both prices have an equal probability of being selected
c. A higher price has a higher probability of being chosen due to normalisation
d. A lower price has a higher probability of being chosen due to normalisation

8. In a classification problem, if a data point lies on a hyperplane that perfectly separates


the two classes, the probability of the data point belonging to class A is:
a. 25%
b. 50%
c. 75%
d. 100 %

9. In a vanilla Principle Component Analysis method, the reconstruction loss of a protected


group is _______ than the remaining data before resampling and _______ than the
remaining data after resampling.
a. Higher, higher
b. Higher, lower
c. Lower, higher
d. Lower, lower
10. The goal of a Fair PCA is to find a PCA solution U where U=[Ua, Ub] such that
reconstruction loss of the two groups A and B where A is the protected group is:
a. Equal
b. Unequal
c. The protected group has a lower reconstruction loss
d. The protected group has a higher reconstruction loss

11. In an ideal situation where the models are completely fair, the different parity values are:
a. Approach 0
b. 1
c. Approach 1
d. 0

12. Match the following:


i. 𝑃(𝑀(𝑥) = 1 | 𝑥 𝑖𝑛 𝐶) − 𝑃(𝑀(𝑥) = 1) a. Fair Logistic
regression
ii. 𝑃(𝑀(𝑥) = 1 | 𝑦 = 1 𝑎𝑛𝑑 𝐶) − 𝑃(𝑀(𝑥) = 1 | 𝑦 = 1) b. Statistical Parity
iii. 𝑃(𝑀(𝑥) = 1 | 𝐶 = 1) − 𝑃(𝑀(𝑥) = 1 | 𝐶 = 0) c. Equality of
Opportunity
a. i. - a, ii. - b, iii. - c
b. i. - b, ii. - a, iii. - c
c. i. - c, ii. - a, iii. - b
d. i. - b, ii. - c, iii. - a

You might also like