0% found this document useful (0 votes)
3 views

Final

Uploaded by

Priyanka Mate
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Final

Uploaded by

Priyanka Mate
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 32

PREVENTING PROFILE SENSITIVE INFORMATION

DISCLOSURES ON OSN

Presented By:- Guided By:-


Ms. Priti N. Rathod Prof. G.P. Chakote(Guide)
ME Student Prof. M. B. Kalkumbe (Co-guide)
CONTENT
1. INTRODUCTION
2. OBJECTIVE
3. MOTIVATION
4. LITERATURE SURVEY
5. ISSUES
6. PROBLEM STATEMENT
7. SA DISCLOSURE EFFECT
8. PROPOSED SYSTEM
9. SYSTEM ARCHITECTURE
10. ADVANTAGES OF PROPOSED SYSTEM
11. APPLICATION
12. CONCLUSION
13. FUTURE WORK
14. REFERENCES
15. SNAPSHOTS
INTRODUCTION
OBJECTIVE

• P R E S E RV E P R I VA C Y O F U S E R
PROFILES.
• PROVIDE THE MINIMUM
I N F O R M AT I O N L O S S .
• TO MINIMIZE CRIMES ON
INTERNET
MOTIVATION

• G R O W I N G R AT E O F C R I M E S O N
THE INTERNET
• P R E S E RV I N G U S E R ’ S P R I VA C Y
• ENABLING SECURE
C O M M U N I C AT I O N B E T W E E N T H E
USERS
• PROTECTION FROM HACKERS AND
OTHER UNKNOWN USERS
LITERATURE SURVEY
Sr. No. Paper Name Author Year of Methodology Description Advantage
Publication Used

1. Inferring Privacy Jianming He, 2010 Bayesian Using a Bayesian Results reveal that personal
Information From Wesley W. Chu, Networks network approach attributes can be
Social and Zhenyu only way to infer inferred with high accuracy
Networks (Victor) Liu private information especially when people are
via friendship link. connected with
strong relationships.

2. Community-Enhanced Shirin Nilizadeh 2014 Divide-and- To re-identify user. Reducing the anonymity of users.
De-anonymization of Apu Kapadia conquer Our approach
Online Social Yong-Yeol Ahn approach partitions the
Networks networks into
communities’ and
performs a two-
stage mapping
Sr. Paper Name Author Year of Methodology Description Advantage
No. Publication Used

3. De-anonymizing A. Narayanan 2009 Divide-and- De-anonymization Reducing the anonymity of users.


Social Networks and V. conquer algorithms to re-
approach identify user in s/n.
Shmatikov Our approach
attempt to protect
against inferring SI
of user

4. curso: Protect Eunsu Ryu Yao 2013 1. Social- Results indicate Whether Alice's sensitive attribute
Yourself from Curse Rong, Jie Li Attribute that user SI can be can be inferred based on public
of Attribute Inference Ashwin Network Model inferred based on information in Alice's
Machanavajjhala 2.Deterministc friendship neighborhood, and Whether
Algorithm information and making Alice's sensitive attribute
3. Utility group membership public leads to the disclosure of
Functions and it also shows sensitive information of another
that disclouser of user Bob in Alice's neighborhood.
one user hidden
information would
breach her friends
privacy.
ISSUES

• The existing scheme cannot work reasonably


balance privacy and data utility.

• Cannot detect collective attacks in diverse large


scale social networks.
PROBLEM STATEMENT

• A challenging task in the OSN is to protect the


privacy of the users profiles and communications.

• Systems address the privacy preserving profile of


users and secure communication in online social
network (osn).
SENSITIVE ATTRIBUTE DISCLOSURE EFFECT

Figure: Effects of disclosure of sensitive attributes on data privacy


PROPOSED SYSTEM
• Analyze the factors that have direct impact on privacy of user
profile
• i.e. Sensitivity and Visibility

• Sensitivity is a risk associated with profile Attributes.

• Information Visibility is a factor that has direct impact on user


privacy.

• Data sanitization technique used to sanitize user sensitive data


from sensitive value. It hides most sensitive attributes DOB,
Gender, Email, Contact no.

• Finally compute the overall privacy Disclosure score for


giving restrictions to users sharing behavior.
Sensitivity of profile attributes

• Sensitivity shows the risk associated with the attributes of the


user.
• When the sensitivity of an attribute increases, the risk posed by
information disclosure of the individuals also increases.

Sensitivity is calculated using following equation

Sensitivity = (N – Ri) / N

For ex. Sensitivity_DOB = (N- DOBpublic_cnt) / N.


Here N= no. of users.
= (546-191)/546
= 0.65018
Visibility of profile attributes

• Visibility determines how widely accessible the attributes of a


user are in an online social network.
• The visibility of a profile item i by the user j is calculated using
the following equations :

Visibility Vcan be calculated using following equation

V = ( Ri * Rj) / (N*n)

For ex.
V = (DOBPublic_cnt * Rj ) / (N *n)
Vcontact = (DOBPublicCnt * Rj) / (N * n);
= (191+1*0.698)/(546*8)
= 0.04356

N means number of users


n = means number of attributes
Privacy disclosure score

• Finally we compute the overall PDS by using sensitivity and visibility of


attributes to make the awareness in OSN users.
• If they share most sensitive information then their profile is in risk than sharing
the less sensitive information.

PDS can be calculated using following equation

PQj

For ex.
PQj = Vdob*dobSensitivity + Vgender*genderSensitivity +
Vcontact*mobileSensitivity + Vemail*emailSensitivity +
Vaddress*AddressSensitivity + VcurrentCity*currentCitySensitivity +
Vedu*EducationSensitivity + Vjob*jobDetailsSensitivity;
PQl.add(PQj);
Curr Job
Sensitivit Us Conta Addre Educati
Attributes DOB Gender Email ent Deta
y er ct ss on
city ils
s
DOB 0.65018
0.17 0.052 0.096 0.261 0.24
Gender 0.68864 A 0.1556 0.26 0.2536
49 1 1 9 26
Contact 0.8956 0.30 0.091 0.168 0.455 0.458 0.42
B 0.2724 0.4439
Email 0.80769 6 3 2 1 3 46
0.21 0.065 0.120 0.130 0.12
Address 0.47985 C 0.1945 0.325 0.1268
86 2 1 9 13
Current 0.04 0.065 0.065 0.06
0.47619 D 0.0389 0.013 0.024 0.0634
City 37 01 4 06
Job details 0.51465 0.17 0.052 0.096 0.261 0.24
E 0.1556 0.26 0.2536
Education 0.49267 49 1 1 9 26
Table 4.1 Sensitivity score for user profile attribute 0.13 0.039 0.072 0.196 0.18
F 0.1167 0.195 0.1902
11 1 1 4 2
Us Curr Job privac 0.17 0.052 0.096 0.261 0.24
Gende Cont Emai Add Educ G 0.1556 0.26 0.2536
er DOB ent Detai y 49 1 1 9 26
r act l ress ation
s city ls score
0.13 0.039 0.072 0.196 0.18
0.052 0.096 0.261 0.253 H 0.1167 0.195 0.1902
A 0.175 0.156 0.26 0.243 0.845 11 1 11 4 2
1 1 9 6
0.26 0.078 0.144 0.390 0.392 0.36
0.091 0.168 0.45 0.458 0.443 I 0.2335 0.3804
B 0.306 0.272 0.425 1.478 23 29 23 1 8 4
3 2 51 3 9
0.08 0.130 0.12
0.065 0.120 0.32 0.130 0.126 J 0.0778 0.026 0.048 0.13 0.1268
C 0.219 0.195 0.121 1.056 74 9 13
2 1 5 9 8
0.06 0.065 0.063
D 0.044 0.039 0.013 0.024 0.061 0.211 Table 6.3 Attribute Visibility Score
5 4 4
.
0.052 0.096 0.261 0.253
E 0.175 0.156 0.26 0.243 0.845
1 1 9 6
0.039 0.072 0.19 0.196 0.190
F 0.131 0.117 0.182 0.634
1 1 5 4 2
0.052 0.096 0.261 0.253
G 0.175 0.156 0.26 0.243 0.845
1 1 9 6
0.039 0.072 0.19 0.196 0.190
H 0.131 0.117 0.182 0.634
1 1 5 4 2
0.078 0.144 0.39 0.392 0.380
I 0.262 0.234 0.364 1.267
3 2 01 8 4
0.130 0.126 Table 6.4 Privacy Disclosure Score
J 0.087 0.078 0.026 0.048 0.13 0.121 0.422
9 8
SYSTEM ARCHITECTURE

Figure. Proposed system Architecture


SYSTEM ARCHITECTURE
Preventing profile sensitive information
using data sanitization technique
Input: X, Sensitive Attributes (SAs), Non-Sensitive
Attributes (NSAs)

Output: X (Hide Sensitive Attributes)

1: Read Profile Data from DB.


2: Sort profile data by sensitive value
3: if (there is no overlap between SA and NSA
then remove SA)
4: else
5: Remove different set between SA and NSA and Modify

the user profile attributes


6: end if
7: return X
ADVANTAGES OF PROPOSED SYSTEM

• Proposed system can work reasonably to balance


privacy and data utility.

• Third party users cannot obtain necessary


information to accurately predict sensitive
information.

• Reduce the chance of attacks due to the secure


communication.
APPLICATION

• It can be used on any social networking site

• Helpful in preserving the privacy of a user


which is sensitive.

• Helpful in preserving the privacy of a user


communication.
CONCLUSION

• The proposed system provided more protection to


personal data by hiding it during account creation.

• It increases the overall security of social media data.


And the accessibility of information to Unknown users
is narrowed down.

• Using Data Sanitization, we are able to effectively


sanitize social network data prior to release.
FUTURE WORK

• As an extension to our work the proposed recommender


system could be considering user’s perspectives about the
sensitivity of their data, decide on which of the personal
information can be made available and which part of the PI
should be hidden at the time of account creation.

• filter out the sensitive contents like users’ personal photos


and videos from the general content before computing the
privacy quotient.
REFERENCES

[1] Abdullah Al Hasib. Threats of online social networks. IJCSNS International Journal of Computer Science
and Network Security, 9(11):288–93, 2009.
[2] Lei Jin, Hassan Takabi, and James BD Joshi. Towards active detection of identity clone attacks on online
social networks. In Proceedings of the first ACM conference on Data and application security and privacy,
pages 27–38. ACM, 2011.
[3] Chi Zhang, Jinyuan Sun, Xiaoyan Zhu, and Yuguang Fang. Privacy and security for online social
networks: challenges and opportunities. Network, IEEE, volume, 24(4):13–18, 2010.
[4] Hongyu Gao, Jun Hu, Tuo Huang, Jingnan Wang, and Yan Chen. Security issues in online social
networks. Internet Computing, IEEE, volume, 15(4):56–63, 2011.
[5] Michael Beye, Arjan Jeckmans, Zekeriya Erkin, Pieter Hartel, Reginald Lagendijk, and Qiang Tang.
Literature overview-privacy in online social networks. 2010.
[6] Abdullah Al Hasib. Threats of online social networks. IJCSNS International Journal of Computer
Science and Network Security, 9(11):288–93, 2009.
[7] Judith DeCew. Privacy. In Edward N. Zalta, editor, The Stanford Encyclopedia of Philosophy, 2012.
[8] Gail-Joon Ahn, Mohamed Shehab, and Anna Squicciarini. Security and privacy in social networks.
Internet Computing, IEEE, 15(3):10–12, 2011.
[9] Prateek Joshi and CC Jay Kuo. Security and privacy in online social networks: A survey. In Multimedia
and Expo (ICME), 2011 IEEE International Conference on, pages 1–6. IEEE, 2011.
[10] Gergely Biczok and Pern Hui Chia. Interdependent privacy: Let me share your data. ´ In Financial
Cryptography and Data Security, pages 338–353. Springer, 2013.
User Login page

Figure : Login page


User home page

Figure : User home page


Admin page Raw dataset to all the basic information of the
User

Figure : Raw dataset to all the basic information of the


User
ANALYSIS
Sensitivity Graph

Graph : Sensitivity Graph


Hidden information seen by user

Figure : Profile page visibility


COMPARISON OF USER PROFILE

Existing system profile is visible Proposed system visibility of


to all the users present on OSN. user profile

Fig.6.1. Profile page visibility


PRIVACY ON USER PROFILES

Graph : Visibility Score Calculation for 10 users


PRIVACY ON USER PROFILES

Graph : Privacy Disclosure Score Calculation for 10 users


REVIEW AND RESULT PAPER

1) To open Review Paper click here.

2) To open Conference Paper click here.


O U
Y
N K
H A
T

You might also like