0% found this document useful (0 votes)
9 views57 pages

UNIT-2

Uploaded by

ankitiaf2002
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views57 pages

UNIT-2

Uploaded by

ankitiaf2002
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 57

KCA031

Privacy & Security in


Online Social Media
UNIT-2
UNIT-2 Syllabus

Trust Management in Online Social Networks: Trust


and Policies, Trust and Reputation Systems, Trust in
Online Social, Trust Properties, Trust Components,
Social Trust and Social Capital, Trust Evaluation
Models, Trust, credibility, and reputations in social
systems; Online social media and Policing,
Information privacy disclosure, revelation, and its
effects in OSM and online social networks; Phishing
in OSM & Identifying fraudulent entities in online
social networks
DEFINITION OF TRUST
• With the development of human societies, trust has played an important role in
people’s lives, including in their relationships, families and their businesses and in
social management systems. With the development of science and scientific
knowledge, different branches of science that focused on human behavioral analysis
and human interaction analysis started to study the concept of trust. Trust has
different definitions in different scientific fields. Here, a brief overview is provided
on the definition of trust in psychology, sociology, economics and, of particular
relevance to the subject of this paper, computer science.
DEFINITION OF TRUST

• Trust in Psychology being confident about received information from another party in an
uncertain environmental state. Psychologists also define trust as ‘the subjective probability by
which an individual expects that another performs a given action on which its welfare depends’

• Trust in Sociology Although in sociology studies, the main focus is on the trust in the society
or social relations

• Trust in economics, trust is defined as ‘the property of a business relationship, such that
reliance can be placed on the business partners and the business transactions developed with
them’
TRUST IN COMPUTER SCIENCE
The concept of trust is widely used in computer science. In computer science into

four major categories:

i) Policy based trust, which covers studies in topics related to network security

credentials, security policies and trust languages;

ii) Reputation-based trust, which includes research on trust in peer-to-peer

networks, and grids and trust metrics in a web of trust;

iii) General models of trust, encompassing research addressing general

considerations and properties of trust and software engineering; and

iv) Trust in information resources, which focus on trust concerns on the Web, the

semantic Web and information filtering based on trust.


Definition for trust in OSNs:

Trust also plays a significant role in the online activities of users of platforms such as
Online Social Networks (OSNs).

“Trust provides information about with whom we should share information, from
whom we should accept information and what considerations to give to information
from people when aggregating or filtering data’. There are many applications for trust
in OSNs, including: social spammer detection, fake news detection, retweet behavior
detection , and recommender systems. All these applications require predicting the
trust relations among users.
Trust Properties
In literature, trust was computed in several ways depending on considered properties.
• Trust can be direct: This property says that trust is based on direct interactions, experiences or observations between the
trustor and the trustee.
• Trust can be indirect: The trustor and the trustee here don’t have any past experiences or interactions. The trust here is
build on the opinion and the recommendation of other nodes. We talk about transitive trust.
•Trust can be local: It depends on the couple trustor/trustee considered and differs from one couple to another, which
means that a node i can trust a node j whether another node k can distrust the same node j.
•Trust can be global: The global trust also called reputation means that every node has a unique trust value in the network
which can be known by all other nodes.
•Trust should be asymmetric: Which means that two people tied by a relationship may have
different levels of trustworthiness each other. The fact that A trusts B does not imply that B should
trust A [13].
Trust Properties

•Trust should be subjective: Trust is inherently a personal opinion which is based on


various factors or evidence, and that some of those may carry more weight than others.
•Trust can be objective: In some case, such as when trust is computed based on QoS
properties of a device.
•Trust can be context-dependent: Where the trust of a node i in a node j varies from
one context to another.
•Trust can be a composite property: Trust is really a composition of many different
attributes: reliability, dependability, honesty, truthfulness, security, competence, and
timeliness, which may have to be considered de-pending on the environment in which
trust has been specified.
Trust Properties

• Trust can depends on history: This property implies that past experience may influence the present level of
trust.
•Trust should be dynamic: Trust is non-monotonically changing with time. It may be periodically refreshed or
revoked, and must be able to adapt to the changing conditions of the environment in which the trust decision
was made.
TRUST AND POLICIES
•The standard notion of trust connected to systems refers, to “the expectation that a device or system will faithfully
behave in a particular manner to fulfill its intended purpose.” The notion of “system” trust is supported by both
software- and hardware-based solutions. These solutions, follow a “strong and crisp” approach based on security
mechanisms to create “trusted,” or rather, trustworthy, systems that overcome technical failures as well as malicious
attacks. In this kind of policy-based trust management, we can describe the conditions necessary to obtain trust, and we
can also prescribe actions and outcomes if certain conditions are met. Policies frequently involve the exchange or
verification of credentials, which are information issued (and sometimes endorsed using a digital signature) by one
entity, and may describe qualities or features of another entity. A comprehensive trust management scheme called
Policy Maker and present their trust management policies, which specify the trusted behaviors and trust relationships.
•A security policy responsible for assigning credentials to entities, delegating trust to third parties, and reasoning about
users’ access rights.
Trust and Reputation Systems
• The notion of trust involving users is derived from Psychology with a standard definition according to which
trust is “a subjective expectation an agent has about another’s future behavior based on the history of
their encounters.” This implies that trust is inherently subjective and relational.

Lets Explore probabilistic definition of trust and the cognitive definition of trust
Probabilistic definition of trust
•According to the probabilistic one, trust is
“the subjective probability by which an individual, A, expects that another individual, B, performs a given
action on which its welfare depends.”
We can also refer this definition as Reliability trust, since it includes the concept of dependence on the trustee, and the
reliability of the trustee, as seen by the trustor
According to the cognitive definition, trust is

•“a mental state, a complex attitude of an agent x towards another agent y


about the behavior/action relevant for the result (goal) .”
Both probabilistic and cognitive definitions share that trust is based on a directed
relationship established from a trustor to a trustee .

Their interdependence is characterized by the fact that:


(i) the interests of the two parties are related, and
(ii) they cannot be achieved without relying on each other.
However, the relationship is not a trust relationship if these two conditions do not exist.
So, as two members interact with each other frequently, their relationship strengthens
and trust evolves based on their experience. Trust increases between members if the
experience is positive and decreases otherwise.
Always remember, trust can be lost quickly:
“[trust] is hard to build and easy to lose:
a single violation of trust can destroy years of slowly accumulated credibility.” In
addition to interdependence, the risk aspect connected to interactions among users have
to be taken into account.
Concept of reliability trust,

“it is possible that the value of the damage (in case of failure) is too high to choose a given
decision branch, and this independently either from the probability of the failure (even if it
is very low) or from the possible payoff (even if it is very high).
In other words, that danger might seem to the agent an intolerable risk.” From this
definition, it emerges that having high (reliability) trust in a person
Decision trust

•“the extent to which one party is willing to depend on something or


somebody in a given situation with a feeling of relative security,
even though negative consequences are possible.”
Trust management follows a “soft and social”
approach,
Based on trust values gathered and shared by a distributed community. In this sense,
• Trust can be direct or based on recommendations.
• Direct trust is based on the direct experience of the member with the other party.
• Recommendation-based trust is connected to reputation.
• reputation as a social evaluation of a target entity attitude towards socially desirable behavior which
circulates in the society (and can be agreed upon or not by each one of the entities in the society).
• Reputation is an assessment based on the history of interactions with, or observations of, an entity, either
directly with the evaluator (personal experience) or as reported by others (recommendations or third party
verification).
• Recommendations may be received through a chain of friends network, so the problem for the user is to
be able to evaluate various types of trust opinions.
Collaborative filtering

• Collaborative filtering techniques are the most popular methods used in recommender systems. The
task in collaborative filtering is to predict the utility of items to a particular user based on a database of
user rates from a sample or population of other users. Unfortunately,
a collaborative filtering system poorly performs when there is insufficient previous available
common rating between users; this is commonly known as cold start problem. To overcome this problem,
the introduction of purely trust-based approaches to recommendation has emerged. These approaches
assume a trust network among users and make recommendations based on the ratings of the users that are
directly or indirectly trusted by the target user.
A general characteristic of trust systems vs
Reputation Systems
• A general characteristic of trust systems is that they can be used to derive local and

subjective measures of trust, meaning that different agents can derive different trust in the

same entity.
• On the contrary, reputation systems provide global reputation scores, meaning that all the

members in a community will see the same reputation score for a particular agent.
• Another characteristic of trust systems is that they can analyze multiple hops of trust

propagation/transitivity, whereas reputation systems normally compute scores based on direct

input from members in the community, which is not based on propagation/transitivity


Characteristics of being both a reputation system and a trust
system.

• In particular, the multi-agent system paradigm and the huge evolution of e-commerce are

factors that contributed to the increase of interest on trust and reputation.


• Theoretically, the issue of formalizing trust as a computational concept in multi-agent

systems, proposing a high-level model based on social and psychological factors.


• A good analysis of what should be taken into consideration to develop trust, how that

relates to previous experience,


Psychological factors in developing a model for
trust in multi-agent systems.
•The more recent and implemented techniques which are at the basis of trust and reputation models can be
classified using many different dimensions .Using this classification, we distinguish between numerical/
statistical and machine learning techniques, heuristical techniques, and behavioral techniques.
Psychological factors in developing a
model for trust in multi-agent systems.
•Numerical/statistical and machine learning techniques focus on providing mathematical models for trust

management.
•Heuristical techniques focus on defining a practical model for implementing robust trust systems.
• Behavioral models focus on user behavior in the community.
•Techniques involving simple summation or average of ratings, techniques based on more complex numerical

functions, Bayesian systems, and belief models are the major examples of purely numerical/statistical techniques.
•In Bayesian systems, binary ratings are used to assess trust by statistically updating the beta probability density

functions. and biological understanding, able to calculate agent’s trust and reputation scores across
•In belief models, a consumer’s belief regarding the truth of a rating statement is also factored into the trust

computation.
Various techniques combining beliefs can be adopted.

•For example, in a model, the information stored by an agent about direct interactions is a set of values that reflect the
quality of these interactions. Only the most recent experiences with each concrete partner are considered for the
calculations. When direct information is available, it is considered the only source to determine the trust of the target agent.
•Subjective logic, that subjective logic represents a practical belief calculus which can be used for calculative analysis of
trust networks.
•Solutions based on machine learning typically exploit techniques such as Artificial Neural Networks (ANNs) and Hidden
Markov Models (HMMs) for computing and predicting trust
•Heuristics-based solutions, less complex than statistical and machine learning ones, aim to define a practical, robust, and
easy to understand and deploy trust management system.
•Among these solutions a decentralized reputation-based trust supporting framework called Peer Trust for P2P
environments,
FIRE model, convinced that most of the trust information
source can be categorized into four main sources:

•Direct experience, witness information, role-based rules, and third-party references.


•FIRE integrates those four sources of information and it is able to provide trust metrics in a wide variety of
situations.
•The reliability value based on the rating reliability and deviation reliability to counteract the uncertainty due to
instability of agents.
•Behavioral trust is evaluated based on two types of trust: conversation trust and propagation trust.
•Conversation trust specifies how long and how frequently two members communicate with each other. Longer
and more frequent communication indicates more trust between the two parties.
Propagation trust

• Propagating information obtained from one member to various other members indicates that a high degree of trust is being placed on the

information and, implicitly a method based on the PageRank algorithm for propagating both trust and distrust. They identify four different

methods for propagating the net beliefs values, namely direct propagation, co-citation, transpose, and coupling.
• The Advogato maximum flow trust metric, aims at discovering which users are trusted by members of an online community and which are not.

• Trust is computed through one centralized community server and considered relative to a seed of users enjoying supreme trust. Local group

trust metrics compute sets of agents trusted by those being part of the trust seed. Advogato only assigns boolean values indicating presence or

absence of trust.
• It is a global trust algorithm which uses the same trusted nodes to make trust calculation for all users. This makes the algorithm suitable for P2P

networks. As the trust inference algorithm has been released under a free software license,

Flow models do not always require the sum of the reputation/trust scores to be constant. One such example is the Eigen Trust model which

computes agent trust scores in P2P networks through repeated and iterative multiplication and aggregation of trust scores along transitive chains

until the trust scores for all agent members of the P2P community converge to stable values.
TRUST COMPONENTS
• trust in online social network services is both a micro- and macro-level phenomenon, generated by the interplay between
the users (the group of micro-level actors) and the network (the macro-level actor). In this vision, the network participants,
the social network service itself, the Web 2.0 technologies can be considered as objects of trust

• Both at micro- and macro-level, social components of trust can emerge from the cognitive, emotive,and behavioral aspects
of trust

• At the micro-level, the interpersonal, one trust can be interpreted both as dyatic trust (direct trust between two entities) or
social trust (a property of social groups),

• When referring to dyatic trust, the cognitive aspect refers to a rational decision to place trust in a trustee based on
qualitative characteristics of the trustee her/himself, such as competence, ability, integrity, honesty, benevolence (e.g., a
user having received a lot of recommendations or endorsements about his competencies by reputed experts on LinkedIn, a
top trending user on Twitter).
Social Capital and Social Trust
Social Capital Definition

• Social capital refers to the links and bonds formed through friendships and acquaintances. These links can form
through friendship groups, i.e. knowing a friend of a friend. Or, they can occur through daily social interactions. For
example, a conversation with the person sitting next to you on the train.

• To put it another way, social capital is the social ties that we develop throughout our lives. Whether it is knowing the
right person to contact in finance to get an invoice through or the right teacher who can help with coursework.
Key Points

1. Social capital is the development of relationships that help contribute to a more efficient production of goods and
services.

2. There are three types of social capital – bonding, bridging, and linking.

3. Social capital can make or break businesses. By having a wide range of connections, some are able to thrive as
they are able to get work done more effectively and efficiently.
Key Points
1. Capital itself is defined as an asset that helps improve the efficiency of production. Social capital, therefore,
is defined as the social assets that help improve the efficiency of production.

2. We can look at it as a social interaction that allows us to build new connections and networks. Or, it can
simply be developing interpersonal relationships with others that help with the efficiency of production. This
may be through good relationships with suppliers, or even colleagues and employees. This relationship may
help motivate workers through friendship ties, or, create more flexible suppliers that are willing to
accommodate.

3. Social capital can help us explain why some firms experience superior managerial performance, efficiency
supply chains, mergers, and acquisitions, as well as the improved performance of diverse groups.
The 3 Types of Social Capital

1. Bonding Social Capital

2. Bridging Social Capital

3. Linking Social Capital


Bonding Social Capital

• Bonding social capital describes the connections between similar groups of people that share the same characteristics. This

might be age, hobbies, relationships, sports teams, or another variable that helps to create a bond between two people or a

group of people.
• Bonding is the strongest type of social capital as a close relationship between two people is formed. This might come from

closely working with a colleague for years and developing a close personal relationship with them. Alternatively, it might

come through sitting next to the same person at each local basketball match, thereby developing a close relationship.
• Typically, these bonds form through the development of social ties. In other words, friendships. However, they also include

family members, as well as neighbors. It is through these connections that people are willing to help each other out and gain

‘social capital’ among their peers. We are more likely to help someone and go out of our way for someone we have a bond

with – as opposed to someone we know nothing about.


Bridging Social Capital

• Bridging social capital differs from bonding in the fact that the ties are not so strong. Instead, the links come from weaker connections such

as friends of friends, or colleagues and associates. The connection is ‘bridged’ in the fact that one person is introduced to another through

an intermediary. That intermediary is effective ‘the bridge’ that brings the two parties together.

• This form of social capital tends to differ from bonding in the fact that there is greater diversity. Bonding tends to occur between two people

or groups of people that have similar interests or characteristics. For instance, you tend to find rich celebrities hanging out with other rich

celebrities, who introduce each other to more rich celebrities.

• Bridging brings two or more people together who would otherwise not connect – even though they are from similar groups and have the

same interests. This may be friends of friends, people who live on the other side of the world, or being introduced to a new colleague –

thereby ‘bridging’ the social gap.

• One of the important aspects of bridging is that it is horizontal. In other words, social capital is developed between people from the same

socioeconomic group – which contrasts to ‘linking social capital’ which develops horizontally between different socioeconomic groups.
Linking Social Capital

• Linking social capital is an extension of bridging. For example, bridging occurs horizontally, i.e. between people of a similar socioeconomic power or

hierarchy. By contrast, linking occurs vertically – i.e. between socioeconomic groups.

• Those who are in similar socioeconomic groups are often referred to as a ‘community’. It is outside of those communities that linking takes place. For

instance, a footballer may be introduced to an underprivileged boy. The two may develop a relationship and hence be able to leverage a far larger level

of resources than previously possible. f we take an example and consider two socioeconomic classes – the upper class and the working class. There are

millions of people in each, yet many of them will stick to their own socioeconomic groups. Through linking, they are able to penetrate the gap between

and develop new contacts and ties in between ‘social boundaries’.

• Another example could include the CEO of a big company. They probably have no idea what work is like for the day-to-day staff at the bottom. Yet a

lower-level manager may introduce them to the staff whereby social capital is formed between the CEO and a general worker. The CEO may benefit

from understanding their staff better, so can introduce better working practices to help motivate staff.

• At the same time, the lower worker may benefit through advice and connections that the CEO may provide. In turn, we have what is known as a

vertical bridge between two groups to generate social capital.


Social Capital Examples

• Social capital is ever present in our every day lives. Examples include:
SOCIAL TRUST VERSUS SOCIAL CAPITAL

• This study focuses on social trust – not on social capital. Social capital is generally defined as a type of resource

embedded in relationships among individuals that facilitates cooperative and collaborative actions within society

(Coleman 1988; Putnam 1993). As Arrow (1999) identified, the term ‘capital’, as in physical and human capital, has a

connotation of deliberate sacrifice in the present for benefit in the future. In general, however, people pursue social

relationships, not for future benefits at the expense of present ones, but for the intrinsic value of these social

relationships. From the literature, it appears that studies have failed to note that while physical and human capital are

allocated through the price mechanism in the market, there is no mechanism that allocates social capital. Without an

allocation mechanism, social capital as a resource faces operational issues. These differences mentioned briefly here

make clear that the argument for equivalence of social capital with physical and human capital is tenuous at best.
SOCIAL TRUST VERSUS SOCIAL CAPITAL

• Although there is controversy over the specific components of the relationships that generate social capital, these components include

social trust, social networks, and memberships of voluntary associations (Coleman 1988; Putnam 1993). All the components of social

capital are expected to bring benefits to society, yet social trust appears to be the most important component of social capital

(Fukuyama 1995; Paldam and Svendsen 2000; Newton 2013). Furthermore, benefits of social networks and their memberships are less

clear, particularly when social capital is compared across countries. The meaning of social networks can differ by country. In group (or

collectivist) societies such as Korea, a large number of social networks or voluntary associations have high limited (or in-group) trust

among their members. Limited trust has negative effects on interpersonal trust among strangers. When group members satisfy their

needs for interpersonal interactions among in-group members, they have limited motivation and experience to deal with people outside

their group (Ermisch and Gambetta 2010; Algan and Cahuc 2013; Delhey and Newton 2003). Hence, increases in limited trust through

membership in social networks may not facilitate cooperation and collaboration within society, since it lowers generalized trust.
SOCIAL TRUST VERSUS SOCIAL CAPITAL

• As a practical matter, a number of worldwide organizations have measured social trust across

countries in a consistent manner over time. This book makes use of data obtained in that research.

However, few organizations measure social capital as such, so there is a paucity of available data on

social capital across countries. Samsung Economic Research Institute (SERI) estimated the levels of

social capital of 72 countries in 2008 (Lee and Jeong 2009), with the components of social capital

including social trust, norms, networks, and social structure. However, SERI’s 2008 estimates

excluded Singapore and Hong Kong, and SERI has not estimated social capital since its first

estimation. This diminishes the utility of SERI data for the present study.
Social Media Phishing

Scams have been around since the dawn of man. Way back before the Internet, a time known to
historians as the Jurassic Era, scammers used primitive analog methods, such as stealing credit card
applications out of your IRL mailbox and calling you on the telephone to trick you into giving out your
credit card number or other personal information—enough to commit identity theft.
Phishers
• Who would have known back then how far we’d progress in a few
short years? One day the Internet came along and nothing was the
same. Not only has it opened our world to 24-hour-a-day online
immersion, it’s also opened up a new world for scammers, some of
who have become known as “phishers.”
Social media phishing

• Social media phishing is a type of fraud in which users receive an enticing invitation to click on an
infected link or provide personal information. While the primary attack vector for “regular” phishing is
through email, social media phishing, is – you guess it – primarily perpetrated through social media
sites. And as social media replaces email – at least in our personal lives, if not at work—social media
phishing is becoming the greater danger. Make sure you know

what to do if you click on a phishing link.


Why Hackers loves social media

• Fraudsters can use social media to target hundreds of thousands of people at once while blending in with
the crowd. What makes social media so attractive to phishers is the sheer number of people on social
media:
• Facebook – 2 billion users
• Instagram – 700 million users
• Twitter – 328 million users
• Snapchat – 150 million users (most likely your kid is on this one)
SOCIAL MEDIA PHISHING SCAMS BY THE NUMBERS

• This incredible volume of social media users is now being reflected in social media phishing
attacks. Here are some more numbers:
• Instances of social media phishing jumped 500% by the end of 2016.
• Fraudulent accounts across sites like Twitter and Facebook increased 100% from the third to
fourth quarter.
• 20% increase in Facebook and Twitter spam from Q3 to Q4 2016.
• Over the past year, the number of phishing attempts on social media networks like Facebook,
Twitter, Instagram and LinkedIn has exploded 150%.
EXAMPLES OF SOCIAL PHISHING SCAMS

Here are some of the most common social media scams circulating today:

• Fake customer service accounts on Twitter (also known as “angler phishing”)

• Fake comments on popular posts

• Fake live-stream videos

• Fake online discounts

• Fake online surveys and contests

Last year, a particularly successful Facebook scam cost an Australian woman $450,000. The team of scammers created a Facebook profile of a

nonexistent doctor – complete with a profile picture stolen from an actual doctor – and sent the victim a friend request, which she accepted.

After gaining her trust, the “doctor” claimed that while traveling, he had inadvertently tried to enter Australia with $1.5 million that was being

held by customs. He urgently needed $3000 to have the money returned to him. She agreed to pay, which started a string of subsequent

payments she made to cover additional fees. By the time she realized she was being scammed, she had made 33 payments, which law

enforcement was powerless to retrieve.


TIPS TO AVOID BEING A VICTIM TO SOCIAL PHISHING

Fortunately you can prevent falling victim to phishing scams by following these best practices:

• As with emailing phishing, the best prevention tip is to think before you click. There’s a reason for the term “clickbait.”

The best phishers hone their craft to try and bait you with links that not only grab your attention, but urge you to click

immediately.

• Whatever you’re using to access your social media accounts, update it! This applies to Web browsers, firmware, apps,

antivirus software, operating systems, and iOS. The reason this is so important is because developers provide patches

and updates as they discover vulnerabilities. Unfortunately, hackers seek to exploit these same vulnerabilities as they’re

discovered. Failing to download updates puts you directly in the line of fire.

• Regularly review your privacy settings. Social media sites have a way of quietly changing privacy settings without you

noticing. Remain aware of how your profile and the content you upload is being seen by others
TIPS TO AVOID BEING A VICTIM TO SOCIAL PHISHING

• Never download unsolicited software, click on URLs in e-mails, or click on popups that appear

while you are browsing. Social networks don’t usually need extra bits of software to be

downloaded on your computer.

• Look for the secure address of the web page with the HTTPS to keep user communications,

identity and web browsing private. And pay particular care to shortened links (through services

like Bit.ly, or Tiny.cc, etc.…), commonly used by scammers.

• And what about your kids who are probably on every social media site, including a bunch you’ve

not even heard of? You must educate them in these best practices too, so they don’t get phished.
WHAT TO DO IF YOU ARE A VICTIM OF SOCIAL
PHISHING

If you have been hooked by some clever social media phishing, then you need to do some damage control. Start

by treating this as a case of identity theft, mainly because that’s what it can lead to if you don’t act.

• Shutdown your computer immediately.

• Change your passwords, using a different computer.

• Put a fraud alert on your account at Equifax, Transunion, and Experian.

• Call your bank and report if you gave out your debit or credit card information.

• Report an account hijacking immediately if you cannot log in to any of your accounts.

You might also like