Managing a Trusted Community
Human societies have been generally successful in managing trust and reputation. People have always self-organized into resilient communities for information exchange, and these self-organizing models have won the test of time . In this section we describe an approach that models trust by using the exchange of reputation and trust information based on social networks .
Utilizing Social Networks: Community Vouchers
Earlier we described a web-of-trust model. Several attempts have been made to create viable web-of-trust models, such as Pretty Good Privacy (PGP) . There are several problems with the models proposed to date, such as the collusion and identity problems already noted. Problems also occur because in every community, members depend on other members for trust. The recommendations are not weighted, and newly enrolled members may have difficulty finding any existing members whose recommendations are believable. Furthermore, the notion of negative recommendations, which is essential in defending against malicious members, does not exist in the existing web-of-trust models. We propose calling all forms of recommendations vouchers, because we see a recommendation as a way for one member to vouch or not for another member.
In Figures 2a and 2b, each solid line between two members represents an existing trust relationship. Figure 2a represents a community that has accreted naturally into three clusters: a cluster between two members labeled Cluster_i, a random network cluster labeled Cluster_r, and a mesh network labeled Cluster_m.
If member F intends to establish a trust relationship with member A, any combination of the following existing trust relationships can be utilized by A to provide evidence for trusting F: (1) the trust relationship between F and I, because A accepts I's vouchers, (2) the trust relationship between F and D, because A accepts D's vouchers, or (3) the trust relationship between F and S, because A accepts S's vouchers.
Of course, in general A does not have a priori knowledge about any of these relationships, and it faces the problem of how to expeditiously obtain the evidence it needs about F's trustworthiness. The solution? A should query all of the other community members it considers trustworthy that are presently on-line. Other community members can send their vouchers. Whether another member has the evidence A needs depends on that member's connectivity and experience within the community. Figure 2 depicts S as more highly connected than other members, so it is likely S can respond to more requests.
By responding to more requests, S can build a reputation for fast or more complete or authoritative responses, because it has access to more sources of evidence (S may instead build a negative reputation by passing information that later proves to be inaccurate). A member such as S builds its reputation based on the natural evolution of relationships within social communities. These reputations can be used to weight trust decisions.
In addition to making decisions on whether or not to trust another member, the weight factor can also be used to determine trust levels. Trust levels fluctuate dynamically. For example, a voucher from a highly connected member such as S may make F more trustworthy to A. A higher trust value is therefore given to F in A's trust list, because S is likely to have more evidence about F's past behavior. By reporting helpful vouchers more frequently than other members, S's trust level will increase.
There are many ways to represent levels of trust. One representation can be trust rings, where each node creates several rings around it that represent levels of trust, and the device sorts other members into the most appropriate ring. A member places the peer members it trusts the most (that is, those that it maintains the closest relationships with) in the inner-most ring, followed by its friends in the next ring, and so forth. A neutral ring denotes a neutral trust state, while the outer rings represent mistrust. A member can move between rings depending on the change in its relationship with other members. Trust, from a member's point of view, can have the following form:
where τay denotes how much member A trusts B, τay< 0 denotes mistrust, and αr denotes the weight associated with a trust ring. Using this equation, if member A receives several vouchers for member B, A will use the trust ring of the members vouching for B to resolve conflicting vouchers. A then uses its trust calculation to decide whether to transfer B between rings. Highly-connected members tend to migrate toward the innermost ring (at least if the vouchers they provide report good information over time), resulting in their opinions weighing more. Trust rings provide a simplified version of assigning trust for a node, especially in situations where the node does not have the computing power to assign and maintain individual values for all nodes it encounters. It provides the flexibility of grouping nodes according to trust levels.
As an example, in Figure 2b, node L trusts member M with a value of 3, while J mistrusts M with a value of -2. Member A places L and J at level 5 and 2.5, respectively. When A computes the trust level for M, the negative endorsement of member J will be subtracted from that of member L. This will result in a positive trust value τAM , which will probably not be high enough to place it in the inner circle due to the negative endorsement of J. In addition, the trust levels (τab ) are dynamic variables, which makes the equation adaptive to changes in the communities (for example, nodes moving out of current communities or forming new communities).
A member's cumulative rating across all other members' rings represents its reputation within the community. Reputation requires maintenance of a relationship history, so is not free. We believe reputation makes for a good default trust value when no other information is available.
Micro and Macro Trust System
In addition to trust between members, we extend the trust model to a second dimension, where trust is calculated in a vertical (layered) manner, and reconciled to horizontal trust (for example, nodes virtual networks). Figure 3 illustrates the concept with a system comprised of three entities, and each entity is comprised of multiple components. For example, entity A consists of a user, a device, a virtual machine, a host operating system, and an application running on the device. A vertical trust relationship exists between the respective components of entity A; in the meantime trust also exists horizontally, such as between entities.
In existing technologies, individual components completely trust all the components of other trusted entities. This increases the risk when a component on a trusted entity is compromised. Our approach, on the other hand, applies trust propagation among individual components inside entities as well as among entities as a whole, and trust levels among different components within an entity can be independent. For example, assume that entity A is running a web browser, while entity B is running a web server, with a database backend. As a result, the trust of the browser in the database server on entity A is a function of its own trust level for the web server and a trust voucher on the database, while Entity A and Entity B as a whole may have different levels of trust between themselves. We assume no explicitly defined trust level between the application and the device. However, trust can propagate from the application to the device through the user, the operating system (OS), or a virtual machine (VM). In order for such a system to be usable, it is essential to have a network that is manageable to the degree that you can continue to model the interconnections based on social networks, where each entity can be treated as a community.