People use social networking tools to figure out who they can trust and rely on for decision making. By the end of this decade, power and influence will shift largely to those people with the best reputations and trust networks, from people with money and nominal power. That is, peer networks will confer legitimacy on people emerging from the grassroots.
This shift is already happening, gradually creating a new power and influence equilibrium with new checks and balances. It will seem dramatic when its tipping point occurs, even though we're living through it now.
Everyone gets a chance to participate in large or small ways, giving a voice to what we once called "the silent majority."
(Okay, I started with the bottom line. The following is a relatively brief summary of how I got there, deserving much longer treatment from really smart people.)
When we need help with decision making, we get recommendations from people we trust, that trust built on some combination of personal experience and reputation. That's the way humans work, nothing new about that. We talk about reputation being one's greatest asset.
Reputation is contextual, that is, you might trust someone when it comes to dry cleaners but not politic. However, I'm going to simplify this thing by avoiding the issue, right now. (Yes, might be short-sighted on my part.) I'll also defer a prerequisite, the need for persistent and verifiable identity, the need to prove you are who you say you are.
In real life, personal networks are pretty small, maybe in the hundreds. Mass media plays a role in shaping reputation for a small number of people, including celebrities and politicians. A very small number of people have influence in this environment.
Internet culture and technology changes this dramatically:
Okay, so we want to be able to see who we might be able to trust, maybe by seeing some explicit measurement, or maybe something implicit, like seeing their history, and who trusts them.
We already see various forms of reputation and recommendation systems evolve, often with mixtures of pre-selected experts or professionals. Amazon and Consumer Reports Online do a good job of this. (Disclaimer: I'm on the board of Consumers, since their record for integrity is close to perfect.)
Wikipedia does a very good job of this, mostly by having lots of people keep an eye on articles, particularly the more controversial ones. There are ongoing issues, being addressed in good conscience as people develop new methods to address information quality and reliability.
We also see reputation and influence created by persistent works, reflected in social networking sites including Facebook, LinkedIn, and Google Social. Such systems show history and context, which play into trust, and display connections to other people. Those are not trust relationships normally, they're "weak ties" which also play into trustworthiness.
Cory Doctorow (or here) postulates kind of a trustworthiness currency called "whuffie". You trust someone, maybe want to reward them for something, you give them points. Turns out that there's an experimental repository of whuffie, thewhuffiebank.org. While this sounds facetious, it's a very simple solution to the complex problem of tracking trust.
The most prominent experiment in directly measuring trust is Unvarnished, very recently launched in beta form. You rate what trust you have in specific individuals, and they might rate you. Unvarnished is pretty controversial, and is already attracting a lot of legal speculation. They're trying to address all the problems related to the trustworthiness of the information they receive, and if so, might become very successful.
The last raises an issue all such systems have; they might be very easy to game. Any such system is vulnerable to disinformation attacks, wherein smart enough people can figure out how to fake good or bad ratings. There are a number of very successful groups who are really good at such disinformation in conventional media. Often they're called "front groups", "influence peddlers" or "astroturfers." One good watchdog over such groups is the Center for Media and Democracy.
One metric of trust is transitive, that is, the trustworthiness of the people who trust someone. If person A trusts you, and person B trusts A, then that might affect how one measures your trustworthiness. However, that gets real complicated when that web of trust involves seven billion people, or even a few thousand. It's a research problem.
How do we trust the custodians of trustworthiness? We need to have some confidence that they're not fiddling the ratings, that they're reasonably secure. After all, trust and reputation are really valuable assets.
I think the solution lies in a network of trust and reputation systems. We're seeing the evolution of a number of different ways of measuring trust, which reflects a human reality; different people think of trust in different ways.
Also, the repositories of trust information are the banks in which we store this big asset. Like any banks, having a lot of this kind of currency confers a lot of power in them. Having some competition provides some checks and balances.
We need to be able to move around the currency of trust, whatever that turns out to be, like we move money from one bank to another. That suggests the need for interchange standards, and ethical standards that require the release of that information when requested. Perhaps there's a need for new law in this area.
Restating the bottom line: we are already seeing a shift in power and influence, a big wave whose significance we'll see by the end of this decade. Right now, it's like the moment before a tsunami, where the water is drawn away from the shore, when it's time to get ahead of that curve.
Follow Craig Newmark on Twitter: www.twitter.com/craignewmark