paulchinonline.com

Establishing Intranet Content Credibility

By Paul Chin

Originally published in Intranet Journal (02-Sep-2005)

back back to portfolio


How much faith would you put on information if you don't know where it came from? Intranet owners are hoping it's a lot.

Intranets contain large volumes of content with numerous points of data entry, all of which is posted by a slew of multi-disciplinary personnel who use various sources to build their content—including their own expertise. Intranet users, for their part, are asked to place their faith on this information without knowing anything about the content's credibility or the person who put it together. Did they do their fact-checking? Are their sources reliable? For all they know it could have been cobbled together by a summer intern who was using the system for the first time.

The credibility of the content providers translates to the credibility of content, which in turn translates to the credibility of the intranet as a whole. When users rely on intranet content in their decision-making process—decisions that will have a significant impact on their business—that content better be accurate. But how do we know?


Social Versus Digital Information Gathering

Information gathering used to be a very social and interactive process. When we needed to know something we would seek out the resident expert on the subject and pepper them with questions. We either met in person or spoke on the telephone. While this process took a bit more time, we at least had an idea of where the content comes from and an overall impression of the person, or persons, who gave it to us.

Once we port all of this content onto an intranet, which speeds up information gathering and retrieval, we lose that face-to-face interaction. We lose our ability to ask follow-up questions and are forced to put our faith on the efforts of relative strangers. The content could have come from anywhere and from anyone. Who's to say that their standard of proof is as high, or that they're as conscientious with their fact-checking, as you are? What if their content is based on someone else's already spotty research?

The main problem—especially with newly developed systems—is that an intranet might not be regarded by users as an established authority as it relates to their information needs. For example, users searching the company intranet might find a wealth of anonymous content related to their ongoing projects and contracts; but they won't feel the same level of content trust as they would if the information was given to them by the team leader of the project or contract.

The answer to this is to build up credibility among those providing intranet content. Every piece of content should be given a "face" by specifying the content's source—whether internal knowledge assets or external information sources. This eliminates the anonymity often associated with intranet content. Users are far more likely to trust known authorities who have provided reliable, high-quality content in the past.


Beware the Halo Effect

Establishing credibility isn't easy. It takes time to gain users' trust and only a single bad piece of content to lose them. Regaining their trust—especially if they were negatively impacted by the use of this content—will be extremely difficult. Unfortunately, users have a tendency to judge the worth and credibility of an intranet as a whole using very limited information. Psychologists refer to this as the halo effect, defined as "the tendency to assign generally positive or generally negative traits to a person after observing one specific positive or negative trait, respectively."

Psychologist Edward Lee Thorndike described the halo effect in his 1920 paper A Constant Error on Psychological Rating as "a problem that arises in data collection when there is carry-over from one judgment to another." He said that it's "an extension of an overall impression of a person (or one particular outstanding trait) to influence the total judgment of that person. The effect is to evaluate an individual high on many traits because of a belief that the individual is high on one trait."

This same halo effect can cause minor intranet inconsistencies to become regarded as the norm rather than the exception. Intranet credibility will suffer because a mar in one small section can curse the entire system. And users will likely label the intranet unreliable. Take as an example the highly publicized story of Jayson Blair, the former New York Times reporter who was discovered to have committed journalistic fraud by plagiarizing stories and faking quotes and interviews. Or the "Memogate" affair with CBS reporter Dan Rather. Two highly respected news sources—The New York Times and 60 Minutes—were given black eyes because of isolated incidents.


Implementing an Approval Process

The issue of content credibility boils down to the path the content takes from conception to publication. When information is obtained through an external source—such as content purchased through third-party information vendors—the content takes a very direct route from source to intranet. Users trust that the originating provider went through their own fact-checking process before making it available to others.

But when content is developed in-house, there are often two opposing schools of thought: that internal content providers, as responsible employees of the organization, did their jobs properly and should be taken at their word; or that the providers and their sources are unknown so the content's credibility is unknown. Hopefully, experience will show that the former prevails over the latter.

Some intranet owners have had success with maintaining a high-level content quality by filtering all incoming content through an approval process. This process of content quality control is similar to that of a magazine or newspaper's editorial process whereby a story is passed from writer to assistant editor to editor in chief. It can be done manually or as part of a built-in feature of a commercial CMS tool. Approval workflow usually follows this path:

  1. A primary intranet content provider gathers or develops content for their section of the system.
  2. Upon completion, this content is placed in an "on-hold" location, separate from production intranet content, and locked so that others can't make any modifications.
  3. An authority or content editor for that particular intranet section reviews the content sitting in the "on-hold" area.
  4. Content that passes their inspection is placed "off-hold" and sent to production. Content which doesn't meet standards is sent back to the originator for reworking.

While this high level of quality control provides users with a greater sense of content's credibility—knowing that they don't have to crosscheck each piece of information themselves—there's one very big caveat with implementing an approval process: it must never hinder the timely posting of new information.

An approval process may work well for small- to medium-sized intranets with moderate traffic, but large enterprise-level intranets with high content traffic might suffer a bottleneck effect if there's a disproportionate ratio of editors to incoming content. Intranet owners need to carefully balance the process of channeling content through a knowledgeable quality control person or group and the timely availability of information. The last thing you want is to create a bottleneck where high volumes of content are sitting in limbo waiting to be verified.


Tips to Maintaining Content and System Credibility
  • Always state the source of content.
  • Always state the date the content was originally posted, when it was last changed, and who changed it.
  • Choose content providers carefully.
  • Maintain an "about us" page for each of the core content provider groups to give users an idea of their background and expertise.
  • If you're unsure about any piece of content, err on the side of caution. The consequences of posting faulty information far outweighs the fear of missing out on information.
  • If an approval process is implemented, make sure that there are enough editors to handle the volume of incoming content to avoid creating a bottleneck. Each section of an intranet should have its own editor(s) in order to maintain the flow of information.


The Impact of Wikis on Content Credibility

Wikis such as JotSpot, Socialtext, and Atlassian are becoming increasingly popular as enterprise-level tools because of their ease of use and, as its name implies, quick content turnaround times (wiki comes from the Hawaiian wiki wiki meaning "quick"). But are wikis well-suited as a replacement for content managed intranets?

The argument against using a wiki as an organization's primary content management system is the same one used by many detractors of the online, collaborative encyclopedia Wikipedia: it lacks credibility—especially when compared to, say, the Encyclopedia Britannica. No one knows who input the content, and no one knows who changed it.

The main difference between a wiki and a traditional content managed intranet is that wikis are peer-edited whereas a content managed intranets usually expert edited.

Peer Edited Expert-Edited
Users edit users.

Quicker turnaround times since everyone edits everyone else.

Arguably, the notion that more eyes translates to more accurate content.

One, or several, authority(-ies) edit content providers

Fewer editors to verify large volumes of content may create a content backlog.

Might be seen by some users as more reliable content.

A wiki, however, is only as strong as the users feeding it. If users trust the efforts and expertise of their peers, then a wiki can be a valuable enterprise tool. But if it's misused, it will become a case of the blind leading the blind—or worse, complete bedlam.

The fiasco with The Los Angeles Times' short-lived "Wikitorial" illustrates the dangers of allowing anyone and everyone to edit content. Last June, the well-respected newspaper launched a media experiment allowing readers of its Web site to edit their online editorials. Perhaps expecting the best of Web users, they got the worst. Their Wikitorial was bombarded with obscenities, pornography, and everything in between. It was launched on a Friday; by the following Monday morning it was taken offline. It lasted only three short days.

The Los Angeles Times' example of "when good wikis go bad" is an extreme case and differs from intranet-based wikis in that it was open to the general public. We expect that internal intranet users would be more responsible in their use of a wiki. But even internal wiki content may leave some users scratching their heads, wondering about the credibility of the content. A user with only an elementary understanding of a particular subject matter might, in all good intention, add content onto a wiki that proves to be erroneous once read by an authority. In between that time, another user who didn't know any better might have accessed that content and taken it as fact.


Closing Thoughts

There are so many tools available—both fresh and well-established—that have potential business applications. We have our traditional content managed intranets, enterprise information portals, wikis, blogs, and klogs just to name a few. But the true worth of a system is based on the credibility of the people providing the information, the quality of the content, and the trust users have in these content providers.

Because of the prevalence of wikis, blogs, and the Internet in general, the standard of proof might not be what it once was when well-respected and well-established news organizations were the primary sources of information. Nowadays, anyone can publish just about anything short of libel. This is why intranet content should never remain anonymous; it must have a face—especially true with digital content developed in-house. Users should have an idea of who's behind their content and where they sourced their information. They shouldn't be forced to rely on the supposed expertise of strangers.


Copyright © 2005 Paul Chin. All rights reserved.
Reproduction of this article in whole or part in any form without prior written permission of Paul Chin is prohibited.