PART III: This is the third installment in a three-part series explaining the three elements of knowledge management.
Parts I and II described how process maps show where knowledge is used, created and stored, while people are the creators and holders of the corporate knowledge.
Part III focuses on technology as an enabler for knowledge management.
Mining, capture, and processing are critical to the proper implementation of knowledge management.
"Do I really need technology to capture my company's intellectual property?"
The answer: maybe not. So why bother with it, or why bother with IT (information technology)? That's a question answered each year by information technology managers during budget allocation processes.
While technology is not required and should never be the focus of a knowledge management initiative, it has, in fact, enabled the capture and large-scale sharing of some of this precious resource we call knowledge. As Tom Stewart of Fortune Magazine stated, "Technology without people won't work, but people without technology won't scale." So, the bottom line is that to implement an effective global knowledge management strategy, information technology must be utilized as an enabler.
What does technology enable? Technology is absolutely necessary to participate in the global game of knowledge management. Technology lets us:
- Record and store mass quantities of data
- Crawl the mass quantities of data
- Notify, automatically, of changes in preferred knowledge stores
- Find and contact people (peers, competitors, customers, vendors) globally
- Record, monitor, and dynamically update good practices
- Maintain global, running dialogues with peers and mentors.
Technology is what allows us to even consider the utopia of making connections possible from anywhere, at anytime, to anyone. There are a number of technologies to watch in this arena.
Knowledge, maintained and used properly, identifies and renders the problem for more effective inspection and solution.
The emergence of the Internet and intranets has been the crowning glory to the world of knowledge management. Many would go so far as to say that without this connecting network and its associated browser technology, the majority of knowledge management systems would be doomed to failure.
The Internet is not new. In fact, in terms of technology years, some might even consider it to be old, given its early beginnings at the US Department of Defense in the 1960s. However, the ubiquity of a common delivery mechanism for knowledge and sharing is what is new. This infrastructure has become the common platform for aggregating content.
Portals are everywhere; everyone claims to be one. In many ways this has added to the knowledge management confusion, even though portals have been described as "an attempt to bring some order to the chaos of the Web" and the "killer app (application) of knowledge management."
Portal technology is focused on delivering the right information to the right person at the right time. While some believe this is impossible, portals certainly have come a long way in helping to organize information on the Web, corporate intranets, and other public or proprietary information sources. Portals began in the early days of the World Wide Web as a way to organize access to the vast amounts of available information
Enterprise information portals (EIPs) have extended this concept inside the walls of corporations. Merrill Lynch first defined this concept in a November 1998 report as "a window, courtesy of the basic Web browser, into all of an organization's information assets and applications." It is a lofty goal - defining the holy grail of corporate information technology and organizational knowledge.
Regardless, a carefully chosen and well-implemented enterprise information portal can make a significant impact on an organization's utilization of its information capital. Today most EIP products offer a single interface to:
- Access external data/information
- Access internal data/information
- Search tools
- Launch applications
- Collaborate on tools such as e-mail, video conferencing, newsgroups/chat rooms.
In general, these tools help with information organization and access. However, other flavors of portals are popping up on the market almost daily. The most popular of these at the moment in the industry seems to be portal interfaces to promote business-to-business transactions. Exploration and production (E&P) organizations are trying to make sense out of the multitudes of options.
There are many choices to be made in the portal marketplace. One must carefully consider the goals of an EIP implementation and select a product to achieve these goals. Some products focus on access to structured data, while others focus on unstructured data. Other offerings differentiate themselves by appealing to varying customers. Some tools present their appeal to the entire organization, while others focus on functional groups, business decision makers, or vertical industries.
The bottom line, however, is to enable an environment in which each individual is comfortable. This is often dependent on what an individual or team does, and their corresponding work style(s). Keep in mind that one cannot always have a "one size fits all" approach. Minimally, any implementation of an enterprise information portal must provide the facility for individuals or teams to personalize this space if it is expected to be the interface into the world of information and knowledge.
Process management tools
Procss management tools enable one to capture crucial process information as described in earlier articles of this series. Without information technology supporting process management, the eight-foot long piece of brown paper that your team so carefully documents their processes on becomes a dust collector in the corner of the team facilitator's offices.
These tools can range from very sophisticated process and task authoring/routing tools to the other extreme of merely creating Web pages that textually document what the organization considers as best practices. As organizations mature and strive to turn their best practices into living, breathing, easily maintained knowledge, one expects more uptake of electronic process management tools in the E&P industry.
Collaboration tools have matured immensely over the last 15 years and are so ubiquitous that some may wonder why we even include them here. Such tools associated with knowledge management include e-mail, discussion groups/ newsgroups, video conferencing, and application sharing. Some would even throw everything associated with knowledge management into this category.
The key to successful use of collaboration technologies lies not in the tools selected, but in the efforts put forth to ensure appropriate use and careful acknowledgement of the change management that must go along with use of these technologies.
Appropriate use does not mean that organizations can reduce their travel budget to zero because everyone has access to e-mail, video conferencing, and application sharing. Larry Prusak, director of IBM's Institute for Knowledge Management, comments: "Technology cannot cause knowledge networks, it can only enable them...Unless people meet, trust decays, and entropy sets in."
Search tools again are tools that have become relatively commonplace due to their growth via the World Wide Web. Most popular search engines are one of two types:
- Search engines or meta-search engines: These systems typically use some form of spider software or crawlers to constantly search the Web for new information. Resulting uniform resource locators (URLs) are sent back to the search engine. Some search engines index every word of a document, while others simply index the title. This type of search engine requires limited human intervention to build its content
- Subject directories: These engines require significant human intervention. Typically, editors select Web pages to index and categorize. These Web pages are often organized into hierarchical subject categories. Some specialized subject guides only look at certain topics.
These search engines usually return a list of Web pages with a subjective indicator as to their relevancy to your desired query. As this technology evolves, you can look for products that answer questions posed, rather than require the user to specify a Boolean-based query. These offerings are a growth from the early days of natural language queries, which use forms of artificial intelligence to understand the surrounding context of their "hits," and apply that information to analyzing the relevancy of results.
Unlike the other topics that have been covered here, extensible markup language (XML) and extensible style language (XSL) is a technology and not a product that a single vendor can provide as an off-the-shelf item. As the world of connectivity has become more widespread, expectations for applications and data connectivity have increased as well.
Sophisticated users require an approach for enabling spontaneous information exchange in this well-connected world. This requires that the exchanging parties agree on what each piece of information means, as well as spontaneously agreeing on the organization of the information. With planned data exchange that can be mapped and organized, this is not difficult.
But the reality is that we want to exchange information without having to plan everything with our information technology departments ahead of time. This is where XML and XSL enter the picture. XML defines a mechanism
for exchanging structured information over intranets, extranets, and the Internet, whereas XSL defines the presentation of the content defined in XML.
XML is a system for defining, validating and sharing document or data formats. This language uses tags to distinguish document structures and attributes to encode extra document information. With XML, tags do not have to be pre-defined. They interpreted as the document is received. This enables the sharing of information without regard to native database structures in a typical relational environment.
XML was not designed to be a standardized way of coding text. It is based on the realization that it would be impossible to devise a single coding scheme suitable for all applications, industries, and languages. Instead, XML passes information about the component parts of a document to another computer system. The receiving system reads this structure and uses it to interpret data content. The strategy ensures that data will be transferable to a wide range of hardware and software environments.
XSL, on the other hand, is a system for specifying presentations when creating XML or HTML pages. This standard is purposely independent of the content being communicated, such that the formatting can be applied without contaminating the content. This separation is particularly useful when there is a need to present the same information to different consumers based on their workflow, role, or history. This is very applicable to data in the E&P industry.
Medieval to future
The authors do not intend to imply that the technologies covered in this article represent a complete list. There are other technologies available that can be used to further knowledge management. The trap to avoid, however, is assuming that implemented technology means knowledge management is robust and thriving.
Verna Allee uses "technology as sort of a litmus test to see where you are in your learning about knowledge management. If a company is babbling on about technology, then it is a beginner."
But as knowledge management evolves within the enterprise, it learns to harness the immense power of technology and activate the value network of dynamic exchanges among its own people, suppliers, and customers. So, while knowledge management has been with us since medieval times, as evidenced by the master and apprentice exchanges, we are witnesses now to times where technology tools make global speed-of-light knowledge exchanges possible. Knowledge, though stored within one mind, can become ubiquitous at the touch of a button.