FANDOM


Knowledge have been not a mathematical element while information has been considered as one of measurable mathematical element[1][2]. Can we measure knowledge as like information?

Introduction Edit

Before discussing why knowledge is mathematically intractable, we illustrate a few differences between Information and Knowledge. The knowledge management paper of [3] investigated Data, Information, Knowledge and Wisdom (DIKW) layers. In this article we omit the discussion of Wisdom, which an abstracted concept in out of the science scope. The paper of [3] shows that knowledge is one step higher level than information from the human mental processing prospective. Data is primitive information sensed from internal and external signals, Information is valuable data chosen from primitive ones, and Knowledge is information addition with their relationship such that knowledge has additional information including importance ordering information. The importance ordering information in knowledge denotes which information is more important than the other information. Knowledge implicitly has ordering information not belonging to information. Each bit of information sequence has the same importance. Thus, knowledge must be constructed more than one symbols whereas information can be treated just by each symbol independtly.

Measuring information Edit

In information theory, the state probability of each symbol, p_k, is a fundamental element of processing. As an example, the amount of information of a given symbol s_k is represented as I(s_k) = log (1/p_k) where logarithm measure is used for mathematical tractability. If the probability of the event in a symbol is low, the amount of information will be inversely high. The other fundamental measure of the information quantity is Entropy, which is given by


 H_i(S) = E[ I(s_k)]

where S = \{ s_1, s_2, ..., s_K \} is a symbol space.

Suggested Way to Measure Knowledge Entropy Edit

Now, discuss how we can measure the amount of knowledge. Since each knowledge consists of more than one information and can not be measured independently, we can not define the 'absolute' amount of knowledge for each knowledge symbol. So, let's define the entropy of knowledge in stead of defining the amount of each knowledge symbol, as follows:


 H_k(S) = E[I(s_k)| k \in \{\text{the importance set}\}]

where we assumed that each knowledge symbol can be grouped to two sets that are important and useless knowledge groups.

Although for analytically simplicity we categorize only two sets according to whether it is important or useless in this article, more general cases can be treated by straight forward manner. Because we ordered the element of all knowledge factors as two sets as 1 for important and 0 for useless with the assumption of the equal level in each set, the Entropy of Knowledge is the small than Entropy of Information with same number of factors:


 H_k(S) \leq H_i(S)

Then, let's discuss whether there is no way to specify the amount of each Knowledge?

References Edit

The following items is citation documentations in this chapter. The following will be revised using the bib system in Latex.

  1. Cover and Thomas, Elements of Information Theory
  2. Simon Haykin, Digital Communications
  3. 3.0 3.1 Gene Bellinger, Durval Castro, Anthony Mills, Data, Information, Knowledge, and Wisdom

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.