This site is supported by donations to The OEIS Foundation.

Pragmatic Semiotic Information

From OeisWiki
(Redirected from Semiotic Information)
Jump to: navigation, search

Author: Jon Awbrey

Semiotic information is the information content of signs as conceived within the semeiotic or sign-relational framework developed by Charles Sanders Peirce.

Once over quickly

Information • What's it good for?

The good of information is its use in reducing our uncertainty about an issue which comes before us.  But uncertainty comes in many flavors and so the information which serves to reduce uncertainty can be applied in several ways.  The situations of uncertainty human agents commonly find themselves facing have been investigated under many headings, literally for ages, and the categories subtle thinkers arrived at long before the dawn of modern information theory still have their uses in setting the stage of an introduction.

Picking an example of a subtle thinker almost at random, the philosopher‑scientist Immanuel Kant surveyed the questions of human existence within the span of the following three axes.

  • What's true?
  • What's to do?
  • What's to hope?

The third question is a bit too subtle for the present frame of discussion but the first and second are easily recognizable as staking out the two main axes of information theory, namely, the dual dimensions of information and control.  Roughly the same space of concerns is elsewhere spanned by the dual axes of competence and performance, specification and optimization, or just plain knowledge and skill.

A question of what's true is a descriptive question and there exist what are called descriptive sciences devoted to answering descriptive questions about any domain of phenomena one might care to name.

A question of what's to do, in other words, what must be done by way of achieving a given aim, is a normative question and there exist what are called normative sciences devoted to answering normative questions about any domain of problems one might care to address.

Since information plays its role on a stage set by uncertainty, a big part of saying what information is will necessarily involve saying what uncertainty is.  There is little chance the vagaries of a word like uncertainty, given the nuances of its ordinary, poetic, and technical uses, can be corralled by a single pen, but there do exist established models and formal theories which manage to address definable aspects of uncertainty and these do have enough uses to make them worth looking into.

What is information that a sign may bear it?

Three more questions arise at this juncture.

  • How is a sign empowered to contain information?
  • What is the practical context of communication?
  • Why do we care about these bits of information?

A very rough answer to these questions might begin as follows.

Human beings are initially concerned solely with their own lives but then a world obtrudes on their subjective existence and so they find themselves forced to take an interest in the objective realities of its nature.

In pragmatic terms our initial aim, concern, interest, object, or pragma is expressed by the verbal infinitive to live, but the infinitive is soon reified into the derivative substantial forms of nature, reality, the world, and so on.  Against that backdrop we find ourselves cast as the protagonists on a scene of uncertainty.

The situation may be pictured as a juncture from which a manifold of options fan out before us.  It may be an issue of “truth”, “duty”, or “hope”, the last codifying a special type of uncertainty as to what regulative principle has any chance of success, but the chief uncertainty is that we are called on to make a choice and all too often we have very little clue which of the options is most fit to pick.

Just to make up a discrete example, let us suppose the cardinality of the choices before us is a finite integer and just to make it fully concrete let us say   Figure 1 affords a rough picture of the situation.

Pragmatic Semiotic Information • Figure 1.png

This pictures a juncture, represented by a node marked where there are options for the outcome of a conduct and we have no clue as to which it must be.  In a sense the degree of the node, in this case measures the uncertainty we have at that point.

The Figure illustrates the minimal sort of setting in which a sign can make any sense at all.  A sign has significance for an agent, interpreter, or observer because its actualization, its being given or its being present, serves to reduce the uncertainty of a decision the agent has to make, whether it concerns the actions the agent ought to take in order to achieve some objective of interest, or whether it concerns the predicates the agent ought to treat as being true of some object or situation in the world.

Where is information bred?

  • In reality or in its stead?

The way signs enter the scene is shown in Figure 2.

Pragmatic Semiotic Information • Figure 2.png

The Figure illustrates a scene of uncertainty which has been augmented by a classification.

In the pattern of classification shown the first three outcomes fall under the sign “A” and the next two outcomes fall under the sign “B”.

  • If the outcomes are things potentially true of an object or situation then the signs may be read as nomens (terms) or notions (concepts) in a relevant empirical or theoretical scheme, in effect, as predicates and predictors of the outcomes.
  • If the outcomes are things potentially worth doing to achieve a goal then the signs may be read as bits of advice or other indicators telling us what actions to try in a situation, relative to our active goals.

This will give us a practical framework for talking about information and signs in regard to communication, decision, and the uncertainties thereof.

Sense and Obliviscence

In taking up a study of signs from a pragmatic point of view we naturally follow the advice of the pragmatic maxim on a way to make the relationship between our concepts and their objects as clear as necessary.  When it comes to our concept of the objects called signs we expand our conception of signs to a conception of their practical effects, conceiving the manifold of experiments and experiences involved in the use of signs.

In forming that expansion we bring to light many kinds of signs glossed over in the more conventional focus on words spoken and words written, that is, language in the strictest sense.  Signs in pragmatic perspective encompass all the data of the senses (dots) we take as informing us about inner and outer worlds, along with the concepts and terms we use to reason about everything from worlds of being to fields of action.

Ironically enough, we have just arrived at one of the junctures where it is tempting to try collapsing the triadic sign relation into a dyadic relation.  For if sense data were so closely identified with objects that we could scarcely imagine how they might be discrepant then we might imagine one role of beings could be eliminated from our picture of the world.

If that were true then the only things we'd need to bother informing ourselves about, via the inspection of sense data, would be yet more sense data, past, present, or prospective, nothing but sense data.  And that is the special form to which we frequently find the idea of an information channel being reduced, namely, to a source with nothing more to tell us about than its own conceivable conducts or its own potential issues.

Uncertainty Measured

As a matter of fact, at least in the discrete types of cases we are currently considering, it would be possible to use the degree of a node, the number of paths fanning out from it, as a measure of uncertainty at that point.  That would give us a multiplicative measure of uncertainty rather than the sorts of additive measures we are more used to thinking about — no doubt someone would eventually think of taking logarithms to bring measures back to familiar ground — but that is getting ahead of the story.

To illustrate how multiplicative measures of multiplicity, variety, or uncertainty would work out, let us take up a simpler example, one where the main choice point has a degree of four.  Figure 3 gives us the picture.

Pragmatic Semiotic Information • Figure 3.png

Uncertainty Multiplied

In our minds' eyes we imagined ourselves coming to a fork in the road and seeing four paths diverge from that point.  Suppose a survey of the scene ahead now shows each path reaching a point where another decision has to be made, this time a choice between two alternatives.  Figure 4 gives us the picture so far.

Pragmatic Semiotic Information • Figure 4.png

The Figure illustrates the fact that the compound uncertainty, is the product of the two component uncertainties,   To convert that to an additive measure, one simply takes the logarithms to a convenient base, say and thus arrives at the not too astounding fact that the uncertainty of the first choice is bits, the uncertainty of the next choice is bit, and the total uncertainty is bits.

Uncertainty Moderated

In many ways the provision of information, a process serving to reduce uncertainty, is the inverse process to the kind of uncertainty augmentation taking place in compound decisions.  By way of illustrating the relation in question, let us return to our initial example.

A set of signs enters on a setup like this as a system of middle terms, a collection of signs that one may regard, aptly enough, as constellating a medium.

Pragmatic Semiotic Information • Figure 5.png

The language or medium here is the set of signs {“A”, “B”}. On the assumption that the initial 5 outcomes are equally likely, one may associate a frequency distribution (k1, k2) = (3, 2) and thus a probability distribution (p1, p2) = (3/5, 2/5) = (0.6, 0.4) with this language, and thus define a communication channel.

The most important thing here is really just to get a handle on the conditions for the possibility of signs making sense, but once we have this much of a setup we find that we can begin to construct some rough and ready bits of information-theoretic furniture, like measures of uncertainty, channel capacity, and the amount of information that can be associated with the reception or the recognition of a single sign.

Information Channeled

Suppose we find ourselves in the classification‑augmented or sign‑enhanced situation of uncertainty shown in Figure 5.  What difference does it make to our state of information regarding the objective outcome if we heed one or the other of the two signs, “A” or “B”, at least, operating on the charitable assumption we grasp the significance of each sign?

  • Under the sign “A” our uncertainty is reduced from to
  • Under the sign “B” our uncertainty is reduced from to

The above characteristics of the relation between uncertainty and information allow us to define the information capacity of a communication channel as the average uncertainty reduction on receiving a sign, a formula with the splendid mnemonic “AURORAS”.

Information Recapped

Reflection on the inverse relation between uncertainty and information led us to define the information capacity of a communication channel as the average uncertainty reduction on receiving a sign, taking the acronym auroras as a reminder of the definition.

To see how channel capacity is computed in a concrete case let's return to the scene of uncertainty shown in Figure 5.

For the sake of the illustration let's assume we are dealing with the observational type of uncertainty and operating under the descriptive reading of signs, where the reception of a sign says something about what's true of our situation.  Then we have the following cases.

  • On receiving the message “A” the additive measure of uncertainty is reduced from to so the net reduction is
  • On receiving the message “B” the additive measure of uncertainty is reduced from to so the net reduction is

The average uncertainty reduction per sign of the language is computed by taking a weighted average of the reductions occurring in the channel, where the weight of each reduction is the number of options or outcomes falling under the associated sign.

The uncertainty reduction is assigned a weight of 3.

The uncertainty reduction is assigned a weight of 2.

Finally, the weighted average of the two reductions is computed as follows.

Extracting the pattern of calculation yields the following worksheet for computing the capacity of a two‑symbol channel with frequencies partitioned as

Capacity of a channel {“A”, “B”} bearing the odds of 60 “A” to 40 “B”

In other words, the capacity of the channel is slightly under 1 bit.  That makes intuitive sense in as much as 3 against 2 is a near‑even split of 5 and the measure of the channel capacity, otherwise known as the entropy, is especially designed to attain its maximum of 1 bit when a two‑way partition is split 50‑50, that is, when the distribution is uniform.

Bibliography

  • Peirce, C.S. (1867), “Upon Logical Comprehension and Extension”, Online.

See also

Related topics

Related articles and projects