SOFTMARK AG

Cognitive Computing

Cognition

Wikipedia: List of Cognitive Biases

 In the intelligent evaluation of complex situations and in the optimizing control of company processes, the software agent needs cognitive strategies equal to or even superior to those of human beings. It must be investigated to what degree human cognition is affected by disadvantages that can be avoided through skillful software strategies:

 1st mistake: Reductionism

Thought patterns in the western, industrialized hemisphere follow a mechanical, causal-analytical world view based on the ideas and mathematical theories of Isaac Newton and René Descartes. Matter is considered to be the basis of all being, and the material world is seen as a multitude of separate objects that combine to form one giant machine. Complex phenomena are understood by being reduced to their individual building blocks, a method called reductionism. This approach of going back to individual components is equated to a scientific method.

As early as the 20th century, this mechanical world view proved to have distinct limitations, and physicists began moving away from the big machine model and to see the world as a harmonious whole, as a network of harmonious relationships.

2nd mistake: Determinism

Human thought is to a large extent dominated by predefined patterns and the perpetual endeavor to sort new data into these existing target patterns. This circumstance is a disadvantage, because the human mind tends to always shift toward defined, long-familiar thought patterns whenever corresponding informational content is presented.

The second the human mind is confronted with a particular situation, the categorizing mechanism kicks in and attempts to classify the event in some way. As soon as classification appears possible, the process is initiated. This leads to patterns. If new informational content arrives that does not fit into one of the existing patterns, we often do not question the pattern but the content itself. The human mind continues to manipulate the information until it more or less fits into one of the defined patterns.

To this end, the human mind also uses improper methods that not only reinforce the whole system but drive it toward absurdity. Patterns form into chains of patterns in which situations are permanently and inextricably linked to one another. There is nothing in the system of human thought that can effectively combat or even interrupt such pattern sequences.

3rd mistake: Volatility

The human mind is erratic and thinks in extremes. Mental transitions in the course of decision-making are not nuanced, but occur in steps that are much too broad and often influenced by factors of the person’s emotional state. This volatility is a disadvantage of human thought, as it causes abrupt switches among several thought patterns. People generally make too little effort to examine the transitions between the conceptual patterns in sufficient detail. Under time constraints and through negligence, we often choose the wrong pattern and fail to get the optimal benefit from the information presented.

4th mistake: Too little diversity in perception

The sequence in which information arrives in the mind determines human thought. Data is sorted into patterns in the order it arrives, which keeps the mind from taking full advantage of the available information. In addition, the human mind depends on attention. In order to address a defined situation, the mind has to adjust to the corresponding circumstances. Without proper attention, thought is not possible.

The goal of focusing our attention is to activate memories and in doing so create a memory level where the current data and statements can be evaluated and sorted. Due to the necessity of attention, however, the scope of information a human being can cognitively process will always be limited. Human attention is also frequently subject to random circumstances rather than deliberate focusing, which can be reinforced by key terms.

The human mind has to limit its memory level to be able to become attentive to a particular area at all. It only works when it does not access all of its memories at once, which would not be possible anyway in light of the huge amount of data. The more we gear our attention toward a particular situation, the greater the chance that we focus too intently on already defined patterns and lose our ability to see other perspectives.

Also, the mind accepts only those perceptions and thoughts that are logical and make sense. Seemingly nonsensical connections are rejected from the start, even if these connections that seem nonsensical a priori do prove to make logical sense at a later point in time, a kind of posteriori logic so to speak. In this context, we speak of so-called paradoxes. These very paradoxes are what gives human perception the innovative-creative impulse to draw the logical and ultimately right conclusion from a supposedly illogical point of view. This creates new approaches to solutions that would not materialize with common linear thinking.

Innovative solutions, therefore, are often the result of initially illogical perspectives of situations. The logic of a solution only becomes apparent after the fact. This situation is strikingly illustrated in data mining systems, which generate data patterns as a result of mathematical distributions. These patterns very clearly express the represented frequencies, but never the reasons for their respective occurrences.

“While the association of toothbrushes and toothpaste is immediately apparent, we are astonished by the connections between shower gel and hair shampoo, which reveal a typical paradox: apparent effects are disappointedly perceived as triviality, surprising effects are neglected because the underlying connection remains hidden. The goal is to support the thought process, which initially occurs upstream and then parallel to the technical analysis, in such a way that the formulation of the question, thesis generation, verification, and falsification follow rational principles and are documented transparently with the analysis results. In addition, a necessarily creative process is to be initiated, in which users break free of well-worn thought pathways and find their way to new questions and ultimately design options.”
(Dr. Nicolas Bissantz, Prof. Hannig, Stand und Weiterentwicklung softwaregestützter Datenanalyse im betriebswirtschaftlichen Umfeld, Status and development of software-supported data analysis in the business management environment, 2001)

5th mistake: Lack of goal recognition

The system is scanned until a defect is found. The defect is removed, and the next defect is located (repair service behavior). As with a novice chess player, planning does not have an overall direction but remains a continuous piecework without an overarching concept.

6th mistake: Limitation to sections of the overall situation

Huge amounts of data are collected that yield enormous lists but show hardly any relationships. That way, they cannot be put into any kind of order, and the dynamics of the system, especially in the temporal dimension, remain unrecognized. This phenomenon occurs especially in today’s information technology, where virtually unmanageable amounts of data are collected with the available technological possibilities, but the contextual dimension remains hidden in most cases.

7th mistake: One-sided focus

We lock in on a particular focus that was correctly identified, but leads us to overlook grave consequences in other areas. According to the principle “pressure generates counter-pressure”, it is not possible to make a change to one part of the system without possibly causing corresponding reactions somewhere else.

8th mistake: Disregarded side effects

Trapped in one-dimensional thinking, we take a very “targeted”, i. e. linear approach to finding appropriate measures for system improvement without branching out. Side effects are not sufficiently taken into account. finding appropriate measure for system improvement n other areas.

9th mistake: Tendency to over-correct

Initially we often proceed with much hesitation. When the system does not react, we deftly intervene, just to come to a complete halt at the first unintended repercussion. Over-correcting is a logical consequence of the common current practice of generating present strategies without a clear picture of the goals we are aiming for in the future.

10th mistake: Tendency toward authoritarian behavior

The power of being entitled to change the system and the belief that we have understood it lead to dictatorial behavior that is absolutely unsuitable for complex systems. They need soft, pliable behavior that adapts with the flow. The rules by which a system must abide cannot be categorically imposed from the outside, but must follow naturally from the system itself over time. This way we will ultimately have a self-controlling and self-regulating system.

Seite drucken| Seite weiterempfehlen|Seitenanfang