Cybernetics

R. Vaughn Spring 2002

 

Cybernetics was derived from the Greek word for steersman Cybernetics. It has many definitions, but the one I prefer the most is that it is the “ study of feedback.” (Littlejohn, 2001) W. Ross Ashbey (1956) had a more complete definition however; he said that cybernetics was the formal study of all possible machines and a discipline with its own foundations. This domain also touches on all traditional disciplines including mathematics, technology, and the social sciences. This is why one of the major sources for development of theory is based on a transdisciplinary study of complex systems. However it is more specifically related to the idea of “sciences of complexity.” (Heylighen et al, 1999) The history of cybernetics can be traced to the early 1940’s and 50’s when a series of meetings, put on by the Macy foundation, was called to order. The meetings were on the “ circular casual and feedback mechanisms in biological and social sciences”. During those meetings Norbert Wiener coined the name “Cybernetics” for the discipline and defined it as the “ control and communication in the animal and the machine”. To which we now add: in society and in individual human beings. It grew out of Shannon’s information theory, which was designed to perfect the transmission of information through communication channels. As a later development, in 1970 Heinz von Forester distinguished first and second order cybernetics: the study of observed systems and the study of observing systems. Its emphasis is on how observers construct model systems with which we interact.

 

Cybernetics focuses on how systems function and how they control their actions, and how they communicate with other systems or with their own components. Cybernetics helps give rise to some new fields like cognitive science and neurobiology and has been useful in formulating dozens of ideas and bits of applied math.  A very simple cybernetic system will consist of what (Littlejohn, 2001) calls a sensor, comparator, and activator. The sensor is used to provide feedback to the comparator, which in turn decides if the machine is off basis. The comparator then gives guidance to the activator. This then provides an output, or feedback and the feedback is in some way affective to the environment. This concept of feedback is essential to the concept of cybernetics.

 

However, some feedback systems are more complete than others. The simplest difference can be found between active and passive behavior. For example active behavior comes directly from the system, and passive behavior comes straight from outside stimulation. Sneezing is a passive behavior, but giving a friend a high five is an active behavior. When we look further at active behavior we can categorize it into purposeful behavior, and random behavior. For example if I move my hand it could be just a random action however if I intend to move my hand in order to emphasize a point, then this would be a purposeful behavior. When we look deeper into purposeful behavior, we can see that this behavior consists of different levels of complexity. In a simple system an organism responds to feedback by turning on or off. For example a light switch only has two functions. Complex systems however, use feedback to determine what they will do. In this manner complex systems are able to adjust and adapt. Also complex systems can be predictive or you can’t predict the outcome. The feedback that is determined through complex or even simple systems can also be classified in a few ways. Feedback can be negative or positive. If I kissed a girl and got slapped, then that would be negative. I would probably not do it again because my system would respond by avoiding that outcome. However a reciprocal response would be positive, and I would probably do it again more often. This is important in system growth because it implies learning it taking place.

 

The theory of Cybernetics also touches on the concept of three feedback states. They are steady state, growth state, and change state. In a steady state when negative outcomes occur, the system acknowledges a problem, and returns to normalcy to balance out the situation. An example of this may be in a classroom situation. Lets say that a teacher is receiving negative feedback from the students that the tests are too hard, or that the lectures are too boring. In order for the teacher to return to a positive class atmosphere he or she might try to provide a study guide, and incorporate a movie or two into the lectures. The second state is growth. IN a growth state, a system begins to deviate, until a positive feedback causes it to increase in an accelerated direction. For example, lets say that a kid thinks it is funny to always make fun of his friend’s weight. His friend is irritated but doesn’t act too upset. So, the kid continues to make fun of his friend and this happens every day. What will happen is that their friendships will increasingly disintegrate, until there friendship falls apart. In this situation the positive feedback was not a good feedback. Instead it was growth, but growth in the wrong direction. Also negative feedback would be needed for the friend to express his dissatisfaction with the fat jokes in order to have their friendship system maintain balance. The third state is change. This system involves both negative and positive feedback because is it constantly changing yet adapting. This feature is both a remarkable trait of human and non-human systems. An example of this system could take place in a relationship. Lets say Mark and Susan begin having problems in their relationship. Mark decides to become more supportive, and it helps their relationship. However, Susan begins to take advantage of Mark’s approach and conflicts begin to arise again. So Mark decides a change is needed and decides to criticize Susan on her actions. When Susan begins to be more affectionate with Mark, he reduces his criticism. At this point the relationship of the system has moved to a new state of somewhat less supportiveness and more scrutiny.

 

While cybernetics yields many important concepts as far as the feedback process, it also had a great deal of relevance to more complex systems. An advanced system that is part of a larger more complex system or even the environment is called a subsystem. In a complex system, there may be multiple loops that provide feedback that form networks. The most consistent rule of this feedback loop is that output will always return as feedback input. Cybernetics helps us explain this larger more complex model because it helps explain concepts such as self-regulation, interdependence, wholeness, and interchange with the environment. Cybernetics is a critical way of thinking that touches on key issues such as circular reasoning, not just the basic statement that one thing happens because it was caused by another thing. In the real world, what goes around comes back around like a boomerang. However from an observer’s perspective it is often difficult to see how this pattern of cybernetics works in a system. Therefore the phrase “ Second Order Cybernetics” was coined as a tool to help explain this phenomenon.

 

Second order cybernetics helps explain how the observation process itself is a system in which feedback loops are established between the observer and the observed. (von forester, 1981.) The observer can’t help becoming a cybernetic system himself or herself because knowing is taken from what is learned and we learn in part from observations, which in turn are effected by what is seen. In Second Order cybernetics the powerful idea that a system is affected and affects the observer is an important concept. This is because humans often have a hard time accepting that we are not separate from what we observe. Another important concept is called structural coupling. This means that two systems may be affected mutually because when we observe a system we are affected by the past and structure of that system. An example of this is when you try to explain to someone how your memory works, but right at that time you are remembering the subject your are speaking about. So, two systems can indeed have mutual effects.

 

Although the theory of Cybernetics can be relatively complex, it can also be evaluated at many levels. For example a theory is a scientific account of a phenomena. The theory of Cybernetics fits this definition because cybernetics explains the phenomena of regulation and control in a system. Another way to evaluate this theory is by using different criteria. For example, a theory must be capable of corrigibility, so we must be able to jeopardize it by making observations. For Cybernetics it is easy to observe this theory and test it for corrigibility because networks exists in every aspect of life. Therefore we can observe networks and verify how systems adjust and change under the concept of cybernetics. Also a theory should be able to provide an explanation about the outcome of a series of events. So, cybernetics must be able to reduce the uncertainty about the outcome of a system and how it gauges its effects. We can see that Cybernetics as a theory does accomplish this set of criteria because it uses feedback models in place of hazy examples. Due to the feedback models you can plug in a system and it will fit appropriately into one of the models.

 

In the real world this theory can help explain how a simple organism responds to feedback, like how a heater shuts on or off. It can also explain more complex systems like the relationship between a boss and co-workers. This theory is very relevant and practical at any level because it optimizes the transmit ion of feedback through any sort of communication channel, and it explains how the system functioned.

 

Works Cited

 

Norbert, Weiner. The use of Human Beings: Cybernetics in Society.  Houghton Mifflin, 1954.

Scott, Bernard. Cybernetics and the Social Sciences Sept. 2001: 1-7.

Fuchs, W.R. Cybernetics for the Modern Mind. New York, 1971.

Heylighten, F. What are Cybernetics and Systems Science? Oct. 1999. ( Principia Cybernetica Web)

Littlejohn, S. Theories of Human Communication.  Wadsworth, 2001.