Cybernatics

Image

CYBERNATICS

The word Cybernetics comes from the Greek word, kybernetes, meaning rudder, pilot, a device used to steer a boat, or to support human governance. This word was first used by Plato in Alcibiades I10 to signify the governance of people. In the 1830s, the French physicist Ampere used it to describe the science of civil government. Norbert Wiener defined cybernetics as “the study of control and communication in the animal and the machine”.

Cybernetics is concerned with concepts at the core of understanding complex systems such as learning, cognition, adaptation, emergence, communication, and efficiency. Cybernetics has been influenced by and, in turn, has applications in fields as diverse as psychology and control theory, philosophy and mechanical engineering, architecture and evolutionary biology, or social sciences and electrical engineering.

There is little wonder that philosophers and scientists have different definition of cybernetics. Cybernetics is “the art of creating equilibrium in a world of constraints and possibilities,” according to the philosopher Ernst von Glasersfeld. The famous mathematician Andrey Nikolaevich Kolmogorov defines cybernetics as the “science concerned with the study of systems of any nature which are capable of receiving, storing, and processing information so as to use it for control.”

In recent years, scientists have shown some reluctance to use the term cybernetics because the discipline covers a very broad range of concepts and applications in so many areas of human endeavor. Nevertheless, core concepts of cybernetics, such as feedback are essential for understanding complex systems, simply because such systems have to adapt their behavior based on feedback from the environment they operate in. Two feedback loops allow the system to learn and to adapt; one, used frequently, makes small adjustments and enables learning, while the other, used less frequently, senses the need to replace obsolete information with new information, thus enabling adaptation. According to Ashby, learning implies that a system discovers patterns of successful behavior in the environment it operates in and repeats successful actions, while avoiding unsuccessful ones. Adaptation means that the system learns a new pattern of behavior after recognizing that the environment has changed and the old pattern is no longer successful.

Ashby defines a machine as a system whose internal state, together with the state of the environment, dictates the next state. The regulator is the element controlling the evolution of the system, and it can do so by using the feedback to assess how far the system deviates from a prescribed behavior, and by reacting to disturbances of its environment. The regulator must have information linking cause and effect in the system environment. The repertoire of actions required by the feedback should reflect the variety of the perturbations; this is the Ashby’s Law of the requisite variety. Faced with an unforeseen sequence of events, we have the option to increase the variety in the regulator, or reduce the variety in the system being regulated. Based on these principles, we expect that an isolated dynamic system obeying unchanging laws will adapt to its environment. As a corollary, it follows that only the ensemble consisting of the system and its environment can be rightfully called self-organizing.