More on Cybernetics.

Cybernetics Greek word meaning “steering”

Therefore, study of control and regulation.

Steer toward the goal and correct the error or deviation from the goal.

System perspective:

■ System has a goal

■ System is maneuvered toward its stated goal

■ Environment deflects the aim

■ System provides “feedback”

■ System measures deviance:

|required_state – current_state|

■ System takes action to mitigate the deviance

■ Repeat

Steering as a feedback loop

Thermostat as a controller.

● Environment acts on the thermostat

● has only two internal states to communicate with the rest of the system: more/less or yes/no

● Environment is more complex than a thermostat. If it’s near a heat source or a sink it’d be totally ineffective to control ambient temperature.

Total variety of a thermostat = 2

● A better regulator is a human controller with more complexity. e.g.

♦ my comfort: (too hot, just right, too cold = 3)

♦ other’s comfort: (same as above, with variety 3)

♦ state of storm doors: (open, or closed, variety=2)

♦ state of guest room door: (as above, variety=2)

♦ state of 8 windows: (total variety, )

♦ state of upstairs 3 bedroom doors: (variety=8)

Total complexity=

● With so much varieties a human controller would be able to control the environment more accurately, but still it’d never be exact. Conclusion: System to be controlled has more complexity than that of the controller.

● I may employ a bunch of communicating sensors and a computer to control the system temperature, but still it’d never be exactly accurate even though I as a controller now have more variety.

Laws of requisite variety: For a system to be stable under perturbation of input, control mechanism has to have equal or more variety (or complexity) than that of the system to be controlled.

● Only variety can destroy variety.

Conant-Ashby theorem: Every good regulator of a system must be a model of the system if we have to choose one.

● To be maximally successful, every regulator must be isomorphic to the system being controlled.

Another example.

■ Management has a model of the system it is embedded in.

■ Model has no more variety than management.

■ Management has no more variety than the system.

Thus the best description for the system is what the manager conceives of it via its model, and because the model has less variety than that of the system it is always inadequate.

Graphically:

What is the “system” that managers see? System definition.

Basically four possibilities. Consider a university as a system.

1. Reductionist view. System is sum total action of the parts.

Number of graduating students,

where

Level of current technology used in class

smaller classes

Amount of scholarship available

Market relevancy of courses

2. Statistical view. System as a “Disorganized complexity”.

Random phenomena.

General state of Economy

Job opportunity in the global

Local job opportunity

Personal motivation level

3. Holistic System. University should be seen as an ensemble of student, professors, administrators

4. Relativistic System. University as an operational element is predicated by how the students, managers and the professors themselves are affected by what they believe and how important are their beliefs. Grapevine effect. Self-fulfilling scenarios.

All these views are correct and we speak of different things in each case. Four ways of looking at the same thing but getting different ideas.

Variety Engineering.

From the point of view of variety:

Thus, we may require

But if the system model has low variety, manager’s response cannot be amplified arbitrarily: It will produce noise, (signal/noise) would deteriorate.

Filter F: May delete all long term patterns, but present only short-term immediate results. This may be dangerous.

Amplifier: Manager may have little understanding of what is going on in the system. He might suggest several things (amplification), but they may not be all relevant.

Manager or consultant as a “problem solver”.

■ Problems do appear in an operational context.

■ In a system we may have several interlocking operation centers each managed by a management team supported by own particular view of their local models.

■ Therefore, attempting to “solve” a single problem(s) may cause more problems to surface resulting in a “mess”.

■ A manager’s job is not to “solve” problems, but to do “mess-management” – to steer an operation unit through plausible mess-type situations.

Some basic laws of Cybernetics:

1.  Law of self-organization: A complex system tends to maintain its organization or tends to organize itself.

Interactions among parts is crucial element. Proper matching of parts is essential for self-organization, for growth.

1A. Complex systems have a set of stability points separated by regions of instabilities.

If you have to push a system beyond the current level of maturity, you must push it beyond its immediate unstable states.

2.  Law of Feedback: The output of a complex system is dominated by the feedback and, within wide limits, the input is irrelevant.

e.g. In a class, we want to teach “mathematics”. The feedback loop ought to demonstrate student’s mathematical reasoning capability. Some profs would use grades, personal enthusiasm, charisma and student’s own excitement at intellectual growth as feedbacks to produce mathematical reasoning ability. Most profs may not bother about that suggesting (a) Mathematical reasoning is not an important output for the class, (b) it may be important for some, but not for most. If it were we’d have appropriate feedback loops for this.

2A. All outputs that are important to the system will have associated feedback laws.

This is equivalent to: Any desired result that has no systemic feedback loop will not be achieved or any system that lacks a feedback loop for a desired output is hopelessly lost.

3. Law of requisite variety: Given a system and some controller of that system, the amount of control attainable is absolutely limited by the variety of the controller.