Model of hierarchical complexity

The model of hierarchical complexity is a framework for scoring how complex a behavior is. It quantifies the order of hierarchical complexity of a task based on mathematical principles of how the information is organized and of information science. This model has been developed by Michael Commons and others since the 1980s.

Overview

The model of hierarchical complexity (MHC), which has been presented as a formal theory,[1] is a framework for scoring how complex a behavior is. Developed by Michael Lamport Commons,[2] it quantifies the order of hierarchical complexity of a task based on mathematical principles of how the information is organized,[3] and of information science.[4] Its forerunner was the General Stage Model.[5] It is a model in mathematical psychology.

Behaviors that may be scored include those of individual humans or their social groupings (e.g., organizations, governments, societies), animals, or machines. It enables scoring the hierarchical complexity of task accomplishment in any domain. It is based on the very simple notions that higher order task actions are a) defined in terms the next lower ones (creating hierarchy), b) they organize those actions c) in a non-arbitrary way (differentiating them from simple chains of behavior ensuring a match between the model-designated orders and the real world orders). It is cross-culturally and cross-species valid. The reason it applies cross-culturally is that the scoring is based on the mathematical complexity of the hierarchical organization of information. Scoring does not depend upon the content of the information (e.g., what is done, said, written, or analyzed) but upon how the information is organized.

The MHC is a non-mentalistic model of developmental stages. It specifies 15 orders of hierarchical complexity and their corresponding stages. It is different from previous proposals about developmental stage applied to humans.[6] Instead of attributing behavioral changes across a person's age to the development of mental structures or schema, this model posits that task sequences of task behaviors form hierarchies that become increasingly complex. Because less complex tasks must be completed and practiced before more complex tasks can be acquired, this accounts for the developmental changes seen, for example, in individual persons' performance of complex tasks. (For example, a person cannot perform arithmetic until the numeral representations of numbers are learned. A person cannot operationally multiply the sums of numbers until addition is learned). Furthermore, previous theories of stage have confounded the stimulus and response in assessing stage by simply scoring responses and ignoring the task or stimulus.

The model of hierarchical complexity separates the task or stimulus from the performance. The participant's performance on a task of a given complexity represents the stage of developmental complexity.

Vertical complexity of tasks performed

One major basis for this developmental theory is task analysis. The study of ideal tasks, including their instantiation in the real world, has been the basis of the branch of stimulus control called psychophysics. Tasks are defined as sequences of contingencies, each presenting stimuli and each requiring a behavior or a sequence of behaviors that must occur in some non-arbitrary fashion. The complexity of behaviors necessary to complete a task can be specified using the horizontal complexity and vertical complexity definitions described below. Behavior is examined with respect to the analytically-known complexity of the task.

Tasks are quantal in nature. They are either completed correctly or not completed at all. There is no intermediate state (tertium non datur). For this reason, the Model characterizes all stages as P-hard and functionally distinct. The orders of hierarchical complexity are quantized like the electron atomic orbitals around the nucleus. Each task difficulty has an order of hierarchical complexity required to complete it correctly, corresponding to the atomic Slater eigenstate. Since tasks of a given quantified order of hierarchical complexity require actions of a given order of hierarchical complexity to perform them, the stage of the participant's task performance is equivalent to the order of complexity of the successfully completed task. The quantal feature of tasks is thus particularly instrumental in stage assessment because the scores obtained for stages are likewise discrete.

Every task contains a multitude of subtasks (Overton, 1990). When the subtasks are carried out by the participant in a required order, the task in question is successfully completed. Therefore, the model asserts that all tasks fit in some configured sequence of tasks, making it possible to precisely determine the hierarchical order of task complexity. Tasks vary in complexity in two ways: either as horizontal (involving classical information); or as vertical (involving hierarchical information).

Horizontal complexity

Classical information describes the number of "yes–no" questions it takes to do a task. For example, if one asked a person across the room whether a penny came up heads when they flipped it, their saying "heads" would transmit 1 bit of "horizontal" information. If there were 2 pennies, one would have to ask at least two questions, one about each penny. Hence, each additional 1-bit question would add another bit. Let us say they had a four-faced top with the faces numbered 1, 2, 3, and 4. Instead of spinning it, they tossed it against a backboard as one does with dice in a game of craps. Again, there would be 2 bits. One could ask them whether the face had an even number. If it did, one would then ask if it were a 2. Horizontal complexity, then, is the sum of bits required by just such tasks as these.

Vertical complexity

Hierarchical complexity refers to the number of recursions that the coordinating actions must perform on a set of primary elements. Actions at a higher order of hierarchical complexity: (a) are defined in terms of actions at the next lower order of hierarchical complexity; (b) organize and transform the lower-order actions (see Figure 2); (c) produce organizations of lower-order actions that are qualitatively new and not arbitrary, and cannot be accomplished by those lower-order actions alone. Once these conditions have been met, we say the higher-order action coordinates the actions of the next lower order.

To illustrate how lower actions get organized into more hierarchically complex actions, let us turn to a simple example. Completing the entire operation 3 × (4 + 1) constitutes a task requiring the distributive act. That act non-arbitrarily orders adding and multiplying to coordinate them. The distributive act is therefore one order more hierarchically complex than the acts of adding and multiplying alone; it indicates the singular proper sequence of the simpler actions. Although simply adding results in the same answer, people who can do both display a greater freedom of mental functioning. Additional layers of abstraction can be applied. Thus, the order of complexity of the task is determined through analyzing the demands of each task by breaking it down into its constituent parts.

The hierarchical complexity of a task refers to the number of concatenation operations it contains, that is, the number of recursions that the coordinating actions must perform. An order-three task has three concatenation operations. A task of order three operates on one or more tasks of vertical order two and a task of order two operates on one or more tasks of vertical order one (the simplest tasks).

Stages of development

The notion of stages or stagecraft is fundamental in the description of human, organismic, and machine evolution.[7] Previously it has been defined in some ad hoc ways. Here, it is described formally in terms of the Model of Hierarchical Complexity (MHC).

Formal definition of stage

Since actions are defined inductively, so is the function h, known as the order of the hierarchical complexity. To each action A, we wish to associate a notion of that action's hierarchical complexity, h(A). Given a collection of actions A and a participant S performing A, the stage of performance of S on A is the highest order of the actions in A completed successfully at least once, i.e., it is: stage (S, A) = max{h(A) | AA and A completed successfully by S}. Thus, the notion of stage is discontinuous, having the same transitional gaps as the orders of hierarchical complexity. This is in accordance with previous definitions.[8]

Because MHC stages are conceptualized in terms of the hierarchical complexity of tasks rather than in terms of mental representations (as in Piaget's stages), the highest stage represents successful performances on the most hierarchically complex tasks rather than intellectual maturity. Table 1 gives descriptions of each stage.

Stages of hierarchical complexity

Table 1. Stages described in the Model of Hierarchical Complexity
Order or stage What they do How they do it End result
0 – calculatory Exact computation only, no generalization Human-made programs manipulate 0, 1, not 2 or 3. Minimal human result. Literal, unreasoning computer programs (at Turing's alpha layer) act in a way analogous to this stage.
1 – sensory or motor Discriminate in a rote fashion, stimuli generalization, move Move limbs, lips, toes, eyes, elbows, head; view objects or move Discriminative establishing and conditioned reinforcing stimuli
2 – circular sensory-motor Form open-ended proper classes Reach, touch, grab, shake objects, circular babble Open ended proper classes, phonemes, archiphonemes
3 – sensory-motor Form concepts Respond to stimuli in a class successfully and non-stochastically Morphemes, concepts
4 – nominal Find relations among concepts; use names Find relations among concepts; use names Single words: ejaculatives & exclamations, verbs, nouns, number names, letter names
5 – sentential Imitate and acquire sequences; follows short sequential acts Generalize match-dependent task actions; chain words Various forms of pronouns: subject (I), object (me), possessive adjective (my), possessive pronoun (mine), and reflexive (myself) for various persons (I, you, he, she, it, we, y'all, they)
6 – preoperational Make simple deductions; follow lists of sequential acts; tell stories Count event events and objects; connect the dots; combine numbers and simple propositions Connectives: as, when, then, why, before; products of simple operations
7 – primary Simple logical deduction and empirical rules involving time sequence; simple arithmetic Adds, subtracts, multiplies, divides, counts, proves, does series of tasks on own Times, places, counts acts, actors, arithmetic outcome, sequence from calculation
8 – concrete Carry out full arithmetic, form cliques, plan deals Does long division, short division, follows complex social rules, ignores simple social rules, takes and coordinates perspective of other and self Interrelations, social events, what happened among others, reasonable deals, history, geography
9 – abstract Discriminate variables such as stereotypes; logical quantification; (none, some, all) Form variables out of finite classes; make and quantify propositions Variable time, place, act, actor, state, type; quantifiers (all, none, some); categorical assertions (e.g., "We all die")
10 – formal Argue using empirical or logical evidence; Logic is linear, 1 dimensional Solve problems with one unknown using algebra, logic and empiricism Relationships (for example: causality) are formed out of variables; words: linear, logical, one-dimensional, if then, thus, therefore, because; correct scientific solutions
11 – systematic Construct multivariate systems and matrices Coordinates more than one variable as input; consider relationships in contexts. Events and concepts situated in a multivariate context; systems are formed out of relations; systems: legal, societal, corporate, economic, national
12 – metasystematic Construct multi-systems and metasystems out of disparate systems Create metasystems out of systems; compare systems and perspectives; name properties of systems: e.g. homomorphic, isomorphic, complete, consistent (such as tested by consistency proofs), commensurable Metasystems and supersystems are formed out of systems of relationships
13 – paradigmatic Fit metasystems together to form new paradigms Synthesize metasystems Paradigms are formed out of multiple metasystems
14 – cross-paradigmatic Fit paradigms together to form new fields Form new fields by crossing paradigms New fields are formed out of multiple paradigms
15 – meta-cross-paradigmatic (performative-recursive)[9] Observes and understands that by virtue of the cross-paradigms that account for their dynamics,disparate entities ranging from the universe, to paradigms, to species, to social metasystems, to

individuals, for example, by their nature and/or with volition, perform recursive procession actions upon themselves, which transform them while and by performing each recursion; transformation may be “positive” or “negative.”

Relationship with Piaget's theory

There are some commonalities between the Piagetian and Commons' notions of stage and many more things that are different. In both, one finds:

  1. Higher-order actions defined in terms of lower-order actions. This forces the hierarchical nature of the relations and makes the higher-order tasks include the lower ones and requires that lower-order actions are hierarchically contained within the relative definitions of the higher-order tasks.
  2. Higher-order of complexity actions organize those lower-order actions. This makes them more powerful. Lower-order actions are organized by the actions with a higher order of complexity, i.e., the more complex tasks.

What Commons et al. (1998) have added includes:

  1. Higher order of complexity actions organize those lower-order actions in a non-arbitrary way.

This makes it possible for the Model's application to meet real world requirements, including the empirical and analytic. Arbitrary organization of lower order of complexity actions, possible in the Piagetian theory, despite the hierarchical definition structure, leaves the functional correlates of the interrelationships of tasks of differential complexity formulations ill-defined.

Moreover, the model is consistent with the neo-Piagetian theories of cognitive development. According to these theories, progression to higher stages or levels of cognitive development is caused by increases in processing efficiency and working memory capacity. That is, higher-order stages place increasingly higher demands on these functions of information processing, so that their order of appearance reflects the information processing possibilities at successive ages (Demetriou, 1998).

The following dimensions are inherent in the application:

  1. Task and performance are separated.
  2. All tasks have an order of hierarchical complexity.
  3. There is only one sequence of orders of hierarchical complexity.
  4. Hence, there is structure of the whole for ideal tasks and actions.
  5. There are transitional gaps between the orders of hierarchical complexity.
  6. Stage is defined as the most hierarchically complex task solved.
  7. There are discrete gaps in Rasch Scaled Stage of Performance.
  8. Performance stage is different task area to task area.
  9. There is no structure of the whole—horizontal decaláge—for performance. It is not inconsistency in thinking within a developmental stage. Decaláge is the normal modal state of affairs.

Orders and corresponding stages

The MHC specifies 15 orders of hierarchical complexity and their corresponding stages, showing that each of Piaget's substages, in fact, are robustly hard stages. Commons also adds four postformal stages: Systematic stage 11, Metasystematic stage 12, Paradigmatic stage 13, and Crossparadigmatic stage 14. It may be the Piaget's consolidate formal stage is the same as the systematic stage. There is one other difference in the orders and stages. At the suggestion of Biggs and Biggs, the sentential stage 5 was added. The sequence is as follows: (0) computory, (1) sensory & motor, (2) circular sensory-motor, (3) sensory-motor, (4) nominal, the new (5) sentential, (6) preoperational, (7) primary, (8) concrete, (9) abstract, (10) formal, and the four postformal: (11) systematic, (12) metasystematic, (13) paradigmatic, and (14) cross-paradigmatic. The first four stages (0–3) correspond to Piaget's sensorimotor stage at which infants and very young children perform. The sentential stage was added at Fischer's suggestion (1981, personal communication) citing Biggs & Collis (1982). Adolescents and adults can perform at any of the subsequent stages. MHC stages 4 through 5 correspond to Piaget's pre-operational stage; 6 through 8 correspond to his concrete operational stage; and 9 through 11 correspond to his formal operational stage.

The three highest stages in the MHC are not represented in Piaget's model. These stages from the Model of Hierarchical Complexity have extensively influenced the field of Positive Adult Development. Few individuals perform at stages above formal operations. More complex behaviors characterize multiple system models.[10] Some adults are said to develop alternatives to, and perspectives on, formal operations. They use formal operations within a "higher" system of operations and transcend the limitations of formal operations. In any case, these are all ways in which these theories argue for and present converging evidence that some adults are using forms of reasoning that are more complex than formal operations with which Piaget's model ended. However, these new innovations cannot exactly be labelled as Postformal thought, as itself, see latest critique[11]

Empirical research using the model

The MHC has a broad range of applicability. The mathematical foundation of the model makes it an excellent research tool to be used by anyone examining task performance that is organized into stages. It is designed to assess development based on the order of complexity which the individual utilizes to organize information. The MHC offers a singular mathematical method of measuring stages in any domain because the tasks presented can contain any kind of information. The model thus allows for a standard quantitative analysis of developmental complexity in any cultural setting. Other advantages of this model include its avoidance of mentalistic or contextual explanations, as well as its use of purely quantitative principles which are universally applicable in any context.

The following can use the Model of Hierarchical Complexity to quantitatively assess developmental stages:

The following list shows the large range of domains to which the Model has been applied. In one representative study, Commons, Goodheart, and Dawson (1997) found, using Rasch (1980) analysis, that hierarchical complexity of a given task predicts stage of a performance, the correlation being r = 0.92. Correlations of similar magnitude have been found in a number of the studies.

List of examples

List of examples of tasks studied using the Model of Hierarchical Complexity or Fischer’s Skill Theory (1980):

Michael Common's Model of Hierarchical Complexity Patent

Intelligent control with hierarchical stacked neural networks

Patent number: 9129218

Type: Grant

Filed: July 18, 2014

Issued: September 8, 2015

Inventors: Michael Lamport Commons, Mitzi Sturgeon White

Invention Summary

This invention relates to the use of hierarchical stacked neural networks that develop new tasks and learn through processing information in a mode that triggers cognitive development in the human brain in identifying atypical messages, for example, spam messages in email and similar services. Neural networks are useful in constructing systems that learn and create complex decisions in the same methodology as the brain.

This invention applies models of the ordered stages that the brain moves through during development that causes it to execute highly complex tasks at higher stages of development to the task of identifying atypical messages, such as email spam. In this process, actions performed at some point of development are developed by ordering, altering and combining the tasks executed in the preceding phase. Because of this process, at each stage of development more complicated tasks can be executed than those performed at the preceding phase.

Implications

It is an object of the present invention to provide hierarchical stacked neural networks that overcome the limitations of the neural networks of the prior art.It is another object of the present invention to provide linked but architecturally distinct hierarchical stacked neural networks that simulate the brain's capacity to organize lower-order actions hierarchically by combining, ordering, and transforming the actions to produce new, more complex higher-stage actions.

Another aim of the invention is to provide hierarchical stacked neural networks that are ordered in a non-arbitrary way so that tasks executed by neural networks at a higher level are the result of a concatenation of tasks executed by lower-level networks in the hierarchy.In addition the tasks executed by a neural network in the stacked hierarchy are a result of amalgamating, ordering, and altering tasks executed by the neural network that precedes it at a lower level in the stacked hierarchy.Furthermore another aim of the invention is that neural networks at higher levels in the hierarchy execute highly complex actions and tasks than neural networks that precede them at a lower level in the hierarchy.

Intelligent control with hierarchical stacked neural networks

Patent number: 9053431

Type: Grant

Filed: July 2, 2014

Issued: June 9, 2015

Inventor: Michael Lamport Commons

This is a system and a method of identifying an abnormally deviant message . An ordered set of words within the message is recognized. The set of words observed within the message is associated with a set of anticipated words, the set of anticipated words having semantic characteristics. A set of grammatical structures illustrated in the message is recognized, based on the ordered set of words and the semantic characteristics of the corresponding set of anticipated words. A cognitive noise vector consisting of a quantitative measure of a deviation between grammatical structures illustrated in the message and a measure (unexpected) of grammatical structures for a message of the type is then discerned. The cognitive noise vector could be processed by higher levels of the neural network and/or an outer processor.

Implications

An aim of this invention to provide hierarchical stacked neural networks that are ordered in a non-arbitrary way so that tasks executed by neural networks at a higher level are the result of a concatenation of tasks executed by lower-level networks in the hierarchy. We can say, lower level neural networks would be able to send output that would be useful as input in the higher levels.

This invention provides an architecture of hierarchically linked, neural networks created for spam filtering stacked one on top of the other. Every neural network in the hierarchical stack keeps track not only of the data it can glean from the input, as in previous art neural networks, but it also concentrates on “cognitive noise” and develops an error vector or a same means of determining the degree of the imperfections in the information transmitted.

In this invention, higher-level neural networks interact with lower level neural networks in the hierarchical stacked neural network. The higher-level neural networks responds to the lower-level neural networks to calibrate connection weights, thus improving the precision of the tasks executed at the lower levels. The higher-level neural networks can also demand that additional information be fed to the lowest neural network in the stacked hierarchy.

Intelligent control with hierarchical stacked neural networks

Patent number: 9015093

Type: Grant

Filed: October 25, 2011

Issued: April 21, 2015

Inventor: Michael Lamport Commons

This is a method of processing information which involves receiving a message; processing the message with a trained artificial neural network based processor, having at least one set of outputs which represent information in a non-arbitrary organization of actions based on an architecture of the artificial neural network based processor and the training; representing as a noise vector at least one data pattern in the message which is represented incompletely in the non-arbitrary organization of actions; analyzing the noise vector distinctly from the trained artificial neural network; scrutinizing one database minimum; and developing an output in dependence on said analyzing and said searching.

The present invention relates to the field of cognitive neural networks, and to hierarchical stacked neural networks configured to imitate human intelligence.

Implications

One goal of the invention to is provide linked but architecturally distinguishable hierarchical stacked neural networks that emulate the capacity of the brain to rearrange lower-order actions hierarchically by combining, ordering, and changing the tasks to produce new, highly complex higher-stage actions.These lower levels of neural networks complete simpler tasks than higher levels.

Furthermore this invention also provides hierarchical stacked neural networks that are ordered in a non-arbitrary manner so that tasks executed by neural networks at a higher level are the consequence of a concatenation of tasks executed by lower-level networks in the hierarchy. We can say, lower level neural networks would provide output that would be useful as input in the higher levels.

Intelligent control with hierarchical stacked neural networks

Patent number: 8788441

Type: Grant

Filed: November 3, 2009

Issued: July 22, 2014

Inventors: Michael Lamport Commons, Mitzi Sturgeon White

It is a continuation of the previous patent

Summary and Implications

The goal of the invention to provide hierarchical stacked neural networks that overpower the restraints of the neural networks of the previous art. Another aim of the invention is to provide associated but distinguishable hierarchical stacked neural networks that imitate the brain's volume to arrange lower-order actions hierarchically by combining, ordering, and changing the tasks to develop more compound higher-stage tasks.

Another aim is to provide hierarchical stacked neural networks which are ordered in a non-arbitrary way so that actions executed by neural networks at a higher level are the result of a concatenation of tasks executed by lower-level networks in the hierarchy.In addition , another task is that the tasks executed by a neural network in the stacked hierarchy are a result of amalgamating, ordering, and altering task executed by the neural network which precedes it at a lower level in the stacked hierarchy.

Furthermore, neural networks at higher levels in the hierarchy execute highly complex tasks than neural networks that precede them at a lower level in the hierarchy.

This invention provides an architecture of hierarchically linked, distinguishable neural networks stacked one on top of the other. Every neural network in the hierarchical stack uses the neuron-based methodology of previous art neural networks. The tasks that every neural network executes and the order in which they execute are based on human cognitive development.

Intelligent control with hierarchical stacked neural networks

Patent number: 8775341

Type: Grant

Filed: October 25, 2011

Issued: July 8, 2014

Inventor: Michael Lamport Commons

Oct 25, 2011

It is a structure and method of identifying abnormal message. An organized set of words within the message is identified. The set of words observed within the message is associated to a corresponding set of anticipated parable, the set of anticipated words having semantic characteristics. A set of grammatical compositions illustrated in the message is identified, based on the ordered set of words and the semantic characteristics of the corresponding set of anticipated words. A cognitive noise vector encompassing a quantitative measure of a deviation between grammatical structures illustrated in the message and an anticipated measure of grammatical structures for a message of the type is then discerned. The cognitive noise vector may be processed by higher levels of the neural network and/or an outer processor.

Implications

In this invention lower-level neural networks interact with higher level neural networks in the hierarchical stacked neural network. The higher-level neural networks responds to the lower-level neural networks to regulate coupling weights as a result boosting the precision of the tasks executed at the lower levels. The higher-level neural networks can also demand that more information be fed to the lowest neural network in the stacked hierarchy.Another aim of this invention is to deliver linked but architecturally distinguishable hierarchical stacked neural networks which imitate the brain's volume to categorize lower-order actions hierarchically by amalgamating, ordering, and altering the tasks to develop complex higher-stage actions. As a result, lower levels of neural networks complete easier tasks as compared to higher levels. For example, in spam filtering, lower levels would concentrate on identifying text as text, distinguishing text into letters, and arranging text into strings of letters, while higher level neural networks would identify and understand words and higher levels would identify a surplus of poorly structured words or sentences.Furthermore, another goal of the invention to give hierarchical stacked neural networks that are ordered in a non-arbitrary manner so that tasks executed by neural networks at a higher level are the result of a coupling of tasks executed by lower-level networks in the hierarchy. We can also say that lower level neural networks can give output that would be useful as input in the higher levels.

Intelligent control with hierarchical stacked neural networks

Patent number: 7613663

Type: Grant

Filed: December 18, 2006

Issued: November 3, 2009

Inventors: Michael Lamport Commons, Mitzi Sturgeon White

Summary and Implications

The goal of the invention is to provide hierarchical stacked neural networks that overcome the limitations of the neural networks of the previous art.Another goal is to provide associated but architecturally different hierarchical stacked neural networks which imitate the brain's measurable volume to arrange lower-order actions hierarchically by incorporating, ordering, and altering the tasks to develop new, more complex higher-stage actions. This invention also provides hierarchical stacked neural networks which are ordered in a non-arbitrary manner so that tasks executed by neural networks at a higher level are the consequence of coupling of actions executed by lower-level networks in the hierarchy. Another aim is that the tasks executed by a neural network in the stacked hierarchy are a resultant of amalgamating, ordering, and altering tasks executed by the neural network that precedes it at a lower level in the stacked hierarchy.It is another aim of the model that neural networks at higher levels in the hierarchy execute highly complex tasks as compared to neural networks that precede them at a lower level in the hierarchy.

Intelligent control with hierarchical stacked neural networks

Patent number: 7152051

Type: Grant

Filed: September 30, 2002

Issued: December 19, 2006

Inventors: Michael Lamport Commons, Mitzi Sturgeon White

Summary and Implications

The invention to provides hierarchical stacked neural networks which overcome the limitations of the neural networks of the previous art. It also provides linked but architecturally distinct hierarchical stacked neural networks that imitate the volume and magnitude of the brain to organize lower-order actions hierarchically by ordering ,combining and altering the actions to develop more complex higher-stage tasks.

Moreover, the invention also provides hierarchical stacked neural networks which are ordered in a non-arbitrary manner so that tasks performed by neural networks at a higher level are the resultant of a concatenation of actions executed by lower-level networks in the hierarchy.Another aim of the invention is that the tasks executed by a neural network in the stacked hierarchy are a consequence of amalgamating, ordering, and altering tasks executed by the neural network that precedes it at a lower level in the stacked hierarchy.In addition another aim of the invention is that the neural networks at higher levels in the hierarchy execute highly complex actions as compared to neural networks that precede them at a lower level in the hierarchy.[13]

Implications of the Model of Hierarchical Complexity

The model of hierarchical complexity by Dr. Michael Commons provides a standard way of examining the universal patterns of development and evolution. It is a quantitative behavioral developmental theory.[14] The model is used to score the complexity of behavior. Behaviors that may be scored include those of individual humans or their social groupings , other living beings and / or machines. It leads to scoring the hierarchical complexity of task accomplishment in all kinds of domains. The idea behind the model is that higher order task actions are a) described in terms the next lower ones (creating hierarchy), b) they organize those actions c) differentiating them from simple chains of behavior causing a match between the model-designated orders and the real world orders). The scoring is dependent upon how the information is organized and not upon the information context (e.g., what is done, said, written).The main idea behind the model was that mathematical psychology could be used to predict the future

People and institutes from all the major continents of the world use The model of hierarchical complexity except Africa. Because the model is very simple and is based on analysis of tasks and not just performances, it is dynamic.[15]

Furthermore, with the help of the model, it is possible to quantify the occurrence and progression of transition processes in task performances at any order of hierarchical complexity.[16]

Criticisms

The descriptions of stages 13–15 have been described as insufficiently precise.[17]

References

  1. Commons & Pekker, 2007
  2. (Commons, Trudeau, Stein, Richards, & Krause, 1998)
  3. (Coombs, Dawes, & Tversky, 1970)
  4. (Commons & Richards, 1984a, 1984b; Lindsay & Norman, 1977; Commons & Rodriguez, 1990, 1993)
  5. (Commons & Richards, 1984a, 1984b)
  6. (e.g., Inhelder & Piaget, 1958)
  7. (Commons et al., 1998; Commons & Miller, 2001; Commons & Pekker, 2007)
  8. http://www.academia.edu/3210966/Toward_Defining_Order_15_and_Describing_Its_Performance_for_the_Model_of_Hierarchical_Complexity_draft_
  9. (Kallio, 1995; Kallio & Helkama, 1991)
  10. Kallio, E. 2011. Integrative thinking is the key: an evaluation of current research into the development of thinking in adults. Theory & Psychology, 21 Issue 6 December 2011 pp. 785 - 801
  11. http://www.cutter.com/content/architecture/fulltext/reports/2013/01/index.html Evernden, R. Mastering Complexity to Drive EA Productivity, Cutter Consortium Executive Report, Vol. 16, No. 1, 2013
  12. "Patents by Inventor Michael Lamport Commons - Justia Patents Database". patents.justia.com. Retrieved 2015-09-21.
  13. Commons, ML. "INTRODUCTION TO THE MODEL OF HIERARCHICAL COMPLEXITY AND ITS RELATIONSHIP TO POSTFORMAL ACTION" (PDF).
  14. Commons, M. "Advances in the model of hierarchical complexity (MHC)" (PDF).
  15. Ross, S. "Fractal Model of Nonlinear Hierarchical Complexity: Measuring Transition Dynamics as Fractals of Themselves" (PDF).
  16. http://www.academia.edu/3210966/Toward_Defining_Order_15_and_Describing_Its_Performance_for_the_Model_of_Hierarchical_Complexity_draft_
Copyright permissions

Portions of this article are from Applying the Model of Hierarchical Complexity by Commons, M.L., Miller, P.M., Goodheart, E.A., Danaher-Gilpin, D., Locicero, A., Ross, S.N. Unpublished manuscript. Copyright 2007 by Dare Association, Inc. Available from Dare Institute, commons@tiac.net. Reproduced and adapted with permission of the publisher. Portions of this article are also from "Introduction to the Model of Hierarchical Complexity" by M.L. Commons, in the Behavioral Development Bulletin, 13, 1–6 (http://www.behavioral-development-bulletin.com/). Copyright 2007 Martha Pelaez. Reproduced with permission of the publisher.

Literature

External links

This article is issued from Wikipedia - version of the Sunday, September 27, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.