Search
Exact matches only
Search in title
Search in content
Search in comments
Search in excerpt
Filter by Custom Post Type
{ "homeurl": "http://complexitylabs.io/", "resultstype": "isotopic", "resultsposition": "block", "itemscount": 4, "imagewidth": 70, "imageheight": 70, "resultitemheight": "70px", "showauthor": 0, "showdate": 0, "showdescription": 0, "charcount": 3, "noresultstext": "No results!", "didyoumeantext": "Did you mean:", "defaultImage": "http://solutionslab.io/academy/wp-content/uploads/sites/4/2016/10/Search-defualt.jpg", "highlight": 0, "highlightwholewords": 1, "openToBlank": 0, "scrollToResults": 0, "resultareaclickable": 1, "autocomplete": { "enabled": 1, "googleOnly": 1, "lang": "en", "mobile": 1 }, "triggerontype": 1, "triggeronclick": 1, "triggeronreturn": 1, "triggerOnFacetChange": 1, "trigger": { "delay": 300, "autocomplete_delay": 310 }, "overridewpdefault": 0, "override_method": "post", "redirectonclick": 0, "redirectClickTo": "results_page", "redirect_on_enter": 0, "redirectEnterTo": "results_page", "redirect_url": "?s={phrase}", "settingsimagepos": "right", "settingsVisible": 0, "hresulthidedesc": "1", "prescontainerheight": "400px", "pshowsubtitle": "0", "pshowdesc": "1", "closeOnDocClick": 1, "iifNoImage": "description", "iiRows": 2, "iiGutter": 5, "iitemsWidth": 200, "iitemsHeight": 200, "iishowOverlay": 1, "iiblurOverlay": 1, "iihideContent": 1, "loaderLocation": "auto", "analytics": 0, "analyticsString": "", "show_more": { "url": "?s={phrase}", "action": "ajax" }, "mobile": { "trigger_on_type": 1, "trigger_on_click": 1, "hide_keyboard": 0, "force_res_hover": 0, "force_sett_hover": 0, "force_sett_state": "closed" }, "compact": { "enabled": 0, "width": "100%", "closeOnMagnifier": 1, "closeOnDocument": 0, "position": "static", "overlay": 0 }, "animations": { "pc": { "settings": { "anim" : "fadedrop", "dur" : 300 }, "results" : { "anim" : "fadedrop", "dur" : 300 }, "items" : "fadeInDown" }, "mob": { "settings": { "anim" : "fadedrop", "dur" : 300 }, "results" : { "anim" : "fadedrop", "dur" : 300 }, "items" : "voidanim" } }, "autop": { "state": "disabled", "phrase": "", "count": 10 }, "fss_layout": "column" }

Self-Organization Far-From-Equilibrium

Hurricanes are examples of a systems far-from-equilibrium as they are driven by runaway positive feedback loops

Hurricanes are examples of a system far-from-equilibrium as they are driven by runaway positive feedback loops

Far-from-equilibrium self-organization is a hypothesis that describes the process of self-organization as taking place at a critical phase transition space between order and chaos when the system is far from its equilibrium.1 The essence of the theory of far-from-equilibrium pattern formation is that new forms of organization form when a system is driven far from its stable basin of attraction.2 Far-from-equilibrium behavior is ubiquitous. The scope of phenomena investigated makes the research of far-from-equilibrium systems an intrinsically interdisciplinary activity that crosses between the physics community and researchers in biology, chemistry, the social sciences, applied mathematics, meteorology and engineering.3

Organization

Organization is an ordered structure to the arrangement of elements within a system that enables them to function. As such, we can loosely equate it to the concept of order. Both order and organization are highly abstract concepts, neither of which are well defined within the language of mathematics and science. But probably the most powerful method we have for formalizing them is through the theory of symmetry.
The theory of symmetry within mathematics is an ancient area of interest originally coming from classical geometry, but within modern mathematics and physics, it has been abstracted to the concept of invariance.4 In this way, symmetry describes how two things are the same under some transformation. For example, if we take two coins, one showing heads and the other tails, by simply flipping one of the coins over it will come to have the same state as the other. Thus we do not need two pieces of information to describe the states within this system. We can describe this system in terms of just one state and a flipping transformation that when we perform it will give us the other state.
If instead of having two coins we had an apple and an orange. Now there is no transformation we know of that can map an apple to an orange. They are different things. There is no trivial symmetry or order between them, and thus we need at least two distinct pieces of information to describe this system. This second system requires more bits of information to describe its state. Thus, we can say it has higher statistical entropy.5 Thus we can talk about and quantify order and randomness in terms of information theory. Ordered systems can be described in terms of these transformations which we encode in equations. Ordered systems are governed by equations whereas random systems are not. However, because there is no correlation between the element’s states in these random systems, they are governed by probability theory, the branch of mathematics that analyses random phenomena.

Order & Randomness

Complex systems are by any definition nonlinear. Complexity is always a product of an irreducible interaction or interplay between two or more things. If we can just do away with this core dynamic and interplay, then we simply have a linear system. If the system is homogeneous and everything can be reduced to one level, then it might be a complicated system but it is certainly not a complex system. Thus, one of the main ideas or findings of complexity theory is that complexity is found at what is sometimes called the interesting in-between.6 If we take some parameter to a system, say its rate of change or its degree of diversity, and turn this parameter fully up, what we often get is randomness or a continuous change or total divinity of states without any pattern. Or if we turn it fully down we get complete stasis and homogeneity with very stable and simple patterns. It is often the case that with too much order the system becomes governed by a simple set of symmetries. Too much disorder results in randomness and the system becomes subject to statistical regularities. It is only between the two that we get complexity. On either side of this, there is a single dominant regime or attractor that will come to govern the system’s behavior.

Edge-Of-Chaos

Complexity is often a product of an in-between state, falling at the boundary or edges of stable attractors

Complexity is often a product of an in-between state, falling at the boundary or edges of stable attractors

It is only when a system is far from its equilibrium, away from one of these stable attractor regimes that we get a phase transition area representing the interplay between the two regimes. In this space, the system is much more sensitive to small fluctuations that can take it into either basin of attraction.7 This phase transition area is also called the edge-of-chaos. The phrase edge-of-chaos was first used to describe a transition phenomenon discovered by computer scientist Christopher Langton. Langton found a small area conducive to producing cellular automata capable of universal computation. At around the same time, physicist James Crutchfield and others used the phrase “onset of chaos” to describe more or less the same concept.
In the sciences in general, the phrase has come to refer to a metaphor that some physical, biological and social systems operate in a region between order and either complete randomness or chaos, where the complexity is maximal.8 The edge-of-chaos concept remains mainly theoretical and somewhat controversial, but it is often posited that self-organization and evolution can only really happen in this phase transition space.
There may be a number of different interpretations for why this is so, but one way of understanding it is that self-organization requires entropy and evolution requires variety. Unlike external intervention where we can take a well-ordered system and simply reconfigure it by transferring energy to it from some other external source, in this way we go from one ordered regime to another without the need for entropy to enable the process; we simply need some input of energy. But as we know, self-organization does not happen in this fashion. It is internally generated on the local level and this process requires the presence of entropy and randomness for elements to be available for reconfiguration into a new regime through feedback loops that originate as weak signals or fluctuations.9

Different Theories

A number of different researchers have posited different theories around this process of self-organization far-from-equilibrium. The principle of “order from noise” was formulated by the cybernetician Heinz von Foerster in 1960. It notes that self-organization is facilitated by random perturbations and noise that let the system explore a variety of states in its state space. A similar principle was presented by Ilya Prigogine as “order through fluctuations” or “order out of chaos.” 10 Researcher Per Bak also looked at this phenomenon in terms of what he calls self-organizing criticality, the mechanism by which complex systems tend to maintain themselves on this critical edge. Many of these theories talk about both the need for entropy and variety in order for the system to stay adapting and evolving over a prolonged period of time.

Cite this article as: Joss Colchester, "Self-Organization Far-From-Equilibrium," in Complexity Labs, July 15, 2014, http://complexitylabs.io/self-organization-far-from-equilibrium/.

  1. View Source
  2. View Source
  3. View Source
  4. View Source
  5. View Source
  6. View Source
  7. View Source
  8. View Source
  9. View Source
  10. View Source
2016-10-27T11:23:38+00:00