Abstract

Discussions of synchronicity tend to focus either on the meaningful content of the experience, or on speculation about possible mechanisms underlying the phenomena. The present paper suggests that the symbolic or meaningful content of some synchronistic phenomena are themselves governed by identifiable dynamics associated with the emergence of symbol systems generally. Specifically, these dynamics are associated with complex dynamical systems theory and give rise to phenomena governed by power laws such as Zipf’s law. It is suggested that synchronicities, which display distinctly symbolic features, behave in ways that conform to power-law distributions in which highly coupled systems form rare outlier aggregations referred to as "dragon kings". This terminology is explained and related to the experience of synchronistic phenomena.

:mao-shining:

  • GalacticFederation [none/use name]
    hexagon
    ·
    3 years ago

    Zipf’s Law

    Had he known of them, Jung might have appreciated a set of mathematical formulations that have the important quality of describing a wide variety of phenomena with no intrinsic connection to one another. There is, if you will, something transcendental about them, in the philosophical sense of being universal conditions of the world, without reference to specific states of affairs. The ones I am particularly interested in, in relation to the symbol, are power-law distributions, particularly Zipf’s law, but also including the fractal geometry of Mandelbrot (1981, 1997) and scale-free structures of networks. Let me note that all of these patterns involve, amongst other characteristics,a relationship to scaling phenomena, or what is usually referred to as scale-invariance. This means that these patterns apply regardless of the scale at which the phenomena are analyzed.

    Zipf’s law is named after the American linguist George Kingsley Zipf (1902–1950). He was something of a polymath or a dilettante, depending on your point of view, who began by examining variations in the size of cities. He discovered that, within a given geographic area, the size of population concentrations, from small villages to large cities is governed by a so-called power-law distribution. In the case of cities the abundance (or frequency) of agglomerations of sizes followed a deceptively simple 1/s distribution.

    The outcome of this calculation looks like a graph schematically depicted in Figure 1a. However, when the results are converted to a doubly logarithmic graph (log-log plot), the chart looks like Fig. 1b, which exhibits the characteristic linearity of a power-law distribution. Zipf’s next step, and the result for which he is remembered, was to examine the frequency of words in a text. He found that the frequency of words in a text, ranked according totheir abundance, fell like 1/r as a function of their rank r. In a log-log plot, the slope of the resulting linear function is then minus one.

    In addition to his work on the statistics of word frequency, Zipf proposed a model for the generation of lexicons, or symbol systems, that he referred to as the principle of least effort. Briefly, the idea here was simply that both the listener and the speaker in an exchange of signs would seek to minimize their expenditure of energy – that is, put in the least effort (Zipf 1949). This means that a kind of negotiation would take place between the parties of an exchange, in which each sought the greatest level of understanding for the least effort.

    Needless to say, the simplest way to achieve this goal is to have a shared lexicon of exact one-to-one relations between the elements in the lexicon and the objects referenced by the lexicon. However, this approach entails massive memory requirements to insure the least ambiguity. It is the way in which most animals, other than humans, communicate. The monkey cry that designates the presence of a snake is distinct from the cry that designates an eagle. But while some animals can learn fairly large lexicons in captivity, and under well controlled conditions, we also know that in the wild the upper bound for say the bonobo chimpanzee, perhaps the most cognitively advanced primate short of humans, is on the order of about 40 “words”, with little or no syntax. These lexicons are essentially indexical rather than symbolic, in the terms used by Peirce and Deacon.

    Explicitly drawing on Deacon’s and Peirce’s distinction between symbols and indexes, Ferrer-i-Cancho and Sole (2003) simulated the development of a lexicon beyond the indexical level and concluded that Zipf’s law was not simply a descriptive tool. Rather, it was actually a necessary emergent property of symbolic systems which, they also demonstrated, exist in what is known as a phase transition – a condition such as what happens as water turns to steam or freezes into ice.

    However, the symbolic phase space in this instance has the added feature that the symbolic system proper remains in the phase space and does not resolve either into indexicals or into meaningless randomness. This feature, which entails a significant degree of referential ambiguity, was, they speculate, a likely contributing factor in the evolution of language, because it allows a limited lexicon to refer to a larger set of objects. In the presentation of their findings one can see how a phase transition emerges where the effort of the speaker becomes roughly equal to the effort of the listener Fig. 2.

    The model of Ferrer-i-Cancho and Sole (2003) created a very abstract and idealized understanding of a language or symbol system. They have gone further in other papers, to examine the emergence of syntax and also to argue that semantic content may follow Zipf’s law as well. This is a more controversial claim, but it has received some support from other researchers. For instance, Vogt (2004) adds a dimension to this discussion in that he enlarges the set of possible symbolic structures by examining the ways in which referential tokens can be aggregated into categories. Once again, the principle of least effort is at work, but the objective is to locate the category that best discriminates one reference from others.

    Vogt refers to the conceptual structures of symbolic systems in terms of their density: symbolic density. He argues that in a search for appropriate categorical structures, the principle of least effort will motivate movement through a hierarchy of increasingly dense categories. Furthermore, this hierarchy of category density can be subsumed under a Zipf-Mandelbrot power law. This is exactly my argument regarding Jung’s system; the complex, the archetype, synchronicity and even the notion of the Self are scale-invariant symbolic structures of increasing density that should, by virtue of their symbolic nature, as well as the curious fact of the scope of power-law like phenomena, fall under Zipf’s law.

    A question can be raised at this point, however. If synchronicities are simply at the high end of a power law, why do they carry the level of meaning and affective impact that they typically do? A large earthquake, after all, is, as Sornette (2003) has remarked, simply a small earthquake that keeps going. But, if the same principle applies to synchronicities, why do you havethe experience of a “rupture of time” (Main 2004)?

    Ironically, part of the answer is already available in the work of Ferrer-i-Cancho and Sole (2003), in their discussion of the emergence of language. As was mentioned, this process consists in the formation of a phase transition in which an entirely new and distinct regime emerges as symbols overwhelm the earlier simple indexical reference. This transition is, in no small measure, a catastrophe, in the technical sense of the word – and perhaps in practice as well, insofar as it likely catapulted the genus homo into an entirely different life-world to the general detriment of other organisms.