In the natural world, we often similar solutions evolve across many species because the solution space for challenges such as movement tends to be fairly small. This phenomenon, known as convergent evolution, illustrates that nature tends to converge on a small set of optimal strategies when faced with similar types of problems. One such strategy is the development of systems for coordination. The ability to act as a unified whole turns out to be a very useful adaptation for the functioning of any complex organism. Let’s take a look at why that is.
To thrive, animals must coordinate the actions of countless cells, tissues, and organs. This coordination is made possible by the nervous system and the brain, which integrate sensory input, process information, and orchestrate responses. Without such systems, a complex organism would collapse into chaos. Imagine a human body where each organ acted independently: the heart pumps without regard for oxygen levels, the lungs breathe without synchronizing with the muscles, and the limbs move without direction. Such an organism would have a very short existence. Coordination proves to be essential for orchestrating complex dynamic systems.
Of course, not all large organisms require such intricate systems. Take the Armillaria ostoyae, a fungus that spans thousands of acres. This organism thrives in a relatively static environment, relying on a network of mycelium to absorb nutrients and reproduce. Its structure is homogeneous, and its ability to adapt to rapid change is limited. While it is vast, it lacks the adaptability of animals. The need for coordination arises from the demands of the environment and the complexity of the tasks at hand. In dynamic, unpredictable environments, the ability to act in coordinated fashion becomes a survival imperative.
We can extend this principle beyond individual organisms to societies, which can be thought of as metaorganisms. Just as cells and organs work together within a body, individuals within a society collaborate to achieve shared goals. Societies, like organisms, compete for resources, and their competition exerts selective pressure. Those that can effectively coordinate labor and resources are more likely to persist and thrive. In small societies, coordination can be relatively simple. A tribe might have a leader who helps organize tasks, but much of the work is distributed among autonomous individuals, each specializing in a specific role, like hunting, crafting, or farming. The structure is flat, and communication is direct.
However, as societies grow, so too does the need for more sophisticated coordination. The transformation from a small tribe to a large civilization is a shift where quantity transforms into quality. With more people comes greater specialization, and with specialization comes interdependence. A blacksmith in a small town might work independently, but in a large society, blacksmiths become part of a broader network of producers, traders, and consumers. This interdependence demands systems to manage complexity, much like a nervous system manages the complexity of a multicellular organism. A group of people specializing in a particular profession is akin to an organ within a living organism.
This pattern emerges in all types of human organizations, from companies to governments. In a small team, direct communication suffices. Each member knows their role, and decisions can be made collaboratively. But as the organization grows, the lines of communication multiply exponentially. What works for five people becomes unmanageable for fifty, and impossible for five hundred. At this point, delegation becomes necessary. Departments form, each with its own leader, and these leaders coordinate with one another. Such hierarchical structure necessarily emerges as a solution to the problem of scale. It mirrors the way an organism relies on a brain and the nervous system to manage its many parts.
The need for coordination, in turn, gives rise to the need for authority. Authority is not inherently oppressive; it is a tool for managing complexity. In a software development project, for example, dozens of individuals might work on interconnected tasks. Frontend developers rely on backend developers to provide data, while backend developers depend on database administrators to manage information. If one team member fails to deliver, the entire project can stall. To prevent such breakdowns, the team must agree on shared norms, schedules, and decision-making processes. These agreements require a team lead to take charge in order to resolve disputes, set priorities, and ensure that everyone is aligned. This authority is not arbitrary; it emerges from the practical demands of coordination.
The same principle applies to large-scale industries. Modern factories, with their complex machinery and hundreds of workers, cannot function without a clear chain of command. Independent action gives way to combined action, and combined action requires organization. Authority, in this context, is not a top-down imposition but a bottom-up necessity. It arises because of the material conditions of production dictated by the scale, complexity, and interdependence of tasks.
Critics of authority often argue for absolute autonomy, but such arguments overlook the real and tangible need for coordination. Authority and autonomy are not opposites; they exist on a spectrum, and their balance shifts with the needs of the group. In a small, simple society, autonomy might dominate. In a large, complex one, authority becomes indispensable. To reject authority outright is to ignore the lessons of both biology and history: that coordination is the foundation of complexity, and that complexity, in turn, demands systems to manage it.
Authority, far from being a mere social construct, is a natural response to the challenges of scale and complexity. It is not inherently good or evil. Rather, it is an effective tool for addressing the needs of the group and the demands of the environment.
I have been reading about complexity science and, as @Sodium_nitride@lemmygrad.ml mentioned, I have thought about whether we can use some insights from Complexity Science that help us build some general model or rule for human organization. This is definitely a question that has interested me lately, and if anyone wants to share thoughts or resources I’d be down. Your post here re-sparked my interest!
I thought I’d give an overview of the literature I’m familiar with, and try to make some connections with what OP wrote, and/or the goal that Sodium_nitride wrote. What follows is mostly an information dump, and I’m sorry for hijacking your post. It’s just that this topic really excites me! Hopefully some of this may be useful food for thought.
There is, of course, cybernetics and the work of Ashby and Beer. Most Hexbearites know of project Cybersyn in Chile, so Beer should be familiar. I think a big idea out of that classical Cybernetics is Ashby’s Law of Requisite Variety (which I will mention again below), and Beer’s idea of the Viable System Model (VSM). I actually am not familiar enough with the VSM to give a decent overview, but an overview written by Raul Espejo, someone who worked on Project Cybersyn with Beer, can be found here. If other comrades wish to give their understanding of the VSM, feel free! I'd love to hear it.
Moving on from Cybernetics and toward more contemporary complex systems theory, the model/theory I’m most familiar with is that presented by Thurner, Hanel, and Klimek in their Introduction to the Theory of Complex Systems. Since Complexity Science is an emerging field, this is just one of the attempts I know to give a general definition for what a complex system is and how to model it. This can be a good starting point for those who want to think about actually modeling complexity, but by no means is it the final word. It doesn’t mention agent based modeling, for example, which is something that could be added to simulate people/institutions and their actions.
The big idea of this book is its attempt at explaining complex systems as a co-evolving multilayer network. Complex systems are:
a.) networks, they consist of entities/components (nodes) with interconnections/relationships (links), they are
b.) co-evolving, both the entities (nodes) and relationships (links) dynamically change, and their evolution depends on each other, i.e. the dynamics of the entities depend on other entities and their interrelationships, and the dynamics of the interrelationships depend on the entities, and
c.) the networks that describe complex systems are multilayerd. A multilayer network just means there are multiple types of relationships in the network, you can express each relationship type with its own network (and the entities/nodes are also present on each network). You can imagine one network having “trade relations”, another network for “communication relations”, etc. The evolution of each relationship in a network-layer also also depends on the other network-layers.
The book develops a general model for this (it can be rather mathematical) and tries to use it for describing some common features of complex systems: punctuated equilibrium, self-organization, robustness, resilience, statistics on collapse, etc.
I’ve thought a bit about using this type of framework for, as Sodium_nitride, has mentioned, describing some common framework for organization. But brainstorming and sharing ideas would definitely help, especially since this isn’t my day job and one person can only think of so much. My first interest was thinking about complexity science as a way to talk about modes of production more generally.
A different direction is the work of Joshua , he has a book called Generative Social Science: Studies in Agent-Based Computational Modeling. It doesn’t use Thurner et al.’s framework described above. It is focused on various ways of using agent-based modeling to generate some observed social dynamics. So think of Conway’s Game of Life but for social rules. This is also like @Sodium_nitride@lemmygrad.ml's idea of coming up with rules for organization and then studying what emerges from them.
There is similar work by Axtell where he uses agent-based modeling to generate observable statistics on firm sizes.
The takeaway if that if one can find a description of organizational rules, then you can simulate them to study emergent patterns and statistics that aren’t obvious from these rules themselves. And these are various works that go into details of how.
My focus on agent-based modeling and complexity science has taken a backseat for now. I want to learn more political economy and imperialism. And I think I was focusing too much on the micro-to-macro approach. And as Marx says, “the complete body is easier to study than its cells”. There are divides on the approach of using micro-descriptions to generate macro-results. It isn’t that micro-dynamics isn’t important, but it is the idea that if you want to focus on the macro-level then there are many cases where you can start at the macro-level and do not need to have a full (or accurate) micro-model. This is also where the idea of universality comes into play - various different micro-models can all give the same emergent macro-descriptions. There is a bit of redundancy, so to speak, in our micro-models. So some people say to just start with a macro-model if that’s the focus of your study.
A paper I recently read that has directed away form focusing too much on the micro-level descriptions is Software in the Natural World: A Computational Approach to Hierarchical Emergence which links together computation science, information theory, and complexity science. A big idea this paper is asking is under what contexts do you need a micro-model to explain an emergent macro-model. It finds that under certain situations you can do not need to refer to a micro-model in order to generate/predict macro-data. The “macro-world” is “closed off” to the “micro-world”. This is where causal, information, and computational closure comes into play.
If anyone is familiar with Physics then this isn’t surprising. You don’t need statistical mechanics to describe thermodynamics and make predictions. You don’t need quantum mechanics to make predictions of the macro world using Newtonian Physics. And Anwar Shaikh would similarly argue that you don’t need microeconomics to understand macroeconomics. But.. it isn’t that micro-models are useless or don’t have a place. It just depends on what you need the model for.
I've begun to take a look at your provided resources. They seem pretty interesting thanks for sharing.
I very much like the idea of thinking of complex systems as multilayer networks, and I'd go even further and say that self organizing systems tend to exhibit fractal nature. Organization within one layer becomes a substrate for the next level of abstraction, and at each layer there is a need for optimization of energy and space use that drives its evolution, so you end up with self similar patterns emerging at different levels of complexity. I ended up writing a bunch on the subject here, might be of interest. I work up to politics in chapter 8, but start with thermodynamics and evolution of complex systems in nature as the basis https://theunconductedchorus.com/
The idea of decoupling macro and micro models makes sense to me as well, and it can be framed in terms of abstraction. Our own minds create an abstract version of the world from the sensory data, and it's very effective for accomplishing tasks in our daily lives. It's a perfect example of how a model does not need to express all the underlying complexity to be useful.
I very much like your analogy of the mind creating an abstract version of the world. And the paper on emergence has a section on Hopfield networks. Without coming across as reductionist, I think there is something to this idea that our thoughts, mental formations, "computations" in our mind are some "macro" emergent "model" that can be analyzed without detailed understanding of the microconfigurations. This is very much related to the concept of entropy as you're most likely aware of. The book by Thurner et. al also has an entire chapter dedicated to what entropy means (and how it can be calculated of course) in a complex system. The "software" of thought, or computation, is a "higher-level abstraction" that can be run on multiple types of "hardware" with determine the specific micro-configurations.
If you are interested in computation theory (it's a new field to me), then the paper on emergence, Software in the Natural World may really interest you.
And if you are interested in the nexus of Hegelian dialectics and computation theory, then you may enjoy this paper on Hegel, computation, and self-reference.
Going back to multilayer-networks. I've thought about using that framework to combine physical constrains (like energy usage, etc.) with political-economic network (labor, commodity, money flows) to come up with some way to model modes of production. This would be similar to what anthropologist Eugene Ruyle has written about.
If using multilayer networks, the "relations of production" that help to differentiate the various modes can be expressed as various types of links between the nodes (whether they be individuals, firms(?), "some general unit of production", etc.). If one could find a way to generally model how humans organize themselves and built institutions, then perhaps that could be encoded in a network's links as well.
But... I got really bogged down in the details and making it work in practice was a bit harder. I have more to say about it, though, if anyone is interested.
I'll give your link a read, it sounds interesting!!
It sounds like we've been thinking about a lot of very similar stuff recently. :)
Very much agree with what you're saying, and thanks for the link. It's great to see more material on the subject. I really enjoyed skimming through the paper you linked in the last comment that talks about applying computational approach to understanding hierarchies and viewing macroscopic process as being computationally closed if their behavior can be fully described by a coarse-grained model of their microscopic components. If you haven't read it, I can highly recommend Hofstadter's I Am a Strange Loop. I think he does a great job arguing that high level phenomena such as patterns of our thoughts are substrate independent.
The problem of modelling the modes of production and social organization is an interesting one to be sure. In my mind the two things are inherently tied as the mode of production tends to shape our social relations, and vice versa. It's kind of a recursive network that evolves over time. The view of a society as an organism is helpful here was well, as you can model different social structures within society as organs within the organism. This is where the ideas from Software in the natural world paper come into play again.
Some other writers in complexity can also be thought-provoking. As I’ve mentioned, complexity science is an emerging field so I don’t think a single school of thought has dominated? That also means that I, as a lay-person, may just be following quacks. So keep that in mind.
Here are a few other writers on complexity in case anyone is interested. Something to note is that you’ll find a bit of anti-Sovietism in these writers. It’s like they have this Hayekian worldview where they see socialism as unable to treat complexity, and this is why the USSR failed, etc. etc.. But immanent critique of the field is a good way forward, so learning how these authors think of complexity can still be useful.
My first introduction to thinking about Complexity was this article on complexity, scale, and cybernetic-communism. It is leftist, but still anti-Soviet. There are many citations included, though, if you want to go through some rabbit holes. If you are up to the mathematics, the final section is an exploration about mathematical measures of Complexity.
Some main ideas that are cited in the above article:
1.) Studies using historical data from the Seshat Global History Databank suggests that the growth of complex societies follows a repeating two-phase cycle. The first phase is in which societies grow in scale, but not in information-capacity. Any increase in complexity is simply due to increasing scale. Their given information-capacity induces a “scale threshold”, a maximum scale, at which the society cannot progress beyond and this causes the society to ‘stagnate’. This stagnation is the second phase where the scale remains the same, but information-capacity (may) grow. If a society can advance their information-capacity then it can continue to grow in scale until it meets the next scale threshold, and repeat.
I’ve thought about this being another view (in terms of complexity and information) of the Marxist idea that
Or another way of thinking about the quantity to quality idea in dialectics.
2.) In order to build cybernetic-communism we need better “instruments of complexity” beyond money and markets. In the most general definition, complexity is a measure of how much information it takes to “describe” a system. The last section of the article goes into some attempts at quantifying this, and some criticisms of complexity measures such as KL divergence, Kolmogorov complexity, etc.
I’ve also listened to some episodes of the General Intellect Unit Podcast where they discuss Cybernetics form a leftist perspective (but still manage to be anti-Soviet). I can't give a general recommendation, but it is another resource.
Other writers in complexity science that I have found are Alexander Siegenfeld and Yaneer Bar-Yam. Bar-Yam is the founding president of the New England Complex Systems Institute, and he definitely has is own ‘school of thought’ within Complexity Science. You may find it fruitful to go through some of his arguments, even if it is to build a better critique. He would definitely fit into the ‘anti-authority’ camp, and would view hierarchies as a limitation to the information-capacity of a network, and hence a limitation to complexity. So he would be a counter to your views on systems. You can also definitely find anti-Sovietism in his work, as well.
Siegenfeld and Bar-Yam wrote an introductory paper to complex systems that may be of interest, and doesn’t require mathematical knowledge.
Some big ideas from this paper are developing a more intuitive understanding of what complexity is. You can think of complexity as like the division of labor that you mentioned. It is correlation of various functions between a system’s parts. It isn’t randomness, and it isn’t uniform cohesion.
I think the biggest idea to take away form Bar-Yam’s work is a Scale-based extension of Ashby’s Requisite Law of Variety that we find in cybernetics. This means that the complexity of a system is a function of the components, the interactions, and the scale of the system. A system may be very complex at a large scale, but lacks necessary complexity at small scales, and etc. This is a view of complexity that focuses on Ashby’s idea of variety. This is the number of possible actions, or states, that a system can take. Bar-Yam extends variety to include scale. So there are small scale actions (actions of single individuals) and large-scale actions (of a state).
Ashby’s Law (with or without taking into account Scale) is about autopoiesis, a system’s ability to maintain itself in an environment. According to Ashby’s Law, a system in an environment (which itself is also a system) must be able to react or respond to actions from the environment at the appropriate scale. For example, the climate is an environment that our mode of production (system) is within (and also part of). Climate change creates certain actions (wildfires, global temperature increase, flooding) at a large (regional to global) scale. Capitalism, to maintain itself, has to respond to each environmental action at its scale. If it fails, then changes occur within Capitalism, the system. The system may adapt, evolve, change the environment, or fragment. For socialism (a system) to survive within its environment (global capitalism) it must have the appropriate responses, at the appropriate scale, to respond to capitalism’s "attacks". The variety of a system must match or exceed the variety of its environment in order to survive. And this applies at all scales.
The scale-based version of Ashby's Law suggests that sometimes a system can “flex” its variety at a particular scale in order to outcompete another system (or its environment). One example that’s given to describe Scale-based Ashby’s Law is guerilla warfare. In warfare, certain environments may favor smaller scale actions over large scale actions. Guerilla fighters may have more variety (actions/states/options) at small scale compared to larger armies. Some environments, like open fields, etc. may favor larger armies. So a large army will have more variety at a large scale than a small one, etc.
Other papers by Bar-Yam then talk about ways of calculating the complexity, or variety I suppose, of a system at its various scales.
At this point, I’m not certain how accepted these ideas are in complexity science. They may be approaching quack? But I’ve found them interesting to chew on, and try to incorporate into my understanding of Marxism and systems.