Category: Methodology
Posted by: Aaron
Evolution by natural section describes a process by which replicators (things that make copies of themselves) tend to increase in numbers, but compete for limited resources, so that replicators with altered (by random mutation) features have (context dependent) variable rates of replication. The result is that those alterations that foster increased replication in the prevailing local context become more prevalent in that population. This view requires identification of the replicator, and genes (or some smaller fragments of DNA) are the natural candidate for life on Earth as we have ever seen it. The idea of group selection, of which earlier crude versions had been reasonably dismissed, has made a rebound. The new versions (sometimes more accurately referred to as multilevel selection) have been called upon to explain culture, morality, altruism, cooperation, and similar phenomena in a seemingly plausible way. But most proposed new versions of group selection are also flawed, and some other explanation is needed for these social phenomena.

» Read More

Category: Methodology
Posted by: Aaron
Complexity science purports to shift the focus from describing states to describing processes, yet this conceptual shift has been much slower and more difficult than many would let on. The amount of self-anointed complexity research that still relies on equilibrium analysis reveals the limited degree to which the field has actually broken free from its methodological roots. Complex adaptive systems are exactly those that self organize in a way that allows them to maintain functionality, cohesion, or other such properties while being continuously in flux. If a system really reaches equilibrium then it’s a bad candidate for being a complex system. But we don’t need to throw out all our old concepts in the pursuit of new, dynamic ones; we can use them as a springboard for developing complexity science. This post is an attempt to analogize the equilibrium concept in a way that directly shifts the focus from states to processes.

» Read More

Category: Methodology
Posted by: Aaron
A new project I'm working on aims to establish a formalism to convert between and combine (update) beliefs measured in just about any way available. Probabilistic, Boolean-based beliefs with Bayesian updating is so much the dominant approach that anything else seems like a niche belief representation. Alternatives such as fuzzy truth and Dempster-Shafer beliefs (incorporating uncertainty) have their subdomain applications for specialized information-system components, but then these components cannot work with other components representing beliefs in a different way. Furthermore, several of these alternative representations do not yet have consistent updating rules, or clear guidelines for how to apply and interpret combined probabilities. So I want to do this.

» Read More

Category: Methodology
Posted by: Aaron
Distance, as it is usually thought of, is between two points (aka dyadic). This can be discreet as in the minimal number of grid spaces an agent needs to traverse to travel from location A to location B (the Chebyshev distance). For standard real-valued spaces it is the length of the shortest line connecting the two points. We can general this into higher dimensions by measuring the distances of manifolds (like strings, curves, solids, etc.), but to be a metric they always have to satisfy the same criteria: 1) positive or 0, 2) symmetric, and 3) the triangle inequality. Now, let's say I want to measure how far apart the elements of a set are, say three points in a 2D plane. Can we come up with a formula for the distance among these three points?

» Read More

Category: Methodology
Posted by: Aaron
The idea of layered networks is quite straight forward: the objects in your model are related to each other in more than one kind of way. This distinguishes a layered network from a multi-graph in which there are multiple connections of the same type. In some cases types of edges represent different relational features; for example, one could have a model with people as nodes and the ability to see each other (i.e. in line of sight) as being one kind of edge and the ability to hear each other (i.e. within natural hearing range) as another. Clearly these are different information paths with different properties on the kinds of information, speeds, distances possible, reciprocity (directedness), etc. Combining a city's road, power, and water networks on the same graph is another straightforward example. For these sorts of heterogeneous communication networks many of the common properties (such as path length and out-component) have already been adapted. But others (such as community structure and betweenness centrality) need a deeper look.

» Read More

Category: Methodology
Posted by: Aaron
The dominant discrete geometry for 2D simulation environments is the square grid; mostly because it's the default and conceptually and methodologically simple. There are, however, some advantages to using a hexagonal grid in 2D: hexes also tessellate (tile) in two dimensions and the centers of all six neighboring hexes are equidistant (see Hex in Netlogo). In three dimensions, however, only the cube tessellates and hence the corner effects are inescapable…until now! By considering discrete geometries as networks of connections, it is possible to build a 3D hexagonal-like (actually dodecahedral) geometry in which all neighbors are equidistant.

» Read More

Category: Methodology
Posted by: Aaron
The topologies of 2D agent-based models are almost completely dominated by bounded planes and torus surfaces (aka: wrapping edges, aka: periodic boundaries). These may be reasonable approximations for some systems, but certainly not for all. There are at least few things I’d like to model that occur on the surface of something roughly spherical (e.g. on the Earth). Spheres are topologically like bounded planes in that neither has holes and both can be laid flat. But they are also like tori in that they have periodic boundaries as surfaces of three-dimensional objects. In order to facilitate a growth in popularity of sphere (or ellipsoid) models, I present a technique (and eventually code) for generating such models in Java or NetLogo.

» Read More

Category: Methodology
Posted by: Aaron
There have been many attempts to define culture, each having its own spin reflecting what work the definition needs to do. My favorites are Boyd & Richerson's and Page & Bednar's, though they are quite different from each other. What the existing definitions have in common is that they are attempts to capture what culture is rather than what culture does. My claim is that culture is properly understood as systematic patterns in how things are done; and my focus is on measuring cultural distances rather than simply identifying cultural components or providing culture-based explanations. My measurement scheme analyses similarities in the differences of how things are done: the "same difference" criterion. This measurement does not help provide an explanation for culture and their differences, but it does identify which behavioral features need a cultural explanation. A sketch of the technique follows.

» Read More

Category: Methodology
Posted by: Aaron
One virtue of the differential equation method of system dynamics is that if one wants to know the system state at some point in the future one can often just plug in the appropriate t value and get the system state at that time (given the specified starting conditions). Not always. In some applications (such as for chaos theory) the system must be iterated step by step and there is no way to just "skip ahead" to a particular time. Agent-based models have this property too. One specifies the initial conditions and then to find out what will happen at some arbitrary time in the future one must actually run the simulation through all the intermediate steps. What a pain! The situation is not hopeless, however – there might be some shortcuts we can exploit and I've got a few ideas.

» Read More

Category: Methodology
Posted by: Aaron
The tipping points technique under development by yours truly (see here then here) is intended to be something akin to a statistical technique the following sense: it uses data from a system (real or modeled), determines which model of a different kind (Markov) would also generate the observed data, and then conveys information about the imputed model not directly ascertainable from the data. And so like descriptions of statistical modeling techniques I need to provide a description of how to generate the Markov model from the original data and the properties/caveats of doing it in different ways.

» Read More