as you simulate other's behavior within your own neural matrix, enough recursion seems to lead to the occasional behavior leaking out and joining the overall manifold. this is obvious, even intuitive - the more you predict someone else's actions, the more that Particular Causal Chain will become embedded within your larger behavioral matrices. given that modelling others seems to be a native thing people do, does this mean that all conversation will eventually reach a sort of simulated-equilibria? is there a perfect final point for a given group of people wherein their hive mind contains such endlessly recursed simulations of each other that they simply become a superorganism?