• Correspondence Matrices (Paper)

  • Evolutionary Logic

         The logistic function is foundational in machine learning as it is both non-linear and continuous. The result of the logistic functional is fed in to the activation function, the overall result corresponding to an observable such as a valuation of truth. This paper is concerned with evolving logical expressions and gives a way to, using propositional logic, manipulate the logistic function by way of the Heavyside function (the activation function).

         This form of evolutionary logic can be applied to the LM’s [here] which evolve to particular CM’s and allow for the time dependent control of phenomenon such as logical switching. Going further we can (possibly) chain together logical expressions via the entanglement of LM’s and have that the evolution of one logical expression has a predictable effect on its entangled partner expression. This might give some insight in to how distributed systems my co-evolve, at least logically.

         There is an interesting effect found in turning the logistic function as seen in the picture in to a binary one, that being such that it is undefined for 0. While this is done for reasons such that a logical expression, given equal probabilities, has an equal probability of being true or false, it has a consequence that there is a probability in which a positively valuated logical expression is equivalent to a negatively valuated i.e. 0 = 1. This has to do with measurement precision and appears to give rise to some amounts of inherent logical uncertainty associated to any differentiable function such as the logistic function.

    Powered by: Wordpress