Gradient descent of Evolution: potential for genetic teleportation
Evolution is the ultimate optimisation algorithm primarily due to the strength of the cost function, where the cost function is real life with an infinite about of variables and parameters ultimately coalescing to two desirable outcomes: survival and reproduction.
In this sense, we can see that to generate new generation AI system beyond that of what is inspired from biology, effective cost functions is key. While we recognise that no cost function will be as good as the simulation of real life, we should drive to make the best cost function as possible and hence increase the probability to generation the best next generation technologies.
However, evolution has a greatest flaw, its iterative and step-wise algorithm means there there neccesitates a pathway to the optimised solution. This means that the best solution that a evolution can come up is limited by the need to have a feasible pathway to that solution. If there is no pathway to that solution, it means that even if the solution is 100x better than the current standard, evolution will never be able to arrive at that solution. This is the achilles heel of evolution, which can only really be solved through the process of genetic teleportation. Genetic teleportation ensures that there is no need for a suitable pathway to the solution, rather by first theoretically deriving the ultimate solution, we can engineer the genes required to get to that solution. In other words, if there is no path across the chasm to the other side, we teleport there instead.
However, I would like to provide a counter-argument to this, and say that genetic teleportation can occur naturally outside the realm of human deliberate engineering of any theological ideas about human creation.
The key here is learning rate, a term borrowed from machine learning. In biology, we can see the learning rate as the rate of mutations or the rate of variation in the population. What is different from machine learning however, is that the learning rate in evolution can vary over time, depending on environmental conditions. Particularly, when there is a sudden change in the environment, resulting in a mismatch between the organism and the environment, there occurs a sudden increase in variation, with a spike in mutations. One example of this is the Cambrian explosion, where there is a sudden increase in variation and speciation to adapt to the environment that has suddenly changed so drastically. In a sense, genetic teleportation works on the principle that when we arrive at a local minima for optimisation of survival, there is a sudden drastic increase in the learning rate / mutation rate in a way that we skip intermediate features for evolution (that has negative side effect without any selective advantage) and jump straight to the state of the feature that yields massive selective advantage.
Another important concept to note is that, the power of random mutations and variation means that the location of a species on the evolutionary gradient descent should not be represented by a point, but rather by a distribution. Imagine a ball covered in a blanket rolling down the hill. If the ball gets stuck in a ditch, that is no problem, because if the blanket is larger than the ditch, then the ball can teleport to any area in the blanket, and this case the ball can teleport to the area in the blanket that is out of the ditch. In this perspective, we can see that increased learning rate / mutation rate, simply means that we have a larger blanket. And a larger blanket means that we are able to get out of a larger chasm of local minima.
The world truly works on a optimisation, and as we see in viruses, we realise that optimisation is not limited by living things, anything can be optimised, and this is supported by the Free Energy Principle, which covers the concept of optimisation beyond the limits of living creatures, where we can consider even a drop of oil in water to have achieved optimisation of dynamic equilibrium to stay separated from water.