"What if Consciousness is a vast and non-computable complex system of non-local GANs feedback loops?", Klee, 2019
After spending almost 2 years as a digital nomad, surfing and skateboarding the world, I have now completed the prophecy and arrived at my final destination. I am now a computer scientist researcher and software engineer manager at the Quantum Gravity Research Institute.
While working half of my time in our campus in Malibu, CA and the other half in our brand new space in Maui, HI, among the many projects we are focusing on, a very cool one is the simulation of our theory for reality, which is based in a discrete spacetime (just like frames in a movie, thought as the simulation we live in), on top of a topological quantum neural network. All without leaving out the emergent complex system phenomena that is consciousness. Intrigued? Stay tuned 😜 Meanwhile, enjoy the weekly references and tips I will be throwing at you 😘
Natural selection is the world optimizing for survival on Earth. Every life form on Earth is a solution generated by evolution's algorithm, which evolves a population of individuals over generations, optimizing for survival. Genetic algorithms are inspired by nature and evolution, what makes them very cool! Artificial neural networks ("NN") are also modeled from biology, so we can think that evolution is the best general purpose learning algorithm we've seen so far, and the brain is the best general purpose problem solver we know so far.
This is a way to describe a GA algorithm:
- The algorithm begins by creating a random initial population.
- Select the fittest individuals to be the parents of the next generation (a score). Randomly select some of the non-fittest individuals to be parents as well, increasing the chance of finding a global optimum.
- Crossover the selected parents, creating new individuals. There will be a chance that the child will have a random mutation of it's numbers.
- Calculate the average population fitness. Rinse and repeat.
- When the average population fitness is ~0 (or close to it), stop evolving.
This is the pseudocode:
START Generate the initial population Compute fitness REPEAT Selection Crossover Mutation Compute fitness UNTIL population has converged STOP
GA algorithm are known for decades, but they are still hot as an optimization technique. Combined with neural networks they can help to find the best hyper-parameters, by creating a population of many NNs and letting it evolve.
This is exactly what we are doing at QGR, I will be telling you more soon.
This w33k's References
- Are You Living in a Simulation, Nick Bostrom.
- AdS/CFT as a deep Boltzmann machine.
- The Fastest Way of Computing All Universes, Schmidhuber.
- The Free Will Theorem, Conway & Kochen.