Integrating Evolutionary Program Learning with General Cognition (e.g. porting MOSES to Atomspace)

MOSES (Meta-Optimizing Semantic Evolutionary Search, hhttp://metacog.org/doc.html) is conceptually part of OpenCog, but currently exists as a separate codebase from the rest of OpenCog (though programs learned by MOSES can in some cases be imported into the OpenCog Atomspace for post-analysis)…

A current thread of R&D work involves porting MOSES into the Atomspace; early-stage docs and code are here:

Initially this involves a bunch of implementation work, but once the software bits of the port are done, the payoff will be that it will then be “straightforward” to explore using MOSES together with PLN the Pattern Miner, ECAN and other aspects of OpenCog AI…

So the subtler goal here is to bring evolutionary program learning, probabilistic logical inference, attention allocation etc. together into one Big Bad Learning Process…

Conceptually, this is part of the push toward a single unified learning process for AGI, as described e.g. here: [1703.04361] Toward a Formal Model of Cognitive Synergy

1 Like