Software:OpenCog
Open Source Artificial Intelligence | |
Original author(s) | OpenCog Developers |
---|---|
Developer(s) | OpenCog Foundation |
Initial release | 21 January 2008[1] |
Written in | C++, Python, Scheme |
Platform | Linux |
Type | Artificial general intelligence |
License | GNU Affero General Public License |
Website | opencog |
OpenCog is a project that aims to build an open source artificial intelligence framework. OpenCog Prime is an architecture for robot and virtual embodied cognition that defines a set of interacting components designed to give rise to human-equivalent artificial general intelligence (AGI) as an emergent phenomenon of the whole system.[2] OpenCog Prime's design is primarily the work of Ben Goertzel while the OpenCog framework is intended as a generic framework for broad-based AGI research. Research utilizing OpenCog has been published in journals and presented at conferences and workshops including the annual Conference on Artificial General Intelligence. OpenCog is released under the terms of the GNU Affero General Public License.
OpenCog is in use by more than 50 companies, including Huawei and Cisco.[3]
Origin
OpenCog was originally based on the release in 2008 of the source code of the proprietary "Novamente Cognition Engine" (NCE) of Novamente LLC. The original NCE code is discussed in the PLN book (ref below). Ongoing development of OpenCog is supported by Artificial General Intelligence Research Institute (AGIRI), the Google Summer of Code project, Hanson Robotics, SingularityNET and others.
Components
OpenCog consists of:
- A graph database, dubbed the AtomSpace, that holds "atoms" (that is, terms, atomic formulas, sentences and relationships) together with their "values" (valuations or interpretations, which can be thought of as per-atom key-value databases). An example of a value would be a truth value. Atoms are globally unique, immutable and are indexed (searchable); values are fleeting and changeable.
- A collection of pre-defined atoms, termed Atomese, used for generic knowledge representation, such as conceptual graphs and semantic networks, as well as to represent and store the rules (in the sense of term rewriting) needed to manipulate such graphs.
- A collection of pre-defined atoms that encode a type subsystem, including type constructors and function types. These are used to specify the types of variables, terms and expressions, and are used to specify the structure of generic graphs containing variables.
- A collection of pre-defined atoms that encode both functional and imperative programming styles. These include the lambda abstraction for binding free variables into bound variables, as well as for performing beta reduction.
- A collection of pre-defined atoms that encode a satisfiability modulo theories solver, built in as a part of a generic graph query engine, for performing graph and hypergraph pattern matching (isomorphic subgraph discovery). This generalizes the idea of a structured query language (SQL) to the domain of generic graphical queries; it is an extended form of a graph query language.
- A generic rule engine, including a forward chainer and a backward chainer, that is able to chain together rules. The rules are exactly the graph queries of the graph query subsystem, and so the rule engine vaguely resembles a query planner. It is designed so as to allow different kinds of inference engines and reasoning systems to be implemented, such as Bayesian inference or fuzzy logic, or practical tasks, such as constraint solvers or motion planners.
- An attention allocation subsystem based on economic theory, termed ECAN.[4] This subsystem is used to control the combinatorial explosion of search possibilities that are met during inference and chaining.
- An implementation of a probabilistic reasoning engine based on probabilistic logic networks (PLN). The current implementation uses the rule engine to chain together specific rules of logical inference (such as modus ponens), together with some very specific mathematical formulas assigning a probability and a confidence to each deduction. This subsystem can be thought of as a certain kind of proof assistant that works with a modified form of Bayesian inference.
- A probabilistic genetic program evolver called Meta-Optimizing Semantic Evolutionary Search, or MOSES.[5] This is used to discover collections of short Atomese programs that accomplish tasks; these can be thought of as performing a kind of decision tree learning, resulting in a kind of decision forest, or rather, a generalization thereof.
- A natural language input system consisting of Link Grammar, and partly inspired by both Meaning-Text Theory as well as Dick Hudson's Word Grammar, which encodes semantic and syntactic relations in Atomese.
- A natural language generation system.[6]
- Interfaces to Hanson Robotics robots, including emotion modelling[8] via OpenPsi. This includes the Loving AI project, used to demonstrate meditation techniques.
Organization and funding
In 2008, the Machine Intelligence Research Institute (MIRI), formerly called Singularity Institute for Artificial Intelligence (SIAI), sponsored several researchers and engineers. Many contributions from the open source community have been made since OpenCog's involvement in the Google Summer of Code in 2008 and 2009. Currently MIRI no longer supports OpenCog.[9] OpenCog has received funding and support from several sources, including the Hong Kong government, Hong Kong Polytechnic University, the Jeffrey Epstein VI Foundation[10] and Hanson Robotics. The OpenCog project is currently affiliated with SingularityNET and Hanson Robotics.
Applications
Similar to other cognitive architectures, the main purpose is to create virtual humans, which are three dimensional avatar characters. The goal is to mimic behaviors like emotions, gestures and learning. For example, the emotion module in the software was only programmed because humans have emotions. Artificial General Intelligence can be realized if it simulates intelligence of humans.[11]
The self-description of the OpenCog project provides additional possible applications which are going into the direction of natural language processing and the simulation of a dog.[12]
See also
Sources
- Hart, D; B Goertzel (2008). "OpenCog: A Software Framework for Integrative Artificial General Intelligence". Proceedings of the First AGI Conference. http://www.agiri.org/OpenCog_AGI-08.pdf. Gbooks
- Goertzel, B., Iklé, M., Goertzel, I.F., Heljakka, A. Probabilistic Logic Networks, A Comprehensive Framework for Uncertain Inference, Springer, 2009, VIII, 336 p., Hardcover ISBN:978-0-387-76871-7
References
- ↑ "OpenCog Release". 21 January 2008. http://bazaar.launchpad.net/~opencog-dev/opencog/trunk/changes/2.
- ↑ "OpenCog: Open-Source Artificial General Intelligence for Virtual Worlds | CyberTech News". 2009-03-06. http://www.cybertechnews.org/?p=915.
- ↑ Rogers, Stewart (2017-12-07). "SingularityNET talks collaborative AI as its token sale hits 400% oversubscription". VentureBeat. https://venturebeat.com/2017/12/07/singularitynet-talks-collaborative-ai-as-its-token-sale-hits-400-oversubscription/.
- ↑ "Economic Attention Allocation". https://wiki.opencog.org/w/ECAN.
- ↑ "MOSES". https://wiki.opencog.org/w/MOSES.
- ↑ "Natural Language Generation.". https://wiki.opencog.org/w/Natural_language_generation.
- ↑ "OpenPsi". https://wiki.opencog.org/w/OpenPsi.
- ↑ "Emotion modeling - Hanson Robotics Wiki". http://wiki.hansonrobotics.com/w/Emotion_modeling.
- ↑ Ben Goertzel (2010-10-29). "The Singularity Institute's Scary Idea (and Why I Don't Buy It)". The Multiverse According to Ben. http://multiverseaccordingtoben.blogspot.com/2010/10/singularity-institutes-scary-idea-and.html.
- ↑ "Even after his arrest, scientists were more than happy to take money from Jeffrey Epstein". Fast Company. Jul 11, 2019. https://www.fastcompany.com/90375335/jeffrey-epsteins-money-was-accepted-by-scientists-even-after-arrest.
- ↑ David Burden; Maggi Savin-Baden (24 January 2019). Virtual Humans: Today and Tomorrow. CRC Press. ISBN 978-1-351-36526-0. https://books.google.com/books?id=ySiFDwAAQBAJ&pg=PT303. Retrieved 25 August 2020.
- ↑ Ben Goertzel; Cassio Pennachin; Nil Geisweiller (8 July 2014). Engineering General Intelligence, Part 1: A Path to Advanced AGI via Embodied Learning and Cognitive Synergy. Springer. pp. 23–. ISBN 978-94-6239-027-0. https://books.google.com/books?id=5Wm5BQAAQBAJ&pg=PA23.
External links
- OpenCog Wiki
- CogPrime: An Integrative Architecture for Embodied Artificial General Intelligence
- OpenCog: An Open Source Software Framework & A Design & Vision for Advanced AGI. Video on YouTube Given at Monash University Australia, Sept 2011. Adam Ford
- Video introduction to OpenCog by Ben Goertzel Video on YouTube. Ben speaks on OpenCog in Tai Po, Hong Kong, Dec 2011. Adam Ford
- Ben Goertzel - the future of AGI - Open Cog development in Asia Video on YouTube Adam Ford