Software:OpenCog

From HandWiki
Short description: Project for an open source artificial intelligence framework
OpenCog
Open Source Artificial Intelligence
Open Source Artificial Intelligence
Original author(s)OpenCog Developers
Developer(s)OpenCog Foundation
Initial release21 January 2008; 16 years ago (2008-01-21)[1]
Written inC++, Python, Scheme
PlatformLinux
TypeArtificial general intelligence
LicenseGNU Affero General Public License
Websiteopencog.org

OpenCog is a project that aims to build an open source artificial intelligence framework. OpenCog Prime is an architecture for robot and virtual embodied cognition that defines a set of interacting components designed to give rise to human-equivalent artificial general intelligence (AGI) as an emergent phenomenon of the whole system.[2] OpenCog Prime's design is primarily the work of Ben Goertzel while the OpenCog framework is intended as a generic framework for broad-based AGI research. Research utilizing OpenCog has been published in journals and presented at conferences and workshops including the annual Conference on Artificial General Intelligence. OpenCog is released under the terms of the GNU Affero General Public License.

OpenCog is in use by more than 50 companies, including Huawei and Cisco.[3]

Origin

OpenCog was originally based on the release in 2008 of the source code of the proprietary "Novamente Cognition Engine" (NCE) of Novamente LLC. The original NCE code is discussed in the PLN book (ref below). Ongoing development of OpenCog is supported by Artificial General Intelligence Research Institute (AGIRI), the Google Summer of Code project, Hanson Robotics, SingularityNET and others.

Components

OpenCog consists of:

  • A collection of pre-defined atoms, termed Atomese, used for generic knowledge representation, such as conceptual graphs and semantic networks, as well as to represent and store the rules (in the sense of term rewriting) needed to manipulate such graphs.
  • A collection of pre-defined atoms that encode a type subsystem, including type constructors and function types. These are used to specify the types of variables, terms and expressions, and are used to specify the structure of generic graphs containing variables.
  • A collection of pre-defined atoms that encode both functional and imperative programming styles. These include the lambda abstraction for binding free variables into bound variables, as well as for performing beta reduction.
  • A collection of pre-defined atoms that encode a satisfiability modulo theories solver, built in as a part of a generic graph query engine, for performing graph and hypergraph pattern matching (isomorphic subgraph discovery). This generalizes the idea of a structured query language (SQL) to the domain of generic graphical queries; it is an extended form of a graph query language.
  • An attention allocation subsystem based on economic theory, termed ECAN.[4] This subsystem is used to control the combinatorial explosion of search possibilities that are met during inference and chaining.
  • An implementation of a probabilistic reasoning engine based on probabilistic logic networks (PLN). The current implementation uses the rule engine to chain together specific rules of logical inference (such as modus ponens), together with some very specific mathematical formulas assigning a probability and a confidence to each deduction. This subsystem can be thought of as a certain kind of proof assistant that works with a modified form of Bayesian inference.
  • A probabilistic genetic program evolver called Meta-Optimizing Semantic Evolutionary Search, or MOSES.[5] This is used to discover collections of short Atomese programs that accomplish tasks; these can be thought of as performing a kind of decision tree learning, resulting in a kind of decision forest, or rather, a generalization thereof.
  • A natural language input system consisting of Link Grammar, and partly inspired by both Meaning-Text Theory as well as Dick Hudson's Word Grammar, which encodes semantic and syntactic relations in Atomese.
  • A natural language generation system.[6]
  • An implementation of Psi-Theory for handling emotional states, drives and urges, dubbed OpenPsi.[7]
  • Interfaces to Hanson Robotics robots, including emotion modelling[8] via OpenPsi. This includes the Loving AI project, used to demonstrate meditation techniques.

Organization and funding

In 2008, the Machine Intelligence Research Institute (MIRI), formerly called Singularity Institute for Artificial Intelligence (SIAI), sponsored several researchers and engineers. Many contributions from the open source community have been made since OpenCog's involvement in the Google Summer of Code in 2008 and 2009. Currently MIRI no longer supports OpenCog.[9] OpenCog has received funding and support from several sources, including the Hong Kong government, Hong Kong Polytechnic University, the Jeffrey Epstein VI Foundation[10] and Hanson Robotics. The OpenCog project is currently affiliated with SingularityNET and Hanson Robotics.

Applications

Similar to other cognitive architectures, the main purpose is to create virtual humans, which are three dimensional avatar characters. The goal is to mimic behaviors like emotions, gestures and learning. For example, the emotion module in the software was only programmed because humans have emotions. Artificial General Intelligence can be realized if it simulates intelligence of humans.[11]

The self-description of the OpenCog project provides additional possible applications which are going into the direction of natural language processing and the simulation of a dog.[12]

See also

Sources

References

  1. "OpenCog Release". 21 January 2008. http://bazaar.launchpad.net/~opencog-dev/opencog/trunk/changes/2. 
  2. "OpenCog: Open-Source Artificial General Intelligence for Virtual Worlds | CyberTech News". 2009-03-06. http://www.cybertechnews.org/?p=915. 
  3. Rogers, Stewart (2017-12-07). "SingularityNET talks collaborative AI as its token sale hits 400% oversubscription". VentureBeat. https://venturebeat.com/2017/12/07/singularitynet-talks-collaborative-ai-as-its-token-sale-hits-400-oversubscription/. 
  4. "Economic Attention Allocation". https://wiki.opencog.org/w/ECAN. 
  5. "MOSES". https://wiki.opencog.org/w/MOSES. 
  6. "Natural Language Generation.". https://wiki.opencog.org/w/Natural_language_generation. 
  7. "OpenPsi". https://wiki.opencog.org/w/OpenPsi. 
  8. "Emotion modeling - Hanson Robotics Wiki". http://wiki.hansonrobotics.com/w/Emotion_modeling. 
  9. Ben Goertzel (2010-10-29). "The Singularity Institute's Scary Idea (and Why I Don't Buy It)". The Multiverse According to Ben. http://multiverseaccordingtoben.blogspot.com/2010/10/singularity-institutes-scary-idea-and.html. 
  10. "Even after his arrest, scientists were more than happy to take money from Jeffrey Epstein". Fast Company. Jul 11, 2019. https://www.fastcompany.com/90375335/jeffrey-epsteins-money-was-accepted-by-scientists-even-after-arrest. 
  11. David Burden; Maggi Savin-Baden (24 January 2019). Virtual Humans: Today and Tomorrow. CRC Press. ISBN 978-1-351-36526-0. https://books.google.com/books?id=ySiFDwAAQBAJ&pg=PT303. Retrieved 25 August 2020. 
  12. Ben Goertzel; Cassio Pennachin; Nil Geisweiller (8 July 2014). Engineering General Intelligence, Part 1: A Path to Advanced AGI via Embodied Learning and Cognitive Synergy. Springer. pp. 23–. ISBN 978-94-6239-027-0. https://books.google.com/books?id=5Wm5BQAAQBAJ&pg=PA23. 

External links