Posted
over 16 years
ago
The decision process for the 2009 GSoC projects has been completed. You can read Ben’s announcement on the opencog-soc Google group.
The accepted projects are: Joel Lehman - Extending MOSES to evolve Recurrent Neural Networks
David Kilgore - Python
... [More]
Interfaces For OpenCog Framework API
Ruiting Lian - Natural Language Generation using RelEx and the Link Parser
Rui Liu - Application [...] [Less]
|
Posted
over 16 years
ago
As a kid, and even in the first few years of University, I used to have trouble understanding why things needed to be explained in detail. Essays were difficult because I’d take the point I was trying to make and think of it like a logic problem:
This interesting fact and this analysis, thus this is [...]
|
Posted
over 16 years
ago
We are happy to announce that the SIAI has been selected again this year to participate in the Google Summer of Code program as a mentoring organization. GSoC is an annual program that awards successful student contributors a 4500 USD summer stipend to work on open source and free software projects for three months. Around [...]
|
Posted
over 16 years
ago
A bit of corpus linguistics is performed to examine the mutual information distribution of word pairs.
|
Posted
over 16 years
ago
I’m currently at the tail end of KiwiFoo, a version of O’Reilly’s Foocamp in New Zealand. I hosted a talk about OpenCog which, as inevitably happens, turned into some interesting philosophical discussions about learning vs. memory.
I was also interviewed for the TVNZ7 show Media7 about OpenCog, so hopefully it will show up on national TV [...]
|
Posted
over 16 years
ago
Jared Wigmore has just finished implementing a prototype connection to the Ubigraph dynamic visualisation tool. It’s really neat! Currently available only in his personal bzr branch ( lp:~jared-wigmore/opencog/misc ), but it should be pushed to
... [More]
staging eventually. This follows on from the visualisation stuff I did for Ben’s presentation at the Singularity Summit demonstrating first [...] [Less]
|
Posted
over 16 years
ago
I’ve recently been tinkering with a mechanism for determining word senses based on their grammatical usage. This has me pretty excited, because, so far, it seems to be reasonably accurate (i.e. not terrible), and lightning-fast. I’m doing this by
... [More]
doing some heavy statistical NLP work, computing statistical correlations between word senses and syntax — specifically, [...] [Less]
|
Posted
over 16 years
ago
by
Cat Allman, Open Source Team
By Ben Goertzel, PhD, Director of Research, SIAIThis summer OpenCog was chosen by Google to participate in the Google Summer of Code™ project: Google funded 11 students from around the world to work under the supervision of experienced mentors
... [More]
associated with the OpenCog project, and the associated OpenBiomind project.OpenCog is a large AI software project with hugely ambitious goals (you can't get much more ambitious than "creating powerful AI at the human level and beyond") and a lot of "moving parts" -- and the most successful OpenCog GSoC projects seemed to be the ones that successfully split off "summer sized chunks" from the whole project, which were meaningful and important in themselves, and yet also formed part of the larger OpenCog endeavor ... moving toward greater and greater general intelligence.Many of the GSoC projects were outstanding but perhaps the most dramatically successful (in my own personal view) was Filip Maric's project (mentored by Predrag Janicic) which involved pioneering an entirely new approach to natural language parsing technology. The core parsing algorithm of the link parser, a popular open-source English parser (that is used within OpenCog's RelEx language processing subsystem), was replaced with a novel parsing algorithm based on a Boolean satisfaction solver: and the good news is, it actually works ... getting the best parses of a sentence faster than the old, standard parsing algorithm; and, most importantly, providing excellent avenues for future integration of NL parsing with semantic analysis and other aspects of language-utilizing AI systems. This work was very successful but needs a couple more months effort to be fully wrapped up and Filip, after a brief break, has resumed working on it recently and will continue throughout November and December.Cesar Maracondes, working with Joel Pitt, made a lot of progress on porting the code of the Probabilistic Logic Networks (PLN) probabilistic reasoning system from a proprietary codebase to the open-source OpenCog codebase, resolving numerous software design issues along the way. This work was very important as PLN is a key aspect of OpenCog's long-term AI plans. Along the way Cesar helped with porting OpenCog to MacOS.There were also two extremely successful projects involving OpenBiomind, a sister project to OpenCog: * Bhavesh Sanghvi (working with Murilo Queiroz) designed and implemented a Java user interface to the OpenBiomind bioinformatics toolkit, an important step which should greatly increase the appeal of the toolkit within the biological community (not all biologists are willing to use command-line tools, no matter how powerful) * Paul Cao (working with Lucio Coelho) implemented a new machine learning technique within OpenBiomind, in which recursive feature selection is combined with OpenBiomind's novel "model ensemble based important features analysis." The empirical results on real bio datasets seem good. This is novel scientific research embodied in working open-source code, and should be a real asset to scientists doing biological data analysis.And the list goes on and on: in this short post I can't come close to doing justice to all that was done, but please see our site for more details!All in all, we are very grateful to Google for creating the GSoC program and including us in it. Thanks to Google, and most of all to the students and mentors involved. [Less]
|
Posted
almost 17 years
ago
Joel Pitt has done some experiments testing first-order PLN inference in OpenCog, on some very simple data.
These experiments don’t use the indefinite probability formulas but rather the good old fashioned SimpleTruthValue PLN formulas.
What they
... [More]
involve is using PLN to extrapolate indirect word associations, from direct words associations mined from text (by some statistical text mining [...] [Less]
|
Posted
almost 17 years
ago
A few weeks back Ben announced he’d be running IRC tutorial sessions on OpenCogPrime. Last night was the second tutorial, and was on the topic of knowledge representation - introducing people to the basic concepts of the AtomSpace, such as Atoms, Nodes, and Links and how various types of each represent things in OCP. If [...]
|