Skip to main content
Skip to content
Case File
efta-efta00830443DOJ Data Set 9Other

From: Ben Goertzel

Date
Unknown
Source
DOJ Data Set 9
Reference
efta-efta00830443
Pages
3
Persons
0
Integrity
No Hash Available

Summary

Ask AI About This Document

0Share
PostReddit

Extracted Text (OCR)

EFTA Disclosure
Text extracted via OCR from the original document. May contain errors from the scanning process.
From: Ben Goertzel To: Jeffrey Epstein <[email protected]> Subject: Re: Brainfood... Date: Thu, 31 Mar 2016 09:39:28 +0000 Attachments: OpenCog_Plan_Slides_v4.pptx; Budget_and_Teams.xlsx And a bit more project-planning stuff for our envisioned 3-year OpenCog initiative (courtesy of Cassio) On Tue, Mar 29, 2016 at 11:37 PM, Ben Goertzel czMIII I> wrote: > Hi Jeffrey, > So, here is some food for thought and discussion... > We are working on getting OpenCog to control the Hanson Robotics robot > heads for embodied (or at least "emheadied"!) natural language > dialogue. This should be working in Summer 2016, maybe sooner. We > hope to demonstrate this to some extent at AGI-16 which will be in New > York this July. To give you some flavor of this work I attach our > Hanson Robotics / OpenCog software plan for the next few months (this > is work funded by Hanson Robotics). > In parallel with that, we have been finally getting some examples of > cognitive synergy to work in OpenCog. > This paper of ours from AGI-14 ended with a bit of a whimper > http://agi-conf.org/2014/wp-contentJuploads/2014/08/harrigan-guiding-agi14.pdf > however we have recently actually gotten this to work. I.e. we have > gotten ECAN (neural net type activation spreading) to successfully > prune the PLN inference tree. If all goes well we will submit a paper > on this to AGI-16 next week. This shows attention allocation and > inference working together. Yay! > This paper of ours from AGI-15 shows PLN being used to estimate the > truth value of a pattern recognized in data by MOSES (our evolutionary > program learning algorithm) > http://agi-conforg/2015/wp-contentJuploads/2015/07/agi15_goertzel_speculative.pdf > It takes about 24 hours to run in the current OpenCog version. > Obviously this is not OK but ifs a start. > Next step is to put these two things together, and use ECAN to help > PLN inference pruning on nontrivial examples like the one from the > above AGI-15 paper. > The attached PPT on some of our biology work contains two parts > -- first part is "straightforward" Al-wise but has quite interesting EFTA00830443 > results biology-wise: i.e. we used MOSES to figure out genes that > distinguish supercentenarians from healthy 80 year olds > second part is more interesting AI-wise ... using PLN to generalize > from MOSES models > Between these papers we see the first concrete examples of "cognitive > synergy" in OpenCog. They are a bit "toy" but they are also real and > actually work in the codebase now ;) > Finally, the other PPT attached shows the plan that Jim Rutt and > Cassio and I are aiming to carry out over the next 3 years. I am > not sure if you intersected with Jim at SFI or not... > http://www.santafe.edu/about/people/profile/Jim%20Rutt > We are aiming to raise $6M for 3 years of work, but I imagine we could > do it for $4M over 3 years if we made copious use of inexpensive > overseas folks. Jim will put in at least $500K of the funds himself > this foundation gives out $500K/year, and he is willing to put at > least 1/3 of that into OpenCog, maybe more). > The goal of this plan is to make OpenCog-based embodied NLP dialogue > work extremely well and hold a conversation demonstrating obvious > understanding of itself and its environment and situation. At the > core this will require getting cognitive synergy between ECAN, MOSES > and PLN to work in the context of visual/auditory perception and NLP > dialogue. This will be both a huge AGI advance and an amazing demo > that will open pretty much any door we want. > Jim Rutt was a good find, not only because he has been donating a bit > of $$ (about $150K last year, none to me personally but he has funded > Nil Geisweiller and Cassio), but because he bring a management and > organization and business sensibility/experience that complements me > well. And he knows enough science and CS and has put in enough time > to thoroughly understand the OpenCog/CogPrime design... > thx! > Ben > -- > Ben Goertzel, PhD > http://goertzel.org Ben Goertzel, PhD http://goertzel.org "I am Ubik. Before the universe was, I am. I made the suns. I made the worlds. I created the lives and the places they inhabit; I move them here, I put them there. They go as I say, then do as I tell them. I am EFTA00830444 the word and my name is never spoken, the name which no one knows. I am called Ubik, but that is not my name. I am. I shall always be." -- Ubik EFTA00830445

Technical Artifacts (5)

View in Artifacts Browser

Email addresses, URLs, phone numbers, and other technical indicators extracted from this document.

URLhttp://agi-conf.org/2014/wp-contentJuploads/2014/08/harrigan-guiding-agi14.pdf
URLhttp://agi-conforg/2015/wp-contentJuploads/2015/07/agi15_goertzel_speculative.pdf
URLhttp://goertzel.org
URLhttp://www.santafe.edu/about/people/profile/Jim%20Rutt

Related Documents (6)

Dept. of JusticeCorrespondenceMar 1, 2010

EFTA00700552 - Goertzel-Arel 'Robot Toddler' AGI Proposal for Epstein Foundation

DOJ-released document from Data Set 9 containing the draft research and development proposal prepared by AI researchers Ben Goertzel and Itamar Arel for the Jeffrey Epstein VI Foundation. The proposal outlines a $3 million project to develop a 'robotic AGI toddler' — an artificial general intelligence system 'with the rough general intelligence of a human 3-4 year old child, demonstrated via embodiment in virtual world characters and humanoid robots.' Arel was designated as primary investigator, contributing his DeSTIN (Deep SpatioTemporal Inference Network) facial recognition system developed at the University of Tennessee using graduate student labor. Goertzel, who received direct salary from the Epstein Foundation, proposed the project and suggested additional hundreds of thousands in funding for Arel's research assistants. This document is central to understanding Epstein's funding of cutting-edge AI research through academic intermediaries.

0p
DOJ Data Set 10CorrespondenceUnknown

EFTA Document EFTA01803291

0p
DOJ Data Set 10OtherUnknown

EFTA01984902

2p
DOJ Data Set 11OtherUnknown

EFTA02577674

2p
DOJ Data Set 10CorrespondenceUnknown

EFTA Document EFTA01818976

0p
DOJ Data Set 9OtherUnknown

Subject: Re: SDNY News Clips Wednesday, July 31, 2019

From: To: Subject: Re: SDNY News Clips Wednesday, July 31, 2019 Date: Wed, 31 Jul 2019 23:27:22 +0000 Ha, really? In that case pretty sure I've seen the filing but will take a look. Thanks Sent from my iPhone On Jul 31, 2019, at 7:24 PM, ) < > wrote: That article is a reference to a government filing from over a month ago (Spencer Kuvin seems especially interested in being quotes in belated but inflammatory fashion on these issues) — but in any event, the NDGA filing from then is attached. From: Sent: Wednesday, July 31, 2019 17:14 To: Subject: FW: SDNY News Clips Wednesday, July 31, 2019 It looks like NDGa just filed something in the CVRA litigation — do you have a copy by any chance? From: Sent: Wednesday, July 31, 2019 5:12 PM Cc: Subject: SDNY News Clips Wednesday, July 31, 2019 SDNY News Clips Wednesday, July 31, 2019 Contents Public Corruption. 2 Epstein. 2 Collins. 18 Securities and Commodities Fraud. 20 Stewart 20 Thompson. 22 Pinto-Thomaz. 24 Narco

25p

Forum Discussions

This document was digitized, indexed, and cross-referenced with 1,400+ persons in the Epstein files. 100% free, ad-free, and independent.

Annotations powered by Hypothesis. Select any text on this page to annotate or highlight it.