Case File
efta-efta00669052DOJ Data Set 9OtherFrom: Kevin Slavin -4
Date
Unknown
Source
DOJ Data Set 9
Reference
efta-efta00669052
Pages
2
Persons
0
Integrity
Extracted Text (OCR)
Text extracted via OCR from the original document. May contain errors from the scanning process.
From: Kevin Slavin -4
To: Joscha Bach
Cc: Jeffrey Epstein
Subject: Re: MDF
Date: Wed, 23 Oct 2013 15:40:50 +0000
, Kevin Slavin
takashi ikegami <Ma-
, Greg Borenstein
An experiment that I would like to see one day (and of which I am not aware if someone already tried it):
equip a subject with an augmented reality display, for instance Google Glass, and continuously feed a visual
depiction of auditory input into a corner of the display. The input should transform the result of a filtered
Fourier analysis of the sounds around the subject into regular colors and patterns that can easily be discerned
visually. At the same time, plug the ears of the subject (for instance, with noise canceling earplugs and white
noise). With a little training, subjects should be able to read typical patterns (for instance, many phonemes)
consciously from their sound overlay. But after a few weeks: Could a portion of the visual cortex adapt to the
statistical properties of the sound overlay so completely that the subject could literally perceive sounds via
their eyes? Could we see music? Could we make use of induced synesthesia to partially replace a lost
modality?
It's not exactly what you're proposing, but are you familiar with Neil Harbisson's work/life:
http://en.wikipedia.org/wiki/Neil_Harbisson
He's an artist, primarily -- and has been living this way for quite a while now but U not aware of anyone
(including Neil himself) who has looked into how it has actually altered his perception...
Also I am just warming up to run a class next semester called -- for the moment -- Animal Superpowers (new
name TK). Fifteen students, each one picks an animal, goes deep into how they perceive the world, and then
builds the sensory apparatus to allow a human user to understand the world as that animal does.
It draws from an old Area/Code project that was never built, called "Ant City" -- in which individual players
were ants, physically situated in a real city, responding to digital pheromone trails. In the meantime the artist
Chris Woebken did a series (called Animal Superpowers) that approximates this. (images
video:
). While I love Chris' work,
it's mostly about how the ant "sees" as opposed to how the ant perceives and understands the world (e.g.,
pheromones). I am interested in experimenting with human augmentation to provide (or augment?) these
additional forms of perception.
Prior to Chris' work, I had wanted to run such a class at ITP in 2005, but students don't have the hardcore sensor
background.. talking about it now with Joe Paradiso in the Responsive Environments group, about connecting
this to his class and the work they are doing (which includes working with a sensor-instrumented cranberry bog
here in Massachusetts)
If there are ways in which these kinds of experiments can fold into / draw from what we're discussing here,.
welcome it.
EFTA00669052
On Wed, Oct 23, 2013 at 10:41 AM, Joscha Bach <
> wrote:
Am 22.10.2013 um 16:01 schrieb Jeffrey Epstein
>:
> I would add the possiblity that each differentiated input has its own encrypted algorithm. and looking at it
from too high an altitude provides little info about each one..i.e. optic nerve encrption different than nasal
receptors . maybe even a one time code . that allows only the individual to access certain stored info.
Indeed! Each individual will form its own code, for each modality. On the other hand, these codes do not
simply diverge, but they are the result of the individual's adaptation to its own (changing, developing,
deteriorating) physiology. The nervous system is designed to extract structure based on the statistical
properties of the input, and to compensate for defects. For instance, replacing the fine-grained input provided
by the many receptors of the cochlea with a crude implant (today's models sample only a handful of
frequencies) will usually result in a subjective experience of continuous auditory perception; splicing the data
of a few pixels into the optic nerve of a blind person may allocate those pixels their correct positions within the
visual field. An interesting question: what are the limits of the plasticity of the sensory modalities? For
instance, could we switch modalities to some extent?
More than hundred years ago, Stratton did a famous experiment, where he wore glasses that turned the world
upside down (using prisms). After a few days, his brain adapted and he would perceive everything as being
upright again.
An experiment that I would like to see one day (and of which I am not aware if someone already tried it):
equip a subject with an augmented reality display, for instance Google Glass, and continuously feed a visual
depiction of auditory input into a corner of the display. The input should transform the result of a filtered
Fourier analysis of the sounds around the subject into regular colors and patterns that can easily be discerned
visually. At the same time, plug the ears of the subject (for instance, with noise canceling earplugs and white
noise). With a little training, subjects should be able to read typical patterns (for instance, many phonemes)
consciously from their sound overlay. But after a few weeks: Could a portion of the visual cortex adapt to the
statistical properties of the sound overlay so completely that the subject could literally perceive sounds via
their eyes? Could we see music? Could we make use of induced synesthesia to partially replace a lost
modality?
Cheers,
Joscha
EFTA00669053
Technical Artifacts (1)
View in Artifacts BrowserEmail addresses, URLs, phone numbers, and other technical indicators extracted from this document.
URL
http://en.wikipedia.org/wiki/Neil_HarbissonRelated Documents (6)
DOJ Data Set 10OtherUnknown
EFTA01682184
186p
DOJ Data Set 10OtherUnknown
EFTA01370863
1p
Dept. of JusticeOtherUnknown
Medical Record/Clinical Encounter: DOJ-OGR-00026334
This clinical encounter document from the Bureau of Prisons details a medical evaluation of Jeffrey Epstein on July 12, 2019. It covers his medical history, current complaints, and treatment, including discussions around his triglyceride levels, sleep apnea, and back pain. The document was generated by the treating physician at the Metropolitan Correctional Center in New York.
1p
DOJ Data Set 8CorrespondenceUnknown
EFTA00014087
0p
DOJ Data Set 11OtherUnknown
EFTA02367961
1p
DOJ Data Set 10OtherUnknown
EFTA01977826
2p
Forum Discussions
This document was digitized, indexed, and cross-referenced with 1,400+ persons in the Epstein files. 100% free, ad-free, and independent.
Annotations powered by Hypothesis. Select any text on this page to annotate or highlight it.