DS10: EFTA01621007 (.mov)
DOJ EFTA Dataset 10 (Seized Media)·Wednesday, January 1, 2025·25m
The question is that there is such a gap like that. If you don't believe in supernatural...
0:00
The question is that there is such a gap like that.
0:03
If you don't believe in supernatural phenomena, there has to be.
0:09
But there are a lot of gaps in our understanding as to what that might be.
0:14
So we're trying to work out a nice continuous...
0:19
It might take a trillion planets each with exactly the right conditions.
0:23
Well, so, you know, it could. If there's a bottom, if there's a hard step.
0:27
I suppose the hard step is making some ribosome that can replicate itself.
0:32
That, in principle, could be so rare that, exactly like you said, we could be unique.
0:36
But there are a lot of possible universes, so it could be any of those.
0:40
It could be. Or, you know, on the other hand, it could be that every step along the way is very easy.
0:46
And it's almost inevitable given a suitable planet.
0:49
In which case, you know, there might be hundreds of millions of planets with life.
0:54
What's your intuition at the moment?
0:57
So I used to refuse to answer that question, but...
1:01
Oh, sorry.
1:02
I'm sort of coming around to the idea that all the steps might be easy.
1:07
Because, and this is purely based on extrapolation,
1:11
that every time we look at a particular step that looks hard,
1:16
because we have no idea how to solve it...
1:18
There's a solution.
1:18
Eventually, there's a solution, and it ends up looking really easy.
1:23
And there haven't been any exceptions to that so far, so...
1:26
Okay.
1:27
If that continues.
1:29
Right.
1:32
And it was...
1:33
How have you been, first of all?
1:35
Great.
1:36
Good.
1:37
He's alive.
1:38
He almost died with something.
1:39
I heard.
1:41
But was it as unexpected, almost dying?
1:44
Yeah, it was a close one.
1:46
Really?
1:46
Well, they have a new pay order.
1:49
No, he's moving on.
1:49
Oh, that sounds...
1:51
Yeah, well, they were washing it for acidity,
1:54
because it was...
1:55
Right.
1:55
And didn't expect to even think of anything,
2:00
so this spring, he was having an appointment in the spring,
2:02
at Kaua.
2:03
I think Norseks would be nice, but they insisted on that.
2:07
Would you like some coffee?
2:09
Would you like some coffee?
2:10
It just busted one night?
2:17
No.
2:17
Yeah.
2:18
No.
2:19
What's that?
2:20
It busted one night?
2:21
It blew.
2:22
I mean, so it wasn't you would get felt a little sick over a period of time?
2:25
No, no.
2:25
It must have been leaking, because it was two days.
2:28
Yeah, but one night...
2:30
But they fixed it by drilling a little hole here
2:38
and putting this whole thing in, which is...
2:40
Really?
2:41
It's amazing.
2:42
Yeah, there was a stent, and that was not open.
2:45
I think that was not open.
2:46
So there's no stock in all of this.
2:49
You have new internal wiring done through the exit block?
2:54
Right.
2:55
Well, they just put up another pipe inside the old...
2:58
Okay.
2:58
...bulch out for it.
3:01
That's running like a new conduit?
3:03
A new conduit in the old conduit?
3:05
Yeah.
3:05
Okay.
3:06
It's really like it inflates on the inside.
3:08
Thank you.
3:09
Wow.
3:10
Some coffee, can I hear you?
3:12
Coffee?
3:13
Some coffee?
3:14
Oh.
3:15
I don't know if you're making tea, but...
3:17
If you have a tea, otherwise...
3:19
So the thing is to be near a place like MGH, right?
3:22
Well, that's why I think the...
3:24
That's where I work.
3:26
The original surgeon didn't want to go to Dubai.
3:27
He's told me it's a great place to work.
3:31
Yes.
3:32
If something bad happens, I guess it's good to be able to...
3:34
It's good to be able to...
3:35
Yeah, it's really good.
3:37
Yeah.
3:39
What hospital do you live in the list?
3:43
Well, it's really...
3:43
It's an...
3:44
Well, in New York, it's easy.
3:46
I have my selection.
3:48
But being in the...
3:48
Is that Sinai still good?
3:49
That's where my father lives in.
3:50
Yeah, mostly...
3:51
There's lots of good hospitals, as long as there's no traffic.
3:55
If it's...
3:56
If they're lighting the Christmas tree, you're dead no matter what happens.
3:59
Because you can't get anywhere, unless the hospital's in your house.
4:02
But since I live in the island, the Virgin Islands, it was tricky because I had the flu
4:06
and I fainted.
4:07
And I said, how am I going to get...
4:08
There's no hospital for hundreds of miles.
4:12
There's a hospital with this.
4:13
Just by name only.
4:14
What about a helicopter lift?
4:16
But to where?
4:17
I can...
4:18
St. Thomas is...
4:19
Yes, what's the range of a helicopter?
4:21
Well, I can go to St. Thomas, but there's...
4:23
Nothing new.
4:23
It's good.
4:24
They practice this form of medicine we refer to as unga bunga.
4:30
Which is not the latest.
4:32
It has sort of a...
4:33
The version of a stent is a bone through their nose.
4:37
Yes?
4:41
Miami.
4:42
Same.
4:42
Miami.
4:43
But that's two and a half hours by plane.
4:46
I didn't like those.
4:47
The idea of traveling.
4:49
No tunnel.
4:52
Marvin, when you're looking at the concept, how do you describe an artificial intelligence
4:57
deception?
5:00
What's the first...
5:00
The basics?
5:01
Because anything that is intelligent...
5:04
If you're trying to design the AI systems, at some point you would have said that you
5:08
wanted to have a system that was smart as a two-year-old or a three-year-old.
5:14
Now, most three-year-olds turn out to lie.
5:17
And one of the systems, psychologically, right, is the fact that you know it's intelligent
5:21
if it's able to lie.
5:27
Yeah?
5:29
The kids not...
5:31
The kids don't...
5:33
If kids who never lie aren't very smart, they're not very intelligent.
5:37
Right?
5:37
Laura?
5:38
But the kind of lying is what it's called, confabulation.
5:42
It's more like trying to plead...
5:44
Well, until they get a little older than me, they do definitely.
5:47
At some point, when they say that the shit on the floor wasn't me, it was the dog.
5:51
That's confabulation.
5:52
Okay.
5:52
I mean, that's a kind of lying.
5:54
But it's a kind of lying.
5:55
It's a deception.
5:56
That's where I started to go.
5:57
It's a definite deception where you're trying to...
6:00
You send out a fake message to see if you get an answer that has low...
6:03
High benefit for low cost.
6:05
There's cases when I was in medical school in the Bell, the psychiatric division.
6:14
And there were this psychiatrist who was demonstrating how this drunk guy was lying,
6:21
was, you know, confabulating.
6:22
Right.
6:23
So he said, you know, we had a great time last night.
6:26
Oh, I'm not blonde, cute.
6:27
I think Drake was great.
6:29
Oh, yeah, yeah, yeah.
6:30
And then about a second later, he said, hey, doc, who's pulling whose leg?
6:39
Anyway.
6:40
I was reading something about young chimpanzees playing cooperatively, but I can't remember
6:49
what age that is.
6:51
They guess what the other one wants, and they tend to share things.
6:56
And then when they're a little older, they stop that.
7:00
But that's more cooperation and the lack of cooperation than actual deception.
7:06
Well, guessing that somebody else wants something is...
7:10
That must be pretty abstract.
7:11
That's abstract, but you're not sending out a false message.
7:15
You're trying to, I think, interpret what the other person's mind, reading the other
7:21
person's mind, what the group, what's potentially the group.
7:24
Oh, right.
7:25
Great.
7:25
So this article didn't say when they start to lie.
7:30
So if you're making an intelligent...
7:31
Pretending that you don't have the food or something.
7:34
So I was trying to think, if when Danny was building a thinking machine and you were building
7:38
artificial intelligence things, did you guys ever think about the necessity for a machine
7:43
to lie, to, in fact, be intelligent?
7:46
Because it would be a lower-cost solution to some of its problems.
7:50
Give an answer that you'd accept and leave it alone.
7:54
As opposed...
7:54
Danny's expression was he always wanted to make a computer that was proud of him.
7:58
Of Danny?
7:59
Yes.
8:00
Of the program?
8:01
And in the game theory, how do you think about, not only in biology, you're dealing
8:12
with deception all the time and all types of signals, right?
8:15
The cells are sending out HIV or anything else is sending out a deceptive signal, so it's
8:22
accepted.
8:24
Sorry?
8:25
Poker.
8:27
Poker's bluffing, right?
8:28
Most of the time.
8:29
Bluff, that's right.
8:30
Yes?
8:31
Yeah.
8:31
Well, Push Singh had a thesis and a robot was actually helping another one build a chair.
8:42
I don't think none of those early projects involved trying to fool them.
8:52
I made it up without your help.
8:53
It was hard enough to get them to do anything.
8:56
Right.
8:56
So we never got into deception and that sort of thing.
9:00
I wonder what age children start to lie.
9:08
Or in a game strategy, at what basics do cells start to send out fake messages, disinformation,
9:16
or catmophores?
9:16
Yeah, that's interesting.
9:18
I mean, we already have models where cells are in competition with each other.
9:23
So basically, a model for a cell can grow by eating its neighbors by sucking out some
9:31
of their molecules.
9:32
So once you've got competition, it seems like the next step should be what you're talking
9:36
about, right?
9:37
Yes.
9:38
So what Seth describes is free energy.
9:40
When you're trying to take the free energy in your cell argument, that's what you're doing,
9:44
right?
9:44
You're sort of getting all the benefit of that cell growing up to today cheaper by eating
9:50
it.
9:50
So it seems that the first strategy would be to hide, whether you hide in junk or whether
9:57
in computer programs, I would think that if you assume that the underlying structure was
10:04
an algorithm, whether it be protein folding or any type of program, if your algorithm was
10:10
known to the competition or the predator, then you're dead meat by definition because once
10:18
he can read you, there's no reason, he shouldn't be able to figure out the strategy to get
10:23
your free energy that you've built up.
10:25
So your first strategy would be to either hide by filling your surroundings with junk, increasing
10:30
the signal to noise.
10:31
Running is useful.
10:33
Yes.
10:34
Yes.
10:35
That's not deception.
10:36
When the SETI people build these big antennas, they talk to people 10,000 light years away.
10:46
Do they send jokes?
10:49
What could you send that says, I'm not here, don't bother?
10:55
That's right.
10:56
Nobody here but us chickens.
10:58
Yes.
10:59
So one possibility is if you have a, there's a possibility of mistakes.
11:08
So, you know, we're in some repeated cooperative relationship, we're choosing to help each other,
11:13
but sometimes when I try to be cooperative and do the cooperative thing, a mistake happens
11:18
and actually I'm selfish by accident.
11:21
If you could tell, you know, you could differentiate accidents from real mistakes, that's great.
11:26
You know, and so then selection would favor strategies that sort of forgive mistakes because
11:32
they're not predictive of what's going to happen next.
11:35
But this isn't exactly an example of the thing that he's talking about where once an agent
11:39
has in place a strategy that says, I forgive accidents, then it opens the door for a new
11:44
strategy that makes accidents on purpose.
11:47
That's right.
11:48
That's a good idea.
11:49
So I make you believe I didn't mean it.
11:52
You know, I didn't mean it.
11:53
Right.
11:54
But eBay does that.
11:55
So eBay has a reputation concept where some of the frauds should be detected by a third
12:02
party.
12:04
It reminds me, in Japan, it used to be that if you were drunk and got in an accident while
12:09
driving, that was an extenuating circumstance.
12:11
Of course I got in an accident while driving.
12:12
I've been drinking.
12:14
Oh, okay, okay.
12:15
Sorry, I interrupted you.
12:16
Please.
12:17
You're sending the genome for a crocodile.
12:23
What?
12:26
But in terms of, if you've seen anyone do work really, the difference or similarities between
12:38
biological viruses and deceptions, strategies, and computer viruses and deceptions, because
12:44
my sense is that they're very similar.
12:47
Whether they're, again, noise to signal or signal to noise.
12:50
Other ways of making believe you're, as two computers, sort of shake hands, right?
12:57
The concept is, I have to know who you are.
13:00
Over repeated interaction, you have to understand you're really friendly.
13:03
There should be some signals back and forth, not only saying, I'm still in contact with you,
13:08
but when someone's eavesdropping a third cell or a third computer, you want to make sure
13:14
that they don't need to understand what you're saying, but if, and if they tamper with our
13:19
communication, protein-wise or any type of, I should be able to tell that someone, in fact,
13:24
was looking or playing.
13:26
And that, that three-party game becomes very complicated.
13:29
So the analogy between computational deception, signals, and biological signals, I haven't
13:36
seen it.
13:36
I don't know if you did.
13:37
Yeah, no, I don't know.
13:39
Viruses always latch on to cellular receptors, right, which are usually used for cells to talk
13:44
to each other.
13:45
So they're sort of using that channel to go in and infect cells.
13:49
But I haven't ever heard of, say, a cell having a fake receptor that it could use to, you
13:59
know, suck in and kill a virus, right?
14:02
The immune system, the immune system.
14:04
Well, but it's, it's not tricking the virus, right?
14:07
It's killing the virus.
14:11
Does the immune system trick, I don't know, does the immune system trick the virus?
14:15
No, it detects, detects virus and figures out that it's not what it's supposed to be.
14:19
It's not something that I'm ready to say.
14:20
If the virus would tell the story, the virus would say, they tricked me.
14:24
Everything was fine.
14:26
It seemed like a perfectly good cell to infect, and then they came along and dismantled them.
14:35
And sometimes the immune system turns on, so that you get this autoimmune disease.
14:42
So let's go back to bacteria.
14:44
So they would have these restriction enzymes.
14:46
So they would actually mask their own sites in the genome.
14:51
And then if the virus comes in and is not protected in these places, they could cut it there.
14:57
And so the anteception of the virus would be to mask the sites.
15:01
I don't know if they...
15:02
Or just not have them.
15:03
Or not have them.
15:05
So then...
15:06
Thank you very much.
15:10
Maybe...
15:11
Maybe a few hundred, maybe less than 1,000 base pairs.
15:16
What's the smallest virus?
15:20
Do you have virus particles that depend on other viruses?
15:23
Less than 1,000 bases, I would say?
15:25
There are...
15:26
Yeah, so the...
15:28
What's a parasitic or associative virus?
15:31
Viroids.
15:39
What are they called?
15:40
Viroids.
15:41
Oh, so it's pre-virus, or is it a virus?
15:44
It's just...
15:45
There's no delineation.
15:45
The little RNA that just sort of...
15:47
It carries along on an existing virus.
15:52
It might get packaged in the same particle.
15:57
So it doesn't really...
15:58
It's like a parasite.
15:59
It doesn't encode for its own viral shell or anything.
16:02
Right.
16:02
The virus will hook up to the...
16:04
You use cellular machinery for its replication
16:08
and then get recognized by something in the viral code
16:12
and get packaged and transmitted.
16:14
But if you were going to try to find
16:17
sort of the early form of a deceptive practice
16:19
in biology, what would...
16:21
How would you think about it?
16:24
So going, like, way back, so...
16:26
What is this?
16:28
So...
16:29
So I don't know.
16:32
I mean, you know, we were talking about
16:34
these things emerging spontaneously
16:37
in experiments with just RNA.
16:40
But then, you know, once you have things compartmentalized
16:43
in cells with membranes.
16:49
It's not really obvious how you would make viruses
16:52
in that situation.
16:56
They have to go from cell to cell.
16:58
Right.
16:59
And it's hard.
17:00
It's hard for RNA.
17:02
But maybe just because we don't really understand
17:05
how it, you know, can happen.
17:07
I mean, there are RNAs that can recognize membrane surfaces,
17:10
so maybe that would be enough in some way.
17:14
In artificial artificial life,
17:16
like computerized artificial life,
17:17
I guess in...
17:19
There's Tom Ray's TIERRA program.
17:22
This is, again, a long time ago.
17:23
One of the things that...
17:24
I mean, I'm not saying this is a great model
17:26
for artificial life,
17:27
but one thing that did happen
17:28
is it's basically a program
17:30
where you have these pieces of code
17:31
that are competing for resources
17:33
to reproduce themselves.
17:35
So you have self-reproducing pieces of code,
17:36
and then they compete for resources
17:39
to see who does better.
17:41
But one thing that happened very rapidly
17:43
was these kind of virus-like pieces of code
17:46
that basically they were shorter.
17:49
They had less code there than the other ones,
17:50
and they were just using some other existing
17:53
self-replicating piece of code
17:55
to replicate themselves.
17:57
So it was really...
17:59
I mean, it was a very viral kind of behavior,
18:01
and it happened almost immediately.
18:04
It didn't take...
18:05
You know, after just a few...
18:06
Like, a few generations.
18:08
What surprised me so much
18:09
is you take all the computers in the world,
18:10
you take the world by faith,
18:12
and all of that stuff,
18:12
and it's a huge combination of power,
18:15
but there's no spontaneous
18:16
emergence of viruses yet.
18:18
All the viruses are made
18:19
artificially by people.
18:21
All the defense against viruses
18:23
are made artificially by people.
18:26
Is that true, Martin?
18:28
I guess.
18:28
You don't have the sources of variation.
18:38
Yeah, it's...
18:39
I mean, unless you call it
18:40
Microsoft Word, actually,
18:41
to be alive.
18:43
I talked to IBM people.
18:47
I mean, so the immune system
18:48
that they have
18:49
is also made by people.
18:51
I mean, I think once they would have
18:52
an automatic immune system,
18:56
that could be a program
18:57
that could also produce viruses
18:58
with change, for example.
19:00
It also might be hard
19:01
to install new software.
19:02
If your computer had its own
19:06
self-generated autoimmune system.
19:11
It's like, no way
19:12
are you installing new software?
19:13
Yeah, I'm not doing that.
19:14
Forget it, right?
19:16
But it's very funny
19:17
because the computer scientists
19:18
tell me they're lost,
19:19
the battle with the hackers
19:21
is kind of lost
19:22
because you always get the message,
19:23
update the software.
19:25
Half of the people
19:25
update the software,
19:26
half don't.
19:27
And the hackers
19:28
just have to look at that
19:29
to see where the weakness is,
19:31
and then it can affect
19:32
the people.
19:33
They have to do not update it.
19:35
Yeah.
19:36
Well, particularly since
19:37
the message to update the software
19:38
is being sent to you
19:39
by a hacker
19:39
who is actually sending you
19:40
the software.
19:41
I always get these messages
19:44
saying it should update.
19:45
How do I know this?
19:47
Well, biologically,
19:49
how does that happen?
19:50
In the virus world,
19:53
when it says
19:53
you should change yourself,
19:55
you usually have to ask
19:56
your friends or your neighbors,
19:57
is this a safe cell?
19:58
There's always,
19:59
normally a reputation
20:00
authority.
20:02
So in the system,
20:04
how do cells attempt to say,
20:06
is this cell,
20:07
is the instructions,
20:08
is the signal I'm receiving
20:09
a real signal?
20:16
I guess one of the arguments
20:17
is if you find
20:18
you're in a cell
20:19
in the wrong place,
20:20
you're a liver cell
20:21
and you shut up
20:21
in your pancreas,
20:22
you should kill yourself.
20:24
Right?
20:24
Because your neighbors
20:26
seem to have to be
20:27
on your wavelength.
20:29
Is that not a good idea?
20:30
Is that accurate?
20:31
No, I mean, you know,
20:32
probably most cancer cells,
20:34
sort of escape from a tumor
20:36
and try to colonize
20:37
somewhere else,
20:38
and end up getting killed
20:39
because they're in
20:40
the wrong place.
20:41
They're not sending out
20:41
the right signals.
20:42
They clearly don't belong.
20:45
And so then the surrounding
20:46
cells send out signals
20:47
telling them to kill themselves.
20:51
You know,
20:52
of course,
20:52
that breaks down
20:53
as soon as you get a mutant
20:54
that doesn't start
20:54
killing itself.
20:55
That's sort of the opposite
20:59
of deception
20:59
in some sense.
21:00
It's an escape
21:01
that as opposed
21:01
to tricking someone,
21:03
you kind of trick yourself
21:04
into not listening
21:05
to the signals
21:06
that are being sent
21:09
to you to kill yourself.
21:12
There's no real fake.
21:14
The default strategy?
21:15
That's a real strategy.
21:16
You're turning off a strategy,
21:17
you're not sending out
21:19
any fake message.
21:20
I think deception
21:21
has to have fake
21:22
something.
21:22
So a lot of,
21:31
let's see,
21:32
when viruses infect cells,
21:34
the immune system
21:40
recognizes typically
21:43
that cell
21:43
as an infected cell
21:44
because it would be
21:45
displaying bits
21:46
of the virus
21:47
on its surface.
21:48
And a lot of viruses
21:49
do things.
21:51
Essentially,
21:52
they send out signals
21:53
saying,
21:53
no, no, it's okay.
21:54
Right.
21:56
So that's a very
21:57
common strategy.
21:57
Everything's all right.
21:58
Just sort of a first
21:59
deception.
22:00
There's no intent.
22:02
But do they send out,
22:03
how is that done?
22:05
Do they send out
22:05
random signals
22:06
and eventually
22:07
some of them live?
22:08
No, no, no.
22:09
They'll make molecules
22:10
that mimic
22:11
normal cellular components
22:13
of which are signals
22:15
that, you know,
22:16
might do things
22:21
like prevent
22:21
the display
22:22
of those neural peptides
22:23
or, you know,
22:26
otherwise they'll put
22:27
signals out
22:27
on the cell surface
22:28
that will make
22:30
that cell look
22:31
normal.
22:32
Normal.
22:33
Right.
22:34
So it seems to be
22:35
very similar
22:36
to computer things.
22:37
That's what computers do.
22:39
I was going to say
22:39
that computer things
22:40
are very similar
22:41
to what's happening
22:41
in nature.
22:44
I mean,
22:46
actually,
22:46
in some sense,
22:46
just even
22:47
when Apple,
22:48
if Apple sends you
22:49
free software updates,
22:51
in some sense,
22:51
it's just a virus,
22:52
right?
22:52
Because updating
22:53
your software
22:54
on an older computer
22:55
inevitably makes it
22:56
run more slowly.
22:56
and their goal
22:58
is simply to,
22:58
like,
22:58
get so much
22:59
new software
23:00
to the computer
23:00
that you have
23:01
to buy a new computer.
23:03
I mean,
23:03
this is actually true.
23:04
I mean,
23:04
this is not actually,
23:05
it's not even,
23:07
it's true of Apple as well.
23:08
Every new Microsoft
23:09
was slower than
23:10
the previous one.
23:11
Yeah.
23:11
So updating
23:14
your software,
23:14
I mean,
23:15
it's all kind of
23:15
like a scam
23:16
on the part
23:17
of the computer companies
23:18
to, like,
23:18
put their new software
23:19
into our computers
23:20
so that we have
23:21
to buy a new computer.
23:21
It's actually a scam
23:22
for the future computers
23:23
in mind.
23:25
Yeah,
23:25
the future computer
23:27
had an intent
23:28
to be born.
23:30
That would be
23:31
one of the arguments.
23:32
There's an interesting
23:33
difference,
23:33
it seems to me,
23:34
between the biological
23:35
viruses and the
23:36
computer viruses,
23:37
just sort of
23:37
what you were saying,
23:38
but it's that
23:39
where the biological
23:46
ones are emergent
23:48
as a result
23:49
of selection.
23:50
Evolution versus
23:51
intelligent design.
23:52
Exactly.
23:53
Exactly.
23:54
And so...
23:54
Well, yeah,
23:57
some of the most
23:57
virulent viruses
23:58
were not supposed
23:59
to be virulent,
23:59
right?
24:00
Right.
24:01
What was this one
24:02
in the 1990s?
24:03
This guy,
24:04
he just meant it
24:05
to kind of, like,
24:06
slightly infect
24:07
a whole bunch
24:07
of computers
24:08
and just lay low,
24:09
but because he
24:10
made a mistake
24:10
in programming it,
24:11
it infected
24:12
a bunch of computers
24:12
and then went viral
24:13
and, like,
24:14
infected vast quantities
24:15
of computers.
24:16
So the effect
24:17
was not intended
24:19
to, like,
24:20
bring down
24:20
all these computers.
24:21
But has there ever
24:22
been a...
24:22
So you were saying
24:23
there's never been
24:24
a naturally
24:25
emergent virus,
24:26
but has there
24:27
ever been
24:28
just an error
24:30
in a design virus
24:31
that actually
24:32
made it better?
24:33
Not a human error.
24:34
Well, that's...
24:34
That would describe...
24:35
But that was a human
24:37
error in the design
24:38
of the virus.
24:39
Yeah.
24:39
But you could imagine,
24:40
like, a copying error
24:41
that actually made it worse.
24:44
Does that ever happen?
24:45
So I believe...
24:46
So this...
24:47
You must be the expert,
24:48
but I believe, like,
24:49
computer programs
24:50
that people write
24:51
cannot mutate
24:53
in their various DNA
24:55
and RNA
24:55
in order to give
24:56
something reasonable.
24:58
Well, they're living
24:58
in an environment
24:59
with all this error correction.
25:01
Well, not really.
25:02
I mean, they just actually...
25:02
It's just very accurate.
25:03
The transcription
25:04
is extremely accurate.
25:05
No, no, but what I mean
25:06
is if you write
25:07
on a C program
25:08
and you say,
25:08
now I make a random...
25:09
I flip a bit somewhere.
25:10
Yeah, yeah.
25:11
That will never do
25:12
anything better, you know?
25:14
They are not...
736 segmentsTranscribed by Epstein Pipeline (faster-whisper)