1
00:00:00,636 --> 00:00:01,540
Podcasting to you.

2
00:00:02,428 --> 00:00:03,684
24th, 2020.

3
00:00:05,372 --> 00:00:06,788
Perceptron.

4
00:00:10,524 --> 00:00:12,772
♪ Bumming self-pacemaking ♪

5
00:00:13,500 --> 00:00:14,244
the country.

6
00:00:18,492 --> 00:00:20,292
It all goes down, we've got the rag.

7
00:00:21,212 --> 00:00:21,764
point over

8
00:00:22,268 --> 00:00:22,948
in the boardroom

9
00:00:24,700 --> 00:00:27,140
2.0. In fact, I'm going to be

10
00:00:27,548 --> 00:00:29,796
only boardroom that is not a part of

11
00:00:30,300 --> 00:00:30,980
I'm about to be.

12
00:00:32,284 --> 00:00:32,804
Curry here.

13
00:00:35,164 --> 00:00:36,356
The man who will.

14
00:00:36,764 --> 00:00:37,732
Your weights are cornered.

15
00:00:38,236 --> 00:00:40,068
Say hello to my

16
00:00:40,412 --> 00:00:42,180
And on the other end, the one.

17
00:00:42,684 --> 00:00:43,364
Only five.

18
00:00:45,116 --> 00:00:47,332
Aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

19
00:00:50,012 --> 00:00:51,076
Oh, you're, uh...

20
00:00:52,860 --> 00:00:57,700
Get that done, even with the cedar fever. Yeah.

21
00:00:58,524 --> 00:00:59,800
to see the fever.

22
00:00:59,956 --> 00:01:06,332
And the only thing that will cure it is more cowbell. I'm trying. I'm working on it.

23
00:01:06,836 --> 00:01:09,244
This, uh, you've...

24
00:01:10,164 --> 00:01:17,116
You don't get knocked out like this very often. No, this is the – no, no, no. I had this in January, I think. I don't get knocked out by much.

25
00:01:17,620 --> 00:01:22,396
No, you don't. What do you mean my note isn't live in the splits? What does this mean?

26
00:01:22,772 --> 00:01:29,532
Eric PP says, your node isn't live in the splits. What does that mean? Isn't in the live splits.

27
00:01:30,164 --> 00:01:34,332
Oh, well, that's okay. Oh, it's not in the live item.

28
00:01:35,924 --> 00:01:41,596
about. No wonder I'm not getting live boosts. Hmm. Hold on a second. Let's see.

29
00:01:42,516 --> 00:01:47,676
Let me see. It's not in there, really? Let me see. Value?

30
00:01:53,332 --> 00:01:54,428
What is this?

31
00:01:56,788 --> 00:01:57,308
Oh, I see.

32
00:01:59,084 --> 00:02:05,364
Okay, well, that would make sense. Let's see. What am I? Adam at getalbie.com?

33
00:02:07,948 --> 00:02:14,260
This is the listen to people type on their keyboards podcast. I really don't care.

34
00:02:16,716 --> 00:02:22,644
I don't care. Yeah, because I say that because I'm doing the same thing. Yeah, I really don't care. I don't care what anybody thinks of me.

35
00:02:23,340 --> 00:02:25,492
All right, so let me do this.

36
00:02:26,220 --> 00:02:31,092
Well, no wonder things weren't working. What? Yeah, thanks, Eric.

37
00:02:31,564 --> 00:02:36,244
Yeah, really. Sorry to derail the show. No, you did it, man.

38
00:02:36,780 --> 00:02:45,876
You were running and Eric just took his leg out. Yeah, I was good to go. Everything's fine. But also the way I read it, I mean, this is not how he wrote it, but.

39
00:02:46,444 --> 00:02:48,340
Let me see what he said.

40
00:02:49,228 --> 00:02:51,988
I've got to read it. The way I received it was.

41
00:02:52,972 --> 00:02:53,748
uh,

42
00:02:54,348 --> 00:02:58,800
Your node isn't in the live splits again! See, that's how I read it.

43
00:02:59,724 --> 00:03:01,012
Thank you.

44
00:03:01,836 --> 00:03:02,516
And.

45
00:03:02,956 --> 00:03:09,364
And that is obviously the problem where I wasn't getting live boosts on the last show. It makes so much sense now. Okay.

46
00:03:10,124 --> 00:03:28,340
This was a gift from Sir PP. No, of course it's a gift. And I'm just telling you how I feel. And my wife was mean to me a minute ago. Oh, yeah. I'm so sorry. Yeah, I say, I'm going to do the show. And she's huffing and puffing and doing the sheets. And doing the sheets sucks, putting the new sheets on.

47
00:03:28,780 --> 00:03:35,860
say you should have just asked me whatever i'm like oh okay and it's just whatever i'm like did you just say whatever to me

48
00:03:36,396 --> 00:03:39,220
You got whatevered? I got whatevered. Like, okay.

49
00:03:39,692 --> 00:03:51,540
All right. Well, this is what is that? So isn't this not worth even talking about? Yeah, but you whatevered me, man. Yeah, it's like we'll just deal with this after the show. Yeah, well, and and have.

50
00:03:52,044 --> 00:03:56,276
Have a good show. Not just whatever, Joe.

51
00:03:57,164 --> 00:04:00,000
Hello, boardroom. How y'all doing?

52
00:04:00,380 --> 00:04:03,748
This is Podcasting 2.0, where we discuss all things podcasting.

53
00:04:04,476 --> 00:04:05,988
And sheets. And sheets.

54
00:04:06,364 --> 00:04:10,372
And sheets and pod pings and all kinds of stuff. Yes.

55
00:04:10,876 --> 00:04:12,740
Had a fun.

56
00:04:13,724 --> 00:04:16,484
A town hall. A town hall.

57
00:04:18,524 --> 00:04:20,260
Wednesday. Wasn't we at Wednesday?

58
00:04:20,796 --> 00:04:23,140
with Alex Sanfilippo.

59
00:04:24,540 --> 00:04:25,220
Town Hall.

60
00:04:26,140 --> 00:04:31,780
Yeah, he put together a town hall. He'd done some survey, I guess. This all happens on LinkedIn, I'm sure.

61
00:04:32,220 --> 00:04:36,036
How do you do – what is the difference between – like what are the criteria that –

62
00:04:36,860 --> 00:04:56,996
delineates a town hall from a fireside chat oh uh a town hall mean it's it's all zoom so it's basically all it's basically a zoom call um but the town hall had multiple people speaking and that was actually that's that was that's actually a pretty good question because some people were like what kind of town hall does this i thought i got to stand up and ask questions

63
00:04:57,980 --> 00:05:00,200
No questions.

64
00:05:00,324 --> 00:05:02,188
It was about the...

65
00:05:02,820 --> 00:05:13,196
about the issue of people spamming to get guests onto podcasts, which we had discussed in a previous board meeting to which I had suggested the booking tag.

66
00:05:15,204 --> 00:05:20,492
And so, you know, Tom Rossi was on and, you know, it was it was actually it was quite good.

67
00:05:21,252 --> 00:05:24,940
What did Tommy have to say? Tommy R.

68
00:05:25,604 --> 00:05:27,596
He said.

69
00:05:28,004 --> 00:05:28,716
Well, here's...

70
00:05:29,220 --> 00:05:34,924
His contribution was, you know, hey, we're taking out the email addresses from the RSS feeds.

71
00:05:35,300 --> 00:05:36,332
Okay.

72
00:05:36,804 --> 00:05:43,436
And then I came in and said, yes, and that's very good. And so what we're doing is we're suggesting putting a tag in called the booking tag.

73
00:05:44,388 --> 00:05:59,700
And, you know, it was kind of a scripted thing. It was like, here's the problem. Here's how here's interim solutions. Here's the solutions. And I think overall it was it was really good. And now Alex and Daniel J. Lewis.

74
00:05:59,888 --> 00:06:00,504
working

75
00:06:02,512 --> 00:06:07,064
that. Yeah. So it was kind of an industry rah-rah thing. I thought it was quite good.

76
00:06:07,568 --> 00:06:23,864
Not as scripted as the interview that Pod News had with the lady from Spotify. Oh, I have not. I've been I've been doing I've been just working nonstop this morning. I'm being whatever. I just, you know, rocking and rolling. I'm getting.

77
00:06:24,080 --> 00:06:41,688
live servers up and running and God cast to you videos. And I've been so behind because of this Cedar fever. So I did not listen to it. How was that? What was the, what was the interview about? Who is she? What lady from Spotify? Who's on first? I don't remember her name. It was.

78
00:06:42,384 --> 00:06:44,760
It's basically one of those interviews.

79
00:06:45,232 --> 00:06:48,248
where I'm going to send you questions.

80
00:06:49,296 --> 00:06:59,600
This is what it sounded like to me. I'm going to send you questions. You record them, send them back to me. Oh, goodness. Record the answers, and then we'll splice it all together. I hate those. You could hear almost everything but the paper.

81
00:06:59,724 --> 00:07:00,276
that she was

82
00:07:00,812 --> 00:07:07,060
crinkle yeah i mean it was all it was it was fairly forgettable what was it about though what was the

83
00:07:07,724 --> 00:07:08,660
How to get.

84
00:07:09,996 --> 00:07:13,396
how to get boosted in their

85
00:07:14,860 --> 00:07:16,180
or something.

86
00:07:16,972 --> 00:07:34,548
How do you get spotlighted or whatever? Yeah, the bottom line was basically you can't. How to get spotlighted. You need to find out who's on the staff and take them out for drinks. Like the way it's supposed to be.

87
00:07:35,436 --> 00:07:39,860
I mean, there's it sounded to me like they do it the same way Apple does it, which is they just.

88
00:07:40,844 --> 00:07:42,004
Roll dice and pick.

89
00:07:42,380 --> 00:07:59,600
pick somebody from a list or, or, you know, somebody on the inside and you get a, you get to add a boy or something. No, I think it's, I don't think there's any way to break in from the outside. You know, the whole problem is it's editorial team. They, they start to run it like they're overlords and it's, it's not, it's not bad. That's just.

90
00:07:59,724 --> 00:08:09,556
That's how you do it when you're running editorial. You determine and everybody hates you except the people that got highlighted and everyone else thinks you're a douche. That's the job.

91
00:08:10,060 --> 00:08:15,572
That's the joke. I mean, any algo that you put out there, somebody is going to figure that thing out and game it. I mean, it's just you.

92
00:08:17,932 --> 00:08:25,108
No, it's right. But I'll go suck. I'll go. I'd rather have it be editorial.

93
00:08:25,868 --> 00:08:37,556
In fact, you know, it's like, why does Spotify do editorial? Apple does editorial, but not a single other podcast app does editorial, which I've been complaining about for years.

94
00:08:38,156 --> 00:08:38,868
Yo!

95
00:08:39,276 --> 00:08:42,356
Have an opinion. We should do an editorial. Have an opinion.

96
00:08:42,700 --> 00:08:46,260
We mean we. Should we do a podcast index?

97
00:08:47,116 --> 00:08:54,740
Do you want to be hated? I'm not interested in that. We're already hated, right? Pick a topic.

98
00:08:55,148 --> 00:08:59,900
I have no time to do editorial for Podcast Index.

99
00:09:00,248 --> 00:09:17,024
I mean, I don't mean we would actually do it. I mean, we would like throw an LLM at it and just pretend we're doing it. Just to get taken out for drinks, basically. Yeah, exactly. Yeah, sure. We would be just as effective as the Podcast Academy or whatever that is.

100
00:09:18,264 --> 00:09:32,224
No, but isn't that the truth, though? I have never understood this. For some reason, podcast app developers, and maybe it is the I don't want to be hated thing, but if you want to delight your users.

101
00:09:33,464 --> 00:09:57,184
then look at what your users are doing and do editorial. Everyone else is doing it. Spotify and Apple literally are pushing shows that they think are good. You can be your one-man editorial team and you can delight your users with all kinds of suggestions. No one does that. There's this kind of like, well, it has to be equal for all, mom. Like, no.

102
00:09:57,784 --> 00:09:59,900
No, if you want your app to be successful,

103
00:09:59,900 --> 00:10:11,488
you have to have other things. And those other things are highlights, spotlight, blue light, moonlight, whatever you want to do. Rachel Maddow app. Hello.

104
00:10:12,344 --> 00:10:13,984
I've been saying this forever.

105
00:10:14,456 --> 00:10:16,160
This is exactly what it should be.

106
00:10:16,632 --> 00:10:28,128
One day somebody's just going to vibe code a Rachel Maddow app and stick it in the app store or something, and it'll just be for you, your special app. Yeah, you don't have to get through the app store.

107
00:10:28,600 --> 00:10:36,032
Do you have Rachel Maddow's permission to do this? Well, you know, that's an interesting topic because I don't know what other.

108
00:10:36,760 --> 00:10:39,968
Podcast app developers are seeing out there, but.

109
00:10:40,568 --> 00:10:43,616
On the Godcaster side of things...

110
00:10:44,184 --> 00:10:45,696
Paul has been telling us that

111
00:10:47,768 --> 00:10:52,448
App Store approvals are taking longer and longer.

112
00:10:53,592 --> 00:10:54,144
Thanks, guys.

113
00:10:58,008 --> 00:10:59,200
Slop. Like AI.

114
00:10:59,292 --> 00:11:03,460
Oh, very good point. Yeah. Well, we know from.

115
00:11:04,476 --> 00:11:05,796
customers.

116
00:11:06,556 --> 00:11:11,076
that the, I call them the wrapper apps, but what are they called? Container apps.

117
00:11:12,060 --> 00:11:13,348
I think is what they're called. Yeah, yeah.

118
00:11:13,852 --> 00:11:16,420
They're having a real hard time.

119
00:11:17,212 --> 00:11:19,396
And what you're describing is like?

120
00:11:20,124 --> 00:11:26,052
Somebody just takes your website and essentially wraps an app around it. Yeah. But it's really just pulling your –

121
00:11:27,708 --> 00:11:41,796
Yeah, you have to have a certain amount of unique qualities in the app or native qualities. But then they also run bots and algos across the whole app store and say, well, this is pretty similar to this one.

122
00:11:42,428 --> 00:11:44,740
It's become very difficult.

123
00:11:46,812 --> 00:11:51,108
Which is kind of infuriating to me on one level because –

124
00:11:53,468 --> 00:11:53,956
As.

125
00:11:54,364 --> 00:11:57,284
I've written a couple of iOS apps in the past.

126
00:11:59,056 --> 00:11:59,608
never written

127
00:12:01,296 --> 00:12:03,960
definitely done a couple of iOS apps and

128
00:12:04,752 --> 00:12:07,160
One of them that...

129
00:12:07,504 --> 00:12:09,176
did was for

130
00:12:09,648 --> 00:12:11,192
company I worked for and

131
00:12:13,296 --> 00:12:15,128
We did a lot of recruiting.

132
00:12:17,200 --> 00:12:19,608
college students to get them into our industry.

133
00:12:21,072 --> 00:12:21,976
a lot of work with that.

134
00:12:23,184 --> 00:12:23,768
Um...

135
00:12:24,528 --> 00:12:28,440
So I wrote this app. Essentially, it was an app.

136
00:12:29,264 --> 00:12:30,168
focused on our

137
00:12:31,024 --> 00:12:35,544
It was branded with our business, but it was all like, it was a recruiting app.

138
00:12:37,872 --> 00:12:38,392
It was a way that.

139
00:12:39,664 --> 00:12:45,976
could install our app and see what our upcoming events were going to be and all this kind of stuff and register.

140
00:12:46,320 --> 00:12:47,096
A lot of work.

141
00:12:48,720 --> 00:12:50,936
I spent weeks writing this.

142
00:12:51,504 --> 00:12:52,472
Amen.

143
00:12:53,456 --> 00:12:59,400
It worked great. It had a server back in to do all this kind of stuff. And then Apple just flat rejected it.

144
00:13:02,116 --> 00:13:09,292
They said this is just a promotional advertisement for your business. Yeah, isn't that the whole idea?

145
00:13:09,892 --> 00:13:14,604
I was like, well, but no, it's got all this functionality to it. I mean, like this is no different than.

146
00:13:15,204 --> 00:13:18,380
You know, any sort of like event registration type app.

147
00:13:19,556 --> 00:13:21,964
tons of examples of this. No, sorry.

148
00:13:24,132 --> 00:13:31,756
It's just – and then you just have these wrapper apps, yeah, like you're talking about that just really just wrap a website and stick it out there.

149
00:13:32,132 --> 00:13:33,900
somehow they just sail through.

150
00:13:35,268 --> 00:13:38,828
Yeah, you know, if they hadn't been such a-holes about...

151
00:13:39,716 --> 00:13:42,348
Web apps, we wouldn't have this problem.

152
00:13:42,820 --> 00:13:50,028
You could just install a web app as simply as an app from the App Store, which was the original idea.

153
00:13:50,756 --> 00:13:55,916
You know, when Steve, it was, it was when I met job, when I met Steve jobs.

154
00:13:56,932 --> 00:13:59,500
Did you say that? Did you say that, Wentina?

155
00:13:59,560 --> 00:14:10,256
complain about the sheets but listen here when i miss steve jobs she's like don't hit me with your steve jobs crap again i'm gonna do that i'm gonna do that

156
00:14:10,696 --> 00:14:14,192
Whatever with your Steve Jobs.

157
00:14:16,264 --> 00:14:18,224
When I met Steve Jobs.

158
00:14:18,984 --> 00:14:19,600
Uh,

159
00:14:19,944 --> 00:14:33,296
This was the iPod Touch days. This was the device. This was the dream. The iPod Touch, it would not be connected to a phone network, and it would be web apps. That was the whole idea, and it was a really good idea.

160
00:14:33,800 --> 00:14:39,056
But when I met Steve Jobs, he was, I noticed right, he was yelling at people.

161
00:14:40,200 --> 00:14:40,752
Wi-Fi.

162
00:14:43,208 --> 00:14:59,500
was so mad. I'm not sure exactly what they messed up with Wi-Fi, but something had happened with Wi-Fi, with the protocol or how it switched. I'm not sure exactly what it was that his dream was crumbling.

163
00:14:59,848 --> 00:15:06,192
And this is why he had to eventually do a deal with AT&T. And AT&T, as you recall, was the first partner.

164
00:15:07,016 --> 00:15:14,672
rolled out the iPhone. You couldn't get it for any other network. It had to be AT&T. And he also never wanted an app store, which in hindsight.

165
00:15:15,112 --> 00:15:19,312
Yeah, he's a huge money maker at the App Store.

166
00:15:20,456 --> 00:15:38,544
And I think you're right. I mean, the way we're suffering or slash that, the way you're suffering from the generated slop, you know, they've got to be inundated with this stuff. But it's the point where people can't even get a developer account approved within five days.

167
00:15:39,592 --> 00:15:41,808
It takes some – some have been waiting for months.

168
00:15:43,272 --> 00:15:44,048
blessed.

169
00:15:44,616 --> 00:15:45,584
The last app.

170
00:15:46,120 --> 00:15:51,760
release on android that paul did i think it took over a week to get approved

171
00:15:52,328 --> 00:15:56,688
Oh, really? Oh, my goodness. Didn't it? It took a while, yeah.

172
00:15:57,352 --> 00:15:58,900
It took a while. Maybe not.

173
00:15:58,992 --> 00:16:06,008
Maybe not over a week, but it was close. I mean, it was quite a number of days. And that's a definite down.

174
00:16:06,928 --> 00:16:07,576
*Burp*

175
00:16:08,336 --> 00:16:10,584
That's a definite increase in time.

176
00:16:13,584 --> 00:16:16,024
recently because it didn't use it was much quicker before

177
00:16:17,904 --> 00:16:20,216
Listen to this.

178
00:16:20,848 --> 00:16:22,712
I think I've received my first.

179
00:16:24,016 --> 00:16:25,240
Boost spam.

180
00:16:28,528 --> 00:16:32,088
So thanks, Eric P.P. Now that I put my.

181
00:16:33,008 --> 00:16:38,776
My split in the – my node in the split, here's what I got. Orangefriend.com.

182
00:16:39,216 --> 00:16:42,616
to swap to from Bitcoin on LN and other cryptocurrencies.

183
00:16:43,088 --> 00:16:52,792
Compare instant exchanges and P2P markets now with no KYC prepaid cards. Satogram is what it's called. It's a Satogram.

184
00:16:53,584 --> 00:16:56,408
How much was it? One Satoshi.

185
00:16:57,968 --> 00:16:59,600
I am a huge fan of this.

186
00:17:01,388 --> 00:17:08,276
You can spam my lightning node all you want. Just wear it out. Yeah. I love it. Yes.

187
00:17:09,580 --> 00:17:13,972
Paying me? Paying me to spam me? Oh, yeah. Bring it on, baby.

188
00:17:14,380 --> 00:17:16,916
Yeah, eventually those sats will be worth something.

189
00:17:17,260 --> 00:17:19,668
We're so back.

190
00:17:21,676 --> 00:17:22,356
Yeah.

191
00:17:22,892 --> 00:17:23,508
Yeah.

192
00:17:25,676 --> 00:17:30,772
Chad F. Chad F. In the hairpin. Very nice.

193
00:17:31,468 --> 00:17:32,948
Like, um...

194
00:17:34,636 --> 00:17:36,180
sort of the big nun.

195
00:17:36,940 --> 00:17:39,764
One of the big non-podcast stories.

196
00:17:41,708 --> 00:17:42,708
This week was...

197
00:17:43,436 --> 00:17:44,788
Tim Cook retiring.

198
00:17:46,604 --> 00:17:48,340
And I just like.

199
00:17:50,828 --> 00:17:58,772
Well, Marco will be happy. He thought Tim Cook was a huge traitor to liberalism.

200
00:17:59,868 --> 00:18:08,772
like this if you look across everybody's acting like Tim Cook was I mean I don't care who cares

201
00:18:09,660 --> 00:18:11,524
But just seeing a...

202
00:18:12,060 --> 00:18:27,012
point that's made over and over is everybody is saying, well, you know, no matter what you think about Tim Cook, he was like a business genius who will never be matched again because of him taking Apple, you know, up to being a $4 trillion market cap.

203
00:18:27,708 --> 00:18:28,260
This is...

204
00:18:29,116 --> 00:18:32,900
If you look across the whole tech industry, there are many companies that have.

205
00:18:33,372 --> 00:18:42,244
Yeah.

206
00:18:42,972 --> 00:18:59,500
Runs that they'll never have again because it's not about necessarily being any particular genius. Well, I'm not saying he's bad at his job. I'm just saying that like when you when you spent when you when you literally spend almost a trillion dollars in stock.

207
00:18:59,592 --> 00:19:00,080
Bye back.

208
00:19:02,120 --> 00:19:13,424
You're going to shoot that. That number is going to go way up. Well, also, he's a supply chain guy. So he did really good things with the supply chain.

209
00:19:14,568 --> 00:19:19,120
Okay, so Eric PP says you can actually turn stuff off. Oh, that's very cool

210
00:19:20,776 --> 00:19:23,152
if you want to, you can turn that stuff off.

211
00:19:24,296 --> 00:19:41,456
What off? Spam. You can turn spam off under a certain number. Hide boost amounts below. See, there you go. Oh, you got a threshold. Oh, yeah. Well, this is, of course, this is a helipad. This is one of the best pieces of software in the universe. This is, it's a spam assassin for boosts.

212
00:19:42,984 --> 00:19:48,816
Is anybody – there's probably one dude out there still running Spam Assassin on like a box in his closet.

213
00:19:50,024 --> 00:19:59,500
You know, the other day I had one of those moments where I'm like, I should probably check and see if I really got all the Bitcoin off of that old laptop.

214
00:20:00,136 --> 00:20:07,952
You ever done one of those? Yeah, where you have a panic moment, yeah. Well, it's just like, you know, because I famously sold 65 Bitcoin at $900.

215
00:20:08,360 --> 00:20:16,080
Like, let me just go see. And so it was Bitcoin Core QT running on a MacBook Air.

216
00:20:17,096 --> 00:20:31,024
So it wouldn't even connect to peers or anything. But I'm like, oh, OK, I can see, you know, I could see all the addresses. And yeah, there was like 19 sats here or there. But then there was all these transfers to AC Android wallet. I'm like, huh?

217
00:20:31,784 --> 00:20:32,784
I wonder.

218
00:20:33,256 --> 00:20:47,984
Oh, you have a mystery wallet somewhere. Like 30 Bitcoin. Like, I've got to find this thing. And so I'm talking to Tina, and this was before the whatever. This is when she was – maybe that's the reason for the whatever at the end of this story.

219
00:20:48,424 --> 00:20:54,320
And an Android phone. So this must have been around 2013.

220
00:20:54,728 --> 00:20:59,300
You keep all your old devices, though. Every single one.

221
00:20:59,936 --> 00:21:06,984
She says, well, when I was dating you, I remember you had been to the strip strip joint.

222
00:21:08,544 --> 00:21:26,920
And your Nokia E71 was left in the Uber. I said, okay, so the E71. How can you make it rain with Bitcoin at the strip club? Well, no, but this was the Nokia. She was just trying to help me identify devices. And I know that. Baby, trust me. I got Bitcoin. I'll send it to you.

223
00:21:27,328 --> 00:21:31,336
This is the yellow rose. It was some business thing. I wasn't really.

224
00:21:31,712 --> 00:21:33,608
wasn't really going for the strippers.

225
00:21:34,784 --> 00:21:36,136
And.

226
00:21:36,736 --> 00:21:43,592
And so I was like, okay. And I said, but I had an iPhone and an iPhone 4.

227
00:21:43,968 --> 00:21:46,184
And I'm not quite sure what came after that, so...

228
00:21:46,528 --> 00:21:53,032
You know, I go into the bin and literally the phones are in order. The Nokia E71.

229
00:21:54,048 --> 00:21:57,672
iPhone 4 and right in between that was an old Pixel.

230
00:21:59,980 --> 00:22:06,228
Bingo. I mean, how awesome is that? And she said, I will never complain ever again about you keeping all of your old crap.

231
00:22:06,668 --> 00:22:11,380
Bitcoin on this thing. I bet. I bet. Yeah. Well, and guess what? Zero.

232
00:22:11,948 --> 00:22:26,452
0.00. Of course, 0.00. Oh, yeah. We all have the story of selling – like I sold six Bitcoin at $1,800 and thought I was a genius. Yeah, it's – whatever.

233
00:22:27,020 --> 00:22:28,916
Yes. So live and learn.

234
00:22:29,516 --> 00:22:30,900
So this is interesting.

235
00:22:31,468 --> 00:22:32,692
Uh...

236
00:22:33,260 --> 00:22:50,964
RSS payment boost, true fans. Okay, true fans. It's coming through a little odd, but it's coming through. Cook missed every tech cycle. Search AI EV and glasses. Also cloud, and the list goes on. Not a genius. He is Apple's bomber.

237
00:22:52,780 --> 00:22:59,600
Well, here's this. That's a Sam Sethi thing. I've ever heard. Yeah. Yes. Now, here's the only thing about this new guy. And.

238
00:23:00,332 --> 00:23:20,884
And unfortunately, his name has too many syllables. You know, Steve Jobs, Tim Cook. This guy came in. He has like too many syllables in his last name. I don't remember. What is his last name? Turnus. Turnus. OK, it's going to be hard. John Turnus. Is it John Turnus? John Turnus. John Turnus. So he's the hardware guy. Well.

239
00:23:21,484 --> 00:23:41,844
That's interesting because if I've learned anything over the past 18 months, particularly in the last six, you know, Apple has their universal memory and their own silicon is highly usable.

240
00:23:42,220 --> 00:23:42,964
for the AI.

241
00:23:44,492 --> 00:24:00,200
Oh, yeah. Yeah. And, you know, they've had all kinds of AI capable type chips in the phones. They may have an incredible and they've held off. You know, it's like they've they've had a few missers. They've they've pulled back from the AI nonsense. And boy.

242
00:24:00,420 --> 00:24:20,300
I can't blame him because that's Samsung that I got. Yes, they all sounds like, hi, I'm Bixby. Oh, it's some some stupid agent. Yeah. Would you would you like for me to recognize you talking to me automatically? No. And get off. You can't even take it off the phone. Bixby. That's the Samsung AI.

243
00:24:21,156 --> 00:24:33,868
Everybody has got every company is making some agent and giving it some stupid cutesy name. It's like you should ask our agent, you know.

244
00:24:34,532 --> 00:24:41,836
you know, to Todd. I was like, I don't want to, I'm not going to talk to your agent by name. Just quit it. Yeah, exactly.

245
00:24:42,340 --> 00:24:43,532
Um,

246
00:24:44,164 --> 00:24:56,940
But I'm seeing the revolution unfolding. You know, this new, I haven't been able to test it, the DeepSeek V4. There's a cloud tag on Olam. It's not actually there yet.

247
00:24:57,444 --> 00:24:59,500
This thing's going to be pretty interesting.

248
00:24:59,500 --> 00:25:02,224
With one million token context.

249
00:25:05,128 --> 00:25:10,288
I haven't even heard of this one. Yeah, yeah, DeepSeek V4.

250
00:25:11,112 --> 00:25:17,296
Let me see. The V4 Pro, 1.6 trillion of parameters. Yeah, 1.6 trillion parameters.

251
00:25:18,728 --> 00:25:29,040
V4 Flash, 284 billion parameters, 13 billion active. Open weight. MIT license, 1 million token context window.

252
00:25:29,736 --> 00:25:30,224
Uh...

253
00:25:30,696 --> 00:25:32,880
trained on domestic Chinese chips.

254
00:25:33,992 --> 00:25:34,928
um,

255
00:25:36,168 --> 00:25:58,800
You know, this could be something very interesting. These are huge. These models are giant. So I've really become kind of 1.6 trillion parameter model. Yeah, that's pretty big. But, you know, this, what is it, together.ai? They're the ones that have been most stable for me of all the rented GPU outfits. This is.

256
00:25:58,860 --> 00:26:00,596
This has changed my life.

257
00:26:01,132 --> 00:26:02,548
Yeah.

258
00:26:03,180 --> 00:26:26,068
You know, it's like, oh, there's a five hour senatorial testimony on C-SPAN. OK, robot, go download it. Download it. OK, run it through the fastest, most awesomest whisper model you have. Do word by word. Throw the JSON onto your drive and then summarize.

259
00:26:27,372 --> 00:26:36,468
a summary. All right. Well, that sounds interesting. What any good fun quotes you can get there for some clips, you know, and that costs 15 cents an hour.

260
00:26:36,908 --> 00:26:42,164
Yeah, and all you would have to do then is just publish it to Spreaker and you can start getting ad revenue.

261
00:26:42,604 --> 00:26:58,300
But the point is, so if you want to use a model like this, you can run it pretty efficiently from, and I don't even know what together, I don't even know what their model is, their business model. I'm sure you're using some dude's gaming computer. Isn't that kind of what they all are?

262
00:26:58,616 --> 00:27:02,112
You know the new thing, right?

263
00:27:06,616 --> 00:27:07,776
call it.

264
00:27:10,488 --> 00:27:11,328
distributed

265
00:27:12,216 --> 00:27:13,568
Proxy.

266
00:27:15,736 --> 00:27:17,344
I can't remember the name, but.

267
00:27:18,232 --> 00:27:27,232
Well, serverless inference is one of the catchphrases. No, this is a scam tactic where you –

268
00:27:29,688 --> 00:27:32,864
where you sign up for something and you're giving.

269
00:27:33,656 --> 00:27:35,744
You're giving your...

270
00:27:36,536 --> 00:27:42,048
You're giving a third-party access to use your computer as a proxy? Yes.

271
00:27:44,376 --> 00:27:47,872
Nefarious actors can distribute their workload and not come from.

272
00:27:48,408 --> 00:27:48,928
the center.

273
00:27:49,464 --> 00:27:58,100
of system numbers. That's also cool. Yes. Yeah, this is like, people are cracking down on this all over the place because people are, they're using it like.

274
00:27:58,256 --> 00:28:01,176
installing Roku apps that have this thing embedded.

275
00:28:02,064 --> 00:28:12,248
Oh, yeah. Suddenly your television is being used for a proxy for a bot army. Yeah, that's good. And bingo, Bitwarden gets popped. Yeah.

276
00:28:12,752 --> 00:28:13,912
How about that, huh?

277
00:28:14,576 --> 00:28:19,896
Yeah, I mean this stuff is probably coming through these distributed proxy networks now.

278
00:28:21,456 --> 00:28:32,472
as their attack vector because they're so, like, it's all over the world. And what you need, like what we do in the Podcast Index, you know, we clearly, we...

279
00:28:32,848 --> 00:28:34,936
block a ton of

280
00:28:37,104 --> 00:28:38,520
of data centers.

281
00:28:39,312 --> 00:28:47,480
a lot of heuristic looking at traffic that's coming from a data center, that's coming from an IP that's related to an autonomous system.

282
00:28:48,656 --> 00:28:51,352
which is connected to a data center provider.

283
00:28:52,176 --> 00:28:53,688
And so, like...

284
00:28:54,160 --> 00:28:57,700
If you have a browser user agent.

285
00:28:58,176 --> 00:28:59,688
claiming to be chrome

286
00:29:00,736 --> 00:29:08,360
coming from AWS. Yeah, you're no good. You're no good. Yeah, you're not Chrome. You know, you're an imposter.

287
00:29:10,240 --> 00:29:26,120
Yeah, one of the huge benefits of running Omarchi is, and I guess you could do with the Mac too, but all my bot-based stuff, I pipe it all through my home desktop. Home IP coming from a home machine looks legit.

288
00:29:27,328 --> 00:29:32,040
And I've had – I'm currently moving my freedom controller to my –

289
00:29:32,416 --> 00:29:33,608
to my house.

290
00:29:34,464 --> 00:29:49,224
to Ubuntu or to my podcast rig. You should run it on the same machine that's running our Albie Hub because it's working so well. What? Is Albie Hub working? Is it not working? Not for Comics for your Blogger. He complained again.

291
00:29:51,456 --> 00:29:58,000
What? I got to see this. Podcast.

292
00:29:58,124 --> 00:29:59,764
L index, LN.

293
00:30:00,748 --> 00:30:02,228
it.

294
00:30:02,924 --> 00:30:10,900
I don't understand because it was it's up and running. Well, I've got I've got machine two here with and it is running right here.

295
00:30:14,284 --> 00:30:15,796
18

296
00:30:16,428 --> 00:30:18,132
1842. What is 1842?

297
00:30:19,596 --> 00:30:22,676
I don't know what 1842 is. It was a good year for wine.

298
00:30:24,204 --> 00:30:26,580
It was a good year.

299
00:30:27,564 --> 00:30:28,724
generated boost

300
00:30:29,132 --> 00:30:30,420
Oh well.

301
00:30:31,468 --> 00:30:55,796
I don't know what to say. It's up and running. I mean it's just sitting right here doing its thing. Anyway, so you're moving the Freedom Controller over to your own home machine. Yeah, because everybody's doing this. Everybody's blocking data center traffic, and my Freedom Controller for many years has been running in Linode. Right. And so increasingly I'm getting 403 just denied. Yeah, yeah, yeah, yeah. And so I can't save articles into my archive anymore.

302
00:30:55,884 --> 00:30:57,900
to have to move it. Luckily, we...

303
00:30:58,248 --> 00:31:01,680
We facilitated for this a long time ago and.

304
00:31:02,344 --> 00:31:05,360
you can use. So one of the things frame control can do is it'll

305
00:31:06,728 --> 00:31:07,760
set up a bucket.

306
00:31:09,224 --> 00:31:10,832
an S3 or somewhere.

307
00:31:11,432 --> 00:31:13,072
you can hit the bucket.

308
00:31:13,704 --> 00:31:17,456
And it'll bounce you to whatever the current IP address is.

309
00:31:17,992 --> 00:31:22,576
of where the frame controller actually is. Oh, cool. Yeah, it's fine.

310
00:31:23,816 --> 00:31:24,880
just find this way.

311
00:31:25,320 --> 00:31:26,672
But I don't know. It's...

312
00:31:27,048 --> 00:31:28,592
Like, this is a...

313
00:31:29,640 --> 00:31:45,712
This is becoming a real problem for attackers, so now they're going to have to figure out a way to take over residential IPs, and that's behind all of this. They just do that in the routers, right? Aren't the routers complete pieces of crap that are just all Swiss cheese? Oh, yeah, for sure. Yeah.

314
00:31:46,312 --> 00:31:48,752
Your router has about as much, you know.

315
00:31:49,512 --> 00:31:52,592
If anybody sits down with a commercial router.

316
00:31:54,856 --> 00:31:55,472
more than

317
00:31:57,824 --> 00:32:05,192
You can just imagine that people are stripping the firmware out of those things, just running them through an LLM to find bugs. Right.

318
00:32:05,664 --> 00:32:06,376
As we're talking.

319
00:32:07,008 --> 00:32:14,152
This is a change in the world that people are not very aware of, and it's going to mean more people will be needed.

320
00:32:15,232 --> 00:32:22,152
pain, but also more people will be needed to reconfigure everything. The whole centralized model is under attack.

321
00:32:22,944 --> 00:32:24,360
Yes, the Slopocalypse.

322
00:32:24,864 --> 00:32:32,520
Slop apocalypse. Yes. Yeah. It's, it's, it's happening and we're not ready. We're not ready for it. Yeah. Really not.

323
00:32:35,168 --> 00:32:35,752
You know,

324
00:32:38,240 --> 00:32:39,400
It's so hard.

325
00:32:39,776 --> 00:32:41,640
to find this stuff.

326
00:32:44,064 --> 00:32:46,248
Part of it is just – part of it is –

327
00:32:47,328 --> 00:32:48,936
There's going to be some amount of.

328
00:32:49,344 --> 00:32:50,152
slop

329
00:32:52,128 --> 00:32:53,864
is just kind of undetectable.

330
00:32:56,224 --> 00:32:57,700
Automate your way around it.

331
00:32:58,368 --> 00:33:00,648
I mean, I've been—this is—

332
00:33:01,472 --> 00:33:04,872
been focused almost squarely on this for

333
00:33:05,696 --> 00:33:07,272
couple of, you know, two, three weeks now.

334
00:33:07,968 --> 00:33:08,744
you

335
00:33:10,752 --> 00:33:11,656
tell you that

336
00:33:13,792 --> 00:33:19,464
I'm in model training mode right now. That's the next step, and that's what I'm working on.

337
00:33:19,936 --> 00:33:21,064
is complicated.

338
00:33:24,928 --> 00:33:25,800
So...

339
00:33:26,208 --> 00:33:30,152
We can talk about it if you want to. Well, I just want to answer.

340
00:33:30,976 --> 00:33:39,144
He's in the boardroom. What even is slop? By the way, whenever I play an AI-generated song on No Agenda in...

341
00:33:40,096 --> 00:33:43,304
show mix, he's the first one to say AI minus.

342
00:33:44,960 --> 00:33:47,944
So you clearly know what it is.

343
00:33:49,024 --> 00:33:57,064
Slop is what pigs eat, and it's usually a lot of the same of it. I think that's the definition.

344
00:33:59,232 --> 00:34:02,312
And this is the thing.

345
00:34:03,232 --> 00:34:05,096
That we can't.

346
00:34:05,728 --> 00:34:06,984
ever get around.

347
00:34:09,504 --> 00:34:14,376
I'm talking about more than just technology. This goes way, this goes all the way back to.

348
00:34:14,912 --> 00:34:15,912
famous.

349
00:34:17,024 --> 00:34:17,640
of I don't know.

350
00:34:18,528 --> 00:34:21,896
define pornography but I know when I see it yeah yeah

351
00:34:22,560 --> 00:34:25,896
There's a lot of – there are many things like this.

352
00:34:26,272 --> 00:34:27,112
the world.

353
00:34:27,808 --> 00:34:28,776
Buh-uh-uh.

354
00:34:29,312 --> 00:34:31,752
You know, one thing is just language in general.

355
00:34:32,224 --> 00:34:33,736
You know, we talk, we, we.

356
00:34:35,296 --> 00:34:37,448
what we were talking about last week, but it brought up

357
00:34:38,336 --> 00:34:39,304
idea

358
00:34:39,936 --> 00:34:43,624
It reminded me of Noam Chomsky's work on language.

359
00:34:44,160 --> 00:34:47,912
Before Noam Chomsky was the sort of –

360
00:34:49,408 --> 00:34:55,336
Anti-war. Before he was friends with Epstein, you mean? Oh, that's – yeah.

361
00:34:56,352 --> 00:34:57,900
Um,

362
00:34:58,440 --> 00:35:03,536
Before all that, before he was the lovable kook.

363
00:35:04,904 --> 00:35:05,552
Um...

364
00:35:06,216 --> 00:35:09,488
His big contribution was...

365
00:35:09,896 --> 00:35:11,344
in the world of

366
00:35:11,912 --> 00:35:14,128
language and philology.

367
00:35:15,112 --> 00:35:16,688
Phalology?

368
00:35:17,800 --> 00:35:20,336
H-I-L-O-L-O-G-Y. Hold on.

369
00:35:20,936 --> 00:35:25,168
Book of Knowledge, give me the definition of philology.

370
00:35:30,824 --> 00:35:45,520
According to the Book of Knowledge, philology is the study of language in written historical sources, combining linguistics with literary criticism and historical analysis to understand texts and their cultural contexts.

371
00:35:46,280 --> 00:35:47,056
Those

372
00:35:47,624 --> 00:35:48,784
It has been written.

373
00:35:49,480 --> 00:35:51,472
Hey, can I vibe code or what?

374
00:35:54,440 --> 00:35:57,900
It's just – it would be like I can't –

375
00:35:59,048 --> 00:36:02,128
I can't help wishing that that sounded like.

376
00:36:02,632 --> 00:36:04,624
the, the priest.

377
00:36:05,736 --> 00:36:13,776
Monty Python and the Holy Grail reading from the Book of Armaments. I mean...

378
00:36:14,376 --> 00:36:16,368
I got it where I wanted it to be.

379
00:36:16,744 --> 00:36:28,944
That one caught me off guard. I was if you is that been on no agenda because I introduced it a couple of shows ago. So I have a little interface with a push to talk button.

380
00:36:29,704 --> 00:36:47,504
And it's piped in exactly the way I want it. And to bridge over the super fast whisper, but then the FFmpeg process, which takes the voice and then adds the echo to it, I added the little page scribbling.

381
00:36:48,648 --> 00:36:50,128
This is...

382
00:36:50,536 --> 00:36:57,400
I'm dangerous with this stuff. That's A-plus. That's A-plus work, brother. Thank you. Okay.

383
00:36:57,524 --> 00:37:12,860
You know what it is. I like being – this is why I don't listen to No Agenda because I like being surprised by your antics. Philology. OK. But that's where Chomsky's chops came from.

384
00:37:13,652 --> 00:37:15,964
That's where he...

385
00:37:16,308 --> 00:37:19,548
sharpened up his chops was in language.

386
00:37:20,244 --> 00:37:23,228
And that's...

387
00:37:23,604 --> 00:37:27,868
What he said, what his big contribution was, and I think it's still.

388
00:37:28,948 --> 00:37:33,724
pretty much the defining characteristic of language.

389
00:37:34,612 --> 00:37:35,516
astrology

390
00:37:36,404 --> 00:37:37,180
is that

391
00:37:38,772 --> 00:37:40,860
He said that language.

392
00:37:41,940 --> 00:37:45,276
It appears to be something built into.

393
00:37:47,220 --> 00:37:49,532
Like it's not something that is learned.

394
00:37:50,196 --> 00:37:54,940
So you could say, okay, is language a general knowledge thing?

395
00:37:55,380 --> 00:37:56,572
Where we...

396
00:37:57,220 --> 00:38:00,684
learn it the same way we learn how to drive a car.

397
00:38:01,380 --> 00:38:01,900
the same way.

398
00:38:03,844 --> 00:38:04,364
Um.

399
00:38:05,572 --> 00:38:08,012
put silverware in a certain order on a table.

400
00:38:09,732 --> 00:38:10,540
He says no.

401
00:38:11,172 --> 00:38:13,996
The language is something that's unique.

402
00:38:14,788 --> 00:38:16,780
wired into the human mind.

403
00:38:17,156 --> 00:38:18,412
way that

404
00:38:18,820 --> 00:38:24,844
seems to be pre-existing. So you could say that humans are...

405
00:38:25,572 --> 00:38:28,684
are born with the faculty.

406
00:38:29,380 --> 00:38:31,596
the property of language.

407
00:38:32,196 --> 00:38:33,100
construction.

408
00:38:34,340 --> 00:38:39,116
we understand what he, I think, if I'm not mistaken, I think the term.

409
00:38:39,684 --> 00:38:41,452
used was universal grammar.

410
00:38:44,516 --> 00:38:49,772
He said that this spans all different languages.

411
00:38:50,404 --> 00:38:54,188
On the outside, it may look like Mandarin Chinese and

412
00:38:55,460 --> 00:38:56,900
Italian are

413
00:38:57,440 --> 00:39:11,144
But on the inside, it's really all tokenized. Is that what you're getting? It's all token. Yeah. It's an API. But it looks like from the outside, those are completely different languages.

414
00:39:11,488 --> 00:39:15,080
But what's happening in the mind is something.

415
00:39:15,648 --> 00:39:17,576
Chomsky called universal grammar.

416
00:39:19,072 --> 00:39:19,720
So.

417
00:39:20,064 --> 00:39:21,480
this idea

418
00:39:24,384 --> 00:39:26,088
universal set of rules.

419
00:39:31,584 --> 00:39:37,096
We all share this structure, no matter what the overarching language is.

420
00:39:38,304 --> 00:39:42,024
We all operate on a subject, an object.

421
00:39:44,480 --> 00:39:46,184
and other parts of grammar.

422
00:39:46,848 --> 00:39:49,416
that even though they may be in a different order.

423
00:39:50,048 --> 00:39:54,280
where they may be a little bit different. We all understand what these things are, you know? Yeah.

424
00:39:55,296 --> 00:39:56,900
And so...

425
00:39:57,376 --> 00:39:58,504
Uh...

426
00:39:58,848 --> 00:40:01,928
I don't know if I said this before, but it's sort of like a...

427
00:40:02,400 --> 00:40:05,288
Like mere Christianity. Like there's.

428
00:40:05,888 --> 00:40:20,968
There's a lot of different theologies, lots of different doctrines, all these kinds of things. But the core of what Christianity is, it's like these two or three things. Yeah, like Jesus come quickly. Everyone in Christianity is thinking that. Yeah, it's just, you know, so there's...

429
00:40:22,912 --> 00:40:24,744
That's this, that's...

430
00:40:25,280 --> 00:40:26,920
When it comes to language,

431
00:40:29,696 --> 00:40:34,056
As I'm working through the train, this how to train.

432
00:40:34,656 --> 00:40:40,584
this model thing, I'm, you know, this is, this beginning, I'm beginning to sort of see.

433
00:40:41,824 --> 00:40:43,368
pattern here.

434
00:40:45,152 --> 00:40:48,616
That goes back to what we were talking about earlier, which is.

435
00:40:49,536 --> 00:40:51,752
These things are hard to define.

436
00:40:52,992 --> 00:40:54,248
but we know what they are.

437
00:40:55,1000 --> 00:40:56,900
Um,

438
00:40:57,376 --> 00:41:00,648
If I gave flashcards...

439
00:41:01,696 --> 00:41:02,568
to

440
00:41:03,680 --> 00:41:06,344
Everyone in the boardroom.

441
00:41:06,880 --> 00:41:10,152
And gave him 20 flashcards and said...

442
00:41:10,592 --> 00:41:12,840
Okay, and then played a – played –

443
00:41:13,728 --> 00:41:20,584
14 seconds of each podcast and say, okay, here's the art, here's the title, here's the description of the podcast.

444
00:41:21,056 --> 00:41:23,048
Here's 14 seconds of audio.

445
00:41:25,440 --> 00:41:26,696
20 out of 20.

446
00:41:27,616 --> 00:41:32,840
would identify every single one of them perfectly as either AI slop.

447
00:41:34,624 --> 00:41:37,096
just a normal podcast. I agree.

448
00:41:38,656 --> 00:41:44,776
Now, when you try to take that and put it into an LLM, into a model.

449
00:41:45,248 --> 00:41:45,928
It is.

450
00:41:47,328 --> 00:41:49,736
You hit a barrier.

451
00:41:50,432 --> 00:41:57,100
And so – and I think that the issue has something to do – so we've talked about Jock a little.

452
00:41:57,100 --> 00:42:04,336
about Jacques Ellul before. Yes, a good old Jacques. Many times. Yes. Yes. Professor Jacques.

453
00:42:05,352 --> 00:42:08,560
He's an interesting guy.

454
00:42:09,800 --> 00:42:14,608
had his two seminal works, Technological Society and Propaganda.

455
00:42:15,176 --> 00:42:19,152
One of the key takeaways from his work, Propaganda.

456
00:42:20,136 --> 00:42:24,528
Is that if you're trying to convince someone or a group of people that your idea is.

457
00:42:25,096 --> 00:42:28,464
If that's what you're doing as your propaganda.

458
00:42:30,216 --> 00:42:33,872
then what you're doing is not propaganda or you're not very good at it.

459
00:42:34,888 --> 00:42:37,456
Right. Because.

460
00:42:38,536 --> 00:42:40,272
Attempts at propaganda, like.

461
00:42:41,768 --> 00:42:49,616
Attempts of propaganda by someone who's not good at it, they'll try to do that. They'll try to explain to you why they are right.

462
00:42:50,056 --> 00:42:54,608
But what he says is propaganda is not about ideas. It's about actions.

463
00:42:56,488 --> 00:42:57,1000
Real propaganda.

464
00:42:59,564 --> 00:43:10,516
does not care about the ideas in any shape, form, or fashion. They change over time. They're different every day. The news cycle changes. It's all kind of irrelevant.

465
00:43:10,860 --> 00:43:15,028
All that matters is whether you drive a particular outcome.

466
00:43:15,468 --> 00:43:16,180
action.

467
00:43:17,164 --> 00:43:21,556
and say a thing and then that provokes an action in the listener.

468
00:43:23,084 --> 00:43:25,588
Your job is done. That is propaganda.

469
00:43:26,092 --> 00:43:29,524
Would you like to hear a readout?

470
00:43:29,900 --> 00:43:32,308
of one of my agents that

471
00:43:33,132 --> 00:43:36,372
searches for YouTube videos and has been trained.

472
00:43:36,940 --> 00:43:39,444
to avoid AI slop videos.

473
00:43:41,548 --> 00:43:43,060
Okay, red flags.

474
00:43:43,628 --> 00:43:49,268
Telltale AI phrasing. So it only does it on on whisper transcripts.

475
00:43:49,708 --> 00:43:50,388
Personally, I think.

476
00:43:51,276 --> 00:43:52,948
Let me break this down.

477
00:43:53,324 --> 00:43:57,1000
What many people don't realize, if you take a step back and think about it,

478
00:43:58,156 --> 00:44:01,588
One thing that immediately stands out from my perspective

479
00:44:02,092 --> 00:44:23,892
What this really suggests is deep dive, red flags in the channel, no real person or camera or identifiable name or title, generic aggregator channel names, stock footage with TTS narration, monotone, flat, overly uniform anchor voice, synthetic rhythm, no natural reading errors, no pacing variations.

480
00:44:24,716 --> 00:44:43,604
The rule, scan first few lines of transcript before anything. If the patterns are there, don't cut the audio. The content might still be useful research, but the AI voice never goes on air. Adam caught seven AI slop clips on NA 1857, rejected them all on content. So that's how my agent is now identifying this stuff.

481
00:44:44,748 --> 00:44:47,732
Which sounds kind of human-esque, really.

482
00:44:49,004 --> 00:44:52,628
Yeah, and the issue is now do that for –

483
00:44:53,004 --> 00:44:57,800
A few hundred podcasts an hour. A minute. Oh, an hour. Yeah, right. Yeah, exactly. Exactly.

484
00:44:58,052 --> 00:45:04,876
And so this is the problem. So when you're restricted to only sort of –

485
00:45:06,308 --> 00:45:08,076
When you're restricted to only language.

486
00:45:09,828 --> 00:45:12,300
don't have the processing horsepower.

487
00:45:12,996 --> 00:45:15,532
listen to all the audio.

488
00:45:17,124 --> 00:45:28,780
It is very difficult. But isn't it amazing, though, that with three watts that powers my entire brain, I can do it in seconds? And this was Chomsky's.

489
00:45:30,500 --> 00:45:33,004
genius was that

490
00:45:33,828 --> 00:45:35,308
It's not math.

491
00:45:36,644 --> 00:45:40,652
The way our minds work is not a math problem.

492
00:45:42,660 --> 00:45:49,516
It's a meth problem. Yeah, sometimes, yes.

493
00:45:50,564 --> 00:45:51,340
So.

494
00:45:52,836 --> 00:45:54,220
It's this like.

495
00:45:55,428 --> 00:45:56,620
ideas

496
00:45:57,744 --> 00:45:59,704
I think the nature of what –

497
00:46:01,200 --> 00:46:03,512
take if you take sort of what Jacques Ellul was saying

498
00:46:05,584 --> 00:46:08,152
He's saying that action...

499
00:46:09,680 --> 00:46:10,808
action

500
00:46:11,440 --> 00:46:17,336
Words and actions are – they share sort of a congruent relationship.

501
00:46:18,384 --> 00:46:20,088
They function together.

502
00:46:20,912 --> 00:46:25,432
Words drive action. Inaction drives words.

503
00:46:26,256 --> 00:46:27,832
have a relationship.

504
00:46:28,720 --> 00:46:30,904
that is mysterious.

505
00:46:31,824 --> 00:46:34,296
But it is there.

506
00:46:38,064 --> 00:46:45,880
When we talk to people we have relationships with, it really matters what the ideas are because we want to be on the same page with each other.

507
00:46:46,416 --> 00:46:48,696
In a way that something like propaganda doesn't care.

508
00:46:49,392 --> 00:46:54,456
Propaganda just cares about actions, but we don't when we're having real relationships.

509
00:46:55,568 --> 00:46:56,248
Um,

510
00:46:57,456 --> 00:46:58,840
you know, there's...

511
00:46:59,184 --> 00:47:05,208
There's this fundamental paradox with language that the words themselves sort of don't matter.

512
00:47:05,648 --> 00:47:07,608
But at the same time, they very much do.

513
00:47:09,264 --> 00:47:15,096
And I think that's really what you bump up against when you try to make a definition.

514
00:47:15,888 --> 00:47:17,592
Because in one sense, the.

515
00:47:17,1000 --> 00:47:21,912
the definition of something that is so hard to put your finger on.

516
00:47:22,320 --> 00:47:26,648
In one sense, the words don't matter because words are kind of fungible.

517
00:47:27,216 --> 00:47:30,840
The English word is five, but the Spanish word is cinco.

518
00:47:31,216 --> 00:47:44,344
They're interchangeable. Also, the meaning of words change over time. Yeah. So the words are just – the words are a conduit for meaning, and that conduit can take many different shapes.

519
00:47:44,848 --> 00:47:50,424
square, round, oval, whatever. Essentially, the language is not the point.

520
00:47:51,088 --> 00:47:53,368
The ideas are, the meanings are the point.

521
00:47:54,544 --> 00:47:57,300
But then there's this other aspect.

522
00:47:58,384 --> 00:48:01,432
where the words themselves are incredibly important.

523
00:48:02,224 --> 00:48:03,896
Like, in fact, they're critical.

524
00:48:05,488 --> 00:48:06,040
Um,

525
00:48:06,384 --> 00:48:15,256
So on the broader scope, the words don't matter, but when it comes to conveying the meaning to another person or another language, passing sort of…

526
00:48:15,664 --> 00:48:21,208
The utility of those words you choose are a critically important aspect of it. Mm-hmm.

527
00:48:22,256 --> 00:48:27,160
So there is like a fundamental paradox that exists within language.

528
00:48:28,528 --> 00:48:49,272
And this is why I brought up the propaganda reference because when it comes to propaganda, word choice, word choice is everything because the actions you desire as outcomes from your words were as a result of the emotional stirring. Of those words that you use. Of those words that you deliver as they sound to the hearer.

529
00:48:50,224 --> 00:48:53,432
So like a good example of this.

530
00:48:54,928 --> 00:48:57,400
think back to the election of this most recent election.

531
00:48:57,748 --> 00:49:04,060
One of my favorite examples is J.D. Vance. There was the sort of like...

532
00:49:04,532 --> 00:49:09,980
When he was nominated as a running mate, all the typical attacks started, political attacks.

533
00:49:10,324 --> 00:49:14,300
The one thing that came up was that people started calling him, quote, weird.

534
00:49:14,900 --> 00:49:17,084
Do you remember that? Of course.

535
00:49:18,292 --> 00:49:23,932
Weird is a word that young people use to mean that a person makes them sexually.

536
00:49:24,276 --> 00:49:25,020
comfortable.

537
00:49:26,164 --> 00:49:29,116
That is what, that is, I have kids.

538
00:49:29,652 --> 00:49:35,356
That is when they say the word that when they say that person is weird, that is what they mean by.

539
00:49:37,076 --> 00:49:38,396
That is not a, that is.

540
00:49:39,092 --> 00:49:42,460
That is not something that older people without kids would.

541
00:49:43,252 --> 00:49:46,236
people with older kids would be familiar with.

542
00:49:47,892 --> 00:49:48,380
Um,

543
00:49:49,300 --> 00:49:50,140
that went like

544
00:49:50,708 --> 00:49:54,172
If a young kid is watching like a movie with a sex thing, they're like,

545
00:49:54,580 --> 00:49:55,708
That's weird.

546
00:49:56,404 --> 00:49:57,500
That is the, that is.

547
00:49:57,528 --> 00:50:04,256
what they mean. You would. And so like, you're only going to know this if you have children that are like in their mid twenties or younger.

548
00:50:05,048 --> 00:50:08,224
And so research definitely went into the use of that word.

549
00:50:09,400 --> 00:50:12,256
And it was chosen to convey that exact meaning.

550
00:50:13,208 --> 00:50:14,400
that generation of people.

551
00:50:16,024 --> 00:50:17,344
You know, so like.

552
00:50:18,488 --> 00:50:20,128
The language.

553
00:50:20,536 --> 00:50:22,144
evokes actions.

554
00:50:23,576 --> 00:50:25,120
is one aspect of it.

555
00:50:25,688 --> 00:50:28,960
The language and then the theme.

556
00:50:29,592 --> 00:50:31,520
The meanings though...

557
00:50:32,920 --> 00:50:35,808
The words convey can change over time.

558
00:50:36,440 --> 00:50:40,096
So as I'm going through this model training thing.

559
00:50:43,128 --> 00:50:48,224
hit this point when I'm interacting with LLM where it asks me

560
00:50:48,600 --> 00:50:49,856
Like it says.

561
00:50:50,200 --> 00:50:52,992
Do you want me to spot check the output of?

562
00:50:53,976 --> 00:50:55,296
blah, blah that it's doing.

563
00:50:56,184 --> 00:50:57,500
And I just stopped.

564
00:50:58,584 --> 00:51:02,176
I'm like, what do you mean? What does that mean? Spot check.

565
00:51:04,280 --> 00:51:04,768
Like what?

566
00:51:06,008 --> 00:51:07,1000
What does it mean? I have no idea.

567
00:51:10,1000 --> 00:51:15,712
Like, what does it mean it's going to look at something and...

568
00:51:16,920 --> 00:51:18,336
Tell me that it's right?

569
00:51:19,480 --> 00:51:21,312
but in what parameters? Like what?

570
00:51:21,688 --> 00:51:24,096
Like this is the problem. We're –

571
00:51:24,504 --> 00:51:26,688
It's trained on our language. Yeah.

572
00:51:27,192 --> 00:51:35,744
And so it's talking like us. It's trying to communicate something to you. But when I say, hey, Adam, do you want me to spot check this?

573
00:51:36,216 --> 00:51:37,856
You can instantly get it.

574
00:51:38,744 --> 00:51:41,312
You have an understanding of what I mean.

575
00:51:41,912 --> 00:51:46,784
Just innately, you don't there's we're on the same page just by default.

576
00:51:47,544 --> 00:51:51,360
But when the machine asks me, do you want me to spot check it?

577
00:51:52,504 --> 00:51:57,300
I'm really, I was just sort of paralyzed for a minute. I think you might be overthinking it.

578
00:51:57,328 --> 00:52:00,312
Doesn't it mean exactly the same thing as if I said it to you?

579
00:52:00,720 --> 00:52:02,648
No, because what if I say yes?

580
00:52:03,216 --> 00:52:09,048
It says, do you want me to spot check this data? If I say yes, what am I communicating to it?

581
00:52:09,456 --> 00:52:10,424
I'm not sure.

582
00:52:11,088 --> 00:52:28,056
Because if I say yes, is it going to say – what is it going to then do? What would you expect it to do? But see, here's the question. But the question though is – and this is the action part. This is the part that I'm getting to is that –

583
00:52:28,752 --> 00:52:35,992
If I say yes and without knowing exactly what it intends to do.

584
00:52:36,528 --> 00:52:38,520
What is it going to do?

585
00:52:40,304 --> 00:52:41,496
Hold on a sec.

586
00:52:42,992 --> 00:52:52,120
If you said it to me, no, if I said to you, do you want me to spot check this, Dave? And you said yes, what would you expect me to do?

587
00:52:52,752 --> 00:52:56,600
And why would you know what I was suggesting?

588
00:52:57,808 --> 00:52:58,424
So.

589
00:52:59,568 --> 00:53:00,792
We would have a context.

590
00:53:02,672 --> 00:53:05,208
If I'm saying – if you say –

591
00:53:06,608 --> 00:53:13,496
Hey, Dave, this feed is not updating in the index, and I think that there may be a

592
00:53:14,032 --> 00:53:15,672
few more of this type of feed

593
00:53:17,584 --> 00:53:18,200
you spot check.

594
00:53:18,832 --> 00:53:19,448
Okay.

595
00:53:20,624 --> 00:53:24,472
I would immediately know what.

596
00:53:25,200 --> 00:53:28,408
you are communicating to me and what actions you expect me to take.

597
00:53:28,752 --> 00:53:38,264
Okay, so apply that to your example. You were doing something and the LLM says, do you want me to spot check this? What was the process you were doing?

598
00:53:38,672 --> 00:53:40,280
Right. And so what the.

599
00:53:41,168 --> 00:53:45,048
What I have found through this process.

600
00:53:45,584 --> 00:53:47,832
is the process of model training.

601
00:53:50,864 --> 00:53:55,864
is that a good 90% of the work of model training goes into preparing your data correctly.

602
00:53:56,560 --> 00:53:57,800
training data set.

603
00:53:59,844 --> 00:54:01,932
Having a having a.

604
00:54:04,964 --> 00:54:08,588
having an excellent training data set.

605
00:54:10,244 --> 00:54:12,300
is literally 90% of the job.

606
00:54:17,284 --> 00:54:19,084
had to go through a lot of steps.

607
00:54:20,804 --> 00:54:24,588
The first one was building what James wanted, which was the dead feed.

608
00:54:25,156 --> 00:54:26,604
problematic feed

609
00:54:30,084 --> 00:54:32,492
That is now built and is functioning.

610
00:54:36,004 --> 00:54:38,892
So I can't release it.

611
00:54:39,780 --> 00:54:41,644
really as a public link i don't think

612
00:54:42,020 --> 00:54:50,956
calls. It's got DMCA stuff in there. We asked, we've been asked to remove. So I don't think we can release that publicly, but I think it would be okay to release it to research.

613
00:54:51,716 --> 00:54:55,948
purposes. So somebody would have to like contact me and ask for it.

614
00:54:56,612 --> 00:54:57,800
Chad F.

615
00:54:58,212 --> 00:55:01,260
says hey adam can you spot check this boostogram done

616
00:55:05,252 --> 00:55:10,092
Context understood. Human to human communication complete.

617
00:55:10,468 --> 00:55:12,492
Yeah.

618
00:55:13,636 --> 00:55:21,228
And so if somebody wants it, James or anybody else, call me. We'll work it out. I'll work out how to get it to you.

619
00:55:22,820 --> 00:55:24,172
So that was step one.

620
00:55:26,084 --> 00:55:32,076
as I'm working through what it required was to put a bunch of new SQL, SQL.

621
00:55:32,516 --> 00:55:33,452
Uh...

622
00:55:34,052 --> 00:55:38,828
statements into the index to get all this data out in the correct way.

623
00:55:40,484 --> 00:55:52,620
export it into a SQLite database just like the other one. What I wanted is two SQLite databases, the standard SQLite database that we export every week and this new problematic database that we're going to use as the...

624
00:55:53,444 --> 00:55:54,956
Test. Corpus.

625
00:55:55,620 --> 00:55:57,036
feeds. Yeah.

626
00:55:58,500 --> 00:56:02,156
And so as I'm going through this.

627
00:56:02,628 --> 00:56:05,100
You know you're going to get criddled on this.

628
00:56:05,476 --> 00:56:09,388
That's okay. That's fine. Podcast indexes deplatforming people.

629
00:56:09,796 --> 00:56:10,540
you

630
00:56:11,684 --> 00:56:13,004
Wrong!

631
00:56:13,540 --> 00:56:25,548
I've been criddled before, and I'll be criddled again. That's fine. That's not a problem. So as I'm working through this, I exported the data, and there was a problem.

632
00:56:26,660 --> 00:56:27,308
Thank you.

633
00:56:29,604 --> 00:56:30,956
SQLite database.

634
00:56:32,292 --> 00:56:33,612
initially came out.

635
00:56:35,940 --> 00:56:36,844
erupt.

636
00:56:38,660 --> 00:56:42,188
The LLM gave me a list of issues.

637
00:56:42,628 --> 00:56:49,708
It was like the SQLite database is unreadable, but also the CSV that we exported.

638
00:56:51,204 --> 00:56:57,500
was missing was it was like missing a field he gave me like four or five different things that

639
00:56:58,328 --> 00:56:59,488
were wrong.

640
00:57:00,568 --> 00:57:04,064
And then it asked me if it wanted me to...

641
00:57:05,656 --> 00:57:09,120
X, Y, and Z and spot check.

642
00:57:14,168 --> 00:57:16,256
don't know what you're referring to.

643
00:57:18,232 --> 00:57:23,808
And it said it in such a way that I was left with the impression that if I say yes.

644
00:57:24,472 --> 00:57:28,480
then it's going to perform an action that I may or may not want.

645
00:57:31,512 --> 00:57:32,064
Amen.

646
00:57:33,656 --> 00:57:37,728
This is vague. I don't know what we're doing here anymore.

647
00:57:40,632 --> 00:57:45,152
So this – but this is all part and parcel to …

648
00:57:47,352 --> 00:57:50,688
the difficulty of

649
00:57:51,096 --> 00:57:52,512
saying to

650
00:57:55,192 --> 00:57:57,200
of having to describe

651
00:57:57,260 --> 00:58:00,244
something that is basically indescribable.

652
00:58:02,156 --> 00:58:07,508
Like you, what your example was, you gave it, you gave a list of things, you say.

653
00:58:08,428 --> 00:58:14,868
This tone, this kind of, you know, these phrases. I didn't actually give it that list. It came up with that list itself. I didn't give it that list.

654
00:58:16,012 --> 00:58:18,164
Well, you came up with a set of...

655
00:58:18,860 --> 00:58:22,228
heuristic phrases. No, I didn't.

656
00:58:22,668 --> 00:58:23,892
It did that itself.

657
00:58:24,588 --> 00:58:30,132
OK, excuse me. Based upon a feedback of loop of me saying probably 10 times that's a slop.

658
00:58:32,780 --> 00:58:37,460
And we're all, to a certain extent, going through this same exercise.

659
00:58:38,764 --> 00:58:42,196
We are trying to come up with heuristics.

660
00:58:42,700 --> 00:58:45,428
Yeah. To identify some stuff.

661
00:58:47,052 --> 00:58:48,180
But what we're going to end up with.

662
00:58:49,356 --> 00:58:51,252
And this is always how heuristics work.

663
00:58:51,852 --> 00:58:55,348
What we're going to end up with is the same thing that happened in the past.

664
00:58:55,980 --> 00:58:57,200
phishing scams.

665
00:59:00,300 --> 00:59:02,932
The way that any...

666
00:59:03,628 --> 00:59:04,308
adversary

667
00:59:04,748 --> 00:59:10,004
consider AI, in this example, I'll consider AI slop our vague adversary.

668
00:59:10,508 --> 00:59:16,052
The way any adversary combats heuristics is they shrink.

669
00:59:17,356 --> 00:59:18,388
sample size.

670
00:59:19,756 --> 00:59:21,236
get smaller and smaller and smaller.

671
00:59:22,636 --> 00:59:31,124
So there's just simply not enough information there for you to make a, for you to pass a confidence threshold.

672
00:59:32,524 --> 00:59:34,644
that sample size.

673
00:59:35,308 --> 00:59:38,260
something like email spam or email phishing

674
00:59:38,860 --> 00:59:41,012
It can just be the number of words they use. Yeah.

675
00:59:41,644 --> 00:59:48,724
So they may just pig butchering. Perfect example. Yeah. Hey, did you go? Are you going to be at the tennis practice today? Yeah.

676
00:59:49,356 --> 00:59:52,756
There's simply not enough information there. Yeah. There's just not.

677
00:59:53,452 --> 00:59:57,400
But when you're dealing with something like AI-produced podcasts…

678
00:59:58,516 --> 01:00:01,276
Well, now you have many variables.

679
01:00:01,684 --> 01:00:04,956
You have all the metadata that goes into the podcast itself.

680
01:00:05,428 --> 01:00:07,164
You have the voice of.

681
01:00:08,116 --> 01:00:10,044
AI slot presenter.

682
01:00:10,452 --> 01:00:11,836
have the content.

683
01:00:13,076 --> 01:00:16,220
There's tons of variables you can push and pull.

684
01:00:16,756 --> 01:00:17,244
round

685
01:00:18,196 --> 01:00:23,420
But for each one of those, there's a set of heuristics involved, and you can refine it.

686
01:00:24,852 --> 01:00:25,468
down

687
01:00:26,100 --> 01:00:26,908
Over time,

688
01:00:27,444 --> 01:00:30,140
point where it is indistinguishable.

689
01:00:30,580 --> 01:00:33,308
Because if I get on, I've...

690
01:00:34,388 --> 01:00:35,100
This is...

691
01:00:35,956 --> 01:00:38,076
what I've been learning all week this week.

692
01:00:39,924 --> 01:00:42,972
What I'm trying to do is create a what's called a Laura.

693
01:00:43,476 --> 01:00:44,060
Laura adapted.

694
01:00:47,316 --> 01:00:48,380
So the way this will work.

695
01:00:48,756 --> 01:00:56,828
Is that a Lord of the Rings adapter? Yeah. I think it stands for.

696
01:01:00,436 --> 01:01:06,300
A book of knowledge. What does L-O-R-A stand for in AI?

697
01:01:12,628 --> 01:01:18,204
No, it's having a hard time. According to the book of knowledge, Loara stands for low rank adaptation.

698
01:01:18,900 --> 01:01:25,020
Parameter-efficient fine-tuning technique for large language models and other deep neural networks.

699
01:01:26,388 --> 01:01:28,668
Thus, it has been written.

700
01:01:30,836 --> 01:01:38,716
Low-rank adaption. That's it. That's it. Nailed it. Good work. So you – the LOR – the LORA adapter.

701
01:01:39,540 --> 01:01:46,012
is a way to add information into a language model.

702
01:01:46,452 --> 01:01:47,612
ace model.

703
01:01:48,212 --> 01:01:52,348
You're actually making adjustments to the weights themselves.

704
01:01:52,788 --> 01:01:57,200
This is fascinating. I love this.

705
01:01:57,324 --> 01:02:03,028
So the quote-unquote the weights, you're really talking about…

706
01:02:03,564 --> 01:02:13,332
you're talking about blocks of, you can think of an LLM as a stack, sort of.

707
01:02:13,804 --> 01:02:17,492
It's sort of like a stack of matrices. There's a...

708
01:02:18,444 --> 01:02:20,212
that are layered into each other.

709
01:02:20,684 --> 01:02:22,996
And those layers are distributed.

710
01:02:23,436 --> 01:02:24,980
blocks across.

711
01:02:25,324 --> 01:02:26,420
across the model.

712
01:02:27,308 --> 01:02:34,228
And one of the parts of the model is the – which is the best name in all of computer science is the percept.

713
01:02:36,332 --> 01:02:42,132
Perceptron. Yeah, so you have the attention layer, which feeds into a four...

714
01:02:42,892 --> 01:02:44,180
They call us like a forward.

715
01:02:45,228 --> 01:02:51,316
forward functioning a forward forwarding function and then that goes into a uh

716
01:02:53,164 --> 01:02:54,004
perceptron right

717
01:02:56,624 --> 01:03:04,568
And essentially all it is is it sounds fancier than it really is. It's just a – it's a block of matrices that –

718
01:03:05,168 --> 01:03:10,648
try to make associations with the attention between words.

719
01:03:12,304 --> 01:03:16,280
So it really is just matrix math.

720
01:03:18,032 --> 01:03:22,456
So the LoRa adapter.

721
01:03:24,496 --> 01:03:27,576
is a set of

722
01:03:28,176 --> 01:03:29,464
Matrices.

723
01:03:30,448 --> 01:03:34,168
you pull out of a training data set.

724
01:03:34,736 --> 01:03:38,616
You add them into the.

725
01:03:39,344 --> 01:03:41,976
LLM server, so something like Lama CPP.

726
01:03:43,792 --> 01:03:44,472
add that

727
01:03:45,168 --> 01:03:47,352
add that LoRa adapter in.

728
01:03:48,944 --> 01:03:52,536
and to the serving harness and then it will

729
01:03:53,936 --> 01:03:55,384
do an additive.

730
01:03:55,888 --> 01:03:56,400
to

731
01:03:56,684 --> 01:03:59,572
those new blocks

732
01:04:00,524 --> 01:04:07,444
into the matrices of the large language model. It essentially puts it through it. The serving harness.

733
01:04:08,332 --> 01:04:12,564
Yeah, so like something like – you can think of like an agent harness would be something like Claude.

734
01:04:13,228 --> 01:04:13,748
Okay.

735
01:04:14,316 --> 01:04:18,548
The server harness would just be the same thing on the server side. Okay.

736
01:04:18,988 --> 01:04:27,188
So it's not changing the base model. So you're going to take a model or something like Quinn 3.6 or something.

737
01:04:28,972 --> 01:04:32,916
Let's say you take Quinn 3.6 35B model.

738
01:04:34,444 --> 01:04:37,300
It's a mixture of experts model, 3 billion active parameters.

739
01:04:38,060 --> 01:04:40,148
take that model as your base.

740
01:04:42,764 --> 01:04:44,244
You are going to.

741
01:04:44,748 --> 01:04:50,324
essentially take another set of weights that are trained on your own data set. And you put that in front of it.

742
01:04:50,988 --> 01:04:56,700
put that beside it. So now that it's now every time it does, uh, every time it runs through.

743
01:04:57,400 --> 01:05:01,664
perceptron it is also going to add these other weights into

744
01:05:02,040 --> 01:05:03,200
Hmm.

745
01:05:04,280 --> 01:05:08,480
in a way that sort of nudges the output token in a certain direction.

746
01:05:10,200 --> 01:05:25,472
And the good thing about this, which is really nice, is you don't have to change. Doing it this way, you're not retraining the entire model. You're just adding this extra component, and that extra component is small. It's 500 megs or less.

747
01:05:25,880 --> 01:05:27,296
and you can just swap it out.

748
01:05:27,672 --> 01:05:36,768
So you can train a new one every month if you wanted to. Very cool. And just add that in. Yeah. So like if you – the reasons you would want to do this –

749
01:05:37,656 --> 01:05:38,176
are

750
01:05:39,832 --> 01:05:40,832
specificity.

751
01:05:41,880 --> 01:05:46,080
Or high customization, because you can do things like this with like rag.

752
01:05:46,520 --> 01:05:48,896
retrieval augmented generation.

753
01:05:49,496 --> 01:05:50,816
You take a database of

754
01:05:52,376 --> 01:05:53,568
and sort of mix.

755
01:05:54,296 --> 01:05:56,700
mix that kind of you you run the output

756
01:05:57,176 --> 01:06:00,704
through the rag as a filter on the output side.

757
01:06:01,848 --> 01:06:03,744
Or you could...

758
01:06:04,088 --> 01:06:06,080
pile everything into the, into the.

759
01:06:06,616 --> 01:06:08,192
prompt into the context.

760
01:06:09,272 --> 01:06:10,016
which you.

761
01:06:10,552 --> 01:06:12,800
quickly run out of space. Right.

762
01:06:13,528 --> 01:06:16,096
But the LoRa adapter...

763
01:06:17,784 --> 01:06:20,768
This would be called fine-tuning the model.

764
01:06:22,328 --> 01:06:23,552
this?

765
01:06:25,048 --> 01:06:27,008
can do if you do this well

766
01:06:27,384 --> 01:06:27,968
and fail.

767
01:06:28,600 --> 01:06:31,104
It can easily fail if you don't have the right training data.

768
01:06:35,832 --> 01:06:43,520
Eric says, is this better than old machine learning techniques like random force or neural nets? I mean, you know.

769
01:06:44,760 --> 01:06:56,1000
Transformer models are really just an evolution of neural networks. So this is going to, it's very similar. Well, I think I have a different question. Between the server harness.

770
01:06:59,268 --> 01:07:00,172
The Laura

771
01:07:01,636 --> 01:07:04,972
When is the last time you took Melissa on a date?

772
01:07:08,036 --> 01:07:08,492
set.

773
01:07:08,996 --> 01:07:09,900
There.

774
01:07:10,340 --> 01:07:27,788
There will be no dates until school's out. I'm just saying. Before you get a whatever. Yeah. Luckily, I've changed the sheets. That's my job. I do not have to worry about it. Nanny. All right.

775
01:07:28,132 --> 01:07:29,516
Um,

776
01:07:30,500 --> 01:07:37,548
Good question, though. It's been a while. We're due. Just saying. As soon as school's out.

777
01:07:39,044 --> 01:07:40,012
Um,

778
01:07:41,028 --> 01:07:47,340
So yeah, Eric, it handles language better because it's built for...

779
01:07:48,100 --> 01:07:49,068
It's built for that sort of thing.

780
01:07:49,892 --> 01:07:50,700
tokenized input

781
01:07:51,940 --> 01:07:53,196
But the...

782
01:07:55,172 --> 01:07:57,400
training if you do it right.

783
01:07:57,748 --> 01:08:00,092
If you do it wrong, here's what can happen.

784
01:08:01,716 --> 01:08:03,004
training set is poor.

785
01:08:04,724 --> 01:08:08,828
or if it's too small or too big.

786
01:08:09,812 --> 01:08:10,972
you can end up

787
01:08:12,244 --> 01:08:15,004
just making the model memorize things.

788
01:08:16,404 --> 01:08:20,828
So really to make sure that you're doing a good job of this, you have to…

789
01:08:21,172 --> 01:08:24,188
You have to do a lot of testing on the backside. Spot check.

790
01:08:25,620 --> 01:08:26,652
Spot checks, yes.

791
01:08:27,668 --> 01:08:29,276
You have to do it. You have to do.

792
01:08:29,716 --> 01:08:32,252
a lot of testing to set to

793
01:08:32,980 --> 01:08:36,188
Make sure that it's not just memorizing.

794
01:08:36,756 --> 01:08:38,940
The training data. Ah, yes.

795
01:08:40,148 --> 01:08:46,172
Because what you're looking for is a balance of weight, of waiting.

796
01:08:47,220 --> 01:08:48,540
word connections.

797
01:08:49,108 --> 01:08:51,676
And so the best way I saw this explained was.

798
01:08:53,460 --> 01:08:53,916
You can see.

799
01:08:57,140 --> 01:08:58,460
well-balanced model.

800
01:08:59,796 --> 01:09:05,340
You can do the underlying matrix math. Yes, exactly, Eric. That's right. It's called over.

801
01:09:06,548 --> 01:09:10,044
You can do the underlying matrix math.

802
01:09:11,028 --> 01:09:13,500
you can literally come out with.

803
01:09:14,196 --> 01:09:17,596
and see a visual representation.

804
01:09:18,932 --> 01:09:19,964
of

805
01:09:21,716 --> 01:09:23,196
King.

806
01:09:24,244 --> 01:09:28,604
is to man what queen is to woman

807
01:09:29,460 --> 01:09:30,748
Like you can say.

808
01:09:31,156 --> 01:09:35,996
You can take the relationship in the matrix of king.

809
01:09:36,372 --> 01:09:37,276
to male.

810
01:09:38,132 --> 01:09:48,540
And then you can do the exact same math on the word queen, and it will predict that it's going to be woman. Okay. So that's a good sort of like... What is woman?

811
01:09:49,844 --> 01:09:55,964
Our Supreme Court justices can't answer that. Yeah, this like.

812
01:09:56,852 --> 01:09:59,932
But if you over trained it you may end up with

813
01:10:00,340 --> 01:10:09,724
You know, Queen equals Elizabeth. Yes. See what I'm saying? Like you would you would end up with a model that has memorized the training set. And so.

814
01:10:10,868 --> 01:10:14,588
This is where we're headed.

815
01:10:15,060 --> 01:10:17,340
If we can, if I can do this right.

816
01:10:18,356 --> 01:10:19,964
We will end up with.

817
01:10:21,524 --> 01:10:22,364
A

818
01:10:23,028 --> 01:10:25,372
highly customized model

819
01:10:26,068 --> 01:10:27,036
is an expert.

820
01:10:27,540 --> 01:10:28,348
Identifying.

821
01:10:29,332 --> 01:10:29,916
slop.

822
01:10:31,252 --> 01:10:33,852
Not just slop, but slop and spam.

823
01:10:39,284 --> 01:10:40,860
content these kinds of things

824
01:10:41,332 --> 01:10:42,524
All these things we don't want.

825
01:10:43,732 --> 01:10:48,316
We can have an expert that can hit those things.

826
01:10:49,044 --> 01:10:50,332
and identify them.

827
01:10:51,508 --> 01:10:53,820
In a way that is.

828
01:10:56,388 --> 01:10:57,932
forward moving

829
01:10:58,660 --> 01:11:00,492
Whereas just doing it the way that.

830
01:11:01,188 --> 01:11:15,884
moment by piling more and more and more stuff into the context window is just not scalable. Right. And then eventually this will wind up as an endpoint that people can query or a flag or something that can be used by people who use the index.

831
01:11:17,380 --> 01:11:18,796
That's the idea, isn't it?

832
01:11:19,364 --> 01:11:25,260
Oh, it already is. I mean, like the reason – so if you hit the problematic endpoint right now, report.

833
01:11:26,884 --> 01:11:29,196
The recent problematic endpoint.

834
01:11:29,732 --> 01:11:34,060
You can already see what it was classified as and the re-inassurance.

835
01:11:35,204 --> 01:11:38,476
about the reason. Is that open or do you have to have a key for that?

836
01:11:43,140 --> 01:11:45,068
So what it's doing is it's training.

837
01:11:46,660 --> 01:11:47,340
So.

838
01:11:48,004 --> 01:11:49,772
taking all that into

839
01:11:50,212 --> 01:11:51,948
Taking all that context in.

840
01:11:53,092 --> 01:11:56,400
What I've been hitting is...

841
01:11:57,548 --> 01:12:01,300
The training set design is so complex.

842
01:12:03,788 --> 01:12:04,564
I need.

843
01:12:05,036 --> 01:12:06,900
I'm aiming for about...

844
01:12:07,596 --> 01:12:09,172
25,000.

845
01:12:09,516 --> 01:12:11,732
quote, good podcasts.

846
01:12:12,748 --> 01:12:14,516
Good ones. Okay, good ones.

847
01:12:15,532 --> 01:12:17,556
And those need to range.

848
01:12:19,372 --> 01:12:22,548
across all the different ways that we know that they do.

849
01:12:23,596 --> 01:12:24,660
Everything from...

850
01:12:25,036 --> 01:12:37,492
Yeah, everything from Joe Rogan to – Oystein Berger. To Oystein Berger. Yeah, exactly. Oystein Berger. To Bowls with Buds. Mm-hmm.

851
01:12:38,156 --> 01:12:55,668
To Pod News Weekly Review, to a person reading a bedtime story on Buzzsprout, all to some teenager reading their term paper on Anchor. All of those are really – all of those are legit podcasts.

852
01:12:56,388 --> 01:13:00,492
But then somehow identify the difference between.

853
01:13:01,060 --> 01:13:07,852
Between a 17 year old in Delhi reading their report from school.

854
01:13:09,412 --> 01:13:12,972
And differentiate that as good against.

855
01:13:14,436 --> 01:13:16,684
AI generated voice.

856
01:13:17,380 --> 01:13:18,828
speaker

857
01:13:19,396 --> 01:13:21,740
reading a bunch of history nonsense off of wiki

858
01:13:22,212 --> 01:13:22,700
Right.

859
01:13:24,228 --> 01:13:30,828
Interesting challenge. Yes, it is very – it has been a challenge. I mean it's not –

860
01:13:31,844 --> 01:13:32,716
where it's not.

861
01:13:34,308 --> 01:13:35,532
I still got a few weeks to go.

862
01:13:37,412 --> 01:13:39,020
Realistically, it's probably

863
01:13:40,644 --> 01:13:41,708
three weeks to get

864
01:13:42,212 --> 01:13:43,820
first shot at this.

865
01:13:44,708 --> 01:13:48,908
Because if your training data set is wrong.

866
01:13:49,284 --> 01:13:53,228
Or if it's poor, you're just going to end up training the model that…

867
01:13:53,636 --> 01:13:54,796
Everything from Spreakers

868
01:13:55,792 --> 01:13:59,640
Yeah. And that's just – but that's not true. Well, it kind of is.

869
01:14:03,664 --> 01:14:05,176
It's like, you know, or

870
01:14:05,744 --> 01:14:10,744
Like if, you know, if Spreaker and the other, here's what would probably help.

871
01:14:11,472 --> 01:14:18,328
If they put a flag in, this is a free account. That would probably be a good signal.

872
01:14:19,856 --> 01:14:21,144
for your training.

873
01:14:22,704 --> 01:14:23,448
Let me give you a great example.

874
01:14:28,656 --> 01:14:31,160
Here you go. Here's one.

875
01:14:32,144 --> 01:14:34,776
Is it a number I can call as James?

876
01:14:35,440 --> 01:14:37,624
No, I don't think so.

877
01:14:38,064 --> 01:14:55,500
There's no numbers in this one. I'll find one of those. If you care about predictions and props, right now, it's all about playoff pressure from the hard work. Always. Okay, hold on. Always do pre-rolls. But here's the twist. We're only going... AI. Oh, yeah. And I'm Luna.

878
01:14:55,656 --> 01:15:18,864
So excited. Now at McDonald's, a McDouble is $2.50. So you can get your gym gains on or just get lunch. I can't shuttle. It's almost here. Yeah, yeah, McDonald's. All right. They made some money off of us. All right, here we go. Hey, welcome back to Learn Tamil with Vexingo. I'm Lucas.

879
01:15:19,240 --> 01:15:31,280
And I'm Luna. So excited for episode 13. I flop. Yeah, we've been on a roll. Today, I thought we'd do something a little different. Oh, goodness.

880
01:15:32,776 --> 01:15:35,088
Hold on. Let me do something here.

881
01:15:37,032 --> 01:15:37,712
Uh...

882
01:15:38,792 --> 01:15:42,352
analyze this podcast.

883
01:15:44,168 --> 01:15:48,112
AI slop or not. I'm just going to give it to my robot.

884
01:15:48,712 --> 01:15:51,536
It'll take a minute, obviously.

885
01:15:52,040 --> 01:15:53,840
My robot is nothing like your building.

886
01:15:54,560 --> 01:15:58,184
But anyone who's listening heard that right away.

887
01:15:59,072 --> 01:16:05,384
Oh, yeah, that's what I'm saying. This is the difference between the built-in language.

888
01:16:06,592 --> 01:16:11,112
an emotional action comprehension that humans have versus machines.

889
01:16:11,808 --> 01:16:16,200
We just know things and we don't know how we know them. We just do.

890
01:16:17,664 --> 01:16:19,240
We just know.

891
01:16:19,904 --> 01:16:24,232
And nobody can explain the – nobody can explain how.

892
01:16:24,928 --> 01:16:25,576
Um,

893
01:16:27,296 --> 01:16:34,088
And everybody's going to pass that test, every single one of us. But AI is really going to struggle with this.

894
01:16:34,848 --> 01:16:35,624
And it.

895
01:16:36,128 --> 01:16:39,112
Well, right off the bat, it'll struggle with the ad.

896
01:16:40,576 --> 01:16:41,064
Exactly.

897
01:16:41,408 --> 01:16:44,200
It has to get past the ads.

898
01:16:44,672 --> 01:16:45,224
point

899
01:16:45,984 --> 01:16:48,424
wonder if my robot will figure that out.

900
01:16:49,248 --> 01:16:53,096
Well, and this – so here's sort of the last –

901
01:16:55,136 --> 01:16:56,840
make is I think.

902
01:16:57,952 --> 01:17:00,680
Here is an Achilles heel.

903
01:17:01,056 --> 01:17:03,560
of how we discuss these things today.

904
01:17:08,256 --> 01:17:10,248
to

905
01:17:11,264 --> 01:17:12,968
about

906
01:17:13,600 --> 01:17:15,688
AI in terms of

907
01:17:16,704 --> 01:17:21,384
what it's not capable of doing or where its flaws are right now.

908
01:17:23,104 --> 01:17:23,912
That is just.

909
01:17:24,320 --> 01:17:26,376
that is very short-sighted.

910
01:17:26,912 --> 01:17:35,688
What I think we have to do, and this is why I'm trying to do it this way. What I think we have to do is assume.

911
01:17:37,696 --> 01:17:42,120
These AI-generated podcasts are going to get so good.

912
01:17:43,232 --> 01:17:44,968
We cannot tell the difference.

913
01:17:46,944 --> 01:17:51,080
We have to assume that they are going to eventually be perfect.

914
01:17:52,800 --> 01:17:53,832
you know, in.

915
01:17:54,208 --> 01:17:55,100
their tone.

916
01:17:56,952 --> 01:17:58,112
word choice.

917
01:17:58,1000 --> 01:18:00,032
going to

918
01:18:00,440 --> 01:18:13,504
evade any sort of detection this is what i meant by by saying they're just going to reduce the attack surface they're going to stop doing the things that get that that sound unnatural

919
01:18:14,872 --> 01:18:23,680
And the uncanny valley is going to shrink down to such a small degree that we're not going to be able to just do heuristics on them. What we're going to have to do...

920
01:18:24,216 --> 01:18:25,056
is

921
01:18:25,528 --> 01:18:27,072
is look at

922
01:18:27,672 --> 01:18:29,664
whole thing in a Bayesian way.

923
01:18:30,456 --> 01:18:33,056
where we're doing it.

924
01:18:33,880 --> 01:18:35,136
We're taking...

925
01:18:36,536 --> 01:18:39,168
things into account that they can't stop.

926
01:18:39,672 --> 01:18:43,200
Would you like the verdict here from my robot?

927
01:18:43,800 --> 01:18:54,900
Almost certainly AI slop. The evidence, 55 red flags, 55 podcasts across every major language. No human operation produces 55 language courses.

928
01:18:55,472 --> 01:18:56,024
Farm.

929
01:18:56,464 --> 01:19:16,664
Two warm voices. Exactly two TTS voices across all 55 languages. Real language instruction requires native speakers for each language. Hosted on Spreaker, huh? See, I told you. Listen to this. It says, hosted on Spreaker, one of the platforms Dave flagged for hosting in cloned spam feeds on last week's show.

930
01:19:17,872 --> 01:19:19,352
Holy crap.

931
01:19:19,920 --> 01:19:32,824
No team, no instructors named, no methodology, no credentials, completely anonymous. Generic Fexingo branding across all 55. Template operation, learn, insert language with Fexingo for every single one.

932
01:19:33,168 --> 01:19:54,1000
zero information about who runs it, no company location, no people, just an email address, the math, 55 languages times even 10 episodes each, 550 episodes of audio with two voices, that's TTS. The pattern matches exactly what we screened for, generic channel names, no identifiable humans, and possibly broad output from a tiny operation and a template applied across dozens of instances.

933
01:19:54,1000 --> 01:20:04,316
podcast spam farm using AI generated TTS language lessons to harvest ad revenue across 55 ads on Spreaker textbook slop.

934
01:20:04,756 --> 01:20:05,308
Okay.

935
01:20:06,964 --> 01:20:11,708
Great analysis, but let me throw a twist in there. Okay.

936
01:20:12,916 --> 01:20:15,676
One of the things I've been seeing a ton of lately

937
01:20:16,052 --> 01:20:17,884
Are people taking LibriVox?

938
01:20:18,292 --> 01:20:19,868
reposting them on Spree.

939
01:20:20,276 --> 01:20:21,308
to add revenue.

940
01:20:21,844 --> 01:20:22,620
Thank you.

941
01:20:24,052 --> 01:20:25,468
So that's theft.

942
01:20:26,356 --> 01:20:32,188
Well, not – no, actually not legally because LibriVox uses a public domain license.

943
01:20:32,788 --> 01:20:35,708
You can really they literally say you can do whatever you want with it.

944
01:20:36,276 --> 01:20:36,988
stop you.

945
01:20:37,652 --> 01:20:41,212
So legally it's fine.

946
01:20:42,004 --> 01:20:47,228
except that you're going to have 100 different people posting the same exact thing over.

947
01:20:50,804 --> 01:20:54,364
Here's what I would actually enjoy.

948
01:20:55,156 --> 01:20:58,076
Because LibriVox, since it's volunteers.

949
01:20:58,484 --> 01:21:02,812
Many of the recordings are horrible. Yeah, it's actually an improvement.

950
01:21:03,636 --> 01:21:07,804
A really well done AI narrator.

951
01:21:08,692 --> 01:21:11,324
I mean like a high-quality –

952
01:21:12,628 --> 01:21:14,428
A.I. narrator would be better.

953
01:21:15,572 --> 01:21:20,220
and many of the LibriVox recordings, and I would prefer to listen to the AI version.

954
01:21:20,596 --> 01:21:34,652
If it was well done, then I would the LibriVox human version. But aren't we now down to what I've always said is some people will actually want this for certain reasons, and that's very – we all need our own –

955
01:21:37,204 --> 01:21:39,804
Percepticon, uh, Perceptitron.

956
01:21:40,148 --> 01:21:50,844
I need my own perceptron. Yes. So see, and this is the issue is like just because it's AI generated does not mean.

957
01:21:51,636 --> 01:21:54,400
is slop and nobody wants it. But...

958
01:21:54,524 --> 01:21:59,108
If you ask somebody, is this slop? They will be able to tell you.

959
01:22:00,636 --> 01:22:07,300
Sir Bemrose has – They will be able to tell you, and for reasons they don't even themselves know. Sir Bemrose has a good point.

960
01:22:07,772 --> 01:22:12,868
One of the most powerful signals for AI slop is that the podcast has ads.

961
01:22:13,212 --> 01:22:21,636
And that's another thing. It's like you don't want to overtrain on that either. Yeah, exactly. Because then the model will be like –

962
01:22:22,012 --> 01:22:26,500
Oh, yeah. We got ads must be slop. Yeah, but that's not true.

963
01:22:27,388 --> 01:22:33,796
And if there was a really well-done AI-voiced version of…

964
01:22:34,844 --> 01:22:35,332
Um...

965
01:22:36,380 --> 01:22:49,284
You know, Frankenstein, Mary Shelley's Frankenstein. It was way better than the liberalized recording. I would not only enjoy it and want to listen to it, but also not mind if it had pre-rolls in it.

966
01:22:50,236 --> 01:22:55,300
Well, back to my point. Isn't it ultimately that we all need?

967
01:22:56,096 --> 01:22:57,288
own perceptron.

968
01:22:58,880 --> 01:22:59,656
really do.

969
01:23:01,120 --> 01:23:10,728
Because you're saying things that are really important here. I wouldn't want to hear that. I want to hear some crappy Frankenstein. But I wouldn't want to be. No, I don't want to hear that.

970
01:23:11,328 --> 01:23:17,256
So isn't that kind of the point where we have to be, that we all have a robot that we train for stuff that we want?

971
01:23:17,728 --> 01:23:24,456
customized customized yeah customized but in the interim while there's while there's sort of like

972
01:23:25,600 --> 01:23:31,592
Brokers of pipelines of content, brokers of content like Podcast Index.

973
01:23:32,160 --> 01:23:33,096
plus podcast direct.

974
01:23:33,440 --> 01:23:35,176
these kinds of things.

975
01:23:36,736 --> 01:23:38,024
We can't –

976
01:23:39,200 --> 01:23:41,512
As a conduit.

977
01:23:41,920 --> 01:23:54,472
we have to be able to continue to conduit. Yes, yeah, true. We can't be overrun with 700 people posting, you know, LibriVox recordings of Frankenstein with a couple of pre-rolls.

978
01:23:55,648 --> 01:23:58,696
It's just not – it's just – that's – it's absurd.

979
01:24:00,960 --> 01:24:16,904
Anyway, so that, I mean, that's, I think that's where, it's difficult. We're gonna, you know, we can try to hit the high level and train, you know, train this model to get the obvious junk, like the one I just posted a while ago, but.

980
01:24:17,568 --> 01:24:21,256
Yes.

981
01:24:21,696 --> 01:24:22,376
you

982
01:24:22,944 --> 01:24:27,272
That's obviously junk that nobody wants. It's the equivalent of punch the monkey.

983
01:24:28,032 --> 01:24:31,688
Oh, goodness. But those were good for a while. Those were fun. I've tried.

984
01:24:32,064 --> 01:24:36,840
Everyone's punched the monkey.

985
01:24:37,504 --> 01:24:42,536
So you would actually wouldn't mind a good TTS reading of Frankenstein.

986
01:24:44,384 --> 01:24:47,336
Oh, yeah, like if it was a high-quality, like—

987
01:24:48,288 --> 01:24:50,312
When we get – when –

988
01:24:51,232 --> 01:24:55,300
When TTS models get to the point where they can really deliver.

989
01:24:56,224 --> 01:24:56,744
quality

990
01:24:58,112 --> 01:24:59,624
just the LM notebook.

991
01:25:00,192 --> 01:25:01,832
get to that point, oh yeah, I'd love it.

992
01:25:02,624 --> 01:25:04,104
I'd love that because...

993
01:25:04,512 --> 01:25:11,656
Because I love audiobooks, and I like LibriVox, but so many of the recordings are just so bad, they're hard to listen to.

994
01:25:12,320 --> 01:25:19,528
So I asked my robot to find a reading of Frankenstein that is well-read by TTS, good enough, even though it is a eye slop.

995
01:25:21,472 --> 01:25:26,120
That would be a perceptron that you might program, right?

996
01:25:27,872 --> 01:25:28,744
What's not there yet?

997
01:25:30,144 --> 01:25:36,552
It does have my podcast index API key and secret.

998
01:25:37,824 --> 01:25:46,088
Oh, it doesn't? Okay. I'll look forward to the server crash. In a .env file. Don't worry. In a .env file.

999
01:25:46,560 --> 01:25:47,624
It's not posting it anywhere.

1000
01:25:48,576 --> 01:25:55,200
Until you get a CI CD pipeline hack. I don't be everywhere. I don't see DCI anything.

1001
01:25:55,644 --> 01:25:57,892
CDC. Whatever.

1002
01:25:59,100 --> 01:26:00,932
I don't do any of that stuff.

1003
01:26:01,692 --> 01:26:06,628
Well, I hope I don't cause any problems. A lot of people have keys.

1004
01:26:07,356 --> 01:26:12,100
Every time I run pip install, I'm just like.

1005
01:26:12,700 --> 01:26:13,540
Ha ha ha.

1006
01:26:15,036 --> 01:26:19,940
Well, isn't it with just, like you said, every packet manager, NPM, all of this stuff.

1007
01:26:21,116 --> 01:26:26,756
Yeah, they're ticking time bombs inside your machine just waiting to crap on your front porch.

1008
01:26:27,260 --> 01:26:28,036
Yeah.

1009
01:26:29,660 --> 01:26:33,924
Uh, well, you are doing important work.

1010
01:26:34,940 --> 01:26:44,644
I'm doing work. I don't know if it's important. I think it is. I think it is important. Let me see. It's going to get a two-minute sample into my show folder here.

1011
01:26:46,748 --> 01:26:49,124
I got my mom pretty happy with my robot.

1012
01:26:50,044 --> 01:26:52,996
I really want your little –

1013
01:26:53,852 --> 01:26:54,700
guy that

1014
01:26:54,888 --> 01:27:01,520
runs out there and gets stories and stuff. No, no, no, not that one. But the one that just analyzed the.

1015
01:27:02,248 --> 01:27:02,960
podcast.

1016
01:27:03,688 --> 01:27:05,456
Oh, that's my favorite guy.

1017
01:27:06,088 --> 01:27:25,1000
Yeah, I like that guy. Yeah, no, that guy is good. Okay. I want a copy of that guy so I can make it work on my behalf. Here, my guy has got something for you. Letters. Frankenstein or the Modern Prometheus by Mary Wollstonecraft Shelley. This is a LibriVox recording. All LibriVox recordings are in the public domain. Do you think that's real or is that slop?

1018
01:27:27,464 --> 01:27:44,368
real to me. According to my robot, it's a TTS. For more information or to volunteer, please visit LibriVox.org. Recording by Caden Clegg, thelunarisland.blogspot. That sounds pretty real to me. Sounds real to me. Let me see. Some people just sound like robots.

1019
01:27:46,504 --> 01:27:53,400
Yeah. I just think that's real. I think it's real. OK, we think it's real.

1020
01:27:53,556 --> 01:28:12,796
He's talking through a tube. Loser robot. I have to call my any AI I call a robot. It reminds me. Don't get involved with this thing. It's a robot. Don't take it seriously. Yeah. No, you can't. You can't. That's dangerous before you know what you said. OK, good night. Talk to you tomorrow.

1021
01:28:14,132 --> 01:28:27,868
You have no idea how many people are doing that, brother. It's bad. Hey, we're over time, man. I should have gotten you out eons ago. I'm sorry about that. Yes, we're way over time. Do you have a few minutes to do some thank yous? Yeah, sure. Why are you talking through a tube?

1022
01:28:28,404 --> 01:28:30,716
I'm doing a LibriVox recording.

1023
01:28:32,660 --> 01:28:52,600
Value for Value podcast, which means we deliver the value. We hope you enjoyed today's board meeting and we would like to receive some value back. Lots of people contribute time, talent and treasure. And here we thank people throughout the show for all kinds of things they're doing. I mean, started for me right off the bat with Eric PP with a with a helipad.

1024
01:28:52,692 --> 01:28:56,924
best software package in the universe. And then, of course, he also scolded me.

1025
01:28:57,332 --> 01:28:57,980
you

1026
01:28:59,284 --> 01:29:20,604
But on the treasure side, this benefits the podcastindex.org infrastructure directly. So if you send us boostagrams or if you send us PayPals, which you can do by going to podcastindex.org at the bottom, a big red donate button, you hit that and you can send us some fiat fund coupons. So there's 333 from Chad F. He's asked me to spot check the boostagram.

1027
01:29:20,884 --> 01:29:26,748
C-Loss on Linux 2220. Oh, interesting.

1028
01:29:27,508 --> 01:29:31,676
fountain. It might have been a row of ducks, but the fountain somehow is rounding up or down.

1029
01:29:32,340 --> 01:29:35,356
He says, firm weight update available. This is probably when you crashed.

1030
01:29:36,468 --> 01:29:37,148
Um...

1031
01:29:37,588 --> 01:29:52,800
That was it. Yeah. Here I got another 59 sats. Interesting from true fans. Co-listening and chatting with Sam Sethi, talking about beverages. Tea from Assam, India, and rosé wine from Provence, France. Do ye.

1032
01:29:53,372 --> 01:29:55,140
And it stopped at the Y.

1033
01:29:55,772 --> 01:30:15,844
Cool. Good. They're talking about things completely unrelated to the show. Yes. From Podcast Guru 333 from Eric P.P. Pew. Yes, we got the Pew. There's the 170 sats. Fetch metadata. Oh, what is this little fetch metadata thing? Hold on.

1034
01:30:16,188 --> 01:30:19,812
Oh, wow. Okay. I'm sorry.

1035
01:30:20,348 --> 01:30:23,844
Oh, goodness gracious. This is awesome, Eric. Share with the class. Yes.

1036
01:30:24,220 --> 01:30:47,120
So on a weird lightning invoice that comes in that I wasn't able to read, there's a little button now in Helipad which says, fetch metadata. And then, boom, it translates it. And so now I see, indeed, a row of ducks, 2222 from Lyceum, co-listening with Sam Sethi, talking about beverages, tea, blah, blah, blah.

1037
01:30:47,120 --> 01:30:51,900
Do you have the ducks in a row? Cheers. I don't know exactly what's happening, but I love it.

1038
01:30:52,568 --> 01:31:11,328
Cool. Yeah, whatever's happening is awesome. Great. 6390 from Sam at truefans.fm. Yeah, that was about Cook missing every cycle. Saltacrayon333. It's been months. Howdy, Node, he says. 333 from Chad F. This Boostergram brought to you by EricPP++.

1039
01:31:11,704 --> 01:31:30,784
2220 from C-Loss on Linux, AI-generated Boostagram spam. And, okay, so then there's that actual spam I got. And now I hit the delimiter. So we're in a good place. Thanks, everybody. That was fun. And Eric PP, Helipad, rocking it, man. Love it.

1040
01:31:31,288 --> 01:31:32,064
Thank you.

1041
01:31:32,536 --> 01:31:33,248
Trying to get my

1042
01:31:34,648 --> 01:31:36,544
I think pulled up here. Oh, yeah. Okay.

1043
01:31:37,080 --> 01:31:44,512
What do we got here? I guess. Oh, that's a I hate the way it mixes all the PayPal stuff together.

1044
01:31:46,264 --> 01:31:46,848
can get.

1045
01:31:47,608 --> 01:31:49,312
head around that. All right, here we go.

1046
01:31:50,520 --> 01:31:52,200
Oh, oh, look.

1047
01:31:53,092 --> 01:31:53,708
Buzzsprout.

1048
01:31:55,652 --> 01:31:58,028
Holy moly.

1049
01:31:58,724 --> 01:31:59,596
Sorry.

1050
01:32:00,356 --> 01:32:02,348
20 inch blades on the MP.

1051
01:32:02,820 --> 01:32:09,388
Yo, baller boys and girls from Buzzsprout, thank you. Thank you. Tom Rossi. It's not even end of the month yet.

1052
01:32:11,364 --> 01:32:15,340
Uh, Tom Ross, he also does conference calls.

1053
01:32:15,940 --> 01:32:17,004
does for free.

1054
01:32:17,700 --> 01:32:28,236
for free. Michael Goggin, five bucks. Thank you, Michael. A thousand to five in a blink of an eye. Jorge Hernandez, five dollars. Thank you, Jorge.

1055
01:32:29,028 --> 01:32:32,012
Christopher Reamer, $10. Thank you, Christopher.

1056
01:32:33,444 --> 01:32:34,636
Yeah, Cohen Glotzbach.

1057
01:32:35,204 --> 01:32:38,988
Thank you, Colin. James Sullivan, $10.

1058
01:32:39,684 --> 01:32:43,820
And Randall Black, $5. That's our PayPal. Let me see what we got on.

1059
01:32:44,324 --> 01:32:45,036
boosts here.

1060
01:32:45,924 --> 01:32:52,300
Oh, we got Oscar. Oscar Mary, 20,000 stats. He's through Fountain. He says, sorry for the disruption, guys.

1061
01:32:52,904 --> 01:32:56,272
That's no problem. No worries. No worries, brother. No worries.

1062
01:32:56,712 --> 01:33:05,136
I will throw no stones. Comic strip blogger, delimiter, 20,000 sats through fountain. He says.

1063
01:33:07,048 --> 01:33:10,736
Today, I want to recommend a podcast from the podcast Morning Chat.

1064
01:33:11,208 --> 01:33:17,936
www.podpage.com slash pmc slash which is quote

1065
01:33:18,472 --> 01:33:26,832
daily morning show for creators by creators ever wonder how top content creators and podcasters keep their shows fresh engaging

1066
01:33:28,456 --> 01:33:31,568
podcast morning chat hosted by Mark Roenick

1067
01:33:32,168 --> 01:33:40,848
Suggested by Martin Lindeskoog. Yo, CSP, AI Arch Wizard, $15.52.

1068
01:33:42,056 --> 01:33:43,568
Thank you very much, Comicster Blogger.

1069
01:33:45,064 --> 01:33:52,400
That's it. Everything else is just river bitcoins, man. So I hit my – this has not happened. This happened twice now.

1070
01:33:53,420 --> 01:33:58,068
Ever since the wonderful upgrade, I've hit my limit on the... ...Claude Code.

1071
01:33:59,372 --> 01:34:04,020
Oh, did you see? I saw that. Yeah, I saw the GitHub.

1072
01:34:04,748 --> 01:34:08,020
is now charging per token, and they've gotten rid of their all-you-can-eat.

1073
01:34:08,812 --> 01:34:13,844
GitHub is charging per token? What does GitHub do with tokens? GitHub Copilot.

1074
01:34:14,252 --> 01:34:14,868
Ah!

1075
01:34:15,788 --> 01:34:21,044
So now I can upgrade or wait an hour and 54 minutes. What is this?

1076
01:34:22,540 --> 01:34:47,092
falling apart but i thought i had extra credits man you created we need some credits hey man and i need some credits how come you can't give me some credits this is no good any credits resets at 4 10 p.m so predictable so predictable well i mean i predicted this i said it was going to happen but now i'm actually push your model i'm now i'm kind of pissed like hold on man

1077
01:34:47,116 --> 01:34:52,100
Hold on, man. I can get some more money. Hold on. Are you on the 100?

1078
01:34:52,192 --> 01:34:54,216
plan or the 200? I'm on the Hyundai plan.

1079
01:34:54,880 --> 01:34:57,768
Nope, gotta go to the two. I don't want to go to the two, man.

1080
01:34:59,1000 --> 01:35:19,688
This is the best way to... This is token inflation is what this is. Tokenflation. This is tokenflation. Tokenflation. Yeah, the best way to do it because you don't actually have to raise prices. You just inflate the amount of tokens that you use with each request and they just run out sooner. It's great.

1081
01:35:20,064 --> 01:35:23,368
But what I don't understand is I thought I had...

1082
01:35:24,800 --> 01:35:27,208
I thought I had extra credits, man.

1083
01:35:28,064 --> 01:35:34,056
Your 20 becomes 100, then your 100 becomes 200, and next thing you know, bada-boom, you're on the API plan.

1084
01:35:35,136 --> 01:35:39,944
Oh, well, the API plan, I mean, that's the one that's crazy. Oh, you'll go broke.

1085
01:35:40,320 --> 01:35:49,288
Oh, goodness gracious. Oh, no. Let me just take a look at here. What is this? Billing. Here we go. Billing. But I have extra credits.

1086
01:35:50,112 --> 01:35:52,500
I am literally a whore.

1087
01:35:52,944 --> 01:35:56,568
Let me see. Okay, man. All right.

1088
01:35:57,040 --> 01:36:11,832
I've never run out of tokens on the Claude $100 plan until this past week, and I hit my limit for the first time. And I've changed nothing about the way I work. It's happened twice for me today.

1089
01:36:12,240 --> 01:36:23,352
Yeah. Well, this sucks. It's Opus 4.7, and they changed the default. If I'm not mistaken, check your settings.

1090
01:36:23,856 --> 01:36:29,880
I believe they changed the default to 4.7. To super high effort. Oh.

1091
01:36:30,384 --> 01:36:37,944
So make sure your effort is the same, and you may want to change back to like 4.6 or 4.5. So where do you do that? Effort slash effort.

1092
01:36:38,672 --> 01:36:43,192
And then what do I say? Low, medium, high, max, auto.

1093
01:36:44,112 --> 01:36:50,744
is what I've always used. But I noticed today when I opened it that it was auto set to extra high.

1094
01:36:52,960 --> 01:36:53,864
you

1095
01:36:54,432 --> 01:36:57,640
Yeah, you're right, Cotton Gin. I've heard that.

1096
01:36:58,464 --> 01:36:59,656
4.7 people

1097
01:37:00,032 --> 01:37:06,216
So how do you set the model? Is it model? Yes, model. Yeah. Opus. Okay, model.

1098
01:37:07,136 --> 01:37:17,896
What is it supposed to be? It's not auto-completing for me. 4.6. But do you just point in 4.6? Does it do that? No. Model 4.6 not found.

1099
01:37:18,432 --> 01:37:19,912
Opus 4.6.

1100
01:37:20,608 --> 01:37:22,472
Opus is it

1101
01:37:22,912 --> 01:37:24,392
4.6

1102
01:37:28,160 --> 01:37:28,680
I don't know.

1103
01:37:32,192 --> 01:37:32,808
Anyway.

1104
01:37:33,632 --> 01:37:38,152
Dave, we're – I'm telling you, the local model –

1105
01:37:38,592 --> 01:37:40,552
Oh no, that's the future.

1106
01:37:41,184 --> 01:37:42,056
Try.

1107
01:37:42,464 --> 01:37:44,424
Whenever you get a chance.

1108
01:37:45,280 --> 01:37:46,056
Try.

1109
01:37:46,400 --> 01:37:47,112
um,

1110
01:37:49,536 --> 01:37:53,200
Triquin 3.6, the 35B.

1111
01:37:53,612 --> 01:37:54,932
A3B model.

1112
01:37:57,676 --> 01:37:58,708
is

1113
01:37:59,052 --> 01:37:59,700
is bull

1114
01:38:00,076 --> 01:38:02,772
3.6.

1115
01:38:03,596 --> 01:38:07,316
Quinn 3.6, the 35B, A3B model.

1116
01:38:07,724 --> 01:38:09,428
A3B.

1117
01:38:09,836 --> 01:38:16,372
model okay use on open code run it locally and run it on open code it is really real

1118
01:38:17,324 --> 01:38:23,284
Awesome. And it's so fast. I mean, it is like it is giving me output.

1119
01:38:24,332 --> 01:38:25,972
almost before I hit the enter key.

1120
01:38:26,604 --> 01:38:28,596
It is lightning fast.

1121
01:38:29,260 --> 01:38:37,268
And this concludes another episode of what the heck were Dave and Adam talking about? It's a beautiful thing.

1122
01:38:40,396 --> 01:38:42,420
Dave, thank you, brother. Have a great weekend.

1123
01:38:44,236 --> 01:38:46,932
Done. Or whatever.

1124
01:38:52,848 --> 01:38:53,400
Thank you.

1125
01:39:01,904 --> 01:39:04,536
been listening to Podcasting 2.0.

1126
01:39:06,128 --> 01:39:06,744
And this is.

1127
01:39:08,304 --> 01:39:09,720
Go Pugs!

1128
01:39:12,528 --> 01:39:14,104
note all you want.
