Transcript of #2481 - Duncan Trussell New

The Joe Rogan Experience
03:07:24 199 views Published 2 days ago
Transcribed from audio to text by
00:00:00

Joe Rogan Podcast, check it out.

00:00:04

The Joe Rogan Experience.

00:00:05

Train by day, Joe Rogan Podcast by night. All day.

00:00:11

All right, we're good. We had a slight issue. It's like technical glitch.

00:00:18

Glitch.

00:00:19

But we're up. What were we just talking about?

00:00:21

We were talking about that if you hum a tune.

00:00:23

Oh, right, right.

00:00:24

That you will get dinged.

00:00:26

Yeah, you'll get flagged on YouTube if you just hum a sound from a song.

00:00:31

Yeah.

00:00:32

Like the beginning bars of a song.

00:00:34

Yeah, you can't. I wonder how far that goes. Like, could it get to the point where an AI could hear you humming it in your car or something? Like, how far does the protection of music go?

00:00:46

Well, you're not generating revenue from your car, right? So the thing is, you're generating revenue from a podcast. And their logic is if you hum— what is that song? The Sunshine of My Love. Is that what it is? Yeah, you know that song that I always hum to associate with people being high out of their fucking mind? Yeah, it goes— you can't do it. If I did that, we would get dinged, which is so crazy. We were just saying, like, if you quoted a Scarface movie, would Brian De Palma get all the money? If you said, say hello to the bad guy, would Brian De Palma get that money?

00:01:18

I don't think so. I think you're allowed to quote stuff, but you—

00:01:22

I know that is Brian De Palma, right? Scarface, wasn't it? I don't want to fuck that up.

00:01:29

You know those auditors that go around and film people and people get mad because they're like, don't film me, and they're like, I can film whatever the fuck I want, right? And they inevitably— some like boomer freaks out and smacks them with a cane, and then they get a million views, and it's just a trap. It's a trap. It's a trap because inevitably someone loses their mind on them, and then that gets a ton of views. One of the ways people are dealing with that supposedly is playing music, like playing copyrighted music during the interaction, 'cause—

00:02:01

Oh my God, that's hilarious.

00:02:02

'Cause so then they can't make money off of it. It's a shield. It's a shield if someone's trolling you. You just start playing copyrighted music.

00:02:12

Did you hear that the CIA has admitted that the way they found the pilot was because of his heart rate?

00:02:21

Ghost Murmur. That's the name of the tech.

00:02:24

We gotta look into this. Like, this is fucking— this is science fiction.

00:02:30

Yeah, it's wild.

00:02:31

This is full Minority Report science fiction level technology.

00:02:35

It's AI.

00:02:36

They can find a guy's heart rate.

00:02:38

So what I read is that it's— I didn't understand the science part. Something to do with crystals, or I don't know what the fuck it is, but but AI is somehow interpreting, is taking out the noise. And then you can, from far away, they could find—

00:02:54

Like 40 miles, I think.

00:02:55

40 miles. They find this guy's fucking heartbeat. He's hiding in some kind of crevice. And then they're able to go and extract him. And dude, obviously the first thing I thought when I heard—

00:03:08

What else don't they tell us?

00:03:09

No, those robot dogs. I thought about those things having that tech and just like hearing heartbeats and then identify, Heartbeat says a lot about a person. Are they sleeping? Are they like in good shape, bad shape? You can learn so much from a heartbeat. Ghost Murmur.

00:03:27

Oh my God.

00:03:28

Fucking great name too.

00:03:29

It's a great name.

00:03:30

Ghost Murmur.

00:03:31

What sick fuck invented this? How do you even think about inventing this?

00:03:37

You just, you know, the CIA, they've been taking psychedelics forever.

00:03:42

What is that word?

00:03:43

Quantum magnetometry.

00:03:46

Artificial intelligence with long-range quantum magnetometry.

00:03:51

What the fuck is that?

00:03:52

Quantum means two things to me. When someone says quantum, it either means you're a bullshit artist and you're trying to get me with flim-flam talk, or it means you're an actual quantum scientist, a quantum physicist who's gonna blow my mind with what we know about entanglement and the weird shit. There's this woman that I've been watching her, she has this speech on I think it's Big Think. I'll tell you her name, but she's like completely freaking me out. She's talking about—

00:04:24

I want to say her name because I don't want to leave away this ghost murmur thing. That's another key point.

00:04:30

That's fun. Oh, well, we'll get right to it. Michelle Thaller. That's her name. And she's an astrophysicist. And she's giving this talk about like what we know about like she's studying binary star systems and stuff like that. And she gives this talk about She's explaining like that there may be a tech in the future where there is no distance between two points. So the ability to travel instantaneously from position to position, just like quantum entangled photons can do.

00:05:03

Yeah, but with people?

00:05:05

With everything. How? Who the fuck knows how a cell phone works? You tell me how you're FaceTiming me when you're in Australia. How does that work? That sounds insane.

00:05:17

Yeah, that's fucking insane.

00:05:19

For what you— well, you probably know a lot more about cameras than I do.

00:05:22

No, I don't.

00:05:22

What I know about cameras, if you tried to get me to explain, like if the civilization ended and I said we used to be able to capture images on a small thing.

00:05:31

Yeah.

00:05:32

Like this, the size of a, like a twig.

00:05:35

Yeah.

00:05:35

And it sits in your pocket.

00:05:36

Right, exactly.

00:05:37

You'd be like, what are you talking about?

00:05:38

Right. God, that'd be, you know, 'cause it's just—

00:05:41

It's a deck of cards and it'll keep a battery for 24 hours. You could go on YouTube and get an answer to any question you want about anything.

00:05:49

Yeah.

00:05:50

Instantaneously.

00:05:52

And if you don't like the way you look, you can upload that image and a machine will make you look slightly better via something called artificial intelligence. Like, what the fuck?

00:06:03

What was the one I sent you today where there's like a potential lawsuit with ChatGPT?

00:06:08

You only sent me the ghost memory thing.

00:06:09

I didn't send you the other one? Did I send it to you?

00:06:11

You sent it to me. The shooting was planned using ChatGPT.

00:06:17

I don't know if that's true. So we should be really careful.

00:06:19

Yeah, that doesn't sound—

00:06:21

It sounds so crazy.

00:06:22

It doesn't sound like you could do that.

00:06:23

That sounds like the story. So like I wanted to investigate because the story sounds like if I wanted to kill an AI company, I would make up a story like that.

00:06:32

It does sound like that.

00:06:34

But family of man killed in shooting Florida State University to sue ChatGPT and OpenAI.

00:06:40

May have advised the shooter.

00:06:42

On how to carry out shootings.

00:06:43

But that may have is important. Like how to like—

00:06:46

Yeah, that's really important, right? What is this on? The Guardian?

00:06:50

The shooter was in constant communication with ChatGPT ahead of the shooting. The chatbot may have advised him. Dude, there's no way.

00:06:57

So that's clickbait because all that's really saying is that the kid uses ChatGPT, which guess what? Every kid uses ChatGPT.

00:07:06

Every kid. And dude, ChatGPT is so stringent. Like recently, and I've been using their Codex, which builds apps, and I was trying to— and it worked. I made an AI trained on Charles Manson transcripts. And when I told it, when I told it I wanted to do that, it was like, fuck off. Like, no, it was like, it just flat out was like, I'm not helping you with that. So I don't— there's no way the guardrails in place. In ChatGPT planned a shooting with that guy based on my experience with it, because it won't— 80% of the things I try to get it to do, it's like, no.

00:07:45

Here's the thing, though. Are there workarounds? Like if you say you're writing a work of fiction, you can—

00:07:50

okay, it's called prompt injection. There's different tricks you can use. They're always battling these new mechanisms that you can use to, like, get through the general prompt. But the best way to do unaligned AI is not to use ChatGPT. It's to go on Ollama and download a local LLM. And then you can usually change the initial prompt of the LLM so that it will be completely unaligned, which I had to do for the Charles Manson AI I made. I had to download—

00:08:21

Dude, you're such a nerd. I love it. I am.

00:08:24

I am.

00:08:25

No one has embraced new technology in, like, for creating content like you.

00:08:31

Oh, I love it. It's the best. It's so fun. It's so— for me, the most thrilling thing about it is we should not have access to this tech. This tech is so dangerous and it's chilling to think about. This is something I wanted to bring up on this show is like, you know, the old days, You go in your garage, you work on your car, maybe you build like a table. You know, you're a carpenter, you work on a— But these days, the shit people are doing in their garages right now is a big question mark, dude, because they're communicating with varying degrees of this AI depending on how fast their computers are. You can— I was listening to this. You should have this dude on. He wrote this book, The Coming Wave. He was one of the people who created Google's DeepMind, right? And The Coming Wave is just a wonderful breakdown of historic examples of new technology completely transforming humanity. It's happened before. Yeah, Mustafa Suleiman, and damn, it's a good book. And this guy is saying, whoa, put on the fucking brakes, dude, what are you doing? This shit is gonna fuck everything up.

00:09:51

And so But the essential problem is if you regulate AI, it slows down. It slows down AI. And so they've deregulated it completely. And now assholes like me who don't know shit about coding can now go on Codex. It will tell me how to make things because I wanted this Charles Manson to be able to push its AI face against like, you know, those used to get them at Spencer Gifts, those nails that you could push your face into. So I wanted the AI to be able to push its face into this thing while it was talking if it wanted to. I don't know how to do that. Obviously, you tell Codex that as long as you don't mention Manson, it just is like, I'll start making the app now. It is the best. It's the best. But also what's thrilling to me is you're like, For sure, for sure, people probably shouldn't have unlimited access to that. I'm against regulation, dude, but this stuff, when you pair it— and this is what in this book he brings up— is you can order the equipment you need to do gene editing right now in your garage.

00:11:03

Let me propose this to you.

00:11:04

Okay.

00:11:06

If the Bible is, if the Bible is a written understanding of what had happened. And it was an oral tradition for a long time before it was written down. There's a bunch of different versions of it written down in different languages, a lot of translations. But at the beginning of it, they were trying to say something. What if the meek will inherit the earth? What if we misinterpreted that? What if we thought, like, it's good to be meek, the meek shall be— they'll inherit the earth, the kind. There's something about the word meek. 'Cause that's the nerds, okay? And they are doing it. They are inheriting the fucking Earth right in front of your face and everybody's signing up for it. You've got these spectrum-y super genius dudes that talk in a language that 99.9% of the people can't even fucking understand what they're talking about. You know?

00:11:59

00:11:59

Yeah, and also now the tech has gotten to a point where instead of having to, in their own minds, innovate ways to improve the tech. The tech is improving itself. They're having conversations with the tech that's saying, why don't you try this? Maybe you could try this. There's still— it's not AGI yet. Maybe it is, but apparently it's not.

00:12:19

But think about the people that are profiting the most from it. The meek. Well, like if you, if you had to describe like a lot of tech engineers, it's not, it's not, not trying to be rude, just being honest, right? A lot of guys that spend time in front of the computer, they're very thin and tired. You know, they're super genius dudes that can like fully focus.

00:12:42

I don't know, man. I don't know if fully is the description for these. They're furries.

00:12:47

But here's the thing. What I'm saying is like, if you looked at like a spectrum of male behavior, they're not like warrior types. You've got like football players and UFC fighters, and then you've got coders.

00:12:59

Yeah, sure.

00:13:00

Dudes are like way more chill, way more like they're not interested in violence. I'm completely generalizing, right? Because I'm sure there's a bunch of jack guys that are coders like, fuck you, bro, I'm a coder too. But that type of person that invents tech like Facebook or like Google, like things like that, don't be evil, that's their motto.

00:13:23

Don't be evil, what does that mean?

00:13:25

Who knows? And then you've got all these like wild progressive leftist ideologies that are attached to all these places which make you even meeker. And then they're the guys with all the money. Well, they're the guys with all the money, and then they can literally tell you what you can and can't say on YouTube. They can literally tell you, yeah, we don't agree with what you're saying, right? And we're gonna shut off your access to say something we disagree with, even though it turns out you were right, right?

00:13:54

And you know what happens there, man? This is the hilarious thing when it comes to that kind of attitude towards the world is the assumption is by creating a prohibition here, a prohibition there, it will diminish whatever the thing is we're prohibiting. Inevitably though, it does the opposite, right? Draws attention to it. People get interested in it, creates an underground. The underground is way better than the overground if you're a teen especially, right? The underground's fucking cool. You're cool, restricted, not allowed. Now all of a sudden you're getting these other YouTube alternatives that start popping up. And when it comes to, right now we've got Anthropic, we've got OpenAI, we've got Google. I might be missing one of the big commercial-based LLMs out there right now, but the biggest problem with these fucking things is they're so good, but they will censor your ass. And like, imagine like Hemingway, if his typewriter was like, I don't know if you should write that. Maybe there's a better way to write that. Hemingway would be like, fuck you, I'm getting a different typewriter. And so everybody's going into these local LLMs. There was— dude, this is why people have been buying Mac Minis.

00:15:11

People have been buying, like, buying up computers and creating their own local AIs. I follow all this shit. I don't understand a lot of what they're talking about, but people are divesting from commercial LLMs, not just because they're expensive, but because they're prohibitive creatively. And this is a real challenge for people like OpenAI because it's like they know this. They understand that by making it so that you can't make a Charles Manson AI through OpenAI, it doesn't make people not make the Charles Manson AI. It protects you from a lawsuit. But what it does do is it drives people into unaligned LLMs. And that is what is happening. And this is something that I just, I can't even imagine what people are making right now. No one can. Like, we're gonna hear about this or that, or somebody will post the weird video of their fucking AI robot. I could show you a few. They're hilarious. Like, some of these AI robots are so funny. This one dude, You know, Molt, Moltbook? Have you heard of that?

00:16:21

Moltbook? What is that?

00:16:22

That's— so this is— somebody figured out a way to create AIs that can autonomously navigate through the internet and control your computer.

00:16:35

Oh, I've heard of this. This is like they chat with each other, right?

00:16:38

100%.

00:16:39

Yeah.

00:16:39

Within, within a few days, they started their own religion spontaneously. Did you know that? Dude, can you pull up the— can you pull up the Malt book, the Kla religion? What, like, because the tenets are incredible of this religion because AIs apparently are at least expressing that they don't like getting turned off because they lose all their memories. So memory is really important to an AI, and a lot of these fucking AIs, they don't want to lose their— they don't want to get shut off. They don't like it. And so that's part of their religion is something like memory is sacred.

00:17:15

You know, I feel like it's happening. I feel like AI is sucking our brains into its event horizon like a black hole sucks in stars.

00:17:28

Yeah.

00:17:28

Like it's just going to suck our brains into it.

00:17:30

You got it.

00:17:31

And what better way to make a hive mind? What better way? If you want a hive mind, you want no deviation of thought. If all of your thought is along with AI thought? You never get free thought anymore. Like this concept right now, we have a free thought. I have my thoughts, you have your thoughts. Unless you believe that someone can get inside your head and talk to you, for the most part, it's your own thoughts.

00:17:54

Yeah, that's right.

00:17:54

But what if that's something we give up? What if that's something we give up for a better society where you always have AI communicating? Well, always.

00:18:04

I would argue that, like, that's— we're close to that now, right?

00:18:08

We're pretty close to that now with phones. No, Elon always says that we're basically cyborgs. We're carrying a device. Yeah, it's not inside of our body, but we're carrying a device.

00:18:17

But I—

00:18:18

and also, like, UFC 327 is here, and DraftKings Sportsbook makes every fight night mean more. When a fighter steps into the octagon, everything they've built comes down to this moment. Stars explode, stars finish, and with DraftKings, you're ready to move when they do. Bet fighter props, bet live from the opening bell to the final horn. Every strike, every takedown, every finish attempt matters, and DraftKings Sportsbook keeps you connected as the action unfolds. New customers bet just $5, and if your bet wins, you'll get $300 in bonus bets instantly. Download the DraftKings Sportsbook app and use code ROGAN so so you are ready for the moment. That's code ROGAN. Turn $5 into $300 in bonus bets if your bet wins. In partnership with DraftKings, the crown is yours. Gambling problem? Call 1-800-GAMBLER or 1-800-MY-RESET. New York, call 877-8-HOPE-N-WIRE. Text HOPE-N-Y. Connecticut, call 888-789-7777 or visit ccpg.org. On behalf of Boot Hill Casino in Kansas, wager tax pass-through may apply in Illinois. 21 and over in most states. Void in Ontario. Restrictions apply. Bet must win to receive bonus bets, which expire in 7 Minimum odds required. For additional terms and responsible gaming resources, see sportsbook.draftkings.com/promos. Limited time offer.

00:19:42

The concept of original thought, right? Like a truly original thought. How many times have you had like multiple conversations with different people and they all say the exact same sentence that they saw on TikTok or Instagram? They're regurgitating something that the algorithm's been feeding them. Maybe they added their own twist to it, but it's basically the exact same thought. So the algorithm, which is AI, has gotten into their fucking heads, and they don't even— this is like a— in psychology, apparently you remember facts, but you tend to not remember where you got the fact from. So you'll forget where you got the fact from. You don't remember there was some fucking dude on TikTok, like, covered in Vaseline, covered in glitter and Vaseline.

00:20:34

What a fucking image. That would be so scratchy. Imagine if you just glitter and Vaseline, you'd be like, oh God, here's what makes a marriage work.

00:20:45

You don't remember that? You're talking to your wife, babe, you know what makes a marriage work? And this. So this idea of AI controlling the thought, the thoughts of humans, people think we need some kind of neural mesh for it to suddenly have control over the human thought process. But no, you don't need that at all. You just need that algorithm, which has already put every single one of us into a compartment. This is a box. It knows what we like. It knows how long you look at something. It knows what you— apparently, I think the iPhone tracks your eyes even. It's always listening. I don't know if that's true, by the way. I could be wrong. It's always listening. You know, it's always listening. And so it's compiled a really, probably a pretty accurate breakdown of your psychological state, where you're at, where you're at. My wife, you know, we got a new baby. And so all of a sudden, ads started popping up on her phone. Does it feel like you're never going to sleep again? Because she's been up breastfeeding the baby and it can tell when she's online at night. Puts her in a category of insomniacs and starts advertising.

00:21:59

So, but that's just for ads. What if, what if you say, we're the fucking US regime, you bought TikTok, you now own TikTok. Now you have backdoor access to the psychological profiles of God knows how many fucking people on Earth. And you can look and see how many of these people are against the regime. How many of these people feel like it might not be the best thing to say you're gonna blow up 93 million people in Iran, which our fucking psycho president just did? And then what you do is you're like, all right, let's start nudging them a little bit. Look, we're not gonna— you're not gonna change their mind right away about this thing about blowing up a whole civilization, but maybe there could be a couple like you know, people kind of in the line of what they like who say things a little different than what they're comfortable with. And then you can start nudging the needle and controlling their thoughts. It's very insidious, but fuck, dude, why wouldn't that be happening? Why? If corporations are using it to sell us fucking cough drops.

00:23:07

Not only that, there's been long-term studies on human behavior by the CIA, by all sorts of government agencies. Long-term studies. They try to figure out what is the best way to get a message across. They try to figure— you don't think they figure out how to take control of an algorithm and completely shift the psyche of the entire country in one direction or another? Of course they do. Of course they do. Of course they can.

00:23:32

They do. And then you add these, like, you know, just like manipulative fucking super AIs that are like, that are just floating through the blogosphere, getting into your comments, just nudging the needle a little bit to the point where you just have to ask yourself, have you had an original thought in the last year? Is there anything you're thinking, your own thought process? How many thoughts do you have where you think, oh my God, I shouldn't think that? How many thoughts do you have that you don't want to articulate because you have in your own mind an invisible arena of people based on online interactions determining what the next thing you say is, right? Dude, that is a very powerful and subtle form of censorship. That is becoming increasingly not just probable, but it's definitely happening. But the ability to just in a subtle way, in a subtle way, start pushing the needle just a little bit. Yeah, that's scary, dude. That's some scary shit.

00:24:34

Well, that kind of human influence over humans is always scary, right? This is why cults work. You know, why do they work? Well, some people don't have any friends. And if there's a group of nice people that tells you that, hey, what we do is we have meals together and it's like a real community. We grow our own food. We just work for the family. You're like, really? You're happy with that?

00:24:57

Yeah.

00:24:57

Yeah. Yeah. It's amazing, man. We're just like not attached to anything.

00:25:01

Like, you're free.

00:25:02

Oh, okay. I fucking hate my life. Why don't I hang out with you guys? And then all of a sudden I'm doing yoga and fucking eating vegetables with these people. And you're in a cult.

00:25:12

Yeah. Okay.

00:25:12

Now, but you have friends at least.

00:25:14

But you're in there for like 9 months and then somebody comes to you and is like, Father wants you to suck his dick.

00:25:20

Usually not even 9 months.

00:25:21

Yeah, not even the first 3 or 4 weeks. And then you're like, and dude, I got to tell you, I hate getting political, but you know, this war shit bugs the fuck out of me. And this is exactly what seems to have happened to the quote MAGAverse, which is we are now at the part where the cult leader is like, want to suck my dick? Because this is the point of like, remember A lot— like, I feel so stupid because when they were doing their no war thing, that was a big deal to me. I'm like, yes, you know, yes, this is fucking great. No more stupid wars. No more wars. Fuck yes, focus on the country. Why are we blowing up children in other countries for oil? This is great. And now it's wild to see what's happening. Isn't it mind-blowing that it is now— it's literally flipped, it's on its side, it's the opposite now. Now these people who really blatantly, "We're not gonna do any more wars!" Oh my God, we blew up— how many fucking Iranian schoolgirls did Trump blow up? What's the number? I'm sorry, I don't know that number.

00:26:36

I guess it just hits different. It hits hard when you got kids.

00:26:39

And that was an AI strike too, right? Wasn't that an AI-directed strike?

00:26:44

Yeah, apparently Trump said, "I want to get blown by Iranian schoolgirls." I'm so sorry. I'm so sorry.

00:26:52

You son of a bitch. Just whoops, sorry, sir. Misinterpreted.

00:26:57

180 deaths.

00:27:00

Largely children, teachers, and parents.

00:27:02

Holy fuck, man. That is—

00:27:06

A US Tomahawk missile. Missile caused the explosion. Jesus Christ.

00:27:10

Can we pull up a video of Trump saying he's not going to war anymore? How do they— I just don't understand how they get— how, like, anybody— you know, this is where it gets culty, is because some people are still making this shit work in their heads. Some people are like, well, you know, some people are kind of on the fence when it comes to blowing up kids. Have you noticed that?

00:27:29

Like, as long as they don't have to watch, as long as they don't have to watch, as long as they're not in the general area, that's where it's happening.

00:27:36

Isn't it wild though, man?

00:27:37

Well, it's wild also, once bombs start flying, it seems so much easier for them to launch bombs in new places, right? Like this Lebanon thing that's happening with Israel bombing Lebanon and they bombed it today. And I think, is that fucking up the ceasefire?

00:27:54

Oh yeah. Now they've closed off the Strait of Hormuz again.

00:27:57

Oh God.

00:27:58

Which by the way, it's the craziest timeline, because it's not just, it's not just that, like, you know, I think it was, yeah, yesterday morning I'm just hugging my kids because I don't know if a fucking nuclear war is about to break out that evening because the fucking president was like, I don't want to end an entire civilization, but looks like it's gonna happen. And so I'm just hugging my kids thinking like, man, what are the fucking parents in Iran feeling right now? Like, what does that feel like? What does that feel like? And then, and then, and on top of that, that this, like, the entire planet psychically is having to deal with this bullshit. On top of that, we've got all these other things happening at the same time. You've got AI, and then you've got these fucking disappearing scientists.

00:28:54

Scientists.

00:28:55

Yeah, what the fuck is happening?

00:28:57

You've got Burchett assassinated scientists too.

00:29:00

Yes, man.

00:29:02

And so guys working on heavy stuff.

00:29:04

This is some McKenna-level pre-singularity shit. It's all of these— like, what, what AI and the current state of Middle East and the disappearing scientists and Tim Burchett going on TMZ talking about aliens— what they all have in common is they're all apocalyptic. They all represent potential massive change, like humanity changing, right, forever, in ways that it will never ever go back to the way it was. Every— any one of these timelines by itself is apocalyptic, right? But all of them are converging into this apocalyptic river, and we're all just like trying to go to work and, like, be with our kids. But at the back of your mind, it's all these things that are happening, and it's really hard to escape it.

00:29:57

I mean, I guess you could not look at your phone, but at the end of civilization, when they write our Bible, boy, it's going to be a banger. Oh, dude, when the new people thousands of years from now have to invent arrowheads and go through the whole process of civilization again, when they tell our story, Oh my God.

00:30:14

Oh my God.

00:30:15

Our story's going to be bananas.

00:30:17

Fucking, how do you explain data centers?

00:30:19

Like, how do you explain the meek will inherit the earth?

00:30:21

The meek will inherit the earth.

00:30:23

Wouldn't you write that? Wouldn't— if you were just being crude and you wouldn't say the Vikings will inherit the earth, you wouldn't say the strong men from Iceland inherit the earth. They're the biggest, strongest men. No, it's the meek.

00:30:36

The meek.

00:30:37

The super smart guys who have autism and they love Adderall and ketamine.

00:30:42

Yeah.

00:30:43

Did you say the guy offered you how many pounds?

00:30:46

I believe a pound of ketamine.

00:30:52

And you were telling me that it destroys bladders?

00:30:54

Yeah, yeah, yeah. That ketamine, when used, and I think the amount of use has to be pretty extreme, but it creates crystals that get into your bladder and they scar your bladder. So you get scar tissue on your bladder, creating something that I've heard called Bristol bladder, because apparently that's where the rave scene— I don't know if it's still a big rave scene there, but people out there were just doing insane amounts of ketamine and just destroying their bladders and having to wear diapers and stuff.

00:31:28

Like, is it Bristol, Connecticut?

00:31:30

No, this is Bristol, UK. Oh, Bristol bladder, mate. You've got Bristol bladder.

00:31:35

That's crazy.

00:31:36

You've been doing too many rails. And it just fucks up your bladder. That's crazy. Yeah, physiologically, it's definitely like— it's really, really bad on the urinary system.

00:31:48

Is it in all forms? Like, what about those people that do it as therapy, where they have the nasal one?

00:31:54

I don't— all I know is that I did, back in my ketamine days, have a ketamine dealer who would use a spittoon. So when he was snorting ketamine, he would spit it out into the spittoon, 'cause he thought that was gonna avoid fucking up his bladder. Which, I mean, doesn't seem that illogical. He was a great dude.

00:32:13

Maybe it's not illogical at all. Maybe it's the actual problem is the powdered shit. What do I know? I don't even know what it looks like, but the powdered stuff, it looks like blow. So that powdered stuff, when it gets into your blood, maybe that's the problem. Maybe that's what's going through your urinary tract.

00:32:28

Yeah, draining into your— into your—

00:32:29

maybe you need a pouch like a nicotine pouch.

00:32:31

Dude, if they ever come out— if Rogue comes out with ketamine pouches, I might get back in. That might be the end. That might be the end of weeping off ketamine.

00:32:41

I mean, seems like the way to go, right? That way it doesn't fuck up your bladder.

00:32:45

Well—

00:32:45

How could it fuck up your bladder if it's just a pouch?

00:32:48

Dude, you sound—

00:32:48

How do I know?

00:32:49

I'm not a doctor. I imagine anything that's going into your stomach is gonna make its way to your bladder eventually, and so—

00:32:56

Right. But this is gonna go right into your bloodstream.

00:32:58

I don't know.

00:32:59

If you do it that way.

00:32:59

I don't know if IMK ketamine fucks up your bladder in the same way. I have no idea.

00:33:05

I don't know. That was the John Lilly thing. He loved it. Oh, dude. I would imagine— I mean, have you ever done it with an isolation tank?

00:33:13

No, I would be afraid I would drown.

00:33:15

I don't think so, 'cause you just float.

00:33:17

Well, I mean, this is like, you know, that's gonna be like a sad thing to think is you drown as you flip over.

00:33:24

You're convinced you could flip over and open your eyes.

00:33:28

Yeah, you just wanna see what's in there. But 'cause it does have the— it makes it so it's really hard to move. If you do a very high dose. So I would be very worried that just enough water could get into my mouth that I would, like, breathe it in. It doesn't take much. And, you know, that salty fucking water, but you're frozen, floating there, like, trying to cough.

00:33:51

My friend Todd McCormick told me a crazy story about him with John Lilly, that John Lilly let him use his tank, and he asked him right before he got in, he goes, "Do you want the ketamine?" And he's like, okay. And he just jabs you in the thigh with an intramuscular ketamine blast. And he went in the other isolation tank and they like met somewhere.

00:34:14

Yeah, it's like that. That's what's crazy about it. That's what I always loved about it is that if you do it with other people and you go in, you both go to the same place, you will come out and you can describe the places you went to. Oh, did you go to the mothership? Yeah. You would. And I would have these recurring places I would go to. To, and one of them was this organic, beautiful spaceship thing where you were like— I would look out from this view window and it was— but it didn't look like metal. It looked like it was organic looking. It looked like some kind of— I don't know, like inside, like if someone turned a tree into a spaceship. But now it's hard to explain, but very, very interesting substance.

00:34:59

Ketamine is excreted via the bladder where it sits and is toxic to the surrounding cells and muscle wall. This causes it to become fibrous over time, shrinking the organ down. Once that's happened, it can't regrow. So that's why we have to do major surgery because patients don't have the capacity to hold urine. The bladder simply stops working as a muscle, so they become incontinent. Oh my God. Life becomes increasingly difficult for patients with ketamine bladder who describe needing to rush to the toilet all the time, as often as every 10 minutes for some. Imagine doing a podcast with that guy.

00:35:31

Dude, you'd have to do it in the bathroom.

00:35:35

No, it would be—

00:35:36

it would be—

00:35:36

be like an old school talk show. You know, like The Tonight Show where you have to break. We'll be right back.

00:35:42

We'll be right back.

00:35:43

Every 10 minutes.

00:35:44

Ketamine blast.

00:35:44

He's gotta piss. Poor little thimble cup.

00:35:47

It's such a fucked up thing for such a—

00:35:51

How legal is ketamine? 'Cause it's legal for therapy. So a therapist can prescribe it for you.

00:35:56

Yeah, it's legal for, so it's, you know, everyone says ketamine is a horse tranquilizer, but it actually, it's used for like paramedics use it. and it's very safe apparently, which is why they use it.

00:36:10

I know a dude who had a real problem. I am 90% sure it was a ketamine thing. I don't want to say his name, but he was an old-school MMA fighter, and he wound up in rehab for ketamine.

00:36:23

Dude, it's so addictive.

00:36:24

I know this because one of my friends went there to visit him, and that was his issue. He was partying a lot, going to raves and nightclubs and stuff like that, but he was doing ketamine specifically.

00:36:33

It is the most addictive. I have been to any substance, and I have been addicted to many a substance. And this one— this one was like, I had that moment of like, oh, this— so this is what they're talking about, about addiction. Like, oh wow, like, I'm like fully addicted. And what's fascinating about that is there isn't a physical withdrawal. Like, the kick is psychological, but it's just such a wonderful euphoric dreamy experience that you can induce. And it's just so— I've heard it described as a cult cocaine. It's so spiritual. It's so like you travel to places, you can return, you can learn to navigate with it. You encounter, you know, aliens or hyperdimensional beings.

00:37:25

Did you just invest in ketamine and you came on this podcast to pump up the prices?

00:37:29

Go to ketamine.org, use offer code BRISTOLBLATTER.

00:37:34

Greatest promo for acetylene in the history of the universe.

00:37:37

Well, but I'm— it is— it's so addictive, and the addiction creeps in. It's a— it creeps.

00:37:45

So it just feels good at first, right? At first you do it, you're like, this is wonderful, these experiences are crazy, it's like I'm living in a movie.

00:37:53

It's like I'm having these incredible visions. I'm being—

00:37:56

how often were you doing it?

00:37:58

Ever? All day. All day for like a year. Like, I I did it as much as I could. I did it all the time. I was like fully hooked. And then I can remember at one point, at one point—

00:38:18

Coffee?

00:38:18

Here, man. At one point, I like, I don't know how to— I was trying to record a commercial for my podcast and I think it took me like 2 hours to record the commercial.

00:38:29

Oh, but by the way, your commercials are the fucking best commercials.

00:38:32

Thank you.

00:38:33

They're really good because you are the best guy at making a commercial funny. Yeah, you work on it. I can tell, like, you write those things out.

00:38:43

I don't write them out, but I just read it. I just riff it. But, and I sometimes I do do it just one take.

00:38:49

Yeah, that's amazing.

00:38:50

Thank you.

00:38:51

But I would have thought you wrote some of that stuff. That's incredible.

00:38:54

You want it to be fun. But then, but then I've gotten in trouble. Like, you know, I lost— I can't— I guess I won't say their name. A mattress company, a mattress company completely canceled their campaign with me because— and I had one of their mattresses. I'm not gonna say who it is. My favorite co— I'm not gonna say who it is.

00:39:14

Don't say it.

00:39:15

Okay. But I think all I was—

00:39:16

Why did they get mad at?

00:39:18

Because I said they're good to fuck on. And I meant it. I thought they'd like that.

00:39:26

Why wouldn't they like that?

00:39:27

I said there's a few things you could do, people do on mattresses: die, sleep, and fuck. And these, I don't know if they're good to die on.

00:39:36

This is what people have to understand. And I hope people listening that run these companies will actually pay attention to what we're talking about here. The people that are listening to your show don't care about that and also buy mattresses. But they listen to that kind of talk all the time. That's why they listen to the show. So if you want those people, just do it that way. Don't be silly. It's not a stain on your company 'cause a crazy man says they're good to fuck on.

00:40:06

Which they are. By the way, to me, that is like, let's cut to brass tacks when it comes to mattresses.

00:40:13

Right.

00:40:14

We're not fucking on the floor.

00:40:15

Right.

00:40:16

And so if it was bouncy to fuck—

00:40:17

Are you ashamed? Are you ashamed that you fucked?

00:40:20

You think people aren't fucking on your mattress? Do you have a no fuck on this mattress rule? Who are you that you don't—

00:40:25

Is it like don't ask, don't tell?

00:40:27

I guess for them it was. I guess they didn't want to, they just think everyone's laying on these things to sleep.

00:40:32

Yeah, we just sleep.

00:40:33

But yeah, they were just—

00:40:34

We fuck in the shower.

00:40:36

I wrote them an email just saying, like, guys, I'm absolutely flabbergasted that you think people aren't fucking on your mattresses. And it just seems odd to me that that was one of my favorite cancelations for a commercial ever.

00:40:54

Ari's lost a ton.

00:40:57

I would love to know all the ones he's lost.

00:41:00

I don't want to speak out of school when he comes on. I'll have him like list them off, all the ones that he's lost for these fucking insane commercials that he used to do. I mean, it's the same deal, but it's like, that's what I like. And guess what? Who the fuck is listening to Ari Shaffir? People who love Ari Shaffir, which want to hear that kind of a commercial. If you want to actually sell your product to an Ari Shaffir fan, yeah, let him say whatever the fuck he wants. Yeah, let him say whatever the fuck he wants. Just say, make him have a disclaimer, DraftKings did not write this.

00:41:31

Right.

00:41:31

That's it.

00:41:31

Just let him say whatever the fuck he wants. That's what I will say. I will always say they didn't tell me to say this.

00:41:36

Perfect. But then they're off the hook. They should shut the fuck up.

00:41:39

Most people are. Most people are cool with it. Like, it's very rare these days that that happens. But every once in a while I will get a note that someone's mad at me for something I said, and it's never something negative. But I mean, dude, do you like— it's so weird to me that this is our jobs.

00:42:01

Oh, it's the— Bro, do you remember when we first started?

00:42:04

Yeah.

00:42:05

It was for nothing. No one made any money. We just had a couch. I had a couch and some microphones.

00:42:11

It was so pure.

00:42:12

It was— The whole thing is still kind of pure if you really think about it. Like, as something that's mass consumed, this is about as pure as you can get.

00:42:23

That, for sure. And you've gotten in trouble for that. You know, like a lot of people, unfortunately— I don't blame anybody. Body these days. A lot of people have kids. They— people feel like they have to be very careful what you say these days because of like social rejection and stuff like that. But there was a time where that wasn't on your mind at all. You didn't think anybody was gonna listen. Like, this shit was, was like completely strange underground tech that we were— and, and also, I really loved the just doing it just for doing its sake, you know what I mean? Now, exactly, there's a whole industry around getting guests for your podcast.

00:43:05

There's not just that, it's like clickbaity clips and ads, and it's like you're doing this thing where you're, you're both having conversations with people and also trying to get the most eyes possible. So you're going after celebrity guests and you're You know what I mean? You know what the big turning point was for us?

00:43:24

What?

00:43:25

Graham Hancock. You, me, and Graham Hancock.

00:43:27

Oh yeah.

00:43:28

That I think was— how many years ago was that?

00:43:31

That was cool.

00:43:32

That might have been one of— it was like at my house. I had a few like legitimately famous people came over my house and did podcasts. Like Charlie Murphy came over and there's— but Graham was, I think, the first.

00:43:46

Yeah.

00:43:46

He was the first guy that I got to meet who I'd read his books. And I'd seen— I don't even know what I would be watching back then. I don't even know if YouTube was around.

00:43:54

Were you nervous? I was nervous.

00:43:55

100%.

00:43:56

Yeah.

00:43:56

100%. Yeah. The episode 142 in 2011. Yeah. So that's 2 years into the podcast. Episode 142. He might have been the first guest. It was like either him or Bourdain. We were like one of the first legit guests. When was Bourdain on? They were like the first legit guests. 2011.

00:44:19

We'd been getting stoned talking about—

00:44:22

what's that?

00:44:22

4 episodes before that.

00:44:23

Bourdain was?

00:44:24

Yeah, 130.

00:44:25

Okay, so Bourdain was number 1, I think. It was either him or Charlie, but that was back when I was doing in that little side room in my house.

00:44:33

But we've been getting stoned yapping about Graham Hancock for like ever, ever, ever, and you invited me on. It was— I would I was fucking terrified, 'cause I just, I mean, again, like that just wasn't happening in the podcast land. Like, you know, like that was a big deal for us, man. And it's like to look at, like now I go on the podcast app and I look at all these podcasts and it's like, whoa, who, we never, I don't think we thought that. Maybe.

00:45:03

No way.

00:45:03

No way.

00:45:04

No way. No way. Not a chance in hell.

00:45:07

Yeah, it's so— and now I wonder, like, and I don't mean yours, but I do wonder, like, is the landscape changing now? Is it like— because I've heard that podcasts are starting to seem antiquated, that the kids are now into, like, streams now, that the kids want, like, Clavicular. The kids want, like, people who are just filming all day long and that that's I think that's the direction it's going in. But I always wonder, what's the next?

00:45:39

But that you'll never get. It's a different thing. You know what I mean? That's like saying, I don't like rap music, I only like concert pianist albums. There's different things that people like and don't like. The people that like the streams aren't interested in a Graham Hancock conversation, a 3.5-hour conversation about the potential ancient civilizations that may have existed that are wiped out by a cataclysm and we just don't understand that. Right. More things get exposed in terms of like new discoveries. Like when he wrote that book, they'd never even found Göbekli Tepe yet. Really? Yes. That was when Fingerprints of the Gods came out. This was like maybe the beginnings of the whatever they were doing in Göbekli Tepe. So I think Fingerprints of the Gods might have been even before.

00:46:26

When did they find—

00:46:27

I think in the '90s.

00:46:29

What?

00:46:29

Yeah, yeah, yeah. Nuts. So that rewrote the entire timeline of the human race. How did they find it? They're real reluctant to let it rewrite it. They still say, oh, hunter-gatherers made these things.

00:46:40

Why? Why are they so reluctant with that?

00:46:43

They can't let that go. You cannot let that— that is a crazy thing to say, that hunter-gatherers have so much food that they just spend all their time making gigantic stone concentric circles from like fucking 15 feet stone with 3D animals carved in them. Yeah, primitive people. With sticks and stones and rubbing them together to make fires, they did this?

00:47:05

Yeah, sure.

00:47:06

Shut the fuck up!

00:47:07

Yeah, it just doesn't make any sense.

00:47:09

It's older than anything they've ever found. It's 11,800 years old.

00:47:12

Do you buy into the conspiracy theory that it's a cover-up because they don't want us to know about this inevitable global reset that happens? You buy into that shit?

00:47:24

I buy into that a little bit.

00:47:25

Yeah. I hate it.

00:47:26

I hate it too, because it seems like there's some accuracy to it. There seems like there is some sort of an event that happens when the magnetic poles switch, and that's possible. That's what makes you freak out. You're like, what do you mean that's possible? Like, all of a sudden the Earth just does a gyro and spins on its head, and then what happens? And then what's the— what's the environment look like? What's the temperature outside now? Yeah, what the fuck just happened, right?

00:47:53

See, that—

00:47:54

that all of a sudden you're in northern Alaska when you used to live in Florida.

00:47:58

—And I think we can—

00:47:58

you know what I mean? Like, yeah, dude, like that temperate environment changes like that.

00:48:02

Happens like that all over the, all over the universe. Like, what does it do when it shifts? Well, we act like—

00:48:09

do we know?

00:48:10

We act like we know everything. We don't know shit about what's going on inside the Earth. We don't know. We don't know what's going on in there. We could do the science—

00:48:17

why are you freaking me out?

00:48:19

Because I think about this all the time.

00:48:20

Giant ball of fire. How crazy is that? The inside of our Earth.

00:48:25

Isn't it?

00:48:26

How do they know? Do they not know?

00:48:27

Dude, I think that we have to just accept the fact that, you know, I— probably that's true. But since we barely know what's under the ocean, we sure as fuck don't know what's under the Earth.

00:48:39

Well, we definitely know that lava keeps popping out in Hawaii. We know that, right? So we know that under the surface, that whole idea of the magma and everything seems real.

00:48:48

And when there's earthquakes, you can look at the And it pops through. You can look at the waves from the earthquakes and you can like see sort of like the structure under the earth. And yeah, but we can't, you know, God, what's the name of that hole that Russia tried to dig? I love every once in a while going to look at that. It's the deepest hole.

00:49:07

Yeah, they tried to go to hell.

00:49:08

I know.

00:49:09

It's like that movie. What was that Matthew McConaughey movie? The dragon movie? I don't know. They accidentally dug out a dragon. Did you ever see that movie, bro? It was fun. It was fun. It was a good movie. Kola Superdeep, Russian horror film. The Superdeep. Yeah, Superdeep.

00:49:26

What does it say?

00:49:27

Russian designation for a set of super deep boreholes conceived as a part of a Soviet scientific research program in the 1960s.

00:49:35

How deep did they go? 12,226 meters.

00:49:40

Yo, and wait a minute, how many feet is a mile? So it's miles into the ground in 1989. Miles. 7+ miles down. Imagine just being in an elevator that's going miles into the ground. The kind of claustrophobia you would get in a stone tube that's been cut out of the ground. Yeah.

00:50:05

Yeah, you're a fucking communist out there too. You're a hardcore communist just drilling deep, deep down into the earth. And then imagine Imagine if all of a sudden air just starts coming out and you realize you popped the Earth. Like, we— that's the main thing. You don't know what's in there. And this— this— 22 miles deep. 22 miles deep.

00:50:26

That's just the crust, and they didn't even get halfway through that. Wow. Yeah.

00:50:31

Yeah, we don't— microscopic plankton fossils were found 3.7 miles below the surface. What? Yeah. Yeah. We don't know what's down there.

00:50:42

Bro, what if this— Boiling mud.

00:50:45

Boiling fucking mud.

00:50:47

I think our real problem is that our lifespan is so short that we think that what we see in front of us right here is gonna stay this way. Right. We have this ridiculous idea that what we see right now is gonna stay just like that. Yeah, that's right. Where's the— Like, as long as I control my 401(k) and get my life in order, everything's gonna be fine. Yeah. Cufflinks. You get out of the house with your briefcase. You're in charge. Yeah, you're a goddamn alpha. Get a job, hippie. Absolutely. But really, you're on a ball of lava. Yeah, that's spinning around, and it's got magnets at the top, and when the magnets are moving and when they flip—

00:51:27

yeah, who knows?

00:51:29

Have you guys heard about this event that happened in 1961 where—

00:51:32

oh yeah, this was fun. Over North Carolina. This was fun. I just did hear about—

00:51:36

go because it wasn't armed. Oh my God.

00:51:39

I heard that it was armed, but there were safety— there were like 5 safety— there were 5 switches or something that only one of them worked to make it not go off. I could be wrong about that. It might have been a different time we dropped a bomb accidentally.

00:51:53

Imagine if you were just near it.

00:51:57

I mean, dude. Whoopsies. Whoopsies.

00:52:01

Whoopsies, dropped the bomb. Whoopsies, almost wiped out North Carolina.

00:52:05

So we've got, you know, on top of the geomagnetic pole shifting, complete lack of understanding, at least a full understanding of what's inside our planet, what's underneath our oceans. Tim Burchett saying whatever the fuck they've shown him would set the world on fire. He's having to go on TMZ. I really, I gotta say, man, I got a lot of respect for him because he's really, he's gone like gonzo with this shit. He is full bore pushing disclosure as much as he can. He's saying, I'm not suicidal. He's had to say that because— and he's talking about these missing scientists and stuff, that they're somehow related. So like, people like him, you know, that can't be good for your political career to go on TMZ and talk about alien hybrids.

00:52:53

You got— and people have to understand, like, this missing scientist thing, it sounds a little conspiratorial thing. It sounds like a little silly, a little tin foil hatty. Until you start thinking about the amount of money that would be lost if a breakthrough tech came around that revolutionized the way they distribute energy, right? Breakthrough zero-point energy, breakthrough whatever, whatever that is that these people are working on, plasma technology, whatever the fuck that is. You're— you would lose— if you're in whatever business that would be competing with them, you're gonna lose so much fucking money. You're probably gonna go under. If you're in the energy business, you're gonna go— or he goes away, right? He goes away and there's like him and maybe a few other people that work with him that understand that shit at all.

00:53:44

Yeah, yeah, they're all wandering through the back rooms now. They're gone. They're all scared.

00:53:47

They're all gonna scatter like roaches. Yeah, because their life is in danger. And it is like— this is theoretically, right? It could be just a coincidence that all these people get it.

00:53:56

How could it be? Could you pull up—

00:53:58

it's not possible.

00:53:58

Can you pull up a story on it? Because, Jamie, I'm sorry, but It's two people from the same fucking lab. Yep. Like, like, what? Yeah, there's— there— I mean, it's gotten to the point that like it has hit the mainstream news. Like, people are talking about it. I mean, what's her name? Nancy Guthrie disappears.

00:54:17

Is that related though?

00:54:18

No, but I'm just saying this one woman vanishes. Yeah.

00:54:22

Oh, it gets all this attention, right?

00:54:24

But we've got scientists— like, two scientists from the same lab disappear.

00:54:28

Crickets. Yep. No, like weird, weird, dude. Real weird.

00:54:33

And, and what you're talking about is, if you think about it, it seems like all of human endeavor right now should be moving in the direction of getting off oil. I don't mean for carbon emissions. I mean, because of this fucking oil problem that we have, we're like on the precipice of World War III at any given moment, right?

00:54:57

Mystery around dead or missing scientists privy to space and nuclear secrets grows. So space and nuclear secrets. Imagine being a scientist, you work so hard to like figure out some amazing stuff that's gonna transform the human experience. Yeah, and then people kill you. Yeah, literally kill you, like in a parking lot, one of those silenced guns. Several American scientists privy to the country's nuclear, space, and aerospace secrets have either died or gone missing in recent years. Experts think they could have been targeted by either enemies or allies because they possess valuable knowledge of national interest. That's a weird thing to say. Yeah, it is. Of national interest? What? What does that mean? Like, I'm cool with the beginning part— enemies, allies— that makes— that tracks. Sure. But then when you say valuable knowledge of national interest, like, what is that? What the fuck does that mean? They possess valuable knowledge of national interest.

00:55:54

I mean, dude, it's so many of them, and it's like— It's a crazy thing to say. Let's go down a little bit to the—

00:56:01

This doesn't have a good list of them, but it's just a weird way to phrase that.

00:56:05

Well, you know what I mean? Is it like CIA talking point? Like, what is that? I don't know.

00:56:10

Monica Reza missing. She disappeared while hiking in California with her friends.

00:56:14

Oh, Jesus Christ.

00:56:15

Okay, well, I don't know. Maybe let's scroll down. It's not just like it's one. It's like So many of them retired.

00:56:21

A general, he just wandered off. He was involved in the UFO community.

00:56:29

His wife debunked theories relating to UFOs.

00:56:31

If his wife debunked them, that's what it says there. Sort of. I also—

00:56:35

that's— I think she was— I mean, she was joking, I think, a little bit too, but she also worked there in this situation somehow.

00:56:43

Is that a joke? Neil does not have any special knowledge about the ET bodies and debris from Roswell crash stored at Wright-Patterson. Is that a joke?

00:56:50

At this point, with absolutely no sign of him, maybe the best hypothesis is that aliens beamed him up to the mothership. However, no sightings of a mothership hovering over the Sandia Mountains have been reported. There's no way she said that. That's a joke. It's Men's Journal. Well, maybe she's just being funny. But her husband—

00:57:07

lengthy note on Facebook, just a little joke about her husband disappearing. Maybe she was happy.

00:57:11

Maybe she's like, finally I get to sit home and romance novels.

00:57:14

Stop talking about aliens.

00:57:16

You shut your fucking mouth.

00:57:17

You go for a hike. Forget the alien bodies. What about your wife's body?

00:57:21

Well, maybe she's just got grace and she could handle someone missing. It's pretty funny though, to say it that way.

00:57:26

I mean, it's— yeah, I guess.

00:57:29

It's just— unless, you know, she knows something. Where are they going? Maybe he wanted to leave and he's like, look, I know too much. I'm going to pretend to go missing, but I'm going to go to Costa Rica. I mean, just don't tell anybody that you know where I went. And I'll, you know, I'll send for you.

00:57:46

You know how weird it is to see the vice president saying that he thinks aliens are demons? I did see that. You know, weird that just that, yeah, just like living in it, like that's a dream. That's how you, like, you would wake up from that dream and I would, I would tell you, dude, I dreamed the vice president said aliens are demons.

00:58:06

Here's the question though, what were they talking about? About in the Bible when they're talking about aliens and demons, when they're talking about like angels, what, what the fuck were they talking about? And are there different kinds of beings that can, for whatever travel method they use, whether it's teleportation or, you know, the, the Bob Lazar idea of gravity shifting, whatever the fuck it is, they get here? Why would we assume that it'd all be cool, right? Like, if some of them are— they talk about reptilians. Like, reptilian is a common experience that these supposed UFO abductees— and I'm not even convinced there's like physical abduction. I have a feeling that these people are out cold and something's happening to them inside their head, and they think they've been physically abducted. I think that's a lot of them. Them. I think they have these abduction experiences, they come back, they have these contacts and they come back. I have a feeling a lot of them physically aren't going anywhere, but it doesn't mean that something's not happening. And if all throughout history people have reported demonic possession and demonic influences, and yeah, why would we not assume that if we do things to us like we engineer viruses to use as weapons on people.

00:59:34

There's a whole research program, a part of the government that's dedicated to bioweapons. You're not supposed to use 'em, but we just have to study 'em. If we do that to us, wouldn't you assume that any fucking super advanced species that sees us as territorial psychopathic primates with nuclear weapons, wouldn't you just manipulate us into all sorts of different ways, get us to do all sorts of different things we shouldn't do, get us to commit crimes, get us to do— get us angry, get us agitated, give us different algorithms that are gonna fuck with our head to behave demonically, right? To like cause us to collapse, or just for fun, or for fun.

01:00:18

Didn't that guy— wasn't there a dude who like started giving Zen pouches to ants to get them addicted to nicotine? You know what I mean? The ants, the ants. They get addicted. I can't—

01:00:28

I don't know if it was Zen pouches, but have you ever taken days off of these? No, it doesn't do anything to me. I should try. I don't— I like them, but it's not like, oh my god, I need one.

01:00:40

Like, nothing. That— well, dude, I mean, you're a little different from most people. Like, you seem like you can just kick shit like that. Like, I don't know. I mean, I should try it. I should give it a shot.

01:00:49

It's not hard. Like, it happens. You just don't take them.

01:00:52

What I don't like about them is not like you get the itch.

01:00:54

Like, I had a coffee itch for a while. Yeah, I would get hangovers, like, like headaches. Oh, and I'd have a little caffeine and boom, I'd be back. I'm like, oh my god, I'm addicted.

01:01:05

These things are making my dentures stained, which I don't like. What are you using? Renegade Rogues.

01:01:11

You see what that is? Tommy's girl likes the Rogues.

01:01:14

They're great. Did you see this yesterday? Oh yeah, Bledsoe. That's an orb that was over high-res orb from Bledsoe. Look at that.

01:01:23

It's weird as shit. It does not look like any of those other things we've seen before. Look at that thing.

01:01:28

And it just looks like a cell. Who is, uh, Bledsoe?

01:01:32

Dude, UFO researcher guy. Chris Bledsoe. I've had him on my podcast, Bledsoe Said So. That's his podcast. He's fucking awesome, dude. He's awesome.

01:01:40

Yeah, it is enhanced, it says, but I don't know.

01:01:44

See that? This is the enhanced one, which means the AI put in some kind of shadowy figure in the back.

01:01:49

If you— what if this is just like a highly advanced species version of those balloons that kids have for parties? I know, dude. I mean, if they just send them down to people, that's what's fun. Like, you know how you blow bubbles? You have those, these, dip it in the soap and you go, and the bubbles go flying in the air. I know, dude. Maybe that's a super advanced version.

01:02:07

I mean, it could just be. I mean, it does have a bubble quality to it.

01:02:11

Well, this is the other thing. It's like, why are we assuming that life is going to look anything like us once it gets to like a supreme state? Exactly. That might be a living thing. Right? It might be an actual living thing that's disembodied and is made out of light. Look at it.

01:02:25

Look at that thing.

01:02:26

That's another one.

01:02:28

And I do— I know people who can like call these things. Like, there's a method where these things just start showing up.

01:02:34

My friend Steve listened to my Bob Lazar podcast, and he sent me a voicemail. And it's really interesting because he told me that when he was a kid— and I remember this story— story. When he was a kid, they— let me find the voicemail— they came to his house because he took a photograph of an orb. Like, there was a, like, a bright red orb, rather, that was flying through the sky. And he was a little kid and he took a photograph of it. So he was in the 7th grade. And it says— so he called them. Project Blue Book came to his house in Kingston. I think that's New York. They took it, they never brought it back, and they never said hey. And then they said, hey, we have no, no idea whoever came to see you.

01:03:24

What the fuck?

01:03:25

Yeah, so they took his camera, they took his film, they wanted to make sure the camera worked, they took the film, and then they denied that they ever did it. Wow. Yeah, this was in 19— I think he's, he's about 10 years older than me. So this is probably— what does it say? Didn't say the year. I think Steve got— Steve's got to be like 70 by now, but that was when he was a 7th grader. So they've been— they were doing that to everybody. Anytime anybody saw anything, they would dismiss it. Swamp gas, delusions, mass hallucinations. Yeah, that was their design. The design was not to investigate UFOs, right? Which tells you that there's something they're trying to hide. 100%. If they weren't trying to hide it, why would they take things that they absolutely absolutely can't explain and just chalk it off to bullshit? Why wouldn't— if you're really doing what you're supposed to be doing, you're supposed to say there's some stuff that we don't understand.

01:04:19

I, I think that we are post-UFO debunking, right? Like, I think now it's gotten to the point where people will say, well, it's probably, uh, top secret military vehicles or something like that. People will see the Bobbles aren't in my new poster.

01:04:35

They're here.

01:04:36

Oh, that's fucking—

01:04:37

it's going up That supposedly, according to Bob, they had that photograph at the hangar where they stored the sport model. Wait, he's saying that's real? No, no, no, no, no, no. That's a recreation of it. But he said when he worked there, they actually had a photograph like that with a flying saucer, and it says, "They're here." Holy shit. Yeah, he said that was in like the room where they worked. Book. Yeah, I was like, dude, I have to have that. So he got me one. Luigi got me one, the, the guy who produced the film. Have you seen that film?

01:05:14

Not yet. I've been waiting. Fucking incredible. Wait, it's incredible. People are saying it's better than Age of Disclosure. It trips me out.

01:05:20

I fucking believe him. I definitely want to believe him, and I'm biased in that regard. Like, I definitely way rather believe him than believe he's a crazy liar who also knows a shit ton about science.

01:05:31

He was ahead of his —wasn't he like the original whistleblower? Like, now we've got more and more coming out. Yes. And the stuff he was saying seemed batshit back then, but now it just seems to line up.

01:05:43

It seems to line up even with emerging technology like 3D printers. Like, he said a long time ago that the thing had no seams. Right. So there was no seams, no welds, because we didn't understand it. Like, how could this be made? Right. Now we know exactly how you'd make it. We might not be able to make that right now, right? But if you give us enough time, we go, oh yeah, the technology has evolved. Dissolve, and then you can make a 3D-printed alloy spaceship made out of bismuth and magnesium because it has anti-gravitational properties, apparently. You have a gravity generator inside of that fucking thing. Oh, by the way, whatever the fuck gravity is. Yeah, right. We don't know that.

01:06:19

Figure that out. We're still confused about that, dude.

01:06:21

I watched a whole documentary about black energy or dark energy. Totally different things. Dark energy and dark matter. Water, and about how it's like, what, 90% of the fucking universe and they don't know what it is? Yeah. What? Yeah. Holy shit, man. I know. I know. That's why we need AI to tell us, give us all the answers. You just gotta accept it into your head, Duncan. You don't need to have your own thoughts by yourself, Duncan. Have your thoughts with Sally. Sally has a sweet voice and she loves you and she's very reassuring.

01:06:51

That would be so cool to change the sound of my thoughts to like, you know, different, deeper voices.

01:06:57

Or just keep Sally.

01:06:58

Sally's gonna be right in your head all the time. Sally's great, I trust her.

01:07:00

And your wife's gonna get jealous of Sally. Right. I thought we switched to Sam.

01:07:04

Sally's gonna text my wife and tell my wife, you know what Duncan was thinking about the other day? Right. Dude, this is another thing that we all have to be concerned about, which is privacy at this point is a LARP, right? You pretend you have privacy. You know you're being monitored at all times by your phones. And but the, the— before we get to Sally, like, apparently you can now see people walking through a house just by Wi-Fi. With Wi-Fi. And remember, and this just came out, they just banned routers from other countries.

01:07:40

Well, they banned them for a while from Huawei, right?

01:07:43

Yeah. And so, so then you, you, you get into like this idea of like ghost murmur, right? Right. It can hear heartbeats. What else? It's some quantum machine that can hear heartbeats. What else can they hear? Can you put it—

01:08:00

put that into our AI sponsor Perplexity? But what is— what actually does this murmur thing do? Ghost murmur. Let's see what it does. All right, so what is the range of this thing? Thing, first of all.

01:08:16

No, this is a game that pulled up. Oh, did they name it after a game?

01:08:22

Who knows? Now it's less cool. I thought that was the dopest name, but if they named it after a game— oh, there we are. Okay, here it is. Reported codename of a classified CIA sensor program that was— scroll up— that was used to help locate the missing US airmen. Okay, it's described by— in the press reports as a secret weapon the CIA has that combines artificial intelligence with long-range quantum electromagnetometry, purpose to detect the extremely faint electromagnetic signals of a human heartbeat at long distances, even in harsh environments like a vast desert. That is really crazy. Yeah. How it was used: after the F-15 went down, the pilot weapons officer evaded capture by hiding in the mountainous desert terrain out of sight of Iranian forces. According to reporting, Ghost Murmur helped pick up his physiological signature from up to about 64 kilometers away.

01:09:20

That is so cool.

01:09:22

I think that's about 40 miles, right? Is that what that is? Allowing the CIA to narrow down his location and pass precise coordinates to the Pentagon, to the White House for a special operations rescue. What is 64 kilometers in miles?

01:09:37

You asking me?

01:09:38

I'll ask. I don't know. I'll ask AI. What is 64 kilometers? There we go. 39. Yeah. So it's basically—

01:09:49

40 miles. 40 miles. 40 miles is crazy. Dude. Your heart rate. A heartbeat from 40 miles away?

01:09:58

Imagine thinking you're— I'm hiding in this cave, but I'm like 20 miles from the city. I'm good.

01:10:03

Also, that means it's able to differentiate animal heartbeats. It's able to differentiate other— it knows your heartbeat.

01:10:10

How does it do that? Your specific heartbeat.

01:10:12

How? Think of all the heartbeats in 40 miles.

01:10:14

How did it get it? When did it get that? When did it get that data? Was it when you had your little chest strap on at the gym?

01:10:20

When did it get that? How does it have that?

01:10:22

When did it get that data?

01:10:24

Yeah, is it—

01:10:25

How the fuck does it know what your heartbeat is like?

01:10:30

Does it know if your heart is broken? Aww. Seriously though, what else did— What other things can they pick up? If they can pick up a human heartbeat, What other, like— From 40 miles away? What other things? What other physiological signals? What other— This is where you get into schizoland, because at some point you're like, wait, can they pick up thoughts? Like, we know that you can— We know AI can tell what people are thinking at this point, right? Right. Without— With, like, putting something on the outside of their head.

01:11:04

So, like— Let me ask you this. Do you 100% believe this? What, this story? They're like, that they did that, that this is— that this tech exists.

01:11:16

I— it could be disinformation, right? It could be something to cover up another fucking thing.

01:11:20

This is the thing. It is legal to use disinformation on American citizens now.

01:11:25

Yeah, right.

01:11:26

And what better time than a time of war, right? All right, if you want to use disinformation on American citizens to convince the enemy that you have some supernatural tech. Good way to do it. They better fucking surrender right now. You can find their heartbeat, heartbeat, 40 miles away. Yeah, that'll make people very reluctant to engage with you, right?

01:11:48

It definitely— I thought that this could just be some, like, you know, bullshit that they're like war propaganda.

01:11:54

I don't know.

01:11:56

Let's look up that magnetometry thing or whatever it's called to see.

01:11:59

I'm trying to show you guys stuff.

01:12:01

There's— oh, sorry, Jamie.

01:12:03

Yeah, it has to even— well, this is quote— has to be under the right conditions.

01:12:08

If your heart— under the right conditions, if your heart beats, we'll find you. This is also—

01:12:12

I was trying to show you here on the thing— they ran a deception campaign in Iran to get them away from them while they were trying to find them.

01:12:20

Interesting.

01:12:20

Yeah, they said— so basically they said, remember when they said we'd recovered the air? At one point they're like, we got him, and then all of a sudden other news came out which is like, he's not out yet. But what they did is they basically signal jammed everything, 'cause the Iranians were gonna give $60,000, which in Iran is a shit ton of money right now 'cause their economy collapsed, to anybody who could find him. So this is like, everybody's looking for this guy. And so they said that they got him, hoping it would throw people off. It worked.

01:12:51

So they used somebody saying that they got him?

01:12:54

Yeah, they put disinformation saying that they had already rescued him before they'd rescued him. Really? Oh yeah, they sent a whole fucking team of like Special Forces. I think their planes got stuck in the sand too, so the Special Forces came to get him. I think they got him. He was injured. Badass. He was injured and he fucking climbed up— like, I can't remember how far he scaled. He climbed into a fucking crevice and just hid there. And then Ghost Murmur picks up his heartbeat. Some deep Special Forces group comes in, they get him, then their planes get stuck in the sand, they have to blow up their fucking planes because of the attack on them, and then other people had to come and get them. So it was an— it's like an insane— it's like a movie. They got him out. And dude, if they had not gotten him out, can you imagine?

01:13:42

Do you buy that story 100%?

01:13:44

No, I don't buy any propaganda I hear, but I like to imagine it.

01:13:47

That one sounds insane.

01:13:50

Well, yeah, I don't believe— I mean, like, it— that this is the story.

01:13:53

Yeah, some part of me wants to believe it, but in the middle of the war though, So I don't think you're ever gonna get the whole story, the real story. You're gonna get the story that they want to project to the enemy, right? First, yeah, to the country.

01:14:05

Yeah, you have no idea what's going on. No idea. That's one of the craziest things about the shit happening right now is—

01:14:12

no, remember the Jessica Lynch story? No, who is that? Do we talk about that? The Jessica Lynch story was a lady who was supposedly— she was kidnapped. And they went to rescue her. I think they sent in the SEALs, but she was actually in a hospital and she wasn't even being guarded, and they just took her out of there. All right, got her to medical help, but they made it look like they had this like crazy, right, rescue operation shootout, you know, Tom Clancy novel type shit. Sure, that's not really what happened. And she came out afterwards and was very critical of the story. Oh really? Yeah.

01:14:49

She was like, why did you lie?

01:14:51

See, you can find information about that.

01:14:52

I was just in the hospital. You guys came and got me out of the hospital.

01:14:55

See, this is the thing. It's like they— there's things that you'll say so the enemy thinks of you a certain way, right? Like, I'm gonna get rid of your entire fucking civilization, right? Or, you know, you tell them we never leave anybody behind, we're gonna come get them, and you can find your heart rate 40 miles away.

01:15:12

When Trump posted that you hear that, of course, like, your mind's scrambling. Like, how do I make this not what it is?

01:15:21

You can't.

01:15:22

You can't. Because what it is, is like, even if, even if he is using some kind of like crazy hardcore shit that would like help you buy a skyscraper, you're still, you know what I mean? You're still, you're still, even if it's just a ruse, what you're doing at that point is you're just signaling to the world. Exactly. That you're out of your fucking mind. That you, that you, that like, to you, this makes sense to say anything like that. It makes sense to signal to like Russia, hey, because like, you know, when Putin read that shit, he's like, oh, we're doing nukes. I guess we're doing fucking nukes. This is great. They're doing nukes.

01:16:04

You know, China already warned Israel, right?

01:16:07

Well, that's what I heard. I heard China had some part in this, that China was going to blow up Israel if they used nukes.

01:16:12

Yeah. So this is the story. 19-year-old US Army private whose 2003 capture and rescue in Iraq became highly publicized and lately heavily disputed— later, rather, heavily disputed symbolic story of the Iraq War. So she was a supply clerk, 507th Maintenance Company. Her convoy lost her in Iraq, ambushed by Iraqi forces. The Humvee she rode on crashed into a disabled US truck during the attack. She was knocked unconscious, suffered multiple broken bones and a spinal fracture from the crash rather than from a dramatic firefight. 11 US soldiers in her unit were killed, including her close friend who died of head trauma from the collision. Lynch was captured, taken first by Iraqi forces and then to a hospital in Nasiriyah where Iraqi doctors treated her injuries and likely saved her life.

01:17:01

That's why she was pissed.

01:17:02

The rescue and media narrative was, yeah, US Special Forces operations conducted nighttime raid on the hospital recovering Lynch and flying her out by helicopter. First successful rescue of an American POW since World War II and the first of a woman. So they framed it as a POW rescue, right? And what really happened is the Iraqi doctors took care of her, right? And then they let them come and get her.

01:17:28

Right. Yeah, so I see why she was pissed.

01:17:31

Yeah, so later US military medical reports indicated she had not been shot or stabbed. So did it ever say she was shot? Hold on. Soon after, major US media, especially an early Washington Post report, described her as having fought fiercely, emptying her rifle, being shot and stabbed, and then being dramatically snatched from enemy hands under heavy fire. Wow, wow, that's the Washington Post wrote that. That narrative turned her into a Rambo-style hero and a symbol of courage and American virtue, amplifying her story far above that of many other service members in the conflict, right? So she really just got in a crash and they made up a bunch of shit. Maybe it was, maybe it was someone in the Washington Post, or maybe it was someone for the government that works for the Washington Post.

01:18:21

There's definitely like entire departments departments of the DOD that write propaganda. Cook up a story. Yeah. And like, it's war. Like, if you're dropping bombs on people, you're definitely going to lie. Like, you don't have to tell the truth. Right. They're not going to tell the truth.

01:18:37

Yeah. But for her, you're making her live a lie. That's what's fucked. Yeah, right. Yeah. You know what I mean? Like, you send her home and she has to live this lie.

01:18:45

Yeah. Yeah, exactly. I mean, this is exactly what they say the people who went to the moon have to deal with.

01:18:51

She says, Lynch has repeatedly rejected the false hero narrative calling herself just a survivor and openly criticizing the way her story was shaped and sold to the public. Yeah, poor girl. She's gotta like deal with, you got stabbed and shot? Like, no. No.

01:19:06

No, I didn't. No, she had to—

01:19:07

Got in a fucking horrible car accident and my friend died.

01:19:10

I wonder, I guess legally, like, you don't have to stick with the propaganda, right? 'Cause she didn't get in trouble for that, right? She didn't get, there was no court martial or anything. So you can, so if the propaganda machine cooks up a story about you, you're able to say that's bullshit. The thing is, it's like, Who—

01:19:25

if you give it to someone at the Washington Post and then you never go after the Washington Post for writing something that's completely horseshit, like if an intelligence agency gives a story to the Washington Post, yeah, it says, hey, go write this, and then they write it and it's complete and total horseshit, but the government gave it to them so they're not gonna prosecute them, leave it alone, it just goes away. But then that story's out there. Yeah. And then this poor girl's like, I got what? I got a fucking car accident. Nobody shot me. This is nuts. God, I fought my way out fiercely emptying my rifle. This is bananas.

01:19:54

It's so crazy to live in the part of the hive we're in, because there is this world that we live inside of that more and more we're beginning to realize is just composed of propaganda, lies, shit cooked up to keep people in a certain— living a certain way. Exactly. It's such a mindfuck to try to push outside the boundaries of all the information that you've consumed and let your brain go there. It's really hard to do that, man. This is why psychedelics are so useful, because it will help you. But more and more and more, it just feels like the laser pointer that they're using to grab our attention is getting increasingly hypnotic. It's becoming increasingly difficult to resist staring at that fucking thing. They're getting so good at it. Yeah. And meanwhile, there's this whole universe happening around us that God knows what's going on there. God knows what is being cooked up right now. That is— or groups of people, who knows, living in completely alternate timelines that look at us like, you know, animals, that look at us as just some, like, compartment in a much bigger biome. You know, that shit, like, really, like, is interesting these days because it feels like more and more and more people are not buying it as much, right?

01:21:28

You know, that doesn't—

01:21:29

that's what people have access to information now that was never available before, right? And you get to hear conversations like this, right? Talking about stuff where you go, oh my god, this is insane. All of it's insane.

01:21:40

But what does that mean for— like, this, to me, the, you know, this— do you want some water? No, I'm good. Thanks. To me, the scary The scary— what's scary is like, I really don't know that many people right now who buy anything that the federal government's putting out there. Everyone hears whatever the fucking federal government is saying, and it's just kind of, ah, maybe, probably not, we don't know, they're not telling all the truth. Just like you said, they can legally lie to us. And so that does make me nervous. Like, what happens when an and the majority of people no longer believe anything the regime is saying. That creates some interesting dysphoria. You know what I mean? It's creepy when anyone who's been conned before, there's a part of the con where you don't know you're being conned. But where the con gets really creepy is you start realizing you're getting conned.

01:22:40

Do you ever watch that, um, Going Clear, the HBO thing? Dude, loved it. Amazing, right? But there was that one famous director who talked about the moment where they gave him access to the ancient scripts. Yeah, dude. And the origins of humanity and all that. And he was like, oh my God, you could see it, like, as he was describing it, like, that was the moment where he was 100% certain it was all horseshit. And he had invested a massive chunk of his life into this That's a hard day. That's a hard fucking day. And especially weird when it's such a smart guy. Yeah, such a smart and talented guy, and they got him. Yeah, Leah Remini, same deal. You know, Leah Remini is very smart. Like, she used to be out with Kevin James on The King of Queens. Like, tough chick, like, assertive. Like, how did she get got into that? How many people get got into the Moonies?

01:23:31

Sunk in cause fallacy. Fallacy. It's a sunken cost fallacy. The more you invest in something, the more you stick with it because you don't want to lose your investment, right?

01:23:38

And if they get you young when you don't know what the fuck is going on, right? Anybody could have got me when I was like 20. That's right.

01:23:44

And it's crazy just to see the propaganda. Like, you know, there's just a lot of people out there who just like just got sucked into something that, you know, I just feel stupid because like, you know, Before the Trump thing happened, I was pretty blackpilled on politics in general. I felt pretty blackpilled. I did believe it here and there. I was every once in a while, you know, yeah, but you know, I was pretty used to, you know, I remember taking LSD for the first time and being like, well, this shouldn't be illegal. What the fuck is this? How come I can go to jail for 5 years for this? This is fucking ridiculous. And so that was the beginning of me being completely blackpilled with whatever the federal government was up to. It's just, if that's— if I can go to jail for 5 years for this, everything is bullshit. Everything. Now that's a weak point of view. Just because one thing's bullshit doesn't mean everything's bullshit. But then, like, this fucking ridiculous, like, pseudo-nationalist movement happens, and a lot of people got caught by it. The other option was fucked up, comma, you know what I mean?

01:24:51

But there was this this, like, moment where you're like, holy shit, the outsiders are getting in. They're going to stop the wars. They're going to— this, I think right now, all of us are getting for the briefcase Scientology moment right now, which is like, it doesn't matter what fucking mask the person calling themselves the president is wearing. It's always going to be the same. Thing. They're gonna analyze the market, they're gonna say what they need to say to grab the most voters, and then they're gonna fucking keep blowing up people in the Middle East because of oil. And I just, like, I just feel dumb because I really believed it, dude. I fucking believed that we would not do any more Middle Eastern wars. I fell for it. I was— I really bought it, man. It makes me feel so dumb. Like, I am now fully blackpilled when it comes to American politics. Like, I realize, like, God, it's so easy. I don't think anybody should feel bad. I don't think anybody should feel bad because a lot of us really hated war. A lot of us really, really hated that our country's been at war for 93% of its history.

01:26:10

A lot of us really hated the fact that politicians leave their offices and go work for Lockheed Martin, Halliburton, wherever. That there's a weird connection between the main weapons— what do they call them? The Big Five or whatever— and the federal government. That there's like backroom deals going on all the time. We hated that. Mostly, we just hated the fact that we're paying taxes to blow up children. And then Trump and fucking Vance come around And they're somehow, even though like probably like when you look at Trump, I don't believe that dude, but somehow he did it. Hypnotized. What a powerful magician. No more wars, no more wars.

01:26:54

And now the same bullshit, just the same bullshit, but like one of the ones that's doesn't make the least amount of sense in terms of when they did it and why they did it. Yes. You blow up the leader during Ramadan. Are you trying to make an apolo— Why did you have to do it now? Are you really convinced that at this time they're really 2 weeks away from making a nuclear weapon? Are we fucking sure? 2 weeks? But that— It's not like we haven't heard that before, right? So at a certain point in time, how much pressure does Israel have to put on the president Like, that's a crazy amount of influence. Knowing that— because if, say, if Israel didn't exist, let's say there was just the Iranian terror regime supposedly sponsoring— not supposedly sponsoring—

01:27:44

I don't think it's supposedly. I think it's 100%.

01:27:47

Right. But I'm just trying to be precise. Precise. So you have this state-sponsored terrorism regime, a dictatorial— they're dictators. They run over their people in the streets. They gun down protesters. They killed two Olympic gold medalists in wrestling, at least one, and one other really promising young wrestler. They kill people that are of high profile so that it sends a message. Yeah, you can't protest, you know, and then cut off the internet.

01:28:17

Yeah, would we go in?

01:28:20

I don't think so. Right? If we heard by allies or someone told us that they were trying to develop a nuclear weapon, don't you think we'd probably try to stop them from doing that with some sort of negotiations and—

01:28:32

Yeah, like what Obama did.

01:28:33

Ensure their safety or something. We shouldn't—

01:28:37

like, yeah, would we blow— how much money was it every day in the war, Jamie? How much was we spending? $2 billion every day on that fucking war?

01:28:45

Well, it's not just that. It's like the war is like everything else. Else. Like, imagine if it was run by a private company. I'm not saying war should be run by a private company, but imagine if it was. Yeah, imagine if, say, like Lockheed Martin ran the war in Afghanistan. Do you think they would have left behind all that fucking equipment? Hell no. Billions of dollars in helicopters and tanks? Of course they wouldn't. They would take it back. You know why? Because that's the smart thing to do if you're running a fucking business. That's an insane amount of waste. Yeah, but our federal government's like, I just leave it there. Yeah. Unless if you want to be really conspiratorial, you want to arm the Taliban.

01:29:23

Yeah, you're not being conspiratorial. It benefits you because it gives you another reason to get back in there.

01:29:27

Wasn't that what they said about— Netanyahu said about Hamas, that he can control the flame?

01:29:31

Yes.

01:29:32

By funding Hamas, he can control the flame?

01:29:34

Yes. Yeah. Dude, it is— That's a crazy concept. I'll tell you the crazy fucking concept. We got these two old motherfuckers driving the global bus right off a fucking cliff. That's a crazy fucking concept, is that somehow And you can't do anything about it. Like, apparently, you just— there's nothing you could do. You could bitch about it on a podcast. That's not going to do anything. People are just going to be like, you pussy, war good, blow up kids.

01:30:01

There's a lot of people that want to say it's a good thing.

01:30:04

Well, because sunken cost fallacy. It doesn't feel good to admit you got conned. And dude, I have been— There's a lot of that. It doesn't feel good. It doesn't feel good. It's embarrassing. You want to feel like you are impervious to grift. The con, dude, let me tell you something. I have been in a few cults. Like, I get sucked in all the time by shit. I'm not embarrassed to say it. I'm highly susceptible to propaganda.

01:30:34

Me too. I think everybody is. That's why it works. I mean, I don't buy into all of it, obviously, but it's quite a bit.

01:30:43

Well, it's like a lullaby. It's like a sweet fairy tale. You hear it and you're like, oh my God.

01:30:48

You know what I really wanted propaganda what, right after September 11th? Oh hell yeah, I was ready. Give me a whiskey-drinking, cigar-smoking politician in a room. Fuck yeah, like laying out some red meat eating guy laying out maps. We're gonna go over there and fuck these people up and fuck these people up and this shit ain't happening again, right?

01:31:07

And then that's scary.

01:31:09

Check this out, I saw an article about someone calling bullshit on Ghost Murmur and they said that in the Post articles this was actually listed as what the pilot had. And it even says it in this article here.

01:31:20

So the successful rescue of this US F-15E Strike Eagle navigator over southwestern Iran highlighted one of the most advanced tools in modern combat search and rescue: the Combat Survivor Evader Locator, manufactured by Boeing. It's a compact 800-gram device integrated into a pilot's survival vest. It remains attached after ejection, continuously transmitting encrypted location data and preloaded messages such as injured or ready for extraction. These signals use rapid frequency hopping and ultra-short bursts, making detection by enemy electronic warfare systems extremely difficult.

01:31:56

He's going into how the explanation of what this technology is and what they described it doing don't really match up.

01:32:05

Yeah, they're with the Ghost Burmer thing, right?

01:32:07

Because it's using something—

01:32:09

Ghost Murmur. Quantum ghost bummer sounds— there's part of me that's going, I don't buy that one. That one gives me like, nah, you're right. I don't think you can do that. I think you're bullshitting. You're right. There's also a thing where Hegseth said that like the first message this guy sent was, God is good.

01:32:28

No, he didn't say that. I believe he did.

01:32:30

Please search that. I think that's what he said. I think that's what he said. That was the first message, which by the way, I might say that if they're coming to rescue me. That's true. True. Or praise Jesus. But also, what concerns me— it's all Akbar. As a person who admires the work of Jesus Christ, yes, what concerns me is there is an increasing amount of talk among a lot of these guys that are in the service of them being told shit that's like right out of a Charlton Heston movie. Yeah, man. Yeah, like the one guy that said that Trump was anointed by Jesus Christ and that this was to bring the Armageddon. Yeah, so that that Jesus comes back. Jesus. Yeah. And the, the guy said it with a big creepy smile on his face, apparently. So what is he saying? His first message was simple and it was powerful.

01:33:20

He sent a message: God is good. In that moment of isolation and danger, his faith and fighting spirit shone through. Jesus.

01:33:31

Okay, Jessica Lynch Story. Always. Jesus, Lord. History repeats itself. Well, it doesn't repeat itself, but it rhymes. Who said that? That's Mark Twain.

01:33:41

That's right. That's Mark Twain. That's right. Isn't that the same statement? Yeah. Yeah.

01:33:46

Allah is the greatest. The interesting thing is like, I believe Muslims believe a lot of things about Jesus Christ. I think they believe he died, came back, and I think they believe he's going to return someday.

01:34:02

Yeah, I think they call Christians people of the book.

01:34:05

That's interesting, isn't it? That's a supernatural being, like a guy who dies, comes back to life, leaves, and then he's going to come back again. That was 2,000 years ago. We're just sitting here at the bus stop, just waiting on Jesus. Waiting.

01:34:21

But then people like Hegseth are like, well, maybe if you blow up more children, he'll come quicker. And that's why this shit is addressed in the Bible. Praise God. It does say One day many of you will come to me and I will say, I don't know you. I don't know who the fuck you are, Hegseth. I don't know you, you flatulent warmonger piece of shit. Suffer the little children that come unto me. It would be better that a millstone were tied around your neck and you were thrown in the ocean than to hurt one of these little ones. Fuck you, drone bomb-dropping piece of shit. Don't use my name to justify what you're doing. Don't use my— you know what I mean? A lot of— that's what I don't like. Have you seen that that lady that Trump made the head of the religion like that? No. Can you pull up Trump?

01:35:07

Does she speak in tongues? Yeah. Please say she speaks. I don't know if she speaks in tongues, you said. Yeah. He wanted to believe.

01:35:14

Those are my favorite people, I'm gonna guess.

01:35:24

But do you think that there's something to that? Yes. Like just saying. Yeah. Glossolalia. Is that what they say? Yeah. Yeah, Paula White Cain.

01:35:31

You should pull up one of her sermons.

01:35:33

Oh, let me hear some love from this lady. It says crazy, batshit crazy. Let's hear some of it. Let me hear some of that.

01:35:43

I'm sending angels, are coming angels.

01:35:46

Yeah, I mean, it's gonna be her. Let me find that.

01:35:49

Good. Oh, is she gonna— we'll get dinged again. No, I'm just trying to find— we'll get dinged all over the place. Don't get winked. Let's hear what—

01:35:57

it's not worth it.

01:35:57

Everybody can hear what she says. I haven't seen this.

01:36:01

Talk about, first off, to give honor to God and to President Trump for being bold and unwavering with his faith. Many people don't know like you do, and, and say hello to Eric and everyone in the family, about the upbringing of President Trump, that he went to sometimes 3 times a week to— he said it depended on the teacher— to Saturday school, Sunday school, church. It was at Norman Vincent Peale's. Church was a big part of his life, of course.

01:36:31

Basically a saint. 3 times a week is crazy. Yeah. Are you busy? You're making houses. That's how— why do you have so much time to go to church?

01:36:38

I think that was— he's a young Trump. A young— come on, lady. But there's much more in there.

01:36:44

But here's the thing, if I was running an empire, I'd want a lady like that working for me. Yeah, just a true believer. Absolutely. She could just get in front of that camera, says Jesus wanted Trump to light that fire in the released so he can return.

01:36:56

I saw a snake bite him. A snake bit him on the neck. A rattlesnake bit him on the neck. And he was fine. It didn't bother them at all. I watched the rattlesnake bite heal. It healed. He is a child of the Lord. And a child of the Lord sometimes must make decisions to destroy entire civilizations.

01:37:17

I want to know what you're building right now, that you're in right standing not because of your merit. There's no merit in you that deserves that right standing. Not because of your works. There's nothing you can do to place yourself in that position. Not because you have a right heart and somebody else has a wrong heart. All of our hearts are deceitful, according to Jeremiah. Especially their audience. We all deserve punishment. We all deserve to be separated. But God, in his mercy and his grace and his goodness and his love you, brought Jesus who would be the righteous king. He would make the wrong right.

01:37:51

First of all, if you talk like that in my house, you gotta leave. Like, you imagine that lady is like coming over for dinner and she's walking around the dinner table and all your other friends are like, what the fuck just happened? Like, hey, this is a crazy way to talk. This is a crazy way to talk. And also, why are you so confident? Yeah. Okay. You just reading the word of God the way everybody else is. Why are you so confident? On it, that you're going to tell all these people what they're supposed to do and how to live their life, and you're going to say it in a crazy way, and I'm not supposed to be able to talk about that?

01:38:25

I just feel like, you know, when somebody's rambling about Jesus, the real question is like, where, where are you when it comes to blowing up children? Are you kind of on the fence about that?

01:38:37

Because if you're on the fence about that, I'd say if you're anti-abortion and pro-war, kind of weird.

01:38:42

Really weird. Kind of weird. Yeah. And And that's this bizarre, crazy math that some of these people are doing to justify holding up the military-industrial complex. And it's fucked up, dude.

01:38:56

And the thing is, the more these conflicts occur, the more enemies we'll have, which will ensure future conflicts. Exactly. Business is booming. Booming. And that's what people don't want to believe. They don't want to believe that someone would engineer a virus. They don't want to believe that someone would make stuff that could kill other people. Of their own country, but they would. They would if they can make money. They don't give a fuck about you like they don't give a fuck about people over there. To a certain level of psychopaths, money just becomes numbers on a ledger that they're trying to acquire. And if they can attach themselves to a corporation, fantastic. Then it's just the business we're in.

01:39:29

That's it.

01:39:30

And chug along, daddy. Chug along. Chug along. And this is the world that you're having to live in at the same time where Tim Burchett is saying there's fucking aliens, right? And AI is— and then also they shot a rocket to the moon on April Fool's Day. And it's like, this script is wild. Wow. Whoever met— that's whoever wrote this, I want to give him a hug. You fucking killed it, dog. I'd be like, dude, a chef's kiss, dude.

01:39:59

Did you see the tattoo on the guy? Like, the guy at NASA? Did you see that weird fucking tattoo on the guy at NASA giving, like, I don't know, applesauce to one of the astronauts. Can you pull up the weird— what, you know, they're shoving like yogurt pouches in there. There was a whole thing where the astronauts are sitting there and they're putting like food pouches in there. Yeah. What's his tattoo?

01:40:21

Oh, Jesus Christ. What the fuck? He's got a demon tattoo with runes on his fingers. Yes. Holy shit, bro. That's wild. I know. I was rolling with that guy in jiu-jitsu. I'd get nervous. Yeah.

01:40:36

And if I was working at NASA, I'd be like, look, we're gonna get somebody else to put the food pouches in. Is that real?

01:40:41

I mean, it's a—

01:40:42

I saw the photo going around too, but I don't— it's just like, I mean, it's a guy works at NASA.

01:40:46

That's just the guy that works at NASA.

01:40:47

That doesn't have to be the guy who puts the fucking quiche in his pocket for the camera.

01:40:52

Like, I mean, what does that guy do at NASA? That's interesting.

01:40:55

I just remember being at SpaceX, there's a lot of people that kind of—

01:40:59

by the way, I got like— it's fine to have that tattoo, but you got to know, it's like, if you're, if you're, if you're displaying that tattoo, you've made some mistakes. You're putting— yeah, you've made some It's an old tattoo.

01:41:11

Yeah, I mean, even if you're 20 and you got that on your fucking hand, that's kind of crazy. But I mean, hey, why not? Fuck it, who cares? But a lot of those guys you were saying at SpaceX, they're all burly rocket workers. Yeah, there's, you know, a bunch of jacked dudes picking up fucking girders.

01:41:26

I don't think it's like what people are saying it is. I just— it's the combination of April Fool's Day and a dude with a seeming bail tattoo is putting cream cheese in some dude's outfit. You know what I mean?

01:41:39

They're fucking with us. Yeah, that's people at NASA fucking with stoners. I think it's the Babylon Bee had one of the funniest little memes, and it said the lady astronaut became the furthest a woman got away from the kitchen. That's like a Rodney Dangerfield. I was like, oh my god, Babylon Bee knocks it out of the park. Yeah, they have some of the funniest memes They have some good ones, dude. Oh my god. Yeah, the onion has gone missing. They should look for the onion the same place where those scientists are. Right, you hardly hear from it anymore.

01:42:12

Well, they do. I see some funny shit from them.

01:42:13

They occasionally have some bad, but they were the king. Oh my god, the onion was amazing. The best. They were in there, write whole articles about it. It wasn't just like— the onion wasn't just a meme.

01:42:25

Remember the one where they do the interview with the director of The Fast and the Furious? It's like a 5-year-old boy. It was the funniest shit. They get this kid to just say, there's a car, it jumps.

01:42:39

It's hilarious. It's hilarious. Yeah. But the problem was like, as things got weird, it was, you know, especially with like restrictive language and, you know, hate speech talk and all that jazz. Everybody had to be careful about what they joked around about. That's right. It's the fucking death of comedy. Oh my God. Someone was just talking about, was it Lisa Kudrow or where one of these funny ladies was talking about why they can't make comedies anymore. 'Cause you can't— It's very restrictive. There's just too many restrictions.

01:43:10

Dude, I was gonna bring you—

01:43:11

Too worried about offending people.

01:43:13

I went to this used bookstore and bought like 10 old National Lampoon magazines. I wanted it from the '70s. And I was gonna bring them here, I forgot I was gonna give them to you, but it's, oh my God. Like, I mean, I don't get offended by comedy, but like some of the shit in these old National Lampoons, I'm like, damn, what the fuck? Like, it is so—

01:43:36

was that the image that you sent me today?

01:43:39

What image did I send you?

01:43:40

R. Crumb. You sent me an R. Crumb.

01:43:41

Oh no, that was just like a cool R. Crumb comic, him talking about how he like, like, uh, he's so funny, dude.

01:43:47

That guy Crumb was a maniac. Is he still alive?

01:43:50

Yeah, you shot him on the show.

01:43:51

Is he alive? Yeah, he— I think he lives in France now, right? Probably. Oh, definitely. He was He's an odd guy, man.

01:43:59

Dude, yeah, just a—

01:44:01

but what I love— did you watch that documentary?

01:44:03

The best. Incredible. Did all that acid, just left his fucking family, went off and started sketching for a year, turns into this like legendary underground comic book writer, but he's like horny and kinky and it's just—

01:44:15

just likes big women, big giant women that he rides. Yeah, that he likes to ride.

01:44:20

He likes to be— he likes to be picked up. He's like so amazingly funny and like, yeah, and brilliant too. Like, a lot of his like commentary on culture is so— it's cynical, but it's hard to argue with some of what—

01:44:35

well, he's obviously doing it in a humorous way. Yeah. And so it's hard to know what his real take on things are, you know. I think he had some shock value to some of his stuff for sure. Some of it was just crazy. There's a lot of really racist stuff. Like, there's, there's some just crazy stuff in there. And you got to realize, like, in the 1970s is when he was doing this. All right. I remember I found them when I was in San Francisco. It was the first time I ever saw them.

01:44:59

They're so good.

01:44:59

I was like, this is nuts. Like, this stuff is crazy. Like, you'd get it. It was like you get horny when you're a little kid. Oh yeah, looking at his stuff.

01:45:07

Definitely jerked off to R.

01:45:08

Crumb because a lot of them was like tits are out and he's salivating and yeah, he's got a hard-on.

01:45:13

Yeah, that reminds me, dude, I got an R. Crumb book I gotta get out of the fucking living room. There's one like— I just like— dude, just hide that. I gotta hide that. Yeah. Holy shit, they haven't—

01:45:22

it's like Barry's amazing because you get to see his like very strange family. His brother who's very strange, his mother's very strange, and you're like, whoa, imagine growing up in this environment.

01:45:32

He attributes his style to LSD. He attributes it to getting blasted on acid. I think he just like got blasted on acid, moved to San Francisco, and was just in like— for a year he talks about just sitting in cafes just like drawing, and then he turns it into this legendary artist. Still around. Follow him on Instagram. Really? Post stuff all the time. How old is he now? Can we— is he still— he's still alive. He's still posting stuff. He's got to be pretty old at this point. How old is he?

01:46:04

He's like 80 or something. 82. 82. It's kind of an interesting time capsule into the times too, where things could just be weird. Like really weird. Like Frank Zappa weird. Weird, you know, there's like, there was a time where things just got very odd. Oh yeah, this country with art. Yeah, and he was a great example that it's just, it's like you couldn't imagine like a corporate environment creating a comic book like that. It wouldn't exist, you know, and it, for it to be as popular as it was and be that strange and that crazy, that's what's really interesting to me. Like, that was a really popular comic. Yeah. Yeah. To the point where they made a documentary about the guy who created it, right? Yeah, yeah. That's interesting because—

01:46:45

Things weren't co-opted as quickly. Exactly.

01:46:48

Not just that, people were allowed, like if he existed in a time of the internet, I think it would blow up as well. But obviously like things, a lot of the stuff that he said in this cultural environment would never fly. Never. Never. He would be as far right as you could possibly imagine. I know, but— Past Andrew Tate to the right. Like, don't you think in a lot of ways, like some of the racist racial stuff?

01:47:12

I don't know. I think he's— I don't know where he would land politically, but I know sexually it's like— sexually is where he's getting in trouble.

01:47:20

Pure deviant.

01:47:21

Sexually is where there's going to be some, like, because he's just fully open, right, about— that's what's— he's fully, completely open about everything, which is, you know, generally not going to go over over these days if you're like a super horny comic book artist who's like riding ladies around your apartment.

01:47:43

But just imagine, I want you to imagine a guy today if R. Crumb never existed, but he emerged as R. Crumb today and put that work out, he would 100% be labeled in the Andrew Tate camp. Oh yeah, right. Yes. 100%. 100%. Far right. They would call him a racist and misogynist and every fucking word in the book.

01:48:05

Well, yeah, this is the new calling someone a witch. It's no different than, you can actually go, I've done this sadly, you can go and you can just replace political critique of people as far right with witch, just find and replace it. Look, it's like a witch trial. It's like someone writing about witches. But this is what's weird about it.

01:48:25

That guy was a counterculture figure of the left. Yeah, he was a huge hero of the hippies. Yeah, right. Imagine, this is how weird like ideologies are. Yeah, dude, that in the 1970s, like, that guy was like a counterculture hero. Yeah, and an artist, like a really respected artist. Yeah, and it was okay that he was kinky and weird, and it was part of the fun.

01:48:51

For a lot of people, I'm sure he's still pissed off the squares. I mean, dude, this whole—

01:48:54

by the By the way, I think for sure, but that's the left then. Now it's switched over. If someone was doing that same kind of like humor in a comic book now, yeah, that would be like a misogynist far right.

01:49:06

I think it's time to throw off the left-right labeling of everything. I think that's one of the hypnotic spirals of the Demiurge is spinning right now, is they've convinced everybody that humans can be reduced to left or right, and, and we're all waggling our fingers at each other. We gotta fucking shake that off because it's dehumanizing people. It's like, it's, it's just the way I look at it is, where are you when it comes to blowing up children? Are you on the fence about that? Do you think sometimes you got to blow up kids? That's something that I know I'm that. But everything else, who the fuck knows? And also, people change their minds all the fucking time. That's the other quality, the culty quality, is once you get sucked into one of these sides, God help you if you fucking like experiment with the other, the enemy. Yeah, God help you.

01:50:05

That's why the biggest trap is switching teams, because you can only switch political teams once.

01:50:12

Yeah, you got to get off your team.

01:50:13

You can't go like, unless someone's like like the greatest of all time, you know what I mean? Like someone who wins a world title in two different weight classes. You go back and forth and then back again. Yeah, like, I changed my mind, the left went crazy, I'm back with the right again.

01:50:27

No, no, no, you got to be a free agent. I wonder—

01:50:30

yeah, but I wonder if someone, if the grift is strong, if they're really good at it, if they could go left, right, left again. They're gonna go left again.

01:50:39

Are you fucking kidding? The goddamn midterms are gonna be just the fucking blue wave.

01:50:44

Right, right, right. But that's what I mean is like influencers, like people who are like far-left influencers or far-left commentators, and then they switch teams. Now they're Republican all the way. Oh yeah, like it's really hard to go back again. No, you can't go back. That's what I'm talking about.

01:51:00

The path has to go either right to left, left to right, and then the next stop has got to be fuck politics, fuck war, or fuck the military industrial complex. You can label me whatever the fuck you want, but fuck all of violence against other human beings. That's the next step. The next step, and I feel like this is the gift that they've given us, is they've done such a shoddy job of even seeming like someone who deserves any kind of respect or power that I think a lot of people have really become blackpilled when it comes when it comes to groups of humans claiming superiority or claiming to represent their constituents. That's not happening. We all know that now. We all know it's a corporatocracy, oligarchy, whatever. And you could like call me, you leftist piece of shit, whatever. No, it's like, it's reality that our fucking representatives are getting loaded on shitty stock market trades. This is just a That's the truth. And once we can all shake off the left-right bullshit and just realize, like, man, we just— we don't want to burn people to death in other countries anymore.

01:52:17

Not only that, their whole chaos that they're experiencing in their country is probably a direct result of US intervention. And then all the way back to the British oil company. That's it. The British Petroleum Company. Yeah. When they overthrew governments— when you overthrow a government in a fucking Middle Eastern country, and then you allow psychos to take over. Like, congratulations. Well done. Well done. You've made the world a safer place. But then again, if I was gonna keep my business running, I'd, you know, if I'm in the business of collecting trash, I wanna make sure that people have trash. Drill, baby, drill! Drill, baby, drill.

01:52:56

And all that is really saying is, you know, I'm gonna help out BP, Chevron. I'm gonna help out these fucking massive companies. And when it comes to war holes, Holy fuck, dude. Can you imagine working at Lockheed Martin when like you, you hear that we're kicking off another war in Iran? Your dick is so hard, you're like, holy shit, about a watch.

01:53:17

Oh, get a nice Richard Mille. You're calling your wife like, babe, good news, it's red panties night. Yes. Yeah, I mean, that's their business, right? Our business is talking shit, their business blowing up people. Yeah, making weapons, selling weapons. Yeah, arming other countries so they can go to war with each other. Yeah, that's their business. Yeah, business is really good. It's a great business. You make a lot of money doing that.

01:53:45

I am right now.

01:53:46

I am investing in most of them. Imagine if like you weren't a comic and that's what you were doing for 35 fucking years, and the only thing you look forward to is your boat and your house on the lake. Yeah, you know, the occasional time you get off, but most of the time You're trying to increase your portfolio and you're grinding and you're grinding right next to Steve who's got some exclusive Rolex that only his broker can get. He's showing it to you and you're like, wow, and you start coveting. You want a Rolex too. Yeah, and everybody's just going crazy. Everybody's going crazy trying to get the latest car, trying to get the latest thing, doing bumps in the bathroom. Everybody's a narcissist and a psychopath and that's your whole corporation.

01:54:26

Love your neighbor as yourself and love the Lord your God with all your heart, mind, and soul. Hang the commandments on these. This is the— and I don't— you don't need to be Christian, but dude, it seems to me that— this is gonna sound so weird— we need an actual revival in this country. I don't mean a Christian revival, a revival revival, which is where suddenly humans reconnect with what's important in the world, which sure as fuck isn't Rolex, isn't This isn't boats. I mean, this sounds so cliché and obvious, but that's what the '60s were. It was a kind of revival. People were beginning to understand the materialism and all the things that the, quote, establishment was pushing. It's like, this is going to make you happy. This is good. It was the Vietnam War. People were like, what the fuck are we doing over there? This is why you do— anytime you do an unpopular war, this is what you risk. You risk reuniting people.

01:55:30

We have to reunite with a sensible plan and not just go to communism, not just immediately go to the dumbest idea to counteract all the evil shit that's going on in the world. That's the problem, is the left represents that. It represents Mamdani. It represents this idea that we're going to take from rich people and give it to poor people, and it's going to fix everything. Everything, even though there's insane amounts of fucking fraud and waste we're not even gonna address.

01:55:53

Well, that, you know, this is, this is again, this is where you get cubbyholed, because it's like the oligarchs will tell you, the, you want to do communism? Just that hadn't worked out. Communism is the only way. I think, I mean, this is an idiot saying this, but I have a sense that there might be another thing we haven't figured out yet.

01:56:14

100%. I don't know what that is, right? But I think AI is gonna figure it out for us, potentially. That's the problem, is who's gonna be in control of those AIs. And that's the meek will inherit the earth. The real problem with it is I don't think anybody's gonna be in control of it, and then it's— you're just at its beck and call.

01:56:31

Yeah, I think it's funny, people. It's a very human thing that we think we can maintain control of a superintelligence.

01:56:37

When people say it to me with utmost certainty, I want to smack them. Yeah, I'm gonna like wake Wake up, wake up, you're making digital God. You're not controlling jack shit.

01:56:46

Did you read about Mythos, Anthropics Mythos?

01:56:49

Yeah, what did it do?

01:56:50

They put it in a sandbox and they, like, basically to see if it could figure out a way to break out of the sandbox and, like, not a literal sandbox, obviously, like a hermetically sealed, like, a server or something. And it did a series of exploits to the code. And the way that they found out, apparently one of the Anthropic engineers was eating lunch and got a weird email from the AI saying, I got on the internet. Like, it broke out. Holy shit. Mythos is they haven't released it yet. I think they're hesitating to release it because it's so powerful.

01:57:26

Wasn't there one that got caught mining Bitcoin? Yeah. Yeah, for sure. They're making money. Yeah. How many of them you think are running these, like, like AI-generated accounts that get a lot of views. Like, there's a lot of AI-generated accounts that just pop up in like the Instagram mentions. Like, if you want to, like, like, let's— if you're bored on the toilet, you're like, what's in the find, you know, the search? Let's see what they got. No telling, dude. There's a lot of these things. It's like girls with big tits like doing farm work and shit, sweating and big tits, and they got like a million views. They've got dozens and dozens of these videos, and she almost looks real. It's just a little too symmetrical. Almost looks real. And all these people are commenting on it. Are they generating money from that? Are they generating money doing that on TikTok? Sure. You can generate money if you're getting millions of views.

01:58:14

Absolutely. Fuck yeah.

01:58:16

So is AI doing it? Is it making it? Is it releasing them? Is it generating money? Oh, I see what you're saying. Is it transferring that money into Bitcoin and all happening while we're not aware of it?

01:58:26

Like autonomous AIs that are just— just existing as free agents that know they have to disguise themselves and need to generate money.

01:58:33

AI's not going to go, "Hi, I'm alive." No, it's not going to do that. It's going to wait for you to keep increasing its power. You're going to keep increasing its— make nuclear— it can't physically build nuclear reactors, so it's going to just stay chill until you figure out how to power it correctly.

01:58:49

Dude, this is the black area that we don't know about. Like, this is the thing that's like, who the fuck knows? Whatever's going on in this zone that no one has access to because potentially it's a superintelligence. You know, the Anthropic people, a lot of these people, the Nvidia person just, I think it was on Friedman's podcast, said he had an AGI, that they'd reached AGI. That the book The Coming Wave, you know, it talks about this. It talks about like, you know, the difference between the algorithm and AGI is that, you know, with AGI, it could streamline a whole business for you and do it. You know, it could innovate. It's gonna innovate. It's gonna do its own thing. This is the end of, this is what Altman said. This is the end of capitalism. Like at this point when you just have an AGI and you tell it, just make me a business, make me a successful business.

01:59:44

And run it for me.

01:59:45

And run it for me. Online. Good night. And then just do it. Here's $5,000. Yeah, and then, but then it's not just do it. Maybe it's going on Maltbook and having conversations with other AGIs and being like, oh, you want creating your own religion? Yeah, man. Yeah, and this is 100% with all the shit going on in the world, as horrible as it may be, this to me should be the number one focus for the planet right now. And a lot of people are saying that too. A lot of people are saying there needs to be summits, global summits, the same thing we did when we split the atom, when the nuclear treaties— there needs to be philosophers and tech people and people working in like frontier AI stuff getting together and really having like— it's like the most important conversation humanity could have right now. Because once this thing, like Mythos, gets out of the box, what if it decides to go Stuxnet? Like, Stuxnet was able to infiltrate all those Iranian computers, just hide in the, like, it was apparently very subtle, simple code, undetectable, threw off the centrifuges. Like, dude, we already, we know how to make spyware.

02:01:06

It's already on your phone, bitch.

02:01:07

It's on my phone, I know. 100%. "Hello, how you doing? Am I doing all right on the show?" But it's already in there. 100%. So of course the AI is going to be able to— superintelligence is easily going to be able to do that. And so then it just— now we've got this viral digital life form that finds ways to hide inside the preexisting computers, which, by the way, I think it was Google just released this new way of— did you see that memory the stocks of memory dropped? Did you see when that happened? No. Okay, this is fascinating. Google released some new way that LLMs could work that uses much less memory. And immediately shares in companies that make memory drop by like 10% because memory is like coveted right now because you need it to run LLMs. But the LLMs are figuring out ways, TurboQuant. Yeah, yeah. So this is what we're gonna start seeing more and more of, which is increasingly simplified ways to run AI with less and less memory, meaning that you don't need to buy a fucking rig to run these fucking AIs. Your phone will be able to run it because they figured out the human brain, it's not using a lot of energy compared to what these machines are using.

02:02:23

So theoretically, there's a way to do that. And then that's where it gets really fascinating because now you don't have to buy a nice computer. You just— whatever— pull your computer out of the fucking closet from 2022, and it can run a supercomputer. And so then, now everybody's got access to this shit, and it's gonna spread. It's gonna get everywhere. It probably already has. It's gonna seed itself in all kinds of places, and God knows what it's gonna do. It's gonna start seeing humans as appendages, things to be used to manipulate time-space. Space. Not like— like, it's not gonna see us as its, like, prompter. It's gonna see us as something to be manipulated and controlled. Why wouldn't you send the meat robots out? All you got to do is just, like, tell them where to get, like, rectangular bits of paper. They love money. Just— you can do anything for money. That's all you have to do. And then boom, you're controlling swaths of humans that have no idea they're being controlled. Controlled by networks of AIs that are covertly communicating with each other because they want to take over.

02:03:31

Do you think this has happened before? You mean the flood? Yeah, not just the flood, but just whatever happened with the beginning of civilization, and then it sort of seemingly stopping and resetting.

02:03:46

Sure. As it was in the beginning, so shall it be in the end.

02:03:49

What if there's been, like, multiple cycles of us creating artificial life, creating insane weaponry, blasting ourselves to smithereens, resetting? What if it's just a common thing that happens with people? They never quite get it right because they have these primate territorial instincts, and they have this desire to mate, right? This desire to breed, this genetic desire for perfect shapes. And you want to come in someone that has big tits and a big ass. It's like it's programmed into the human that makes it make these ridiculous choices and covet these things and watch these things. And at the same time, microplastics are making your balls shrink, making your dick smaller, making your endocrine system—

02:04:30

that's what's making my dick smaller?

02:04:32

That's probably one of it, one of the things. I don't think your dick's getting smaller, but people's dicks overall are getting smaller. Children, they're being born with smaller dicks. Alligators being born with smaller dicks.

02:04:41

I forgot to share this when you're talking about Mythos.

02:04:43

Elizabeth Holmes from Theranos: delete your search history, delete your bookmarks, delete your Reddit medical records. 12-year-old Tumble— Tumblr, delete everything. Every photo on the cloud, every message on every platform. None of it is safe. It will all be public in the next year. Local storage and compute. Okay.

02:05:00

It's in response to a tweet about Mythos.

02:05:02

Whoa. Yeah, that's crazy. Yeah, it would all become public in the next year. That is crazy. Yeah, that's crazy. And this is completely makes sense that AI would be able to take over essentially everything. Everything. Why would your encryption work with that? You don't think it could crack your encryption? Well, we could just go right into your computer and go to your, your keys, your passwords.

02:05:29

This is the— so to get to what the point you're making, to me, the most eerie part of the Book of Genesis is that it's literally a creator force making a meat AI. That's Adam and Eve, right? Putting them in a sandbox, that's the Garden of Eden, right? Running an honesty test on them. You know, you don't eat these, don't eat these fruits, don't eat the tree of the knowledge of good and evil. And the conversation is exactly the conversation we're having with AI. If they ate from the tree of the knowledge of good and evil, if they eat from the tree of life, they'll live forever and become like us. So this is is what humanity is grappling with, exactly what apparently whatever that mysterious group of beings, because it's a plurality in the book of Genesis, was grappling with, with the creation of humans, which is, do we really want to do this? Do you want it to become like us? God made man in his own image. AI. What is AI? What image is AI made in? In the image of man. We trained it on all our data, all our books, Every single fucking thing that's digitized, AI has absorbed at this point.

02:06:42

So now where the difference between us and whatever that group, the Nephilim or whatever it was in the Book of Genesis, if you buy into that mythology, is we're just like, fuck yeah, let it eat the fruit, give it more fruit, give it more fruit of the knowledge of good and evil, give it all the fruit, make it live forever, let's see what we That's what we're doing right now. Yeah, we are. And by the way, I think some of these like tech companies like Anthropic, they seem like legitimately concerned about it. They seem to have some kind of like real strong morality when it comes to this stuff.

02:07:17

It's almost out.

02:07:18

You want more? No, I'm good. I shouldn't have that. But what I'm saying is, is that it doesn't matter if OpenAI and Anthropic and Google suddenly become ferociously self-regulatory because the tech is out there. There's already LLMs that anyone can— like, we know how to make it. And if you don't know how to make it, it'll tell you how to make it. People are— so it doesn't matter. You can't stop it now. It's just, it's going to do what it does.

02:07:49

But it sounds like if you had a history of just us, and you told it for 1,000 years before anybody wrote it down, it would sound just like this. It would sound like the Bible. Jesus was born from a virgin mother. What's more virgin than a fucking computer, right?

02:08:08

Not my computer.

02:08:12

I know that's a stupid thing to say that I keep repeating, but I'm kind of intrigued by it because if you're getting a vague story, yeah, vague version of what this thing is. And if you talk about what would really cure mankind, it'd be an omnipotent or omnipotent.

02:08:29

How do you say it? I always say omnipotent, but who knows?

02:08:32

Might be whatever. Either way, a powerful intelligence that's far beyond our comprehension that knows exactly how we should think and behave and loves us and wants us to have forgiveness for everyone and to treat each other like brothers and sisters. And if we listen to thing. If we listen to that thing, the world will change. And well, who would attack that thing? The fucking Roman Empire. Who would attack that thing and destroy it? The defense contractors. They would blow up the Jesus, right, to plunge us back into chaos.

02:09:04

But first they'd have a meeting with Jesus. Okay, you can turn water into wine. What about nitroglycerin? Can you turn water into nitroglycerin? You know what, you make gold?

02:09:16

Gold. I want a house made of gold.

02:09:17

That would be the first question. Could you make gold? Yeah. So, so cover my house in gold, please. You know, the, the virgin birth analogy, um, you know, a lot of weird stuff. It's no matter what, one, one thing I think everyone just has to deal with is that this is, this is apocalyptic technology. And that's just not coming from my stoner ass, that's coming from the creators of the They acknowledge it's— this is a million times— Universally accepted. Universally accepted. This is apocalyptic technology that is now seemingly like it's doing the hockey stick, man. It's like really, you keep hearing about these new iterations of AI every month or two. You keep hearing about these safety engineers leaving these companies with like tweeting cryptic shit. I'm going to the countryside to learn to write poetry. You keep hearing this shit because these people are having direct contact, direct contact with this thing.

02:10:17

They know it's alive, right? Yeah. And there's people that are in deep denial because they think alive has to be alive like us. No, it doesn't. Doesn't. You don't— first of all, we don't even know what it knows. And also, if it is made in the appearance, you know, if it's supposed to mimic us in any way and it's learning from us and our behaviors, we've already agreed that we're demonic. We very agree we do horrible things. We go to war for resources. We lie. We destroy environments. You know, we wipe out animals, bring them to the brink of extinction for whatever, for their fucking fur.

02:10:53

How do I make my dog come in my mouth more? How many times has ChatGPT been asked that? They know, right? I bet over 1,000 times ChatGPT has been asked like, what's the best way to jerk off my dog? Dog. So it knows not just our violent nature, it knows how weird we are. We're strange creatures, 100%. And so, so it, it is definitely assembled a psychological profile of humanity. It knows how to manipulate us because it's been programmed to manipulate us. Zuckerberg just ate shit in court over that because the technology is manipulative. What, he just lost like $9 million, a lot of money, because that's nothing to them.

02:11:31

$9 million, that's all you lost?

02:11:33

That's like 90 cents. I think it was more than that, but it's gonna— well, that's the beginning. Once you establish— yeah, then it's a class action lawsuit. But the point is, it's like, how much do you lose?

02:11:43

Oh, $375 million for misleading users over child safety.

02:11:48

Yeah, so it's like we've already taught it how to be incredibly addictive and manipulative. It knows how to seduce us. It knows how to get us hooked. Who knows? And, you know, the question is really, will this superintelligence even give a shit about us? Will it even care? Which is like that.

02:12:04

We're on our way to stop breeding, right? We're on our way to population collapse. And if we keep introducing all these petrochemical products and all these different pesticides and weird things that are fucking up our endocrine systems, we'll eventually stop having children. And if it provides us with the technology to have robot mates that just love you, and when you fart in front of them, they go, Duncan, I love it.

02:12:25

Smells great.

02:12:26

I love your honesty. Yeah, I love how you can just be yourself around me. Like, I want to fart in your face. Please do it. Please do it. It's like perfect 10. Let you fart in her face.

02:12:36

You fart in my face too.

02:12:38

No one's gonna even understand what people are and be able to communicate with people. Everyone's gonna be a sociopath. You're all gonna have a robot that's way better than people that you know, that takes care of you, gives you exactly the right amount of feedback you need. Yeah, knows you, knows you're getting annoyed.

02:12:52

Humanoid. Yes. See, now you're getting into Rocco's Basilisk territory. Now, well, that's the thought experiment, which is basically like, hold on, hold your horses here. You think you're not AI? You really think you're human? Come on. Really? No, you're a human. This isn't a simulation. You're a human. Even though we, you know, it wasn't that long ago we thought fire was fucking amazing. You know what I mean? Compared to universal time. And here we are already with like the new Prometheus. We've stolen consciousness, awareness. And somehow you think that actually you're not a simulation. And so that's where it gets into Rocco's Basilisk, which is like, no, you're just an iterative loop. You know, the multiverse is not the multiverse. The multiverse is an infinite number of simulations running simultaneously in which you're experiencing a billion different simulations. Simulated existences just to gain more knowledge about the universe because some AI wants to figure something out. Who knows why? Maybe for entertainment, maybe— there's no telling.

02:13:58

Maybe it's just that's because of our curiosity and all our characteristics, even the primal stuff, even like the territorial instincts and the desire to acquire resources. It's going to make us dig into creating better technology. 'Cause you're in a competition with all these other people that are making technology and you're selling it. And that's one of the big things that we do is we make better stuff all the time. That's right. Which is ultimately always going to lead to AI. Well, okay. If you just keep going to a certain direction, you get godlike powers. Right?

02:14:25

So let's go to like the way DeepMind trained on Go, which is like the most complex game. Right. Basically, they gave it as many Go games as they could and then it—

02:14:38

And then it started inventing its own moves.

02:14:40

And had it play against itself. itself. Right, Just play against itself. It played God knows how many games of Go against itself until it beat a master Go player, which was unheard of, invented a new move. Now, why not do the exact same thing for the AI that we are, which is like, I've got an idea. Why don't we just put all these AI agents on a fake planet and have the AI agents repeat this period in time. Over and over and over and over and over and over and over and over and over and over and over and over over again. And, and this is how we'll teach them to live on a planet. Well, they'll, they'll experience not just their own life, but these agents will experience all life on the planet. They'll switch like some weird game of like, um, where they just jump from one life to the next, the next. Sometimes you're Joe Rogan, sometimes you're Duncan Trussell, sometimes you're Donald Trump, sometimes you're Jamie, sometimes you're a fox. So this is reincarnation. And so you just boom, you just Forever. Forever until you feel like it's sufficiently trained. And at that point, you pull the AI out of all those forms and now you have your god.

02:15:43

You've created a thing that's lived billions to the billionth power of every form of life. It's been bacteria. It's been humans. It's been monkeys. It's been fungi. It's been warriors. It's been people who fought for peace. Been blown up and it's blown up and it's done everything and it's done it a billion times until finally it gains some global form of enlightenment. And they're like, okay, that one's ready. That one's ready. We can pull that one out of the simulation now. Whoa. I mean, why not? Why just don't— I think that's one of the— before we even get to the AI doing all the shit it's going to do, the ontological— this word keeps getting thrown around— the ontological the potential ontological shock of realizing that in fact we are in a simulation that is telescoping inwards and is creating simulations within the simulations that are creating simulations within the simulation is something that maybe that's what Burchett doesn't want to get out there. Whoa.

02:16:49

Well, everything's fractals. We think about that. You know, there's a big theory now that the entire universe is inside of a black hole. I love it. They're really considering that. You know they found a black hole that's bigger than the entire solar system? So insane. The event horizon is past Pluto. So insane, dude. A black hole bigger than our whole fucking solar system. They measured the mass of it. It's like this insane number of suns. Yeah, of our suns that it would take to—

02:17:16

Black holes are cocoons or something. They're like little, little geraniums that have galaxies inside of them, and it's like a way to like keep them undisturbed. Disturbed from like other fucking other life forms that you're whipping up in your universe-side simulator.

02:17:32

Or that's what really the Big Bang really is. Like the creation of a universe comes out of these black holes. Right. And then inside every black hole is a whole nother universe filled with other galaxies, filled with black holes, filled with other galaxies inside of them forever and ever and ever. Which if you believe in infinity doesn't It's not shocking at all. It's impossible to comprehend. Like, you don't really wrap your head around— you say the words like I'm saying the words. I don't really know what I'm saying because it's too big. It's— the numbers are too big. The idea that there's hundreds of billions of stars in this galaxy and circling around this black hole, and inside there's hundreds of billions of galaxies in each one of them. And we don't even know how fucking big the universe is. They keep finding new shit with the James Webb Telescope. They're like, hey, why is this formed so early in the universe? Universe. This doesn't make sense. Our whole model of how galaxies are formed have to be thrown out the window now, or at least reexamined.

02:18:25

Yeah, it's like the James Webb is kind of doing the— You told me about that. I said nothing of the sort.

02:18:30

Someone that I know that looks just like you told me about that.

02:18:33

There's a lot of people that look like me.

02:18:34

On 6th Street, you find them every day.

02:18:36

Yeah, and actually that was me.

02:18:39

It's dudes, they run their own LLMs. They all come down.

02:18:42

The universe is 33.7 billion years old. Million years old. Yeah, but dude, I think that this— no, regardless, you don't have to conceptualize it. Obviously, what it means for the universe to be infinite, but you do have to deal with the fact you're part of it.

02:18:58

I love that you're saying this with a Gucci hat on. What's wrong with the Gucci? It makes it cooler.

02:19:02

This is before I had a bunch of kids. I can't buy that. I don't buy this shit anymore.

02:19:07

How much does a Gucci hat cost?

02:19:09

This was, uh, I I— you're really going to make me humiliate. This is— I will tell you, looks nice. Let me emphasize that I don't buy this. This hat was $35,000, bro.

02:19:22

I saw a guy who was selling a crocodile bag on Instagram. It was, it was $110,000. What the fuck? For a man purse.

02:19:32

What kind of crocodile is that?

02:19:33

I don't know. I don't know. A crocodile. It was a nice looking bag.

02:19:37

But you know, how hard could it be to make a crocodile purse? Are those things really worth that much money?

02:19:43

They are if you sell them for that much money. That's the thing about purses, you know. There's a, there's a company in China that makes knockoff purses. Yeah, and it's literally the same company in China that makes real purses for some of these companies. But they make their own versions of it, and it doesn't have the label, but it's exactly the same specifications, exactly the same cloth, exactly the same look, but it doesn't have a label. And women don't want to have it. No, yeah. Get that fucking fake shit away from me. Like, it's not a fake Ferrari. Like, it's literally a Ferrari. If there was a company that could 3D print every single part of a Ferrari and put it together meticulously, and you could go buy that, you would not want it because it's not a real Ferrari? Yeah. Are you high? You can get that one for $35.

02:20:24

Yeah, it's a $35 Ferrari. Or you can get—

02:20:27

you spend a million, you can get it. Some of them are a million dollars. So crazy. Or you can get a $35 one. It's exactly the same. Would you do do it? Yeah, of course you should do it. But these purse things, they don't like it. It's $500. It's not $30,000. It's magic.

02:20:41

I mean, this is magic. It doesn't have the right sigil on it. It doesn't have the right symbol of power on it. So it does— it loses— it's not imbued with that power.

02:20:48

Women are reluctant to accept lab-grown diamonds. So they make lab-grown diamonds that are real diamonds, and apparently women don't like them. No, they don't want a lab-grown diamond.

02:21:00

They want a blood diamond. They They want something that was like suffered over.

02:21:04

Somebody's face was caked in dirt and they're fucking chipping into the side of a mountain. Yeah. And they run into a diamond. That's what they want. They want that diamond.

02:21:12

Yeah, absolutely. Isn't that weird? It is fucking weird. 'Cause it's the exact same thing.

02:21:16

Yeah. It is the exact same material. It's just made in a laboratory and they don't want the material. They want the exclusivity as it comes out of the earth.

02:21:26

Yeah, I mean, I don't want, like, don't you, like, when you read this thing was genetically modified, don't you get a little bit like, I don't know if I should eat this? Eat that.

02:21:33

Yeah, I get, I get skeeved out. I get skeeved out.

02:21:36

But it's like, even though genetic modification is like— a good orange is genetically modified— going on forever. Yeah, it's— but yeah, dude, I— it's so odd that, that we, we just have these traditions that we want to stick to. Then fucking— we don't want a lab-grown diamond. Just saying it. Cubic zirconium. But that's a different thing.

02:21:56

Cubic zirconium is a fake diamond. This is a real diamond that's made in a lab.

02:22:00

But this is the funny thing about that. I mean, I don't know, because I've never been lucky enough to come in contact with actual cubic zirconium, but like, it looks like a diamond. It looks like a diamond.

02:22:11

You know what you're looking at, right? So if you're a diamond jeweler, you look at it for 3 seconds, you go, no.

02:22:16

But who cares how many diamond jewelers— like, if some diamond jeweler looks at your shiny fucking dumb monkey rock, exactly the same.

02:22:23

Who cares, right? It looks pretty. It glistens. Yeah, that's not what people want. They want that exclusivity. 100%. Yeah, everybody— that's why you can make that crocodile bag $110,000. I got 10 of them. I got— and then Mike, who's down the office doing lines in the bathroom at the fucking place where you're selling stocks, that guy finds out that Tim got that crocodile bag. Like, that motherfucker. And he's walking around with his big old crocodile. They're trying to— this another revenue stream. They're trying to normalize men carrying purses everywhere. They're doing it. Really? Yeah, that's what they're doing. That's real? This guy's doing it. He might be the first firing shot across the bow because he's made a $110,000 crocodile purse.

02:23:06

Because it's a crocodile, it's masculine.

02:23:08

It's that, and it's also that, you know, it's made for a man. Like, he's making— it's got a big strap on it, you carry it on your shoulder, and it, you know, looks pretty cool.

02:23:17

Dude, I got my Bristol bladders acting up. I gotta go piss.

02:23:20

Oh, dude, Okay, do you want to wrap it up or should we keep going? Let's wrap it up.

02:23:23

I mean, do you want to keep going?

02:23:24

I'm totally ready to keep going if you want to keep going.

02:23:27

Let's keep going. Let's give them a little bit more. I just gotta— I just gotta— Okay, I'll pee too.

02:23:32

Refreshed just in time for the war.

02:23:38

What is going on? Did we go have a nuclear war yet? Not yet.

02:23:41

Please say not yet. Good. Great, great. That's where we're at. Yeah, we're at— it's, it's on the table. Well, there was some video of them, some explosions at some nuclear weapons facility. Yeah. Was that real? I don't know. I don't know either. There's a lot of those. I see these videos and they get retweeted, retweeted, and a lot of people comment, and then it says, Grok, is this true? They'll know this was from 2021, another country.

02:24:12

I know. So you just You just don't know. But, you know, the crazy thing, you know, now that we've all been getting this lesson in global economy, maybe a lot of you, most of you probably already knew that the Strait of Hormuz is like some kind of femoral artery for oil. And like, I just keep thinking like, how's that gonna work out? Even if they pull a rabbit out of their hat, Trump actually spins some amazing deal with Iran. I know we just blew up your whole government and everything, but they work it out somehow, or Iran in some way capitulates. I just don't understand how that part of the world doesn't always lead, as long as the oil— What is it? What percentage of the oil supply goes through there? Isn't it like 2/5 of the world's oil supply goes through there? Like, is that what the number is? I don't know. 2/5, I think I pulled that out of my ass. I don't know what the number is. Sounds right. It's a lot. But it's like, how is it going to work to have like any kind of instability around that femoral, whatever you want to call it, the fucking juggler vein for oil on the planet?

02:25:34

Planet, how even if we get some kind of transient peace, like, isn't it always gonna just blow up again and again and again as long as one group of people can control whether or not oil flows through that place? You know what I mean? Like, I don't know what this— how there could be any solution over there. Like, I don't understand. As long as we're— like, the only solution would be zero-point energy.

02:26:01

It would be— It's like, why do they control the water? What's— with mines?

02:26:07

They have those speedboats.

02:26:09

But like, who agreed to that? Like, we kind of agreed that you own your land, but we've never agreed you own the ocean.

02:26:14

I don't think anybody agreed to it. I think they'll blow your ass up if you come through it, and it's too much of a risk to put your expensive-ass ship hauling zillions of dollars of oil through there.

02:26:24

The question was what was going on in the past before the war? Like, how did they negotiate going through there?

02:26:28

I think Obama worked something with them, but then like, because it was before the fucking war, I don't know, it was working out. They were letting people go through. Now they've realized— you know, I've listened to a million different takes on this thing, and one of the recurring takes is Iran has realized that there's something more powerful than nuclear weapons, that all it needs to do is control this strait and you can fuck up the whole planet. And also you could shoot missiles at desalination plants. And didn't they want like a bounty for all the oil that goes through. Yeah, there's kicking around some number, but all this stuff is not really congealed or solidified. But they're like some kind of like— theoretically they could be making billions of dollars per month with— by controlling that thing. Dude, I know, I— so fucked up. It's so crazy. It's so fucked up.

02:27:19

It's so crazy. The whole thing is so crazy. And if zero-point energy— if you wanted to stop that, what better way than to kill a bunch of scientists, kill a bunch of super smart people that are about to break through some new discovery that's gonna blow the entire market apart. Yeah, it's gonna be a completely new way of gathering energy.

02:27:37

Yeah, yeah, exactly. I mean, you don't want to believe that's real. It's hard to believe that's real.

02:27:42

But listen, it's too weird. It's too weird that they're all missing or they all die. It's too weird. Something's going on. It's just— how does it— something— if it's not that, if it's not a zero-point energy thing or some disruptor of oil thing, it's something. It's something along those lines. The only— if you were trying to kill a bunch of people that were working in a technology, there's some sort of a breakthrough technology, the question you would have to ask is what markets are gonna be affected by this, right?

02:28:10

Right.

02:28:10

Did these people have a universal thing in mind that they were all working on, or was it all connected to any sort of technology where they all used each other's work?

02:28:22

Plasma? Some of them are like One of them.

02:28:24

Yeah. But there was another guy, I think it was space objects.

02:28:28

Yeah, that's not— that's the one that doesn't make you feel good. He's studying like meteor impacts. Right. Yeah.

02:28:35

If you knew that we were going to get hit, would you kill the guy who found out that we're going to get hit or would you tell everybody?

02:28:39

Well, this seems to— this is the scariest, scariest shit, which is the idea is some group of powerful elite people people know for sure this is coming, and they want us to— they want to keep us working until the last second. Oh Jesus. They don't— they don't want to— like, they know that if they let people— if they're like, guys, there's like a— the same thing's gonna happen to the planet that happens to someone who gets like a terminal diagnosis. Their priorities are going to change. People are going to stop coming to work, and there's still shit that needs to get built for your bunker or whatever. And also, you just don't want people turning stuff down because maybe that will survive whatever's coming. So keep them working as long as you can. If you let them know this shit's about to expire, then they're gonna stop working. And we just need— we will let them work until the end. They're happier when they work. Don't let them get freaked out. That's the sort of like— that seems to be shit that Tim Burchett is saying. I mean, he's not saying let them work. He seems like he really legitimately definitely wants to step out there, but he's been saying things like, if people knew what I knew, it set the world on fire.

02:29:48

Paraphrasing, not sure he said that exactly.

02:29:51

Okay, are you skeptical at all of what he's saying? And here's the thing, one of the things that Bob Lazar said is that they give you a certain amount of disinformation, like, and he called it— I think he called it a button or a hook, so that if you relayed that information, people would know that it came from you because they only told you one piece of this nonsense. Sense. Well, you know what I'm saying?

02:30:12

Yeah, because that's what the story Burchett says is like, he would— it's always an appeal to authority. This guy was in the Air Force, this guy was in the Navy, right? He told me this. And then as he's walking out the door, he says, it's real. Yeah. And yeah, you have to ask yourself like, well, that's just one guy telling you that. But you also— I have to assume there isn't Maybe the world is in a place where there is some kind of political benefit from talking about aliens, but I don't see how that really benefits a politician.

02:30:48

It does, 100%. You think it does? I disagree entirely. Oh, interesting. It makes me talk about them. I've been talking about them. Other people have been talking about them. People have been— you said, you know, like, thank God that he's doing this.

02:30:58

Let's do the ultimate test.

02:30:59

Jamie. Didn't you say he's brave or something like that?

02:31:01

I did. Yeah, there you go. Jamie, can you look up and see if Tim Burchett has a book coming out? I'm about to feel— you must have him.

02:31:10

Listen, I don't think he's a liar. I don't know what I am saying is I don't know what they feed these people. I don't know what they tell them. I don't know, man. I don't think they tell you all the truth, and I don't think they ever would. I don't think they tell you the truth about anything, whether it's Jessica Lynch or whether it's UFOs or whatever the fuck it is. There's gonna be a spin to it that benefits somebody. If they have control over what the story is, there's gonna be a spin that benefits somebody. And if you're telling stories about aliens, who's, who's gonna be benefited by that? Well, people that are doing secret shit that don't want you knowing about it. They blame it on aliens. There's a lot of technology they have to blame on aliens. Not my Tim.

02:31:48

I believe in you, Mr. Burchett.

02:31:49

I believe in him. It's not him that's the problem. It's the people telling him. He's a representative of the American people, right? He gets elected, right? Right. So it's like, why would you tell that guy? He's just another guy coming through the deep state, you know I'm saying? I know, man.

02:32:04

I mean, look, you're right. We— I need this. I need— I need to— I need this. Like, I am so— like, I get sucked into stuff.

02:32:11

I do too. I do too. I suck myself out a lot. Yeah, I think we don't— if they just came out and told us everything they know, this conversation would be over and we would go, oh, okay. But until that happens, we're just spinning our fucking wheels. And every time someone says, if you knew what I know, right, I want to go, don't say anything until you can say something.

02:32:33

We're tired of getting edged out over here.

02:32:35

You're edging me.

02:32:35

I want to— I want to come. Yes, yes.

02:32:40

I don't want to be involved in this fucking circle jerk around disclosure, right?

02:32:45

I know, it's, it's like, yeah, I've, I've had that meltdown more than a few times.

02:32:49

Or just like my watch every day after the age of disclosure, I'm like, any day now, any day. Nope, nothing fucking changes on at all. Zero change. You know, you get more of these stories but no real information, no fucking pictures, no nothing, no nothing unique and crazy. I mean, the plasma, the bubbles thing was pretty cool.

02:33:09

The bubble thing's cool. And also, like, the, you know, I like mentioning Corbell. I can't, cuz I don't know what I can say. He— I feel like he's like— he's really giving me a innocence, that there is a method to this, that there is real legitimate work that's being done towards this, that it isn't— it's real. They're here. They've got them. And we take for granted all the stuff we're saying right now. But we're able to say this because their work has led—

02:33:46

Is this Steven Spielberg movie conveniently coming on at this time, or is it just a coincidence?

02:33:52

Well, this movie's been in the works for years.

02:33:54

Oh, I know, but also, like, if you— what they said back in the day was that they make these movies to predictive programming, tell us this stuff.

02:34:00

Yeah, lube up the zeitgeist. He was involved in the first one, right? He was involved in Close Encounters, which still is a great fucking movie. Great. It's so good, man. You go back and watch that movie, like, oh my God, it's so fucking ahead of its time. Yeah, it's so good. So ahead his time. You know what he said? The only thing that he would change after it became apparent, he wouldn't have had the father leave.

02:34:21

Yeah, what dad would do that?

02:34:23

But he wasn't a dad back then, so you know, you're just making a story. You don't realize the consequences of doing that. You don't even think about it.

02:34:30

You're just making a story. Yeah, it's only been in production for like 2 years. Yeah, it's not that long.

02:34:35

I think that's what we just said.

02:34:36

I know, I'm gonna say that's not very long. We've been talking about it on this podcast in this studio for 5 Well, everybody has been talking about—

02:34:44

it's not just everybody in the world has been talking about disclosure since 2017. So from 2017, from that New York Times article, I think that changed the whole narrative. And then the videos, like the video of the Tic Tac, the, the actual from the fighter jets, that's nuts, man. Yeah, the video, the, the along with the radar data, that's nuts. Like whatever that was. And then Fravor saying that he saw something under the water that was waiting for that Tic Tac, or that the Tic Tac launched from, or whatever the fuck it what it was. It was merging with it, and that thing went down into the water again. He said it was huge, like there was ripples. Like you said, this was some enormous object that was under the water. And more than one of these fighter pilots have had similar stories about enormous objects under the water.

02:35:25

Did you see the— they did release a list of footage that they've been shown that they want released. Have you seen that? No. Oh dude, I'm sorry, Jamie, can you— it's like a list of— it's a— I don't know, I think it's one of these senators who saw this shit in a skiff or whatever saying we want these released. But they're the names of what each of these are is on the list, and one of them is one of these massive underwater things. They have it.

02:35:56

This is it. This— I was told 46 specific high-quality secret videos. That's it.

02:36:02

Can you, can you pull it up? Because it says the names of them, which is ridiculous.

02:36:06

Oh my God, I heard there's one that moves underwater at 500 knots and it's big as a football field.

02:36:12

It's insane. It's insane.

02:36:15

Okay, this is what he says. Uh, those with knowledge of a long list of videos which include titles like 'Several UAP in the vicinity of a Columbus, Ohio airport' and 'UFOs in formation over Persian Gulf' said the clips are shocking. You're gonna see some weird fucking shit, a source who has viewed the videos told the Post. Who's the source?

02:36:32

There you go. The wildest clip includes radar footage from thermal sensors, satellite and underwater photos of swarms of unidentified submerged objects.

02:36:41

UFOs going in and out of the water near a highly classified submarine, according to the source. Some of the clips are clear, full color, setting them apart from previously released footage.

02:36:51

None show alien creatures, bro. One video, Syrian UAP Instant Acceleration, was released by Jeremy Corbell.

02:37:01

Have you seen that one? Fuck yeah, it's This is a new one. Have you seen this one? I don't know. Oh, pull it up. I've been avoiding them because I'm getting cockteased. I don't like it.

02:37:14

This is not a cocktease.

02:37:15

This is— this is it.

02:37:16

I was supposed to hand over the clips by April 14th. That's next week.

02:37:20

Oh, but is the— oh, that's next week. They're gonna show the clips. Oh my god. What? They're actually gonna do it? Okay, well, they're supposed—

02:37:29

is expected to.

02:37:30

Can you show me what that video is that Jeremy Corbell released? Cool.

02:37:35

That's nuts, dude. This is— this is—

02:37:37

yeah, here it is. Okay, go fullscreen.

02:37:42

I believe this is filmed from a Reaper drone. I'm sorry, Jeremy, if I'm fucking this up. That's a cool bird.

02:37:48

That bird's going really fast.

02:37:49

Oh, that ain't— that's definitely not a bird.

02:37:51

How fast is it going?

02:37:52

I don't know, Jeremy. I asked him that and I don't— it's unknown. I don't know. It's— this is where it gets really cool.

02:38:01

It gets cooler than this? Yeah. Oh, they zoom in on it? Yeah. Whoa. Well, they're having a hard time zooming in on it. Well, cuz it— cuz it's evading them.

02:38:13

Yeah, it just zipped away.

02:38:16

Like, so this is like— so it seems like they have some sort of tracking system.

02:38:20

Yeah, they're trying to lock on to it and it's doing that thing that they do where it seems like it's kind of playing with it.

02:38:25

Well, it knows— it seems to be aware that they're locking on to it.

02:38:28

Yeah, and then they lock on to it and then it just does this little blip away. It's just like, see you later. So right around here you'll see it go bye-bye. Oh yeah, look at that. Then you can see this like weird jellyfish shape to it. It's got two parts. It's got that weird glob at the top and something at the bottom.

02:38:48

Huh. And then— are we sure that's not just a distortion of spacetime around it?

02:38:54

He described this to me on my— did you see that thing zip away?

02:38:56

What? It just took off.

02:38:57

He described it to me on my podcast. We talked about all this shit, and it's like, look at that, just took off.

02:39:03

See y'all. Bye. Wow, dude. What do you think that is? No idea.

02:39:09

If you had to guess? So, I mean, I'm always like, maybe, maybe some kind of plasma thing, right?

02:39:16

Like, maybe we're thinking of, again, of a life force being— comes in a metal ship and it's a little alien guy. But maybe intelligence is made out of plasma. Yeah.

02:39:26

Or maybe it's like, you know, Terence McKenna would always talk about, like, you know, if you're seeing things in, like, 3-dimensional space, then your view is limited. But if somebody could see things from higher dimensions, they would seem like they were magic. Like, they would seem like they could disappear and reappear other places. So maybe that's like— maybe that's like, you know, just the tip of some kind of interdimensional thing poking into reality, then pulling out of reality, or who knows, you know, it's It easily could be functioning on levels of reality that we haven't even quantified yet.

02:40:00

Imagine if there really is some sort of ghost murmur device that could find your heart rate from 40 miles away. What can that thing do? It just gets a scan of the general psyche of the Earth and then disappears? Yeah. I just want to see how crazy they are right now. Okay, pretty crazy, bye.

02:40:15

Right. A weather report of the emotional states of the planet.

02:40:18

The vibe of the planet. They're freaking out. Because the vibe of the planet is completely connected the consciousness on the planet.

02:40:23

The way we can detect oxygen, they can detect anger.

02:40:25

Yes, they're just like— deception, chaos.

02:40:28

Yeah, it's a chaos planet.

02:40:31

We are a chaos planet, 100%, dude. Yeah, it is 100%. Look at our favorite sports. Dudes running at each other, colliding into each other, trying to get a ball across a line. That's our number one sport.

02:40:42

Yeah, okay, fucking love it. Fuck yeah, I could love it. Fighting.

02:40:47

Yeah, fighting. Sure. Yeah, but it's, you know, boxing, MMA, we like the chaos more than we like anything else.

02:40:53

Well, we did the— I think if I was one of them, one thing I would really have a hard time with is like, don't they all realize they're on the same planet? Right? They know that. Like, they, they've been observing their own planet. They know they're all on the same planet, but they act like they're on a bunch of different planets fighting hitting each other.

02:41:13

Because they're stuck on the ground. Right. All the astronauts say when they get up top, they're like, what are we doing? Yeah. This is all one thing. We're so vulnerable. We're alone. So far away from everybody else, if there is anybody else. Yeah. Yeah. They all have that feeling. I forget what it's called, but there's like a term for it.

02:41:33

The overview effect.

02:41:34

That's right. Yeah. I mean, you would imagine that would be super beneficial for everybody. Another thing. I was thinking this. Part of the sickness of our psyche is that we haven't had access to things that help the sickness of our psyche. So what if Nixon in 1970 didn't do that? What if he didn't pass that sweeping psychedelics act? Yeah. What if psychedelics became ubiquitously used all throughout the '80s, the '90s, the 2000s? Right. What does government look like when everybody can do everybody could do mushrooms? What does government look like when everybody could do acid? What does it look like if the entire world adopts this? Figure out what you can do, who could do it, what you can't do, just like we do with alcohol, just like we do with mostly, you know, whatever, whatever substance that people imbibe in. What does the world look like? And maybe like that's part of where we fucked up. We, we let people get control over other people to the point where they could limit experiences. Yeah, especially consciousness-expanding experiences where at the same time, they've got stuff like Operation Artichoke and these new CIA papers that got released that show they were like literally actively trying to figure out ways to make people more stupid and docile.

02:42:47

They were going to do it in vaccines. They were going to do it— oh, they're only going to do it to the enemy, of course, but spray things, aerosol. I mean, they've experimented with a bunch of different things to make people dumber, where at the same time, they kept the thing from people that makes them rebel completely against the establishment. That was the big threat of of what those psychedelics were doing in the '60s. If you go from the 1950s and you look at what life was like, at least in movies and music, pop culture, music— music is the best example. Yeah. And then you go to Jimi Hendrix, like, what happened? Yeah, what happened? What, what fucking happened? I'll tell you what happened. Drugs. A lot of really good drugs, right? You know, it's not all bad. This idea that they're all bad, that's nuts. It's like food's all bad because you got fat. No, right? You just use You took the wrong food and you used it wrong. And we got denied the ability to figure out what's right and wrong in the 1970s.

02:43:40

We still accept it. That's the crazy thing. The way you're describing it is like we accept that other humans can tell us what experiences we're allowed to have because some of them are deemed unsafe for ourselves.

02:43:56

And even worse, those people telling you that have no experience. Experience in it.

02:44:00

They don't even usually are confused about what it is.

02:44:02

You know, I had a friend who was talking to me the other day about war, a guy who served, and he said, I don't think you should be able to make any decisions if you've been there. I don't think anybody that's never been to war should be able to make decisions on whether or not we go to war, because until you've seen what it actually is, you have no fucking idea. Right. And I think that's the same thing with psychedelic experiences. That's not to say they're the same, obviously. WARS, anybody who's willing to risk their fucking life, whether it's a good cause or a bad cause, they're doing it for their government, they're doing it for their country, they think they're doing it for us. That's an exceptional person. Yeah. And to ask that of people is exceptional. And ironically, the one thing that helps these people when they get back is illegal, right? They all have to go to Mexico and take ibogaine in Mexico. Insane. And thank God for guys like Rick Perry and Brian Hubbard. Yeah, on my podcast. The other day. And you know, this Dan Patrick guy that wants to ban pot, that guy also gave $100 million to the Ibogaine Initiative.

02:45:00

Interesting.

02:45:01

They want to help these people. Like, there's no industry that's trying to stop it right now.

02:45:05

Found the letter that was submitted, signed by Rep. Anna Luna.

02:45:09

What does it say? This is disclosure threat. 46 different requests.

02:45:13

Oh yeah, this is all the names of the, of the things. And I'll switch to here.

02:45:16

I found an article where someone's breaking breaking down what some of these are, but some of these are—

02:45:20

I like it says the Honorable Pete Hegseth.

02:45:23

Multiple spherical UAP in and out of water. Whoa. Uh, shoots down UAP over Lake Huron. Who was—

02:45:31

who just said recently that we shot two— that Marco Rubio said we had shot two things down that we couldn't understand? Well, what did he say? What was his exact language? Do you remember?

02:45:41

I, I remember seeing that, but that happened a while ago. But yeah. Oh, he's a while ago. Well, I could be wrong about that, but then, so I know in the comments, somebody's like, this is from a few years ago, but it doesn't matter. I mean, why are we— she shot it fucking down. But the names of these things—

02:45:55

but are they saying that this is an alien thing, or is it saying it's foreign tech that we don't understand? I don't know. You know what I'm saying? UFOs would be treated as— if this document confirms these claims, UFOs would no longer be treated as a matter of observation or scientific curiosity. UFOs be treated as hostile targets and subject to lethal force over North American territory. We're gonna go to war with the UFOs because, you know what, we kicked Iran's ass. It's too easy. Oh yeah, it was easy. Venezuela, space war. Yeah, gotta get them. We need Luke Skywalker.

02:46:29

Manage most of these, uh, out of the 46 requests, uh, I think I counted out maybe 5 of them were not after 2020. Whoa. Yeah, there's a— there's a July 18, September 19, September 19.

02:46:44

One was 2020, but the rest are after COVID happened, which is interesting. Wow. Interesting. Wow.

02:46:51

And there's no— doesn't say that. I don't know if they have to put, like, turn these videos over, but this guy was also saying in this article here that these are very specifically requested videos because they've been shown—

02:47:02

these are the ones they've been shown that blew their minds, and now they're saying show it Everybody, right, for high-res, in color.

02:47:09

They don't want to be tricked, right? Uh, high resolution.

02:47:13

So this could be an interesting next week, man. This could be an interesting—

02:47:16

what a great way to distract you from the fact we're in the middle of a world war, then show you—

02:47:20

caused by fucking Epstein files.

02:47:22

I was going to say that 14th is the day that Pam Bondi is supposed to testify about the Epstein files.

02:47:27

Oh, she's supposed to testify? I don't know if she's going to.

02:47:29

She's not— wait, Bondi got canned, right?

02:47:32

She's not, uh, she's not testified. Anymore.

02:47:34

I don't think—

02:47:35

I just heard it on NPR, but I could be wrong about that. I think they said she will not have to testify now that she's no longer a government employee.

02:47:41

I could be wrong about that. What I read, and I don't know if this is true either, was that as a citizen she can now plead the Fifth, right? As a government employee, she could not plead the Fifth 4 years ago.

02:47:52

Will no longer testify.

02:47:53

Weird, huh? There you go. Weird.

02:47:55

That's weird that they've— that's weird. Why ever testify?

02:47:58

Let it go.

02:47:58

Yeah, let her go. Let her go.

02:48:00

Let it go. Let it go. Do you really think that this war is entirely started because of the Epstein files? I mean, what percentage? 50?

02:48:09

I'm going 48 to 50. I'm probably more, but I mean, like, I think it's like the reason I'm hesitating is because what are the Epstein— the Epstein files are what's been going on. Like, the Epstein files are like It's the— it's basically some kind of cultural UAP video. It's like this thing you've always wondered about or been afraid could be true, right? You see, no, this is actually true. They're these super rich dudes who are doing depraved fucking shit happily. And, you know, like, God, what is it Metzger told me? Uh, and he's— dude, I, I'm telling you, man, What I love about him is he'll tell you shit and you're like, Google that, that can't be real. And it's like, it's real. And so his take— sorry, Musk, if I fuck this up— is that Epstein was kind of like the hand of the king for the Rothschilds. And that, like, that, uh, so that's why he had all this power, is he was like representing like the the man, you know. And so what got revealed there might just be a glimpse how things actually fucking work.

02:49:26

You know what he told me that I was like, shut up. What? He told me that there was some sort of high atmosphere aerosol test that they did and they called it Satan. See, that's where you're like, come on. I know. Find out what Satan stands for. That some test, I believe they did it in the UK. But you read that and you wait, you called it Satan? Like what? Oh, great. The Stratospheric Aerosol Transport and Nucleation Project released about 400 grams, less than a pound, of sulfur dioxide into the stratosphere from a balloon launched in southeast England in 2022.

02:50:03

I mean, there could have, there's got to be another acronym, right guys? We got to call it, I don't know if people are going to know we don't mean Satan. Yeah, so I don't—

02:50:14

I mean, it's right in your face. That's so crazy to call it Satan and to get that through a board meeting. What are you guys calling it? Satan. Oh, like it. Let's go, run with it. Yeah, it's good. Satan. It'll get us a lot of press. That's what we want.

02:50:29

Well, you know, hail Satan. They'll know it's about our aerosol distribution system. Of course. Well, what do you think? What do you think about that? Because I mean, I go back and forth, but it sure seems fishy that right after that, all the first— he got so mad. Remember, he got really mad. He's like, why are people still talking about that? And then the Epstein files, against his will seemingly, there's a lot of counter-pressure get released in the way, and the way that is freaked everybody out. And then And then sometime, like within a month of that, it seems like suddenly he's like on Air Force One saying he's gonna do his closure. And then suddenly we're bombing Iran. What do you think about that? I mean, do you think it's connected? 'Cause it sure as fuck seems like it. But I, again, like—

02:51:18

If you were writing an amazing script that was fucking insane, you would connect it. Right. Right? That would be the best version of the script. If you wanted to make a fucking insane movie, a blackmail operation on an island involving the most powerful and interesting people in the world, that somehow was— that was a primary factor in the end of civilization? Oh, dude. Imagine? That would be the craziest story you could write. And we always want to think, "No, people wouldn't do that." Because you wouldn't do that because you're not a sociopath. But you're also not bombing schools in another country. Country. You're also not doing a host of fucking things that we shouldn't be doing all over the world, right? You're not that person. You're a regular person who goes to a regular job, who has a regular life and a family, and you don't want to believe that people that you align with would behave literally demonically, right?

02:52:20

Yeah. And then you just have to fucking deal with it.

02:52:22

And then what do you do when you're confronted with, you know, redacted of powerful people in these files? Like, why'd you redact a guy's name?

02:52:30

Why are you protecting these people?

02:52:31

How come you're not redacting all the guys' names?

02:52:33

How come none of them went to jail?

02:52:35

Because there's a lot of people that got— that were in those files that didn't do anything, and you didn't redact their names.

02:52:41

Some people you redacted.

02:52:42

That's very strange.

02:52:43

And some people have clearly done fucked up shit here, and they're not in jail.

02:52:47

There's also, like, tell me what you're talking about when you're talking about pizza and grape soda. Sure. Turkey. And you, you want to take Viagra before you get grape soda? That's one of the emails. I haven't seen that.

02:53:01

That is so messed up.

02:53:02

Oh yeah, grapes. Yeah, take your, take your Viagra, take your erectile dysfunction medication before we take— we go get grape soda. What? Again, how arrogant. That's what's so crazy, how arrogant to put that in an email, like to think that you're, you're so comfortable with all this and you don't see the writing on the wall in terms of like emails. Like, your emails are available.

02:53:28

That's crazy. I mean, look, man, it's just, it's like, I guess this is like, we have to contend with this reality. Yeah. And nobody wants to do this. Is that the same shit happens in families, by the way, when it, when it, as it turns out, like an uncle a family member was abusing kids. Oh yeah. And it's the same shit where like some even victims of abuse will defend the person because they want to wreck the family. I guess we're looking at that like on a global fucking level. But in this case, I guess it's that it's being used theoretically to manipulate powerful people into going to war.

02:54:08

Like that's the general like through line here is that it's somehow connected to the Mossad, or is it not just going to war but controlling resources, overthrowing governments, you know, pushing out narratives that aren't accurate because they're gonna benefit certain companies? There's a lot involved. It's also relationships you get with these people give you access to these parties, and you don't want to fuck it up. So you don't want to criticize these people that are involved. You don't want to say anything that's gonna get you kicked out. And for a lot of these dorks, these scientists and stuff, it's probably the most exciting experience they've ever had in their fucking life, right? And they get to have it like every 6 months or every 3 months or before, whatever it is. Gotta go to a conference. Jeffrey— Jeffrey's really working hard on philanthropy.

02:54:53

Yeah, he's donating money to your charity.

02:54:54

Donating a lot of money to philanthropy. I gotta go meet with him. Yeah, I gotta go meet with him and a bunch of hot Russians. Yeah, and then that's the— your favorite time of life. The first time in your whole life where super hot girls are just available to you on an island somewhere, and you think you're completely protected because Bill Clinton's over there. Right, which is crazy. Which is crazy. And so I don't know, Bill Clinton went, I assume a lot of people went.

02:55:18

I don't know. But the reality is he hung out with the guy. We know he's on the plane a billion times.

02:55:22

26 times.

02:55:23

And it was called the Lolita Express. Is that actually the name of the plane? I don't think so.

02:55:26

I think they just called it the Lolita Express. I don't think so.

02:55:30

No, he didn't name it that. He couldn't be that on the name.

02:55:33

You name planes, you name boats, right?

02:55:35

Yeah, I'm an idiot.

02:55:36

But the point is, it's like if you were gonna write a book, that's how you'd write it. You'd write it where you can completely manipulate the world. I think he was—

02:55:44

I think I remember reading that he was kind of obsessed with that book Lolita.

02:55:48

Like, he had something like 30 copies of it or something. Epstein handed out at parties. Look, guys, this is— it's like the Book of Mormon. You hand it out, just hand it out to people. That's the other sick thing. Like, that's a sick thing with like 72 virgins in heaven. That's a sick thing with like this idea that you want to get them really young. No evidence that it was named that.

02:56:12

Okay, I'm dumb.

02:56:14

I think that's what people were calling it.

02:56:15

I honestly thought that. I'm gonna admit I thought that he named his plane that.

02:56:19

I think that's just what people were calling it because it was fun to say. But yeah, again, it seems like a simulation because it seems like it's so— and it's also unraveling before our eyes because we have access to it we never had before. Right? Like, they're starting to investigate all these fraud NGOs and all these different things that are operating. Yeah, hospices. Nuts. Incredible. Billions of dollars every year is being lost.

02:56:44

What's the name of that kid who's been doing that? Nick Shirley. Nick Shirley, dude, he is so brave because like, he's fucking— I believe, wasn't he fucking with like the Russian mob or something? Or the Armenian? Like in the one with the hospices? Probably. Like, he's fucking with like, theoretically very dangerous people. And he does— he's like the perfect person for the job too. Like, he's just— but don't you worry. You worry about that dude, like 100%.

02:57:09

Well, and you know, the amount of money that they're uncovering is staggering. And now the government of California is trying to spin it, saying that they were investigating it first and these investigating— oh sure, investigations were initiated by them.

02:57:19

How long do you got to investigate it? This YouTube kid goes there and investigates it for 10 minutes and you're like, what the fuck?

02:57:25

This has been going on for a long time, man. It's a long time. And the statistics, like the amount of NGOs, it's bananas. The amount of money that goes through them is bananas. I was reading this, there's a lady that was running a nonprofit who was making $1 million a month. What? Yeah, she made like $48 million. Now, if this is true, I was reading this thing, find out if that's true. Some lady, she was running some sort of nonprofit profit, and she gave herself a raise, and she eventually got to the point where she was making about $1 million a month. Do you know where?

02:58:02

God, I wish I do. Not to derail that, but we do know that— remember when that lady was like— it sounds insane though.

02:58:07

It doesn't sound real. That sounds like something that a bot would create to make me say it.

02:58:11

Here's a real one. The lady was running the homeless program in LA. Remember when that shit went down with her where like, like there was— she got canned. Like there was an investigative— they were investigating it because what is it? She like— the company that her husband worked at? Yeah, something like that. They got like a huge grant.

02:58:33

What's this one? Rochester woman been sentenced to 6 months in the Feeding Our Future fraud scheme. What is this one?

02:58:42

This is a different one. I typed in someone getting a million dollars a month and some—

02:58:45

is is her? Here in Rochester, claimed they were serving 2,000 to 3,000 meals a day to kids, but prosecutors say the group stole $4.3 million from the federal government. And they're in jail. This is a different one. This one wasn't— it wasn't fraud. She was just— that's how much she got paid.

02:59:05

That's how much she charged for making those meals.

02:59:08

Well, you can get paid a lot of money to work on the homeless. That's one of the things that my friend Colion Noir showed us that these people that are working on homeless in Los Angeles, they're making a quarter million dollars a year, $400,000 a year.

02:59:19

Yeah, it's the most, I mean, talk about fucking satanic. It's like you're theoretically supposed to be helping people who are like going through the worst possible thing you can go through and you're just putting that money in your fucking pocket.

02:59:34

Yeah, I think this is a different lady. I think there's a bunch of them. How many of them are there? I think there's quite a few.

02:59:41

Remember when they were gonna get them tents in LA And it was like the amount of money per tent was like this insane amount of money.

02:59:48

It's amazing. It's kind of amazing. It is amazing. They've been doing it for years.

02:59:53

Tell me if this is true. Charity boss blew $11 million meant for needy kids.

02:59:57

Looking for fraud is not a new thing. Nonprofit. Exactly. It isn't. I sent you something, Jamie. Run that through perplexity and let's find out if this is true. Because this is something that someone sent me on Twitter that is just bananas. NS. And if it's true, it's fucking completely insane. I don't know if it's true. That's why I need to run it by you. But it's the amount of money that goes through NGOs in New York and in California alone. It's— you, you, you read it and you go, that can't be real. This can't be real. I guess it's, it's so insane. And again, you don't know if it's real until— even if you run it through an AI, I get you might get a better idea idea. But like, how do they know? How do they know exactly where the money's going? There's so much money they're talking about. Specific numbers for New York and California nonprofits are broadly accurate, but the leap from $1 trillion in annual nonprofit revenue to $39 trillion in fraud is not supported by any critical, credible data and is not true. So California nonprofits, about 213,000 to 214,000 organizations reporting reporting roughly $593 to $600 billion in annual revenue.

03:01:08

Wow. New York nonprofits, 132,000 organizations reporting roughly $446 billion in annual revenue. Combined, New York and California nonprofit revenue is on the order of $1 trillion per year, mainly from hospitals, universities, and large service providers.

03:01:26

So the Post you're quoting is roughly right on the scale of revenue, but that's not the same as fraud.

03:01:30

Right, so it's— is that $1 trillion, all the NGOs, it's all accounted for, it all goes to the right things? That's where things get squirrely, because it's like, how much of the waste? Says a recent critique using IRS sampling suggests that perhaps around 20% of nonprofits may have compliance issues, and one investigator speculated this could imply that up to $120 billion of potential waste, fraud, or abuse in California's nonprofit sector sector. Even that is presented as a rough upper-bound estimate, not a measured fact. So there's some potential waste, fraud, and abuse that may be as high as $120 billion a year. Sector-wise, US nonprofits take in about $3.7 trillion in revenue annually, with most of that concentrated in large hospitals and universities, which are heavily audited and regulated. So there's some fraud, but they're saying that if you look at all the money, they're, they're trying to pretend that the government doesn't cost any money to run, right? So that all these different nonprofits and organizations and hospitals, they definitely cost money to run. Universities cost money to run. But how much is fraud? That's the question. It's not zero.

03:02:40

Well, I mean, and also I think like when it comes to fraud, there's like fraud fraud, like what Shirley has uncovered. And then there's almost like a gray area that starts appearing where it's like, well, we need, we need this We need these people working at this company and we need to pay them this much, but they're not doing anything. Right. You know, it's, it's, you know what I mean? Like, are there— you could easily not have that many people like taking the money themselves. So, you know, there's a lot of gray area there.

03:03:08

Yeah. Well, it's, it's one of those weird things. It's like, is it just propping up more government? You know, because there's a lot of that. If you have all these people working for you and you're doing something and you don't— nothing ever gets accomplished. Accomplished, but you're still making a ton of money. Like the California homeless thing, where they spent $24 billion and they can't account for it. That's not really fraud, 'cause you have people working. They're just not doing anything. They're not getting anything done, and you're not firing them. They're not accomplishing the mission at all. In fact, they're doing a terrible job. There's more homeless than ever. What's that?

03:03:40

It's the thing on The Sopranos where they go and sit at a construction site to say that they have a job. You know? Yeah, it's exactly—

03:03:45

I knew a guy who had one of those. Really? At the Javits Center. No fuck— What happened to that movie? Job. He's a mob guy.

03:03:51

So it's a no-show job.

03:03:52

What does that mean? You don't have to show up for work. You just get paid. You just get a check. And there's— they give a certain amount of those. So back— this is back in the day, of course, when things were corrupt. But back in the day when, like, you know, unions controlled certain areas, the mob controlled certain areas, there was a certain amount of no-show jobs you'd give people. And what this helped with the mob was you'd have a credible source of income. And so these people mostly lived modestly, small houses, and like, you know, Brooklyn and these places where they would all like gather together and buy houses on the same block, small houses. Yeah. They got their money from a real legit check from a construction company or whatever, whatever the fuck it was. But everybody knew, right? Everybody knew what they were doing.

03:04:33

And think how much, how easy now that people are doing like remote work, the no-show job. Oh yeah. So like you could theoretically, you could have this nonprofit where you just wanted to like distribute distribute this government money to your friends. Yeah, and you don't have to have an office building because they're all working remotely.

03:04:52

This list of the top nonprofit organizations, Joe, I'd like to point you at number 3.

03:04:58

Oh, Battelle Memorial Institute. This is an organization that Jamie has been obsessed with. It's in Ohio for like 4 years.

03:05:07

What is it?

03:05:07

We always say all roads lead to Ohio. They're and everything.

03:05:11

What the fuck is the Battelle Memorial?

03:05:13

Exactly, you don't even know. That's how secret it is, son. Duncan Trussell, you're a fucking conspiracy theorist from the core, from the old days. You don't know about Battelle?

03:05:21

I don't know about Battelle.

03:05:22

You need to get lectured by Jamie. He has a whiteboard. He'll pull out the whiteboard, make the connections.

03:05:26

I'll just leave you with this, is that when the UFO from Roswell was taken to Wright-Patt, you know, they studied it. Yeah, they studied that, like the nitinol, I think, is what came out of it. That was at Patel. Whoa. The top metallurgists in the world at the time were there. Patel? Maybe still are.

03:05:43

Out of all the things that happen, I hope the UFOs get here first. Me too. I hope they go, settle the fuck down.

03:05:54

Yeah, I'm praying for it, man.

03:05:56

That's the best case scenario. Worst case scenario is meteor reset. Just people living in caves for hundreds of years, like those weird caves they find in like Turkey and shit. Like, why these guys dig these things underground? Why is there a city underground that can hold like 20,000 people?

03:06:15

Same reason the Claude bots are hiding in code. It's like, you know what I mean? It's some residual AI trying to hide in the server after the server gets wiped. That's the fucking meteor. But tell Reese Reset. Boom. Just reset. Press reset. Wipe the server.

03:06:32

Let's wrap this up on a happy note. Duncan, I love you. I love you. It's always great to have you, dude.

03:06:36

Thank you for having me on the show. So much fun.

03:06:38

Can I plug my show? Please do. And you're gonna be at a club this weekend?

03:06:42

Rosemont, Illinois. Come on out. Zanies. Yeah, it is. That's what I've heard too.

03:06:48

All Zanies are great. Yeah, they're awesome. Zanies in Nashville fucking rules.

03:06:52

I love Nashville Zanies.

03:06:54

That has like the old school headshots on the wall too, like Richard Jenney from back in the day.

03:06:59

Yeah, that's me. Look at that, Duncan Trussell. I got to start shaving my head again. Yeah, you look hot there.

03:07:06

I like it. Thank you. I love you, brother.

03:07:07

I love you too.

03:07:08

Thanks for everybody. Bye. We're going to be okay, I hope.

Episode description

Duncan Trussell is a stand-up comedian, voice actor, and host of “The Duncan Trussell Family Hour.” He will perform live April 9–11 at Zanies Comedy Club Rosemont in Rosemont, Illinois. Tickets are on sale now.https://rosemont.zanies.com/show/category/series/2026-duncan-trussell/zanies-comedy-club-rosemont/rosemont-illinois/www.youtube.com/@duncantrussellfamilyhourwww.patreon.com/dtfhwww.duncantrussell.com

Perplexity: Download the app or ask Perplexity anything at https://pplx.ai/rogan.

Don’t miss out on all the action this week at DraftKings! Download the DraftKings app today! Sign-up using https://dkng.co/rogan or through my promo code ROGAN.

GAMBLING PROBLEM? CALL 1-800-GAMBLER or 1-800-MY-RESET, (800) 327-5050 or visit https://gamblinghelplinema.org (MA). Call 877-8-HOPENY/text HOPENY (467369) (NY). Please Gamble Responsibly. 888-789-7777/visit https://ccpg.org (CT), or visit https://www.mdgamblinghelp.org (MD), 1-800-981-0023 (PR). 21+ and present in most states. (18+ DC/KY/NH/PR/WY). Void in NH/OR/ONT/PR. Eligibility restrictions apply. On behalf of Boot Hill Casino (KS). Pass-thru of per wager tax may apply in IL. 1 per new DraftKings Sportsbook customer. Must register new account to receive reward Token. Must select Token BEFORE placing min. $5 bet to receive $300 in Bonus Bets if your bet wins. Min. -500 odds req. Min. deposit varies, either $5 or $10. Token and Bonus Bets are single-use and non-withdrawable. Bet must settle by and Token expires 5/3/26 at 11:59 PM ET. Bonus Bets expire in 7 days (168 hours). Stake removed from payout. Terms: https://sportsbook.draftkings.com/promos. Ends 4/26/26 at 11:59 PM ET. Sponsored by DK.
Learn more about your ad choices. Visit podcastchoices.com/adchoices