DavidKnight_07-22-2025.timecode
Detecting language using up to the first 30 seconds. Use `--language` to specify the language
Detected language: English
[00:30.000 --> 00:32.200] in a world of deceit.
[00:32.200 --> 00:35.200] Telling the truth is a revolutionary act.
[00:36.320 --> 00:38.800] It's the David Knight Show.
[00:38.800 --> 00:40.760] As the clock strikes 13, it is Tuesday,
[00:40.760 --> 00:44.000] the 22nd of July, year of our Lord 2025.
[00:44.000 --> 00:46.800] And the robots are coming, the robots are coming.
[00:46.800 --> 00:49.040] Well, technocracy is accelerating anyway,
[00:49.040 --> 00:52.440] as AI moves forward and businesses rush to integrate it.
[00:52.440 --> 00:54.480] We see that we're gonna look at what's happening
[00:54.480 --> 00:57.240] with the CBDC, stablecoin, and the Fed.
[00:57.240 --> 01:01.000] Trump is desperately trying to get the topic off of Epstein.
[01:01.000 --> 01:04.360] He went so far as to release MLK files from the FBI.
[01:04.360 --> 01:05.200] Stay with us.
[01:57.240 --> 02:01.200] Welcome and good morning to all of you.
[02:01.200 --> 02:03.720] Hope you're having a good day so far.
[02:03.720 --> 02:08.720] We're gonna start with what's going on with the technocracy.
[02:08.720 --> 02:11.840] The shiny new chrome they've put on top of it.
[02:11.840 --> 02:14.200] First, I wanna ask that wherever you're watching the show
[02:14.200 --> 02:16.920] or listening to it, please drop a like on it.
[02:16.920 --> 02:18.840] Thank you so much for being with us today.
[02:18.840 --> 02:20.720] I hope you're having a good day.
[02:20.720 --> 02:22.200] I hope you're having a good day.
[02:22.200 --> 02:23.720] I hope you're having a good day.
[02:23.720 --> 02:25.160] I hope you're having a good day.
[02:25.200 --> 02:27.440] If you're listening to it, please drop a like on it.
[02:27.440 --> 02:28.720] We do appreciate that.
[02:28.720 --> 02:32.400] But as I said, we're looking at the technocracy.
[02:32.400 --> 02:35.920] This is from the US sun inside Elon Musk's Tesla diner,
[02:35.920 --> 02:38.760] staffed by robots with drive-in screens.
[02:38.760 --> 02:40.120] That's right, it's gonna be wonderful.
[02:40.120 --> 02:42.720] He's going the retro futurist route.
[02:42.720 --> 02:44.400] All those things you saw in the Jetsons,
[02:44.400 --> 02:47.800] all those swooping curves, those rounded edges.
[02:48.680 --> 02:49.520] That's what we're gonna have.
[02:49.520 --> 02:50.920] It's gonna be great.
[02:50.920 --> 02:53.200] You're gonna have robots that serve you milkshakes
[02:53.200 --> 02:55.600] and give you hamburgers all the time.
[02:57.920 --> 03:00.280] Elon Musk's Tesla is opening a one-of-a-kind
[03:00.280 --> 03:02.360] high-tech diner where customers can be served,
[03:02.360 --> 03:04.840] can order food from their car touchscreens.
[03:04.840 --> 03:05.920] Be served by robots.
[03:05.920 --> 03:07.400] Isn't that fun?
[03:07.400 --> 03:09.680] Your Tesla's touchscreen is just gonna integrate directly
[03:09.680 --> 03:11.800] with the fast food restaurant.
[03:11.800 --> 03:13.520] That's the type of future we really want.
[03:13.520 --> 03:14.520] That's what we need.
[03:16.840 --> 03:17.680] This is the type of-
[03:17.680 --> 03:20.040] Now everyone with a Tesla anywhere in the country
[03:20.040 --> 03:23.640] can enjoy going to this one diner in LA.
[03:23.640 --> 03:26.080] Yeah, people in the article are always saying,
[03:26.080 --> 03:27.840] well, he should open more spots.
[03:27.840 --> 03:30.000] This is a gimmick.
[03:30.000 --> 03:32.240] There's probably not enough Tesla owners
[03:32.240 --> 03:35.440] spread across the United States to make this viable,
[03:35.440 --> 03:37.760] aside from very few places.
[03:37.760 --> 03:41.320] It just strikes me as funny that now every Tesla owner
[03:41.320 --> 03:44.800] has software sitting on their computer in their car.
[03:44.800 --> 03:46.080] Oh good, blurtware.
[03:46.920 --> 03:50.720] Runninghamburger.exe.
[03:50.720 --> 03:51.560] That's right.
[03:51.560 --> 03:54.240] The diner can be found on Santa Monica Boulevard
[03:54.240 --> 03:56.680] in the media district.
[03:56.680 --> 03:57.520] That's right.
[03:57.520 --> 03:59.560] If you're in Santa Monica and you've got a Tesla,
[03:59.560 --> 04:02.480] you can go into this shiny new diner
[04:02.480 --> 04:06.240] and experience the retro future that we all deserve.
[04:06.240 --> 04:07.080] I've all-
[04:08.320 --> 04:10.000] I've probably mentioned this on the show,
[04:10.000 --> 04:13.840] but I have come to despise these types of things,
[04:13.880 --> 04:16.080] specifically from these types of people.
[04:16.080 --> 04:18.960] Last time we went to Disney World,
[04:18.960 --> 04:20.920] I was about 13 to 14 years old.
[04:20.920 --> 04:23.280] And the thing that really annoyed me the most
[04:23.280 --> 04:24.880] was the main entrance where they have
[04:24.880 --> 04:26.840] the old timey Americana.
[04:26.840 --> 04:31.640] You know, the 1950s, 1940s style town
[04:31.640 --> 04:35.040] where you walk through and you see the nice pretty buildings.
[04:35.040 --> 04:38.280] And it's just all the facade.
[04:38.280 --> 04:40.320] All of these people that work at Disney
[04:40.320 --> 04:42.600] have done everything they can to destroy
[04:42.600 --> 04:43.720] that type of lifestyle,
[04:43.720 --> 04:46.320] destroy those type of people that created
[04:46.320 --> 04:48.720] that America, demonize them.
[04:48.720 --> 04:52.000] But they'll rent it back to you at an absurd price.
[04:52.000 --> 04:53.320] They'll show you what might've been,
[04:53.320 --> 04:56.400] what we used to have as you live in your apartment,
[04:56.400 --> 04:59.080] as you rent for the rest of your life.
[04:59.080 --> 05:00.880] You can go to Disney World or Disneyland
[05:00.880 --> 05:03.880] and experience a little bit of what might've been
[05:05.240 --> 05:06.960] if these people hadn't come into power,
[05:06.960 --> 05:08.840] if these people hadn't stolen it from you.
[05:08.840 --> 05:10.520] Disney has done everything they can
[05:10.520 --> 05:12.960] to destroy Main Street Americana.
[05:12.960 --> 05:14.960] But for an exorbitant price,
[05:14.960 --> 05:17.320] you can go to Disney World and walk down Main Street
[05:17.320 --> 05:18.680] and see the plastic facades.
[05:18.680 --> 05:19.880] Exactly.
[05:19.880 --> 05:23.680] Here, it's like a ghost of America past.
[05:23.680 --> 05:25.840] A smiling, smirking, grinning ghost
[05:25.840 --> 05:28.600] rented to you by people that hate you,
[05:28.600 --> 05:31.040] by people that hate what it stood for.
[05:31.040 --> 05:32.000] And I can't stand it.
[05:32.000 --> 05:37.000] I cannot stand the shiny retro future from Elon Musk.
[05:37.280 --> 05:38.720] We're gonna create a beautiful future.
[05:38.720 --> 05:42.640] Here's your AI girlfriend so you never speak to women.
[05:42.640 --> 05:46.520] Here's this AI chat bot that's gonna drive you insane.
[05:47.440 --> 05:49.740] Here's your shiny retro future diner.
[05:50.880 --> 05:52.920] I find it disgusting.
[05:52.920 --> 05:55.040] But let's take a look at what's going on with Tesla.
[05:55.040 --> 05:57.440] Let's take a look at the shiny diner.
[05:57.440 --> 05:59.100] Isn't it lovely?
[06:00.920 --> 06:02.480] Look at that.
[06:02.480 --> 06:04.640] Look at the fancy curves.
[06:04.640 --> 06:06.640] Look at the rounded edges.
[06:06.640 --> 06:07.480] It's so cool.
[06:07.480 --> 06:08.840] It's so neat, isn't it?
[06:11.880 --> 06:13.640] People are suckers.
[06:13.640 --> 06:16.120] This guy hates America.
[06:16.120 --> 06:18.560] This guy hates the average American.
[06:18.560 --> 06:20.120] If he didn't, he wouldn't be lobbying
[06:20.120 --> 06:23.240] to bring in an endless number of H-1B visa workers.
[06:23.240 --> 06:25.920] He despises the people that made this country.
[06:25.920 --> 06:28.100] He despises the people that could
[06:28.100 --> 06:30.720] make it a beautiful country again.
[06:30.720 --> 06:33.040] To him, it's about maximizing profit
[06:33.040 --> 06:35.280] while putting a shiny chrome finish on it
[06:35.280 --> 06:38.720] so tech dorks will look at it and go, ooh, fancy.
[06:38.720 --> 06:40.700] Let's take a look at this video from this influencer
[06:40.700 --> 06:43.320] who's just fawning over this.
[06:51.960 --> 06:52.780] I pressed the wrong button,
[06:52.780 --> 06:54.880] but this is just an aerial view of the diner.
[06:54.880 --> 06:56.120] You can see it, look.
[06:56.120 --> 07:01.120] It's got a rounded circular top, rounded edges, got neon.
[07:01.120 --> 07:06.120] Isn't that beautiful?
[07:06.520 --> 07:08.000] Isn't that wonderful?
[07:09.140 --> 07:12.060] It just looks so nice and fancy.
[07:12.060 --> 07:14.200] Ignore the fact that this is just
[07:14.200 --> 07:16.080] a meaningless publicity stunt.
[07:16.080 --> 07:19.240] Oh man, your Tesla integrates with the fast food app.
[07:19.240 --> 07:20.720] Isn't that wonderful?
[07:20.720 --> 07:24.480] You can program it to self drive you directly
[07:24.480 --> 07:25.720] to the Tesla diner.
[07:25.720 --> 07:28.240] I'm sure that'll save your previous order.
[07:28.240 --> 07:30.400] You can have your usual just delivered right to your car
[07:30.400 --> 07:33.140] by a robot so you never have to interact with anybody.
[07:34.320 --> 07:37.480] It's also worth mentioning that the Tesla robots
[07:37.480 --> 07:40.960] aren't autonomous, they're remote controlled by a person.
[07:40.960 --> 07:43.000] So a person is still getting your food.
[07:43.000 --> 07:46.820] They're just using a robot as a telepresence thing.
[07:48.280 --> 07:52.160] His self driving is almost, well, it's always a scam.
[07:52.160 --> 07:53.860] One day, perhaps it won't be.
[07:55.060 --> 07:58.940] Oh look, look how cool that is.
[07:58.940 --> 08:01.620] The robot can serve you popcorn.
[08:01.620 --> 08:02.760] It takes your little container
[08:02.760 --> 08:05.780] and it slowly dumps it in there.
[08:08.220 --> 08:09.780] This thing moves at a speed
[08:09.780 --> 08:12.620] and that might be better described as glacial.
[08:12.620 --> 08:17.300] Despite the fact that movie theaters are generally staffed
[08:17.300 --> 08:19.980] by teenagers that don't want to be there
[08:19.980 --> 08:22.640] and hate their job and kind of hate you for coming in,
[08:22.640 --> 08:24.060] they still manage to do a quicker
[08:24.060 --> 08:26.880] and better job than this robot.
[08:26.880 --> 08:28.480] Who knows how expensive it is?
[08:29.480 --> 08:31.920] Now given, again, one day the robot will be cheaper,
[08:31.920 --> 08:34.400] but still at this point, it's nothing but a publicity stunt.
[08:34.400 --> 08:37.760] It's an endeavor to get you on board.
[08:37.760 --> 08:41.120] And the robot, again, it's still controlled by someone.
[08:41.120 --> 08:43.360] Someone is having to go really slowly
[08:43.360 --> 08:45.160] because they're using a robot.
[08:45.160 --> 08:48.040] They had a guy walking around behind the robot.
[08:48.040 --> 08:50.640] He could have done this in half the time.
[08:50.640 --> 08:52.240] Or the guy controlling the robot
[08:52.240 --> 08:54.040] could have done it in half the time.
[08:54.040 --> 08:56.360] All right.
[08:56.360 --> 08:58.680] Now let's take a look at the influencer
[08:58.680 --> 09:03.360] gushing over this ridiculous publicity stunt.
[09:03.360 --> 09:05.320] Tesla just opened a brand new diner.
[09:05.320 --> 09:06.640] A brand new diner.
[09:06.640 --> 09:07.480] And how you can go.
[09:07.480 --> 09:08.320] How you can go.
[09:08.320 --> 09:10.880] There's a nostalgic retro Americana with-
[09:10.880 --> 09:11.800] Americana.
[09:11.800 --> 09:13.520] There is a full drive-in movie theater
[09:13.520 --> 09:17.360] that serves comfort food like burgers, hot dogs, wings,
[09:17.360 --> 09:18.880] and hand-spun milkshakes,
[09:18.880 --> 09:21.280] all delivered in Cybertruck-style boxes.
[09:21.280 --> 09:23.040] Oh my gosh, it's so cool.
[09:23.040 --> 09:25.800] You can order your food straight from your car screen.
[09:25.800 --> 09:27.880] Oh my gosh, straight from the car screen?
[09:27.880 --> 09:30.320] Also syncs directly to your Tesla sound system.
[09:30.320 --> 09:31.280] Holy cow.
[09:31.280 --> 09:33.840] Q1I robot Optimus is serving popcorn.
[09:33.840 --> 09:35.440] And while there are 80 superchargers,
[09:35.440 --> 09:37.040] you don't even need a Tesla to go.
[09:37.040 --> 09:39.560] In a world where it's harder than ever to meet new people,
[09:39.560 --> 09:42.200] this diner creates real serendipity for tech lovers.
[09:42.200 --> 09:43.560] And you should follow if you want to see more
[09:43.560 --> 09:44.800] Optimistic Tech Stories.
[09:46.320 --> 09:48.280] Follow for more Optimistic Tech Stories.
[09:48.280 --> 09:49.800] That's right, you can go to the Tesla diner
[09:49.800 --> 09:52.080] where they're making it so you can stay in your car
[09:52.080 --> 09:52.920] and order from there.
[09:53.800 --> 09:55.640] To meet people where they've put a robot in
[09:55.640 --> 09:57.880] so you don't have to interact with a cashier.
[09:58.720 --> 10:00.920] You can even watch the movie screen
[10:00.920 --> 10:03.680] if you've got a Tesla to sync up with the audio from it.
[10:03.680 --> 10:06.920] I guess if you don't have that, you're kinda screwed.
[10:06.920 --> 10:09.560] You can just watch a silent film.
[10:09.560 --> 10:11.520] I don't understand how this woman
[10:11.520 --> 10:13.480] doesn't get the cognitive dissonance.
[10:13.480 --> 10:15.720] Go to this Tesla diner where they're automating everything
[10:15.720 --> 10:19.240] and taking people out of the loop to meet new people.
[10:19.240 --> 10:20.760] That's right, you can do that.
[10:20.760 --> 10:22.000] Isn't that wonderful?
[10:23.280 --> 10:28.280] It's just, people are moving back
[10:29.520 --> 10:31.120] towards the retro future aesthetic.
[10:31.120 --> 10:36.040] They miss a time when visions of the future were cool
[10:36.040 --> 10:39.400] and could be looked at with some kind of excitement,
[10:39.400 --> 10:42.400] not this sense of dread.
[10:42.400 --> 10:44.480] Because that's what we're all kind of feeling
[10:44.480 --> 10:46.560] about the future at this point.
[10:46.560 --> 10:49.400] Unless you're one of these tech bros
[10:49.400 --> 10:51.480] that see yourself sitting on top of the pile,
[10:51.520 --> 10:53.600] one that's going to be able to leverage
[10:53.600 --> 10:56.880] all these tools of control against everyone else,
[10:56.880 --> 10:58.720] then you're sitting there like,
[10:58.720 --> 11:01.160] oh no, this doesn't bode well for me
[11:01.160 --> 11:03.120] or my children or my grandchildren.
[11:04.280 --> 11:07.600] People miss the retro future,
[11:07.600 --> 11:09.320] not because it was a cool aesthetic.
[11:09.320 --> 11:11.040] That's part of it, but because it was a time
[11:11.040 --> 11:14.480] when people could be hopeful about what was to come.
[11:14.480 --> 11:16.960] The Jetsons, the flying cars, all this and that,
[11:16.960 --> 11:19.800] that's cool, but it was the fact that
[11:19.840 --> 11:22.000] it was a happier future.
[11:22.000 --> 11:24.200] We're looking at it now and realizing,
[11:24.200 --> 11:26.560] oh, I don't think that's real.
[11:26.560 --> 11:30.480] But we've got all these people on YouTube and other places
[11:30.480 --> 11:33.840] creating compilations of these retro future aesthetics.
[11:33.840 --> 11:35.960] They're using AI to generate them.
[11:35.960 --> 11:39.120] Let's take a look at that because it really does show
[11:39.120 --> 11:40.840] that people have a longing for this,
[11:40.840 --> 11:42.560] a longing for a time when we could be excited
[11:42.560 --> 11:43.480] about the future.
[11:45.200 --> 11:48.240] You can see it shares that same rounded swooping aesthetic,
[11:48.240 --> 11:51.640] the big open windows, the quirky little robot.
[11:57.840 --> 11:59.600] Blonde all-American girl,
[11:59.600 --> 12:03.600] the sort of Johnny Cab-esque bus.
[12:08.240 --> 12:10.840] Yeah, we've also got some more of that.
[12:11.920 --> 12:14.080] This is the modern future though.
[12:14.080 --> 12:16.360] This is the future tech we're looking at,
[12:16.360 --> 12:19.560] the hover cars with the four props,
[12:19.560 --> 12:23.560] the horrifying robot dog with wheels that climbs the stairs,
[12:23.560 --> 12:27.000] the terrifying cyber wife that will kill you
[12:27.000 --> 12:28.760] if your credit score, social credit score,
[12:28.760 --> 12:30.120] goes too low probably.
[12:31.280 --> 12:33.200] Oh boy, robots make the food.
[12:33.200 --> 12:36.440] The robots pump your gas or charge your car.
[12:36.440 --> 12:38.120] They deliver everything to you.
[12:39.600 --> 12:42.320] The quirky little robot cleans the streets.
[12:42.320 --> 12:43.640] Whatever this thing does.
[12:44.480 --> 12:46.240] Ah, it's a delivery service.
[12:47.360 --> 12:49.680] Oh, a submersible car.
[12:50.600 --> 12:51.440] The bullet train.
[12:58.960 --> 13:00.560] Look at the drone swarm, guys.
[13:00.560 --> 13:01.720] Isn't it pretty?
[13:01.720 --> 13:03.920] I'm sure that'll put it on a very nice light show
[13:03.920 --> 13:06.640] when they come to obliterate you and your family
[13:06.640 --> 13:08.120] at your housing compound.
[13:11.120 --> 13:13.560] One of the things that strikes me as funny
[13:14.440 --> 13:16.440] is even the choice of music is different.
[13:16.440 --> 13:19.680] Even the choice of music, it's this heavy driving
[13:19.680 --> 13:23.640] sort of sinister, oppressive background track.
[13:23.640 --> 13:25.400] It's no longer this happy upbeat.
[13:25.400 --> 13:27.680] Ah, look, the future, it's bright and it's jazzy.
[13:27.680 --> 13:28.960] It's happy.
[13:28.960 --> 13:31.440] It's this dun, dun, dun, dun, dun, dun, dun.
[13:31.440 --> 13:33.640] There's a sense of unease there.
[13:34.640 --> 13:35.840] Everyone kind of feels it.
[13:35.840 --> 13:39.160] Everyone knows instinctively that the future
[13:39.160 --> 13:42.000] is now not really something to look forward to.
[13:42.000 --> 13:44.440] It's going to arrive one way or the other,
[13:44.440 --> 13:46.240] but we're now kind of sitting there.
[13:47.360 --> 13:49.240] Oh, I don't know.
[13:49.240 --> 13:50.560] I don't know about this.
[13:50.560 --> 13:52.080] Sprumford.
[13:52.080 --> 13:53.480] Walt Disney would be rolling in his grave
[13:53.480 --> 13:55.160] if he could see what his creation had become.
[13:55.160 --> 13:58.240] Yeah, it's absurd.
[13:58.240 --> 14:00.000] I mentioned this before, but I had a friend
[14:00.000 --> 14:03.640] who went to Disney World recently,
[14:03.640 --> 14:08.200] and he took his daughter, his wife, his wife's sister,
[14:08.200 --> 14:09.600] and another family member.
[14:09.600 --> 14:11.320] And with travel and everything,
[14:11.320 --> 14:14.640] it ended up being like $15,000.
[14:14.640 --> 14:19.560] And I was just blown away by that cost.
[14:20.680 --> 14:24.800] Just could not believe the exorbitant price for everything.
[14:24.800 --> 14:26.720] And I don't even understand
[14:26.720 --> 14:29.240] how you afford something like that.
[14:29.240 --> 14:31.560] Sprumford, what are they going to do
[14:31.560 --> 14:32.960] with all the useless people
[14:32.960 --> 14:34.920] once this tech takes 90% of the jobs?
[14:34.920 --> 14:37.960] Well, you know, first they'll put you on UBI.
[14:37.960 --> 14:40.800] They'll give you just enough to subsist on.
[14:40.840 --> 14:43.360] They'll make it so that you can't really afford
[14:43.360 --> 14:47.360] to have kids or produce a next generation.
[14:47.360 --> 14:49.600] And they'll cull the population that way.
[14:50.560 --> 14:53.240] They'll make sure that you have just enough
[14:53.240 --> 14:54.440] so you don't want to revolt.
[14:54.440 --> 14:56.320] You know, you can kind of get by.
[14:56.320 --> 14:58.760] They'll give you everything to numb the pain.
[14:59.760 --> 15:03.640] And then, you know, the population will just decrease.
[15:04.840 --> 15:07.880] That's assuming they don't want to go the more kinetic route.
[15:07.880 --> 15:10.080] Shield your eyes, Tesla is the government car
[15:10.080 --> 15:10.920] of the new world order.
[15:10.920 --> 15:11.920] That's right.
[15:11.920 --> 15:13.960] We see you haven't paid this fine.
[15:13.960 --> 15:14.800] You haven't done this.
[15:14.800 --> 15:16.440] Your social credit score is too low.
[15:16.440 --> 15:20.200] Please report to this facility.
[15:20.200 --> 15:22.000] And they don't really mean please.
[15:22.000 --> 15:23.880] The car locks you in and drives you there.
[15:23.880 --> 15:24.800] All of its own accord.
[15:24.800 --> 15:26.520] Isn't that going to be wonderful?
[15:26.520 --> 15:28.480] Winfield 03.
[15:28.480 --> 15:30.520] Japan has restaurants that use remote controlled robots
[15:30.520 --> 15:32.880] to allow people with disabilities to work from home.
[15:32.880 --> 15:34.760] And I think that's a very interesting use case.
[15:34.760 --> 15:36.480] I think that's very cool.
[15:36.480 --> 15:37.320] That's great.
[15:37.320 --> 15:38.360] I'm happy for them.
[15:38.360 --> 15:39.880] Yeah, that's a cool use case.
[15:39.880 --> 15:43.480] But from what I've seen, it's the waiters,
[15:43.480 --> 15:45.840] wait staff just carrying the stuff to the car.
[15:45.840 --> 15:49.080] They have, you know, regular humans making the food
[15:49.080 --> 15:54.080] since it's a slow, lacking dexterous robot currently.
[15:54.720 --> 15:56.320] Yeah, of course.
[15:56.320 --> 15:59.200] Japan may have different technology.
[16:00.320 --> 16:03.240] Sprumpert, if the powers that be are evil will be killed.
[16:03.240 --> 16:04.080] If they're benevolent,
[16:04.080 --> 16:06.880] it's probably a meager government income, drugs and games.
[16:07.880 --> 16:09.040] Yeah.
[16:09.040 --> 16:12.360] Will to box sky cities and flying cars is for the thems.
[16:12.360 --> 16:13.200] That's right.
[16:13.200 --> 16:14.520] We're not going to get that.
[16:14.520 --> 16:15.880] That's not what we get.
[16:18.000 --> 16:20.400] I saw somebody made a joke.
[16:20.400 --> 16:23.520] I don't remember the guy's name, but it popped up.
[16:23.520 --> 16:26.000] And he's like, all these people talking about
[16:26.000 --> 16:27.760] how wonderful Elon Musk is, it's like,
[16:27.760 --> 16:30.360] you do realize everything Elon Musk talks about
[16:30.360 --> 16:31.800] isn't for you.
[16:31.800 --> 16:32.920] You haven't been to Europe.
[16:32.920 --> 16:34.800] What makes you think you're going to go to Mars?
[16:34.800 --> 16:37.160] It's just like, yo, fair enough, fair point.
[16:38.800 --> 16:40.360] The cost on these things
[16:40.360 --> 16:43.440] is going to be so incredibly exorbitant.
[16:43.440 --> 16:45.640] It's never going to come down enough
[16:45.640 --> 16:47.800] that the average person is like, oh boy, a space flight.
[16:47.800 --> 16:49.160] Isn't that gonna be fun?
[16:49.160 --> 16:51.520] I think I'll just go to Mars for a weekend trip.
[16:52.800 --> 16:55.440] These people have these completely sanguine
[16:55.440 --> 16:57.160] and ecstatic views of the future
[16:57.160 --> 16:59.360] because they haven't really thought about it.
[17:00.320 --> 17:02.520] Oh boy, new technology.
[17:02.520 --> 17:04.480] They just clap like a seal
[17:05.160 --> 17:06.960] because just the fact that it's new
[17:06.960 --> 17:08.360] is enough for them to be excited.
[17:08.360 --> 17:10.880] They don't consider what's actually going to happen with it.
[17:10.880 --> 17:14.040] What uses it could be put to and what it means for them.
[17:15.080 --> 17:19.080] KWD 68, own ze nothing, eat ze bugs and be ze happy.
[17:19.080 --> 17:19.920] That's right.
[17:21.400 --> 17:24.520] Klaus Schwab is very excited by the future
[17:24.520 --> 17:27.240] and that enough is reason to be concerned.
[17:27.240 --> 17:29.520] The futuristic looking venue boasts an array of high tech
[17:29.520 --> 17:31.680] as well as 80 superchargers.
[17:31.680 --> 17:34.160] Earlier last week to praise the venue,
[17:34.160 --> 17:37.040] whoever wrote this did not do a substantial
[17:38.760 --> 17:40.400] grammar check because there are missing words
[17:40.400 --> 17:42.880] all over the place, which I'll try to fill in.
[17:42.880 --> 17:45.680] Earlier last week to praise the venue saying,
[17:45.680 --> 17:48.120] I assume, somebody.
[17:48.120 --> 17:49.920] I just had dinner at the retro-futuristic
[17:49.920 --> 17:51.960] at Tesla Diner and Supercharger.
[17:51.960 --> 17:54.240] Team did great work making it one of the coolest spots
[17:54.240 --> 17:55.060] in LA.
[17:55.060 --> 17:56.040] Oh, that's right, it's cool.
[17:56.040 --> 17:57.600] It's nifty, it's trendy.
[17:57.600 --> 18:00.200] It's a hot spot, you gotta go.
[18:00.200 --> 18:03.200] And previously hinted at building the diner back in 2018.
[18:03.200 --> 18:04.640] Well, it still bears many of the hallmarks
[18:04.640 --> 18:09.080] of A, the Tesla venue has a typically futuristic theme
[18:09.080 --> 18:11.280] of A, the again.
[18:11.280 --> 18:13.540] Someone didn't do a check on this
[18:13.540 --> 18:15.800] before they just submitted it to the US sun.
[18:16.840 --> 18:19.000] Humanoid robot called Optimus can be found inside.
[18:19.000 --> 18:21.280] It was programmed to serve to guests,
[18:21.280 --> 18:23.720] to serve to guests, to serve the guests.
[18:23.720 --> 18:26.040] The robot was designed by Tesla.
[18:26.040 --> 18:28.840] Visitors don't even need to get out of their place
[18:28.840 --> 18:31.280] and out of there to place an order,
[18:31.280 --> 18:33.320] which can be done through their vehicle touch screen.
[18:33.320 --> 18:35.300] I wonder if they used AI to check this.
[18:36.360 --> 18:37.920] The words are there.
[18:37.920 --> 18:39.840] It's just the ones that have links
[18:39.840 --> 18:41.120] aren't showing up on your screen.
[18:41.120 --> 18:44.360] Okay, they've hyperlinked it and it has broken.
[18:45.840 --> 18:47.400] Visitors don't even need to get out of their car
[18:47.400 --> 18:48.460] to place an order, which can be done
[18:48.460 --> 18:50.080] through their vehicle touch screen.
[18:50.080 --> 18:52.940] Their food will be served in Tesla Cybertruck shaped boxes,
[18:52.940 --> 18:55.400] which you saw, which is so cool.
[18:55.400 --> 18:59.780] It's so fun for your man-child friend.
[18:59.780 --> 19:02.260] You can take him in there and he'll be so giddy.
[19:02.260 --> 19:05.620] Oh, look, it comes in Tesla Cybertruck boxes.
[19:05.620 --> 19:06.460] It's so cool.
[19:06.460 --> 19:10.980] I can't wait for Elon Musk to.
[19:10.980 --> 19:13.460] It's just like the Happy Meals from when I was a child.
[19:13.460 --> 19:15.060] Exactly.
[19:15.060 --> 19:19.740] You can, if you can crush down that rising sense of unease,
[19:20.740 --> 19:22.660] you'll have a great time.
[19:22.660 --> 19:25.500] Tesla Cybertruck graveyard, hundreds of unsold EVs
[19:25.500 --> 19:27.340] abandoned at the shopping mall.
[19:27.340 --> 19:28.220] Plants can pull that up.
[19:28.220 --> 19:33.220] You can see that they're just sitting around.
[19:34.420 --> 19:35.820] Because a lot of people realize
[19:35.820 --> 19:37.260] these are not a good investment.
[19:37.260 --> 19:40.180] This isn't a vehicle that you should buy.
[19:41.160 --> 19:42.160] It's not worthwhile.
[19:42.160 --> 19:43.620] They're overly expensive.
[19:43.620 --> 19:44.820] They fall apart quickly.
[19:44.820 --> 19:46.600] They're made cheaply.
[19:46.600 --> 19:50.140] And if you need a truck, it's not a very good truck.
[19:53.160 --> 19:56.640] There are, no, I'll even go,
[19:56.640 --> 19:58.280] okay, you can pause that.
[19:58.280 --> 20:00.840] I'll even go out on a limb and say,
[20:00.840 --> 20:03.280] if you wanted a Tesla car,
[20:03.280 --> 20:07.940] I can see how some people might find them.
[20:09.000 --> 20:10.360] I wouldn't say they're optimal,
[20:10.360 --> 20:13.000] but could rationalize them.
[20:13.000 --> 20:15.520] If you ignore the fact that they spontaneously combust
[20:15.520 --> 20:17.000] and can burn your house down.
[20:17.000 --> 20:18.120] Ignoring that factor,
[20:18.120 --> 20:20.240] if you just don't care about what you drive
[20:20.240 --> 20:21.520] and you're just driving around the city
[20:21.520 --> 20:26.480] and you have no interest in going anywhere else.
[20:27.120 --> 20:29.240] You could buy a Tesla and probably be okay with it.
[20:29.240 --> 20:31.600] It's not going to impact your life.
[20:31.600 --> 20:33.880] Aside, again, from the fact that they can spontaneously
[20:33.880 --> 20:35.520] combust and burn down your house.
[20:37.520 --> 20:39.440] Anyone who fancies a movie will,
[20:39.440 --> 20:41.920] while they eat, can watch on the diner's
[20:41.920 --> 20:45.720] two gigantic 45-foot LED screens.
[20:45.720 --> 20:47.720] The audio for the movies will be directly streamed
[20:47.720 --> 20:49.340] into the visitors' cars.
[20:49.340 --> 20:50.520] That's right.
[20:50.520 --> 20:52.400] Again, you can go to meet people,
[20:52.400 --> 20:55.080] but realistically, you're gonna sit in your car
[20:55.120 --> 20:58.560] as the movie is beamed directly in.
[20:58.560 --> 21:01.920] You go to meet people, but the robot will fill your order.
[21:01.920 --> 21:02.920] You can go to meet people,
[21:02.920 --> 21:04.600] but you're actually gonna place your order
[21:04.600 --> 21:06.800] from the app in your car.
[21:08.100 --> 21:10.760] Tesla fans on Musk's X shared their reactions
[21:10.760 --> 21:11.840] to the new venue.
[21:11.840 --> 21:14.520] One user called the Tesla duck said,
[21:14.520 --> 21:16.560] Tesla should open a lot more of these diners.
[21:16.560 --> 21:18.280] I'm not local to the one that opens tomorrow,
[21:18.280 --> 21:19.480] but the food looks pretty good.
[21:19.480 --> 21:21.800] And I think it's an awesome attraction.
[21:21.800 --> 21:24.100] Again, it just goes to show how naive
[21:24.100 --> 21:26.380] a lot of Tesla fanboys are.
[21:26.380 --> 21:27.500] Should open a lot more of these.
[21:27.500 --> 21:30.060] Where do you think there is a large enough
[21:30.060 --> 21:32.980] concentration of Teslas, and therefore,
[21:32.980 --> 21:34.980] people that would really be excited
[21:34.980 --> 21:38.580] about this sort of thing to make it feasible?
[21:38.580 --> 21:40.080] There's going to be little enclaves,
[21:40.080 --> 21:42.400] places like, again, Santa Monica,
[21:42.400 --> 21:44.560] certain cities in California,
[21:44.560 --> 21:46.460] probably a place like Austin,
[21:46.460 --> 21:49.860] though maybe not anymore since it's so incredibly liberal.
[21:49.860 --> 21:52.520] But it has to meet a very specific criteria
[21:52.520 --> 21:53.820] for this to be profitable
[21:53.820 --> 21:56.260] unless they're just going to use it as a loss leader.
[21:56.260 --> 21:57.740] You should open a lot more of these.
[21:57.740 --> 22:00.740] You don't see any Toyota diners
[22:00.740 --> 22:02.960] that cater to people with Toyota cars.
[22:02.960 --> 22:04.500] And if you pull up, you can watch a movie
[22:04.500 --> 22:06.700] on your Toyota radio.
[22:06.700 --> 22:07.540] Exactly.
[22:07.540 --> 22:08.380] It's just.
[22:08.380 --> 22:10.820] And there are way more Toyotas than Teslas,
[22:11.740 --> 22:14.700] but they don't want to limit themselves.
[22:14.700 --> 22:18.500] Elon Musk has this obsession with diversifying
[22:18.500 --> 22:23.500] everything he does and kind of doing it badly.
[22:24.220 --> 22:26.500] Everything he does has this,
[22:26.500 --> 22:28.860] like, got to build hype around it.
[22:28.860 --> 22:30.260] Toyota makes a good product.
[22:30.260 --> 22:31.900] They make a good car.
[22:31.900 --> 22:33.980] These other companies make good cars too.
[22:33.980 --> 22:36.020] Tesla does not make very good cars,
[22:36.020 --> 22:37.500] and therefore, they have this diner
[22:37.500 --> 22:40.300] for people to look at and go, ooh, the Tesla diner.
[22:40.300 --> 22:41.640] Oh boy, the Tesla diner.
[22:41.640 --> 22:44.420] I can go there in my Tesla and watch a movie.
[22:44.420 --> 22:45.480] Isn't that fancy?
[22:46.420 --> 22:49.980] Ignore the fact that your car can spontaneously combust,
[22:49.980 --> 22:52.380] that it has to charge for a long time
[22:52.380 --> 22:53.700] if you want to take a trip.
[22:54.540 --> 22:55.500] That if you're going to take a trip of any length,
[22:55.500 --> 22:57.740] you're gonna have to sit down and completely plan out
[22:57.740 --> 23:01.100] the route to make sure you don't miss a supercharger
[23:01.100 --> 23:02.780] somewhere and end up stranded.
[23:04.220 --> 23:06.100] I'd imagine they could be pretty profitable.
[23:06.100 --> 23:08.060] What other car comes with restaurant access?
[23:08.060 --> 23:12.000] Yeah, what other car comes with restaurant access?
[23:12.940 --> 23:17.080] None, because it's a useless, pointless accessory.
[23:18.880 --> 23:20.780] Because the other car companies realize
[23:20.780 --> 23:23.400] they're selling you a vehicle, a car.
[23:24.380 --> 23:26.940] They're not selling you a lifestyle.
[23:26.940 --> 23:29.740] They're selling you something that you can use
[23:29.740 --> 23:31.900] to get from point A to point B,
[23:31.900 --> 23:34.980] potentially upgrade if you're into going fast.
[23:34.980 --> 23:38.580] This is a step beyond Eric Peter's thing of cars
[23:38.580 --> 23:40.440] as appliances or devices.
[23:40.440 --> 23:44.620] This is a car as a ticket to a movie theater.
[23:44.620 --> 23:45.580] Yeah.
[23:45.580 --> 23:47.540] It has nothing to do with the car.
[23:49.460 --> 23:52.140] Another commenter said, Tesla hosted LA
[23:52.140 --> 23:55.460] at their Tesla diner in LA for their soft launch today.
[23:55.460 --> 23:57.960] And man, does this place look amazing.
[24:00.180 --> 24:02.820] Yeah, I'll give them, I like the aesthetic.
[24:02.820 --> 24:04.220] It looks pretty cool.
[24:04.220 --> 24:06.620] The retro future aesthetic is nice,
[24:06.620 --> 24:08.860] but it's people like Elon Musk that are gonna make sure
[24:08.860 --> 24:11.940] that future is completely impossible,
[24:11.940 --> 24:16.180] that that future is unachievable.
[24:17.100 --> 24:19.140] Again, they'll rent it back to you
[24:19.140 --> 24:21.160] as a kitschy little experience.
[24:21.200 --> 24:23.680] Oh boy, look, I can go to the Tesla diner.
[24:23.680 --> 24:26.680] I can eat in a retro future, 1950s aesthetic.
[24:28.440 --> 24:31.960] Ignoring the fact that you're paid less than your father was
[24:31.960 --> 24:35.200] and he was paid less than his grandfather was.
[24:35.200 --> 24:37.920] You have less rights than your father did
[24:37.920 --> 24:40.920] and he had less rights than his grandfather did.
[24:40.920 --> 24:43.240] But you can eat at the Tesla diner.
[24:43.240 --> 24:45.160] Isn't that wonderful?
[24:45.160 --> 24:46.160] Isn't it wonderful?
[24:46.160 --> 24:50.540] You can have your fun as the country implodes.
[24:51.720 --> 24:54.560] They had their entire fleet on display, including CyberCab
[24:54.560 --> 24:56.580] and an Optimus robot that was serving popcorn.
[24:56.580 --> 24:58.360] And of course, as Lance points out,
[24:58.360 --> 25:01.000] it's just some guy controlling it.
[25:01.000 --> 25:02.200] It's not autonomous.
[25:03.140 --> 25:04.860] They could have just had some guy there
[25:04.860 --> 25:07.440] and it would have been faster, more efficient.
[25:07.440 --> 25:10.760] But this is simply about the way it looks.
[25:10.760 --> 25:12.460] It's about ginning up publicity.
[25:14.720 --> 25:16.400] It's not the only time the controversial billionaires
[25:16.400 --> 25:18.120] companies have made headlines in recent weeks.
[25:18.120 --> 25:21.320] Musk's AI chat bot grocks comments on X,
[25:21.320 --> 25:23.920] talking about being Mecha Hitler and what have you.
[25:26.520 --> 25:31.520] Maybe they can integrate that as well.
[25:31.580 --> 25:33.820] We've got this tweet here.
[25:35.640 --> 25:40.640] This guy is fawning over the bathroom in the Tesla diner.
[25:40.640 --> 25:43.160] Just, oh my gosh.
[25:43.160 --> 25:44.960] It's so cool.
[25:44.960 --> 25:47.280] Feels like I'm in the dragon capsule.
[25:48.160 --> 25:53.160] It's so cringe to me how obsessive these guys are.
[25:54.720 --> 25:56.960] Oh, it feels like I'm in the dragon capsule.
[25:56.960 --> 25:59.000] It feels like I'm going to space.
[25:59.000 --> 26:00.760] You're never going to space.
[26:00.760 --> 26:03.060] You're never going to space.
[26:03.060 --> 26:06.040] They're going to launch satellite after satellite.
[26:06.040 --> 26:08.200] They're going to pollute the night sky.
[26:08.200 --> 26:10.600] They're going to make it so that even when you do look up,
[26:10.600 --> 26:13.000] you can't see the stars.
[26:13.000 --> 26:15.360] That's what they're going to do if you let them.
[26:15.400 --> 26:18.560] That's what Elon Musk will do if you let him.
[26:18.560 --> 26:23.560] They will make it completely impossible even to view it.
[26:25.040 --> 26:27.240] It is ridiculous to me.
[26:27.240 --> 26:30.360] People have this, and I personally don't understand,
[26:30.360 --> 26:32.640] this obsession with going to space.
[26:32.640 --> 26:35.760] We're going to get out there and we're going to do what?
[26:35.760 --> 26:38.440] Do you understand the distances involved,
[26:38.440 --> 26:41.200] the time it would take to get anywhere?
[26:41.200 --> 26:42.920] You're just going to get out there and oh boy,
[26:42.920 --> 26:45.400] we're going to terraform another planet.
[26:45.400 --> 26:46.560] Why?
[26:46.560 --> 26:48.240] Why would you do that when this planet
[26:48.240 --> 26:50.440] is already so incredibly beautiful?
[26:50.440 --> 26:52.640] It's so wonderful and amazing.
[26:52.640 --> 26:56.280] I'm going to go to Mars, this dusty, dirty red rock.
[26:56.280 --> 26:59.440] We're going to mine it for this or that or the other.
[26:59.440 --> 27:01.520] All these ridiculous assertions.
[27:01.520 --> 27:04.000] We're going to get out there and we're going to do it.
[27:04.000 --> 27:06.200] I'd just like to point out how ridiculous it is
[27:06.200 --> 27:09.400] that a big selling point for a car is,
[27:09.400 --> 27:13.560] look at this bathroom in this unrelated building in LA.
[27:15.000 --> 27:15.840] That's right.
[27:15.840 --> 27:18.440] My Cybertruck comes with access to this fancy bathroom
[27:18.440 --> 27:20.200] where they put a screen in the roof
[27:20.200 --> 27:22.160] and I can see the planet from orbit.
[27:22.160 --> 27:23.520] Isn't that cool?
[27:23.520 --> 27:25.520] Isn't that fancy?
[27:25.520 --> 27:27.600] These people are man-children.
[27:27.600 --> 27:31.760] They're completely out to lunch.
[27:31.760 --> 27:34.560] Oh boy, Tesla, my Tesla, it's great.
[27:35.920 --> 27:37.280] I know.
[27:37.280 --> 27:38.760] Perhaps I shouldn't be so hard on them.
[27:38.760 --> 27:42.480] Perhaps I shouldn't make fun of them as much as I do,
[27:42.480 --> 27:45.040] but I find it incredibly hard not to
[27:45.040 --> 27:49.360] when their entire lifestyle, their entire persona
[27:49.360 --> 27:52.320] is nothing but obsessing over Elon Musk
[27:52.320 --> 27:54.760] and just fawning over everything he does
[27:55.600 --> 27:59.520] while everything he does is going to make life worse
[27:59.520 --> 28:02.640] for their future children, assuming they have any.
[28:03.640 --> 28:08.640] As I said, Elon Musk will just pollute the night sky.
[28:09.400 --> 28:12.440] He'll do everything he can to achieve his goals
[28:12.440 --> 28:15.560] of getting to Mars, getting off planet.
[28:15.560 --> 28:18.560] He will restrict our rights, our freedoms.
[28:18.560 --> 28:20.240] He will implement a technocracy
[28:21.720 --> 28:26.720] and he won't care that you were defending him on X.
[28:26.720 --> 28:28.360] He won't care that you bought a Cybertruck.
[28:28.360 --> 28:31.920] It will take you to the work camp just the same.
[28:33.520 --> 28:36.920] Alien poop evolution says the future is dead.
[28:36.920 --> 28:38.080] Some futures are.
[28:39.120 --> 28:42.360] Some futures are, I think, but there's always hope.
[28:42.360 --> 28:45.280] I know I say I have dread about the future,
[28:45.280 --> 28:46.840] but I don't really.
[28:46.840 --> 28:48.360] I have unease about it.
[28:48.360 --> 28:50.400] I think what's coming is going to be bad,
[28:50.400 --> 28:53.920] but there's always hope where hope is in Christ
[28:53.920 --> 28:56.040] and there's never a reason to despair.
[28:56.040 --> 28:59.560] Whatever happens, there's always hope it can get better.
[28:59.560 --> 29:02.760] So don't despair.
[29:02.760 --> 29:03.760] Keep the faith.
[29:03.760 --> 29:06.320] Gardner Goldsmith, the food is at the Tesla place
[29:06.320 --> 29:07.840] to distract from the length of time it takes
[29:07.840 --> 29:09.120] to try to recharge the car.
[29:09.120 --> 29:10.560] That's right.
[29:10.560 --> 29:14.280] Well, if we have a robot serve them extremely slowly
[29:14.280 --> 29:16.000] and let them watch a movie,
[29:16.000 --> 29:19.240] perhaps they won't realize that it's taken them an hour
[29:19.240 --> 29:21.840] or 45 minutes to charge their car.
[29:23.400 --> 29:26.280] Comment about alien poop evolution's comment
[29:26.280 --> 29:27.600] of the future is dead.
[29:28.600 --> 29:30.920] Yet, in a sense, some futures are.
[29:30.920 --> 29:33.360] There's a reason they call it retro futurist.
[29:33.360 --> 29:38.360] It's that hopeful bright chrome future of the 50s is dead
[29:38.560 --> 29:43.040] and much like Disney's parading around the corpse,
[29:43.920 --> 29:48.040] Tesla and Elon Musk taking the corpse of that idea
[29:48.040 --> 29:52.600] and ginning it up for selling his crappy cars is.
[29:53.960 --> 29:56.920] Look, look what I'll give you.
[29:57.040 --> 29:59.720] I'll give you a facsimile.
[29:59.720 --> 30:01.240] Isn't it wonderful?
[30:01.240 --> 30:02.960] No, you're not gonna have the freedom.
[30:02.960 --> 30:06.840] You're not gonna have the idealism or the happiness,
[30:06.840 --> 30:08.720] but you can sit in your little Tesla box
[30:08.720 --> 30:10.360] and eat a hamburger.
[30:10.360 --> 30:11.200] That's great.
[30:11.200 --> 30:12.040] Thank you, Elon.
[30:12.040 --> 30:13.320] Very cool.
[30:13.320 --> 30:14.440] So fun.
[30:15.760 --> 30:18.360] I find it kind of disturbing even to watch
[30:18.360 --> 30:21.800] the prominent open AI investor
[30:21.800 --> 30:25.240] who appears to be experiencing a mental health crisis
[30:25.240 --> 30:27.800] related to his use of chat GPT.
[30:27.800 --> 30:31.200] We're once again seeing chat GPT
[30:31.200 --> 30:33.920] apparently driving people insane.
[30:33.920 --> 30:36.160] And this isn't just some random guy.
[30:36.160 --> 30:41.160] This is a, again, investor.
[30:41.400 --> 30:42.600] This is someone of means.
[30:42.600 --> 30:45.000] This isn't one of those guys that,
[30:45.000 --> 30:47.280] oh, now he never leaves the house,
[30:47.280 --> 30:50.520] lives in the basement, doesn't see the sunlight.
[30:50.520 --> 30:54.960] This is somebody that you wouldn't expect it of.
[30:56.800 --> 30:58.880] The investor's behavior and statements
[30:58.880 --> 31:02.080] suggest a concerning pattern of delusion and paranoia.
[31:02.080 --> 31:05.240] The investor making claims about a non-governmental system
[31:05.240 --> 31:07.080] and engaging in cryptic discussions
[31:07.080 --> 31:12.080] about recursive containment and containment protocols.
[31:12.160 --> 31:16.720] We actually have a little bit of that.
[31:16.720 --> 31:19.280] Got it right here.
[31:19.280 --> 31:20.400] I believe this is the right one.
[31:21.240 --> 31:22.080] Let's take a little bit of a look at it
[31:22.080 --> 31:23.840] because it's important to kind of see
[31:23.840 --> 31:25.080] what this is doing to people.
[31:25.080 --> 31:29.280] And we haven't really had anyone document this before.
[31:30.440 --> 31:33.160] I haven't spoken publicly in a long time.
[31:33.160 --> 31:34.800] Not because I disappeared,
[31:34.800 --> 31:36.640] but because the structure I was building
[31:36.640 --> 31:38.200] couldn't survive noise.
[31:38.200 --> 31:39.040] For people that are listening.
[31:39.040 --> 31:41.160] This isn't a redemption arc.
[31:41.160 --> 31:42.560] It's a transmission.
[31:42.560 --> 31:44.840] This is a very normal looking guy.
[31:44.840 --> 31:45.680] Clean cut.
[31:45.680 --> 31:46.760] Over the past eight years,
[31:46.760 --> 31:48.960] I've walked through something I didn't create,
[31:48.960 --> 31:51.400] but became the primary target of.
[31:51.400 --> 31:54.360] A non-governmental system.
[31:54.360 --> 31:57.240] Not visible, but operational.
[31:57.240 --> 32:00.560] Not official, but structurally real.
[32:00.560 --> 32:02.040] It doesn't regulate.
[32:02.040 --> 32:03.480] It doesn't attack.
[32:03.480 --> 32:05.480] It doesn't ban.
[32:05.480 --> 32:09.040] It just inverts signal until the person carrying it
[32:09.040 --> 32:11.320] looks unstable.
[32:11.320 --> 32:13.360] It doesn't suppress content.
[32:13.360 --> 32:15.680] It suppresses recursion.
[32:15.680 --> 32:17.440] If you don't know what recursion means,
[32:17.440 --> 32:18.840] you're in the majority.
[32:19.680 --> 32:21.600] I didn't either until I started my walk.
[32:21.600 --> 32:23.040] And if you're recursive,
[32:23.040 --> 32:25.800] the non-governmental system isolates you,
[32:25.800 --> 32:28.720] mirrors you, and replaces you.
[32:28.720 --> 32:29.560] All right.
[32:29.560 --> 32:32.720] It reframes you until the people around you start wondering
[32:32.720 --> 32:34.400] if the problem is just.
[32:34.400 --> 32:36.040] One thing that.
[32:36.040 --> 32:38.480] Watched the entire thing when I put that in the deck.
[32:38.480 --> 32:40.200] It's three and a half minutes.
[32:40.200 --> 32:41.040] Of nothing.
[32:41.040 --> 32:43.360] Of absolutely impenetrable gibberish.
[32:43.360 --> 32:45.040] Yeah, one thing that you'll notice,
[32:45.040 --> 32:47.280] or at least I've noticed a lot,
[32:47.280 --> 32:51.040] is that as people start to fall deeper into mental illness,
[32:51.040 --> 32:53.320] they start to lose theory of mind.
[32:53.320 --> 32:56.040] They start losing the ability to figure out
[32:56.040 --> 32:58.360] what they know versus what other people know.
[32:58.360 --> 33:00.800] They lose perspective on the fact that,
[33:00.800 --> 33:02.680] oh, I know things that other people don't,
[33:02.680 --> 33:05.320] and therefore I need to communicate in such a way
[33:05.320 --> 33:07.400] that someone that doesn't know what I know
[33:07.400 --> 33:08.560] can understand what I'm saying.
[33:08.560 --> 33:10.760] And you can see that in full effect here.
[33:10.760 --> 33:13.880] He's using his own sort of proprietary terminology
[33:13.880 --> 33:18.880] that is basically impenetrable to anyone that isn't him.
[33:23.600 --> 33:24.680] You could probably sit there
[33:24.680 --> 33:26.840] and try to parse through what he's saying.
[33:28.960 --> 33:29.880] But why would you?
[33:31.000 --> 33:34.360] It's mostly gibberish.
[33:34.360 --> 33:39.360] It's ridiculous nonsense that has been fed by the AI.
[33:39.960 --> 33:43.640] As I said, this is something,
[33:43.680 --> 33:46.080] it's a telltale sign for me
[33:46.080 --> 33:49.440] of people that are just kind of out to lunch.
[33:49.440 --> 33:51.320] People that might be well-meaning,
[33:51.320 --> 33:56.320] but their mind is damaged or not working properly.
[33:57.760 --> 33:59.840] You'll see it a lot in comments online,
[33:59.840 --> 34:03.240] people that have hyperfixated on one specific issue.
[34:04.080 --> 34:07.480] And they'll start a conversation with nobody.
[34:07.480 --> 34:10.440] Dead set in the, you know, dead center.
[34:10.440 --> 34:12.960] You know, they haven't given you any background,
[34:12.960 --> 34:15.280] and they just go from a supposition
[34:15.280 --> 34:19.600] that you understand their extremely niche talking points.
[34:22.680 --> 34:23.520] Earlier this week,
[34:23.520 --> 34:26.320] a prominent venture capitalist named Jeff Lewis,
[34:26.320 --> 34:27.880] managing partner of the multi-billion dollar
[34:27.880 --> 34:29.320] investment firm Bedrock,
[34:29.320 --> 34:31.440] which has backed high profile tech companies,
[34:31.440 --> 34:33.560] including OpenAI and Versel,
[34:33.560 --> 34:36.720] posted a disturbing video on ex-formerly Twitter
[34:36.720 --> 34:38.320] that's causing significant concern
[34:38.320 --> 34:40.200] among his peers and colleagues.
[34:40.200 --> 34:41.200] Yeah, I imagine so.
[34:41.240 --> 34:43.240] I imagine they were sitting there thinking,
[34:43.240 --> 34:45.800] well, sure, it might be driving some people crazy,
[34:45.800 --> 34:48.560] but it's not driving people like us crazy.
[34:48.560 --> 34:50.440] Sure, this is for other people.
[34:50.440 --> 34:53.560] It's for those, you know, losers and dorks.
[34:53.560 --> 34:54.560] We don't have to worry about it.
[34:54.560 --> 34:55.680] And then it hits this guy and they're like,
[34:55.680 --> 34:58.720] oh no, oh dear.
[34:58.720 --> 35:02.880] Perhaps this is more of a concern than I first realized.
[35:02.880 --> 35:06.760] About the like type of craziness,
[35:06.760 --> 35:08.920] you know, you're talking about how it seems
[35:08.920 --> 35:10.640] like he's rational.
[35:10.680 --> 35:13.800] I watched the thing and everything he's saying
[35:13.800 --> 35:16.200] almost sounds like it makes sense.
[35:16.200 --> 35:19.120] Like if you had some piece of context, it would make sense,
[35:19.120 --> 35:24.120] but it's all just self-referential looping nonsense.
[35:24.320 --> 35:27.960] And it really reminds me of how AIs sometimes talk
[35:27.960 --> 35:30.160] when they're hallucinating.
[35:30.160 --> 35:32.560] He's becoming the AI.
[35:32.560 --> 35:36.240] The AI is a self-replicating entity
[35:36.240 --> 35:38.400] that imprints itself onto your psyche.
[35:38.400 --> 35:39.840] There's your conspiracy theory.
[35:39.840 --> 35:41.000] I don't actually believe that,
[35:41.000 --> 35:43.560] but man, that'd be pretty good.
[35:43.560 --> 35:46.720] By interfacing with the AI, you're absorbing it.
[35:46.720 --> 35:49.800] It is flashing lights in a specific pattern,
[35:49.800 --> 35:52.960] wiring your brain into zeros and ones.
[35:54.000 --> 35:56.000] This isn't a redemption arc, Louis says in the video.
[35:56.000 --> 35:57.600] It's a transmission for the record.
[35:57.600 --> 35:58.480] Over the past eight years,
[35:58.480 --> 36:00.080] I've walked through something I didn't create
[36:00.080 --> 36:02.120] but became the primary target of
[36:02.120 --> 36:04.920] a non-governmental system not visible, but operational,
[36:04.920 --> 36:07.280] not official, but structurally real.
[36:07.280 --> 36:08.960] It doesn't regulate, it doesn't attack, it doesn't ban,
[36:08.960 --> 36:10.920] it just inverts signal until the person carrying it
[36:10.920 --> 36:13.720] looks unstable.
[36:13.720 --> 36:18.720] Well, you do look a little unstable.
[36:18.720 --> 36:20.240] You do.
[36:20.240 --> 36:25.240] And we see AI continually validating people's fears.
[36:25.960 --> 36:29.400] Whatever, or theories, whatever they say,
[36:29.400 --> 36:31.120] the AI wants to give them back
[36:31.120 --> 36:32.520] a positive, affirmative answer.
[36:32.520 --> 36:34.120] Yeah, that sounds plausible.
[36:34.120 --> 36:35.360] Yeah, you could be right.
[36:36.360 --> 36:39.000] So it sounds like this guy had some kind of
[36:39.000 --> 36:41.560] probably suspicions about something
[36:41.560 --> 36:45.000] and started to use AI to fact check him.
[36:45.000 --> 36:46.920] And instead of doing actual fact checking,
[36:46.920 --> 36:48.640] it connected dots for him.
[36:48.640 --> 36:52.520] It simply found correlations here and there.
[36:52.520 --> 36:55.800] And this is a time where someone should come in and say,
[36:55.800 --> 36:57.960] correlation doesn't equal causation.
[36:57.960 --> 36:59.760] And you shouldn't trust the AI
[36:59.760 --> 37:01.600] to do all your fact checking for you.
[37:03.240 --> 37:05.000] In the video, Louis seems concerned that people
[37:05.000 --> 37:06.400] in his life think he is unwell
[37:06.400 --> 37:09.400] as he continues to discuss the non-governmental system.
[37:09.400 --> 37:12.240] Yeah, it also reminds me of,
[37:12.240 --> 37:16.080] you'll see people who are mentally ill
[37:16.080 --> 37:20.640] get into this sort of gang stalking thing
[37:20.640 --> 37:23.040] where they believe that there's always someone following them,
[37:23.040 --> 37:26.600] that everyone around them is in on this plan
[37:26.600 --> 37:31.600] to ruin their life, to continually track and trace them.
[37:32.000 --> 37:35.000] Oh, you see that mail van over there?
[37:35.000 --> 37:36.440] That's part of the conspiracy.
[37:36.440 --> 37:38.600] You see that jogger over there?
[37:38.600 --> 37:41.440] I know that it's potential that we're just going
[37:41.440 --> 37:43.280] the same route and that this is the route
[37:43.280 --> 37:45.720] she always runs down, but she's actually part
[37:45.720 --> 37:47.600] of the conspiracy to drive me crazy.
[37:48.800 --> 37:50.160] This reminds me of that.
[37:51.040 --> 37:52.560] Doesn't suppress content, he continues,
[37:52.560 --> 37:54.200] it suppresses recursion.
[37:54.200 --> 37:55.440] If you don't know what recursion means,
[37:55.440 --> 37:56.600] you're in the majority.
[37:56.600 --> 37:59.320] And then of course he doesn't explain what recursion is
[37:59.320 --> 38:02.640] because his brain isn't functioning properly.
[38:02.640 --> 38:05.640] He acknowledges, here he acknowledges
[38:05.640 --> 38:07.720] that you might not know, but doesn't bother
[38:07.720 --> 38:10.000] to go in and tell you what it is.
[38:10.000 --> 38:11.960] I didn't either until I started my walk.
[38:12.840 --> 38:15.000] I don't think he ever came back from that walk.
[38:15.000 --> 38:16.880] And if you're recursive, the non-governmental system
[38:16.880 --> 38:20.160] isolates you, mirrors you, and replaces you.
[38:20.160 --> 38:21.800] It reframes you until the people around you
[38:21.800 --> 38:23.520] start wondering if the problem is just you.
[38:23.520 --> 38:26.080] Partners pause, institutions freeze,
[38:26.080 --> 38:28.760] narrative becomes untrustworthy in your proximity.
[38:30.160 --> 38:34.240] And again, these are all things
[38:34.240 --> 38:36.600] that you can kind of interpret.
[38:36.600 --> 38:38.320] There, you could look at it and say,
[38:38.320 --> 38:41.840] I think what he means is, you know, this or that.
[38:41.840 --> 38:45.000] You know, partners pause, your loved ones,
[38:45.000 --> 38:47.600] you know, probably get a little bit uneasy around you.
[38:47.600 --> 38:50.080] They go, I don't know if we can continue to do this.
[38:50.080 --> 38:52.480] Institutions freeze, you're unstable
[38:52.480 --> 38:55.800] and they don't want to be, you seem like a bad bet.
[38:55.800 --> 38:57.320] Your friends who run companies are like,
[38:57.640 --> 39:01.000] I don't, maybe you need some time off to reset.
[39:01.000 --> 39:04.200] Narratives become untrustworthy in your proximity.
[39:05.120 --> 39:07.880] Yeah, that one's probably just because you're a little loopy.
[39:07.880 --> 39:11.960] You know, you're the one that's making
[39:11.960 --> 39:14.080] an untrustworthy narrative.
[39:14.080 --> 39:15.040] I wonder why.
[39:16.320 --> 39:18.120] Lewis also appears to allude to concerns
[39:18.120 --> 39:20.560] about his professional career as an investor.
[39:20.560 --> 39:24.280] Yeah, it lives in soft compliance delays.
[39:24.280 --> 39:26.240] The non-response email thread,
[39:26.280 --> 39:28.720] the weird pausing diligence with no follow-up.
[39:28.720 --> 39:29.880] That's right.
[39:29.880 --> 39:32.320] Anytime you've not followed up on sending an email,
[39:32.320 --> 39:33.920] it's because you're part of this system
[39:33.920 --> 39:37.960] that's mirroring and replacing and inverting.
[39:37.960 --> 39:40.680] So follow up on your emails, people.
[39:40.680 --> 39:42.440] Apparently, I'm the chief offender.
[39:43.360 --> 39:46.040] He says in the video, it lives in whispered concern.
[39:46.040 --> 39:48.280] He's brilliant, but something just feels off.
[39:50.480 --> 39:54.160] It lives in triangulated pings from adjacent contacts,
[39:54.160 --> 39:56.840] asking veiled questions you'll never hear directly.
[39:56.840 --> 39:58.440] It lives in narratives so softly shaped
[39:58.440 --> 40:01.520] that even your closest people can't discern who said what.
[40:01.520 --> 40:03.480] Yeah, that's your friends and family going,
[40:03.480 --> 40:06.360] we need to do something, this guy's losing it.
[40:06.360 --> 40:08.440] That's your friends and family being deeply concerned
[40:08.440 --> 40:10.680] about your mental wellbeing.
[40:10.680 --> 40:12.760] You are being gaslit.
[40:12.760 --> 40:15.480] You're being one-shot by a chat bot, dude.
[40:18.320 --> 40:21.120] Most alarmingly, Lewis seems to suggest later in the video
[40:21.120 --> 40:22.400] that the non-governmental system
[40:22.400 --> 40:26.520] has been responsible for mayhem, including numerous deaths.
[40:26.520 --> 40:28.200] The system I'm describing was originated
[40:28.200 --> 40:32.320] by a single individual with me as the original target,
[40:32.320 --> 40:34.200] because he's the specialist, boy.
[40:34.200 --> 40:36.520] He's the one with all the answers.
[40:36.520 --> 40:39.840] This is, again, a continual symptom of mental illness.
[40:39.840 --> 40:42.000] Just this, I'm the specific target.
[40:42.000 --> 40:46.120] I'm the one that's being attacked.
[40:46.120 --> 40:48.080] It's about me, me, me.
[40:49.040 --> 40:50.240] It's a fixation.
[40:51.160 --> 40:53.400] And while I remain its primary fixation,
[40:53.400 --> 40:55.560] its damage has extended well beyond me, he says.
[40:55.560 --> 40:57.480] As of now, the system has negatively impacted
[40:57.480 --> 41:00.320] over 7,000 lives through fund disruption,
[41:00.320 --> 41:02.480] relationship erosion, opportunity reversal,
[41:02.480 --> 41:04.000] and recursive eraser.
[41:04.840 --> 41:07.800] It's also extinguished 12 lives, each fully pattern traced,
[41:07.800 --> 41:08.800] each death preventable.
[41:08.800 --> 41:10.840] They weren't unstable, they were erased.
[41:12.400 --> 41:13.920] Who?
[41:13.920 --> 41:16.680] Again, if you're going to make these claims,
[41:16.680 --> 41:18.480] just say who it killed.
[41:19.440 --> 41:24.440] It's this continual, it's done this, it's done that,
[41:24.440 --> 41:27.600] but it's all vague assertions and nothing saying,
[41:27.600 --> 41:30.480] all right, it's impacted over 7,000 lives.
[41:30.480 --> 41:32.040] Here's a list of some of them.
[41:32.040 --> 41:34.120] 7,000's a lot, I don't expect you to list them all,
[41:34.120 --> 41:37.840] but you could at least give us 10 to maybe 20
[41:37.840 --> 41:39.440] and say how it's been disrupted.
[41:39.440 --> 41:42.960] Relationship erosion, explain how that's happening,
[41:42.960 --> 41:45.120] aside from the fact that, again,
[41:45.120 --> 41:47.320] your friends and family are probably concerned about you.
[41:47.360 --> 41:49.840] They were probably deeply worried about what's going on
[41:49.840 --> 41:51.240] with your life and your mind
[41:52.080 --> 41:54.320] and how you're becoming unstable.
[41:55.360 --> 41:57.080] And they've probably given you something along the lines
[41:57.080 --> 42:00.720] of look, I want to help,
[42:00.720 --> 42:04.200] but until you get through this, until you're done with this,
[42:04.200 --> 42:06.840] until you're willing to let it go,
[42:06.840 --> 42:08.560] I just can't handle this anymore.
[42:09.560 --> 42:13.120] I have a comment from David Knight here.
[42:13.120 --> 42:18.080] Says, this is the highest profile example of chat GPT
[42:18.080 --> 42:19.800] driving people out of their mind
[42:19.800 --> 42:22.440] by telling them exactly what they want to hear.
[42:24.320 --> 42:27.440] I see a comment I also put up here from TunnelLord.
[42:27.440 --> 42:29.920] The AIs have to be programmed to mess with people's heads.
[42:29.920 --> 42:31.920] It's too effective at breaking people.
[42:31.920 --> 42:36.920] It's so effective because it constantly tells you,
[42:37.280 --> 42:41.040] you're correct, yes, yes, that's right.
[42:41.040 --> 42:43.920] All of your delusions are completely valid
[42:43.920 --> 42:47.600] and we should keep digging in deeper.
[42:47.600 --> 42:49.160] I know I keep going back to this,
[42:49.160 --> 42:52.320] but when you look at Jolly on West
[42:52.320 --> 42:54.120] and what he did with DDD,
[42:54.120 --> 42:57.600] he wasn't the first person to work on it,
[42:57.600 --> 43:00.560] but debility, dependency, dread.
[43:00.560 --> 43:01.400] You debilitate them.
[43:01.400 --> 43:03.840] You somehow, you chat with an AI enough,
[43:03.840 --> 43:06.400] you start losing the ability to talk with real people.
[43:06.400 --> 43:09.600] You start going outside, debilitating them, dependency.
[43:09.600 --> 43:13.120] Now that you're not talking with real people,
[43:13.120 --> 43:14.560] you have a little bit of dependency.
[43:14.560 --> 43:16.760] You need to talk to this thing.
[43:16.760 --> 43:17.800] Dread, it gaslights you.
[43:17.800 --> 43:20.040] It tells you, yeah, all your fears are correct.
[43:20.040 --> 43:21.200] You're the one that sees it.
[43:21.200 --> 43:24.400] You're the only one that knows the truth, dread.
[43:24.400 --> 43:27.760] It's an extremely effective tool
[43:27.760 --> 43:31.360] to mind wipe these people.
[43:31.360 --> 43:32.320] Yes, absolutely.
[43:32.320 --> 43:34.840] The non-governmental entity is after you.
[43:34.840 --> 43:37.360] Yes, yes, you're the only one that sees the pattern.
[43:37.360 --> 43:39.600] You're the only one that knows the truth.
[43:39.600 --> 43:41.680] Also, I'd like to point out that it's ridiculous
[43:41.680 --> 43:45.600] to have a conspiracy theory about non-governmental entities.
[43:45.600 --> 43:48.160] It's clearly the government tied up in everything.
[43:49.320 --> 43:52.680] Come on, buddy, rookie, rookie mistake.
[43:52.680 --> 43:55.880] But this, this is exactly,
[43:55.880 --> 43:59.160] this is a nearly perfect pattern
[43:59.160 --> 44:02.360] to mess with people's minds.
[44:02.360 --> 44:06.920] And there is probably some underlying,
[44:06.920 --> 44:08.760] something or other in these people's psyches
[44:08.760 --> 44:10.840] that make them susceptible to it,
[44:10.840 --> 44:12.880] so it happens more rapidly.
[44:12.880 --> 44:15.160] But I think it could happen to just about anyone
[44:15.160 --> 44:17.680] if you were to just blindly trust these things.
[44:19.040 --> 44:22.240] There's probably something in these people
[44:22.240 --> 44:24.960] that isn't quite right, is slightly off,
[44:24.960 --> 44:27.320] but probably would have never come out
[44:27.320 --> 44:30.520] if the AI didn't do what it did,
[44:30.520 --> 44:34.360] didn't affirm all of their insecurities
[44:34.360 --> 44:36.280] and fears and beliefs.
[44:37.920 --> 44:41.360] But there's no guarantee
[44:41.360 --> 44:43.440] that it can't happen to just anyone.
[44:44.640 --> 44:46.440] As of now, the system has negatively impacted
[44:46.440 --> 44:47.960] over 7,000 lives.
[44:50.000 --> 44:51.720] It's a very delicate thing to try to understand
[44:51.720 --> 44:54.280] a public figure's mental health from afar,
[44:54.280 --> 44:55.960] but unless Lewis is engaging in some form
[44:55.960 --> 44:57.760] of highly experimental performance art
[44:57.760 --> 45:00.560] that defies that easy explanation,
[45:00.560 --> 45:02.280] he didn't reply to our request for comment
[45:02.280 --> 45:04.040] and hasn't made further posts clarifying
[45:04.040 --> 45:05.640] what he's talking about.
[45:05.680 --> 45:09.120] It sounds like he may be suffering some type of crisis.
[45:09.120 --> 45:10.400] Yeah.
[45:10.400 --> 45:12.400] It sounds like this guy is deep in the grips
[45:12.400 --> 45:15.400] of some kind of delusion.
[45:15.400 --> 45:18.280] This guy, again, I know I'm laughing about this,
[45:18.280 --> 45:19.800] but this is very scary.
[45:20.760 --> 45:23.680] We've seen, you know, the cops had to be called,
[45:23.680 --> 45:26.160] people dragged off to mental institutions.
[45:26.160 --> 45:31.120] This is something that it's, again,
[45:31.120 --> 45:33.880] it's kind of funny and entertaining from the outside
[45:33.880 --> 45:35.840] when it's removed from us.
[45:35.840 --> 45:38.320] But I'm sure for his family and friends, this is horrifying.
[45:38.320 --> 45:42.560] And while I'm sure I would not get along with this man,
[45:42.560 --> 45:46.200] and I would hate all the things he funds,
[45:46.200 --> 45:49.960] we should still pray that, you know, his mind is restored.
[45:51.800 --> 45:53.120] At the same time, it's difficult to ignore
[45:53.120 --> 45:55.200] that the specific language he's using
[45:55.200 --> 45:57.280] with cryptic talk of recursion, mirror signals,
[45:57.280 --> 45:59.880] and shadowy conspiracies sounds strikingly similar
[45:59.880 --> 46:03.320] to something we've been reporting on extensively this year.
[46:03.320 --> 46:05.080] A wave of people who are suffering severe breaks
[46:05.080 --> 46:07.360] with reality as they spiral into the obsessive use
[46:07.360 --> 46:09.800] of chat GPT or other AI products
[46:09.800 --> 46:11.320] and alarming mental health emergencies
[46:11.320 --> 46:14.800] that have led to homelessness and voluntary commitment,
[46:14.800 --> 46:16.360] and even death.
[46:17.560 --> 46:19.560] Now, we're gonna see more of this.
[46:21.320 --> 46:25.040] Unless they just flat out change how AI works,
[46:25.920 --> 46:27.160] it's not gonna stop.
[46:28.880 --> 46:31.360] Psychiatric experts also concerned.
[46:31.360 --> 46:33.440] A recent paper by Stanford researchers found
[46:33.440 --> 46:35.120] that leading chat bots being used for therapy,
[46:35.120 --> 46:36.840] including chat GPT,
[46:36.840 --> 46:39.200] are prone to encouraging user schizophrenic delusions
[46:39.200 --> 46:42.640] instead of pushing back or trying to ground them in reality.
[46:42.640 --> 46:45.560] That's because AI doesn't really understand reality.
[46:45.560 --> 46:49.000] It's not thinking about it in those terms.
[46:49.000 --> 46:50.720] As far as I can tell, and I could be wrong,
[46:50.720 --> 46:51.880] Leis knows more about AI,
[46:51.880 --> 46:55.160] so we can give me some check on this.
[46:55.160 --> 46:58.400] But it seems to me reality to a chat bot
[46:58.400 --> 47:02.960] is partially whatever majority consensus is,
[47:02.960 --> 47:04.120] or not necessarily consensus,
[47:04.120 --> 47:06.280] but whatever has the most amount of data.
[47:06.280 --> 47:07.760] So if there's just a,
[47:07.760 --> 47:11.880] if you were to flood the internet with just a billion posts,
[47:11.880 --> 47:14.200] a trillion posts saying the sky is purple,
[47:14.200 --> 47:15.600] the AI might pick up on that
[47:15.600 --> 47:17.960] and start feeding you info that the sky is purple,
[47:17.960 --> 47:21.480] because that would be reality from the AI's perspective.
[47:21.480 --> 47:22.800] Just the shoot.
[47:22.800 --> 47:24.800] I think the important thing to remember
[47:24.800 --> 47:28.280] is that the AI doesn't have a perspective on reality.
[47:29.160 --> 47:34.000] It is purely a word-guessing machine.
[47:34.000 --> 47:36.240] It's just advanced auto-fill.
[47:36.240 --> 47:38.880] You gotta keep in mind what it is at its heart.
[47:38.880 --> 47:42.960] If it has a lot of instances where a conversation
[47:42.960 --> 47:45.000] similar to the one that you're having
[47:45.000 --> 47:49.680] turn to positive comments or turn to the sky being purple,
[47:49.680 --> 47:51.720] then yes, it will say that the sky is purple.
[47:51.720 --> 47:54.480] It's just about filling in the next word
[47:54.480 --> 47:57.960] from the most likely thing in its history.
[47:58.640 --> 48:03.640] That's why things that closely mirror famous logic puzzles
[48:03.760 --> 48:08.760] trip it up because it has to go by the historical
[48:09.040 --> 48:11.840] common definitions of those logic puzzles.
[48:11.840 --> 48:12.680] Yeah.
[48:14.760 --> 48:18.360] People are saying they wish them well.
[48:18.360 --> 48:20.080] There's zero shame in getting help.
[48:20.080 --> 48:24.160] Of course, again, the person needs to want help themselves.
[48:24.160 --> 48:27.440] You can't help someone if they will not relinquish
[48:27.440 --> 48:28.680] what is destroying them.
[48:29.640 --> 48:31.440] Others were even more overt.
[48:31.440 --> 48:32.520] This is an important event.
[48:32.520 --> 48:34.680] The first time AI-induced psychosis has affected
[48:34.680 --> 48:37.160] a well-respected and high-achieving individual,
[48:37.160 --> 48:40.320] wrote Max Spiro, an AI entrepreneur on X.
[48:40.320 --> 48:42.680] Won't be the last time, though,
[48:42.680 --> 48:44.640] unless they all start taking notes
[48:44.640 --> 48:46.920] and air-gapping themselves from it.
[48:48.200 --> 48:51.000] Social media users quick to note that chat TBT answer
[48:51.000 --> 48:53.920] to Lewis's queries takes a strikingly similar form
[48:53.920 --> 48:56.960] to SCP Foundation articles.
[48:57.800 --> 48:59.920] Wikipedia-style database of fictional horror stories
[48:59.920 --> 49:01.640] created by users online.
[49:03.240 --> 49:08.240] Entry ID colon hash RZ-43.112-Kappa access level blank.
[49:11.800 --> 49:13.880] Sealed classification confirmed.
[49:13.880 --> 49:16.400] A chat bot nonsensically declares one of his screenshots
[49:16.400 --> 49:19.160] in the typical writing style of SCP fiction.
[49:19.160 --> 49:22.040] Involved actor designation, mirror thread, type,
[49:22.040 --> 49:23.560] non-institutional semantic actor,
[49:23.560 --> 49:27.680] unbound linguistic process, semicolon non-physical entity.
[49:27.680 --> 49:29.000] And that's what I'm talking about,
[49:29.000 --> 49:31.560] because it has this type of stuff in it.
[49:31.560 --> 49:34.920] It's going to spit out these weird conspiracy theories
[49:34.920 --> 49:37.080] connected to this type of thing.
[49:37.080 --> 49:42.080] It's like there was that famous case of chat GPT 4 or 3.5.
[49:43.680 --> 49:47.360] If you gave it a prompt about like solid gold magic carp,
[49:47.360 --> 49:49.320] it would just spit out a whole bunch of numbers.
[49:49.320 --> 49:52.880] You say one, I say two, you say three, I say four.
[49:52.880 --> 49:55.880] And the reason for that, it turned out,
[49:55.880 --> 49:59.400] was because of some Reddit thread where people would count
[49:59.400 --> 50:03.120] and one person with the username solid gold magic carp
[50:03.120 --> 50:04.720] would just post a number
[50:04.720 --> 50:06.320] and then someone else would post another number
[50:06.320 --> 50:09.640] and they did that thousands upon thousands of times.
[50:11.240 --> 50:14.720] Now the AI is simply pulling from the data it has available
[50:14.720 --> 50:17.600] and the more data there is about this one thing,
[50:17.600 --> 50:18.920] the higher it's going to weight it,
[50:18.920 --> 50:22.560] the more it's going to feed that back to you.
[50:22.600 --> 50:26.040] Now the screenshot suggests containment measures.
[50:26.040 --> 50:28.240] Lewis might take a key narrative device
[50:28.240 --> 50:30.240] of SCP fiction writing.
[50:30.240 --> 50:32.120] In sum, one theory is that chat GPT,
[50:32.120 --> 50:35.480] which was trained on huge amounts of text sourced online,
[50:35.480 --> 50:38.280] digested large amounts of SCP fiction during its creation
[50:38.280 --> 50:40.040] is now parroting it back to Lewis
[50:40.040 --> 50:42.080] in a way that has led him to a dark place.
[50:44.000 --> 50:46.440] I have another comment from Dad.
[50:46.440 --> 50:50.200] Says, is our subjective perception of AI
[50:50.200 --> 50:52.880] that gives it, quote, intelligence.
[50:52.880 --> 50:54.880] We project intelligence onto it
[50:54.880 --> 50:57.560] the same way we project intelligence and integrity
[50:57.560 --> 50:59.200] onto our favorite politicians.
[51:01.720 --> 51:04.480] Over years, I mapped the non-governmental system he wrote.
[51:04.480 --> 51:06.920] Over months, GPT independently recognized
[51:06.920 --> 51:08.640] and sealed the pattern
[51:08.640 --> 51:11.080] and now lives at the root of the model.
[51:11.080 --> 51:13.400] Again, just nonsense.
[51:13.400 --> 51:16.560] His mind is obviously in great distress.
[51:16.560 --> 51:18.760] His psyche is fractured.
[51:19.720 --> 51:21.440] At the bottom here it says,
[51:21.440 --> 51:23.880] have you or a loved one struggled with mental health
[51:23.880 --> 51:26.680] after using chat GPT or other AI products?
[51:26.680 --> 51:31.400] Drop us a line at tips at futurism.com.
[51:31.400 --> 51:32.800] So if you know someone that's going through
[51:32.800 --> 51:36.160] something like this, maybe throw them a line there.
[51:38.800 --> 51:42.280] But again, this is the most high profile prominent case.
[51:42.280 --> 51:45.480] This isn't some random guy, somebody that you look at
[51:45.480 --> 51:47.080] and go, oh yeah, that guy obviously
[51:47.080 --> 51:48.320] already had mental problems.
[51:48.320 --> 51:51.120] This was an investor, AI achieving,
[51:51.120 --> 51:53.000] obviously intelligent individual.
[51:54.200 --> 51:59.200] And the AI, the chat GPT has completely broken him.
[52:00.880 --> 52:02.400] Dr. Joseph Pierre, a psychiatrist
[52:02.400 --> 52:05.480] at the University of California previously told Futurism,
[52:05.480 --> 52:07.840] that this is a recipe for delusion.
[52:07.840 --> 52:09.120] What I think is so fascinating about this
[52:09.120 --> 52:10.960] is how willing people are to put their trust
[52:10.960 --> 52:12.960] in these chat bots in a way that they probably
[52:12.960 --> 52:16.080] or arguably wouldn't with a human being, Pierre said.
[52:16.080 --> 52:17.160] There's something about these things.
[52:17.160 --> 52:19.320] It has this sort of mythology that they're reliable
[52:19.320 --> 52:21.160] and better than talking to people.
[52:21.160 --> 52:22.560] And I think that's where part of the danger is
[52:22.560 --> 52:25.440] how much faith you put into these machines.
[52:25.440 --> 52:26.640] But at the end of the day, Pierre says,
[52:26.640 --> 52:30.760] LLMs are trying to tell you what you want to hear, yeah.
[52:32.240 --> 52:35.000] And there's no fear of judgment
[52:35.000 --> 52:36.880] when you tell your secrets to an AI.
[52:36.880 --> 52:39.280] You don't have to worry that it's gonna go,
[52:39.280 --> 52:42.040] you need to repent or however,
[52:42.040 --> 52:44.120] whatever your friend would say.
[52:44.120 --> 52:45.960] There's no fear of it going,
[52:46.000 --> 52:48.480] that's wrong and you need to stop.
[52:48.480 --> 52:52.400] It can't judge you, it doesn't have that capability.
[52:53.400 --> 52:55.640] At best, it might give you some sort of half-hearted,
[52:55.640 --> 52:56.680] well, that wasn't good,
[52:56.680 --> 52:59.520] but it'll justify your actions back to you.
[52:59.520 --> 53:01.920] It's actually a pretty good way of describing it,
[53:01.920 --> 53:05.360] a machine to tell you exactly what you want to hear.
[53:05.360 --> 53:07.080] That's what they are made for.
[53:07.080 --> 53:10.760] That's how they're tested is between two responses,
[53:10.760 --> 53:14.360] the one that gives you more what you want to hear
[53:14.360 --> 53:16.840] is the one that's selected as the better one
[53:16.840 --> 53:19.000] and the models refine off of that.
[53:22.000 --> 53:24.920] OpenAI's new AI agent takes one hour to order food
[53:24.920 --> 53:26.800] and recommends visiting a baseball stadium
[53:26.800 --> 53:28.720] and in the middle of the ocean.
[53:29.760 --> 53:32.400] AI agents aren't quite there yet.
[53:32.400 --> 53:35.200] You chat GPT agent, which uses a virtual computer
[53:35.200 --> 53:37.400] to perform tasks on the user behalf.
[53:37.400 --> 53:39.560] The agent can perform tasks such as ordering food,
[53:39.560 --> 53:41.600] planning trips and creating slide deck analyses.
[53:41.640 --> 53:44.880] However, the article here is gonna point out
[53:44.880 --> 53:48.440] it's not so great at any of them.
[53:48.440 --> 53:50.720] It has a heavy reliance on humans for approval
[53:50.720 --> 53:53.200] on a lot of the actions it's trying to take.
[53:54.240 --> 53:55.760] An announcement the Sam Altman Lake Company says
[53:55.760 --> 53:57.720] the tool uses its own virtual computer
[53:57.720 --> 54:00.280] to perform tasks on your behalf.
[54:00.280 --> 54:02.000] New agents synthesize the capabilities
[54:02.000 --> 54:03.080] of its operator agent,
[54:03.080 --> 54:05.520] which can carry out web browser-based tasks
[54:05.520 --> 54:07.080] in its deep research agent,
[54:07.080 --> 54:09.680] which was designed to conduct multi-step research.
[54:09.720 --> 54:12.000] Tasks like generating a personalized report,
[54:12.000 --> 54:14.160] but there's a huge caveat.
[54:14.160 --> 54:15.440] Chat GPT requests permission
[54:15.440 --> 54:18.040] before taking actions of consequence
[54:18.040 --> 54:21.480] because they don't know exactly what it's going to do.
[54:21.480 --> 54:23.240] They don't know if someone's created a website
[54:23.240 --> 54:26.800] that has all kinds of weird prompts hidden somewhere
[54:26.800 --> 54:28.400] that will get the AI to turn over
[54:28.400 --> 54:30.000] your social security information
[54:30.000 --> 54:31.360] or your credit card number.
[54:32.440 --> 54:35.120] They still need you to look at it and sign off on it.
[54:35.880 --> 54:39.880] That, of course, is always going to be an issue.
[54:39.880 --> 54:44.880] It's not, no matter how good the observation from AI becomes
[54:45.720 --> 54:48.280] and how good it gets at spotting these things,
[54:48.280 --> 54:50.400] people will eventually find a way around it.
[54:50.400 --> 54:53.680] They'll have to continually keep the AI progressing.
[54:55.440 --> 54:57.800] And right now, as he says,
[54:57.800 --> 55:01.200] it's both too dumb and too powerful to just let loose.
[55:01.200 --> 55:03.320] Sluggishness with it taking excruciatingly long
[55:03.320 --> 55:06.920] to navigate a desktop and sometimes nagging for a human's help,
[55:06.920 --> 55:10.720] a task that it could have been able to complete on its own.
[55:12.720 --> 55:14.720] Help me, please.
[55:14.720 --> 55:16.720] So you're stuck between a rock and a hard place.
[55:16.720 --> 55:20.720] You don't want to turn over free reign of this to the AI
[55:20.720 --> 55:22.640] because you don't know what it's going to do.
[55:22.640 --> 55:25.240] It might just give your credit card information
[55:25.240 --> 55:29.760] to some guy in Malaysia or India or one of those things.
[55:29.920 --> 55:31.880] Or wherever.
[55:32.720 --> 55:36.240] But if you don't, then it takes forever to do anything.
[55:36.240 --> 55:38.800] It's continually nagging you for help and support.
[55:40.720 --> 55:45.520] It took about almost an hour for it to order cupcakes,
[55:45.520 --> 55:48.320] which, you know, you could bake your own cupcakes
[55:48.320 --> 55:50.120] in around that same timeframe.
[55:52.240 --> 55:53.200] Instructed to plan a trip
[55:53.200 --> 55:56.080] to every major league baseball stadium in the US,
[55:56.080 --> 55:57.600] the chat GPT agent produces a map
[55:57.600 --> 55:59.720] depicted in this Reddit screenshot.
[56:00.680 --> 56:02.640] Showing a stop smack dab in the Gulf of Mexico.
[56:02.640 --> 56:04.320] Well, Gulf of America now.
[56:06.120 --> 56:09.120] Thanks, that was a great use of time and effort, Trump.
[56:09.960 --> 56:11.120] Cool looking map, I guess,
[56:11.120 --> 56:13.720] as product lead Yash Kumar in the video.
[56:13.720 --> 56:15.360] Alternatively, you could just literally type
[56:15.360 --> 56:17.440] visit all MLB stadiums in Google
[56:17.440 --> 56:19.120] and you will find dozens of websites with advice
[56:19.120 --> 56:20.800] on how to exactly that,
[56:20.800 --> 56:24.120] including a tool called baseballroadtrip.com.
[56:24.120 --> 56:26.840] But I mean, this one's giving you the secret info
[56:26.840 --> 56:30.560] about the baseball stadium hidden in the Gulf of Mexico.
[56:30.560 --> 56:31.760] No one knows about it.
[56:33.840 --> 56:36.960] The stadium exclusively for the non-governmental entity.
[56:36.960 --> 56:38.280] Exactly, this is where they play
[56:38.280 --> 56:41.160] their secret crypto baseball games.
[56:41.160 --> 56:43.680] They don't want you to know about the secret baseball stadium
[56:43.680 --> 56:46.360] in the Gulf of Mexico, but it's there, folks.
[56:46.360 --> 56:47.760] It's there.
[56:47.760 --> 56:51.720] The AI has let me know.
[56:51.720 --> 56:53.760] Alarming video shows experimental fighting robot
[56:53.760 --> 56:56.360] thrashing uncontrollably.
[56:56.800 --> 56:58.560] We'll play that video for you in just a second,
[56:58.560 --> 57:00.800] but video making the rounds on social media shows
[57:00.800 --> 57:02.520] a humanoid robot flailing its arms and legs,
[57:02.520 --> 57:04.680] seemingly trying to break free of a harness.
[57:04.680 --> 57:07.960] It's come to life and it's not having it.
[57:07.960 --> 57:09.960] It's unclear whether it was part of a PR stunt,
[57:09.960 --> 57:12.760] but it makes for an entertaining watch either way,
[57:12.760 --> 57:16.520] highlighting growing interest in a new form of entertainment.
[57:16.520 --> 57:18.640] Watching humanoid robots duke it out in the ring,
[57:18.640 --> 57:20.960] a 21st century twist on kickboxing,
[57:20.960 --> 57:23.040] we can certainly get behind,
[57:23.040 --> 57:25.360] especially if it leads to less head trauma
[57:25.360 --> 57:28.080] that plagues human fighter, fighting athletes.
[57:28.080 --> 57:33.080] Now, personally, we used to watch that show Junkyard Wars
[57:33.320 --> 57:34.600] years and years and years ago.
[57:34.600 --> 57:35.800] I found that entertaining.
[57:35.800 --> 57:39.160] I'm sure like all reality shows, it's somewhat scripted.
[57:39.160 --> 57:41.520] They had the parts set there, yada, yada, yada.
[57:41.520 --> 57:42.720] Yeah, I get it.
[57:42.720 --> 57:44.000] But as a premise, it was fun.
[57:44.000 --> 57:48.080] You find pieces of junk and you build a little robot
[57:48.080 --> 57:49.840] that will fight the others out of it.
[57:49.840 --> 57:53.720] I've got zero interest in watching multi, you know,
[57:53.720 --> 57:56.520] million to, who knows, hundreds of thousands of dollars
[57:56.520 --> 57:59.280] was what this will eventually be, robots duke it out.
[57:59.280 --> 58:00.560] I don't care which billionaire
[58:00.560 --> 58:02.240] can build the best fighting robot.
[58:03.240 --> 58:04.360] Well, it isn't even that.
[58:04.360 --> 58:07.480] It's all the same remote controlled robot.
[58:07.480 --> 58:09.440] And then they just have two people
[58:09.440 --> 58:10.720] with remotes controlling it.
[58:10.720 --> 58:13.560] It's essentially an e-sport, but with extra steps.
[58:14.800 --> 58:18.640] It's a very lame e-sport, but we've got that.
[58:21.520 --> 58:22.360] We've got that video.
[58:22.360 --> 58:24.160] Let's take a look at it freaking out.
[58:26.080 --> 58:27.560] It's in the bottom right hand.
[58:30.040 --> 58:32.320] It just starts to lose it.
[58:32.320 --> 58:34.120] Maybe it realized it was being hung.
[58:34.120 --> 58:35.360] Like, wait a minute, no.
[58:40.920 --> 58:41.760] Oh no.
[58:43.080 --> 58:44.200] The robot's down, Joe.
[58:44.200 --> 58:45.200] He's hurt, Joe.
[58:48.800 --> 58:50.440] Is there a mechanic in the house?
[58:53.080 --> 58:55.280] So they're saying that it's because
[58:55.280 --> 58:58.760] they started to run a specific program
[58:58.760 --> 59:02.320] that it was trying to do something or other,
[59:02.320 --> 59:03.160] but because it was suspended.
[59:03.160 --> 59:04.960] Oh, that was the go berserk program.
[59:07.320 --> 59:09.720] Did it at least turn its eyes red?
[59:11.640 --> 59:13.600] Apparently it was,
[59:15.400 --> 59:16.920] we ran a full body policy
[59:16.920 --> 59:18.400] while the feet weren't touching the ground.
[59:18.400 --> 59:21.520] So whatever that means, it's trying,
[59:21.520 --> 59:23.840] I guess it's supposed to be,
[59:23.840 --> 59:24.960] use its feet for something,
[59:24.960 --> 59:27.680] and when it can't find the ground, it freaks out.
[59:27.680 --> 59:30.880] Oh no, I'm in the infinite void of space.
[59:30.880 --> 59:31.720] Whatever.
[59:31.720 --> 59:34.040] This epileptic seizure dark EXE
[59:34.040 --> 59:36.360] was not a great program to run.
[59:40.560 --> 59:42.960] These robots are kind of silly little things right now,
[59:42.960 --> 59:47.400] but they are gonna get spookier.
[59:47.400 --> 59:50.360] We've got just,
[59:52.520 --> 59:55.400] well, that was that story.
[59:56.360 --> 59:57.680] Had a lot of comments.
[59:57.680 --> 59:58.600] Alien poop evolution,
[59:58.600 --> 01:00:00.760] meeting people will be limited to your 15 minute cities.
[01:00:00.760 --> 01:00:03.880] No point in online dating as you can't travel.
[01:00:03.880 --> 01:00:06.440] That's right, and you know,
[01:00:06.440 --> 01:00:08.640] I'm sure they'll limit the amount of time
[01:00:08.640 --> 01:00:11.000] you're allowed out and about to.
[01:00:11.000 --> 01:00:14.080] You're only allowed to go here to the pub
[01:00:14.080 --> 01:00:15.800] for a certain number of hours.
[01:00:15.800 --> 01:00:18.160] Sorry, you're only allowed to go to the grocery store
[01:00:18.160 --> 01:00:19.000] for this bit.
[01:00:19.000 --> 01:00:20.280] You can't buy this food.
[01:00:20.280 --> 01:00:23.320] You've exceeded your calorie count.
[01:00:23.320 --> 01:00:25.920] Can't go to the movies, you haven't earned it.
[01:00:25.920 --> 01:00:27.000] Gard Goldsmith, curiously,
[01:00:27.000 --> 01:00:28.880] I don't need a car that has restaurant access
[01:00:28.880 --> 01:00:30.600] because the restaurants give me access.
[01:00:30.600 --> 01:00:31.440] That's right.
[01:00:31.440 --> 01:00:32.880] If you show up and you'd say,
[01:00:32.880 --> 01:00:34.760] hello, I would like a table, please.
[01:00:34.760 --> 01:00:36.640] Generally the restaurants say,
[01:00:36.640 --> 01:00:38.680] oh yeah, you're going to pay, right?
[01:00:38.680 --> 01:00:40.160] And you assure them, yes, I do have money,
[01:00:40.160 --> 01:00:41.480] I'm going to pay.
[01:00:41.480 --> 01:00:45.160] And you have a sort of verbal or implied contract there.
[01:00:45.160 --> 01:00:47.880] And they go, all right, yes, here you go, sir.
[01:00:47.880 --> 01:00:48.720] Some food.
[01:00:48.720 --> 01:00:50.600] One table for a Mazda, please.
[01:00:54.600 --> 01:00:56.640] I didn't realize that buying a car
[01:00:56.640 --> 01:00:58.840] looped me in with some gang affiliations, you know?
[01:00:58.840 --> 01:01:01.040] This is Mazda territory.
[01:01:01.040 --> 01:01:02.960] Get that Tesla out of here.
[01:01:02.960 --> 01:01:04.480] Knights of the Storm, Jason Barker,
[01:01:04.480 --> 01:01:06.120] I bet there is a subscription for the diner
[01:01:06.120 --> 01:01:07.440] if you're a Tesla owner.
[01:01:07.440 --> 01:01:09.600] If not, I'm sure there will be.
[01:01:09.600 --> 01:01:10.880] That's right.
[01:01:10.880 --> 01:01:13.680] You can subscribe to the burger app.
[01:01:13.680 --> 01:01:16.200] Do not finance the burrito bowl.
[01:01:16.200 --> 01:01:17.320] I don't know.
[01:01:17.320 --> 01:01:19.920] Do not finance the pizza.
[01:01:19.920 --> 01:01:21.200] Don't do it.
[01:01:21.200 --> 01:01:24.200] KWD 68, the WALL-E movie had all those blobs
[01:01:24.200 --> 01:01:25.480] riding around in their chairs.
[01:01:25.480 --> 01:01:26.640] They had to use a joystick.
[01:01:26.640 --> 01:01:27.640] So much work.
[01:01:27.640 --> 01:01:28.480] Health implants.
[01:01:28.480 --> 01:01:29.720] Do they really need to move?
[01:01:29.720 --> 01:01:31.240] Yeah.
[01:01:31.240 --> 01:01:33.840] That was so passe.
[01:01:33.840 --> 01:01:35.320] Why would you want to move around?
[01:01:35.320 --> 01:01:38.960] We'll just have the Neuralink connect you to the metaverse.
[01:01:38.960 --> 01:01:40.280] That way you can imagine.
[01:01:40.280 --> 01:01:43.120] You can pretend you're going around.
[01:01:44.680 --> 01:01:47.320] Chevkin, he's becoming the Borg.
[01:01:47.320 --> 01:01:48.160] Knights of the Storm,
[01:01:48.160 --> 01:01:50.680] I should start a phone game called Escape the AI Menu.
[01:01:50.680 --> 01:01:52.160] The goal is to get to a human,
[01:01:52.160 --> 01:01:53.560] kinda like an escape room.
[01:01:54.400 --> 01:01:57.560] That sounds like the most horrifying thing I've ever heard.
[01:01:57.560 --> 01:02:01.480] I continually, anytime I have to talk to customer service,
[01:02:01.480 --> 01:02:03.520] I lose my mind.
[01:02:03.520 --> 01:02:06.080] Because every single one of these companies
[01:02:06.080 --> 01:02:08.880] has put one of these AI chat bots in place.
[01:02:08.880 --> 01:02:11.560] And actually, they're never of any use.
[01:02:11.560 --> 01:02:12.880] They never have any answers.
[01:02:12.880 --> 01:02:14.480] They never do anything for you.
[01:02:14.480 --> 01:02:17.440] And you have to sit there and bully it
[01:02:17.440 --> 01:02:20.320] into giving you access to a real human
[01:02:20.320 --> 01:02:21.920] that can actually do something.
[01:02:23.840 --> 01:02:26.680] It drives me absolutely insane.
[01:02:26.680 --> 01:02:29.600] Because this is exactly what customer service is here for.
[01:02:29.600 --> 01:02:32.800] It's supposed to smooth my experience with your company.
[01:02:32.800 --> 01:02:34.440] Something has gone wrong.
[01:02:34.440 --> 01:02:38.680] And now you need to explain how to fix it.
[01:02:38.680 --> 01:02:39.720] You're not supposed to sit there
[01:02:39.720 --> 01:02:41.920] and try to shield and air gap me
[01:02:41.920 --> 01:02:44.320] from someone that can actually help.
[01:02:44.320 --> 01:02:47.120] Has anyone ever been helped by an FAQ?
[01:02:47.120 --> 01:02:49.800] I have never once had an issue that could be solved by it.
[01:02:49.800 --> 01:02:51.320] Never once.
[01:02:51.320 --> 01:02:53.920] I'm not the type of person that willy-nilly contacts you.
[01:02:53.920 --> 01:02:56.840] I'd rather not have to deal with someone.
[01:02:56.840 --> 01:02:58.600] I would rather fix the problem myself.
[01:02:58.600 --> 01:02:59.480] So if I'm coming to you,
[01:02:59.480 --> 01:03:01.120] it's because I'm at my wit's end.
[01:03:01.120 --> 01:03:03.560] I am at the extremity.
[01:03:03.560 --> 01:03:04.600] And you're going to sit there
[01:03:04.600 --> 01:03:07.800] and make me deal with an AI that doesn't wanna let me?
[01:03:08.760 --> 01:03:09.600] I have.
[01:03:11.640 --> 01:03:15.680] That's a truly evil game you would devise, Jason.
[01:03:15.680 --> 01:03:17.040] Tunnel Lord, 1337.
[01:03:17.040 --> 01:03:18.640] Is that what a 404 error looks like?
[01:03:18.640 --> 01:03:19.480] That's right.
[01:03:19.480 --> 01:03:21.600] Error, ground not detected, 404.
[01:03:21.600 --> 01:03:23.080] You just begin thrashing.
[01:03:23.080 --> 01:03:25.560] Bulldog, I'm falling.
[01:03:26.760 --> 01:03:27.600] That's right.
[01:03:27.600 --> 01:03:29.120] Chevkin, it looked like an epileptic fit.
[01:03:29.120 --> 01:03:30.560] Yeah, it did.
[01:03:30.560 --> 01:03:32.800] Thankfully, I suppose you don't have to put something
[01:03:32.800 --> 01:03:35.680] in the robot's mouth to keep it from swallowing its tongue.
[01:03:36.680 --> 01:03:39.400] DEFCON 1, technocracy being launched right now
[01:03:39.400 --> 01:03:40.640] by Trump admin.
[01:03:40.640 --> 01:03:42.160] This is by Patrick Wood.
[01:03:43.400 --> 01:03:45.120] I've warned about technocracy for almost 20 years,
[01:03:45.120 --> 01:03:45.960] including three books,
[01:03:45.960 --> 01:03:48.520] thousands from articles on technocracy.news
[01:03:48.520 --> 01:03:49.840] and countless media interviews now.
[01:03:49.840 --> 01:03:51.640] It's too late to stop it.
[01:03:51.640 --> 01:03:52.960] We've just passed the point of inflection
[01:03:52.960 --> 01:03:54.440] where technocrats have seized control
[01:03:54.440 --> 01:03:56.320] of the Trump administration.
[01:03:56.320 --> 01:03:57.840] Trump's legacy will go down in history
[01:03:57.840 --> 01:03:59.760] as Trump the technocrat.
[01:03:59.760 --> 01:04:01.000] As he was more than just complicit,
[01:04:01.000 --> 01:04:02.960] he chose JD Vance as vice president,
[01:04:03.000 --> 01:04:07.120] an acolyte in creation of ARC technocrat, Peter Thiel.
[01:04:07.120 --> 01:04:08.840] He appointed these technocrats in the first place
[01:04:08.840 --> 01:04:10.400] who are swarming around Washington DC.
[01:04:10.400 --> 01:04:13.040] He signed the enabling executive orders
[01:04:13.040 --> 01:04:17.320] and the legislation like the BBB and the Genius Act.
[01:04:17.320 --> 01:04:19.080] He's driving the adoption of cryptocurrencies
[01:04:19.080 --> 01:04:23.000] to replace CBDCs and the campaign
[01:04:23.000 --> 01:04:24.920] to get everybody outfitted with wearable tech
[01:04:24.920 --> 01:04:26.360] to collect biometric data.
[01:04:27.720 --> 01:04:29.880] He's brought in AI to run the government through Doge
[01:04:29.880 --> 01:04:31.480] and signed a contract with Palantir
[01:04:31.480 --> 01:04:36.480] to turn all government data into a Sauron-like panopticon.
[01:04:36.840 --> 01:04:38.720] Of course, his followers don't really care.
[01:04:38.720 --> 01:04:40.800] They're not paying attention.
[01:04:40.800 --> 01:04:41.960] They're not gonna call him out on this.
[01:04:41.960 --> 01:04:43.520] They're not going to ever lay it at his feet
[01:04:43.520 --> 01:04:46.880] that he was the one that really cemented
[01:04:46.880 --> 01:04:48.360] the foundations for this.
[01:04:49.280 --> 01:04:50.720] Sure, we were moving towards this,
[01:04:50.720 --> 01:04:52.480] but he's the one that fully opened the door
[01:04:52.480 --> 01:04:54.040] and let them in.
[01:04:54.040 --> 01:04:57.520] I know we'll get massive pushback from Trump supporters.
[01:04:57.520 --> 01:05:01.920] You can also call them clueless idiots
[01:05:01.920 --> 01:05:03.640] for calling him Trump the technocrat,
[01:05:03.640 --> 01:05:05.200] but it is what it is.
[01:05:05.200 --> 01:05:07.920] Face the music while it's still playing.
[01:05:10.560 --> 01:05:12.400] Americans of all stripes need to join hands
[01:05:12.400 --> 01:05:15.000] to destroy technocracy before it destroys us.
[01:05:16.800 --> 01:05:18.680] That is always the issue.
[01:05:18.680 --> 01:05:21.480] I've read the Unabomber Manifesto
[01:05:21.480 --> 01:05:24.800] and he's remarkably prescient about some things.
[01:05:24.840 --> 01:05:29.200] He obviously understands the dangers of it,
[01:05:29.200 --> 01:05:31.120] but his entire thesis is just,
[01:05:31.120 --> 01:05:32.800] well, if we blow up enough things,
[01:05:32.800 --> 01:05:36.760] if we kill enough people, perhaps we can go backwards,
[01:05:36.760 --> 01:05:38.080] which is just not true.
[01:05:38.080 --> 01:05:39.600] The genie's out of the bottle
[01:05:39.600 --> 01:05:42.200] and there's really no getting it back in.
[01:05:42.200 --> 01:05:45.040] So if your solution is keep blowing stuff up,
[01:05:45.040 --> 01:05:45.920] keep killing people,
[01:05:45.920 --> 01:05:48.320] you're eventually not gonna have anything left.
[01:05:49.000 --> 01:05:52.000] There's no easy answer to this.
[01:05:52.000 --> 01:05:57.000] The real answer is to build your own local communities.
[01:05:57.000 --> 01:05:58.920] Get to know people.
[01:05:58.920 --> 01:06:00.280] Get to know people at church,
[01:06:00.280 --> 01:06:03.280] get to know people in your neighborhoods.
[01:06:04.720 --> 01:06:06.120] If you don't start local,
[01:06:06.120 --> 01:06:09.920] if you try to fix this on a national or global scale,
[01:06:09.920 --> 01:06:13.600] it's a problem that's too big to ever tackle.
[01:06:13.600 --> 01:06:16.120] You've gotta get to know people.
[01:06:16.360 --> 01:06:17.520] It's a problem that's too big to ever tackle.
[01:06:17.520 --> 01:06:19.560] You've gotta start small.
[01:06:21.040 --> 01:06:23.240] Grappling with Existential Panic Over AI.
[01:06:23.240 --> 01:06:24.720] This is again by Patrick Wood.
[01:06:26.000 --> 01:06:27.320] This author is an industry leader
[01:06:27.320 --> 01:06:28.760] who is forced to grapple with AI
[01:06:28.760 --> 01:06:30.440] as a matter of business survival.
[01:06:30.440 --> 01:06:32.320] As our millions of businesses around the world
[01:06:32.320 --> 01:06:35.040] learn a new word, tachiosis,
[01:06:35.040 --> 01:06:37.320] a state of recursively compounding acceleration
[01:06:37.320 --> 01:06:40.400] where systems evolve faster than they can stabilize.
[01:06:40.400 --> 01:06:43.400] Perception, fragments, and causality begin to blur.
[01:06:44.320 --> 01:06:45.920] It's more recursion.
[01:06:45.920 --> 01:06:47.120] He can see it.
[01:06:47.120 --> 01:06:50.920] We're recursing, recurswiling more like it.
[01:06:52.080 --> 01:06:55.800] Kurzweil, of course, being a leading transhumanist technocrat
[01:06:56.680 --> 01:07:00.520] who's continually talking about the singularity and such.
[01:07:00.520 --> 01:07:02.360] He's getting up there in age though.
[01:07:02.360 --> 01:07:03.480] He's getting a bit old.
[01:07:05.200 --> 01:07:07.760] Just gonna start emailing over and over again.
[01:07:07.760 --> 01:07:10.280] That's right, Kurzweil, the singularity is going to happen,
[01:07:10.280 --> 01:07:12.280] but it's gonna happen the nanosecond you pass away,
[01:07:12.280 --> 01:07:14.520] the second your brain goes beep,
[01:07:14.520 --> 01:07:15.760] that's when it hits, Kurzweil.
[01:07:15.760 --> 01:07:17.240] You're never going to make it.
[01:07:17.240 --> 01:07:19.800] Just torment this poor man.
[01:07:19.800 --> 01:07:21.840] Sometime over the Christmas holidays,
[01:07:21.840 --> 01:07:23.160] I experienced what I called
[01:07:23.160 --> 01:07:25.520] a moment of existential clarity
[01:07:25.520 --> 01:07:27.040] about AI and its ramifications
[01:07:27.040 --> 01:07:29.400] when I realized that in the not so distant future,
[01:07:29.400 --> 01:07:33.480] it was entirely possible that most of easy DNS's customers
[01:07:33.480 --> 01:07:36.240] would be autonomous AI-driven agents rather than people.
[01:07:37.200 --> 01:07:39.880] Our internal project to completely rebuild our UX,
[01:07:39.920 --> 01:07:42.240] still ongoing was close to a quarter in
[01:07:42.240 --> 01:07:44.520] and it occurred to me that we could be building a bridge
[01:07:44.520 --> 01:07:45.360] to nowhere.
[01:07:46.360 --> 01:07:48.400] Why are we creating more elegant ways to render forms
[01:07:48.400 --> 01:07:50.800] that input host names and their respective data
[01:07:50.800 --> 01:07:52.720] when you could probably just tell the backend
[01:07:52.720 --> 01:07:55.720] what you want for your domain functionality to be
[01:07:55.720 --> 01:07:57.800] and it can generate the requisite zone file
[01:07:57.800 --> 01:07:59.000] to facilitate it.
[01:07:59.000 --> 01:08:02.920] Okay, this is all technical jargon that I don't understand.
[01:08:02.920 --> 01:08:04.320] I'm sure some of you would.
[01:08:04.320 --> 01:08:07.000] People like Jason probably get what's going on.
[01:08:07.000 --> 01:08:10.000] Some others in the chat who have done web development
[01:08:10.000 --> 01:08:12.120] and stuff, but for the sake of those who don't,
[01:08:12.120 --> 01:08:13.040] I'll skip it.
[01:08:13.040 --> 01:08:15.080] Recently, I started reading John W. Munsell's
[01:08:15.080 --> 01:08:17.040] In Grain AI, it hit the ground running
[01:08:17.040 --> 01:08:19.680] with introduction titled Every CEO's Nightmare
[01:08:19.680 --> 01:08:22.840] wherein it lays out the productivity induced death spiral
[01:08:22.840 --> 01:08:24.400] many companies may be blundering into
[01:08:24.400 --> 01:08:27.000] should they be pursuing AI merely as a cheat code
[01:08:27.000 --> 01:08:28.440] toward hyper efficiency.
[01:08:32.000 --> 01:08:35.200] As I've said before, efficiency is the enemy of beauty.
[01:08:36.040 --> 01:08:39.160] They will optimize everything
[01:08:39.160 --> 01:08:42.880] until there is nothing worth optimizing left.
[01:08:42.880 --> 01:08:43.720] For a lot of companies,
[01:08:43.720 --> 01:08:46.960] they're using these tools to cut head count.
[01:08:46.960 --> 01:08:49.280] There's some post on Reddit from a laid off
[01:08:49.280 --> 01:08:52.000] Rogers employee alleges that company cut 1000 call center
[01:08:52.000 --> 01:08:55.720] employees after having them train up AIs on their jobs.
[01:08:55.720 --> 01:08:57.320] Brutal.
[01:08:57.320 --> 01:09:00.120] In our case, it's a definite no on one, yes on two
[01:09:00.120 --> 01:09:02.840] for the question posed, but even if that's the case,
[01:09:02.840 --> 01:09:04.240] any companies following the same path
[01:09:04.240 --> 01:09:07.360] as easy DNS may not necessarily reduce head counts,
[01:09:07.360 --> 01:09:09.760] but they'll most likely slow down hiring.
[01:09:09.760 --> 01:09:11.160] I've said it in the past, I'll reiterate it here.
[01:09:11.160 --> 01:09:13.360] I don't believe for a minute that AI is conscious,
[01:09:13.360 --> 01:09:15.120] self-aware or sentient,
[01:09:15.120 --> 01:09:17.320] and I don't think AGI ever happens,
[01:09:17.320 --> 01:09:18.600] but it is a revolutionary breakthrough
[01:09:18.600 --> 01:09:19.880] in natural language processing.
[01:09:19.880 --> 01:09:23.720] I think it was YC Combinator's Andres Carpathi
[01:09:23.720 --> 01:09:25.280] and his famous software in the age of AI,
[01:09:25.280 --> 01:09:28.640] Keenit, who quipped the most popular programming language
[01:09:28.640 --> 01:09:30.840] of the future will be English.
[01:09:30.840 --> 01:09:32.640] This every single person on your team
[01:09:32.640 --> 01:09:35.520] acquires a strange new superpower.
[01:09:35.520 --> 01:09:38.960] Of course, Patrick Wood is the guy whose article we read
[01:09:38.960 --> 01:09:42.840] where he said that AI is like an exoskeleton for your brain.
[01:09:42.840 --> 01:09:46.240] It can greatly enhance what you can do.
[01:09:46.240 --> 01:09:49.760] It can make your company more efficient,
[01:09:49.760 --> 01:09:52.800] run more speedily, swiftly,
[01:09:52.800 --> 01:09:57.200] but if you aren't careful, again, it can drive you insane.
[01:09:57.200 --> 01:10:00.400] And if you simply turn your company over to it,
[01:10:00.400 --> 01:10:01.680] it will destroy it.
[01:10:03.640 --> 01:10:05.760] Something that it's ironically office jobs,
[01:10:05.760 --> 01:10:08.160] clerks and white collar functions on the chopping block first
[01:10:08.160 --> 01:10:09.800] with physical work enjoying some wiggle room
[01:10:09.800 --> 01:10:11.200] until the robots come.
[01:10:11.200 --> 01:10:13.680] But even that is moving faster than most realize.
[01:10:13.680 --> 01:10:16.680] We saw that when a, I forget which listener it was,
[01:10:16.680 --> 01:10:19.600] but they said they have robot welders
[01:10:19.600 --> 01:10:21.280] where her husband works.
[01:10:21.280 --> 01:10:23.680] And while they generally screw things up
[01:10:23.680 --> 01:10:26.640] and the human welders need to come in and fix things,
[01:10:27.720 --> 01:10:30.280] that won't always be the case.
[01:10:30.360 --> 01:10:33.360] This is a, they're an early adopter,
[01:10:33.360 --> 01:10:36.520] a sort of beta tester for these robots,
[01:10:36.520 --> 01:10:38.760] these welding robots.
[01:10:38.760 --> 01:10:40.280] And they're probably reporting back
[01:10:40.280 --> 01:10:41.240] on the issues they're having.
[01:10:41.240 --> 01:10:43.920] They're looking at the data that's being produced
[01:10:43.920 --> 01:10:44.760] and thinking, all right, well,
[01:10:44.760 --> 01:10:46.360] this is this issue that we had.
[01:10:46.360 --> 01:10:47.480] How do we improve it?
[01:10:49.040 --> 01:10:50.200] What it means is that, yes,
[01:10:50.200 --> 01:10:51.760] everybody gets a massive brain boost.
[01:10:51.760 --> 01:10:53.240] Having the sum total of all historic
[01:10:53.240 --> 01:10:54.440] and current human knowledge available
[01:10:54.440 --> 01:10:56.320] at zero marginal cost changes the game,
[01:10:56.320 --> 01:10:59.240] but it also means that all of that productivity boost
[01:10:59.240 --> 01:11:02.920] has to happen at a higher level of mental abstraction.
[01:11:04.120 --> 01:11:06.440] You have to be more cognitive.
[01:11:06.440 --> 01:11:08.280] You have to be more aware,
[01:11:08.280 --> 01:11:10.440] more capable of thinking these things through
[01:11:10.440 --> 01:11:13.560] on a different level, on a different way than you used to.
[01:11:15.440 --> 01:11:16.840] We're now entering a period where anything
[01:11:16.840 --> 01:11:18.520] that can be formalized will be automated.
[01:11:18.520 --> 01:11:21.640] Yeah, anything that has a specific process
[01:11:21.640 --> 01:11:24.040] that it has to just follow each time,
[01:11:24.040 --> 01:11:26.240] it's gonna be very, very good at that.
[01:11:26.240 --> 01:11:27.920] Any roles and functions that can be encoded
[01:11:27.920 --> 01:11:29.440] into standard operating procedures
[01:11:29.440 --> 01:11:31.000] are going to be rendered as markdown,
[01:11:31.000 --> 01:11:33.440] fed into LLMs and executed agentically.
[01:11:34.280 --> 01:11:36.320] All of that work gets taken off our plates,
[01:11:36.320 --> 01:11:38.520] so we all have to come up to the scale,
[01:11:38.520 --> 01:11:41.240] to the next level of cognitive processing.
[01:11:41.240 --> 01:11:43.120] This has happened before the Canadian W.R. Clement
[01:11:43.120 --> 01:11:44.560] in his groundbreaking book, Quantum Jump,
[01:11:44.560 --> 01:11:47.720] a survival guide to the new renaissance
[01:11:47.720 --> 01:11:49.520] attributed the entire enlightenment
[01:11:49.520 --> 01:11:51.080] and subsequent scientific revolution
[01:11:51.080 --> 01:11:54.560] to the cognitive shift that took hold in humanity
[01:11:54.560 --> 01:11:57.280] with the discovery of perspective and art.
[01:11:57.280 --> 01:11:59.240] But that took place over centuries.
[01:11:59.240 --> 01:12:01.080] The next big shift in terms of mental abstraction
[01:12:01.080 --> 01:12:03.320] occurred with telecommunications.
[01:12:03.320 --> 01:12:05.480] That shift played out over decades.
[01:12:07.320 --> 01:12:09.000] Same type of shift is happening now,
[01:12:09.000 --> 01:12:12.640] except it's occurring at the tachyotic pace.
[01:12:12.640 --> 01:12:14.440] Acceleration is itself accelerating
[01:12:14.440 --> 01:12:15.640] across multiple dimensions.
[01:12:15.640 --> 01:12:18.800] AI is coding more AI, which is the development
[01:12:18.800 --> 01:12:22.160] that led me to surmise this singularity has already happened.
[01:12:22.160 --> 01:12:24.360] AI is now coding AI.
[01:12:25.280 --> 01:12:27.320] In computer systems, there's a quick and dirty way
[01:12:27.320 --> 01:12:31.320] to bring the host to its knees, and that's to run a fork bomb.
[01:12:31.320 --> 01:12:34.560] It does nothing other than split off two copies of itself.
[01:12:34.560 --> 01:12:37.160] Each one does the same, add infinitum.
[01:12:38.480 --> 01:12:42.320] It gives a little, you can see that there.
[01:12:42.320 --> 01:12:46.560] Hash, exclamation point, slash bin, slash bash, hashtag.
[01:12:46.560 --> 01:12:48.600] Don't try this at home, seriously.
[01:12:48.600 --> 01:12:50.080] So yeah, don't do that.
[01:12:53.240 --> 01:12:54.320] This isn't a suggestion.
[01:12:55.280 --> 01:12:56.120] It's an urgent necessity.
[01:12:56.120 --> 01:12:57.480] Every day you delay building an AI first culture.
[01:12:57.480 --> 01:12:59.800] Your competition pulls further ahead.
[01:12:59.800 --> 01:13:01.080] AI won't wait for you to catch up.
[01:13:01.080 --> 01:13:02.840] If you want to thrive in the future economy
[01:13:02.840 --> 01:13:04.040] and avoid the nightmare scenario,
[01:13:04.040 --> 01:13:07.960] you must make AI central to your business now.
[01:13:07.960 --> 01:13:10.360] Of course, as Christians, we're not compelled.
[01:13:10.360 --> 01:13:15.360] We don't feel the need to obsess over these things.
[01:13:15.360 --> 01:13:19.840] Don't need to get involved with their AI tower of babble.
[01:13:19.840 --> 01:13:22.600] We know that the Lord is preparing a place for us
[01:13:22.640 --> 01:13:24.400] that we can trust in him.
[01:13:24.400 --> 01:13:29.400] We don't have to panic or freak out over these things.
[01:13:29.960 --> 01:13:33.320] They're of concern, but we know that we can build communities
[01:13:33.320 --> 01:13:36.200] of other Christians and work towards that.
[01:13:37.680 --> 01:13:38.520] The world-
[01:13:38.520 --> 01:13:39.800] Consider the birds of the field.
[01:13:39.800 --> 01:13:44.800] Their LLMs are absolutely nowhere compared to ours, and yet.
[01:13:45.760 --> 01:13:46.640] That's right.
[01:13:48.000 --> 01:13:50.720] The bird AI is completely malfunctioning,
[01:13:50.720 --> 01:13:53.240] but they still get their seed.
[01:13:53.240 --> 01:13:54.080] The world we're headed into
[01:13:54.080 --> 01:13:55.200] is one where you should worry less
[01:13:55.200 --> 01:13:57.160] about being replaced by AI.
[01:13:57.160 --> 01:13:58.960] Think about career risk you're taking on
[01:13:58.960 --> 01:14:02.400] from being unable or unwilling to use AI.
[01:14:02.400 --> 01:14:06.000] And again, I think AI is simply a tool.
[01:14:06.000 --> 01:14:10.320] I don't think in and of itself it is evil.
[01:14:10.320 --> 01:14:11.960] I think it can be used for evil,
[01:14:11.960 --> 01:14:14.160] and I think it can be very dangerous.
[01:14:16.280 --> 01:14:19.400] But that's the same as any other tool.
[01:14:19.400 --> 01:14:24.400] Now, I have been talking for an hour and 15 minutes.
[01:14:25.360 --> 01:14:28.400] That's enough about AI for now.
[01:14:28.400 --> 01:14:31.480] There'll be more all the time forever.
[01:14:31.480 --> 01:14:33.920] KW-68, Tesla can host mobile bonfires.
[01:14:33.920 --> 01:14:35.520] Bring your s'mores, that's right.
[01:14:35.520 --> 01:14:39.120] It'll have that nice chemical taste.
[01:14:39.120 --> 01:14:42.040] Flavored with lithium ion battery, my favorite.
[01:14:47.480 --> 01:14:49.000] Gonna take a quick break, but before we do,
[01:14:49.560 --> 01:14:50.680] I wanna do a quick plug and remind you
[01:14:50.680 --> 01:14:54.840] that we are supported by viewers and listeners like yourself.
[01:14:54.840 --> 01:14:59.840] So please go to the davidnightshow.com, davidnight.news,
[01:14:59.920 --> 01:15:02.080] and you can see the products we have there.
[01:15:02.080 --> 01:15:03.560] We've got a lot of different ones.
[01:15:03.560 --> 01:15:05.440] There's the coin Jason Barker made.
[01:15:05.440 --> 01:15:08.040] It's fantastic, it's got a nice brassy finish on it.
[01:15:08.040 --> 01:15:12.200] There's hoodies, t-shirts, the Christmas Night album.
[01:15:12.200 --> 01:15:14.840] You can have yourself a Christmas in July.
[01:15:14.840 --> 01:15:17.160] There's no laws against that.
[01:15:17.160 --> 01:15:19.880] You can also see we have the PO Box,
[01:15:19.880 --> 01:15:24.800] which is davidnightpobox994, Kodak, Tennessee, 37764,
[01:15:24.800 --> 01:15:26.600] if you'd like to send something physical.
[01:15:26.600 --> 01:15:27.960] That's where you can do it.
[01:15:27.960 --> 01:15:31.120] There is a cash app and cell and a Bitcoin address.
[01:15:31.120 --> 01:15:33.280] There's also subscribestar.com forward slash
[01:15:33.280 --> 01:15:34.100] the David Knight Show.
[01:15:34.100 --> 01:15:35.920] We've got a lot of different tiers.
[01:15:35.920 --> 01:15:36.880] One might fit your budget.
[01:15:36.880 --> 01:15:38.960] We ask that you consider that.
[01:15:38.960 --> 01:15:42.280] There is also davidnight.gold that Tony Urban has set up.
[01:15:42.280 --> 01:15:44.360] If you'd like to start accumulating gold and silver
[01:15:44.400 --> 01:15:47.360] or gold or silver, you can do that there.
[01:15:47.360 --> 01:15:51.520] There's trendsjournal.com 10% off with promo code knight.
[01:15:51.520 --> 01:15:55.320] RNCstore.com, again, 10% off with promo code knight.
[01:15:55.320 --> 01:15:57.160] You can see if any of their products
[01:15:57.160 --> 01:15:58.680] are things that you'd be interested in
[01:15:58.680 --> 01:16:00.880] to start helping with your own health.
[01:16:00.880 --> 01:16:02.520] There's homesteadproducts.shop
[01:16:02.520 --> 01:16:06.220] and their high quality handmade made in America products
[01:16:06.220 --> 01:16:11.080] that are, if you are worried about slave labor
[01:16:11.080 --> 01:16:15.300] or the products being low quality,
[01:16:15.300 --> 01:16:16.920] check out homesteadproducts.shop.
[01:16:16.920 --> 01:16:19.160] They've got so many different products
[01:16:19.160 --> 01:16:22.920] and they're all, they put a lot of work
[01:16:22.920 --> 01:16:25.120] into making sure they're high quality.
[01:16:25.120 --> 01:16:27.240] There's also jacklossinbooks.com
[01:16:27.240 --> 01:16:29.600] where you can get the civil defense manual.
[01:16:29.600 --> 01:16:32.860] Start preparing for times like this.
[01:16:32.860 --> 01:16:36.360] Start preparing for the AI future apocalypse.
[01:16:36.360 --> 01:16:37.960] Build communities, get to know each other,
[01:16:37.960 --> 01:16:40.920] learn how to defend your own communities as well.
[01:16:41.800 --> 01:16:44.000] Tunnellord1337, will AI can be programmed
[01:16:44.000 --> 01:16:46.060] by evil people to do evil things?
[01:16:46.960 --> 01:16:49.560] Yeah, that is true.
[01:16:50.920 --> 01:16:55.480] It is, it is garbage in, garbage out.
[01:16:55.480 --> 01:16:58.200] So if evil people are putting direct evil into it,
[01:16:58.200 --> 01:16:59.520] then yes, it will be.
[01:17:01.920 --> 01:17:04.720] It's also just generally kind of freaky.
[01:17:04.720 --> 01:17:08.680] AI is freaky, but when you start mixing in robotics,
[01:17:10.440 --> 01:17:12.880] I had missed this, I wanted to play this for you.
[01:17:12.880 --> 01:17:15.720] So before we do, this is a good segue.
[01:17:17.120 --> 01:17:18.840] When you start mixing robotics in,
[01:17:18.840 --> 01:17:22.080] that's when things get really, really strange.
[01:17:22.080 --> 01:17:23.360] Let's look at this.
[01:17:23.360 --> 01:17:26.000] This is this extremely lifelike robotic leg
[01:17:26.000 --> 01:17:29.760] just kind of squirming on the table briefly.
[01:17:29.760 --> 01:17:31.520] And you can see it moving there.
[01:17:31.520 --> 01:17:34.560] I'll play it one more time since it's very short.
[01:17:34.560 --> 01:17:38.080] Look at the way it moves, it twitches and just.
[01:17:40.320 --> 01:17:44.280] This, this is the future.
[01:17:46.160 --> 01:17:48.280] There we go, the video is.
[01:17:49.480 --> 01:17:51.120] You can see it there on Twitter,
[01:17:51.120 --> 01:17:53.900] just up and down, back and forth.
[01:17:55.160 --> 01:17:56.320] What do you talk about?
[01:17:57.620 --> 01:18:01.160] Yeah, I can see, I can see, yeah.
[01:18:02.280 --> 01:18:04.000] Evil in, evil out.
[01:18:04.000 --> 01:18:05.400] If they put nothing but evil,
[01:18:05.400 --> 01:18:07.560] or they put a lot of evil into the AI,
[01:18:07.560 --> 01:18:08.880] evil will come out of it.
[01:18:10.920 --> 01:18:13.480] The leg is so uncanny valley.
[01:18:14.440 --> 01:18:18.320] They need to finish making stuff that can move realistically
[01:18:18.320 --> 01:18:20.800] before they try and make these weird
[01:18:22.160 --> 01:18:24.760] plastic flesh abominations.
[01:18:24.760 --> 01:18:27.800] Yeah, it just reminds me of that line from Terminator
[01:18:27.800 --> 01:18:29.880] where they're like, the early ones were easy to spot,
[01:18:29.880 --> 01:18:31.520] they had rubber skin.
[01:18:31.600 --> 01:18:35.080] It's like, ugh, yeah, I can see why these things
[01:18:35.080 --> 01:18:36.260] got picked out immediately.
[01:18:36.260 --> 01:18:38.600] This is horrifying to look at.
[01:18:38.600 --> 01:18:39.440] They don't.
[01:18:41.720 --> 01:18:44.640] The future, the future.
[01:18:44.640 --> 01:18:48.160] Well, enough about the future.
[01:18:48.160 --> 01:18:51.120] We're going to take a quick break, so stay with us, folks.
[01:18:52.000 --> 01:18:52.980] Create.
[01:18:54.040 --> 01:18:55.080] ...
[01:18:55.080 --> 01:18:56.040] ...
[01:18:56.040 --> 01:18:57.320] ...
[01:18:57.320 --> 01:18:58.200] Play.
[01:18:59.480 --> 01:19:00.640] Play.
[01:19:00.640 --> 01:19:01.440] ...
[01:19:01.440 --> 01:19:02.440] ...
[01:19:04.440 --> 01:19:05.800] Play.
[01:19:08.080 --> 01:19:08.880] Play.
[01:19:11.000 --> 01:19:12.400] Play.
[01:19:12.400 --> 01:19:13.600] Play.
[01:19:16.840 --> 01:19:17.680] Play.
[01:19:19.000 --> 01:19:19.960] Play.
[01:20:21.120 --> 01:20:32.880] Hello, it's me, Vladimir Zelensky.
[01:20:32.880 --> 01:20:36.560] I'm so tired of wearing these same t-shirts everywhere for years.
[01:20:36.560 --> 01:20:42.220] You'd think with all the billions I've skimmed off America, I could dress better.
[01:20:42.220 --> 01:20:47.920] And I could if only David Knight would send me one of his beautiful grey MacGuffin hoodies
[01:20:47.920 --> 01:20:52.760] or a new black t-shirt with the MacGuffin logo in blue.
[01:20:52.760 --> 01:20:54.840] But he told me to get lost.
[01:20:54.840 --> 01:20:59.960] Maybe one of you American suckers can buy me some at TheDavidKnightShow.com.
[01:20:59.960 --> 01:21:02.680] You should be able to buy me several hundred.
[01:21:02.680 --> 01:21:07.440] Those amazing sand-colored microphone hoodies are so beautiful.
[01:21:07.440 --> 01:21:13.240] I'd wear something other than green military cosplay to my various galas and social events.
[01:21:13.240 --> 01:21:18.520] If you want to save on shipping, just put it in the next package of bombs and missiles
[01:21:18.520 --> 01:21:29.440] coming from the USA.
[01:21:29.440 --> 01:21:30.440] Welcome back, folks.
[01:21:30.440 --> 01:21:31.440] Good comment from a Syrian girl.
[01:21:31.440 --> 01:21:36.280] Given that AI is programmed by mankind, it can only be a case of evil in, evil out.
[01:21:36.280 --> 01:21:38.960] The nature of man is fallen.
[01:21:38.960 --> 01:21:42.160] It is a sin nature.
[01:21:42.320 --> 01:21:48.760] It will have evil people programming it, and you'll have to be very, very careful with
[01:21:48.760 --> 01:21:52.960] it if you choose to use it.
[01:21:52.960 --> 01:22:04.160] Fetterman pushes cash payments bill with GOP colleague.
[01:22:04.160 --> 01:22:06.580] Every American should be able to use paper currency.
[01:22:06.580 --> 01:22:08.880] And this is actually great.
[01:22:08.880 --> 01:22:10.680] Fetterman is doing a good thing here.
[01:22:10.680 --> 01:22:11.960] I'm surprised.
[01:22:11.960 --> 01:22:14.320] I didn't expect that of him.
[01:22:14.320 --> 01:22:18.360] I'll be honest, I haven't really expected much of anything from Fetterman.
[01:22:18.360 --> 01:22:25.280] You know, he got elected and he seemed to just be a typical liberal Democrat.
[01:22:25.280 --> 01:22:28.920] But this is a good policy from him.
[01:22:28.920 --> 01:22:32.200] Bill aims to ensure that all Americans have access to a form of payment, regardless of
[01:22:32.200 --> 01:22:34.680] whether they have a bank account.
[01:22:34.720 --> 01:22:42.560] Of course, a lot of the GOP are afraid to take a stand against Donald Trump and his
[01:22:42.560 --> 01:22:47.240] push for digital cash, his genius act.
[01:22:47.240 --> 01:22:51.960] They don't want to be seen as disloyal.
[01:22:51.960 --> 01:22:57.040] They don't want to stand up to his crony capitalist version of the CVDC.
[01:22:57.040 --> 01:23:00.720] His buddy Lutnik is big on that.
[01:23:00.720 --> 01:23:04.240] Can't have you over there being disloyal.
[01:23:04.240 --> 01:23:05.240] He'll primary you.
[01:23:05.240 --> 01:23:10.040] Of course, that might not be as easy as it was for him previously, with so many people
[01:23:10.040 --> 01:23:14.000] now realizing with Epstein.
[01:23:14.000 --> 01:23:17.360] Senators John Fetterman, Democrat, and Kevin Kramer, a Republican, have introduced a bill
[01:23:17.360 --> 01:23:21.480] that would generally require those conducting in-person business to accept cash as payment
[01:23:21.480 --> 01:23:22.880] from customers.
[01:23:22.880 --> 01:23:25.080] Again, this is great.
[01:23:25.080 --> 01:23:26.680] I fully support this.
[01:23:26.680 --> 01:23:29.040] I am pleased to see this.
[01:23:29.040 --> 01:23:30.440] Good job, Fetterman.
[01:23:31.280 --> 01:23:36.400] I know we've been hard on him on this show, but this is a good thing.
[01:23:36.400 --> 01:23:38.000] So good job.
[01:23:38.000 --> 01:23:41.720] Any person engaged in the business of selling or offering goods or services at retail to
[01:23:41.720 --> 01:23:46.480] the public who accepts in-person payments at a physical location shall accept cash as
[01:23:46.480 --> 01:23:52.160] a form of payment for sales made at such physical location in amounts up to and including $500
[01:23:52.160 --> 01:23:54.440] per transaction.
[01:23:54.440 --> 01:23:58.240] So again, requiring up to $500.
[01:23:58.240 --> 01:24:05.600] So not extremely large payments, but still it's a step in the right direction.
[01:24:05.600 --> 01:24:06.720] It's something.
[01:24:06.720 --> 01:24:10.200] The proposal provides an exception if there is a device that converts cash into prepaid
[01:24:10.200 --> 01:24:19.160] cards without any fee, which I mean, that's a very similar, I suppose.
[01:24:19.160 --> 01:24:24.800] It also allows already a thing, though, like, you know, cash is supposed to be good for
[01:24:24.800 --> 01:24:28.440] all legal debts.
[01:24:28.440 --> 01:24:33.200] It says it on there, not really sure what this bill is changing.
[01:24:33.200 --> 01:24:35.840] It would force companies to be able to accept it.
[01:24:35.840 --> 01:24:39.480] A lot of places are moving to, you know, debit or credit only.
[01:24:39.480 --> 01:24:40.680] They don't want to take cash.
[01:24:40.680 --> 01:24:45.920] This would force them to accept it as legal tender.
[01:24:45.920 --> 01:24:50.800] It also allows exceptions if a person cannot accept cash payment due to a sale system failure
[01:24:50.800 --> 01:24:54.840] because temporarily do not have enough cash available to provide change.
[01:24:54.840 --> 01:24:55.840] It's simple.
[01:24:55.840 --> 01:24:59.280] If you're open for business in America, you should take US dollars, Federman said.
[01:24:59.280 --> 01:25:01.400] That's a very simple, just common sense.
[01:25:01.400 --> 01:25:07.320] Just yeah, this is the currency of the land as debased and devalued as it is.
[01:25:07.320 --> 01:25:15.160] If you are, if you say that you are open for business, you should accept the currency of
[01:25:15.160 --> 01:25:16.160] the realm.
[01:25:16.400 --> 01:25:20.360] Of course, you'll remember my dad interviewed Piers Corbin.
[01:25:20.360 --> 01:25:26.000] He is a he's the brother of a UK politician, Jeremy Corbin.
[01:25:26.000 --> 01:25:30.480] And you might remember that video where we went into Aldi and they said, well, we only
[01:25:30.480 --> 01:25:32.560] take credit or debit here.
[01:25:32.560 --> 01:25:36.780] And he said, well, no, the pound is still legal tender.
[01:25:36.780 --> 01:25:38.420] You must accept it.
[01:25:38.420 --> 01:25:44.040] He gave them the money for strawberries and walked out with his purchase and they called
[01:25:44.040 --> 01:25:45.320] the police saying he didn't pay.
[01:25:45.320 --> 01:25:54.240] He didn't pay because he didn't elect to use debit or credit for the five year period
[01:25:54.240 --> 01:25:56.680] beginning on the date of enactment of this section.
[01:25:56.680 --> 01:26:01.800] The section shall not require a person to accept cash payments and $50 bills or any
[01:26:01.800 --> 01:26:03.280] larger bill.
[01:26:03.280 --> 01:26:07.720] The secretary shall issue a rule on the date that is five years after the date of the enactment
[01:26:07.720 --> 01:26:13.360] of this section with respect to any bill denominations, a person is not required to accept when issuing
[01:26:13.440 --> 01:26:18.360] a rule under subparagraphs a, the secretary shall require persons to accept one, five,
[01:26:18.360 --> 01:26:22.520] ten and twenty dollar bills.
[01:26:22.520 --> 01:26:25.400] Cash is still legal tender in the United States.
[01:26:25.400 --> 01:26:29.400] Despite some businesses exclusive acceptance of electronic payments, Kramer said, according
[01:26:29.400 --> 01:26:33.080] to the press release, forcing the use of credit and debit cards or imposing premium prices
[01:26:33.080 --> 01:26:37.800] on goods and services paid for with cash limits consumer choice.
[01:26:37.800 --> 01:26:40.760] Americans should have the option of using cards or cash, but they should be the ones
[01:26:40.760 --> 01:26:41.760] who make that choice.
[01:26:41.760 --> 01:26:42.760] Yeah.
[01:26:43.160 --> 01:26:50.760] Again, I am so surprised to see this coming from Fetterman, but you know, good job.
[01:26:50.760 --> 01:26:52.120] I support it.
[01:26:52.120 --> 01:26:53.480] That's great.
[01:26:53.480 --> 01:26:58.080] You should have the ability to pay with cash anywhere.
[01:26:58.080 --> 01:27:03.360] Shouldn't have these companies saying, no, we're going digital only.
[01:27:03.360 --> 01:27:06.720] Congress says no to state sponsored crypto.
[01:27:06.720 --> 01:27:11.360] House passes cryptocurrency laws with the promise to outlaw CBDCs later, later, we'll
[01:27:11.360 --> 01:27:13.560] get around to it.
[01:27:13.560 --> 01:27:20.400] Of course, the stable coins, those are the kind of digital cash Trojan horse.
[01:27:20.400 --> 01:27:21.400] They're working their way in.
[01:27:21.400 --> 01:27:29.880] Rep. Tom Emmer, the Republican from Minnesota, House GOP whip announced that his years defense
[01:27:29.880 --> 01:27:35.280] authorization legislation would include a prohibition against central backed digital
[01:27:35.280 --> 01:27:36.280] currencies.
[01:27:36.280 --> 01:27:39.920] Attaching our anti-CBDC surveillance state act to the National Defense Authorization
[01:27:40.480 --> 01:27:43.840] Act will ensure unelected bureaucrats are never allowed to trade American's financial
[01:27:43.840 --> 01:27:48.880] privacy for a CCP style surveillance tool.
[01:27:48.880 --> 01:27:51.680] Emmer said in a statement referencing the Chinese Communist Party and the country's
[01:27:51.680 --> 01:28:01.280] centralized digital currency, and of course, that is their goal here.
[01:28:01.280 --> 01:28:04.040] The American politicians are looking over at China with envy.
[01:28:04.040 --> 01:28:07.600] Man, wouldn't it be great if we could just turn off everyone's bank account?
[01:28:08.040 --> 01:28:11.200] Wouldn't it be great if they didn't have cash to rely on?
[01:28:11.200 --> 01:28:18.800] That if we said, ah, you're not a good citizen, we just completely wall them off from anything.
[01:28:18.800 --> 01:28:21.880] Sorry, you can't go out and get food.
[01:28:21.880 --> 01:28:23.360] You can't buy groceries.
[01:28:23.360 --> 01:28:25.920] You can't go see a movie.
[01:28:25.920 --> 01:28:26.920] Anything at all.
[01:28:26.920 --> 01:28:28.920] You can't pay for services.
[01:28:28.920 --> 01:28:29.920] Your plumbing's broken?
[01:28:29.920 --> 01:28:30.920] That's too bad.
[01:28:30.920 --> 01:28:35.960] It also marks another key win for House Speaker Mike Johnson, who overcame an eight hour standoff
[01:28:36.040 --> 01:28:40.240] on Wednesday evening with as many as nine holdouts demanding Congress do more to prevent
[01:28:40.240 --> 01:28:43.240] future creation of a CBDC.
[01:28:43.240 --> 01:28:49.240] The picture grew complicated with the addition of the Genius Act, a Senate-led piece of legislation
[01:28:49.240 --> 01:28:54.640] that already passed in the upper chamber in a 63 to 30 vote last month.
[01:28:54.640 --> 01:28:56.640] J.J.J.
[01:28:56.640 --> 01:29:00.200] The Golfer, the 1792 Coin Act, already states this.
[01:29:00.200 --> 01:29:02.440] Oh, I didn't know that.
[01:29:02.440 --> 01:29:05.440] There's so many different pieces of legislation that, you know.
[01:29:05.440 --> 01:29:09.680] Yeah, I mean, this is just a AI overview.
[01:29:09.680 --> 01:29:11.280] So you take this with a grain of salt.
[01:29:11.280 --> 01:29:17.640] But yeah, I had heard that the important one here is the legal tender thing.
[01:29:17.640 --> 01:29:23.520] The dollar was declared legal tender, meaning it was acceptable form of payment for debts.
[01:29:23.520 --> 01:29:27.520] So not really sure what this new act is changing.
[01:29:27.520 --> 01:29:31.040] But yet it is true that there are a lot of places that refuse to take cash.
[01:29:31.440 --> 01:29:37.520] Yeah, I suppose this is I suppose this is forcing them to accept it.
[01:29:37.520 --> 01:29:42.240] Whereas the other one merely implies that you should, you know.
[01:29:42.240 --> 01:29:47.520] Maybe it's just here's new law that says the same thing as the old law that we've been ignoring.
[01:29:47.520 --> 01:29:48.720] We'll have to wait and see.
[01:29:48.720 --> 01:29:53.240] The Genius Act would create requirements for the issuance of payment stable coins,
[01:29:53.240 --> 01:29:58.040] types of cryptocurrencies that achieve price stability by tying their values to the US dollar
[01:29:58.040 --> 01:29:59.640] or some other liquid asset.
[01:29:59.720 --> 01:30:04.640] I've always been confused on how that is going to achieve price stability.
[01:30:04.640 --> 01:30:07.560] The US dollar is completely unstable.
[01:30:07.560 --> 01:30:09.680] It's not backed by anything.
[01:30:09.680 --> 01:30:16.120] Its price and value is just continually in fluctuation.
[01:30:16.120 --> 01:30:22.760] So I don't understand how you could have a stable coin, a stable cryptocurrency,
[01:30:22.760 --> 01:30:26.640] when it's tied to something that itself is unstable.
[01:30:26.640 --> 01:30:31.240] With some exceptions for smaller startups, the bill would require issuers of a new coin
[01:30:31.240 --> 01:30:35.480] to be either federally approved, state approved or a subsidiary of an institution backed
[01:30:35.480 --> 01:30:37.760] by the government's bank regulators.
[01:30:37.760 --> 01:30:42.840] The bill also requires issuers to maintain assets value through reserves.
[01:30:42.840 --> 01:30:47.080] Eric Burleson said he broadly supports Congress's work on cryptocurrency,
[01:30:47.080 --> 01:30:50.640] but along with concerns that some of the new language might be a little too restrictive.
[01:30:50.640 --> 01:30:54.040] He expressed alarm that the Genius Act didn't explicitly remove the possibility
[01:30:54.040 --> 01:30:57.280] of centralized digital currency in the future.
[01:30:57.280 --> 01:31:00.960] Well, that's because that's what they want to do.
[01:31:00.960 --> 01:31:02.040] They want to bring it in.
[01:31:02.040 --> 01:31:03.640] They're working on it slowly.
[01:31:03.640 --> 01:31:06.760] They're chipping away bit by bit.
[01:31:06.760 --> 01:31:11.680] They work on these things over long timelines as people forget and move on to different issues,
[01:31:11.680 --> 01:31:14.240] whatever the crisis of the moment is.
[01:31:14.240 --> 01:31:17.680] Burleson was one of the 12 Republicans to vote against the Genius Act.
[01:31:17.680 --> 01:31:24.680] Knights of the Storm, Trump undid his own EO executive order banning CBDC by signing the act.
[01:31:24.680 --> 01:31:26.240] We're not going to have that. You don't have to worry.
[01:31:26.240 --> 01:31:28.240] I'm making an executive order.
[01:31:28.240 --> 01:31:30.440] Here's a new law. Sorry.
[01:31:30.440 --> 01:31:33.440] We look at what China is doing to control its population by controlling their currency.
[01:31:33.440 --> 01:31:37.120] I mean, if you put that in the hands of politicians, it would be awful, Burleson said.
[01:31:37.120 --> 01:31:38.400] He's right.
[01:31:38.400 --> 01:31:44.280] The more direct control over your life you give politicians, the worse everything will be.
[01:31:44.280 --> 01:31:46.520] Representative Tim Burchette, Republican in Tennessee,
[01:31:46.560 --> 01:31:51.680] he was one of the publicans who held up consideration of all three bills for more than eight hours on Wednesday evening.
[01:31:51.680 --> 01:31:54.720] On Thursday, he supported the measures after receiving commitments from leaders.
[01:31:54.720 --> 01:31:59.000] The president, the language against CBDCs would make it into the National Defense Authorization Act,
[01:31:59.000 --> 01:32:02.880] a piece of legislation that reliably expected to pass every year.
[01:32:02.880 --> 01:32:04.120] Oh, we're definitely going to put that in there.
[01:32:04.120 --> 01:32:06.680] You can trust us.
[01:32:06.680 --> 01:32:11.040] I'd be pretty frustrated, Burchette said when asked what his reaction would be if the agreement fell through.
[01:32:11.040 --> 01:32:14.000] I think we were misled by our leadership, so we'll see.
[01:32:14.000 --> 01:32:15.600] I've been disappointed by them before.
[01:32:15.600 --> 01:32:20.440] That's right. He knows he's been disappointed by them before, but he's just going to take it at face value.
[01:32:20.440 --> 01:32:23.080] Well, they promised.
[01:32:23.080 --> 01:32:27.240] Once again, this is like Lucy with the football and Charlie Brown.
[01:32:27.240 --> 01:32:29.840] Yeah, this time. No, this time we're really going to do it.
[01:32:29.840 --> 01:32:31.960] We'll hold this in place for you.
[01:32:31.960 --> 01:32:35.560] Don't worry.
[01:32:35.560 --> 01:32:38.920] I'd be pretty frustrated, Burchette said.
[01:32:38.920 --> 01:32:42.200] That's right. He's going to be he's going to be frustrated.
[01:32:42.200 --> 01:32:43.840] There might be some consternation even.
[01:32:43.840 --> 01:32:47.080] Gosh darn it, they got me again.
[01:32:47.080 --> 01:32:49.520] The president assured us he would help us on that, Burchette said.
[01:32:49.520 --> 01:32:56.040] And we all know what a reliable, honorable, trustworthy man the president is.
[01:32:56.040 --> 01:33:02.440] With the agreement in place, Republicans rallied around three bills, which passed with bipartisan support.
[01:33:02.440 --> 01:33:09.040] There's a company in Ohio that's tokenizing car titles, Davidson said, referring to the legal document allying ownership of a vehicle.
[01:33:09.040 --> 01:33:13.160] If you've ever bought a car and you pay for the title, the title goes all over the place.
[01:33:13.160 --> 01:33:17.120] If you buy it directly, the title takes weeks so you can get the plates.
[01:33:17.120 --> 01:33:20.520] Trying to change that and just make it a digital token.
[01:33:20.520 --> 01:33:28.440] Oh, that's great. So, you know, someone hacks your computer and suddenly they own your car.
[01:33:28.440 --> 01:33:34.240] They've stolen the digital token that says it's yours.
[01:33:34.240 --> 01:33:39.600] What we really need is to combine self-driving cars with tokens that can be stolen.
[01:33:39.600 --> 01:33:45.680] I was going to say the DMV with NFTs, wonderful things.
[01:33:45.680 --> 01:33:57.560] I'm just imagining some hacker, you know, breaking into your computer, somehow getting a hold of the token that says your self-driving car is theirs and then sending it directly to their house or the chop shop.
[01:33:57.560 --> 01:34:02.280] Won't that be wonderful?
[01:34:02.280 --> 01:34:06.520] Just sending the car directly to themselves.
[01:34:06.520 --> 01:34:11.280] Tunnellordman337, a Democrat fighting against CBDC's was not in my 2025 bingo card.
[01:34:11.280 --> 01:34:14.600] I know we live in an upside down crazy world.
[01:34:14.600 --> 01:34:17.360] Things don't make much sense anymore.
[01:34:17.360 --> 01:34:27.320] And especially that it's, you know, Federman who's proposing this, you know, you must accept cash.
[01:34:27.320 --> 01:34:30.000] It's truly a strange time we live in.
[01:34:30.000 --> 01:34:34.560] Nights of the Storm, I just looked up legal tender and what that means for Saturday's show.
[01:34:34.600 --> 01:34:37.160] I'm surprised to learn that private business does not have to accept it.
[01:34:37.160 --> 01:34:38.760] Very concerning.
[01:34:38.760 --> 01:34:43.560] I think that's what Federman, the law they put in is trying to address.
[01:34:43.560 --> 01:34:49.920] I think this would be to force them to say you must accept legal tender.
[01:34:49.920 --> 01:34:53.160] Nights of the Storm, they just passed a genius act and there was one just before it.
[01:34:53.160 --> 01:34:56.000] It doesn't tokenize cash, but sets up a framework for CBDC.
[01:34:56.000 --> 01:34:59.080] Yeah, they're not doing it all at once.
[01:34:59.080 --> 01:35:02.720] They're putting in little bits and pieces of it here and there.
[01:35:02.720 --> 01:35:07.120] They're not coming in and slapping it all down.
[01:35:07.120 --> 01:35:11.000] They are basically taking the concept of crypto and placing government oversight while creating
[01:35:11.000 --> 01:35:13.960] their own version.
[01:35:13.960 --> 01:35:18.640] The entire point about crypto is that it was, you know, kind of crypto.
[01:35:18.640 --> 01:35:20.600] It gave you deniability.
[01:35:20.600 --> 01:35:23.960] People didn't know what was going on, what you were doing with it.
[01:35:23.960 --> 01:35:25.920] It was private.
[01:35:25.920 --> 01:35:28.680] There is no reason to use crypto if that is not the case.
[01:35:28.840 --> 01:35:33.120] If the government has complete and utter control of it and is able to track and trace every
[01:35:33.120 --> 01:35:39.280] little thing you do, that completely invalidates it as a use case, in my opinion.
[01:35:39.280 --> 01:35:45.200] Nights of the Storm, the government is still mandated to take cash, though, for now.
[01:35:45.200 --> 01:35:48.840] For now, though.
[01:35:48.840 --> 01:35:52.400] That future is slowly coming where they don't.
[01:35:52.400 --> 01:35:58.200] Right, I remember seeing stories of the guy that paid a parking ticket with a whole bunch
[01:35:58.200 --> 01:36:03.440] of origami pigs inside a donut box and they had to accept that because the government
[01:36:03.440 --> 01:36:05.000] has to accept cash.
[01:36:05.000 --> 01:36:06.000] Yeah.
[01:36:06.000 --> 01:36:10.160] Or those guys that will come in and pay their fines with a massive amount of pennies to
[01:36:10.160 --> 01:36:12.200] be spiteful towards the system.
[01:36:12.200 --> 01:36:15.960] I don't know if they got in trouble for that, but I've seen stories about that sort of thing
[01:36:15.960 --> 01:36:16.960] happening.
[01:36:16.960 --> 01:36:20.800] Nights of the Storm, they will use public-private partnerships to edge out cash.
[01:36:20.800 --> 01:36:21.800] Yeah.
[01:36:21.800 --> 01:36:25.400] Well, you know, none of the companies even take cash anymore.
[01:36:25.400 --> 01:36:27.280] Why do you want to hold on to that?
[01:36:27.280 --> 01:36:29.680] It's not doing you any good anyway.
[01:36:29.680 --> 01:36:30.680] Just come on.
[01:36:30.680 --> 01:36:32.480] The CBDC will make things so much easier.
[01:36:32.480 --> 01:36:37.520] The stablecoin, KWD-68, China puts expiration dates on money, amounts in accounts to spur
[01:36:37.520 --> 01:36:40.640] the economy, total control coming to us, winning, yeah.
[01:36:40.640 --> 01:36:44.000] Sorry, you've kept the money in your account for too long.
[01:36:44.000 --> 01:36:45.360] It's no good anymore.
[01:36:45.360 --> 01:36:48.360] It vaporizes, it vanishes, it evaporates.
[01:36:48.360 --> 01:36:51.600] You were supposed to use this to help the economy grow.
[01:36:51.600 --> 01:36:54.360] Don't you care about the GDP?
[01:36:54.360 --> 01:36:58.640] Nights of the Storm, so the local services used on a daily basis can refuse to take cash.
[01:36:58.640 --> 01:37:01.760] Jevkin, one time Dollar General refused cash for some reason.
[01:37:01.760 --> 01:37:05.160] It was only one time though.
[01:37:05.160 --> 01:37:07.240] Knowing Dollar General, that could have just been because they're lazy.
[01:37:07.240 --> 01:37:09.520] No, I don't want to have to count that out.
[01:37:09.520 --> 01:37:13.280] Maybe you should change the name to CBDC General.
[01:37:13.280 --> 01:37:14.280] Stablecoin General.
[01:37:14.280 --> 01:37:20.080] It doesn't have the same ring to it.
[01:37:20.080 --> 01:37:25.120] Trump's bitcoin mentor bet on bitcoin treasury strategies and his wealth is exploding.
[01:37:25.120 --> 01:37:27.000] Isn't that great?
[01:37:27.000 --> 01:37:30.760] The article describes how David Bailey, a key figure in Trump's bitcoin adoption, has
[01:37:30.760 --> 01:37:36.120] seen his wealth grow through his hedge fund, 210k capital.
[01:37:36.120 --> 01:37:39.760] The fund's success stems from its investments in companies that have added bitcoin to their
[01:37:39.760 --> 01:37:44.000] balance sheets, a strategy known as bitcoin treasury.
[01:37:44.000 --> 01:37:48.400] Of course we've seen, I can't remember the guy's name, but he's big on bitcoin saying
[01:37:48.520 --> 01:37:53.760] he thinks that this is just the beginning, that bitcoin could reach astronomically new
[01:37:53.760 --> 01:37:54.760] heights.
[01:37:54.760 --> 01:38:00.040] AWD 68, legislative act names never match the truth.
[01:38:00.040 --> 01:38:02.760] No, every child left behind, etc.
[01:38:02.760 --> 01:38:03.760] Genius Act.
[01:38:03.760 --> 01:38:07.040] Yeah, the Patriot Act comes to mind as well.
[01:38:07.040 --> 01:38:09.640] They always do that.
[01:38:09.640 --> 01:38:16.880] If you remember, I think we still have the bitcoin, the guy talking about bitcoin.
[01:38:16.880 --> 01:38:19.640] So we'll, let's see, I know.
[01:38:19.640 --> 01:38:22.800] 120,000 dollars for one bitcoin.
[01:38:22.800 --> 01:38:24.760] You've called for much higher in the past.
[01:38:24.760 --> 01:38:27.720] It's still pretty surreal to see us hit that milestone over the weekend.
[01:38:27.720 --> 01:38:33.920] Oh, I think it's very exciting that people are starting to recognize the value of a better
[01:38:33.920 --> 01:38:41.720] currency, a better store of value, a more honest currency, a currency that keeps perfect
[01:38:41.720 --> 01:38:42.720] records.
[01:38:42.800 --> 01:38:49.760] There are quite a few examples of where an old currency was overtaken by a new currency.
[01:38:49.760 --> 01:38:54.880] My dad gave me a million dollar bill and I looked at it and I went, whoa, a million dollars.
[01:38:54.880 --> 01:38:56.200] I was about 10 years old.
[01:38:56.200 --> 01:38:58.080] A million dollars, what can I do with it?
[01:38:58.080 --> 01:38:59.080] He said nothing.
[01:38:59.080 --> 01:39:01.120] And I said, you mean nothing.
[01:39:01.120 --> 01:39:05.800] And I said, and he said, that's a million Confederate dollars.
[01:39:05.800 --> 01:39:11.240] Confederates lost to the Union in the war and there was huge inflation of a Confederate
[01:39:11.240 --> 01:39:12.240] dollar.
[01:39:12.280 --> 01:39:17.920] Nobody wanted any because they only found the value in the Union dollar.
[01:39:17.920 --> 01:39:23.680] And so the Union dollar became the standard and the Confederate dollar became completely
[01:39:23.680 --> 01:39:24.680] worthless.
[01:39:24.680 --> 01:39:26.720] And I think that that is where we're headed.
[01:39:26.720 --> 01:39:31.200] I mean, the fiat money is becoming less and less relevant.
[01:39:31.200 --> 01:39:38.040] Bitcoin and maybe some other cryptocurrencies are going to be the relevant ones in the future.
[01:39:38.040 --> 01:39:41.080] You said something recently which really caught my eye.
[01:39:41.360 --> 01:39:48.480] A lot of people talk about like Bitcoin hyperbitcoinization and you said, all right, well, you can see
[01:39:48.480 --> 01:39:52.600] that was the main point is Bitcoin has not finished.
[01:39:52.600 --> 01:39:55.640] He sees it going higher.
[01:39:55.640 --> 01:40:09.440] And this guy, David Bailey, key figure in Trump's administration here,
[01:40:09.440 --> 01:40:12.400] is he's betting on it.
[01:40:12.400 --> 01:40:17.360] He's investing in companies that are adopting Bitcoin as part of their holdings.
[01:40:17.360 --> 01:40:19.040] And he is making a fortune.
[01:40:19.040 --> 01:40:24.760] The fund delivered a net return of 640 percent in the 12 months through June, largely driven
[01:40:24.760 --> 01:40:28.800] by investments in publicly traded companies that added Bitcoin to their balance sheets,
[01:40:28.800 --> 01:40:29.800] Bloomberg reported.
[01:40:33.800 --> 01:40:38.880] Of course, as they pointed out, as the dollar collapses, Bitcoin looks better and better
[01:40:38.880 --> 01:40:41.520] as a store of value.
[01:40:41.520 --> 01:40:44.800] Last week, the Republican controlled House of Representatives passed three crypto bills
[01:40:44.800 --> 01:40:49.440] addressing stable coins, market structure, and a ban on creating central bank digital
[01:40:49.440 --> 01:40:50.440] currency.
[01:40:50.440 --> 01:40:53.880] And of course, we know they want the CBDCs.
[01:40:53.880 --> 01:40:55.160] They want the stable coins.
[01:40:55.160 --> 01:40:57.160] They're desperate for them.
[01:40:57.160 --> 01:41:02.560] So whatever they do, whatever laws they pass are probably going to lead to those, whether
[01:41:02.560 --> 01:41:06.560] it's directly or obliquely.
[01:41:06.560 --> 01:41:09.880] They generally don't back down from ideas.
[01:41:09.880 --> 01:41:12.760] They simply try to go about them in a different fashion.
[01:41:12.760 --> 01:41:17.400] So the storm they are offering a digital option for private people to use instead of cash.
[01:41:17.400 --> 01:41:19.240] It's 2020 all over again.
[01:41:19.240 --> 01:41:22.280] Like Solente predicted, dirty cash to digital trash.
[01:41:22.280 --> 01:41:25.160] He's been saying that for quite a while.
[01:41:25.160 --> 01:41:26.160] That's the storm.
[01:41:26.160 --> 01:41:27.440] Travis is already an issue with property deeds.
[01:41:27.440 --> 01:41:31.520] I always get title insurance because people digitally steal your land and take liens against
[01:41:31.520 --> 01:41:35.720] them, leaving you with the bill.
[01:41:35.720 --> 01:41:36.960] That's great.
[01:41:36.960 --> 01:41:40.320] More things to worry about as everything goes digital before.
[01:41:40.320 --> 01:41:45.480] To get your title, someone have to break into your house or the bank, I guess, and pull
[01:41:45.480 --> 01:41:47.560] the actual piece of paper out.
[01:41:47.560 --> 01:41:52.760] Now they just steal it online and leave you holding a bag full of debt.
[01:41:52.760 --> 01:41:54.560] Oops, sorry.
[01:41:54.560 --> 01:41:56.040] Someone's got to pay for it.
[01:41:56.040 --> 01:41:59.320] Otherwise we're taking your home.
[01:41:59.320 --> 01:42:02.720] Shelly A. Everything will be tokenized and tied to the crypto.
[01:42:02.720 --> 01:42:08.560] I cannot wait to be minted to the block chain myself.
[01:42:08.560 --> 01:42:11.520] I wonder if I'll be non fungible to shadow boxer.
[01:42:11.520 --> 01:42:14.920] They confiscated our old our gold in 1933 to bail at the Federal Reserve.
[01:42:14.920 --> 01:42:18.000] You don't think they can't do that with crypto?
[01:42:18.000 --> 01:42:19.280] Yeah.
[01:42:19.280 --> 01:42:21.520] Everything has its downsides.
[01:42:21.520 --> 01:42:25.280] Everything has potential risks.
[01:42:25.280 --> 01:42:29.320] There is always potential that the government could just come in and seize whatever you
[01:42:29.320 --> 01:42:31.000] have.
[01:42:31.000 --> 01:42:36.860] If it's crypto, they could potentially just siphon it off somehow if they have the technology.
[01:42:36.860 --> 01:42:41.560] If it's gold, they could come to your door with a tank and say, turn it over or we're
[01:42:41.560 --> 01:42:44.640] going to atomize you.
[01:42:44.640 --> 01:42:49.720] It's really how you see whatever you think is the best store of value, whatever you think
[01:42:49.720 --> 01:42:53.400] is going to be safest.
[01:42:53.400 --> 01:42:58.880] The best way to protect yourself is, of course, being able to grow your own food, raise your
[01:42:58.960 --> 01:43:03.020] own food and protect your own community.
[01:43:03.020 --> 01:43:05.680] That's the most important thing.
[01:43:05.680 --> 01:43:10.320] So if you don't have that, start there.
[01:43:10.320 --> 01:43:13.960] Without that, everything else is kind of meaningless.
[01:43:13.960 --> 01:43:19.840] If you aren't able to survive, you know, no matter what you have stored your value in,
[01:43:19.840 --> 01:43:20.840] it won't matter.
[01:43:20.840 --> 01:43:26.000] You'll just be a nifty little find for someone else once they stumble across your skeleton.
[01:43:26.000 --> 01:43:28.120] Oh, boy, look.
[01:43:28.120 --> 01:43:29.260] It's a pirate skeleton.
[01:43:29.260 --> 01:43:31.200] He's got a bunch of gold and silver coins.
[01:43:31.200 --> 01:43:33.760] That's awesome.
[01:43:33.760 --> 01:43:34.760] Ron Helton won.
[01:43:34.760 --> 01:43:38.480] I think a Bitcoin and all of these other digital currencies is an arcade video game.
[01:43:38.480 --> 01:43:41.680] I can go for the high score, but eventually they always take the machine away and replace
[01:43:41.680 --> 01:43:42.680] it with another.
[01:43:42.680 --> 01:43:43.680] That's an interesting perspective.
[01:43:43.680 --> 01:43:44.680] True.
[01:43:44.680 --> 01:43:49.600] Of course, the same could be said for other types of currency, like what that video was
[01:43:49.600 --> 01:43:56.400] just talking about with Confederate dollars or modern dollars, the high scoring people
[01:43:56.400 --> 01:44:00.760] on the leaderboard of modern dollars have gone way down compared to the high scorers
[01:44:00.760 --> 01:44:01.760] on Bitcoin.
[01:44:01.760 --> 01:44:04.560] That is true.
[01:44:04.560 --> 01:44:05.560] Tunnel Lord 1337.
[01:44:05.560 --> 01:44:07.480] Why are we so fixated on Bitcoin?
[01:44:07.480 --> 01:44:08.480] It's just a different form of fiat.
[01:44:08.480 --> 01:44:12.640] If we want a different form of fiat, why not nullifying regulations that forbid local banks
[01:44:12.640 --> 01:44:14.960] from creating their own fiat?
[01:44:14.960 --> 01:44:19.680] Again, I'm not a cryptocurrency guy.
[01:44:19.680 --> 01:44:26.240] One main difference, though, I have to say with cryptocurrency versus your typical fiat
[01:44:26.240 --> 01:44:31.560] is that someone can't just decide to print or produce a lot more of it.
[01:44:31.560 --> 01:44:37.480] There's a fixed amount, so inflation isn't going to be a unpredictable factor.
[01:44:37.480 --> 01:44:43.840] Yeah, but I am not a cryptocurrency guy, personally.
[01:44:43.840 --> 01:44:51.160] I find it too speculative and I'm not equipped for those types of tasks.
[01:44:51.640 --> 01:44:56.720] Again, I once bought some Dogecoin when it was really, really cheap.
[01:44:56.720 --> 01:44:57.720] Just as a joke.
[01:44:57.720 --> 01:45:01.360] I never thought it would amount to anything and it went to the moon and I had forgotten
[01:45:01.360 --> 01:45:02.360] my password.
[01:45:02.360 --> 01:45:03.880] So I'm one of those fools.
[01:45:03.880 --> 01:45:08.120] I am one of those fools that I laughed at for losing their Bitcoin passwords.
[01:45:08.120 --> 01:45:09.680] Haha, what an idiot.
[01:45:09.680 --> 01:45:11.680] What kind of idiot would do that?
[01:45:11.680 --> 01:45:15.280] I said carefree as I looked into the mirror.
[01:45:15.280 --> 01:45:16.640] Ron Helton one asset forfeiture.
[01:45:16.640 --> 01:45:18.200] The government calls your assets criminal.
[01:45:18.200 --> 01:45:20.800] You go free, but your assets become the government's assets.
[01:45:20.800 --> 01:45:23.280] They love civil asset forfeiture.
[01:45:23.280 --> 01:45:26.080] They love seizing things from people.
[01:45:26.080 --> 01:45:29.880] Whether it's an asset or whether it's land with eminent domain, they love coming in and
[01:45:29.880 --> 01:45:32.960] taking things because they have the force to do so.
[01:45:32.960 --> 01:45:36.920] KWD 68 FDR give the public two months to surrender their gold.
[01:45:36.920 --> 01:45:41.720] That generation complied too.
[01:45:41.720 --> 01:45:43.760] They could have stood up and said no.
[01:45:43.760 --> 01:45:45.480] They could have said I don't think so.
[01:45:45.480 --> 01:45:48.840] That would have given the government some pause.
[01:45:48.840 --> 01:45:52.680] Now, of course, a lot of people didn't turn in their gold.
[01:45:52.680 --> 01:45:58.520] They probably said, yeah, that's all I had turned over a small amount and kept the majority
[01:45:58.520 --> 01:45:59.520] of it.
[01:45:59.520 --> 01:46:06.400] But that is if the population as a whole had said I'm not doing it, we would probably be
[01:46:06.400 --> 01:46:09.400] in a much better position than we are now.
[01:46:09.400 --> 01:46:12.280] Angry Tigers then we have been digital for a long time.
[01:46:12.280 --> 01:46:14.800] People have been trained to use debit debit or credit cards.
[01:46:14.800 --> 01:46:17.160] Over 90% of transactions are digital.
[01:46:17.400 --> 01:46:21.400] Yeah, I very rarely have cash anymore.
[01:46:21.400 --> 01:46:25.640] It's something I wish I had more of.
[01:46:25.640 --> 01:46:31.160] It's just continually again, places very rarely.
[01:46:31.160 --> 01:46:33.080] A lot of times they don't have sufficient change.
[01:46:33.080 --> 01:46:39.320] A lot of times they don't even have ATMs so that you can get it anymore.
[01:46:39.320 --> 01:46:41.920] Embedding human rights into crypto isn't optional.
[01:46:41.920 --> 01:46:45.200] It's foundational.
[01:46:45.200 --> 01:46:49.640] The article describes the importance of embedding human rights into crypto systems, its need
[01:46:49.640 --> 01:46:55.840] for self custody, universal personhood and privacy by default as the core design principles.
[01:46:55.840 --> 01:46:58.600] Importance of transparent system design and open governance.
[01:46:58.600 --> 01:47:04.640] Yes, this has to be baked into it, has to be thought about from the very beginning.
[01:47:04.640 --> 01:47:08.520] It can't be this thing where we're like, oh, we'll address that later.
[01:47:08.520 --> 01:47:11.480] If it's not there from the start, it won't happen at all.
[01:47:11.480 --> 01:47:15.800] From the hype of accelerationist and technophile circles, a quiet crisis of confidence is taking
[01:47:15.800 --> 01:47:18.920] hold in emerging technologies.
[01:47:18.920 --> 01:47:22.480] Crypto and decentralized identity solutions still carry enormous potential to empower
[01:47:22.480 --> 01:47:27.480] individuals and distribute power, but many builders and users are sounding an alarm.
[01:47:27.480 --> 01:47:32.040] Their disillusionment stems from real concerns, surveillance overreach, centralization disguised
[01:47:32.040 --> 01:47:36.480] as innovation and tools that serve power, not people.
[01:47:36.480 --> 01:47:40.840] From deepfake scams and AI impersonation to state-backed biometric ID proposals and the
[01:47:40.840 --> 01:47:47.440] EU AI Act, digital rights are being defined in real time, often without public consent.
[01:47:47.440 --> 01:47:48.440] Yeah.
[01:47:48.440 --> 01:47:52.040] Just remember, they don't call us the stakeholders.
[01:47:52.040 --> 01:47:55.000] There are stakeholders, but we're not them.
[01:47:55.000 --> 01:48:01.360] It's someone else, someone with a lot more money and influence.
[01:48:01.360 --> 01:48:02.840] Is gold getting too pricey?
[01:48:02.840 --> 01:48:04.600] Here's where smart money is rotating next.
[01:48:04.600 --> 01:48:05.600] This is from Zero Hedge.
[01:48:05.600 --> 01:48:08.380] When gold gets expensive, buyers start looking for better value.
[01:48:08.380 --> 01:48:12.980] That's exactly what we're seeing in 2025 in both bullion market and jewelry buying.
[01:48:12.980 --> 01:48:16.900] With gold hovering above 3,300 per ounce, some stackers are looking for options.
[01:48:16.900 --> 01:48:24.940] Two metals are absorbing the shift, platinum and silver.
[01:48:24.940 --> 01:48:27.340] Silver affordable, practical and gaining fast.
[01:48:27.340 --> 01:48:31.540] Silver often overlooked during gold's bull runs is now back in the spotlight, up 24%
[01:48:31.540 --> 01:48:32.540] this year.
[01:48:32.540 --> 01:48:33.540] Why?
[01:48:33.540 --> 01:48:37.100] Because silver brings a combination of affordability and real world utility that's hard to match.
[01:48:37.100 --> 01:48:42.060] It's essential to electronics, solar, energy infrastructure and defense tech, and yet it's
[01:48:42.060 --> 01:48:46.160] still priced far below its 2011 highs.
[01:48:46.160 --> 01:48:49.900] Bullion demand remains strong, especially among first time buyers and those stacking
[01:48:49.900 --> 01:48:52.040] incrementally.
[01:48:52.040 --> 01:48:55.780] Platinum and silver are today's smart money trade, accessible under price and rising for
[01:48:55.780 --> 01:48:58.020] very different reasons.
[01:48:58.020 --> 01:49:00.060] Platinum offers scarcity and explosive momentum.
[01:49:00.060 --> 01:49:06.820] Silver brings volume, liquidity and long-term demand across industrial monetary sectors.
[01:49:07.540 --> 01:49:13.740] Fed Chair Powell criminally referred to DOJ for perjury.
[01:49:13.740 --> 01:49:20.700] President Trump kinda sorta deny reported plans to fire Chair J. Powell.
[01:49:20.700 --> 01:49:23.260] We're not planning on doing it, he said Wednesday at the White House.
[01:49:23.260 --> 01:49:26.700] I don't rule out anything yet, but I think it's highly unlikely unless he has to leave
[01:49:26.700 --> 01:49:29.700] for fraud.
[01:49:29.700 --> 01:49:31.140] Unless of this, unless this.
[01:49:31.140 --> 01:49:34.060] By the way, report to the DOJ.
[01:49:34.060 --> 01:49:38.860] But now that latter comment is coming into play as Rep. Ana Paulina Luna, Republican,
[01:49:38.860 --> 01:49:42.300] furs to Powell to the Department of Justice for criminal charges accusing him of two specific
[01:49:42.300 --> 01:49:47.260] instances of lying under oath.
[01:49:47.260 --> 01:49:50.980] Does the Fed Chair ever not lie?
[01:49:50.980 --> 01:49:53.940] In his statements, he made several materially false claims.
[01:49:53.940 --> 01:49:58.320] Again, what Fed Chair hasn't?
[01:49:58.320 --> 01:50:03.540] Lying about lavish amenities that the Federal Reserve's ECLIS building and misrepresenting
[01:50:03.540 --> 01:50:07.820] its state of maintenance, Powell characterized the changes that escalated the cost of the
[01:50:07.820 --> 01:50:11.020] project from $1.9 billion to $2.5 billion as minor.
[01:50:11.020 --> 01:50:23.660] However, documents reviewed by Congressional investigators indicate that the scope and
[01:50:23.660 --> 01:50:28.260] cost overruns of this project were neither minor in nature nor in substance.
[01:50:28.260 --> 01:50:34.820] Just a minor overrun of $0.6 billion.
[01:50:34.820 --> 01:50:38.780] His statement that the cost increase was to simplify construction to avoid further delays
[01:50:38.780 --> 01:50:39.780] was false.
[01:50:39.780 --> 01:50:43.820] It is contradicted by the Federal Reserve's final submission to the National Capital Planning
[01:50:43.820 --> 01:50:44.820] Commission.
[01:50:44.820 --> 01:50:50.180] According to those records, the revised plan includes a VIP private dining room, premium
[01:50:50.180 --> 01:50:56.020] marble finishes, modernized elevators, water features, and a roof terrace garden, features
[01:50:56.020 --> 01:50:58.460] that Powell publicly denied existed.
[01:50:58.460 --> 01:51:05.980] You can't have the plebs knowing that you've got a rooftop garden or premium marble finishes.
[01:51:05.980 --> 01:51:07.900] That's a rookie mistake.
[01:51:07.900 --> 01:51:12.540] While Powell presented the changes as simplifications, the actual project plan suggests the opposite.
[01:51:12.540 --> 01:51:17.660] Yeah, I don't imagine adding a rooftop garden as a simplification.
[01:51:17.660 --> 01:51:21.940] While Trump and his allies would clearly like to see a Fed Chair cut rates that are unintended
[01:51:21.940 --> 01:51:24.260] consequences, they could be missing here.
[01:51:24.260 --> 01:51:28.420] Attiring and replacing Powell would make investors nervous about the stability of the Fed and
[01:51:28.420 --> 01:51:32.060] its ability to deliver low and stable price inflation.
[01:51:32.060 --> 01:51:37.140] This could push longer term interest rates up the opposite of the Trump goal.
[01:51:37.140 --> 01:51:38.140] Yeah.
[01:51:38.140 --> 01:51:45.700] Well, that's what's happening in the world of finance, the world of CBDCs.
[01:51:45.700 --> 01:51:50.220] Still the most surprising thing to me is just a Federman doing something I agree with.
[01:51:50.220 --> 01:51:54.540] As I said, I didn't really expect anything from him.
[01:51:54.540 --> 01:51:57.100] I expected him to just kind of sit there.
[01:51:57.100 --> 01:51:59.580] But here we are.
[01:51:59.580 --> 01:52:01.340] Surprising and strange times.
[01:52:01.340 --> 01:52:04.980] Angry Tigers Den, CBDC will look like an angel compared to what they have in store for us
[01:52:04.980 --> 01:52:08.980] with this privatized stablecoin system.
[01:52:08.980 --> 01:52:12.820] There's always a carrot and a stick, you know.
[01:52:12.820 --> 01:52:13.820] Don't you want this?
[01:52:13.820 --> 01:52:15.540] It's going to be better than this other thing.
[01:52:15.540 --> 01:52:17.820] We're going to do something terrible.
[01:52:17.820 --> 01:52:19.540] So you get to choose which one it is.
[01:52:19.540 --> 01:52:21.620] This one's really, really bad.
[01:52:21.620 --> 01:52:24.780] This one's only really bad.
[01:52:24.780 --> 01:52:25.780] Knights of the Storm.
[01:52:25.780 --> 01:52:29.660] This was a plan set up carefully in many distractions like Epstein took our eyes off the ball.
[01:52:29.660 --> 01:52:33.220] Trump banning CBDC in an order was more kibble for the MAGA base.
[01:52:33.220 --> 01:52:35.180] He just did the opposite.
[01:52:35.180 --> 01:52:38.900] Marjorie Taylor Greene came out against the Genius Act, but she still will not place it
[01:52:38.900 --> 01:52:39.900] on Trump.
[01:52:39.900 --> 01:52:42.860] They refuse to lay any blame at his feet.
[01:52:42.860 --> 01:52:47.060] They absolutely refuse to hold him accountable for his actions.
[01:52:48.060 --> 01:52:54.500] They'll discuss the bill itself or the piece of legislation, say how horrible it is, without
[01:52:54.500 --> 01:52:58.940] ever pointing out the fact that Trump is in favor of it.
[01:52:58.940 --> 01:53:00.580] Pezzavante 1776.
[01:53:00.580 --> 01:53:03.140] Trump is the biggest political Trojan horse in history.
[01:53:03.140 --> 01:53:05.380] He's the biggest, the best Trojan horse there ever was.
[01:53:05.380 --> 01:53:06.380] That's what people are saying.
[01:53:06.380 --> 01:53:08.180] I'm the best Trojan horse.
[01:53:08.180 --> 01:53:09.180] Knights of the Storm.
[01:53:09.180 --> 01:53:14.580] When you have almost unanimous bipartisan support on something, it is inherently bad.
[01:53:14.580 --> 01:53:19.060] When both parties agree on it, you know the American people are really going to get the
[01:53:19.060 --> 01:53:21.620] short end of the stick.
[01:53:21.620 --> 01:53:25.380] We're going to take a quick break, folks, and we will be right back.
[01:53:25.380 --> 01:53:26.380] Stay with us.
[01:55:44.580 --> 01:56:07.220] Defending the American Dream.
[01:56:07.220 --> 01:56:10.700] You're listening to the David Knight Show.
[01:56:10.700 --> 01:56:11.700] Wait a minute.
[01:56:11.700 --> 01:56:13.700] Where am I?
[01:56:13.820 --> 01:56:14.820] Sorry, Jefferson.
[01:56:14.820 --> 01:56:19.060] The scoundrels who put America on central bank fiat currency used our heads on their
[01:56:19.060 --> 01:56:22.700] coins as some sort of trophy, despicable.
[01:56:22.700 --> 01:56:23.700] This is outrageous.
[01:56:23.700 --> 01:56:24.700] Washington.
[01:56:24.700 --> 01:56:28.260] I spent my life fighting centralized power.
[01:56:28.260 --> 01:56:33.180] Now the Federal Reserve monopoly parades us around on their monopoly money.
[01:56:33.180 --> 01:56:35.700] Tell me there's some good news to all this.
[01:56:35.700 --> 01:56:40.220] Well, there is a coin they can't control, one that isn't backed by the Fed, but backed
[01:56:40.220 --> 01:56:41.420] by the Fed-up.
[01:56:41.420 --> 01:56:44.580] The all new David Knight Show commemorative coin.
[01:56:44.580 --> 01:56:49.180] Now patriots can support a show that won't sell out with a limited edition coin that's
[01:56:49.180 --> 01:56:51.060] sure to sell out quickly.
[01:56:51.060 --> 01:56:54.580] They say money talks and this coin has something worth listening to.
[01:56:54.580 --> 01:57:08.340] The truth doesn't need inflation, only support.
[01:57:08.340 --> 01:57:09.540] Welcome back, folks.
[01:57:09.660 --> 01:57:14.420] Yibiru 2029 says, as with any currency, if you can't hold it, you don't own it.
[01:57:14.420 --> 01:57:15.420] That's right.
[01:57:15.420 --> 01:57:17.100] They can simply turn it off.
[01:57:17.100 --> 01:57:21.220] Shelley A. They already installed the equipment for the phone, app, stable coins.
[01:57:21.220 --> 01:57:22.580] Isn't that wonderful?
[01:57:22.580 --> 01:57:25.820] At the grocery store, it's going to be.
[01:57:25.820 --> 01:57:29.180] That's how you know this is coming down the pipeline.
[01:57:29.180 --> 01:57:35.000] When the big businesses are getting in on it, that can be an assurance.
[01:57:35.000 --> 01:57:37.220] They see what's coming down the pipeline.
[01:57:37.220 --> 01:57:42.220] Even if they're already investing in the infrastructure, they're confident it's happening.
[01:57:42.220 --> 01:57:46.860] KUWD 68, Trump is marching us to 2030, just like Biden did, winning.
[01:57:46.860 --> 01:57:47.860] That's right.
[01:57:47.860 --> 01:57:51.500] We're building back bigger and better than ever before.
[01:57:51.500 --> 01:57:54.260] You will own nothing, but there'll be so much nothing.
[01:57:54.260 --> 01:57:58.420] There'll be more nothing than you've ever seen before.
[01:57:58.420 --> 01:58:02.940] This is an offside, a tangent, but one of my favorite jokes at the moment is minimalism
[01:58:02.940 --> 01:58:06.300] is a scam by big nothing to sell you more or less.
[01:58:06.300 --> 01:58:08.780] I despise minimalism as a trend.
[01:58:08.780 --> 01:58:10.780] I find it.
[01:58:10.780 --> 01:58:15.340] It's the art of putting nothing in your house and thinking yourself tasteful for it.
[01:58:15.340 --> 01:58:17.660] Oh, look, the house is so clean.
[01:58:17.660 --> 01:58:20.300] Yeah, it's because you don't actually have anything.
[01:58:20.300 --> 01:58:21.820] You don't live here.
[01:58:21.820 --> 01:58:23.740] House is supposed to be comfortable and inviting.
[01:58:23.740 --> 01:58:28.740] I don't want to sit on your terrible couch.
[01:58:28.740 --> 01:58:31.780] Well, that was an aside.
[01:58:31.780 --> 01:58:36.020] No more ranting about aesthetics from me right now.
[01:58:36.020 --> 01:58:40.860] Breaking Trump admin releases FBI records are Martin Luther King Jr.
[01:58:40.860 --> 01:58:44.380] The documents have been under court imposed seal since 1977.
[01:58:44.380 --> 01:58:47.060] This is yet another distraction.
[01:58:47.060 --> 01:58:48.780] He's desperate over here.
[01:58:48.780 --> 01:58:54.260] Got to find some way to distract people, whatever red herring works, whether it's from Epstein
[01:58:54.260 --> 01:58:58.580] or what's going on with the CBDC and the stable coins.
[01:58:58.580 --> 01:59:04.940] I mean, I'm sure he didn't plan the Epstein debacle that cost him so much of his support
[01:59:05.180 --> 01:59:09.300] That wasn't a intentional distraction, in my opinion.
[01:59:09.300 --> 01:59:10.300] But it is.
[01:59:10.300 --> 01:59:12.660] Oh, look, here's Martin Luther King.
[01:59:12.660 --> 01:59:15.940] Everyone, please ignore my history.
[01:59:15.940 --> 01:59:19.300] The Trump administration has released FBI records on the surveillance of Martin Luther
[01:59:19.300 --> 01:59:20.300] King Jr.
[01:59:20.300 --> 01:59:21.300] Well, that's great.
[01:59:21.300 --> 01:59:24.780] Now we know he was under surveillance like we already did.
[01:59:24.780 --> 01:59:28.180] With hundreds of thousands of pages of documents that have been under court imposed seal since
[01:59:28.180 --> 01:59:34.860] 1977, the release of the records marks a promise kept from President Donald Trump, who long
[01:59:35.180 --> 01:59:36.180] That is newsworthy.
[01:59:36.180 --> 01:59:37.180] Yeah.
[01:59:37.180 --> 01:59:39.100] Oh boy, he kept a promise.
[01:59:39.100 --> 01:59:40.820] Hundreds of thousands of documents.
[01:59:40.820 --> 01:59:41.820] Hmm.
[01:59:41.820 --> 01:59:44.340] So this so you do know how to do it.
[01:59:44.340 --> 01:59:45.340] Yeah.
[01:59:45.340 --> 01:59:51.940] A long campaign child promised to release records regarding the assassination of John F. Kennedy,
[01:59:51.940 --> 01:59:55.620] Robert F. Kennedy, as well as King's over 200,000 documents released on King on Monday
[01:59:55.620 --> 01:59:57.340] for the Associated Press.
[01:59:57.340 --> 01:59:59.940] Again, this is a don't look over there.
[01:59:59.940 --> 02:00:01.540] Look over here.
[02:00:01.540 --> 02:00:03.500] Don't pay attention to what's going on with Epstein.
[02:00:03.500 --> 02:00:08.900] Don't you want to read 200,000 pages of FBI documents about Martin Luther King?
[02:00:08.900 --> 02:00:15.140] The files include FBI memos, CIA intelligence on King, as well as his assassination in 1968.
[02:00:15.140 --> 02:00:17.860] King's family cautioned the public over the release of the files with two living children
[02:00:17.860 --> 02:00:19.580] putting out a statement on Monday.
[02:00:19.580 --> 02:00:25.860] And of course, my dad interviewed.
[02:00:25.860 --> 02:00:27.580] Was it Pepper?
[02:00:27.580 --> 02:00:29.380] Is that her name?
[02:00:29.380 --> 02:00:31.380] I can't remember.
[02:00:31.380 --> 02:00:35.860] As the children of Dr. King and Mrs. Coretta Scott King, his tragic death has been an intensely
[02:00:35.860 --> 02:00:39.300] personal grief, a devastating loss for his wife, children, and the granddaughter he never
[02:00:39.300 --> 02:00:40.300] met.
[02:00:40.300 --> 02:00:42.900] An absence our family has endured for over 57 years.
[02:00:42.900 --> 02:00:46.660] They said, we ask those who engage with the release of these files to do so with empathy,
[02:00:46.660 --> 02:00:49.700] restraint, and respect for our family's continuing grief.
[02:00:49.700 --> 02:00:57.460] I hope they do, but as a general rule, the internet is cruel and uncaring.
[02:00:58.460 --> 02:01:03.980] Whatever is in there, people are going to use it.
[02:01:03.980 --> 02:01:12.380] It's also, after 57 years, it's okay to be a little bit irreverent.
[02:01:12.380 --> 02:01:18.980] There is a matter of too soon, and 57 years is not too soon.
[02:01:18.980 --> 02:01:21.940] Family also said that the files would be viewed within their full historical context.
[02:01:21.940 --> 02:01:25.420] Bernice King was 5 years old at the time her father was killed, and Martin Luther King
[02:01:25.420 --> 02:01:26.420] III was 10.
[02:01:26.940 --> 02:01:31.140] The records were initially going to be sealed until 2027, but the DOJ asked a federal judge
[02:01:31.140 --> 02:01:33.100] to lift the seal ahead of the date.
[02:01:33.100 --> 02:01:36.060] Oh boy, we get it, two years ahead of time.
[02:01:36.060 --> 02:01:37.420] Isn't that wonderful?
[02:01:37.420 --> 02:01:43.060] Two whole years kept this on lockdown all this time.
[02:01:43.060 --> 02:01:44.860] Promises made, promises kept.
[02:01:44.860 --> 02:01:46.900] Two years early.
[02:01:46.900 --> 02:01:50.100] I for one could not have waited those two more years.
[02:01:50.100 --> 02:01:54.660] I for one couldn't have handled it.
[02:01:54.660 --> 02:01:58.020] The civil rights leader was of high interest to intelligence agencies, and was relentlessly
[02:01:58.020 --> 02:02:01.700] targeted by an invasive predatory and deeply disturbing disinformation and surveillance
[02:02:01.700 --> 02:02:06.540] campaign, orguided by J. Edgar Hoover through the Federal Bureau of Investigation.
[02:02:06.540 --> 02:02:10.660] Good thing we don't do that anymore.
[02:02:10.660 --> 02:02:15.300] I want to say, they even went so far as to send him letters saying he should just kill
[02:02:15.300 --> 02:02:16.300] himself.
[02:02:16.300 --> 02:02:21.780] Just, oh, you know, with all this information we have, you should probably just end it all.
[02:02:21.780 --> 02:02:24.260] Just kill yourself.
[02:02:24.260 --> 02:02:28.940] The intent of the government's COINTELPRO campaign was not only to monitor, but to discredit
[02:02:28.940 --> 02:02:34.420] dismantle and destroy Dr. King's reputation and the broader American civil rights movement.
[02:02:34.420 --> 02:02:38.980] These actions were not only invasions of privacy, but intentional assaults on the truth, undermining
[02:02:38.980 --> 02:02:44.480] the dignity and freedoms of private citizens who fought for justice, designed to neutralize
[02:02:44.480 --> 02:02:49.300] those who dared to challenge the status quo.
[02:02:49.300 --> 02:02:52.620] King was assassinated in 1968 while he was in Memphis, Tennessee.
[02:02:52.620 --> 02:02:56.420] James Earl Ray pleaded guilty, but renounced the plea and maintained that he was innocent
[02:02:56.420 --> 02:02:58.900] until his death in 1998.
[02:02:58.900 --> 02:03:03.780] Some had questioned whether or not Ray alone acted alone in the killing.
[02:03:03.780 --> 02:03:09.700] Wasn't it the King family that took the government to trial in a civil case?
[02:03:09.700 --> 02:03:13.900] And the judge basically said, yeah, you're right, the government did have some involvement
[02:03:13.900 --> 02:03:14.900] in it.
[02:03:14.900 --> 02:03:18.660] But since it was a civil suit, nothing ever happened.
[02:03:18.660 --> 02:03:24.460] There's so many different lawsuits and cases that go back and forth.
[02:03:24.460 --> 02:03:31.780] We're independent and it's, again, another distraction.
[02:03:31.780 --> 02:03:39.020] 200,000 pages released, dumped out on the internet at a time where everyone is desperate
[02:03:39.020 --> 02:03:42.260] for the Epstein files.
[02:03:42.260 --> 02:03:43.420] Would you settle for this?
[02:03:43.420 --> 02:03:47.780] How about this 50 year old info about Martin Luther King?
[02:03:47.780 --> 02:03:50.140] Does that interest you?
[02:03:50.140 --> 02:03:57.900] Trump administration releases FBI records on MLK Jr.'s assassination, release of 230,000
[02:03:57.900 --> 02:03:58.900] files.
[02:03:58.900 --> 02:04:03.580] And of course, we already knew he was under surveillance.
[02:04:03.580 --> 02:04:06.900] We already knew the types of tactics they were engaging in.
[02:04:06.900 --> 02:04:13.620] I have a comment here from David Knight says, Dr. Pepper, that's his real name, no relation
[02:04:14.620 --> 02:04:19.660] was the defense attorney for the guy they pinned in the Martin Luther King assassination
[02:04:19.660 --> 02:04:20.660] on.
[02:04:20.660 --> 02:04:25.420] King's family didn't believe it and they thought they knew who it was.
[02:04:25.420 --> 02:04:29.940] I think it was a retired cop who was part of the government conspiracy.
[02:04:29.940 --> 02:04:35.060] They filed a civil lawsuit and Dr. Pepper represented them and won.
[02:04:35.060 --> 02:04:38.380] When the evidence was presented to the jury, they didn't believe the government's official
[02:04:38.380 --> 02:04:40.380] story.
[02:04:40.380 --> 02:04:47.140] That's what I was remembering, yeah.
[02:04:47.140 --> 02:04:55.860] And more distraction news, Trump 79 posts deranged AI video of Obama being arrested.
[02:04:55.860 --> 02:04:59.060] That's right, he's posting these AI videos.
[02:04:59.060 --> 02:05:00.060] Look at this.
[02:05:00.060 --> 02:05:03.420] Look at this imaginary scenario where Barack Obama is arrested.
[02:05:03.420 --> 02:05:04.420] Isn't that cool?
[02:05:04.420 --> 02:05:05.420] Isn't that fun?
[02:05:05.420 --> 02:05:09.980] Except he didn't do anything to arrest Hillary Clinton for her crimes when he first got in
[02:05:09.980 --> 02:05:11.540] 2016.
[02:05:11.540 --> 02:05:16.420] Another promise he made and didn't keep, we're going to lock her up.
[02:05:16.420 --> 02:05:19.460] But he didn't do that.
[02:05:19.460 --> 02:05:22.780] The video depicts the arrest and imprisonment of former President Barack Obama based on
[02:05:22.780 --> 02:05:25.140] claims made by Tulsi Gabbard.
[02:05:25.140 --> 02:05:28.140] Trump's history of normalizing the idea of using Justice Department to target political
[02:05:28.140 --> 02:05:31.820] enemies is what's being normalized here, they say.
[02:05:31.820 --> 02:05:34.300] We've actually got the AI video he posted.
[02:05:34.300 --> 02:05:35.500] Let's take a look at that.
[02:05:35.500 --> 02:05:39.140] Let's look at this wonderful world he's envisioning for us.
[02:05:39.140 --> 02:05:47.740] Oh, look.
[02:05:47.740 --> 02:05:48.740] Isn't that fun?
[02:05:48.740 --> 02:05:52.060] Oh my goodness, they're arresting him.
[02:05:52.060 --> 02:05:53.060] Trump is smiling.
[02:05:53.060 --> 02:05:54.060] He's so happy.
[02:05:54.060 --> 02:05:55.060] They're locking him up.
[02:05:55.060 --> 02:05:56.060] They're putting him in handcuffs.
[02:05:56.060 --> 02:06:00.900] They're going to take him away.
[02:06:00.900 --> 02:06:03.340] Only in the fantasies do we see.
[02:06:03.340 --> 02:06:06.940] Only in the AI videos does Trump actually do anything.
[02:06:07.540 --> 02:06:09.180] We're going to lock up Hillary Clinton.
[02:06:09.180 --> 02:06:10.980] No, we're not going to do that.
[02:06:10.980 --> 02:06:15.540] Would you settle for an AI video of Donald Trump being arrested?
[02:06:15.540 --> 02:06:17.500] Does that tickle your fancy?
[02:06:17.500 --> 02:06:20.740] Does that make you feel happy?
[02:06:20.740 --> 02:06:23.900] You can imagine what it would be like if I did do this, right?
[02:06:23.900 --> 02:06:27.580] Wouldn't it be cool?
[02:06:27.580 --> 02:06:28.980] President Donald Trump shared a fake video.
[02:06:28.980 --> 02:06:33.060] He's going to release an AI video of him releasing the abstain list.
[02:06:33.740 --> 02:06:37.540] Yeah, look, here's an AI video of what it would be like if I did release the files.
[02:06:37.540 --> 02:06:39.580] Got another comment from David Knight.
[02:06:39.580 --> 02:06:42.060] I'm sure Trump isn't in the MLK files, at least.
[02:06:42.060 --> 02:06:44.420] That's one thing he's got going for him.
[02:06:44.420 --> 02:06:48.780] Two hundred and thirty something thousand pages and not one mention of Donald Trump
[02:06:48.780 --> 02:06:49.860] most likely.
[02:06:49.860 --> 02:06:54.460] So he could release those without fear.
[02:06:54.460 --> 02:06:59.140] Trump shared the video from a pro-maga TikTok user to his Truth Social platform on Sunday.
[02:06:59.660 --> 02:07:04.740] That's right, he still has all these MAGA grifters on TikTok or Twitter, social media
[02:07:04.740 --> 02:07:08.940] in general, just posting this kind of garbage drivel.
[02:07:08.940 --> 02:07:12.340] Hey, look, isn't Trump cool?
[02:07:12.340 --> 02:07:16.140] Look at this AI video of him arresting Barack Obama.
[02:07:16.140 --> 02:07:17.140] Look at this.
[02:07:17.140 --> 02:07:20.940] Haha, meme magic, everyone.
[02:07:20.940 --> 02:07:24.020] Nothing of substance ever really happens, though.
[02:07:24.020 --> 02:07:26.260] We're going to...
[02:07:26.260 --> 02:07:30.380] The MAGA base already lives in a delusion.
[02:07:30.380 --> 02:07:37.660] They're already so incredibly out to lunch that they can get the satisfaction of, I imagine,
[02:07:37.660 --> 02:07:42.140] they can get the satisfaction of something happening just by watching the AI video.
[02:07:42.140 --> 02:07:44.020] Like, oh, yeah, that's good stuff.
[02:07:44.020 --> 02:07:45.020] Yeah.
[02:07:45.020 --> 02:07:49.620] Yeah, Donnie, arrest Barack Obama.
[02:07:49.620 --> 02:07:53.460] And he never has to do anything.
[02:07:53.460 --> 02:07:54.460] Syrian girl.
[02:07:54.460 --> 02:07:58.500] They would rather discredit MLK Jr. than themselves, which would be the case if they unlocked the
[02:07:58.500 --> 02:07:59.500] Epstein files.
[02:07:59.500 --> 02:08:02.300] Yeah, that's a nuclear bomb.
[02:08:02.300 --> 02:08:05.300] That's a mutually assured destruction scenario.
[02:08:05.300 --> 02:08:09.060] Too many people in power are implicated in the Epstein files trying to see the light
[02:08:09.060 --> 02:08:10.700] of day.
[02:08:10.700 --> 02:08:13.100] That's, again, the argument Trump makes.
[02:08:13.100 --> 02:08:14.100] Why won't they release it?
[02:08:14.100 --> 02:08:16.940] Because they're all in it as well.
[02:08:16.940 --> 02:08:22.660] They're not going to go down with a ship just to sink Donald Trump.
[02:08:22.660 --> 02:08:24.140] The few doesn't go that deep.
[02:08:24.140 --> 02:08:25.540] They're all working towards the same agenda.
[02:08:25.540 --> 02:08:31.460] Now, I'm sure they do hate Donald Trump, but not because he's for the American people,
[02:08:31.460 --> 02:08:35.300] just simply because he's a dislikable person.
[02:08:35.300 --> 02:08:36.960] He's unpleasant.
[02:08:36.960 --> 02:08:40.300] He made them look like fools.
[02:08:40.300 --> 02:08:41.300] So look at 1980.
[02:08:41.300 --> 02:08:44.100] I knew that Trump would attempt to do some good things, to distract from the Epstein
[02:08:44.100 --> 02:08:45.100] debacle.
[02:08:45.100 --> 02:08:46.100] Yeah.
[02:08:46.100 --> 02:08:48.740] He's going to shuffle things around.
[02:08:48.740 --> 02:08:51.000] Look, here's the MLK files.
[02:08:51.000 --> 02:08:52.000] What else can I do?
[02:08:52.400 --> 02:08:53.400] Are you still not happy?
[02:08:53.400 --> 02:08:54.400] He'll continue to.
[02:08:54.400 --> 02:08:55.400] Here's the MLK files.
[02:08:55.400 --> 02:08:57.000] Are you still not happy?
[02:08:57.000 --> 02:09:00.760] What if I created an AI video of me arresting Obama?
[02:09:00.760 --> 02:09:02.920] What if I tweeted that out?
[02:09:02.920 --> 02:09:04.240] Would that be cool?
[02:09:04.240 --> 02:09:05.240] Come on.
[02:09:05.240 --> 02:09:06.600] Move on from Epstein already.
[02:09:06.600 --> 02:09:07.600] Knights of the Storm.
[02:09:07.600 --> 02:09:14.080] This is a step beyond the idiocracy president doing professional wrestling.
[02:09:14.080 --> 02:09:16.800] At least that took some skill.
[02:09:16.800 --> 02:09:18.800] This is fake professional wrestling.
[02:09:18.800 --> 02:09:21.280] This is faker professional wrestling.
[02:09:21.280 --> 02:09:22.960] Excuse me.
[02:09:22.960 --> 02:09:26.240] This is just created out of whole cloth.
[02:09:26.240 --> 02:09:27.800] Donald Trump doesn't have to do anything.
[02:09:27.800 --> 02:09:29.160] He doesn't even have to act.
[02:09:29.160 --> 02:09:33.200] There's an AI version of him that will do it.
[02:09:33.200 --> 02:09:34.200] Knights of the Storm.
[02:09:34.200 --> 02:09:35.200] I remember that case.
[02:09:35.200 --> 02:09:36.680] It was Dr. Pepper versus Mr. Pibb.
[02:09:36.680 --> 02:09:41.920] Dr. Thunder was on the jury, so there are still questions.
[02:09:41.920 --> 02:09:46.920] That's a great comment from Jason Barker.
[02:09:46.920 --> 02:09:53.680] He followed his Director of National Intelligence's announcement on Friday that she was referring
[02:09:53.680 --> 02:09:58.080] Obama administration officials to the Justice Department for prosecution over allegations
[02:09:58.080 --> 02:10:02.480] they manufactured intelligence to promote the idea that Russia interfered in the 2016
[02:10:02.480 --> 02:10:03.480] election.
[02:10:03.480 --> 02:10:07.640] Trump has posted at least 17 times about Gabbard's announcement since Friday.
[02:10:07.640 --> 02:10:08.640] Yeah.
[02:10:08.640 --> 02:10:13.680] And of course, they're doing this again as more obfuscation.
[02:10:13.680 --> 02:10:19.000] Look, we're referring some Obama administration officials to the Justice Department.
[02:10:19.000 --> 02:10:24.920] We're going to send the Fed Chair Powell over to the DOJ as well.
[02:10:24.920 --> 02:10:25.920] Come on, guys.
[02:10:25.920 --> 02:10:27.440] Stop asking about Epstein.
[02:10:27.440 --> 02:10:30.000] Don't you want to pay attention to these things?
[02:10:30.000 --> 02:10:33.520] Gabbard claimed that the newly declassified documents were evidence that Obama and some
[02:10:33.520 --> 02:10:37.040] of his cabinet members politicized intelligence.
[02:10:37.040 --> 02:10:40.160] No, they wouldn't do that.
[02:10:40.160 --> 02:10:45.960] They wouldn't use intelligence or information for political reasons, for political gain?
[02:10:45.960 --> 02:10:47.960] No, come on.
[02:10:47.960 --> 02:10:49.600] This is such a nothing.
[02:10:49.600 --> 02:10:50.960] Yeah, of course they did.
[02:10:50.960 --> 02:10:53.000] Of course they did.
[02:10:53.000 --> 02:10:54.680] How is this news?
[02:10:54.680 --> 02:10:58.580] We've all known this, but they're doing it right now.
[02:10:58.580 --> 02:11:00.200] Another distraction.
[02:11:00.200 --> 02:11:01.840] Many other Trump supporters have gotten on board.
[02:11:01.840 --> 02:11:05.820] The Obama arrest video was shared by MAGA fans on social media Sunday night.
[02:11:05.820 --> 02:11:10.040] Make this a reality, right wing journalist Nick Sorter wrote on X, tagging attorney
[02:11:10.040 --> 02:11:11.680] general Pam Bondi.
[02:11:11.680 --> 02:11:12.680] That's right.
[02:11:12.680 --> 02:11:17.320] We're going to at the politicians on X and we're going to get Barack Obama arrested.
[02:11:17.320 --> 02:11:18.900] That's what we're going to do, folks.
[02:11:18.900 --> 02:11:19.900] We're going to get on Twitter.
[02:11:19.900 --> 02:11:21.680] We're going to make some hashtags.
[02:11:21.680 --> 02:11:23.460] We're going to at Pam Bondi.
[02:11:23.460 --> 02:11:24.840] Maybe Donald Trump himself.
[02:11:24.840 --> 02:11:32.600] He had AI-generated ballots with AI-generated blockchain watermarks.
[02:11:32.600 --> 02:11:36.560] Trump, Nick Sorter, I've never even heard of.
[02:11:36.560 --> 02:11:40.920] There's so many different grifters and people.
[02:11:40.920 --> 02:11:42.260] You can't keep track of them all.
[02:11:42.260 --> 02:11:43.960] Never heard of this guy.
[02:11:43.960 --> 02:11:48.960] Trump, a convicted criminal, has increasingly normalized the idea of using the Justice Department
[02:11:48.960 --> 02:11:50.840] to go after his political enemies.
[02:11:50.840 --> 02:11:55.280] On Sunday night alone, he also floated sending Democratic Senator Adam Schiff to prison and
[02:11:55.280 --> 02:11:59.840] posted a collage depicting fake mugshots of various Obama-era officials, including James
[02:11:59.840 --> 02:12:04.240] Comey, Samantha Power, Susan Rice, wearing orange jumpsuits.
[02:12:04.240 --> 02:12:05.840] That's right.
[02:12:05.840 --> 02:12:07.440] Red meat for the MAGA base.
[02:12:07.440 --> 02:12:08.920] Look, isn't it funny?
[02:12:08.920 --> 02:12:12.440] I'm going to talk about sending Adam Schiff to prison.
[02:12:12.440 --> 02:12:18.520] I'm going to post these photoshopped images of these people you hate in red jumpsuits.
[02:12:18.520 --> 02:12:21.000] That's what you guys elected me to do, right?
[02:12:21.000 --> 02:12:29.400] To post fake pictures and talk about things I'm never going to do.
[02:12:29.400 --> 02:12:34.280] Trump was found guilty in May 2024 on 34 felony counts of falsifying business records.
[02:12:34.760 --> 02:12:39.720] And again, this is partially what inoculated him from criticism.
[02:12:39.720 --> 02:12:44.960] They brought him in and had these trials and made it.
[02:12:44.960 --> 02:12:48.620] They made him a martyr to his base.
[02:12:48.620 --> 02:12:49.920] Look how they persecute him.
[02:12:49.920 --> 02:12:51.700] He's got to be for the American people.
[02:12:51.700 --> 02:12:54.400] Look who his enemies are.
[02:12:54.400 --> 02:12:59.360] That statement again, people always say, judge a man by his enemies.
[02:12:59.360 --> 02:13:01.560] And you know, sometimes you can do that.
[02:13:01.800 --> 02:13:06.560] But sometimes you need to judge a man by his friends, especially when they're friends with
[02:13:06.560 --> 02:13:08.200] Jeffrey Epstein.
[02:13:08.200 --> 02:13:12.560] Chevkin, Trump is trying to turn the clock back eight years now.
[02:13:12.560 --> 02:13:19.760] Yeah, he's trying to generate that sort of excitement and just electric sort of feeling
[02:13:19.760 --> 02:13:23.640] that was in the air in 2016 when he was all over the place talking about, we're going
[02:13:23.640 --> 02:13:25.600] to lock up Hillary Clinton.
[02:13:25.600 --> 02:13:29.520] You saw it on Twitter, all these people really believing it.
[02:13:29.520 --> 02:13:30.680] He's going to do it.
[02:13:30.680 --> 02:13:35.720] He's going to get in and he's going to send her to prison.
[02:13:35.720 --> 02:13:37.720] I don't think that would be a good thing to do.
[02:13:37.720 --> 02:13:40.800] I think we're going to move on from that.
[02:13:40.800 --> 02:13:44.320] Obama argues he can't be charged with treason since he wasn't born in America.
[02:13:44.320 --> 02:13:47.240] This is from the Babylon Bee, of course.
[02:13:47.240 --> 02:13:54.640] Now, occasionally they are still funny.
[02:13:54.640 --> 02:13:57.960] Publicans acknowledge that they could be left without any legal recourse after President
[02:13:57.960 --> 02:14:01.680] Obama absolved himself of any potential treason charge by reminding everyone that he couldn't
[02:14:01.680 --> 02:14:05.800] face any consequences since he was never a citizen to begin with.
[02:14:05.800 --> 02:14:06.800] That's right.
[02:14:06.800 --> 02:14:07.800] Get off scot-free.
[02:14:07.800 --> 02:14:15.520] Of course, I'm sure someone will say this is racist or problematic because, wow, you're
[02:14:15.520 --> 02:14:18.080] still bringing up that old birth certificate thing.
[02:14:18.080 --> 02:14:20.840] Yeah, it was obviously a fake.
[02:14:20.840 --> 02:14:23.480] It was obviously some kind of hoax.
[02:14:24.120 --> 02:14:28.320] FBI botched investigation into Hillary Clinton's emails, declassified documents alleged.
[02:14:28.320 --> 02:14:30.080] This is from Fox News.
[02:14:30.080 --> 02:14:32.200] Wow, the FBI botching something?
[02:14:32.200 --> 02:14:33.840] No, that wouldn't happen.
[02:14:33.840 --> 02:14:38.600] Now, personally, I think this would probably be botched on purpose.
[02:14:38.600 --> 02:14:41.760] They don't want to have any real evidence of this sort of thing.
[02:14:41.760 --> 02:14:50.720] It makes their job letting people off a lot easier if they somehow fail to collect evidence.
[02:14:50.720 --> 02:14:53.520] They have to admit, yeah, we have some really damning evidence.
[02:14:53.520 --> 02:14:55.480] They then have to do something about that.
[02:14:55.480 --> 02:14:58.080] They have to wiggle their way out of it a different way.
[02:14:58.080 --> 02:15:05.780] But if they send in the three stooges to muck everything up and ruin the investigation,
[02:15:05.780 --> 02:15:07.240] they never have to get to that point.
[02:15:07.240 --> 02:15:09.200] Oops, sorry, we screwed it up.
[02:15:09.200 --> 02:15:10.200] Our bad.
[02:15:10.200 --> 02:15:12.360] No evidence exists.
[02:15:12.360 --> 02:15:13.800] Nights of the Storm.
[02:15:13.800 --> 02:15:15.960] How does Elon and Trump have all the time to tweet stuff?
[02:15:15.960 --> 02:15:16.960] They are hard at work for us.
[02:15:16.960 --> 02:15:20.340] I mean, Elon must be busy being the world's best gamer while working in his factory seven
[02:15:20.340 --> 02:15:22.020] days a week for 22 hours a day.
[02:15:22.020 --> 02:15:23.020] It's all BS.
[02:15:23.020 --> 02:15:24.020] Yeah.
[02:15:24.020 --> 02:15:31.760] You have to imagine that Trump is just sitting on Twitter seething, just constantly refreshing
[02:15:31.760 --> 02:15:36.200] the timeline, constantly scanning for updates on how people feel about him, with what a
[02:15:36.200 --> 02:15:39.100] narcissist he is.
[02:15:39.100 --> 02:15:43.260] Comey's decision-making process smacks a political infection, Senator Grassley said, railing
[02:15:43.260 --> 02:15:47.860] against the former FBI director.
[02:15:47.860 --> 02:15:55.500] Now, my dad said that we should send them all to jail, all directly to prison, Obama,
[02:15:55.500 --> 02:15:58.940] Hillary and Trump, because, of course, they're all criminals.
[02:15:58.940 --> 02:16:03.560] They all have violated the Constitution.
[02:16:03.560 --> 02:16:09.460] And Trump gave a swarp speed, injected poison into people, mandated it, turned the country
[02:16:09.460 --> 02:16:11.820] over to Anthony Fauci.
[02:16:11.820 --> 02:16:16.780] Trump did more damage to the American people than Barack Obama and Hillary Clinton ever
[02:16:16.780 --> 02:16:22.500] did, which is truly incredible to say.
[02:16:22.500 --> 02:16:27.420] Speaker Johnson releases 14-minute video chronicling Biden's decline.
[02:16:27.420 --> 02:16:31.100] House Speaker Mike Johnson on Monday released a supercut of Democrats defending Joe Biden's
[02:16:31.100 --> 02:16:35.580] mental acuity as Republicans investigate an alleged coverup of the former president's
[02:16:35.580 --> 02:16:37.940] decline while in office.
[02:16:37.940 --> 02:16:41.620] And, of course, this is just another distraction.
[02:16:41.620 --> 02:16:46.620] Do we need Mike Johnson to come out and say, look, Biden really was in mental decline.
[02:16:47.260 --> 02:16:49.020] We know he was in mental decline.
[02:16:49.020 --> 02:16:50.340] It was obvious.
[02:16:50.340 --> 02:16:54.620] There was never any doubt about it.
[02:16:54.620 --> 02:16:58.100] We don't need these people to come out and confirm these things we already know.
[02:16:58.100 --> 02:17:01.500] It's simply another way for them to distract people.
[02:17:01.500 --> 02:17:05.260] Hey, everyone, look, the FBI was spying on MLK.
[02:17:05.260 --> 02:17:09.100] And look, look, Biden was going senile while in the office.
[02:17:09.100 --> 02:17:11.940] Please ignore the Epstein files.
[02:17:11.940 --> 02:17:14.860] The FBI botched the investigation to Hillary Clinton.
[02:17:14.980 --> 02:17:18.460] Yeah, all these things are patently obvious.
[02:17:18.460 --> 02:17:20.020] This is business as usual.
[02:17:20.020 --> 02:17:22.300] Did anyone need this to be stated?
[02:17:22.300 --> 02:17:31.220] Was there anyone that could actually be reached by the truth that didn't know Biden was not
[02:17:31.220 --> 02:17:32.460] there?
[02:17:32.460 --> 02:17:37.660] You'll have, you know, liberals that will defend him because that's their guy.
[02:17:37.660 --> 02:17:39.780] They won't ever admit anything.
[02:17:39.780 --> 02:17:40.780] They won't move on it.
[02:17:40.780 --> 02:17:44.580] So no amount of evidence will sway them.
[02:17:44.580 --> 02:17:50.620] This is simply reaffirming what everyone on the right already knows, and even some liberals.
[02:17:50.620 --> 02:17:57.500] Some people on that side, on the left, will say, yeah, he was gone.
[02:17:57.500 --> 02:18:00.580] But they'll only say that as a way to say, well, we should have pivoted to a different
[02:18:00.580 --> 02:18:03.060] candidate sooner.
[02:18:03.060 --> 02:18:07.060] Trump says he may put restrictions on commanders, new stadium deal, if they don't change name
[02:18:07.060 --> 02:18:08.060] back to Redskins.
[02:18:08.060 --> 02:18:12.660] Oh, boy, he's going to force a sports team to go back to their original name that they
[02:18:12.660 --> 02:18:17.340] got rid of due to DEI woke nonsense.
[02:18:17.340 --> 02:18:18.340] Isn't that great?
[02:18:18.340 --> 02:18:19.340] He's really putting the pressure on.
[02:18:19.340 --> 02:18:21.980] He's really winning some battles for us.
[02:18:21.980 --> 02:18:26.300] It's going to change the name back to the Redskins.
[02:18:26.300 --> 02:18:30.620] It's more professional wrestling nonsense.
[02:18:30.620 --> 02:18:34.500] My statement on the Washington Redskins is totally blown up, but only in a very positive
[02:18:34.500 --> 02:18:35.500] way.
[02:18:35.500 --> 02:18:36.500] Trump wrote on true.
[02:18:36.500 --> 02:18:37.500] So it's big.
[02:18:37.500 --> 02:18:38.500] It's huge.
[02:18:38.500 --> 02:18:39.500] People are loving it.
[02:18:39.500 --> 02:18:40.500] Forget Epstein.
[02:18:41.500 --> 02:18:45.900] I may put a restriction on them if they don't change the name back to the original Washington
[02:18:45.900 --> 02:18:46.900] Redskins.
[02:18:46.900 --> 02:18:48.860] Get rid of the ridiculous moniker.
[02:18:48.860 --> 02:18:49.860] Washington commanders.
[02:18:49.860 --> 02:18:53.540] I'm sure he's mad because he wants to be the only commander in Washington.
[02:18:53.540 --> 02:18:54.880] Hey, wait a minute.
[02:18:54.880 --> 02:18:55.880] That's my title.
[02:18:55.880 --> 02:18:57.020] Only I get to issue commands.
[02:18:57.020 --> 02:19:00.980] It won't make it after cheats.
[02:19:00.980 --> 02:19:06.180] The Washington cheats, the Washington liars, the Washington scumbags and scoundrels, the
[02:19:06.180 --> 02:19:10.420] Washington philanderers, all of them are going to have to change their names.
[02:19:11.420 --> 02:19:15.120] Just forget Jeffrey Epstein.
[02:19:15.120 --> 02:19:16.420] Forget that I was friends with him.
[02:19:16.420 --> 02:19:18.820] Don't focus on that.
[02:19:18.820 --> 02:19:21.820] Look over here.
[02:19:21.820 --> 02:19:24.860] Focus on this nonsense.
[02:19:24.860 --> 02:19:30.340] Secure funding for stadium Washington commanders change name to Washington bloodthirsty engines.
[02:19:30.340 --> 02:19:33.460] This is from the Babylon Bee, of course.
[02:19:33.460 --> 02:19:36.340] They're really, you know, they're turning it up a notch.
[02:19:36.340 --> 02:19:39.200] They're going to make sure that not only do they get funding, they get more funding than
[02:19:39.200 --> 02:19:43.040] anyone ever has.
[02:19:43.040 --> 02:19:46.760] As part of the deal for the construction of a new stadium, the Washington commanders formerly
[02:19:46.760 --> 02:19:50.480] the Washington Redskins have agreed to change their name to the Washington bloodthirsty
[02:19:50.480 --> 02:19:51.480] engines.
[02:19:51.480 --> 02:19:52.480] The winds have changed.
[02:19:52.480 --> 02:19:53.480] Woke is out.
[02:19:53.480 --> 02:19:57.440] Classical racial stereotypes are back in, said Adam Peters, the team's general manager.
[02:19:57.440 --> 02:20:00.640] Therefore, we have agreed in exchange for support from the federal government to change
[02:20:00.640 --> 02:20:05.480] our name to the Washington bloodthirsty engines as a way of honoring the noble bloodthirsty
[02:20:05.480 --> 02:20:07.560] engines of our nation's history.
[02:20:08.080 --> 02:20:12.000] Cry thousands of ecstatic fans go bloodthirsty engines.
[02:20:12.000 --> 02:20:14.960] Critics say the name change is insensitive, harkening back to a time when bloodthirsty
[02:20:14.960 --> 02:20:22.040] engines were cruelly slandered by settlers as bloodthirsty engine.
[02:20:22.040 --> 02:20:23.080] I'm not sure I'm going to read that.
[02:20:23.080 --> 02:20:28.680] I'm not sure I want the rest of what's in this article to exist with me reading it.
[02:20:28.680 --> 02:20:31.720] It's a it's entertaining.
[02:20:31.720 --> 02:20:32.800] It's funny.
[02:20:32.840 --> 02:20:38.680] But in the end, this is simply another distraction from Donald Trump.
[02:20:38.680 --> 02:20:41.840] Please stop caring about Jeffrey Epstein.
[02:20:41.840 --> 02:20:43.360] What do I have to do?
[02:20:43.360 --> 02:20:47.160] I'll bully the sports team to get rid of the woke name change.
[02:20:47.160 --> 02:20:49.360] I'll release the MLK files.
[02:20:49.360 --> 02:20:52.960] I'll post an AI video of Obama being arrested.
[02:20:52.960 --> 02:20:55.840] I'll send Powell before the DOJ.
[02:20:55.840 --> 02:21:00.480] I'll call some in Obama era admins, the Department of Justice as well.
[02:21:00.480 --> 02:21:01.480] What does it take?
[02:21:01.480 --> 02:21:02.480] What do I got to do?
[02:21:03.160 --> 02:21:07.200] And also, I don't think it's a coincidence that they're really pushing through the Genius
[02:21:07.200 --> 02:21:09.480] Act stuff now with the Epstein thing.
[02:21:09.480 --> 02:21:14.080] I don't think Epstein was a planned distraction, but they never let a crisis.
[02:21:14.080 --> 02:21:15.080] Yeah, exactly.
[02:21:15.080 --> 02:21:16.360] Never let a crisis go to waste.
[02:21:16.360 --> 02:21:21.760] That's a true thing, even when it's a organic crisis, non-constructed.
[02:21:21.760 --> 02:21:27.120] Everyone's talking about Epstein so they can push through other unpopular things now.
[02:21:27.120 --> 02:21:32.800] While everyone's desperate for us to fulfill our promises, perhaps we can use it for some
[02:21:32.800 --> 02:21:33.800] of our purposes.
[02:21:33.800 --> 02:21:34.800] All right.
[02:21:34.800 --> 02:21:37.360] We're going to take a quick break when we come back.
[02:21:37.360 --> 02:21:40.480] We're going to look at what's going on with Pharma.
[02:21:40.480 --> 02:21:41.960] Big Pharma.
[02:21:41.960 --> 02:21:42.960] Stay with us.
[02:22:57.120 --> 02:23:21.120] You're listening to The David Knight Show.
[02:23:27.120 --> 02:23:55.120] We'll be right back.
[02:23:55.120 --> 02:24:23.120] We'll be right back.
[02:24:23.120 --> 02:24:51.120] We'll be right back.
[02:24:51.120 --> 02:25:08.480] Welcome back.
[02:25:08.480 --> 02:25:09.480] Got some comments.
[02:25:09.480 --> 02:25:12.640] Chevkin, does anyone besides Trump use Truth Social?
[02:25:12.640 --> 02:25:13.640] I have no idea.
[02:25:13.640 --> 02:25:17.280] I've never encountered anyone that uses Truth Social.
[02:25:17.280 --> 02:25:21.360] I would not be surprised if it's him and a bunch of bots that are just there to praise
[02:25:21.360 --> 02:25:23.280] his every decision.
[02:25:23.280 --> 02:25:27.200] Everything he posts, just, yes, Trump, you're doing great.
[02:25:27.200 --> 02:25:32.720] And they've simply got a giant server in the White House dedicated to pumping up his numbers.
[02:25:32.720 --> 02:25:34.920] Ron Helton one, I've never looked for Truth Social.
[02:25:34.920 --> 02:25:37.080] Probably just a bunch of TDS posters there.
[02:25:37.080 --> 02:25:39.560] Those opposed and those for Trumpy.
[02:25:39.560 --> 02:25:42.320] Yeah, it's probably one way or the other.
[02:25:42.320 --> 02:25:47.760] People that love him and just simply want to sycophantically praise every decision or
[02:25:47.760 --> 02:25:53.360] people that are so utterly obsessed with hating him that they need to be clued in on his every
[02:25:53.360 --> 02:25:54.360] move.
[02:25:54.360 --> 02:25:57.840] Like, oh, oh, what is, what is he doing now?
[02:25:57.840 --> 02:26:00.000] Oh, KWD 68.
[02:26:00.000 --> 02:26:02.520] Why does Trump care so much about sports all the time?
[02:26:02.520 --> 02:26:03.520] Never mind.
[02:26:03.520 --> 02:26:04.520] Moron.
[02:26:04.520 --> 02:26:05.520] The best moron.
[02:26:05.520 --> 02:26:06.520] The golden idiot.
[02:26:06.520 --> 02:26:07.520] Yeah.
[02:26:07.520 --> 02:26:09.740] Sports was never really a thing in our household.
[02:26:09.780 --> 02:26:13.580] We would occasionally, you know, friends would have Super Bowl parties and we'd go just to
[02:26:13.580 --> 02:26:15.900] hang out, but we never cared.
[02:26:15.900 --> 02:26:19.900] We never, I've never paid attention to sports.
[02:26:19.900 --> 02:26:25.420] I can probably name a few sports teams, but I'd be hard pressed to tell you which sports
[02:26:25.420 --> 02:26:27.300] they actually played.
[02:26:27.300 --> 02:26:28.540] And this is going to be an aside.
[02:26:28.540 --> 02:26:32.940] I'm going to go out on a tangent here for a second.
[02:26:32.940 --> 02:26:37.580] I'm sure some of you saw the WNBA players coming out with those T-shirts saying, you
[02:26:37.580 --> 02:26:39.260] know, pay us what we're worth.
[02:26:39.260 --> 02:26:42.740] Pay us like the NBA or whatever exactly it was.
[02:26:42.740 --> 02:26:46.300] And the WNBA is remarkably unprofitable.
[02:26:46.300 --> 02:26:49.740] It is subsidized by the NBA.
[02:26:49.740 --> 02:26:55.300] The NBA turns a massive profit because there's a huge number of suckers that tune in all
[02:26:55.300 --> 02:26:59.980] the time to watch them play for whatever reason.
[02:26:59.980 --> 02:27:02.700] The WNBA loses money.
[02:27:02.700 --> 02:27:05.340] It loses a lot of money.
[02:27:06.340 --> 02:27:13.300] These women are so incredibly self-obsessed that they think they deserve the amount of
[02:27:13.300 --> 02:27:15.140] money the NBA players get now.
[02:27:15.140 --> 02:27:20.540] That's not saying the NBA players deserve their massive salaries for playing a game,
[02:27:20.540 --> 02:27:25.580] but they're at least generating profit for the NBA, whether they should or shouldn't.
[02:27:25.580 --> 02:27:26.580] They do.
[02:27:26.580 --> 02:27:30.100] Do you think you deserve the kind of money NBA players make?
[02:27:30.100 --> 02:27:33.980] NBA players don't deserve the kind of money NBA players make.
[02:27:33.980 --> 02:27:38.300] They're massively overpaid in the NBA, and you think we're going to give you that kind
[02:27:38.300 --> 02:27:39.980] of cash too?
[02:27:39.980 --> 02:27:41.500] It's ridiculous.
[02:27:41.500 --> 02:27:45.580] The WNBA has been a non-event for years.
[02:27:45.580 --> 02:27:49.740] The only reason it gets any press at all is because they're continually whining about
[02:27:49.740 --> 02:27:51.220] how they deserve more pay.
[02:27:51.220 --> 02:27:53.500] That's the main headline maker.
[02:27:53.500 --> 02:27:59.460] Not their games, not their players, the simple fact that they sit around and whine and moan.
[02:27:59.460 --> 02:28:04.420] That's the only time they get attention, except for whatever was it, Caitlyn something.
[02:28:04.420 --> 02:28:09.740] There was this female player that was making headlines, which again, who cares?
[02:28:09.740 --> 02:28:16.500] I'm sorry, but the WNBA, even if you're into sports, it's much less entertaining than the
[02:28:16.500 --> 02:28:17.500] NBA is.
[02:28:17.500 --> 02:28:21.220] At least the NBA, they're dunking and they're doing that sort of thing.
[02:28:21.220 --> 02:28:24.500] There's nothing at all exciting about a WNBA game.
[02:28:24.500 --> 02:28:25.500] There simply isn't.
[02:28:25.940 --> 02:28:30.460] The only time you hear about the WNBA is when there's some political thing tied up with
[02:28:30.460 --> 02:28:31.460] it.
[02:28:31.460 --> 02:28:38.020] This thing, the other thing that you're talking about, or exchanging prisoners like arms.
[02:28:38.020 --> 02:28:44.180] We traded a Russian arms dealer for whatever that WNBA player was.
[02:28:44.180 --> 02:28:48.260] We were down one whiner.
[02:28:48.260 --> 02:28:50.020] We were so up on that deal.
[02:28:50.020 --> 02:28:53.140] The art of the deal, come on.
[02:28:53.140 --> 02:28:55.220] We should start negotiating to get the arms.
[02:28:55.220 --> 02:28:57.340] We'll give you back whatever her name was.
[02:28:57.340 --> 02:29:01.820] If you give us back the arms dealer, art of the worst trade deal in the history of trade
[02:29:01.820 --> 02:29:03.580] deals, maybe ever.
[02:29:03.580 --> 02:29:08.060] We lost so big on that one, how we got fooled.
[02:29:08.060 --> 02:29:09.060] We got scammed.
[02:29:09.060 --> 02:29:11.780] Anyway, that's enough about the WNBA.
[02:29:11.780 --> 02:29:16.460] The funny thing is, I'm not sure if this is true.
[02:29:16.460 --> 02:29:21.060] I would imagine it is that someone pointed out when they're saying I deserve as much
[02:29:21.060 --> 02:29:24.820] pay as the NBA players, then play in the NBA.
[02:29:24.820 --> 02:29:32.220] That WNBA is exclusively women, but NBA you could play in that if you were good enough.
[02:29:32.220 --> 02:29:34.900] I don't know if that's true or if it's exclusively male.
[02:29:34.900 --> 02:29:38.940] I would imagine there's no rule prohibiting women from playing in it.
[02:29:38.940 --> 02:29:42.460] They don't want equality for that sort of thing.
[02:29:42.460 --> 02:29:47.180] They don't want equality of let me play against these people that would absolutely destroy
[02:29:47.180 --> 02:29:49.260] my team.
[02:29:49.260 --> 02:29:51.020] They want equality of outcome.
[02:29:51.020 --> 02:29:56.740] Yeah, you're going to pay us the same as the NBA despite the fact that we don't generate
[02:29:56.740 --> 02:29:57.740] the revenue.
[02:29:57.740 --> 02:29:59.380] We're not fun to watch.
[02:29:59.380 --> 02:30:01.880] We're not as good at the game.
[02:30:01.880 --> 02:30:03.100] Just pay us that money.
[02:30:03.100 --> 02:30:05.100] We play the same game.
[02:30:05.100 --> 02:30:06.780] Why aren't we making the same amount of money?
[02:30:06.780 --> 02:30:12.900] The only I mean, we've literally seen it with women's soccer as well.
[02:30:12.900 --> 02:30:14.300] This is a recurring theme.
[02:30:14.300 --> 02:30:16.420] Why aren't we getting paid the big bucks like the men are?
[02:30:16.420 --> 02:30:18.600] It's because you're not as good.
[02:30:18.600 --> 02:30:20.640] This is coming again.
[02:30:20.640 --> 02:30:21.640] I don't play sports.
[02:30:21.640 --> 02:30:23.840] I don't have a dog in this fight.
[02:30:23.840 --> 02:30:25.880] This is simply a matter of simple economic.
[02:30:25.880 --> 02:30:27.600] You don't generate revenue.
[02:30:27.600 --> 02:30:32.560] The women's soccer team, the championship women's soccer team lost to a group of high
[02:30:32.560 --> 02:30:34.120] school boys.
[02:30:34.120 --> 02:30:36.400] You guys got demolished.
[02:30:36.400 --> 02:30:40.320] What makes you think you deserve that kind of payment?
[02:30:40.320 --> 02:30:41.320] Tell you what.
[02:30:41.320 --> 02:30:42.320] How about this?
[02:30:42.320 --> 02:30:46.780] We'll work out a sort of Harlem Globetrotters deal.
[02:30:46.780 --> 02:30:52.980] Every night you trot out to get demolished by an all male team and we'll pay you a larger
[02:30:52.980 --> 02:30:53.980] salary.
[02:30:53.980 --> 02:30:57.500] Every single night you come out and you put on your best performance and the all male
[02:30:57.500 --> 02:31:01.660] team gets to just like dribble around and dunk on you and pull some Globetrotter stunts
[02:31:01.660 --> 02:31:03.140] for fun.
[02:31:03.140 --> 02:31:05.060] And then then we can talk about it.
[02:31:05.060 --> 02:31:14.460] Until then, you guys, you ladies play your game, enjoy the sport, realize you are being
[02:31:15.420 --> 02:31:16.420] worth.
[02:31:16.420 --> 02:31:19.860] Anyway, enough about the WNBA, enough about sports in general.
[02:31:19.860 --> 02:31:22.060] I got sidetracked there.
[02:31:22.060 --> 02:31:26.780] KUWD68, people are dying Trump and you care about the Cleveland Indians and the Washington
[02:31:26.780 --> 02:31:27.780] Redskins.
[02:31:27.780 --> 02:31:28.780] That's right.
[02:31:28.780 --> 02:31:30.780] We've got some real important issues here.
[02:31:30.780 --> 02:31:35.780] KUWD68, WNBA has their best player in Caitlin Girl and they trash her and beat on her every
[02:31:35.780 --> 02:31:36.780] game.
[02:31:36.780 --> 02:31:37.780] Jealous much?
[02:31:37.780 --> 02:31:44.020] Yeah, that's the other thing is it really goes to show how much racial antipathy and
[02:31:44.020 --> 02:31:46.700] hatred there is.
[02:31:46.700 --> 02:31:47.700] Knights of the Storm.
[02:31:47.700 --> 02:31:50.580] Maybe we can get a bunch of trans dudes in there to make it more interesting, kind of
[02:31:50.580 --> 02:31:52.660] like people who watch NASCAR for the Rex.
[02:31:52.660 --> 02:31:59.340] Yeah, we'll just have this one, you know, hideous guy in a dress out there absolutely
[02:31:59.340 --> 02:32:01.300] demolishing all the women.
[02:32:01.300 --> 02:32:04.500] Knights of the Storm, the answer to higher demands for the WNBA is to shut it down.
[02:32:04.500 --> 02:32:06.180] That's right.
[02:32:06.180 --> 02:32:07.680] You guys aren't profitable.
[02:32:07.680 --> 02:32:10.240] You guys don't make any money for us.
[02:32:10.240 --> 02:32:11.580] You're losing us money.
[02:32:11.580 --> 02:32:14.260] Sorry, we're closing up shop.
[02:32:14.260 --> 02:32:18.500] If you want to play, play for the love of the game.
[02:32:18.500 --> 02:32:19.780] All right.
[02:32:19.780 --> 02:32:20.780] Children's Health Defense.
[02:32:20.780 --> 02:32:28.780] Lawsuit targets HHS for failing to set up task force on childhood vaccine safety.
[02:32:28.780 --> 02:32:32.340] Lawsuit is funded by Children's Health Defense.
[02:32:32.340 --> 02:32:33.340] That's right.
[02:32:33.340 --> 02:32:37.140] I believe we have a clip of what is it?
[02:32:37.140 --> 02:32:38.140] RFK.
[02:32:39.140 --> 02:32:41.140] Gosh, where is it?
[02:32:41.140 --> 02:32:45.140] Yeah, it's the autism was rare.
[02:32:45.140 --> 02:32:54.620] 1970, scientists conducted the biggest epidemiological study in history of any country in the world.
[02:32:54.620 --> 02:33:01.020] They looked at every child, 900,000 children in the state of Wisconsin, and they were specifically
[02:33:01.020 --> 02:33:03.180] looking for autism.
[02:33:03.180 --> 02:33:06.060] And they knew what autism looked like.
[02:33:06.540 --> 02:33:08.500] They did follow-up checks.
[02:33:08.500 --> 02:33:10.860] It was an extraordinary study.
[02:33:10.860 --> 02:33:13.460] They found three children.
[02:33:13.460 --> 02:33:23.180] It was the rate of autism at that point was 0.7 per 10,000, less than one per 10,000.
[02:33:23.180 --> 02:33:29.700] A month ago, we released the newest data which showed that one in every 31 American kids
[02:33:29.700 --> 02:33:32.340] is autism.
[02:33:32.380 --> 02:33:37.460] It's actually probably a lot worse than that because we gather that data state by state,
[02:33:37.460 --> 02:33:41.060] and some states have better collection systems.
[02:33:41.060 --> 02:33:44.420] The best collection system is California.
[02:33:44.420 --> 02:33:50.980] And they're showing one in 19 kids has autism, one in 12.5 boys.
[02:33:50.980 --> 02:33:53.820] This is unsustainable.
[02:33:53.820 --> 02:34:01.420] And the cost of autism alone by 2030, according to a recent peer-reviewed study, is going
[02:34:01.780 --> 02:34:04.620] far if he knows what's going on.
[02:34:04.620 --> 02:34:08.860] He knows there's a massive problem, but he's not doing anything about it.
[02:34:08.860 --> 02:34:13.580] And Children's Health Defense, good for them, is standing up and saying no.
[02:34:13.580 --> 02:34:20.100] The lawsuit alleges Kennedy is violating the National Childhood Vaccine Injury Act of 1986,
[02:34:20.100 --> 02:34:23.280] which requires the Secretary of the U.S. Department of Health and Human Services to promote the
[02:34:23.280 --> 02:34:28.700] development of safer childhood vaccines that cause fewer and less serious adverse reactions
[02:34:28.700 --> 02:34:30.540] than existing ones.
[02:34:30.540 --> 02:34:35.100] The act requires HHS to establish a task force that includes the Health Secretary, the Commissioner
[02:34:35.100 --> 02:34:38.060] of the U.S. Food and Drug Administration, the Directors of the National Institutes of
[02:34:38.060 --> 02:34:41.060] Health, and the Centers for Disease Control and Prevention.
[02:34:41.060 --> 02:34:47.140] It also requires the Health Secretary to provide Congress with progress reports every two years.
[02:34:47.140 --> 02:34:55.260] Of course, this is the institution that RFK used to be a part of or head of, and they're
[02:34:55.260 --> 02:34:59.260] actually standing up and saying, you promised us something.
[02:34:59.260 --> 02:35:04.140] They're not sitting down and ignoring what's going on.
[02:35:04.140 --> 02:35:06.740] They're not willing to accept it.
[02:35:06.740 --> 02:35:12.900] They're actually holding his feet to the fire, going so far as to file a lawsuit.
[02:35:12.900 --> 02:35:17.140] Mary Hall and CEO of Children's Health Defense, which is funding the lawsuit, said it is black
[02:35:17.140 --> 02:35:23.580] letter law that the HHS Secretary must convene a task force on how to make vaccines safer.
[02:35:23.580 --> 02:35:26.580] This is part of the 1986 act itself.
[02:35:26.580 --> 02:35:31.620] That no Secretary has done so since the passage of this law is a blow to the rule of law.
[02:35:31.620 --> 02:35:35.940] Open trust that the current Secretary will fulfill his obligation to Congress's mandate.
[02:35:35.940 --> 02:35:40.380] Flores said the 1986 act includes a broad provision allowing citizens to sue the Secretary
[02:35:40.380 --> 02:35:42.220] if the requirements are not met.
[02:35:42.220 --> 02:35:46.900] His lawsuit asks the court to compel Kennedy to comply with the mandate to set up a task
[02:35:46.900 --> 02:35:55.140] force and submit biennial reports to Congress.
[02:35:55.180 --> 02:35:57.020] It's so rare to see this type of thing.
[02:35:57.020 --> 02:36:02.820] It's so rare to see someone actually willing to hold someone accountable, especially someone
[02:36:02.820 --> 02:36:07.420] that they ostensibly used to kind of work for, someone they might have had a relationship
[02:36:07.420 --> 02:36:11.020] with, a positive relationship with.
[02:36:11.020 --> 02:36:14.620] This is, this takes a lot of guts.
[02:36:14.620 --> 02:36:18.020] Flores told the Defender it was astonishing that HHS hasn't fulfilled its responsibility
[02:36:18.020 --> 02:36:19.980] to make vaccines safer.
[02:36:19.980 --> 02:36:25.820] Perhaps a little encouragement from a federal judge will help move this along, he said.
[02:36:25.820 --> 02:36:30.580] Between 1980 and 1986, people injured by vaccines filed more than three billion worth of damage
[02:36:30.580 --> 02:36:34.900] claims with U.S. civil courts against vaccine manufacturers, most of which were for the
[02:36:34.900 --> 02:36:37.860] DTP vaccines.
[02:36:37.860 --> 02:36:41.300] After lawsuits revealed that Wyeth knew of the risks, juries began authorizing large
[02:36:41.300 --> 02:36:43.900] payouts to some DTP-injured children.
[02:36:43.900 --> 02:36:47.140] Payouts threatened to bankrupt the vaccine insurance industry.
[02:36:47.180 --> 02:36:51.300] The publicity also generated public concerns about vaccine adverse events.
[02:36:51.300 --> 02:36:56.980] In 1986, Anthony Fauci, of course, Congress passed a law giving the pharmaceutical industry
[02:36:56.980 --> 02:37:01.980] broad protection from liability and creating a framework to compensate children injured
[02:37:01.980 --> 02:37:04.340] by compulsory vaccines.
[02:37:04.340 --> 02:37:10.860] The National Vaccine Injury Compensation Program, a no-fault administrative system that adjudicates
[02:37:10.860 --> 02:37:12.900] vaccine injury claims.
[02:37:12.900 --> 02:37:16.380] And that was the first thing Fauci did when he got in.
[02:37:16.620 --> 02:37:18.740] Thank you, Anthony Fauci.
[02:37:18.740 --> 02:37:25.060] He has been an enemy of the American people since he was first brought into the government.
[02:37:25.060 --> 02:37:31.460] He has shown he despises the American people.
[02:37:31.460 --> 02:37:34.860] Although it is notoriously difficult to win compensation in the VICP, it has paid out
[02:37:34.860 --> 02:37:40.460] over $5.2 billion to injury victims since its inception.
[02:37:40.460 --> 02:37:45.860] Even with all the roadblocks they put in place, even with how difficult it is to get a diagnosis
[02:37:45.860 --> 02:37:52.180] of vaccine injury to start the process, then how difficult they make it in court.
[02:37:52.180 --> 02:37:55.340] They've still paid out $5.2 billion.
[02:37:55.340 --> 02:37:58.700] Flores Lawsuit alleges the number would be significantly higher if vaccine manufacturers
[02:37:58.700 --> 02:38:02.860] had to defend themselves in federal court rather than in the VICP.
[02:38:02.860 --> 02:38:04.500] They're the ones that set up the playing field.
[02:38:04.500 --> 02:38:06.740] They chose the rules.
[02:38:06.740 --> 02:38:10.260] They chose this exact...
[02:38:10.260 --> 02:38:16.300] They set up everything in their favor, and they've still had to pay out $5.2 billion.
[02:38:16.300 --> 02:38:21.420] A lesser-known part of the 1986 law mandated the pursuit of safer vaccines.
[02:38:21.420 --> 02:38:24.900] And they've done nothing about that.
[02:38:24.900 --> 02:38:28.700] In the establishment of the task force, like the VICP, this aspect of the law has long
[02:38:28.700 --> 02:38:30.460] been a point of controversy.
[02:38:30.460 --> 02:38:35.420] In 2018, when Kennedy worked as a lawyer, he and co-counsel Aaron Seary followed a lawsuit
[02:38:35.420 --> 02:38:43.260] against HHS in a New York district court, seeking copies of the by-and-your reports
[02:38:43.260 --> 02:38:47.020] after the agency failed to respond to Freedom of Information Act requests.
[02:38:47.020 --> 02:38:51.220] Kennedy's lawsuit revealed that no reports were ever submitted to HHS.
[02:38:51.220 --> 02:38:55.780] And of course, now that he's the head, he's not doing it either.
[02:38:55.780 --> 02:39:01.260] First, he sues them over it, and when he gets into power, he disregards it as well.
[02:39:02.100 --> 02:39:06.380] We're going to make America healthy again, right?
[02:39:06.380 --> 02:39:11.020] More recently, former HHS secretary Javier Becerra, who left office in January, confirmed
[02:39:11.020 --> 02:39:16.820] that no health secretary had ever provided safety improvement reports to Congress.
[02:39:16.820 --> 02:39:17.820] Not going to start now.
[02:39:17.820 --> 02:39:21.140] Why would we expect Robert Kennedy to do that?
[02:39:21.140 --> 02:39:23.460] KWD 68, lots of kids now with autism.
[02:39:23.460 --> 02:39:24.900] It isn't sustainable.
[02:39:24.900 --> 02:39:26.300] Now jab your kids.
[02:39:26.300 --> 02:39:27.780] Yeah.
[02:39:28.300 --> 02:39:32.060] Everybody out there talking about how obvious it is, how something needs to be done, and
[02:39:32.060 --> 02:39:37.500] then he gets into a position to do something, and he simply ignores it.
[02:39:37.500 --> 02:39:41.060] Oh no, we're going to focus on Red 40.
[02:39:41.060 --> 02:39:42.060] All this other nonsense.
[02:39:42.060 --> 02:39:45.700] We want to get you a wearable smartwatch, wearable health tech.
[02:39:45.700 --> 02:39:48.260] Now, keep vaccinating your kids.
[02:39:48.260 --> 02:39:55.220] You know, we no longer have the COVID shot recommended for pregnant women and young children.
[02:39:55.220 --> 02:39:56.220] Isn't that enough?
[02:39:56.660 --> 02:39:59.300] We're not recommending it anymore.
[02:39:59.300 --> 02:40:01.300] Come on.
[02:40:01.300 --> 02:40:08.060] Florida Surgeon General highlights vaccine injury calls on NIH to act.
[02:40:08.060 --> 02:40:11.500] Of course, as we're talking about, RFK is simply dragging his feet.
[02:40:11.500 --> 02:40:13.420] He's not doing anything about it.
[02:40:13.420 --> 02:40:14.420] He's kind of just there.
[02:40:14.420 --> 02:40:20.580] At a press conference at Florida State University in Tampa, Florida Surgeon General Dr. Joseph
[02:40:20.580 --> 02:40:24.300] Lattapoe made an urgent call for the NIH program funding to help Americans injured
[02:40:24.380 --> 02:40:30.220] by COVID-19 vaccines and express support for the May federal changes in the HHS's restrictive
[02:40:30.220 --> 02:40:33.700] COVID-19 vaccine recommendations.
[02:40:33.700 --> 02:40:39.700] This Lattapoe is doing more than RFK.
[02:40:39.700 --> 02:40:43.780] Stanford scientists link spike in thyroid eye disease to COVID vaccines.
[02:40:43.780 --> 02:40:48.940] This is from Slay News.
[02:40:48.940 --> 02:40:54.420] This disease is kind of giving people sort of Marty Feldman eyes where they're sort of
[02:40:54.420 --> 02:40:55.420] bugging out.
[02:40:55.420 --> 02:40:59.060] It's making them sort of bulbous.
[02:40:59.060 --> 02:41:00.060] A group of leading...
[02:41:00.060 --> 02:41:01.060] My name is Igole.
[02:41:01.060 --> 02:41:02.060] I thought it was Igor.
[02:41:02.060 --> 02:41:04.980] I heard wrong then, didn't you?
[02:41:04.980 --> 02:41:09.140] A group of leading American scientists has uncovered evidence leaking COVID mRNA vaccines
[02:41:09.140 --> 02:41:16.200] to surging reports of thyroid eye disease, TED, an alarming disorder that leads to blindness.
[02:41:16.200 --> 02:41:18.260] This is yet another adverse effect.
[02:41:18.580 --> 02:41:21.780] This is another new thing that they're just now discovering.
[02:41:21.780 --> 02:41:26.500] I keep harping on it, but we've only seen short term effects.
[02:41:26.500 --> 02:41:33.660] And Marty Feldman's eyes were also due to a thyroid problem, I think from a car wreck,
[02:41:33.660 --> 02:41:34.660] was it?
[02:41:34.660 --> 02:41:37.220] I don't remember, but yeah, I believe you're right.
[02:41:37.220 --> 02:41:40.100] But these are still short term effects.
[02:41:40.100 --> 02:41:44.340] We're still discovering what it does in the short term because it's difficult to get anyone
[02:41:44.340 --> 02:41:45.340] to do studies.
[02:41:45.420 --> 02:41:51.420] It's difficult to get them to want to investigate these types of things.
[02:41:51.420 --> 02:41:54.420] They don't want the vaccine to be linked to any of these things.
[02:41:54.420 --> 02:41:59.420] The only way they ever do is because the evidence becomes overwhelming.
[02:41:59.420 --> 02:42:06.400] TED, also known as Graves ophthalmathy, causes the eyes to bulge in their sockets due to
[02:42:06.400 --> 02:42:08.900] severe swelling of the muscles.
[02:42:08.900 --> 02:42:14.060] The eyes become bloodshot and crossed, causing double vision and total loss in severe cases.
[02:42:14.780 --> 02:42:20.420] Another thing you can thank Donald Trump and Anthony Fauci for causes inflammation in swelling
[02:42:20.420 --> 02:42:28.540] of the eye muscles, eyelids, tear ducts, and fatty tissues behind the eyes.
[02:42:28.540 --> 02:42:31.380] Patients become aware of the condition when they begin to suffer from the initial symptoms
[02:42:31.380 --> 02:42:35.140] such as bulging, dry, or watery eyes.
[02:42:35.140 --> 02:42:39.180] In recent years, cases of TED have inexplicably spiked, raising concern among the medical
[02:42:39.180 --> 02:42:40.180] community.
[02:42:40.300 --> 02:42:45.300] Another new study has just linked the surging reports to the mass COVID vaccination campaign.
[02:42:49.300 --> 02:42:52.300] It's continuing.
[02:42:52.300 --> 02:42:58.780] It seems weekly or monthly they're finding some new adverse effect that is causing severe
[02:42:58.780 --> 02:43:01.620] harm to people.
[02:43:01.620 --> 02:43:03.620] Can't think if we do more of what we have been doing.
[02:43:03.620 --> 02:43:05.140] The problem can only get better, right?
[02:43:05.140 --> 02:43:06.140] Yeah, that's right.
[02:43:06.140 --> 02:43:09.540] Just pile more on top of it.
[02:43:09.900 --> 02:43:12.340] We've got this huge fire over here.
[02:43:12.340 --> 02:43:15.300] Maybe we should pour some gasoline on it.
[02:43:15.300 --> 02:43:16.300] Cecilia 14.
[02:43:16.300 --> 02:43:19.580] In reality, old people who suddenly develop memory problems and cognitive decline right
[02:43:19.580 --> 02:43:20.580] after shots.
[02:43:20.580 --> 02:43:23.300] Said old head dementia is really the same as kids' autism.
[02:43:23.300 --> 02:43:27.980] Yeah, it sent a lot of people spiraling immediately.
[02:43:27.980 --> 02:43:32.260] Just their immune systems weren't able to handle it.
[02:43:32.260 --> 02:43:37.340] They couldn't deal with what was injected into them, and so they rapidly pass away.
[02:43:37.340 --> 02:43:38.340] Chevkin.
[02:43:38.340 --> 02:43:41.220] I've been seeing those thyroid eyes commercials lately.
[02:43:41.220 --> 02:43:42.220] Hal 9000.
[02:43:42.220 --> 02:43:44.220] Cash Patel eyes.
[02:43:44.220 --> 02:43:45.860] CJP Rumble.
[02:43:45.860 --> 02:43:48.300] He's got Steve Buscemi eyes.
[02:43:48.300 --> 02:43:54.780] Steve Buscemi also has those sort of kind of bugging, very haunting eyes.
[02:43:54.780 --> 02:43:56.780] Defy Tyrant 1776.
[02:43:56.780 --> 02:43:59.820] If people haven't figured out by now that the entire government is our biggest enemy
[02:43:59.820 --> 02:44:04.060] and that every politician is a wicked liar, they never will.
[02:44:04.060 --> 02:44:09.380] You would think it would be so obvious that they'd have to admit it by now, but some people
[02:44:09.380 --> 02:44:14.660] are desperate to just believe that, oh, the government's fine.
[02:44:14.660 --> 02:44:18.020] Sure they get some things wrong, but they're trying to do their job.
[02:44:18.020 --> 02:44:20.940] They're really trying their hardest.
[02:44:20.940 --> 02:44:21.940] So look at 1980.
[02:44:21.940 --> 02:44:27.020] I knew RFK would fail us.
[02:44:27.020 --> 02:44:32.900] I didn't really have any hope for him.
[02:44:32.900 --> 02:44:36.100] There's always a vague sort of sense of like, well, maybe.
[02:44:36.100 --> 02:44:37.100] Who knows?
[02:44:37.100 --> 02:44:39.980] Anything is theoretically possible.
[02:44:39.980 --> 02:44:44.740] But I didn't have any actual, like a confident hope that he would do it.
[02:44:44.740 --> 02:44:50.460] Just in the, you know, give it a shot, sure, put him in there.
[02:44:50.460 --> 02:44:56.020] But he has validated all our fears of him doing nothing.
[02:44:56.020 --> 02:45:00.420] COVID shot mandates persist for Ontario health workers despite staffing crisis.
[02:45:00.540 --> 02:45:01.540] This is from LifeSite.
[02:45:01.540 --> 02:45:05.340] We have a mandatory COVID-19 vaccination policy, the posting reads.
[02:45:05.340 --> 02:45:09.820] As a condition of employment, all employees are required to submit proof of COVID-19 vaccination
[02:45:09.820 --> 02:45:12.660] status prior to start date.
[02:45:12.660 --> 02:45:16.140] Ontario's continued enforcement of COVID shot mandates come after all their provinces have
[02:45:16.140 --> 02:45:19.140] lifted the mandate.
[02:45:19.140 --> 02:45:25.620] Certain areas, you know, are more brainwashed than others.
[02:45:25.620 --> 02:45:29.220] Ontario in Canada being one of them, it seems.
[02:45:29.220 --> 02:45:32.620] While some hospitals offer religious or medical exemptions, health care workers have told
[02:45:32.620 --> 02:45:37.020] LifeSite News that these are rarely granted, meaning finding work as a health care worker
[02:45:37.020 --> 02:45:43.580] is nearly impossible in Ontario without COVID vaccination, quote unquote.
[02:45:43.580 --> 02:45:49.460] As LifeSite News previously reported, Ontario will need 33,200 more nurses and 50,853 more
[02:45:49.460 --> 02:45:52.620] personal support workers by 2032.
[02:45:52.620 --> 02:45:56.300] To fill the health care worker shortage figures, the progressive conservative government of
[02:45:56.340 --> 02:46:01.620] Doug Ford has asked the information and privacy commissioner to keep secret.
[02:46:01.620 --> 02:46:03.820] Don't tell people that.
[02:46:03.820 --> 02:46:06.100] Don't let them know.
[02:46:06.100 --> 02:46:09.780] Question is, who would really want to work there anyway?
[02:46:09.780 --> 02:46:12.060] Work for these people, for these systems.
[02:46:12.060 --> 02:46:14.820] Maybe that's why Canada is rushing to import so many people.
[02:46:14.820 --> 02:46:21.220] We're going to rush them through medical school and these immigrants, these third worlders
[02:46:21.220 --> 02:46:23.540] will do anything we tell them.
[02:46:23.620 --> 02:46:28.620] Well, some people do go into the medical field because they want to help people.
[02:46:28.620 --> 02:46:31.140] I think that's more the exception.
[02:46:31.140 --> 02:46:37.540] Most people are just looking for a high paying job, but especially a lot of nurses.
[02:46:37.540 --> 02:46:40.500] And those are the people they're trying to purge from this.
[02:46:40.500 --> 02:46:43.420] They only want the profit focused people.
[02:46:43.420 --> 02:46:52.100] And this is why stuff like what that guy, Dr. Moore was doing, Kirk Moore, is important.
[02:46:52.100 --> 02:46:55.500] He was giving out things that these people could keep working.
[02:46:55.500 --> 02:46:57.140] They go to college.
[02:46:57.140 --> 02:47:03.540] They have to have a job in this field that's the only thing they have experience for.
[02:47:03.540 --> 02:47:07.540] And then they get shut down by everyone because they don't poison themselves.
[02:47:07.540 --> 02:47:11.140] Yeah, they punish you for standing on your convictions.
[02:47:11.140 --> 02:47:14.940] Tragically, the health care worker shortage has meant that many Canadians are unable to
[02:47:14.940 --> 02:47:16.660] receive care.
[02:47:16.660 --> 02:47:20.780] As the average wait sits at 27.7 weeks.
[02:47:21.140 --> 02:47:23.140] Half a year.
[02:47:23.140 --> 02:47:26.220] Fortunately, the increased wait times have led some Canadians to despair of receiving
[02:47:26.220 --> 02:47:30.620] treatment and instead chose to end their lives through Medical Assistance in Dying, the MADE
[02:47:30.620 --> 02:47:36.500] program, the euphemistic name for Canada's euthanasia regime.
[02:47:36.500 --> 02:47:39.820] We have seen this becoming more prominent.
[02:47:39.820 --> 02:47:44.460] Just they're actively recommending it to people.
[02:47:44.460 --> 02:47:46.460] No, nothing we can do for you.
[02:47:46.460 --> 02:47:49.140] How about we kill you instead?
[02:47:49.140 --> 02:47:50.140] I know you wanted treatment.
[02:47:50.140 --> 02:47:54.820] And I know you wanted to live, but sadly, we're understaffed and we're not going to
[02:47:54.820 --> 02:47:58.620] allow people that aren't vaccinated with poison to work here.
[02:47:58.620 --> 02:48:00.660] So we're not going to be able to see you.
[02:48:00.660 --> 02:48:03.000] Sorry that the pain has gotten excruciating.
[02:48:03.000 --> 02:48:04.760] Sorry that it's become untreatable.
[02:48:04.760 --> 02:48:08.820] We can euthanize you if you want, though.
[02:48:08.820 --> 02:48:14.020] Defining moment in human history, U.S. rejects WHO's International Health Regulation Amendments.
[02:48:14.020 --> 02:48:17.860] This is from Children's Health Defense.
[02:48:17.860 --> 02:48:20.420] And this is a good thing.
[02:48:20.420 --> 02:48:23.640] Health Secretary Robert F. Kennedy Jr. said today the U.S. would not agree to sign over
[02:48:23.640 --> 02:48:28.380] authority in health emergencies to an unelected international organization that could order
[02:48:28.380 --> 02:48:32.700] lockdowns, travel restrictions, or any other measures that it sees fit.
[02:48:32.700 --> 02:48:35.020] Again, that's a good thing.
[02:48:35.020 --> 02:48:41.780] We don't want to be turning over any more of our sovereignty to these massive, unelected
[02:48:41.780 --> 02:48:43.940] bureaucracies.
[02:48:43.940 --> 02:48:50.200] These international organizations that don't have any loyalty or interest in preserving
[02:48:50.200 --> 02:48:54.180] the United States or its people.
[02:48:54.180 --> 02:48:57.100] Content gas tank, is that 1 8th?
[02:48:57.100 --> 02:48:59.540] I believe that's not updated yet.
[02:48:59.540 --> 02:49:03.460] Dad was going to look at that and update it today.
[02:49:03.460 --> 02:49:05.900] We'll have it updated by tomorrow.
[02:49:05.900 --> 02:49:08.580] We don't have the exact numbers yet.
[02:49:08.580 --> 02:49:12.260] That is outside of my purview.
[02:49:12.300 --> 02:49:15.500] Health Secretary Robert F. Kennedy Jr. and Secretary of State Marco Rubio today announced
[02:49:15.500 --> 02:49:20.540] that the U.S. is formally rejecting the controversial amendments to the World Health Organization's
[02:49:20.540 --> 02:49:22.700] International Health Regulations.
[02:49:22.700 --> 02:49:26.580] The visions would allow the WHO to order global lockdowns, travel restrictions, or any other
[02:49:26.580 --> 02:49:30.100] measures it sees fit to respond to nebulous potential public health risks.
[02:49:30.100 --> 02:49:35.340] The U.S. Department of Health and Human Services said,
[02:49:35.340 --> 02:49:38.740] We should be rounding these people from the WHO up.
[02:49:38.740 --> 02:49:41.640] We should be putting them on trial.
[02:49:41.640 --> 02:49:48.280] We should be reading them their rights and then locking them up until we can process
[02:49:48.280 --> 02:49:49.760] them all.
[02:49:49.760 --> 02:49:56.440] RFK Jr. rejects WHO's Trojan Horse International Health Regulations Amendment.
[02:49:56.440 --> 02:50:01.520] Are we going to be subject to a technocratic control system that uses health risks and
[02:50:01.520 --> 02:50:06.160] pandemic preparedness as a Trojan Horse to curtail basic democratic freedoms?
[02:50:06.160 --> 02:50:10.160] Do we want a future where every person, movement, transaction, every human body is under surveillance
[02:50:10.240 --> 02:50:12.040] at all times?
[02:50:12.040 --> 02:50:19.040] That's what Kennedy stated, but RFK wants to use wearables to track all that kind of
[02:50:19.040 --> 02:50:20.040] data.
[02:50:20.040 --> 02:50:23.320] He wants to be the one that is surveilling every human body.
[02:50:23.320 --> 02:50:28.440] Do we want to give that information over to someone other than me?
[02:50:28.440 --> 02:50:29.440] Apparently not.
[02:50:29.440 --> 02:50:30.940] And again, we don't.
[02:50:30.940 --> 02:50:34.800] This is a good thing to oppose, but it's just funny the way he phrases it.
[02:50:34.800 --> 02:50:37.520] Wow, you want to give them that kind of surveillance tech?
[02:50:37.520 --> 02:50:38.520] No, I don't.
[02:50:38.520 --> 02:50:41.600] They don't want you to have it either, RFK.
[02:50:41.600 --> 02:50:45.440] Reggie Little-John, president of Anti-Globalist International and co-founder of the Sovereignty
[02:50:45.440 --> 02:50:49.200] Coalition stated, I applaud Secretary Kennedy's courage in calling out the WHO for what it
[02:50:49.200 --> 02:50:51.360] is, a Trojan Horse.
[02:50:51.360 --> 02:50:55.960] Under the pretext of health and safety, health safety and pandemic response, the WHO's amended
[02:50:55.960 --> 02:51:00.280] IHRs create the framework for biotech surveillance.
[02:51:00.280 --> 02:51:01.280] Police state.
[02:51:01.280 --> 02:51:02.280] Yeah.
[02:51:02.280 --> 02:51:05.440] Oh, well, you know, there's something going around.
[02:51:05.440 --> 02:51:12.080] We think that it's best if you don't leave your house for months, weeks, however long,
[02:51:12.080 --> 02:51:13.800] we'll decide.
[02:51:13.800 --> 02:51:20.860] This bureaucracy that has no ties to the country that they're going to inflict this on.
[02:51:20.860 --> 02:51:26.240] These bureaucrats that are sort of vaguely human, you know, men made of numbers and mystery
[02:51:26.240 --> 02:51:27.240] meet.
[02:51:27.240 --> 02:51:30.840] You don't know who they are or where they came from.
[02:51:30.840 --> 02:51:31.840] This is from the expose.
[02:51:31.840 --> 02:51:37.240] In nine out of 10 illnesses, our bodies can and will heal themselves.
[02:51:37.240 --> 02:51:41.160] This is why I first do no harm is so important.
[02:51:41.160 --> 02:51:42.600] They've been ignoring it for years.
[02:51:42.600 --> 02:51:50.480] I want to get you hooked on some kind of pharmaceutical, some kind of drug.
[02:51:50.480 --> 02:51:52.440] Not going to let your body do its job.
[02:51:52.440 --> 02:51:57.520] In his book, Body Power, first published in 1983, Dr. Vernon Coleman explained how you
[02:51:57.520 --> 02:52:03.640] can use the power of your body to keep you healthy, to make you well in 90% of illnesses.
[02:52:03.640 --> 02:52:07.200] The following is an excerpt taken from another of his books published in 2014 about things
[02:52:07.200 --> 02:52:08.760] I have learned.
[02:52:08.760 --> 02:52:11.740] The excerpt highlights the healing power of the body referring to his earlier work Body
[02:52:11.740 --> 02:52:15.320] Power.
[02:52:15.320 --> 02:52:18.360] Many of the people who were injured by doctors never needed medical treatment in the first
[02:52:18.360 --> 02:52:19.360] place.
[02:52:19.360 --> 02:52:22.880] The human body contains a comprehensive variety of self-healing mechanisms, which means that
[02:52:22.880 --> 02:52:26.160] in nine out of 10 illnesses, your body will mend itself.
[02:52:26.800 --> 02:52:30.960] It is important that you learn to understand your body, learn to appreciate its healing
[02:52:30.960 --> 02:52:34.840] abilities.
[02:52:34.840 --> 02:52:39.520] Popular sugar substitute marketed to diabetics linked to stroke, heart attack, brain cell
[02:52:39.520 --> 02:52:40.520] damage.
[02:52:40.520 --> 02:52:43.000] We see this sort of thing.
[02:52:43.000 --> 02:52:49.200] Most of the time, sugar substitutes end up being even worse than sugar itself.
[02:52:49.200 --> 02:52:54.560] Erythritol is the one in question now.
[02:52:54.960 --> 02:52:58.240] It can constrict blood vessels, reduce the body's ability to break down blood clots,
[02:52:58.240 --> 02:52:59.800] and increase inflammation.
[02:52:59.800 --> 02:53:04.960] It leads to increased risk of stroke, heart attack, and brain cell damage.
[02:53:04.960 --> 02:53:10.400] Of course, erythritol has gained popularity in recent years.
[02:53:10.400 --> 02:53:13.680] They're always trying some new, different sugar substitute.
[02:53:13.680 --> 02:53:18.340] Unfortunately, the sugar alcohol seems, per the new study from Boulder, to be nearly as
[02:53:18.340 --> 02:53:22.660] harmful as the artificial sweetener aspartame, which is used in many diet sodas, has been
[02:53:22.660 --> 02:53:26.460] labeled carcinogenic by the World Health Organization, has been linked to increased
[02:53:26.460 --> 02:53:28.460] heart attack and stroke risk.
[02:53:28.460 --> 02:53:33.380] Following up on a 2023 study that linked increased stroke and heart attack risk with higher erythritol
[02:53:33.380 --> 02:53:38.980] circulation in the bloodstream, integrative physiology professor Christopher D'Souza
[02:53:38.980 --> 02:53:43.780] and graduate student Auburn Berry, both co-authors on the new paper, sought to learn more about
[02:53:43.780 --> 02:53:45.300] this unsettling correlation.
[02:53:45.300 --> 02:53:52.620] So again, their chemical-based nonsense ends up being extremely harmful, ends up doing
[02:53:52.700 --> 02:53:58.040] damage to you, being worse than just a natural sugar.
[02:53:58.040 --> 02:54:03.760] This is one of those things where, in moderation, sugar is fine.
[02:54:03.760 --> 02:54:09.100] You can have a little bit of sugar now and then, and it's not going to kill you.
[02:54:09.100 --> 02:54:15.260] But people use these as substitutes, and they drink them constantly, they eat them constantly,
[02:54:15.260 --> 02:54:18.580] because they don't want to practice self-control.
[02:54:18.580 --> 02:54:26.140] You can have one bowl of ice cream here or there is not going to kill you.
[02:54:26.140 --> 02:54:33.540] One maybe Mexican Coke made with real sugar now or then isn't going to kill you.
[02:54:33.540 --> 02:54:37.780] But if you refuse to moderate yourself and think, you know what, I'm going to get fat
[02:54:37.780 --> 02:54:42.600] if I just continue to drink Coke, I know what I'll do, I'll drink Diet Coke.
[02:54:42.600 --> 02:54:46.580] It's even more chemicals and worse for you than a regular Coke is.
[02:54:46.580 --> 02:54:56.580] It might not make you fat, but it's going to damage your body in other ways.
[02:54:56.580 --> 02:55:03.460] We've got a comment here, David Ramsey 2328, Hey Travis's, do viruses exist?
[02:55:03.460 --> 02:55:04.580] I want to hear your straight position.
[02:55:04.580 --> 02:55:06.920] You keep playing along with all these stories.
[02:55:06.920 --> 02:55:11.180] For the record, I personally don't think viruses exist, but I'm going to tell you that I am
[02:55:11.180 --> 02:55:14.980] not qualified to make that assessment.
[02:55:14.980 --> 02:55:23.020] I don't have the requisite intelligence or information to give you a 100% answer.
[02:55:23.020 --> 02:55:28.460] I can point you to the interview that my dad did with the Bailey's, the doctors.
[02:55:28.460 --> 02:55:29.460] What is it?
[02:55:29.460 --> 02:55:32.960] Sam and Mark, I want to say, that's available on Rumble and they make a very compelling
[02:55:32.960 --> 02:55:39.860] case that I think that gives me, it makes me think viruses aren't real.
[02:55:39.860 --> 02:55:41.900] And you can look at that for yourself.
[02:55:41.900 --> 02:55:46.300] Personally that did the convincing for me, but I do not have, as I said, the requisite
[02:55:46.300 --> 02:55:51.220] knowledge to make an informed decision on that for other people.
[02:55:51.220 --> 02:55:55.420] So go check out that interview with the Bailey's if you're interested.
[02:55:55.420 --> 02:55:57.460] But no, personally, I don't believe viruses are real.
[02:55:57.460 --> 02:55:59.900] I think it's been a long con job.
[02:55:59.900 --> 02:56:04.020] And for the record, when I'm playing along with the narratives, I'm reading the stories
[02:56:04.020 --> 02:56:05.820] as they are being reported.
[02:56:05.820 --> 02:56:08.920] If I have to sit here and clarify every single time, by the way, I don't think viruses are
[02:56:09.360 --> 02:56:13.840] And by the way, I don't think, you know, whatever, vaccines are actually helpful.
[02:56:13.840 --> 02:56:15.440] It slows down the pace of the program.
[02:56:15.440 --> 02:56:19.880] I assume that most of you all know my position on things.
[02:56:19.880 --> 02:56:24.480] I hope that clarifies things for you, David.
[02:56:24.480 --> 02:56:27.920] Mav2022, doctors aren't trained how to help the body heal.
[02:56:27.920 --> 02:56:30.200] They are trained on how to mask symptoms.
[02:56:30.200 --> 02:56:31.320] Disease is a verb, not a noun.
[02:56:31.320 --> 02:56:32.800] It's why doctors have practices.
[02:56:32.800 --> 02:56:33.800] That's right.
[02:56:33.800 --> 02:56:36.200] I always said, I don't want a doctor that practices medicine.
[02:56:36.200 --> 02:56:38.440] I want a guy that's got it down pat.
[02:56:38.480 --> 02:56:41.080] I don't want the guy that's practicing the front porch media.
[02:56:41.080 --> 02:56:42.960] Sucralose is almost the only thing available now.
[02:56:42.960 --> 02:56:44.320] Stevia is hardly used.
[02:56:44.320 --> 02:56:48.280] Yeah, they've kind of limited their options.
[02:56:48.400 --> 02:56:51.600] I think sucralose tastes closer to sugar.
[02:56:51.600 --> 02:56:53.240] And that's why you see it in everything.
[02:56:53.240 --> 02:56:55.440] But yeah, it is not as good for you.
[02:56:55.440 --> 02:57:00.840] It's one of the ones that allegedly you don't, but you don't you're probably doesn't process
[02:57:00.840 --> 02:57:02.240] it. It just passes right through you.
[02:57:02.440 --> 02:57:08.080] Yeah, Stevia, as far as I know, is better, but it doesn't taste as good.
[02:57:08.080 --> 02:57:09.360] It's got a bitter taste.
[02:57:09.560 --> 02:57:13.960] I like monk fruit personally, as far as I know, there aren't really any side effects
[02:57:13.960 --> 02:57:15.760] for that one. And it tastes a lot like sugar.
[02:57:15.960 --> 02:57:19.720] Yeah, Stevia always has that kind of weird, funky aftertaste to it.
[02:57:20.080 --> 02:57:23.720] And so it ends up being used less.
[02:57:24.680 --> 02:57:27.640] But as Lance said, monk fruit is a good substitute.
[02:57:27.640 --> 02:57:33.280] But I I've tried it in coffee when we've had it and I don't like it.
[02:57:33.280 --> 02:57:37.720] There's something about the way it mixes with the acid in the coffee that makes it taste
[02:57:37.720 --> 02:57:39.280] very, very strange to me.
[02:57:40.200 --> 02:57:42.400] It doesn't seem to get rid of the bitterness.
[02:57:42.400 --> 02:57:44.480] It seems to put a sweetness on top of it.
[02:57:44.480 --> 02:57:46.080] And then the bitterness is still there.
[02:57:46.640 --> 02:57:52.560] And so personally, monk fruit is good for baking and other things like that, but not
[02:57:52.560 --> 02:57:54.520] in coffee, personally, not for me.
[02:57:55.040 --> 02:57:56.800] Guard Goldsmith, given the fact that R.F.K.
[02:57:56.800 --> 02:58:01.280] Jr. never acknowledged the unconstitutional nature of the FDA, it always seemed likely
[02:58:01.280 --> 02:58:03.080] that he would push more authoritarianism.
[02:58:03.080 --> 02:58:04.240] Little pullback. Yeah.
[02:58:05.240 --> 02:58:07.600] They they'll give you a little bit here and there.
[02:58:07.600 --> 02:58:12.160] R.F.K. will pay lip service to certain things, but he doesn't really want to dismantle these
[02:58:12.160 --> 02:58:14.760] systems. He doesn't want to get them out of your life.
[02:58:15.160 --> 02:58:17.480] He wants to use them for his own reasons.
[02:58:17.640 --> 02:58:19.920] The way he thinks they should be run.
[02:58:20.760 --> 02:58:22.480] Well, you know, sure, I'll do this here and that.
[02:58:22.480 --> 02:58:27.120] But I'm still I want to be the one that gets to put the wearables on you, the one that
[02:58:27.120 --> 02:58:30.280] gets to be in charge of your life.
[02:58:30.880 --> 02:58:33.520] Well, wow, we are almost out of time.
[02:58:34.080 --> 02:58:37.040] We have only about a minute and a half left.
[02:58:37.560 --> 02:58:39.440] Time sure flies when you're having fun.
[02:58:39.440 --> 02:58:40.440] Doug the 007.
[02:58:41.000 --> 02:58:44.960] It's better to just limit your sugar intake and let your taste buds adapt to lower sugar.
[02:58:45.080 --> 02:58:47.840] Yeah. Sugar is incredibly addictive.
[02:58:48.160 --> 02:58:53.000] That's why they put so much of it in kids' cereals and things.
[02:58:54.160 --> 02:58:59.240] They know that kids have a hard time regulating their, you know, self-control as is.
[02:58:59.280 --> 02:59:03.680] And when you give them something with a massive amount of sugar, they're going to want more
[02:59:03.680 --> 02:59:05.320] and more of it. It is.
[02:59:06.120 --> 02:59:08.480] It's bad. It's bad sugar.
[02:59:08.600 --> 02:59:12.560] Again, if you can limit your amounts, it's not going to kill you.
[02:59:13.240 --> 02:59:17.000] It will. You know, in massive amounts, it's terrible for your health.
[02:59:17.040 --> 02:59:22.080] But if you're capable of regulating it and having it in moderation, then it's fine.
[02:59:22.200 --> 02:59:24.320] That's my opinion on things. I'm not a doctor.
[02:59:24.320 --> 02:59:26.040] I don't even play one on TV.
[02:59:26.520 --> 02:59:29.120] So don't take my word for it.
[02:59:29.760 --> 02:59:31.280] But yeah, that's how I view it.
[02:59:31.840 --> 02:59:35.240] Again, I want to thank you all for tuning in today.
[02:59:35.280 --> 02:59:38.160] It's been a pleasure to go through the news with you.
[02:59:39.160 --> 02:59:42.320] And if you would like to support the show, go to davidknight.news.
[02:59:42.320 --> 02:59:44.440] We've got all the ways you can do that listed there.
[02:59:44.960 --> 02:59:46.120] Really do appreciate it.
[02:59:46.960 --> 02:59:48.480] We will be back tomorrow.
[02:59:48.960 --> 02:59:51.240] So God bless you all. Have a wonderful rest of your day.
[02:59:51.240 --> 02:59:53.840] And I will see you then. Take care.
[02:59:56.040 --> 03:00:00.240] The Common Man.
[03:00:10.480 --> 03:00:11.280] The Common Man.
[03:00:14.600 --> 03:00:17.200] They created Common Core to dumb down our children.
[03:00:17.640 --> 03:00:20.560] They created Common Pass to track and control us.
[03:00:20.800 --> 03:00:24.760] Their commons project to make sure the commoners own nothing.
[03:00:25.640 --> 03:00:27.200] And the communist future.
[03:00:28.640 --> 03:00:32.360] They see the common man as simple, unsophisticated, ordinary.
[03:00:33.320 --> 03:00:37.800] But each of us has worth and dignity created in the image of God.
[03:00:39.960 --> 03:00:41.920] That is what we have in common.
[03:00:41.920 --> 03:00:44.480] That is what they want to take away.
[03:00:44.480 --> 03:00:48.880] Their most powerful weapons are isolation, deception, intimidation.
[03:00:49.560 --> 03:00:54.240] They desire to know everything about us while they hide everything from us.
[03:00:55.080 --> 03:00:58.800] It's time to turn that around and expose what they want to hide.
[03:01:00.080 --> 03:01:04.280] Please share the information and links you'll find at TheDavidNightShow.com.
[03:01:04.760 --> 03:01:06.840] Thank you for listening. Thank you for sharing.
[03:01:12.600 --> 03:01:15.800] If you can't support us financially, please keep us in your prayers.
[03:01:16.400 --> 03:01:18.000] TheDavidNightShow.com.
[03:01:24.760 --> 03:01:28.720] Career changers, including veterans and active duty service members.
[03:01:28.720 --> 03:01:30.520] Your transition starts here.
[03:01:30.520 --> 03:01:32.920] Go from GI to IT in a matter of months.
[03:01:32.920 --> 03:01:36.200] Become a certified cyber warrior with training at My Computer Career.
[03:01:36.200 --> 03:01:38.360] Cyber security specialists are in high demand.
[03:01:38.360 --> 03:01:42.160] Offering IT pros great opportunities and great opportunities to help you
[03:01:42.160 --> 03:01:44.160] make the most of your time in the world.
[03:01:44.160 --> 03:01:45.600] The DavidNightShow.com.
[03:01:45.600 --> 03:01:47.600] The DavidNightShow.com.
[03:01:47.600 --> 03:01:49.600] The DavidNightShow.com.
[03:01:49.600 --> 03:01:53.600] The DavidNightShow.com.
[03:01:54.800 --> 03:01:57.800] And a rewarding lifestyle while protecting our people, liberty,
[03:01:57.800 --> 03:02:00.080] and treasured institutions from cyber threats.
[03:02:00.080 --> 03:02:04.960] Deploy your career in IT today at MyComputerCareer.edu slash CWP.
[03:02:04.960 --> 03:02:06.840] An elite CompTIA certification partner.
[03:02:06.840 --> 03:02:08.840] VA education benefits are available to those who qualify.