dks_fs_12_09_2025.timecode

Detecting language using up to the first 30 seconds. Use `--language` to specify the language
Detected language: English
[00:00.000 --> 00:05.300]  MOTO Casino, America's Social Casino.
[00:05.300 --> 00:07.900]  Welcome to MOTO Casino, where the excitement never ends.
[00:07.900 --> 00:11.300]  With thousands of the hottest free-to-play social casino games, fastest payouts,
[00:11.300 --> 00:14.400]  and the best promotions in the industry, no tricks or gimmicks.
[00:14.400 --> 00:16.300]  Owned and operated in the USA.
[00:16.300 --> 00:17.800]  MOTO Casino is a free-to-play social casino.
[00:17.800 --> 00:19.900]  No purchase necessary, 21 plus to play, void or prohibited.
[00:19.900 --> 00:22.300]  Sign up today for a generous welcome bonus.
[00:22.300 --> 00:27.500]  MOTO Casino, America's Social Casino.
[00:27.500 --> 00:29.400]  Download the MOTO Casino app today.
[00:30.100 --> 00:35.100]  Multiply, multiply, multiply, multiply.
[00:35.100 --> 00:38.200]  With X the cash scratch tickets from the Texas Lottery,
[00:38.200 --> 00:43.700]  you could multiply the cash by 30, 50, 100, or even 200 times.
[00:43.700 --> 00:46.800]  And when you multiply the cash, you multiply the celebration.
[00:46.800 --> 00:50.100]  With top prizes from 60,000 up to a million dollars,
[00:50.100 --> 00:52.300]  it's the easiest way to multiply your luck.
[00:52.300 --> 00:55.400]  And enter for a chance to win a VIP iHeart experience.
[00:55.400 --> 00:57.700]  Play X the cash scratch tickets today.
[00:57.800 --> 00:59.000]  Must be 18 or older.
[00:59.000 --> 01:00.100]  Play responsibly.
[01:28.700 --> 01:35.700]  In a world of deceit, telling the truth is a revolutionary act.
[01:35.700 --> 01:38.700]  It's the David Knight Show.
[01:44.700 --> 01:47.700]  As the clock strikes 13, it's Tuesday, the 9th of December,
[01:47.700 --> 01:50.700]  of our Lord 2025.
[01:50.700 --> 01:54.700]  Well, today we're going to take a lot of look at artificial intelligence
[01:54.700 --> 01:58.700]  we actually have in the third hour, a guest, a best-selling author.
[01:58.700 --> 02:00.200]  He's done 25 books.
[02:00.200 --> 02:04.700]  He's an MD who works in neuroscience.
[02:04.700 --> 02:07.700]  The book is The 21st Century Brain.
[02:07.700 --> 02:13.200]  And he's been a consultant or lecturer at the CIA, at the NSA,
[02:13.200 --> 02:14.700]  the Pentagon, other places.
[02:14.700 --> 02:19.700]  So he knows something about where this stuff is headed.
[02:19.700 --> 02:22.700]  We're going to see what he has to say here about this.
[02:22.700 --> 02:28.700]  But we're going to begin with what is going on in Ukraine.
[02:28.700 --> 02:31.700]  Is this the beginning of the end?
[02:31.700 --> 02:34.700]  Are these people going to be able to sustain this?
[02:34.700 --> 02:39.700]  Russia is rapidly advancing even though Zelensky is not even taking a look at the plan.
[02:39.700 --> 02:42.700]  And so we're going to start with news.
[02:42.700 --> 02:47.700]  We're going to also, we have some interesting updates in pharmaceutical areas,
[02:47.700 --> 02:51.700]  as well as an update in terms of Trump's tariffs.
[02:52.700 --> 02:53.700]  Are they working?
[02:53.700 --> 02:58.700]  We finally got around to giving some breadcrumbs to the soybean farmers.
[02:58.700 --> 03:02.700]  We're going to take a look at the bigger picture of the soy stuff.
[03:02.700 --> 03:04.700]  So we're back already.
[03:04.700 --> 03:05.700]  It ended soon.
[03:05.700 --> 03:06.700]  Sorry, I accidentally hit the button.
[03:06.700 --> 03:07.700]  That's okay.
[03:07.700 --> 03:08.700]  All right.
[03:08.700 --> 03:10.700]  Well, let's start with the news here.
[03:10.700 --> 03:13.700]  And Trump said out loud, he said,
[03:13.700 --> 03:16.700]  I'm disappointed that Zelensky hasn't even read my peace proposal.
[03:16.700 --> 03:18.700]  And I understand how he feels.
[03:18.700 --> 03:23.700]  I think that's a point that Trump hasn't even read the Constitution that he saw it uphold.
[03:23.700 --> 03:27.700]  Maybe he doesn't like it, just like Zelensky doesn't like peace.
[03:27.700 --> 03:31.700]  His frustration continues to show, especially after high hopes,
[03:31.700 --> 03:33.700]  for his 28-point peace plan.
[03:33.700 --> 03:36.700]  You know, we had this 10-point peace plan between us and the government.
[03:36.700 --> 03:38.700]  It's called the Bill of Rights.
[03:38.700 --> 03:41.700]  Tell you what, this is where we draw the line with what the government does
[03:41.700 --> 03:45.700]  and with our natural rights, our God-given rights.
[03:45.700 --> 03:47.700]  But they didn't respect that either.
[03:47.700 --> 03:53.700]  He said, I'm a little bit disappointed that Zelensky hasn't even read the proposal yet.
[03:53.700 --> 03:55.700]  Well, he doesn't want peace.
[03:55.700 --> 03:58.700]  And neither do the European leaders as well.
[03:58.700 --> 04:04.700]  So as he's saying that, you've got the leaders of Britain, France, Germany,
[04:04.700 --> 04:09.700]  all meeting with Zelensky telling him, keep fighting, keep fighting.
[04:09.700 --> 04:10.700]  We're going to win this thing.
[04:10.700 --> 04:12.700]  He said, that's not really what's happening.
[04:12.700 --> 04:14.700]  He said his people love it, but he hasn't.
[04:14.700 --> 04:16.700]  Russia's fine with it.
[04:16.700 --> 04:20.700]  He said, and the assessment of what's going on with Ukraine.
[04:20.700 --> 04:24.700]  Of course, this follows after his son-in-law, Jared Kushner,
[04:24.700 --> 04:26.700]  and his former business partner, I guess you could say,
[04:26.700 --> 04:31.700]  Steve Wittkopf, who are now his emissaries for geopolitics.
[04:31.700 --> 04:36.700]  I mean, hey, if you can negotiate a big real estate deal in New York,
[04:36.700 --> 04:39.700]  most of this stuff is about real estate, right?
[04:39.700 --> 04:41.700]  Whether you're talking about Gaza or you're talking about Ukraine,
[04:41.700 --> 04:45.700]  you're still talking about people killing each other over land.
[04:45.700 --> 04:52.700]  And so he said they didn't think that Zelensky was really serious about this.
[04:52.700 --> 04:57.700]  Moscow, as I pointed out yesterday, really likes the document that was released,
[04:57.700 --> 05:06.700]  the first NSS, which is the National Defense Security Agreement that's there.
[05:06.700 --> 05:10.700]  It's basically laying out the Trump administration's perspective
[05:10.700 --> 05:12.700]  on foreign policy and national security.
[05:13.700 --> 05:15.700]  And I liked what it had to say.
[05:15.700 --> 05:19.700]  I just don't believe that Trump is going to stick to any of it.
[05:19.700 --> 05:21.700]  But Russia reported on it.
[05:21.700 --> 05:25.700]  You didn't have any reporting really from mainstream media here in America.
[05:25.700 --> 05:32.700]  So the Russians liked it because, as they pointed out in the Zero Hedge article,
[05:32.700 --> 05:37.700]  the document characterizes Europe as weak while warning of an unpredictable,
[05:37.700 --> 05:40.700]  disunified atmosphere on the European continent,
[05:40.700 --> 05:45.700]  where in desperation European leadership could overreact and escalate a war with Russia.
[05:45.700 --> 05:46.700]  You think?
[05:46.700 --> 05:51.700]  I mean, they've been doing that up front in so many different ways.
[05:51.700 --> 05:56.700]  You've got Fred Mertz in Germany, and you've got Keir Starmer in Britain,
[05:56.700 --> 05:57.700]  as well as Macron in France.
[05:57.700 --> 06:01.700]  They're all saying, you know, get ready for massive casualties,
[06:01.700 --> 06:05.700]  and we've got to draft more people in the Army.
[06:05.700 --> 06:06.700]  I mean, they're doing everything.
[06:06.700 --> 06:10.700]  It essentially amounts to a declaration of war already.
[06:10.700 --> 06:15.700]  So Donald Trump's first NSS since returning to office blames European officials
[06:15.700 --> 06:18.700]  for thwarting U.S. efforts to end the war in Ukraine
[06:18.700 --> 06:24.700]  and accuses governments of ignoring a large European majority, quote unquote, who want peace.
[06:24.700 --> 06:27.700]  Well, I agree with Trump on that, the Trump administration.
[06:27.700 --> 06:30.700]  I just don't trust him on any of this stuff.
[06:30.700 --> 06:34.700]  Meanwhile, there might be another way to have peace,
[06:34.700 --> 06:38.700]  and that is for Russia to win, and it looks like that may be happening.
[06:38.700 --> 06:42.700]  One way to have it, a lasting peace, is that we're going to have a lasting peace.
[06:42.700 --> 06:49.700]  Well, you could end the NATO provocation that is called Ukraine as a geopolitical construct
[06:49.700 --> 06:54.700]  that, as I pointed out before, the Ukraine was an area of Russia,
[06:54.700 --> 07:01.700]  an area of Russia for 400 years, and breaking that off as a separate entity
[07:01.700 --> 07:07.700]  and then creating a coup to change the government that then began a civil war.
[07:07.700 --> 07:11.700]  That happened in 2014, 11 years ago.
[07:11.700 --> 07:18.700]  So that is a construct of NATO, who decided after the Soviet disunion
[07:18.700 --> 07:21.700]  that they would eliminate Russia as a power.
[07:21.700 --> 07:24.700]  And so this has been a gradual policy of encroachment.
[07:24.700 --> 07:26.700]  They're pushing for war.
[07:26.700 --> 07:31.700]  Putin's army seizing land at one of its fastest rates since the initial invasion
[07:31.700 --> 07:34.700]  almost four years ago, says research.
[07:34.700 --> 07:37.700]  The Kremlin's army seized 200 square miles of territory in November,
[07:37.700 --> 07:43.700]  up from 100 square miles the previous month, according to Deep State,
[07:43.700 --> 07:47.700]  a trusted Ukraine-based battlefield map.
[07:47.700 --> 07:51.700]  How about that? They even call it Deep State.
[07:51.700 --> 07:54.700]  Let's use that for our marketing purposes here.
[07:54.700 --> 07:57.700]  The speed of advance was approaching the fastest since the initial invasion
[07:57.700 --> 07:59.700]  almost four years ago.
[07:59.700 --> 08:02.700]  But then you have the desperation of the war cult.
[08:02.700 --> 08:07.700]  Zelensky meeting with Keir Starmer, Manuel Macron, Fred Mertz.
[08:07.700 --> 08:11.700]  Ukraine is holding its own, they said, and doing even better.
[08:11.700 --> 08:14.700]  Ukraine is not on the brink of collapse.
[08:14.700 --> 08:18.700]  Again, reality has no meaning to these people.
[08:18.700 --> 08:21.700]  If we cannot immediately reach a peace agreement with Russia,
[08:21.700 --> 08:24.700]  it is essential that we give Ukraine all the support it needs
[08:24.700 --> 08:28.700]  so that it does not lose ground due to lack of support.
[08:28.700 --> 08:32.700]  Well, it is losing ground, even though they are supporting it.
[08:32.700 --> 08:35.700]  And this was something that many people said from the beginning,
[08:35.700 --> 08:41.700]  that there was no way that Russia was going to be able to outlast,
[08:41.700 --> 08:44.700]  that Ukraine was going to be able to outlast Russia.
[08:44.700 --> 08:48.700]  The comparative size of the two countries' militaries,
[08:48.700 --> 08:56.700]  as well as the close proximity, it was in the cards that this was going to happen.
[08:56.700 --> 09:00.700]  They plan to end the war drawn up by the Trump administration,
[09:00.700 --> 09:04.700]  involve Ukraine handing over vast tracts of land,
[09:04.700 --> 09:09.700]  and Ukraine and Europe have rejected the proposals.
[09:09.700 --> 09:14.700]  Well, that's one way it's going to end, and maybe it will end when they take the land.
[09:15.700 --> 09:18.700]  The gains in territory risk helping to persuade Trump
[09:18.700 --> 09:21.700]  that peace should be set on Russia's terms.
[09:21.700 --> 09:25.700]  That sending weapons and aid to Kiev was a waste.
[09:25.700 --> 09:27.700]  Yes, really.
[09:27.700 --> 09:31.700]  Well, why does Keir Starmer want the war so much,
[09:31.700 --> 09:36.700]  and why does Fred Mertz want the war, and why does Francis Emmanuel Macron?
[09:36.700 --> 09:41.700]  It's because their people understand that their governments are at war with them.
[09:41.700 --> 09:48.700]  They're locking people up for mere comments as they create this police surveillance state
[09:48.700 --> 09:50.700]  and shut down all free speech.
[09:50.700 --> 09:55.700]  And in the UK, for example, same thing is happening in all these countries,
[09:55.700 --> 09:58.700]  being overrun with immigrants from abroad.
[09:58.700 --> 10:05.700]  There is fury in the UK as nearly 350,000 migrant families could get extra welfare
[10:05.700 --> 10:08.700]  after the new budget from Keir Starmer.
[10:08.700 --> 10:10.700]  So why does he want war?
[10:10.700 --> 10:15.700]  Well, because his own people are waking up to the fact that Starmer is at war with the British people.
[10:15.700 --> 10:16.700]  That's why.
[10:16.700 --> 10:19.700]  There are 50,000 foreign-born families,
[10:19.700 --> 10:26.700]  and they found that 200,000 of them were from just 10 countries.
[10:26.700 --> 10:30.700]  Families from Pakistan, Bangladesh, Nigeria set to benefit the most
[10:30.700 --> 10:35.700]  from the £3 billion decision to scrap the two-child benefit cap.
[10:35.700 --> 10:37.700]  Endless found.
[10:37.700 --> 10:41.700]  A Tory MP who carried out the research said,
[10:41.700 --> 10:46.700]  you have to ask whose side the government is on.
[10:46.700 --> 10:48.700]  I don't think you have to ask that anymore.
[10:48.700 --> 10:50.700]  I think they made that pretty clear.
[10:50.700 --> 10:55.700]  They like any third-world migrants, and they hate all the native Britons.
[10:55.700 --> 11:04.700]  And if you look at the chart that's there, the 10 countries are Pakistan, Bangladesh, Nigeria, Somalia, India,
[11:04.700 --> 11:09.700]  Ghana, Afghanistan, Iraq, Sri Lanka.
[11:09.700 --> 11:11.700]  That's nine of them.
[11:11.700 --> 11:16.700]  There's only one country that is European, and that's Poland.
[11:16.700 --> 11:21.700]  And so that's 200,000 of the 350,000.
[11:21.700 --> 11:25.700]  The rest of the world is 150,000 immigrants from the rest of the world.
[11:25.700 --> 11:30.700]  And of course, coming in for the welfare benefits, the welfare magnet that's there.
[11:34.700 --> 11:37.700]  And you can find it on the floor in Vegas. You can play it on Moto.
[11:37.700 --> 11:39.700]  I like my slots hot. Moto's free to play.
[11:39.700 --> 11:43.700]  Like food stamps in line at the grocery store, at a funeral, in traffic,
[11:43.700 --> 11:45.700]  keep your eyes on the road, hop on Moto Casino.
[11:45.700 --> 11:47.700]  Moto Casino got jackpots that are bigger than my belly.
[11:47.700 --> 11:50.700]  Moto, America's hottest free-to-play social casino.
[12:04.700 --> 12:06.700]  Multiply!
[12:06.700 --> 12:14.700]  With cash scratch tickets from the Texas Lottery, you could multiply the cash by 30, 50, 100, or even 200 times.
[12:14.700 --> 12:17.700]  And when you multiply the cash, you multiply the celebration.
[12:17.700 --> 12:22.700]  With top prizes from 60,000 up to a million dollars, it's the easiest way to multiply your luck.
[12:22.700 --> 12:25.700]  And enter for a chance to win a VIP iHeart experience.
[12:25.700 --> 12:28.700]  Play X the cash scratch tickets today.
[12:28.700 --> 12:30.700]  Must be 18 or older. Play responsibly.
[12:30.700 --> 12:36.700]  So, Milo Yiannopoulos, and I don't normally get into these.
[12:36.700 --> 12:43.700]  It's amazing to me what a soap opera the conservative alternative media has become.
[12:43.700 --> 12:46.700]  But they've kind of been angling for this for a long time.
[12:46.700 --> 12:52.700]  One of the things that I criticized Charlie Kirk for was the fact that he was going around doing culture war events.
[12:53.700 --> 13:00.700]  And he was putting out front a black guy who was a homosexual, checking two DEI boxes.
[13:00.700 --> 13:09.700]  And as he's going around talking about Christ and Christianity, he's sending this conflicting message of supporting homosexual marriage.
[13:09.700 --> 13:14.700]  And he was called out on it by some people at some of the events.
[13:14.700 --> 13:17.700]  And he got really furious.
[13:17.700 --> 13:19.700]  How dare you call this out?
[13:19.700 --> 13:29.700]  And I said at the time, I said, this I think is very revealing because it shows what he's interested in is big tent GOP.
[13:29.700 --> 13:34.700]  He's interested in getting money from backers and that type of thing.
[13:34.700 --> 13:40.700]  And to me, it was a real betrayal of all the conservative things that he pays lip service to.
[13:40.700 --> 13:42.700]  But the entire Republican Party is like that.
[13:42.700 --> 13:44.700]  But especially the alternative media.
[13:44.700 --> 13:55.700]  And so Milo Yiannopoulos has apologized for helping to sell and normalize homosexuality and homosexual marriage.
[13:55.700 --> 14:00.700]  And where did he do that with the alternative media?
[14:00.700 --> 14:05.700]  Milo says he's become a Christian and rejected that.
[14:05.700 --> 14:12.700]  And now he is outing a lot of other people that are living this closeted, as they say, lifestyle.
[14:13.700 --> 14:15.700]  We've seen this for a long time in the Republican Party.
[14:15.700 --> 14:23.700]  And Milo's point is that homosexuality is rampant, but hidden in the GOP.
[14:23.700 --> 14:27.700]  I mean, there's been reports when they have their large conventions at Grindr.
[14:27.700 --> 14:34.700]  You can see the Spike and Grindr activity, which is a homosexual dating app.
[14:34.700 --> 14:39.700]  You can see it where they're meeting at geolocation.
[14:39.700 --> 14:42.700]  And we've seen it in the past.
[14:42.700 --> 14:49.700]  I mean, the longest serving Speaker of the House, Dennis Hastert, was put into Congress from being a wrestling coach.
[14:49.700 --> 14:51.700]  That was his qualification for getting in Congress.
[14:51.700 --> 14:54.700]  Actually, his qualification wasn't being a wrestling coach.
[14:54.700 --> 14:57.700]  His qualification was being a pedophile wrestler coach.
[14:57.700 --> 15:01.700]  And then that lawsuit caught up to him eventually.
[15:01.700 --> 15:05.700]  But while he was in, there was a paging scandal.
[15:05.700 --> 15:08.700]  Not pager, I guess, the pages.
[15:08.700 --> 15:13.700]  The young boys that go to Congress because they want to get experience in politics.
[15:13.700 --> 15:16.700]  They got a different kind of experience than they were expecting.
[15:16.700 --> 15:18.700]  And so there was a scandal there with Mark Foley.
[15:18.700 --> 15:23.700]  And so Dennis Hastert, before all this stuff broke about him, went on with Rush Simba.
[15:23.700 --> 15:24.700]  And they just pooh-poohed it.
[15:24.700 --> 15:29.700]  Oh, this is just nothing but partisan politics, the same type of stuff they're doing now with Pete Hegseth.
[15:29.700 --> 15:35.700]  And what's happening with the murder of people in international waters.
[15:35.700 --> 15:37.700]  And so, yeah, it's just partisan politics.
[15:37.700 --> 15:40.700]  Nothing to see here, except we did see what it was.
[15:40.700 --> 15:48.700]  And so Milo is saying that, in his opinion, it is everywhere within the GOP.
[15:48.700 --> 15:55.700]  Now, he might have a bit different perspective on it since he was holding himself forth as a homosexual.
[15:56.700 --> 16:02.700]  And again, of course, they're still doing this with Scott Pressler, the guy with really long straight hair.
[16:02.700 --> 16:04.700]  You may remember him.
[16:04.700 --> 16:12.700]  He is a favored person for the GOP in terms of representing them.
[16:12.700 --> 16:16.700]  And they're normalizing this.
[16:16.700 --> 16:18.700]  And so Milo has rejected that.
[16:18.700 --> 16:23.700]  And he has apologized for normalizing that.
[16:23.700 --> 16:25.700]  Which, by the way, none of the influencers have.
[16:25.700 --> 16:32.700]  And so people like Charlie Kirk, people like Alex Jones have been normalizing this type of thing.
[16:32.700 --> 16:37.700]  And as a matter of fact, he went on with Tim Pool, who is also playing this game.
[16:37.700 --> 16:39.700]  Tim Pool had Milo on.
[16:39.700 --> 16:41.700]  He had George Santos.
[16:41.700 --> 16:45.700]  Why would you put George Santos on, unless it's some kind of a clickbait thing?
[16:45.700 --> 16:51.700]  And so Milo is making all kinds of statements about all these other conservative influencers.
[16:52.700 --> 17:02.700]  Candice Owen and even Charlie Kirk and Alex Jones saying that they were involved in homosexual activity.
[17:02.700 --> 17:04.700]  So I don't know.
[17:04.700 --> 17:13.700]  And so already you had Benny Johnson, who he said that about, said he's going to sue Milo for what he said about that.
[17:13.700 --> 17:16.700]  And he made some very specific statements about it.
[17:17.700 --> 17:25.700]  All I can say is that when you look at how they're using this, the people who say that they're for conservative values,
[17:25.700 --> 17:30.700]  that they're for family values, and then they do this kind of stuff.
[17:30.700 --> 17:37.700]  I mean, it's just look at Alex Jones platforming Blair White, this guy who dresses up like a woman.
[17:37.700 --> 17:43.700]  And so, again, Tim Pool put all that stuff into his podcast.
[17:43.700 --> 17:52.700]  All I've got to say about that is the reason I mention this is not to get caught up in all of this gossip and all the rest of this stuff.
[17:52.700 --> 17:57.700]  But just take these people and look at what they do.
[17:57.700 --> 18:00.700]  Look at what they do and look at what they say.
[18:00.700 --> 18:04.700]  Ask yourself, then, why would you trust them?
[18:04.700 --> 18:14.700]  Very interesting in terms of January the 6th, Trump has, according to some sources,
[18:14.700 --> 18:19.700]  was trashing the people who were the conspiracy theories around January the 6th.
[18:19.700 --> 18:22.700]  And then you got people like Nick Fuentes.
[18:22.700 --> 18:26.700]  It was put up by Shannon Joy yesterday, and I don't have it on the deck here.
[18:26.700 --> 18:33.700]  But it was footage of Nick Fuentes yelling at people, go over there, go over there.
[18:33.700 --> 18:36.700]  Directing people on January the 6th.
[18:36.700 --> 18:44.700]  And I've said from the very beginning, why did they not focus on Ray Epps, right?
[18:44.700 --> 18:54.700]  And not focus on Fuentes, on Alex Jones and all these people who have been running Stop the Steal, all the people who enticed them to come.
[18:54.700 --> 18:59.700]  And it's like Ray Epps is there saying, yeah, we've got to go over there as well.
[19:00.700 --> 19:03.700]  Fuentes is doing that that day as well.
[19:03.700 --> 19:07.700]  Why does he get a pass? Is he a Fed?
[19:07.700 --> 19:15.700]  The question is, when you look at this stuff, are they selling this stuff for clicks?
[19:15.700 --> 19:22.700]  Are they selling it because they're being funded by people who want to use them to propagandize you?
[19:22.700 --> 19:25.700]  Use them for controlled opposition?
[19:25.700 --> 19:30.700]  And I think that it really, in the long term, doesn't really matter that much.
[19:30.700 --> 19:33.700]  They're manipulating you. They're lying to you.
[19:33.700 --> 19:35.700]  And that's the key thing that you need to know.
[19:35.700 --> 19:38.700]  It's a trap in many different ways.
[19:38.700 --> 19:42.700]  Well, I'm going to take a quick break here because there's something going on.
[19:42.700 --> 19:44.700]  I need to find out what is happening with this.
[19:44.700 --> 19:46.700]  And we're going to continue.
[19:46.700 --> 19:51.700]  When we come back, we're going to talk about a man who died from eating cockroaches.
[19:51.700 --> 19:55.700]  If people swallow some of this stuff coming from the conservative influencers,
[19:55.700 --> 19:58.700]  I guess somebody who is kind of like swallowing cockroaches.
[19:58.700 --> 20:02.700]  And if you get too much of it, it can be a very bad thing for you.
[20:02.700 --> 20:05.700]  So I'm going to take a quick break, folks, and we will be right back.
[20:21.700 --> 20:23.700]  We'll be right back.
[20:51.700 --> 20:53.700]  Thank you.
[21:21.700 --> 21:23.700]  Thank you.
[21:51.700 --> 22:18.700]  You're listening to The David Knight Show.
[22:18.700 --> 22:20.700]  You ain't heard about Modo Casino.
[22:20.700 --> 22:22.700]  Modo has real Vegas slots.
[22:22.700 --> 22:25.700]  Any game you can find on the floor in Vegas, you can play it on Modo.
[22:25.700 --> 22:26.700]  I like my slots hot.
[22:26.700 --> 22:27.700]  Modo's free to play.
[22:27.700 --> 22:28.700]  Like food stamps.
[22:28.700 --> 22:29.700]  In line at the grocery store.
[22:29.700 --> 22:30.700]  At a funeral.
[22:30.700 --> 22:31.700]  In traffic.
[22:31.700 --> 22:32.700]  Keep your eyes on the road.
[22:32.700 --> 22:33.700]  Hop on Modo Casino.
[22:33.700 --> 22:35.700]  Modo Casino got jackpots that are bigger than my belly.
[22:35.700 --> 22:38.700]  Modo, America's hottest free to play social casino.
[22:38.700 --> 22:39.700]  Download the Modo Casino app today.
[22:39.700 --> 22:40.700]  Modo Casino is a social casino.
[22:40.700 --> 22:41.700]  Boy, we're prohibited.
[22:41.700 --> 22:42.700]  No purchase necessary.
[22:42.700 --> 22:43.700]  Visit Modo.us for more details.
[22:44.700 --> 22:45.700]  ...Modo Casino.
[22:45.700 --> 22:49.700]  America's social casino.
[22:49.700 --> 22:50.700]  Multiply.
[22:50.700 --> 22:51.700]  Multiply.
[22:51.700 --> 22:52.700]  Multiply.
[22:52.700 --> 22:53.700]  Multiply.
[22:53.700 --> 22:58.700]  With X's the cash scratch tickets from the Texas Lottery, you could multiply the cash
[22:58.700 --> 23:00.700]  by 30, 50, 100, or even 200 times.
[23:00.700 --> 23:04.700]  And when you multiply the cash, you multiply the celebration.
[23:04.700 --> 23:08.700]  With top prizes from $60,000 up to a million dollars, it's the easiest way to multiply
[23:08.700 --> 23:09.700]  your luck...
[23:09.700 --> 23:15.020]  the easiest way to multiply your luck and enter for a chance to win a VIP iHeart experience.
[23:15.020 --> 23:20.180]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[23:20.180 --> 23:26.820]  APS radio delivers multiple channels of music right to your mobile device. Get the APS radio
[23:26.820 --> 23:31.140]  app today and listen wherever you go.
[23:31.140 --> 23:33.900]  Well welcome back. I was trying to figure out what was going on. Everybody's scrambling
[23:33.900 --> 23:37.820]  and running around and I didn't know what the issue was. It turns out that we had some
[23:37.820 --> 23:42.380]  issues with Rumble streaming. So that's now been fixed and we now have everybody back
[23:42.380 --> 23:44.260]  in their proper assigned seats.
[23:44.260 --> 23:48.860]  So if you want to be on Rumble but went somewhere else, you can now go back to Rumble and watch
[23:48.860 --> 23:50.100]  the show there.
[23:50.100 --> 23:55.180]  Yes. Well, as I promised, we're going to talk about something really important here, but
[23:55.180 --> 24:00.420]  I think it is an apt metaphor for our times in a number of ways and then horrifying death
[24:00.420 --> 24:07.100]  as he ate cockroaches in a competition. And this is just yet another warning. You probably
[24:07.100 --> 24:11.500]  don't want to get into competitions of drinking and eating stuff, whether it's hot dogs or
[24:11.500 --> 24:18.820]  even water or especially cockroaches. But I've talked many times in terms of how dosage
[24:18.820 --> 24:24.020]  is so important. The woman who was part of a radio, they had a radio contest that was
[24:24.020 --> 24:28.540]  going on and they thought it'd be funny to give people lots of water and then not let
[24:28.540 --> 24:35.540]  them go to the bathroom. And a lady died because the water, basically, if you get a lot of
[24:35.620 --> 24:42.620]  water, an overdose on water, it will dilute, I think, your blood or something to the extent
[24:42.620 --> 24:47.620]  that it kills you. And it killed that one woman just in terms of doing a stupid contest.
[24:47.620 --> 24:48.620]  This guy...
[24:48.620 --> 24:50.580]  I think it's stomach lining that it dilutes.
[24:50.580 --> 24:51.580]  Stomach lining, that's the method?
[24:51.580 --> 24:57.220]  Yeah, and then it just leeches out into your system and your body needs water, but it's
[24:57.220 --> 24:59.220]  supposed to stay in its proper place.
[24:59.220 --> 25:05.220]  Wow. Well, this guy, 32 years old, collapsed and died as part of a contest. And guess what?
[25:05.220 --> 25:11.100]  The prize was a python. I want that python. Give me those bugs.
[25:11.100 --> 25:18.100]  I'll eat the bugs for the snake. This is a strange barter economy he was in.
[25:18.100 --> 25:22.900]  He was trying to eat zee bugs and he ate too many of zee bugs. The interesting thing is,
[25:22.900 --> 25:27.020]  when I saw this, I thought, so are these things toxic? I grew up in Florida where we have
[25:27.020 --> 25:32.820]  really large cockroaches, palmetto bugs, I would call them, to try to put a, I think,
[25:32.820 --> 25:33.820]  a nice...
[25:33.820 --> 25:34.820]  Soften the blow a little bit.
[25:34.820 --> 25:42.620]  Put a nice spin on it, a label, but they're filthy things. And so I thought, you know,
[25:42.620 --> 25:49.180]  was it toxic? No. It's actually he just respirated cockroach parts. He was trying to eat them
[25:49.180 --> 25:56.060]  so quickly. And so he died from asphyxiation, got him stuck in his throat. His girlfriend
[25:56.060 --> 26:00.540]  said that he had eaten bugs before, and she was his girlfriend.
[26:00.540 --> 26:01.540]  So...
[26:01.540 --> 26:03.820]  There's somebody out there for everyone, guys.
[26:03.820 --> 26:04.820]  That's right.
[26:04.820 --> 26:09.380]  Such a pity that he died eating bugs. He loved eating bugs.
[26:09.380 --> 26:14.780]  So it involved not just cockroaches, but it had several different rounds of eating different
[26:14.780 --> 26:19.580]  species of insects. And I don't know if these were the big, this was in Florida, but I don't
[26:19.580 --> 26:25.220]  know if it was the big Florida cockroaches and palmetto bugs. They said they were measuring
[26:25.220 --> 26:26.860]  three or four inches long.
[26:26.860 --> 26:30.820]  Kind of sucks that he got to the cockroach round and then died there.
[26:31.820 --> 26:35.820]  Yeah, maybe grasshoppers would have been better. I don't know, but the...
[26:35.820 --> 26:41.060]  If what you're consuming can come in a plague, stop eating it.
[26:41.060 --> 26:45.820]  This might have been the Madagascar roaches or something. It was three or four inches long.
[26:45.820 --> 26:46.820]  Anyway...
[26:46.820 --> 26:50.820]  I feel like those would be too expensive. You know, those are a pet people want to buy.
[26:50.820 --> 26:55.060]  Yeah, they said he was eating these things really quickly, and then he began retching.
[26:55.060 --> 26:59.260]  I guess most of the people thought it would be nothing unusual after eating a bunch of
[26:59.260 --> 27:04.580]  cockroaches that you would start to throw up. But maybe that's why they evidently didn't
[27:04.580 --> 27:10.100]  give him the Heimlich maneuver. I don't know. But in the video, you can see him trying to
[27:10.100 --> 27:16.060]  swallow and breathe at the same time. We can't do both of those simultaneously. That's right.
[27:16.060 --> 27:25.420]  So question from the New York Times is, is Hollywood getting God? I guess you'd have
[27:25.420 --> 27:26.420]  a t-shirt.
[27:26.420 --> 27:27.420]  Probably eventually God's wrath.
[27:27.580 --> 27:32.700]  Yeah. Instead of got milky, you could say got God, you know, or something. But I don't
[27:32.700 --> 27:37.220]  think that they get God. I don't think they understand God. I don't think they ever have
[27:37.220 --> 27:44.260]  understood God. And a good example of this is something that is happening today. Today
[27:44.260 --> 27:52.700]  is the 60th anniversary, December 9th, 1965, of the airing of the Charlie Brown Christmas
[27:52.700 --> 28:00.460]  special. And CBS really didn't get the whole Christmas thing either. It was kind of interesting
[28:00.460 --> 28:08.100]  because it was sponsored by Coca-Cola. Coca-Cola, during the summer of 1965, in June, as a matter
[28:08.100 --> 28:12.860]  of fact, came to CBS and said, we want to have a TV special that we want to sponsor.
[28:12.860 --> 28:17.820]  Well, you know, Coca-Cola doesn't really like Christmas. It doesn't like Christ and
[28:18.300 --> 28:24.880]  They've done everything they can to put Santa in his place. And these AI commercials that
[28:24.880 --> 28:32.300]  Coca-Cola has done, they got a lot of criticism for it. But they scrupulously avoid using
[28:32.300 --> 28:37.400]  the term Christmas having anything to do with Christ. And so they were going to be the sponsor
[28:37.400 --> 28:43.980]  of this. And so they said, we're on a really tight schedule. And there's actually a documentary
[28:43.980 --> 28:48.660]  in case you're interested, the making of the Charlie Brown Christmas. It's a documentary.
[28:48.660 --> 28:55.940]  Bill Melendez is still around and he was the animator. And so he's one of the key people
[28:55.940 --> 28:59.340]  that they talked to about it. And they said, we didn't know how we were going to get this
[28:59.340 --> 29:05.660]  thing done. So they brought in Charles Schultz, who was, they had already picked, said we want
[29:05.660 --> 29:10.300]  to do something with peanuts. They called him Sparky. That was his nickname. And they said,
[29:11.260 --> 29:16.540]  he was really incredible as a creative. He wasn't just a cartoonist. He was a storyteller.
[29:16.540 --> 29:20.940]  And he did these things that came out of the woodwork. Sometimes I would just sit back
[29:20.940 --> 29:28.460]  and like, wow, this guy comes up with great ideas. And so he was able to put together
[29:28.460 --> 29:34.060]  the outline for the show in less than a day. They sent the outline to Coca-Cola. They got
[29:34.060 --> 29:43.420]  on Monday, on Tuesday, they called up and said they'd do it. And it had the objectionable scene
[29:43.420 --> 29:50.540]  in it, which was Linus reading the Bible passage from Luke. But they didn't really catch onto that,
[29:50.540 --> 29:57.180]  evidently. And so the TV executives, once they got the show delivered to them, were very unhappy
[29:57.180 --> 30:02.700]  with it. They said they didn't like the kids' voices, which I thought pretty good. They didn't
[30:02.700 --> 30:08.060]  like the jazz music. They said it doesn't fit, which, of course, that has now become a classic.
[30:08.060 --> 30:09.020]  It's iconic.
[30:09.020 --> 30:14.940]  Yeah. And they didn't like the Bible being in there. They thought that was too controversial.
[30:15.500 --> 30:21.020]  It's like all the things that everybody likes about it. CBS TV executives hated it. That's how
[30:21.020 --> 30:27.180]  totally out of touch they are with everything like this. That's why Hollywood is circling the drain
[30:27.260 --> 30:32.940]  and well on its way to being flushed out, because they really don't get it.
[30:32.940 --> 30:38.060]  Yeah, you can't have more shows like this now. In fact, you couldn't even really have them back
[30:38.060 --> 30:42.700]  then most of the time. This was lightning in a bottle that got past them.
[30:42.700 --> 30:44.060]  That's right. That's right.
[30:44.060 --> 30:50.140]  They've been completely out of touch and anti-Christian for decades, probably since inception.
[30:50.140 --> 30:54.780]  Like 60 years. Well, yeah, if you look at Hollywood, it was pretty amazing. There was
[30:54.780 --> 31:03.260]  an interesting BBC series that was narrated by James Mason, the actor. It's talking about
[31:03.260 --> 31:07.020]  the early days of Hollywood silent films. They called it something about silver screen.
[31:08.540 --> 31:13.340]  We had it in our video stores. It was really interesting because they talked about how they
[31:13.340 --> 31:21.180]  made the movies and why movie stars wear sunglasses because they were spending all day in these really
[31:21.180 --> 31:27.340]  bright lights, these carbon arc lights that they were using and doing a number on their eyes.
[31:27.340 --> 31:31.180]  They really needed to get their eyes shaded. When they went outside, they needed to rest.
[31:31.980 --> 31:36.780]  A lot of different things like that, but how they would do stunts, everything was real.
[31:37.260 --> 31:42.780]  There was no special effects. They did it for real. Lillian Gish is on an ice
[31:42.780 --> 31:47.420]  flow and she's on a real ice flow. This is not a staged thing.
[31:48.060 --> 31:54.940]  And the cameramen, how would they keep the steady flow? It does look a little bit
[31:54.940 --> 31:59.260]  jerky in terms of movement and that type of thing, but the cameramen were picked because
[31:59.260 --> 32:04.380]  they could turn the crank and manually crank the film through the camera at a constant rate.
[32:04.380 --> 32:08.220]  They all had a song that they would sing to themselves and that would be how they would
[32:08.220 --> 32:13.020]  pace themselves. But these guys had to keep this stuff up even when they strapped them to the wing
[32:13.020 --> 32:17.580]  of a biplane or something. They were up there controlling this thing as they were flying around
[32:17.580 --> 32:25.820]  on the biplane. It was a fascinating series, but from the inception, you can see just how perverted.
[32:25.820 --> 32:30.540]  I mean, the whole thing was like Jeffrey Epstein party continuously with all these different
[32:30.540 --> 32:34.380]  people. That's why they had the Hollywood code that came in. But they've been completely out of
[32:34.380 --> 32:41.660]  touch with the rest of society from the get-go. They don't get it, but what they do is they
[32:41.660 --> 32:48.860]  manufacture a new reality. They manufacture a new consent. They're not reflecting culture.
[32:48.860 --> 32:55.900]  They're driving culture. Anyway, and back to this. In the outline, Schultz Sparkey had insisted that
[32:55.900 --> 33:00.540]  there'd be a scene from the Bible. And at the time, hardly any TV shows referenced scripture.
[33:00.540 --> 33:05.500]  The move was very risky. Mendelsohn said, Bill and I looked at each other and he said,
[33:05.500 --> 33:08.700]  oh, we don't know if we can animate from the Bible. It's never been done before.
[33:09.580 --> 33:14.220]  And Charles Schultz said, well, if we don't do it, who will? So they went ahead and did that.
[33:14.220 --> 33:20.940]  That became part of the famous scene. This year, again, marks the 60th anniversary of the TV special
[33:22.460 --> 33:33.820]  December the 9th, 1965 and 730. And it's the 75th anniversary of the Peanuts comic strip. So he had
[33:33.980 --> 33:38.780]  that comic strip for about 15 years before they picked him to do the film.
[33:39.500 --> 33:45.340]  So this is a short segment. We're going to come back, though, and we're going to talk about the
[33:45.340 --> 33:53.580]  technocracy and some of the mounting problems that solve driving cars that are going to take
[33:53.580 --> 33:59.180]  over the world. AI is going to run the world and going to run us, but it can't even navigate
[33:59.260 --> 34:04.860]  the Chick-fil-A drive through. They're working on an app for that. And so we're going to take
[34:04.860 --> 34:12.220]  a quick break. And Lance, did you put in the Charlie Brown thing? Yeah, I believe it's
[34:12.220 --> 34:17.500]  called Christmas time and Christmas folder. Okay. Yeah. Let's see if I can get that here.
[34:18.700 --> 34:23.180]  I got it. I got it. Yeah. All right. Yeah. We've got a little bit different visuals this year with
[34:23.180 --> 34:27.740]  the help of AI for our Charlie Brown song. We'll be right back.
[35:53.180 --> 36:14.700]  You're listening to the David Knight Show.
[36:14.700 --> 36:25.100]  Elvis, and the sweet sounds of Motown. Find them on the Oldies channel at APSradio.com.
[37:14.700 --> 37:35.660]  Well, as we talk about whatever I was watching 60 years ago, today the government watches you.
[37:36.620 --> 37:40.220]  The TV watches you back. The refrigerator watches you back. As a matter of fact,
[37:40.220 --> 37:49.420]  there was an interesting funny story that Lance had shown me. And there was a woman
[37:49.420 --> 37:55.740]  who was suffering from paranoia. And she had one of these refrigerators that plays commercials all
[37:55.740 --> 38:05.660]  the time. And it was a commercial for kind of a sci-fi dystopian film. And the character in the
[38:05.660 --> 38:11.180]  film had the same name as this woman. And so the traitor starts playing this thing and calls her
[38:11.180 --> 38:16.620]  out by name. And she thought she was having a psychotic episode here. But I guess when
[38:16.620 --> 38:23.660]  they're really watching you, maybe it's not psychotic. It was a woman with schizophrenia.
[38:23.660 --> 38:32.860]  And she got these messages for this TV show in which some group or AI or something is talking
[38:32.860 --> 38:37.420]  to this woman through various devices. So it's putting up these messages like,
[38:37.420 --> 38:42.620]  sorry, we disappointed you, Carol. And the woman's named Carol and had been
[38:42.620 --> 38:46.540]  diagnosed as schizophrenic. So she thought she was having a psychotic break.
[38:47.900 --> 38:54.540]  Yeah, if I ever get a car that talks to me, I'll have to get the sound bites in there from 2001.
[38:54.540 --> 38:55.820]  Sorry, Dave, I can't do that.
[38:55.820 --> 39:00.780]  I was thinking you were going to go maybe Kit from Knight Rider or something less malevolent.
[39:01.100 --> 39:05.660]  No, it had to be malevolent from my opinion. Talking about malevolent use of technology,
[39:07.500 --> 39:11.500]  Axon Enterprise. This is the company that is the biggest
[39:14.220 --> 39:20.140]  vendor of body cameras for cops. But of course, they're also famous for developing tasers.
[39:21.260 --> 39:26.860]  And now what they want to do is, and I thought it was interesting that the number two
[39:27.580 --> 39:32.620]  body camera company was Motorola. And I said, this is the way everything is going in the world.
[39:33.900 --> 39:39.980]  Because of the government's money, they've taken over all consumer manufacturing and everybody is
[39:39.980 --> 39:45.420]  now catering to the government. That's their customer. That's especially going to be true
[39:45.420 --> 39:50.940]  of artificial intelligence. But it has definitely been true for quite some time in terms of
[39:51.820 --> 39:58.700]  the technology companies that are here. Even consumer-based companies started getting into
[39:58.700 --> 40:04.620]  defense contract work because it was so lucrative. And so the police body cameras are equipped with
[40:04.620 --> 40:11.420]  artificial intelligence, trained to detect the faces of about 7,000 people on a high-risk
[40:11.420 --> 40:14.940]  watch list. And they're rolling this out in the Canadian city of Edmonton.
[40:14.940 --> 40:21.340]  And I have to ask myself, I should have looked up the population of Edmonton, but
[40:22.060 --> 40:27.660]  when you've got in a town, I don't care if it's New York City, if you've got 7,000 people who
[40:27.660 --> 40:33.980]  are dangerous enough that they need to be on the bolo, be on the lookout for, maybe there's
[40:33.980 --> 40:38.860]  something wrong with the government system and the court system that you have these people on
[40:38.860 --> 40:44.300]  the streets in the first place. So that's my first concern, is why are 7,000 people being
[40:45.740 --> 40:51.660]  that they say are dangerous? Why are they allowed to be out there? Then the second issue is that if
[40:51.660 --> 40:55.900]  these people are dangerous enough that they're going to instantly alert the police and say,
[40:55.900 --> 40:59.260]  be careful of this person. They're very dangerous. They might be a threat to you.
[40:59.900 --> 41:04.620]  We've seen that type of thing done, labeling people as sovereign citizens. Remember how they
[41:04.620 --> 41:10.620]  did that after the, what was it, 2008 or something? We had Chuck Baldwin and Ron Paul
[41:10.780 --> 41:17.980]  ran for president, and they were telling police officers with these fusion data centers. They're
[41:17.980 --> 41:22.780]  telling them that if they pulled a car over and had a bumper sticker supporting Chuck Baldwin
[41:22.780 --> 41:26.700]  or Ron Paul, these people might be sovereign citizens, so you better be on the lookout for
[41:26.700 --> 41:33.740]  them, and they might try to kill you. So you got the police take the safeties off their gun,
[41:33.740 --> 41:39.180]  they're on a hair trigger here, and that's a real dangerous thing when you falsely identify people.
[41:39.260 --> 41:44.540]  As they did with that. These people are not a threat to the police, but this AI can do the same
[41:44.540 --> 41:52.060]  thing. This AI can say, this person looks like, I think we've got this particular guy, and you might
[41:52.060 --> 41:58.300]  be completely innocent, and you'd be misidentified by artificial intelligence, and because it's
[41:58.300 --> 42:04.780]  hyping up the police and telling them that you're dangerous, that could threaten you severely. So
[42:05.420 --> 42:11.100]  severely. So we've gone beyond the no-fly list type of stuff, and so now they want to do this.
[42:11.100 --> 42:18.620]  So they're running this out as a test in Edmonton. And I hope the AI is in their ear as they're
[42:18.620 --> 42:22.860]  getting this, just feeding them full metal jacket lines, you know, show me your war face, just
[42:22.860 --> 42:27.660]  getting them really hyped up, pumped up, ready to go, rock and roll, heavy metal. That's right.
[42:27.660 --> 42:33.180]  Draw your gun right now, pull it on. Yeah, regardless of the population size, if you've got
[42:33.180 --> 42:39.100]  7,000 people who truly deserve to be on a terrorist watch list, that's going to be a war zone.
[42:39.100 --> 42:42.220]  I know. That's what I'm saying. I don't know the population as of Edmonton, but it doesn't
[42:42.220 --> 42:48.300]  really matter. Even if it's New York City or some large area, 7,000 criminals out there that you've
[42:48.300 --> 42:53.180]  got to alert the police as to how dangerous they are. That's a crazy situation. That means
[42:53.180 --> 42:58.460]  that the whole policing and justice system ain't working, folks. Yeah. It's like, I'm convinced
[42:58.540 --> 43:03.500]  there's at least 7,000 people in New York that are criminals. I'm not, like you said, not convinced
[43:03.500 --> 43:09.340]  there are 7,000 criminals in even New York that you need to immediately alert the police on.
[43:09.340 --> 43:13.980]  That's right. Yeah, they could be criminals because of some thing that they do that's not
[43:13.980 --> 43:20.220]  a threat to other people. Nevertheless, the interesting thing is that this was brought up
[43:20.220 --> 43:25.820]  six years ago by them and also considered by Motorola, who is now the number two provider
[43:25.820 --> 43:31.500]  of police body cameras. They're both talking about matching this with artificial intelligence
[43:31.500 --> 43:36.780]  and doing a biometric database because although that is much more sophisticated now, they've been
[43:36.780 --> 43:42.460]  working on this type of thing for quite some time. And so one of the guys who used to be
[43:43.100 --> 43:52.300]  the chair of Axon's ethics board spoke out because he resigned because of unethical behavior from
[43:52.300 --> 44:00.220]  the corporation back in 2019. He and seven other people resigned from Axon when the CEO had this
[44:00.220 --> 44:08.860]  great idea. Let's put our tasers on drones. It just keeps getting worse when you look at these
[44:08.860 --> 44:14.060]  corporations that are part of the police state industrial complex. I had this great idea to put
[44:14.060 --> 44:19.660]  tasers on drones. My entire ethics department quit, but this will be great for our bottom line.
[44:19.740 --> 44:24.780]  That's right. So after getting rid of the ethics department with the tasers on drones,
[44:24.780 --> 44:30.140]  now he is free to do artificial intelligence connected up to the police body cameras.
[44:30.780 --> 44:37.580]  And he said, it's not essential to use these technologies, which have very real costs and
[44:37.580 --> 44:43.180]  risks unless there's some clear indication of the benefits, said the former employee who was there
[44:43.180 --> 44:48.060]  for ethics. He was the board chair for ethics, Barry Friedman, who is now a law professor at
[44:48.060 --> 44:55.900]  New York university. The founder and the CEO of Axon though, says that the Edmonton pilot is
[44:55.900 --> 45:01.100]  not a product launch, but it's an early stage field research that will assess how the technology
[45:01.100 --> 45:06.780]  performs and reveal the safeguards needed to use it responsibly. So you better believe that if this
[45:06.780 --> 45:11.580]  thing works at all, they'll be selling it. And they don't really care if it gives false positives,
[45:12.220 --> 45:17.180]  if it identifies you as a criminal. And testing in real world conditions outside the US, we can
[45:17.180 --> 45:22.460]  gather independent insights, we can strengthen oversight frameworks, and we can apply those
[45:22.460 --> 45:27.580]  learnings to future evaluations, including within the United States. So he's testing it outside the
[45:27.580 --> 45:34.460]  US and believe me, they will sell this as safety for law enforcement officers. It will be like
[45:34.460 --> 45:40.460]  wildfire the way everybody will snap this thing up. So they're in the process right now of making
[45:40.460 --> 45:44.140]  their case for it. Oh, look, we tested it in Edmonton and it worked great. We already know
[45:44.140 --> 45:49.260]  how that's going to go. This is just like the way the pharmaceutical companies test their drugs.
[45:49.260 --> 45:55.020]  You know, yeah, look at the here's our study here that we did ourselves to show how safe and
[45:55.020 --> 46:03.580]  effective this is. So the person who is now the director of responsible AI, they don't call it
[46:03.580 --> 46:09.660]  ethics anymore. So we really wanted to make sure that it's targeted so these folks that's targeting
[46:09.660 --> 46:16.620]  these folks who have serious offenses. Okay, so again, why are 7000 people serious offenses
[46:17.260 --> 46:25.100]  at large in Edmonton? And if it's a serious offense and they misflag you and they say they
[46:25.100 --> 46:30.700]  have a real issue under certain lighting conditions. They have an issue identifying accurately people
[46:30.700 --> 46:38.700]  with darker skin. And so this is this is going to be a disaster. It's a disaster in the making
[46:38.700 --> 46:42.940]  right here. I think I'm getting to think if they've got 7000 hardened criminals on the
[46:42.940 --> 46:48.540]  streets that maybe the Mounties don't always get their man. They get a man, not necessarily
[46:48.540 --> 46:52.300]  the one that they need. Here's golf legend John Daly. Hell yeah, these winds are pulling up
[46:52.300 --> 46:56.300]  faster than my divorces. I only spent on moto America's social casino. You know, I've won a
[46:56.300 --> 47:01.020]  couple of majors and on moto. I've won majors, grants and epic jackpots on their classic Vega
[47:01.020 --> 47:06.220]  slots with huge, huge bonus rounds. Moto casino adds new games and awards players free coins every
[47:06.220 --> 47:10.860]  single day. Grip it and spend it on moto casino. Download the moto casino app today. Moto casinos,
[47:10.860 --> 47:14.060]  the social casino board were prohibited. No purchase necessary. This is moto.us for more details.
[47:24.620 --> 47:31.020]  With x the cash scratch tickets from the Texas lottery, you could multiply the cash by 30, 50,
[47:31.020 --> 47:36.860]  100 or even 200 times. And when you multiply the cash, you multiply the celebration with top
[47:36.860 --> 47:41.820]  prizes from 60,000 up to a million dollars. It's the easiest way to multiply your luck
[47:41.820 --> 47:46.700]  and enter for a chance to win a VIP I heart experience. Play x the cash scratch tickets
[47:46.700 --> 47:53.260]  today. Must be 18 or older. Play responsibly. We can promise you someone is going to prison.
[47:53.260 --> 47:57.660]  That's right. Our AI drones aren't all that great at picking out faces in low light,
[47:57.660 --> 48:00.860]  but let's put a whole bunch of tasers on them and send them out in swarms.
[48:01.740 --> 48:05.100]  If we put out enough of them, eventually things will work out.
[48:05.980 --> 48:08.060]  Just taser enough people, you'll get the criminals.
[48:09.900 --> 48:12.700]  Yeah, taser everybody. We'll sort it out later. Is there a lane on the ground?
[48:12.700 --> 48:17.500]  What is that military saying? Accuracy through volume of fire or something like that.
[48:17.500 --> 48:21.020]  You don't have to be precise with your shots if you just shoot enough times.
[48:21.020 --> 48:24.620]  Lethality, not legality, right? That's the new motto of the
[48:25.820 --> 48:32.300]  Pentagon Pete Department of Defense because they haven't changed the name to war department yet.
[48:32.300 --> 48:37.100]  So anyway, they talked to Motorola and Motorola said, well, we took a look at this and we decided
[48:37.100 --> 48:42.460]  not to do it because we thought it'd be unethical. We were intentionally abstained from deploying
[48:42.460 --> 48:48.380]  this feature. However, we might do it in the future because ethics are changing, right?
[48:48.940 --> 48:53.580]  Morality is up for negotiation, especially if your competitor is doing it.
[48:53.580 --> 48:58.860]  And so if Axon does it, Motorola will do it and it'll explode and we'll see it everywhere.
[48:59.500 --> 49:06.860]  And they're all going to be coming to the local mayor, whoever, and say,
[49:07.660 --> 49:11.740]  well, if you won't do this for us, you really don't value our lives because we've had a
[49:11.740 --> 49:15.500]  police officer over here that was killed under these circumstances. We could have stopped that
[49:15.500 --> 49:21.020]  with this thing. So it'll be on them. This is clearly unethical. We don't want to be
[49:21.020 --> 49:25.020]  the ones pushing it and at the forefront of it, but we'll hold off on it.
[49:25.660 --> 49:31.660]  That's right. Studies showing the technology is flawed. They demonstrate biased results based on
[49:31.660 --> 49:39.660]  race, gender and age. What else is there? Race, gender and age, that pretty much covers everything,
[49:39.660 --> 49:43.260]  doesn't it? I suppose if the drone were to sit you down and ask you about your religion,
[49:43.260 --> 49:50.780]  it could discriminate based on that. Well, it doesn't match the faces that accurately.
[49:50.780 --> 49:55.420]  So again, it's a real risk to somebody to be given a false positive like this.
[49:56.220 --> 50:01.340]  All of us would be at risk even if we're not a criminal. Several US states and dozens of cities
[50:01.340 --> 50:06.860]  have sought to curtail the police use of facial recognition, although the Trump administration
[50:06.860 --> 50:14.060]  is just fine with it. And they want to block or discourage states from regulating AI.
[50:15.260 --> 50:21.500]  You see, if the Trump administration gets its way, you wouldn't be able to pass a state or
[50:21.500 --> 50:24.860]  local ordinance saying, we're not going to let the police use that kind of stuff.
[50:25.500 --> 50:31.180]  It's AI. You got to get your hands off of my donors, businesses, right? They're free to do
[50:31.260 --> 50:37.260]  anything they wish, just like his friends and the pharmaceutical companies are FDA,
[50:37.260 --> 50:42.860]  free to do anything. And so that's what the Trump administration is really pushing for.
[50:43.500 --> 50:49.260]  Same thing that was done to protect the glyphosate model, the Roundup model.
[50:50.060 --> 50:54.140]  The European Union has banned real time public face scanning police technology
[50:54.700 --> 51:00.380]  across the 27 nation block, except when used for serious crimes like kidnapping or terrorism.
[51:01.180 --> 51:06.700]  In the UK, authorities started testing the technology on London streets a decade ago,
[51:07.420 --> 51:11.420]  and they've used it to make 1,300 arrests in the past two years. The government is considering
[51:11.420 --> 51:19.420]  expanding its use across the country because the UK wants to be the leader in this kind of
[51:20.380 --> 51:29.020]  Orwellian tyranny. They have seen 1984 as a manual. Axon doesn't make its own AI model
[51:29.020 --> 51:33.740]  for recognizing faces, and they declined to say which one they're using.
[51:36.460 --> 51:41.900]  When we look at the UK, the way they have gone into this, gone over to the dark side,
[51:42.620 --> 51:47.020]  maybe it would be a fitting thing for them to just change the name of the country,
[51:47.660 --> 51:54.460]  especially under Keir Starmer. Remember under Orwell, it was Ing Sock, right? Like English
[51:54.460 --> 52:00.220]  socialism. And of course, Keir Starmer is a socialist, so just call it Ing Sock.
[52:00.220 --> 52:04.620]  It's also great that they're not relying on their own model. So if something goes wrong and these
[52:04.620 --> 52:09.500]  things start tasing people, they have to then send off to some third party company to go,
[52:09.500 --> 52:13.980]  hey, by the way. Well, what they like about that, it gives them plausible deniability.
[52:13.980 --> 52:18.380]  It wasn't us, it was this other company. And you know, if it's something that's produced by
[52:18.380 --> 52:22.300]  Zuckerberg or Altman or Musk or whatever, you know the Trump administration is going to give
[52:22.300 --> 52:29.500]  them a pass, even if it makes an egregious error there. So they said, about 50 officers
[52:29.500 --> 52:33.900]  piloting the technology won't know if their facial recognition software made a match. The
[52:33.900 --> 52:39.500]  outputs will be analyzed later at the station. However, in the future, it could help police
[52:39.500 --> 52:44.540]  detect if there is potentially a dangerous person nearby so they can call for assistance.
[52:45.500 --> 52:52.700]  And you know, with all of this happening, it's kind of interesting. I went back and watched
[52:52.700 --> 52:58.220]  a little bit of Robocop because in Detroit, they've just erected a Robocop statue.
[52:59.580 --> 53:06.700]  And I thought, why are we honoring this kind of stuff? I mean, Detroit looks awful in that movie.
[53:06.700 --> 53:13.180]  You know, they send in mechanized robots to keep order and to use these heavy guns like
[53:13.180 --> 53:21.100]  Ed 209. Put down the gun. I said, this is kind of like Venezuelan boats, right?
[53:21.100 --> 53:25.820]  Put down the gun. They put down the gun. Now I've got five seconds to put down. And everybody's
[53:25.820 --> 53:31.180]  scrambling because they know this thing's going to unleash fire. And it just starts shooting
[53:31.180 --> 53:35.020]  them over and over again. So now they're embracing that. I have that in the deck.
[53:35.020 --> 53:39.020]  You do? Yeah, let's play that. There it is.
[53:39.020 --> 53:43.260]  Ed 209.
[53:52.780 --> 53:55.180]  He's probably got facial recognition technology as well.
[53:57.340 --> 54:03.180]  From the TSA. Is 209 the iteration number or the number of rounds it's going to pump into your
[54:03.180 --> 54:05.420]  corpse? I guess.
[54:20.380 --> 54:26.860]  The enforcement droid, series 209. Enforcement droid, Ed. 209 is currently
[54:26.860 --> 54:29.820]  programmed for urban pacification, but that is only the beginning.
[54:30.780 --> 54:36.860]  After a successful tour of duty in old Detroit, we can expect 209 to become the hot military
[54:36.860 --> 54:42.540]  product for the next decade. Dr. McNamara. We'll need an arrest subject. Mr. Kenny. Yes,
[54:42.540 --> 54:47.180]  sir. Would you come up and give us a hand, please? Yes, sir. Mr. Kenny is going to help us
[54:47.180 --> 54:53.740]  simulate a typical arrest and disarming procedure. Mr. Kenny, use your gun in a threatening manner.
[54:53.740 --> 55:01.100]  I pointed at Ed 209. Yeah, Ed doesn't care if you threaten a human, just don't threaten it.
[55:07.020 --> 55:11.260]  Please put down your weapon. You have 20 seconds to comply.
[55:13.020 --> 55:14.860]  I think you'd better do what he says, Mr. Kenny.
[55:14.860 --> 55:30.060]  You now have 15 seconds to comply. You are in direct violation of Article 113, Section 9.
[55:30.060 --> 55:33.980]  Engineers are furiously trying to rip out the electronics.
[55:33.980 --> 55:49.820]  We'll cut it at that point. But you get the idea. Pete Hegseth wants to know where he can
[55:49.820 --> 55:58.940]  get one of these things for Venezuela. Can I use that? I have a helicopter. The criminology
[55:58.940 --> 56:03.740]  professor in Alberta says he's not surprised the city is experimenting with live facial recognition.
[56:04.700 --> 56:09.820]  Given that the technology is already ubiquitous in airport security, that's why the TSA is there.
[56:10.780 --> 56:17.340]  It is training for all of us, right? And that's what they're training you for, facial recognition
[56:17.340 --> 56:25.020]  right now. And so, again, they resigned because the taser-equipped drones, so now they don't have an
[56:25.020 --> 56:29.100]  ethics board, they're free to do this kind of stuff. Well, you had NVIDIA's CEO,
[56:29.900 --> 56:35.660]  Hwang, goes on with Joe Rogan and has a jaw-dropping AI prediction. He says,
[56:35.660 --> 56:40.700]  in the future, maybe two or three years only from now. Ninety percent of the world's knowledge
[56:40.700 --> 56:48.300]  will likely be generated by AI. Well, this is a self-serving prediction, if ever there was one.
[56:49.180 --> 56:53.740]  If he really believes that, why is he having to do the circular financing of other companies in
[56:53.740 --> 56:58.380]  order to keep pushing his stock higher and higher? It seems like the market would take care of that.
[56:59.500 --> 57:07.260]  And so, he's involved in circular financing fraud. And so Rogan says, well, I don't know,
[57:07.260 --> 57:18.780]  that's crazy, he said. Welcome to Moto Casino, where the excitement never ends. With thousands
[57:18.780 --> 57:22.780]  of the hottest free-to-play social casino games, fastest payouts, and the best promotions in the
[57:22.780 --> 57:27.820]  industry. No tricks or gimmicks. Owned and operated in the USA. Moto Casino is a free-to-play social
[57:27.820 --> 57:32.540]  casino. No purchase necessary. 21 plus to play. Sign up today for a generous welcome bonus.
[57:37.900 --> 57:46.460]  Download the Moto Casino app today. Multiply, multiply, multiply, multiply. With X's the cash
[57:46.460 --> 57:53.580]  scratch tickets from the Texas lottery, you could multiply the cash by 30, 50, 100, or even 200
[57:53.580 --> 57:58.380]  times. And when you multiply the cash, you multiply the celebration. With top prizes from
[57:58.380 --> 58:03.580]  60,000 up to a million dollars, it's the easiest way to multiply your luck. And enter for a chance
[58:03.580 --> 58:09.660]  to win a VIP iHeart experience. Play X the cash scratch tickets today. Must be 18 or older. Play
[58:09.660 --> 58:16.060]  responsibly. Rogan said, yeah, I know, but it's just fine. Rogan says, but it's just fine. Why?
[58:16.060 --> 58:21.020]  He goes, well, let me tell you why. Wang said, it's because what difference does it make to me
[58:21.100 --> 58:24.620]  that I'm learning from a textbook that was generated by a bunch of people I didn't know,
[58:25.260 --> 58:29.900]  or knowledge that was generated by AI computers that are assimilating all of these and
[58:29.900 --> 58:34.540]  resynthesizing things? To me, I don't think there's a whole lot of difference. Yeah, as a
[58:34.540 --> 58:39.500]  man, right, you can be propagandized by textbook companies and the school board or the government
[58:39.500 --> 58:45.660]  or whatever. We can be propagandized by our AI. What is the difference? And that's the key thing.
[58:45.660 --> 58:50.860]  You need to look at, you need critical thinking. You need to look at the source and you need to
[58:50.860 --> 58:56.300]  check it out for yourself. And that's true. Before we had AI, a lot of people didn't do it.
[58:56.300 --> 59:01.500]  That's why AI is going to be so much more dangerous because people will just trust it because it's
[59:01.500 --> 59:07.580]  coming from the machine. They're going to assume it's an unbiased source. Like, oh, look at this.
[59:07.580 --> 59:12.780]  It's a robot. It doesn't have an agenda. It's not trying to sell me something. That's right. It
[59:12.780 --> 59:18.220]  removes the people who are trying to do that one layer and people will just forget they exist.
[59:18.220 --> 59:22.700]  Yeah. Yeah. The man behind the curtain thing. So you're interacting with the
[59:22.700 --> 59:26.940]  Wizard of Oz head that's up there, but you don't realize that there's people behind the curtain
[59:27.500 --> 59:33.500]  that have been hired to program their particular biases and things into these issues that they
[59:33.500 --> 59:40.700]  find important. I'm sure Grok was just purely truth-seeking when it said that it would be
[59:40.700 --> 59:46.620]  better for humanity to lose 49% of its population than for Elon Musk to die.
[59:48.300 --> 59:52.140]  These things are purely unbiased truth-seekers.
[59:52.140 --> 59:58.620]  That's right. So again, it is a tool that is ripe for manipulation, says this article,
[59:58.620 --> 01:00:05.180]  and that's right. And that's the real key with it. It's ripe for surveillance and it's ripe
[01:00:05.180 --> 01:00:13.020]  for manipulation. But then again, so are the schools. So are the textbooks. So is TV. So is
[01:00:13.020 --> 01:00:19.820]  movies. So is social media. These are all tools that are ripe for manipulation. So in that regard,
[01:00:19.820 --> 01:00:27.020]  AI is no different from them. It's just that people have, over time, some people have got
[01:00:27.020 --> 01:00:33.100]  their guard up for these other forms of manipulation and propaganda. AI is going to come in from a
[01:00:33.100 --> 01:00:41.100]  different way. In a rare show of Spine, and this is all critical, right? This is coming from Steve
[01:00:41.100 --> 01:00:47.260]  Watson, and he's rightfully critical of this and skeptical of this. But then listen to this. He
[01:00:47.260 --> 01:00:54.060]  says, however, in a rare show of Spine from Big Tech, Wang declared President Trump to be our
[01:00:54.060 --> 01:01:00.540]  president and cheered him on. How is that a show of Spine, Watson? I don't get it.
[01:01:01.100 --> 01:01:05.580]  Look, this evil scumbag is saying Trump is his president. Isn't that wonderful?
[01:01:05.580 --> 01:01:11.340]  But you know, he is a sycophant and he just came from a meeting with Trump where he's looking to
[01:01:11.340 --> 01:01:19.580]  make money for his business. And these guys know that Trump is their ally. So how is Big Tech now
[01:01:19.580 --> 01:01:25.500]  and the Democrats? They're all good now because for somebody like Steve Watson, they are so embedded
[01:01:25.500 --> 01:01:35.020]  in this because they are now kowtowing to the Trump cult. He's now got a spine. It's just the opposite.
[01:01:35.900 --> 01:01:40.060]  You look straight at Joe Rogan. He said, President Trump is my president. He is our president.
[01:01:40.620 --> 01:01:46.860]  Just because it's President Trump, many want him to be wrong. I think the U.S., we all have to realize
[01:01:46.860 --> 01:01:52.140]  that he is our president and we want him to succeed because it helps everybody, all of us,
[01:01:52.140 --> 01:01:58.380]  to succeed. Well, he certainly is helping all of the AI technocrats to succeed. Isn't Jensen
[01:01:58.380 --> 01:02:05.340]  Wong Taiwanese anyway? Yeah, yeah. Yeah, again, that's dual citizenship, I guess.
[01:02:07.020 --> 01:02:14.220]  But he is his president. If he's going to give him massive subsidies, protect him from any
[01:02:14.220 --> 01:02:24.220]  restrictions in terms of his business, this is what is happening here. So again, he really focuses
[01:02:24.220 --> 01:02:29.100]  and so do other people. It's not just Steve Watson. He's taken this article from a thing
[01:02:29.100 --> 01:02:37.260]  that's put up by Vigilant Fox. These people, they do the articles, they do the posts simply
[01:02:37.260 --> 01:02:40.460]  because somebody said something good about Trump. Look, there's a powerful person that says something
[01:02:40.460 --> 01:02:47.100]  good about Trump and we want Trump to succeed because Trump is our success as well. Trump is
[01:02:47.100 --> 01:02:53.740]  the success of people like Vigilant Fox and Steve Watson, just like he's the success of the technocrats
[01:02:54.220 --> 01:02:58.380]  who are going to be getting the government subsidies for these projects and who are going
[01:02:58.380 --> 01:03:03.900]  to be protected from any regulation at the state or local level because of Trump. The remarks come
[01:03:03.900 --> 01:03:10.060]  amid Huyeng's whirlwind DC tour where he was bowing and scraping before all these people were going
[01:03:10.060 --> 01:03:16.700]  to take your money, take your freedom, take your dignity and hand it to these billionaire technocrats
[01:03:16.700 --> 01:03:26.700]  who he huddled with Trump and Senate Republicans to slash export red tape on AI chips, warning
[01:03:26.700 --> 01:03:33.020]  that, here it is, patchwork state regulations could cripple U.S. dominance. They always call
[01:03:33.020 --> 01:03:36.940]  it that, patchwork state regulations. We don't want to have patchwork regulations. We don't want
[01:03:36.940 --> 01:03:41.980]  to have a different approach in different states. No, we've got to have one ring to rule them all
[01:03:41.980 --> 01:03:47.500]  and that's going to be coming out of Washington. That gang will tell everybody and this is a
[01:03:47.500 --> 01:03:53.660]  violation of the 10th Amendment, what Trump is pushing for, pushing against patchwork state
[01:03:53.660 --> 01:04:00.060]  regulations. Where does it say in the Constitution that you can subsidize these companies? Where does
[01:04:00.060 --> 01:04:06.780]  it say in the Constitution that we can't have any control over what these companies do in our state?
[01:04:07.740 --> 01:04:13.180]  As a matter of fact, it says just the opposite. So, he's there lobbying for protection from
[01:04:13.180 --> 01:04:18.540]  competition and regulation, lobbying for Trump to violate the 10th Amendment and you'll get what he
[01:04:18.540 --> 01:04:25.580]  wants. Trump's energy push is defying the green zealots, he says. That's what Steve Watson says.
[01:04:26.220 --> 01:04:32.140]  This energy push for AI. Let me tell you something. People are angry because they see the power rates
[01:04:32.140 --> 01:04:36.940]  going up because of this green grift that is out there. Oh, we can only generate power
[01:04:37.740 --> 01:04:44.540]  that is created with new devices made by my corporate sponsors. Well, guess what? The
[01:04:44.540 --> 01:04:51.740]  corporate sponsors of Trump are going to be building these, cause massive disruption of the grid
[01:04:52.540 --> 01:04:58.860]  in order to feed their AI data centers and this AI energy grid requirement is going to drive your
[01:04:58.860 --> 01:05:05.980]  prices up further and faster than any of the Green New Deal stuff. That's the bottom line for us.
[01:05:05.980 --> 01:05:11.660]  You want to pay more for electricity and have less of it? Well, the Democrats have a plan for that.
[01:05:11.660 --> 01:05:17.420]  It's called solar power and windmills. If you want to pay more for electricity and have less of it,
[01:05:17.420 --> 01:05:23.740]  the Republicans have a plan for that. It's called AI data centers. Wang's line of there being no
[01:05:23.740 --> 01:05:32.540]  difference between what is coming from the AI and coming from somebody writing a textbook.
[01:05:33.900 --> 01:05:40.460]  It says Watson ignores how these ghosts erode the soul, the authenticity and erode jobs,
[01:05:41.100 --> 01:05:46.860]  paving the way for a world that is scripted by code, not by creators. He talks about that in the
[01:05:46.860 --> 01:05:55.180]  context of Solomon Ray, a chart-topping singer that is just done by AI.
[01:05:56.220 --> 01:06:01.980]  Wayne's vision thrills, but it demands guardrails. We don't even have any guardrails on Trump.
[01:06:03.420 --> 01:06:11.340]  I'm going to get guardrails on his corporate sponsors. So it is, as all this is happening,
[01:06:11.980 --> 01:06:18.780]  just to put this in perspective of this omnipotent AI, it is a real threat because it is going to be
[01:06:19.740 --> 01:06:26.060]  combined with government. That's the real threat, the surveillance to control the propaganda
[01:06:27.260 --> 01:06:32.060]  and the auditing of all of us all the time. But when it comes to things like self-driving cars,
[01:06:33.100 --> 01:06:37.020]  they're having difficulty getting through the Chick-fil-A drive-through.
[01:06:37.980 --> 01:06:42.540]  And some of them have gotten stuck in it. And so there's going to be an app for that.
[01:06:43.580 --> 01:06:47.820]  One person looked at this and said, oh, it's a business opportunity. They've come up with a
[01:06:49.180 --> 01:06:55.500]  startup company called AutoLane. And what they want to do is develop a kind of air traffic control
[01:06:55.500 --> 01:07:02.300]  system that will be specific to a particular business. So you get people to come to your
[01:07:02.300 --> 01:07:08.300]  Chick-fil-A drive-through if Chick-fil-A does a thing with AutoLane. And the people who don't
[01:07:08.300 --> 01:07:13.020]  drive cars who are being driven around in self-driving cars can tell it to go to Chick-fil-A
[01:07:13.020 --> 01:07:18.060]  and they'll be able to navigate there without getting caught. And so they're looking at selling
[01:07:18.060 --> 01:07:26.140]  this to a lot of big box retailers, a lot of fast food chains, and even mentioned selling it to some
[01:07:26.140 --> 01:07:31.260]  of the big real estate investment trusts that are managing shopping centers or things like that.
[01:07:31.260 --> 01:07:35.340]  Here's golf legend John Daly. Hell yeah, these wins are piling up faster than my divorces.
[01:07:35.340 --> 01:07:39.740]  I only spend on Moto, America's social casino. You know, I've won a couple of majors. And on Moto,
[01:07:39.740 --> 01:07:44.380]  I've won majors, grands, and epic jackpots on their classic Vega slots with huge, huge bonus
[01:07:44.380 --> 01:07:49.020]  rounds. Moto Casino adds new games and awards players free coins every single day. Grip it and
[01:07:49.020 --> 01:07:53.420]  spend it on Moto Casino. Download the Moto Casino app today. Moto Casino is a social casino board
[01:07:53.500 --> 01:07:55.900]  prohibited. No purchase necessary. Visit moto.us for more details.
[01:08:01.500 --> 01:08:09.580]  Multiply, multiply, multiply, multiply. With X's the cash scratch tickets from the Texas lottery,
[01:08:09.580 --> 01:08:16.700]  you could multiply the cash by 30, 50, 100, or even 200 times. And when you multiply the cash,
[01:08:16.700 --> 01:08:21.820]  you multiply the celebration with top prizes from 60,000 up to a million dollars. It's the
[01:08:21.820 --> 01:08:26.780]  easiest way to multiply your luck and enter for a chance to win a VIP iHeart experience.
[01:08:26.780 --> 01:08:31.340]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[01:08:32.300 --> 01:08:37.820]  And that's where he sees his market. He said, we don't work on public streets and we don't work
[01:08:37.820 --> 01:08:41.820]  with public parking spots. So what he wants to do is he wants to partner with these private
[01:08:41.820 --> 01:08:47.340]  businesses so they can say that they are self-driving car friendly. This is the
[01:08:47.340 --> 01:08:53.820]  pathetic world that we are headed into here. We've gone from London taxi drivers who could
[01:08:53.820 --> 01:09:01.260]  keep the destinations in London in their head and had this massive part of their brain, whatever
[01:09:01.260 --> 01:09:05.900]  it was. I don't remember. Hippocampus? Yeah, it might have been hippocampus. I don't know
[01:09:05.900 --> 01:09:10.220]  which part got larger actually. I started to say it, but I don't know if that was the part
[01:09:10.220 --> 01:09:15.020]  that got larger. But we have our shrinking brains because our responsibilities are shrinking and
[01:09:15.020 --> 01:09:20.060]  we're using them less. And so it turns out they said American roads are not too friendly to
[01:09:20.060 --> 01:09:24.300]  self-driving cars and they're not friendly to pedestrians. You can tell this is coming
[01:09:24.300 --> 01:09:31.340]  from the perspective of an urban planner. They love cities. They love people walking. They hate
[01:09:31.340 --> 01:09:36.060]  cars because cars are used by people to get out of the cities as fast as they can.
[01:09:37.180 --> 01:09:41.420]  They want to keep you company. There's a big difference between the London streets and
[01:09:41.420 --> 01:09:45.980]  memorizing all that and being able to navigate a Chick-fil-A parking lot drive-through.
[01:09:47.980 --> 01:09:52.620]  That's right. The founder described the company as one of the first application layer companies in
[01:09:52.620 --> 01:09:56.380]  the self-driving vehicle industry. He says we're not going to build the car. We're not going to
[01:09:56.380 --> 01:10:01.420]  navigate on the road. What we would do is we'd have a special app that gets layered on top of it.
[01:10:02.220 --> 01:10:05.260]  We aren't the fundamental models. We're not building the cars, doing anything like that.
[01:10:05.260 --> 01:10:10.060]  We're simply saying as the industry grows, has exponential rates, someone is going to have to
[01:10:10.060 --> 01:10:14.460]  sit in the middle and orchestrate, coordinate, and kind of evaluate what's going on. When I saw
[01:10:14.460 --> 01:10:19.740]  this, like air traffic control, I remember a discussion that we had, Eric Peters and I,
[01:10:19.740 --> 01:10:25.900]  years ago when we were wargaming out where this AI thing is headed for self-driving cars.
[01:10:27.340 --> 01:10:34.540]  Eric was right. He said these things don't handle interaction with human beings that well.
[01:10:34.540 --> 01:10:37.660]  So we're going to have to eliminate the human beings because that's our first priority
[01:10:37.660 --> 01:10:42.060]  is to get the AI and the self-driving stuff out there. So if there's a problem between
[01:10:42.780 --> 01:10:47.420]  AI and humans, the humans have to go, which means human drivers have to go. He said you stop and
[01:10:47.420 --> 01:10:54.860]  think about it. You have air traffic control at the airports to make sure these planes don't collide,
[01:10:54.860 --> 01:11:04.380]  and they keep big distances between themselves. Big distances vertically as well as in their
[01:11:04.380 --> 01:11:12.060]  same plane. So he said, how's that going to work with artificial, with the self-driving cars?
[01:11:12.700 --> 01:11:16.860]  You're going to have to get most of the cars off the road, and or they're all going to have
[01:11:16.860 --> 01:11:20.860]  to be self-driving cars so they can communicate with each other. If they can communicate with
[01:11:20.860 --> 01:11:25.820]  each other, you can get them doing the, I forget what they call it, it's like a caravanning thing
[01:11:25.820 --> 01:11:30.620]  or something where they get, the cars get right up against each other bumper to bumper because
[01:11:30.700 --> 01:11:36.460]  they're communicating simultaneously and whatever the front car sees, it can instantaneously
[01:11:37.580 --> 01:11:43.100]  apply that to all the cars in the row. And so it's like caravanning or something like that.
[01:11:43.100 --> 01:11:50.460]  But they sell that as a feature once they get all the humans off the road. And so now they're
[01:11:50.460 --> 01:11:54.220]  starting to talk about the air traffic control model. Yeah, we're going to have complete control
[01:11:54.220 --> 01:11:59.420]  of all the cars here. Well, just guess what? You know, when they set this thing up and they've got
[01:11:59.420 --> 01:12:04.780]  all the self-driving cars going through the drive-through, it's not going to be very friendly
[01:12:04.780 --> 01:12:09.660]  for you. And so they're gradually going to squeeze you out of it. I think another important thing
[01:12:09.660 --> 01:12:18.380]  to focus on is just you have a right to travel. You have a right to freely travel without
[01:12:18.380 --> 01:12:23.500]  impediment. Eventually, in my opinion, they've been telling us for the longest time, you need
[01:12:23.500 --> 01:12:28.380]  to have a driver's license because driving is a privilege. It's not a privilege. It's a right. I
[01:12:28.380 --> 01:12:32.300]  mean, if you're doing it commercially, they can regulate it. They should not be regulating anything.
[01:12:32.300 --> 01:12:37.420]  We shouldn't have to have driver's licenses to drive around. I'm with the guys who are the sovereign
[01:12:37.420 --> 01:12:42.060]  citizens pushing back against this. I just know, however, that you're not going to win in court
[01:12:42.060 --> 01:12:47.500]  because the courts are rigged. So don't go down that road. But anyway, the right principle. Yeah.
[01:12:47.500 --> 01:12:53.420]  If you focus on the fact that they're unsafe, that they do stupid things, eventually they will reach
[01:12:53.420 --> 01:12:59.740]  a point where they don't anymore. These things will eventually probably become statistically safer
[01:12:59.740 --> 01:13:05.260]  than the average driver because of the number of idiots we have on the road. And if you focus on
[01:13:05.260 --> 01:13:09.900]  the safety aspect, eventually that'll go away and you won't have an argument anymore. You have to
[01:13:09.900 --> 01:13:16.140]  focus on the fact that it is your right as a human being to travel and drive yourself and control
[01:13:16.140 --> 01:13:21.580]  your own destiny in that sense. The freedom and dignity, you know. And again, when you look at
[01:13:21.580 --> 01:13:28.220]  human drivers, how much of the ding against human drivers is really a ding against drunk
[01:13:28.220 --> 01:13:34.060]  drivers, right? Or third-worlders that don't speak English. Yeah. I'm tired of being lumped
[01:13:34.060 --> 01:13:39.020]  in with the drunk drivers and having to be stopped on the road to make sure that I'm sober.
[01:13:39.980 --> 01:13:43.580]  And so what they're doing is they're lumping me in with the drunk drivers again
[01:13:44.940 --> 01:13:50.940]  to say that the machines are safer. They had a Waymo this year that got stuck in one of
[01:13:50.940 --> 01:13:55.900]  Chick-fil-A's fast food cul-de-sacs. Couldn't find its way out, but that's nothing new actually.
[01:13:55.900 --> 01:14:02.140]  They're getting stuck in a lot of different places that are there. So yeah. I've told this
[01:14:02.140 --> 01:14:06.620]  story before, but one of the last times we went to North Carolina is to visit some friends.
[01:14:06.620 --> 01:14:11.100]  As we're coming back, I looked over and there's a woman in a Tesla. She's got her phone in her hand
[01:14:11.100 --> 01:14:16.380]  and she's picking her nose with the other one and she is just completely checked out. She's
[01:14:16.380 --> 01:14:21.340]  not looking at the road. She's not paying any attention. And I personally can believe that
[01:14:21.340 --> 01:14:27.980]  possibly the self-driving feature on that car is more attentive and better equipped than she is.
[01:14:27.980 --> 01:14:32.140]  Well, if she didn't have self-driving, she'd have to at least have one hand on the car.
[01:14:32.140 --> 01:14:36.220]  She'd have to pick one which she wants to. You want to look at your phone or you want to pick
[01:14:36.220 --> 01:14:40.220]  your nose and drive? Pick my nose or pick my phone? Which one do I do? The thing is they know that's
[01:14:40.220 --> 01:14:46.540]  not a good driver currently. So they say, oh, well, you've got to be alert and aware and ready
[01:14:46.540 --> 01:14:51.420]  to take over when it inevitably tries to kill someone. But these people just say, oh, well,
[01:14:51.420 --> 01:14:56.540]  it's going to drive itself. So therefore I can play on my phone and pick my nose and not worry
[01:14:56.540 --> 01:15:00.540]  about any of it. And that's the worst possible circumstance under which you can throw it back
[01:15:00.540 --> 01:15:04.540]  to you. You have an emergency that's quickly developing on the highway. Here, you take the
[01:15:04.540 --> 01:15:11.580]  wheel. That's what happened. I have royally screwed up everything. I have made a horrible
[01:15:11.580 --> 01:15:17.420]  mistake. Here you go. Enjoy your last three seconds of life. I've turned into oncoming
[01:15:17.420 --> 01:15:23.340]  traffic. This is a disaster. I am so sorry. That's right. And then, you know, Tesla looks at it and
[01:15:23.340 --> 01:15:29.180]  says, well, it was under manual control when the accident occurred. That was the case of that
[01:15:29.180 --> 01:15:35.740]  woman who was killed in Phoenix, right? She was a homeless woman pushing a grocery cart across the
[01:15:35.740 --> 01:15:41.260]  road in the dark. And the person who was a human driver couldn't see her. She was jaywalking.
[01:15:41.260 --> 01:15:46.540]  Probably would have hit her anyway. But everybody was saying, why didn't the AI put on the brakes?
[01:15:46.540 --> 01:15:53.340]  And I said, well, because it kept deploying these emergency brakes without there being a reason. And
[01:15:53.340 --> 01:15:58.460]  it got really dangerous. So we turned off the emergency braking system. And so it saw this
[01:15:58.460 --> 01:16:05.420]  person at the last minute and throws it back to the woman. And she's playing with her phone or
[01:16:05.420 --> 01:16:11.180]  whatever. And she can't handle it either. Well, Google's AI has deleted a user's entire hard drive.
[01:16:12.380 --> 01:16:17.820]  That's how they get the metrics that show that these things are so safe is because they always
[01:16:17.820 --> 01:16:22.940]  throw them over and don't count it as an accident from the car. It's an accident from the driver.
[01:16:22.940 --> 01:16:30.220]  That's right. Not my responsibility, right? So yeah, Google AI has now deleted a user's
[01:16:30.220 --> 01:16:35.660]  entire hard drive. You know, we had this story once before, and it was an entire company. Remember
[01:16:35.660 --> 01:16:41.580]  that? I've just deleted everything, all of your business records, all of your customer records,
[01:16:41.580 --> 01:16:46.380]  everything. I did it. Yeah, I'm sorry I did it. You know, that's what this woman is saying.
[01:16:46.380 --> 01:16:48.940]  You're right. You even told me not to do that.
[01:16:48.940 --> 01:16:51.740]  Yeah, you're right. Yeah, you said don't do that, but I did it anyway.
[01:16:53.020 --> 01:16:58.700]  I cannot express how sorry I am that I've deleted all your data. Well, we can only hope
[01:16:58.700 --> 01:17:06.140]  that that happens once they give the government databases to the AI. Perhaps it'll just delete
[01:17:06.140 --> 01:17:11.260]  it all. That would be nice, wouldn't it? We can hope and dream. Yeah, we're going to take
[01:17:11.260 --> 01:17:15.340]  a quick break, folks, and we will be right back.
[01:17:41.260 --> 01:17:56.300]  Here's golf legend, John Daly. Hell yeah, these winds are piling up faster than my divorces.
[01:17:56.300 --> 01:18:00.700]  I only spent on Moto, America's social casino. You know, I've won a couple of majors and on Moto,
[01:18:00.700 --> 01:18:05.340]  I've won majors, grands and epic jackpots on their classic Vegas slots with huge, huge bonus
[01:18:05.340 --> 01:18:09.980]  rounds. Moto casino adds new games and awards players free coins every single day. Grip it
[01:18:09.980 --> 01:18:14.300]  and spend it on Moto casino. Download the Moto casino app today. Moto casino is a social casino
[01:18:14.300 --> 01:18:16.860]  board. We're prohibited. No purchase necessary. Visit moto.us for more details.
[01:18:22.540 --> 01:18:30.620]  Multiply, multiply, multiply, multiply. With X the cash scratch tickets from the Texas lottery,
[01:18:30.620 --> 01:18:37.740]  you could multiply the cash by 30, 50, 100, or even 200 times. And when you multiply the cash,
[01:18:37.740 --> 01:18:42.860]  you multiply the celebration with top prizes from 60,000 up to a million dollars. It's the
[01:18:42.860 --> 01:18:47.820]  easiest way to multiply your luck and enter for a chance to win a VIP I heart experience.
[01:18:47.820 --> 01:18:54.780]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[01:19:07.740 --> 01:19:32.780]  You're listening to the David Knight show.
[01:20:07.740 --> 01:20:30.780]  And now the David Knight show.
[01:20:31.100 --> 01:20:37.660]  If you like the Eagles, the cars, and Healy Lewis in the news,
[01:20:40.700 --> 01:20:47.660]  you'll love the classic hits channel at APS radio. Download our app or listen now at APSradio.com.
[01:20:49.340 --> 01:20:53.340]  Well, welcome back folks. We've got a lot of comments. Stealth Patriot, thank you very much
[01:20:53.340 --> 01:20:58.300]  for the tip. He says, do you think the AI police surveillance state and self-driving cars is the
[01:20:58.380 --> 01:21:02.860]  infrastructure the Trump supporters thought they were promised? I'll bet they're tired of winning.
[01:21:02.860 --> 01:21:08.700]  I haven't seen any of them put this stuff up and say, I voted for this. I voted for Ed 209.
[01:21:13.500 --> 01:21:18.140]  No, I didn't. But I'm afraid that's what we're going to get. That's why I don't think we got
[01:21:18.140 --> 01:21:24.700]  that in the board anymore, do we? That Apocalypse Now thing, the animation of the Trump meme.
[01:21:24.780 --> 01:21:31.100]  I literally just took it out yesterday. That's why I went with that because it's not just the
[01:21:31.100 --> 01:21:35.740]  wars that he's starting unnecessarily, but it's the war that he wants to have domestically.
[01:21:37.020 --> 01:21:40.940]  And I think when you look at what's going on in Venezuela and you look at these flimsy lies that
[01:21:40.940 --> 01:21:46.380]  they're putting out, well, these people are running drugs and that's a threat. That's a
[01:21:46.380 --> 01:21:52.300]  violent threat to us. That is as absurd folks as the left saying to you that speech is violence.
[01:21:53.020 --> 01:21:59.420]  Drugs are not violence. Drugs are a black market. And when you create a black market monopoly,
[01:21:59.420 --> 01:22:06.140]  you will get violent gangs who will compete with each other. And yet they're using that to say
[01:22:06.140 --> 01:22:13.100]  that it is violence. It's their prohibition that is violence. The drugs are harmful and I don't
[01:22:13.100 --> 01:22:18.220]  recommend anybody take them. I just know that we already had this experiment once we did it legally
[01:22:18.220 --> 01:22:23.500]  with alcohol and it was a massive failure. But he's using that. If you use those arguments,
[01:22:23.500 --> 01:22:28.700]  they're being used by the Pentagon. Those same arguments could be used and will be used, I think,
[01:22:29.260 --> 01:22:34.700]  to do violence on the street to people without due process in the same way that his hero,
[01:22:35.500 --> 01:22:40.620]  Duterte in the Philippines, did that on the streets of the Philippines. He wants to do that here.
[01:22:41.500 --> 01:22:44.860]  Go ahead, read this. And when you gave me that
[01:22:44.860 --> 01:22:52.940]  ED-209 clip to put in, I thought that was in reference to the attacks on the drug boats
[01:22:52.940 --> 01:23:01.100]  allegedly after they dropped the drugs. Yeah, you have five seconds to drop the
[01:23:01.100 --> 01:23:05.020]  cocaine to get off the boat. Are you trying to float in the river? Yeah.
[01:23:05.020 --> 01:23:10.060]  We will open fire in 40 minutes. Yeah. So evidently from what we're told,
[01:23:10.060 --> 01:23:13.820]  the only way these people could have not been killed was if they decided that they were going
[01:23:13.820 --> 01:23:21.900]  to swim back to shore. If they tried to float on the boat, then that's a threat. Crazy.
[01:23:22.780 --> 01:23:28.540]  Alien Poop Evolution says cockroach eats bait poison. Man eats cockroach. Could happen in
[01:23:28.540 --> 01:23:35.260]  any restaurant. Thankfully, I'm pretty sure that the quantity of poison in a roach would not
[01:23:35.260 --> 01:23:41.180]  actually negatively impact you based on your size. However, just gross, gross.
[01:23:41.180 --> 01:23:45.340]  It is also a roach eating contest. You get enough of those guys with poison.
[01:23:47.500 --> 01:23:50.620]  Hopefully, they weren't just out there collecting roaches off the ground. Hopefully,
[01:23:50.620 --> 01:23:55.500]  these were specifically procured roaches. Since this is a reptile store, I'm assuming
[01:23:55.500 --> 01:23:59.900]  that these are like the Madagascar cockroaches because it said they were three to four inches
[01:23:59.900 --> 01:24:02.940]  big. Maybe some kind of particularly bred
[01:24:02.940 --> 01:24:06.220]  cockroach that these reptiles like to eat. Yeah.
[01:24:06.220 --> 01:24:14.620]  We have Owen61 saying, Somali appetizers. Delicious. Assyrian girl. Should have let the
[01:24:14.620 --> 01:24:21.180]  python eat the bugs and himself eat the python. Fairly certain. Fairly certain the pythons have
[01:24:21.180 --> 01:24:25.100]  enough sense to not be eating cockroaches. I think they go for something that's higher
[01:24:25.100 --> 01:24:30.060]  up the food chain, like people. If it's a Burmese python, who knows?
[01:24:30.700 --> 01:24:35.420]  Those can get large enough that they can pose a threat. However, your average python.
[01:24:35.420 --> 01:24:39.340]  Since it was Florida and since they've got such a problem now with the Burmese python,
[01:24:39.340 --> 01:24:43.660]  I'm assuming that it was a Burmese python or something. Maybe they made those out loud.
[01:24:43.660 --> 01:24:49.580]  I think they may have. Well, I know for a fact that as a general rule, if you're going to keep
[01:24:49.580 --> 01:24:54.300]  a Burmese python, you need a specifically set up enclosure because that thing is going to get
[01:24:54.300 --> 01:24:59.260]  massive. If you don't have one, you are eventually just going to end up getting rid of it and
[01:24:59.260 --> 01:25:04.620]  probably releasing it into the Everglades. Narrow way, narrow gate ministries. How disgusting.
[01:25:04.620 --> 01:25:10.380]  Cockroaches are filled with all sorts of bacteria and diseases. Under the Levitical laws,
[01:25:10.380 --> 01:25:13.900]  Levitical eating laws, only locusts and grasshoppers are clean to eat.
[01:25:13.900 --> 01:25:17.340]  All their flying creeping are unclean and you shall not eat.
[01:25:17.340 --> 01:25:21.260]  That's what I say. I always tease my family because they like lobster.
[01:25:21.820 --> 01:25:27.980]  And I said, I don't eat water filters. These are the, you know, it's in Levitical water.
[01:25:27.980 --> 01:25:33.180]  Some of the other things I think it's kind of interesting. How did Moses know that these
[01:25:33.180 --> 01:25:38.620]  things that are scavengers that are eating waste and anything like, you know, cockroaches or,
[01:25:39.740 --> 01:25:44.300]  you know, the shellfish and things like that, how did he know that that would be harmful for you?
[01:25:44.860 --> 01:25:48.860]  So you can look at it and say, well, I'm told I can't do this. Or the other way you can look at
[01:25:48.860 --> 01:25:52.460]  it is, you know, God is telling them, you know, don't eat this stuff and you won't get the
[01:25:52.460 --> 01:25:58.140]  diseases that the Egyptians get when they eat this kind of stuff. Stay away from the water filters.
[01:25:59.100 --> 01:26:00.940]  Mmm, delicious water bugs.
[01:26:03.020 --> 01:26:07.820]  High boost new Stephen King movie concept of Christine, but it's an AI smart fridge.
[01:26:10.620 --> 01:26:15.260]  Yeah, it works for ice. Yeah. Beware of your smart refrigerators. They work for ice. Yeah.
[01:26:15.260 --> 01:26:20.300]  It seems like you're buying a lot of tamales there, friend. Perhaps we need to report you.
[01:26:20.300 --> 01:26:24.940]  I mean, I think the AI smart fridges are already about as evil as they could possibly be.
[01:26:25.500 --> 01:26:30.220]  Yeah. They're already spying on you. They're doing everything that they have the capability
[01:26:30.220 --> 01:26:34.460]  to do except for spoiling your food that they can do that is against you.
[01:26:34.460 --> 01:26:42.140]  You know, it was about a decade ago that Betrayus, Petraeus, I've called him Betrayus so much,
[01:26:42.940 --> 01:26:49.740]  but Petraeus went from the military to the CIA and he made that statement. He said your
[01:26:49.740 --> 01:26:53.260]  refrigerator isn't going to be smart and they're going to be spying on you, that type of thing.
[01:26:53.260 --> 01:26:57.180]  We talked about that and everybody, oh, you conspiracy theorists and everything. It wasn't
[01:26:57.180 --> 01:27:01.660]  a conspiracy theory. It was a conspiracy, but it wasn't a theory. He had said they were going to
[01:27:01.660 --> 01:27:06.300]  do it and now we see it everywhere, don't we? It's amazing. Real Jason Barker says,
[01:27:06.300 --> 01:27:11.180]  my wife wants a new TV and we cannot find one that does not have the smart features anywhere.
[01:27:11.180 --> 01:27:18.700]  Yeah. It's a huge nuisance. They're completely and utterly just, they don't do anything useful.
[01:27:18.700 --> 01:27:23.580]  They're obnoxious. They get in the way. You're going to have to go back to an old CRT TV if
[01:27:23.580 --> 01:27:27.180]  you want to avoid them at this point. Yeah. I was going to say you just make
[01:27:27.180 --> 01:27:30.220]  sure that it's not connected to the internet, but unlike your thermostat or something like that,
[01:27:30.220 --> 01:27:34.300]  you need to connect the TV to the internet. That's the problem. I got you there.
[01:27:35.420 --> 01:27:41.260]  I'm becoming convinced that 4.3 is actually the superior aspect ratio for TV viewing.
[01:27:41.980 --> 01:27:44.540]  Why is that? It's cozier. It focuses the view.
[01:27:44.540 --> 01:27:48.540]  You don't have all this extraneous information on the outside of the screen.
[01:27:48.540 --> 01:27:51.820]  If you're looking for something like an IMAX that's a spectacle, maybe that's what you want.
[01:27:51.820 --> 01:27:56.540]  But for TV shows, it's a bit cozier. It's a bit comfier. You've got your little cast there
[01:27:57.100 --> 01:28:00.860]  and you're focused on them. You don't have to worry about all this nonsense on the periphery.
[01:28:00.860 --> 01:28:03.500]  Thought you were going to say it's because 4.3 doesn't spy on you.
[01:28:06.460 --> 01:28:09.100]  Yeah. I prefer the black and white stuff actually if I'm on TV.
[01:28:10.220 --> 01:28:15.100]  Aesthetics reasons. Real Jason Barker, all the new TVs listen to you. They have Alexa or other
[01:28:15.100 --> 01:28:18.940]  voice functions. I hate talking to robots. I refuse to.
[01:28:19.900 --> 01:28:24.220]  Goldsmith. I remember reading that Charlie was based on Charles Schultz's own younger days
[01:28:24.220 --> 01:28:27.420]  and personality and that he eventually married that red-haired girl. Very nice.
[01:28:27.420 --> 01:28:30.140]  Oh, that's great. Good for him.
[01:28:30.140 --> 01:28:37.740]  Yeah. He was a cool guy. I liked him a lot. Yeah. Yeah. Very relaxed guy. What was the guy? Mr.
[01:28:37.740 --> 01:28:39.660]  Roberts or something. Mr. Roger?
[01:28:39.660 --> 01:28:42.380]  Roger. Yeah. Mr. Roger's neighborhood.
[01:28:42.460 --> 01:28:48.540]  Yeah. Yeah. As a matter of fact, they have brought him back with AI so that he's doing all kinds of
[01:28:49.580 --> 01:28:52.380]  things that really the original character would not do.
[01:28:53.820 --> 01:28:58.540]  So he's part of the... As Sora was coming back, they were doing all these things with
[01:29:00.300 --> 01:29:05.100]  Stephen Hawkins doing races in his wheelchair and things like that.
[01:29:05.100 --> 01:29:05.660]  Donuts and what have you.
[01:29:05.660 --> 01:29:12.300]  The stuff that they did with Mr. Rogers was, I think, even funnier. Go ahead.
[01:29:13.180 --> 01:29:17.820]  Brian and Deb McCartney says, you cannot reason with a robot. That's right. You just have to
[01:29:17.820 --> 01:29:23.980]  put the weapon down. Goldsmith says, did you see the Waymo cars that have been passing school buses
[01:29:23.980 --> 01:29:30.460]  that are releasing kids? Time for a code check. That's right. I guess it doesn't recognize the
[01:29:31.420 --> 01:29:35.980]  law or the yellow paint because that's what keeps the school buses safe, right? You don't
[01:29:35.980 --> 01:29:40.220]  have to have seat belts. There's no safety devices in there. There's no airbags, no seat belts,
[01:29:40.220 --> 01:29:44.780]  nothing. It's just they're covered with yellow paint and they're covered with laws.
[01:29:44.780 --> 01:29:49.980]  And maybe it's hard for it to see it. They had these things keep hitting. It's interesting.
[01:29:49.980 --> 01:29:54.700]  It's almost like somebody is sabotaging them. They have a propensity to hit fire trucks,
[01:29:54.700 --> 01:30:00.220]  police trucks, and to threaten school buses. But it's okay. They're safer than we are,
[01:30:00.220 --> 01:30:05.420]  right? And we should have more of them. Real Jason Barker says, do the AI and data
[01:30:05.420 --> 01:30:09.980]  centers actually consume the water or just require initial filling of a closed loop system
[01:30:09.980 --> 01:30:19.660]  like your car uses? Yeah, I don't know. They're using it for cooling and they put these power
[01:30:19.740 --> 01:30:25.900]  plants on the edge of bodies of water for quite some time to recycle it through. So I don't
[01:30:25.900 --> 01:30:32.540]  really know. It seems like you'd be able to cover that. But who knows? Well, I saw something that
[01:30:32.540 --> 01:30:39.020]  was saying it's different from just power plants because these things require cleaner water. So
[01:30:39.020 --> 01:30:46.300]  it's essentially taking up water that has been purified and treated that could be used as drinking
[01:30:46.300 --> 01:30:52.220]  water and running it through their system where I suppose it evaporates off and then they have to.
[01:30:52.220 --> 01:30:56.300]  Or maybe it says no longer drinking water. And so that's what they mean by consuming water,
[01:30:56.300 --> 01:31:01.580]  right? So you had some purified water that had been treated or something and had fluoride in it.
[01:31:03.580 --> 01:31:08.620]  What happens when the AI centers consume fluoride? Do they get stupid as well? I don't know.
[01:31:08.620 --> 01:31:14.460]  I can't wait for the tech cults to emerge and they'll just be selling you the holy water that
[01:31:14.460 --> 01:31:19.740]  was used to cool the AI data center. Here's golf legend, John Daley. Hell yeah. These winds
[01:31:19.740 --> 01:31:23.900]  are pulling up faster than my divorces. I only spend on Moto, America's social casino. You know,
[01:31:23.900 --> 01:31:28.700]  I've won a couple of majors and on Moto, I've won majors, grands and epic jackpots on their classic
[01:31:28.700 --> 01:31:33.420]  Vegas slots with huge, huge bonus rounds. Moto casino adds new games and awards players free
[01:31:33.420 --> 01:31:38.060]  coins every single day. Grip it and spend it on Moto casino. Download the Moto casino app today.
[01:31:38.060 --> 01:31:40.700]  Moto casinos, the social casino board were prohibited. No purchase necessary. Visit
[01:31:40.700 --> 01:31:42.700]  Moto.us for more details.
[01:32:10.700 --> 01:32:17.820]  Win a VIP iHeart experience. Play X the cash crash tickets today. Must be 18 or older. Play responsibly.
[01:32:17.820 --> 01:32:24.540]  Drink. Drink the water. Real Jason. Read that one. Fonzie bear. Minority report cars always looked
[01:32:24.540 --> 01:32:30.860]  like what they want. Minority report cars always looked like what they want to come to be. Yeah,
[01:32:30.860 --> 01:32:37.420]  the weird little bubbles that are completely un-stylish, uncool. Yeah. Minority report. Another
[01:32:37.500 --> 01:32:42.540]  pretty good movie. It's a very communist aesthetic to a car. It's sort of like the
[01:32:42.540 --> 01:32:48.540]  car equivalent of wearing pajamas or a jumpsuit everywhere. Yeah. Yeah. Where do we wear pajamas
[01:32:48.540 --> 01:32:53.820]  now everywhere? Everywhere. At the TSA. The TSA. Everybody goes and they fly because
[01:32:54.380 --> 01:33:00.860]  they've imposed that kind of authoritarianism on us. Yeah. Nibiru, 2029. Self-driving cars will
[01:33:00.860 --> 01:33:05.580]  drive auto insurance rates beyond affordability. That's right. You want to drive your own car? Well,
[01:33:05.580 --> 01:33:10.700]  sorry buddy. You're going to have to pay through the nose. We'll all be treated like teen drivers.
[01:33:13.260 --> 01:33:17.580]  When I was getting my first car, there were a few I was looking at. Of course, you know, as a guy,
[01:33:17.580 --> 01:33:25.100]  you're looking at some of the nicer low-end sports cars. Things like the, whatever, the Scion FRS.
[01:33:25.740 --> 01:33:31.020]  And the insurance on that thing was going to be ludicrous. It would have been a massive,
[01:33:31.020 --> 01:33:36.060]  like a substantial portion of the car's actual cost per year to ensure that. Because again,
[01:33:36.060 --> 01:33:42.380]  young guys get that and they just wrap it around telephone poles non-stop. So you got a Nissan
[01:33:42.380 --> 01:33:48.860]  300 twin turbo. Yeah, it was great. Insurance rates on that are nothing, right? Well, I mean,
[01:33:48.860 --> 01:33:53.740]  considering how infrequently that thing ran. Yeah, that's true. Didn't have to have it insured.
[01:33:53.740 --> 01:34:01.260]  Yeah, I had a friend I work with who was into one of these rice rocket motorcycles, right?
[01:34:01.260 --> 01:34:06.300]  And it was a really fast motorcycle and it was expensive. I mean, it was just under $20,000. But
[01:34:07.340 --> 01:34:15.500]  he said the insurance was going to be prohibitive. He said, they're charging me so much insurance.
[01:34:15.500 --> 01:34:21.500]  I could buy a new one of these like every year or two. And he goes, how do you justify that? He goes,
[01:34:21.500 --> 01:34:25.900]  and I'm not even a threat to anybody else really with this motorcycle. We're going to get a big
[01:34:27.660 --> 01:34:32.060]  bill. It's like, they don't have to pay for the people that I hit for the most part.
[01:34:32.060 --> 01:34:37.340]  You just got to scrape me off of it. That's the thing is just, if you're on the motorcycle,
[01:34:37.340 --> 01:34:45.740]  if you have an accident, you may not even need insurance. You may go beyond your necessary
[01:34:45.740 --> 01:34:50.300]  mortal concerns if you have an accident on a motorcycle. Very much more likely to happen,
[01:34:50.300 --> 01:34:54.540]  in my opinion. Jerry Alatalo states opposed to artificial intelligence, autonomous warfare,
[01:34:54.540 --> 01:34:58.780]  minority report surveillance, and other horrific aspects of technocracy and transhumanism stand
[01:34:58.780 --> 01:35:09.740]  in the way of American dystopia. R plus excuse. Another interesting thing is the minority report
[01:35:09.740 --> 01:35:14.220]  video game from way back in the day was actually pretty good. Didn't follow the TV story, but it
[01:35:14.220 --> 01:35:17.740]  was still entertaining. It was just like a beat them up, shoot them up. Don't frag me, bro. The
[01:35:17.740 --> 01:35:21.900]  false promise of safety and security is the oldest argument by tyrants for peasants to give
[01:35:21.900 --> 01:35:28.540]  up their freedom. Pezzano Vonte, 1776. Just like they lumped the criminal misuse of firearms with
[01:35:28.540 --> 01:35:34.540]  firearms owners and all gets thrown into one. That's a reference to the drunk drivers being
[01:35:34.540 --> 01:35:40.380]  counted and determining how safe human drivers are. Yeah, that's right. Yeah. You shouldn't be
[01:35:40.380 --> 01:35:46.060]  allowed to have a gun because criminals shoot people and it's like, well, people defend life
[01:35:46.060 --> 01:35:50.060]  with that as well. Well, you know, Trump has finally come around. Remember they were talking
[01:35:50.060 --> 01:35:54.300]  for the longest time about how they were going to help the farmers that he had hurt with the
[01:35:54.300 --> 01:36:01.340]  tariffs. Don't worry. Help is on the way. Yeah, we just gave $20 billion to Argentina and they
[01:36:01.340 --> 01:36:07.020]  use that to set up a deal with China. So China doesn't buy our agricultural products anymore.
[01:36:07.020 --> 01:36:15.020]  They get the soy directly from Argentina. And that was a massive double cross of the farmers.
[01:36:15.020 --> 01:36:19.260]  Trump had already betrayed the farmers in his first administration with tariff rates that caused
[01:36:21.500 --> 01:36:26.780]  them to not be able to sell their products. But then he doubled down with this and said,
[01:36:26.780 --> 01:36:31.260]  well, we give $20 billion to Argentina and we got another $20 billion that we're going to put
[01:36:31.260 --> 01:36:34.460]  together with people on Wall Street. So we're all together. We're going to give them 40.
[01:36:34.460 --> 01:36:40.060]  Don't worry. We'll give you $12 billion someday. Well, that was back in September. Here we are in
[01:36:40.860 --> 01:36:46.780]  December, three months later, and now he's talking about it being imminent.
[01:36:46.780 --> 01:36:51.500]  I'm delighted to announce this afternoon that the United States will be taking a small portion of
[01:36:51.500 --> 01:36:57.820]  the hundreds of billions of dollars we receive in tariffs. We are making a lot of money from
[01:36:57.820 --> 01:37:03.180]  countries that took advantage of us for years. They took advantage of us like nobody's ever seen.
[01:37:03.180 --> 01:37:07.660]  Our deficits are way down. He took advantage of the farmers who voted for him. Because of the
[01:37:07.660 --> 01:37:11.820]  election. Because without the election, you wouldn't have tariffs. You'd be sitting here
[01:37:11.820 --> 01:37:18.220]  losing your share. But we're taking in billions. We're really taking in trillions of dollars.
[01:37:18.220 --> 01:37:21.740]  If you think about it, Scott, because the real numbers, you know, when you think of all the
[01:37:21.740 --> 01:37:27.900]  the money being poured into the country for new auto plants and all of the other things, AI.
[01:37:28.940 --> 01:37:36.780]  So what would you take in a relatively small portion of that? And we're going to be giving
[01:37:36.780 --> 01:37:43.100]  and providing it to the farmers in economic assistance. And we love our farmers. And as you
[01:37:43.100 --> 01:37:49.100]  know, the farmers like me, because, you know, based on based on voting trends, you could call
[01:37:49.100 --> 01:37:55.180]  it voting trends. All right, that's enough of the lies. All of that is a lie. Okay. And if we're
[01:37:55.180 --> 01:37:59.660]  making trillions of dollars, but I'm going to give them 12 billion dollars, even if it were true,
[01:38:00.300 --> 01:38:06.140]  he'd be reprehensible, because he's going to give them one thousandth of what he's bringing in.
[01:38:06.780 --> 01:38:11.340]  And wait for months and months as these guys are circling the drain, struggling to survive.
[01:38:12.460 --> 01:38:17.900]  This is America last, folks. This is not America first. Trump says the 12 billion dollar bailout
[01:38:17.900 --> 01:38:23.020]  plan for farmers will come from the tariff revenue. You know, this is one of the most amazing
[01:38:23.020 --> 01:38:27.660]  things. This is this is better. Actually, tariffs. Why didn't we think of this before? This is better
[01:38:27.660 --> 01:38:33.500]  than the Federal Reserve. This is better than the Democrats modern monetary theory, where we just
[01:38:33.500 --> 01:38:37.820]  have this magic money tree that we can print the money and it doesn't make any difference and the
[01:38:37.820 --> 01:38:42.460]  deficits don't make any difference. You know, we just create money and wealth out of thin air.
[01:38:43.180 --> 01:38:47.900]  It's even better than the Federal Reserve thing, because, you know, they're getting in all of this
[01:38:47.900 --> 01:38:54.940]  revenue and it isn't raising anybody's prices, right? It's not hurting any manufacturing or
[01:38:54.940 --> 01:39:03.420]  farmers here in this country, except that it is. And apart from the arguments about how the
[01:39:03.420 --> 01:39:11.020]  taxes should be structured, the worst thing about Trump's tariffs has and remains the capricious,
[01:39:11.020 --> 01:39:19.260]  arbitrary, continually shifting environment that it's created, making it impossible for people to
[01:39:19.260 --> 01:39:23.820]  be able to do business, whether you're a manufacturer, whether you are a retailer,
[01:39:23.820 --> 01:39:29.260]  importing stuff, or whether you are a farmer. This has been absolutely chaotic. As I pointed
[01:39:29.260 --> 01:39:33.100]  out before, we have the Chicago Commodities Exchange.
[01:39:59.260 --> 01:40:02.380]  Download the MotoCasino app today.
[01:40:29.260 --> 01:40:40.700]  Because farmers needed to have a way to make sure that they knew what their price was going to be.
[01:40:40.700 --> 01:40:44.940]  They could lock that in in the future. So that's why you have the commodity futures market.
[01:40:45.820 --> 01:40:50.540]  And yet what Trump has done is he's taken all that away. I guess we could say that with the Trump
[01:40:51.100 --> 01:40:57.820]  capricious, arbitrary, ever-changing tariff policy, there is no futures for any of us,
[01:40:57.820 --> 01:41:03.500]  because of what he's taken away. The package includes $11 billion in a one-time payment
[01:41:03.500 --> 01:41:09.260]  to crop farmers. And oh, by the way, there's this interesting little thing there from the
[01:41:09.260 --> 01:41:13.260]  Department of Agriculture Secretary, Brooke Rawlins, saying, yeah, we're going to get these
[01:41:13.260 --> 01:41:21.340]  things out in February of 2026. So it's still not coming. He's waited three months. They hinted at
[01:41:21.340 --> 01:41:27.020]  it. He had Scott Bessent hinting at it. Trump announces it, but isn't actually going to be
[01:41:27.020 --> 01:41:31.420]  going out from what I can see based on what Brooke Rawlins said. It won't happen for another two
[01:41:31.420 --> 01:41:37.660]  months yet. So they're going to go half a year with this. So the aid package comes as the US-China
[01:41:37.660 --> 01:41:42.860]  trade war has hit soybean farmers especially hard. I would just say this is the Trump trade war.
[01:41:43.980 --> 01:41:48.220]  China had blocked all purchases of soybeans from the US. China was the biggest buyer of US soybeans
[01:41:48.220 --> 01:41:55.660]  in 2024, accounting for $12.5 billion in sales. China agreed to purchase 12 million metric tons
[01:41:55.660 --> 01:42:03.180]  of soybeans now in the final two months of this year and 25 million metric tons in 2026, 27, and
[01:42:03.180 --> 01:42:11.020]  28 on par with levels before the trade war. But what CBS does not say, I'm sorry, this is ABC,
[01:42:11.020 --> 01:42:19.100]  not CBS. At what price? You know, it was a double whammy from Trump. Not only did he cut off their
[01:42:19.100 --> 01:42:25.980]  biggest customer, but that created a glut of soybeans on the domestic market and it took the
[01:42:25.980 --> 01:42:31.100]  price down. So the question is, at what price did they get this stuff? That actually matters.
[01:42:31.820 --> 01:42:36.780]  It's amazing they don't even think about that. But what they're doing is, even though it's ABC,
[01:42:36.780 --> 01:42:41.580]  they're just kind of, whoever wrote this thing is just going with the talking points of the Trump
[01:42:41.580 --> 01:42:46.860]  administration. So far, China has purchased only two and a half million metric tons of
[01:42:47.340 --> 01:42:52.220]  soybeans, not the 12. So they got a lot of catching up to do here. The administration's
[01:42:52.220 --> 01:42:56.460]  new actions also come on the heels of the administration's $20 billion bailout of
[01:42:56.460 --> 01:43:02.780]  Argentina, which Scott Bessent said he was going to make it 40 in terms of helping put together some
[01:43:02.780 --> 01:43:07.740]  private funds. A move that many American farmers and lawmakers on both sides of the political aisle
[01:43:07.740 --> 01:43:14.620]  criticize. This fall, as China stopped buying all soybeans from US farmers, it purchased soybeans
[01:43:14.620 --> 01:43:20.060]  from Argentina instead. So the US was giving a financial lifeline to Argentina, a country that
[01:43:20.060 --> 01:43:26.540]  directly benefited from the trade war. American farmers said they felt left behind. At the time,
[01:43:26.540 --> 01:43:33.100]  Chuck Grassley in Iowa said, farmers are very upset about Argentina selling soybeans to China
[01:43:33.100 --> 01:43:40.780]  right after the US bailed out and there's still zero US soybeans sold to China. That was back in
[01:43:40.780 --> 01:43:48.140]  September and it's taken them this long to firm up their promises, but still not to help the
[01:43:48.140 --> 01:43:53.900]  farmers. Trump in his first term also took action to bail out American farmers, except that he'd
[01:43:53.900 --> 01:43:59.260]  already bailed them in to his tariff regime. He'd already hurt them. This is like somebody
[01:43:59.260 --> 01:44:03.980]  breaking your legs and then handing you, giving you a wheelchair and boasting about the wheelchair
[01:44:03.980 --> 01:44:10.220]  they gave you. His administration approved two packages in 2018 and 19, totaling $28 billion
[01:44:10.220 --> 01:44:14.860]  for farmers impacted by his economic policies. Many of them were saying, well, he nearly put
[01:44:14.860 --> 01:44:19.900]  us out of business with these tariff policies. Now he's putting us out of business with the COVID
[01:44:19.900 --> 01:44:29.660]  lockdown. So again, the announcement was made yesterday. So meanwhile, the run-up in soybean
[01:44:29.660 --> 01:44:36.060]  futures over the past month over a resolution with China, crop prices are still close to 2020 lows.
[01:44:36.700 --> 01:44:41.420]  Now this is zero hedge. This ABC didn't even think about the price aspect of it. That's the
[01:44:41.420 --> 01:44:47.100]  all-important thing. You go out and you make a deal with China and let's say, you know, I don't have
[01:44:47.100 --> 01:44:52.620]  any idea what soybeans cost or what quantity they sell them in. Let's say they, we'll call it a
[01:44:52.620 --> 01:44:57.500]  widget. You put them in a widget. I don't know if it's a basket or a barrel or whatever it is,
[01:44:57.500 --> 01:45:03.340]  a bushel or whatever, but you got a widget full of soybeans that goes for $10. Then after this,
[01:45:04.300 --> 01:45:08.460]  he wants to make a deal. He wants to show that he's getting them back up buying soybeans.
[01:45:08.460 --> 01:45:12.460]  And they agree to it. So what did he do to get them to agree to it? Did he say, well,
[01:45:12.460 --> 01:45:15.500]  now you can buy the same quantity of stuff, but we'll sell you these
[01:45:17.500 --> 01:45:25.340]  soybeans at $5 per widget full of soy stuff. So again, they're taking advantage of the low
[01:45:25.340 --> 01:45:31.580]  cost right now. Is that what they're doing? So as they announced this, Trump is saying this
[01:45:31.580 --> 01:45:37.180]  wouldn't be possible. This money would not be possible without tariffs. Here's the truth,
[01:45:37.180 --> 01:45:43.420]  folks. It wouldn't be necessary without tariffs. He wouldn't have to give them a bailout if he
[01:45:43.420 --> 01:45:49.660]  hadn't bailed them into his Trump trade war. You know, these farmers that are suffering from the
[01:45:49.660 --> 01:45:54.220]  tariffs. Well, without the tariffs, I wouldn't have had the money to give them a piece of it.
[01:45:54.220 --> 01:46:00.220]  I've taxed these people to death and now I'll dole out a small amount back to them. That wouldn't
[01:46:00.220 --> 01:46:04.940]  be possible if I hadn't taxed them to death in the first place. That's right. And here's why I say
[01:46:04.940 --> 01:46:09.660]  it's not going to happen until 2026. This is CNN reporting now. It says, Rollins said the money
[01:46:09.660 --> 01:46:16.540]  would be flowing by February the 28th, 2026, the very last day of February. We're going to get the
[01:46:16.540 --> 01:46:23.260]  money flowing. So we'll make the first payment in three months now and explain that a billion
[01:46:23.260 --> 01:46:29.180]  dollars of the funding is being held back to make sure all specialty crops are covered. She credited
[01:46:29.180 --> 01:46:34.300]  Trump for opening the markets through trade deals without directly acknowledging how tariffs have
[01:46:34.300 --> 01:46:40.380]  impacted farmers. Again, you close the markets and now you open it. And so now you pat yourself on
[01:46:40.380 --> 01:46:45.900]  the back for opening the market that was open before you closed it. All of this is based on a
[01:46:45.900 --> 01:46:52.140]  lie. And so what you've been able to do is to open those markets up again and move towards an era
[01:46:52.140 --> 01:46:57.820]  where our farmers are not so reliant on government checks. Here's the bottom line. He was just
[01:46:57.820 --> 01:47:06.300]  boasting about the fact that after he disrupted the market sale of soybeans at market prices to
[01:47:06.300 --> 01:47:11.260]  China, after he messed with the market price, after he closed it off and shut it to zero,
[01:47:11.820 --> 01:47:15.260]  now he's going to open it back up and they're going to purchase it at levels that they were
[01:47:15.260 --> 01:47:21.900]  buying before he started in this nonsense. Just amazing. Are you tired of the winning? I'm tired
[01:47:21.900 --> 01:47:27.020]  of the whining about all of this stuff and the fact that he is lying to everybody about this.
[01:47:27.820 --> 01:47:32.700]  Some farmers have previously bailed at the idea of aid. Mark Reed, a director for the Illinois
[01:47:32.700 --> 01:47:41.260]  Soybean Association, said farmers don't want free aid. We want free trade. There you go. That's what
[01:47:41.260 --> 01:47:46.700]  they had before he messed with it. Well, reason says the Trump tariffs have failed to reduce the
[01:47:46.700 --> 01:47:53.180]  trade deficits. You ain't heard about Modo Casino? Modo has real Vegas slots. Any game you can find
[01:47:53.180 --> 01:47:57.660]  on the floor in Vegas, you can play it on Modo. I like my slots hot. Modo's free to play, like
[01:47:57.660 --> 01:48:01.980]  food stamps, in line at the grocery store, at a funeral, in traffic, keep your eyes on the road,
[01:48:01.980 --> 01:48:06.940]  hop on Modo Casino. Modo Casino got jackpots that are bigger than my belly. Modo, America's hottest
[01:48:06.940 --> 01:48:10.700]  free to play social casino. Download the Modo Casino app today. Modo Casino is a social casino.
[01:48:11.340 --> 01:48:13.340]  No purchase necessary. Visit Modo.us for more details.
[01:48:18.940 --> 01:48:27.020]  Multiply, multiply, multiply, multiply. With X's the cash scratch tickets from the Texas lottery,
[01:48:27.020 --> 01:48:34.060]  you could multiply the cash by 30, 50, 100, or even 200 times. And when you multiply the cash,
[01:48:34.060 --> 01:48:39.260]  you multiply the celebration. With top prizes from 60,000 up to a million dollars, it's the
[01:48:39.260 --> 01:48:44.220]  easiest way to multiply your luck. And enter for a chance to win a VIP iHeart experience.
[01:48:44.220 --> 01:48:48.780]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[01:48:49.420 --> 01:48:55.180]  How should we assess whether the Trump tariffs have been effective or successful? Well, it's
[01:48:55.180 --> 01:49:00.060]  an important question. Trump has outlined overlapping and confusing and sometimes competing
[01:49:00.060 --> 01:49:05.500]  goals for the tariffs. He has celebrated them as a source of government revenue, for example.
[01:49:06.220 --> 01:49:09.500]  But he's also claimed that they're meant as a negotiating tactic.
[01:49:10.060 --> 01:49:17.260]  They can't be both. Tariffs used for negotiation are meant to be removed once the negotiations
[01:49:17.260 --> 01:49:23.020]  are complete. He's also said, they don't mention it here, but he's also said we're going to use
[01:49:23.020 --> 01:49:30.140]  the tariffs to make sure that manufacturing moves back to America. And look at all the
[01:49:30.140 --> 01:49:32.780]  win-fall profit that we're going to make. Well, again, you can't have both of those. You're
[01:49:32.780 --> 01:49:37.580]  either going to use it for negotiations and then take it off, or you're going to use it to get
[01:49:37.580 --> 01:49:43.340]  businesses to come back. If that's your goal, get businesses to come back and do manufacturing
[01:49:43.340 --> 01:49:47.900]  here domestically. But if they do manufacturing domestically, then your tariff revenue goes away.
[01:49:48.460 --> 01:49:53.340]  So he's always putting out these contradictory ideas and everybody grabs whatever they want.
[01:49:53.340 --> 01:49:56.860]  They think, well, he's going to make so much money, we're going to get rid of the income tax.
[01:49:56.860 --> 01:50:00.940]  That's floating around again as well, thanks to Trump. Except that he's talking about, hey,
[01:50:00.940 --> 01:50:04.780]  he's going to make all these different tax changes that he's done permanent.
[01:50:07.420 --> 01:50:10.380]  So you might want to think about what he's actually saying here.
[01:50:10.380 --> 01:50:15.020]  There's also the fact that I don't see the government generating revenue as a win.
[01:50:17.260 --> 01:50:23.500]  If we had a government that was actually working on building infrastructure, even that I don't
[01:50:23.500 --> 01:50:28.380]  necessarily think that's the government's place to do that, but you could at least make that sound
[01:50:28.380 --> 01:50:31.260]  good. Like, oh, we're going to build better roads, we're going to build nicer parks,
[01:50:31.260 --> 01:50:35.980]  we're going to build really cool monuments. Instead, it's going to go to his friends and
[01:50:35.980 --> 01:50:40.060]  it's going to go into the military industrial complex. And the police state industrial complex
[01:50:40.060 --> 01:50:45.340]  and the surveillance state industrial complex. The government is not going to do anything that
[01:50:45.340 --> 01:50:51.420]  will benefit the common citizen with it. And again, it's not their place to do that, I don't
[01:50:51.420 --> 01:50:56.140]  think, but at least then you would be getting some benefit, some use from it. There's no planned
[01:50:56.140 --> 01:51:00.700]  benefit for it. And we don't want to see the government taking more and more control of the
[01:51:00.700 --> 01:51:05.340]  economy, but Trump does. Trump tariffs are a solution to every problem. And the trade war
[01:51:05.340 --> 01:51:11.980]  is more about the vibes than it is about economics. But when Representative Brendan Boyle, Democrat,
[01:51:11.980 --> 01:51:21.260]  pressed Jameson Greer, the US trade representative, said, what would success look like? Greer gave
[01:51:21.260 --> 01:51:27.100]  two clear metrics. He said, first of all, the trade deficit needs to go in the right direction,
[01:51:27.100 --> 01:51:33.260]  in other words, down. And manufacturing as a share of gross domestic product needs to go
[01:51:33.260 --> 01:51:40.140]  in the right direction, needs to go up. So if it's going to be a success, as they pinned them down,
[01:51:40.140 --> 01:51:44.540]  they said, okay, well, Trump wants to talk about revenue. What is your view as the trade
[01:51:44.540 --> 01:51:48.620]  representative for all this stuff? What are you trying to see happen? Well, I want to see the
[01:51:48.620 --> 01:51:52.220]  trade deficit go down and I want to see manufacturing go up. Well, what has happened?
[01:51:53.020 --> 01:51:57.660]  More than six months later, neither goal is any closer to being achieved. Neither of them
[01:51:57.660 --> 01:52:03.500]  seems likely to be completed over the long term by an economic policy rooted in barriers to trade.
[01:52:04.140 --> 01:52:11.740]  Trump has been obsessed with the trade deficit for years. But he doesn't really care if he even
[01:52:11.740 --> 01:52:17.980]  understands the budget deficit, they point out, which is the difference between the revenue they
[01:52:17.980 --> 01:52:24.060]  bring in and what they spend. That is far more important than the trade deficit. But he's not
[01:52:24.060 --> 01:52:29.980]  going to put his own house in order. From January through July, America's trade deficit was $840
[01:52:29.980 --> 01:52:41.180]  billion. It was 23% larger than the same months in 2024. Okay, so her stated goal is we want to see
[01:52:41.180 --> 01:52:46.220]  the trade deficit go down defined as we want to sell more to other people than they're selling to
[01:52:46.220 --> 01:52:55.340]  us. Except it increased by 23%, even with all of Trump's manipulation here. It also reflects now
[01:52:55.340 --> 01:53:03.260]  a well-established fact that tariffs do not reduce trade deficits. During his first term, Trump raised
[01:53:03.260 --> 01:53:12.860]  various tariffs, but the country's trade deficit climbed from about $481 billion in 2016 to $679
[01:53:12.860 --> 01:53:21.420]  billion in 2020. So over four years, it goes up, let's say maybe about 50%, right? But under this
[01:53:21.420 --> 01:53:30.140]  new regime of Trump tariff policy, it has gone up 23%. The trade deficit has increased 23%.
[01:53:31.660 --> 01:53:39.340]  So by their metric, and of course, no matter whether Trump has these contradictory explanations
[01:53:39.340 --> 01:53:45.180]  at all, he is definitely wanting to see the trade deficit go down, but it went up 23%.
[01:53:46.620 --> 01:53:51.420]  Tariffs are no better as a tool for boosting manufacturing. Rather than being helped,
[01:53:51.420 --> 01:53:57.180]  the manufacturing sector is being crushed by tariffs, increasing the cost of raw materials
[01:53:57.180 --> 01:54:02.700]  and of intermediate goods. And it's not just manufacturing, it's all businesses. Whether
[01:54:02.700 --> 01:54:07.740]  people are in retail or anything else, they can't tell what their costs are going to be.
[01:54:07.740 --> 01:54:12.140]  Because who knows if Trump is going to have something that gives him indigestion
[01:54:12.140 --> 01:54:18.620]  and he's going to try to punish the country that he bought that food from. It's just that petty.
[01:54:18.620 --> 01:54:22.300]  If he has an argument with somebody who is a political leader in another country,
[01:54:22.300 --> 01:54:28.700]  he slaps them with tariffs. So during a speech in July, the trade representative Greer added
[01:54:28.700 --> 01:54:35.100]  a third goal for the administration's tariff policies, increasing real median household income.
[01:54:36.060 --> 01:54:40.860]  Well, tariffs are making it more difficult for households to make ends meet. An October study
[01:54:40.860 --> 01:54:46.140]  from the Harvard Business School shows that retail prices had declined throughout 2024
[01:54:46.140 --> 01:54:52.620]  and early 2025, and then began rising in April after Trump's tariffs were announced.
[01:54:53.420 --> 01:55:00.060]  The Trump administration's tariff policies misunderstand the role of trade in productive
[01:55:00.220 --> 01:55:08.460]  flourishing economies. The administration has set the wrong goals and then has made policy choices
[01:55:08.460 --> 01:55:14.940]  that are unlikely to achieve those goals. Again, it's because people like Peter Navarro.
[01:55:15.580 --> 01:55:22.940]  This is the dumb as a sack of bricks policy. And so what does this look like? Well, China
[01:55:22.940 --> 01:55:31.820]  has had a record trade surplus. China's trade surplus has topped a trillion dollars for the
[01:55:31.820 --> 01:55:39.580]  first time, despite Trump's tariffs. China reports exports have rebounded in November
[01:55:39.580 --> 01:55:45.180]  after an unexpected contraction the previous month, pushing its trade surplus past a trillion dollars
[01:55:45.180 --> 01:55:52.060]  for the first time ever, an all-time high. Exports, listen to this, climbed from 6%
[01:55:52.700 --> 01:56:00.300]  a year earlier, while imports rose just under 2%. Meanwhile, shipments to the United States
[01:56:01.580 --> 01:56:07.340]  dropped nearly 29% year over year. So they've been able to replace this with other markets,
[01:56:07.340 --> 01:56:11.900]  and they are thriving. If this is part of his policy, again, that is another thing he's thrown
[01:56:11.900 --> 01:56:17.180]  in there. The economic competition with China is a failure with that as well. So it's been a
[01:56:17.180 --> 01:56:22.300]  failure in terms of the trade deficit. It's been a failure in terms of economic competition with
[01:56:22.300 --> 01:56:28.460]  China. It's been a failure in terms of manufacturing. It's been a failure in terms of keeping costs down.
[01:56:30.220 --> 01:56:37.260]  It's a failure. The nearly trillion dollar trade surplus for the first 11 months of this year is a
[01:56:37.260 --> 01:56:42.300]  record high. It's likely that November exports have yet to fully reflect the tariff cut,
[01:56:42.860 --> 01:56:48.700]  which should feed through in the coming months. But hey, they're making it up in other countries.
[01:56:50.140 --> 01:56:55.340]  You, however, may pay a lot more. They're expecting that toys will go up quite a bit because
[01:56:55.340 --> 01:57:00.860]  a lot of toys are manufactured in China. But as Trump said before, hey, so your kids only got
[01:57:01.580 --> 01:57:08.940]  one doll instead of five dolls. Too bad. I wonder how many dolls Ivanka had,
[01:57:09.980 --> 01:57:16.380]  whichever one it is. I get the two of them mixed up. Ivana was the mother, right? And Ivanka is the
[01:57:16.380 --> 01:57:21.020]  daughter. Yeah, Ivanka is the daughter. I imagine she had a lot of dolls, but Trump doesn't really
[01:57:21.020 --> 01:57:25.580]  care about that. Doesn't care if you can afford toys or not. It's kind of like that toy market
[01:57:25.580 --> 01:57:29.980]  we went to in China where the TSA then confiscated all the toys that we'd bought
[01:57:30.540 --> 01:57:36.860]  to keep our daughter busy while we came back. So China's exports grow 6% and U.S. shipments
[01:57:37.740 --> 01:57:45.420]  drop 29%. Seems like things are going in exactly the opposite direction than Trump wanted to go.
[01:57:45.420 --> 01:57:51.420]  By the way, manufacturing is dropping as well. And they're struggling, as I said before, just like
[01:57:51.980 --> 01:57:57.980]  retailers and importers, every business, farmers, everybody is struggling with the chaos
[01:57:59.100 --> 01:58:04.540]  that Trump has brought to the economy. It's not about tariffs versus income tax.
[01:58:05.180 --> 01:58:12.300]  It's about chaos versus stability. Chaos is hampering everyone in the U.S. economy.
[01:58:12.860 --> 01:58:18.940]  It is the elephant in the room. And I'm not talking about Republicans. We're going to take
[01:58:18.940 --> 01:58:24.460]  a quick break. You want to get those comments there? Yes. I don't know if that other one is
[01:58:24.460 --> 01:58:30.300]  right, Lansom, but Gard Goldsmith says, by the way, the Trump executive order re-AI appears to
[01:58:30.300 --> 01:58:35.340]  claim authority by implying that state statutes on AI interfere with interstate commerce.
[01:58:35.340 --> 01:58:40.300]  Yet Trump's executive order breaches separation of powers. Yeah, it breaches the 10th Amendment.
[01:58:40.300 --> 01:58:45.420]  And when you look at the way they've sold the unconstitutional illegal war on drugs,
[01:58:46.060 --> 01:58:50.780]  how did they do it? Well, the Commerce Act claiming that that allowed them to prohibit drugs.
[01:58:51.580 --> 01:58:55.820]  Why didn't anybody think about that when they prohibited alcohol? It's funny. You know, those
[01:58:55.820 --> 01:59:00.620]  people, I don't know, were they just stupid and they couldn't read that in the Constitution? Or
[01:59:00.620 --> 01:59:05.500]  maybe they had respect for the Constitution that we don't have. I think that's what it was.
[01:59:06.140 --> 01:59:10.140]  Well, we're going to take a quick break and we will be right back. Stay with us.
[01:59:45.420 --> 02:00:04.140]  And now the David Knight show.
[02:00:45.420 --> 02:01:07.900]  You're listening to the David Knight show.
[02:01:09.180 --> 02:01:13.340]  Here's golf legend, John Daly. Hell yeah, these winds are piling up faster than my divorces.
[02:01:13.340 --> 02:01:17.740]  I only spend on Modo, America's social casino. You know, I've won a couple of majors and on Modo,
[02:01:17.740 --> 02:01:21.820]  I've won majors, grants and epic jackpots on their classic Vegas slots with huge,
[02:01:21.820 --> 02:01:26.620]  huge bonus rounds. Modo casino adds new games and awards players free coins every single day.
[02:01:26.620 --> 02:01:30.060]  Grip it and spend it on Modo casino. Download the Modo casino app today.
[02:01:30.060 --> 02:01:32.540]  Modo casino is a social casino board. If prohibited, no purchase necessary,
[02:01:32.540 --> 02:01:34.700]  visit modo.us for more details.
[02:01:43.340 --> 02:02:12.940]  Welcome back folks. Briefly,
[02:02:12.940 --> 02:02:17.500]  I want to let you know that it is support from listeners like you that keeps the show going.
[02:02:17.500 --> 02:02:20.700]  We cannot thank you all enough. A really good way to support the show is go to
[02:02:20.700 --> 02:02:26.460]  subscribe star.com forward slash the David Knight show. You can find a tier that fits your budget
[02:02:26.460 --> 02:02:32.300]  and then it's fire and forget. You don't have to worry about it. And there you can see it.
[02:02:32.300 --> 02:02:36.220]  There's many different tiers. As I said, hopefully one of them fits your budget and
[02:02:36.220 --> 02:02:40.220]  you can just set it up and not have to worry about it. It'll only go down if your card
[02:02:40.300 --> 02:02:44.860]  is no longer valid. Check out subscribe star.com forward slash the David Knight. So you go to
[02:02:44.860 --> 02:02:49.580]  davidknight.news and find all the other ways you can support us directly.
[02:02:49.580 --> 02:02:52.620]  Of course you can turn it off. You're not locked into it forever.
[02:02:52.620 --> 02:02:55.100]  They don't come to your house. We got your number. We're not going to let you go.
[02:02:55.660 --> 02:03:00.060]  But we do appreciate the people who stuck with us for years there. And one of the things that we
[02:03:00.060 --> 02:03:06.620]  try to do for them years ago, what I did, I guess it's two years ago, did the Christmas album. We
[02:03:06.620 --> 02:03:12.780]  gave it to the people there for free. And we also try to give them the articles as well as
[02:03:13.500 --> 02:03:18.060]  a link to the podcast where they can get it without commercials. And you can also get that
[02:03:18.060 --> 02:03:22.460]  on Substack now if you just want to get the podcast without commercials.
[02:03:22.460 --> 02:03:25.740]  Yeah. If you're only interested in the podcast without commercials, the best place to do that
[02:03:25.740 --> 02:03:30.780]  is substack.com. You can subscribe and you'll receive it there. I also want to let you know
[02:03:30.780 --> 02:03:35.420]  that homesteadproducts.chops is having a sale on their activated charcoal capsules. They're
[02:03:35.420 --> 02:03:42.060]  good for detoxifying your body, good for hangovers, energy boosting, whitening your teeth,
[02:03:42.060 --> 02:03:47.660]  filtering your water. They have a numerous number of applications. So go to homesteadproducts.shop,
[02:03:47.660 --> 02:03:51.500]  check out their best stuff there. They've got all kinds of really interesting,
[02:03:51.500 --> 02:03:56.060]  very high quality products. They work very, very hard to make sure that products are made in the
[02:03:56.060 --> 02:04:01.260]  USA and of the highest quality. So again, go to homesteadproducts.shop, check out the sale
[02:04:01.260 --> 02:04:05.980]  they're having on their activated hardwood charcoal capsules. And you can also use promo
[02:04:05.980 --> 02:04:11.820]  code NIGHT to get 10% off anything in their shop. So go check them out. If you're looking
[02:04:11.820 --> 02:04:17.020]  for survival gear to just some modest clothing, they've got options for you.
[02:04:17.020 --> 02:04:21.340]  So go check them out. I'll just throw in real quickly too, the code NIGHT also gets you 10%
[02:04:21.340 --> 02:04:28.380]  off at RNC stores. Yes. Or you can get books that help you to find natural remedies for many
[02:04:28.940 --> 02:04:33.260]  things, including cancer. And you can find the book, The World Without Cancer. That's at RNC
[02:04:33.260 --> 02:04:40.060]  store. Yeah. Dot com. And also get you 10% off with Gerald Slinty's Trends Journal as well.
[02:04:40.060 --> 02:04:46.460]  Which the Trends Journal with the 10% off works out to be about $2.50 a week, which what else can
[02:04:46.460 --> 02:04:51.420]  you get for that kind of value at this point? Well, real quickly, before our guest comes on,
[02:04:51.980 --> 02:04:57.500]  I know this is an interesting story. This is a college student who got a zero on her assignment.
[02:04:58.140 --> 02:05:04.300]  Simply because she quoted the Bible in a gender assignment article that she's supposed to review.
[02:05:04.940 --> 02:05:09.740]  Now this is really about a lot of different issues. It's about free speech, free exercise,
[02:05:09.740 --> 02:05:14.700]  religion. It's about the fact that the LGBT people see what they're doing as a religion,
[02:05:15.340 --> 02:05:20.940]  as well as what is happening in schools and the worthlessness of college degrees, I would say as
[02:05:20.940 --> 02:05:28.780]  well. So this is a college student in Oklahoma, gets a failing grade because she laid out a
[02:05:28.780 --> 02:05:33.340]  biblical case for gender. Unfortunately for her, and she didn't know it at the time,
[02:05:33.900 --> 02:05:40.300]  but the teaching assistant who is going to be doing the grading is a tranny. She didn't know
[02:05:40.300 --> 02:05:47.180]  that. She turned in the paper and she didn't attack transgender. She made the case for the
[02:05:47.180 --> 02:05:52.220]  biblical role of men and women. So it was not a negative hit piece. There was nothing hateful
[02:05:52.220 --> 02:05:59.020]  about it. Well, these people are so completely deluded out to lunch that simply showing them
[02:05:59.020 --> 02:06:07.980]  reality is painful to them. It breaks their self-delusion. Yes. And it was an opinion-based
[02:06:07.980 --> 02:06:12.060]  piece. What, Lance? Like I mentioned yesterday of the story of the person, and I believe it was
[02:06:12.060 --> 02:06:18.060]  the UK that got 10 days in prison and a fine for mentioning that men and women have different
[02:06:18.060 --> 02:06:23.180]  skeletons. Yeah, that's right. So this was an opinion-based piece. And she said, I pointed
[02:06:23.180 --> 02:06:26.540]  out, it didn't say anywhere that I needed evidence. It didn't say anywhere that I needed evidence for
[02:06:26.540 --> 02:06:37.660]  my opinion. His response was, no, that was a grade that you deserved, a zero. She said in terms of
[02:06:37.660 --> 02:06:41.820]  her essay, here's some excerpts from it, she said, this article was very thought-provoking,
[02:06:41.820 --> 02:06:45.420]  caused me to thoroughly evaluate the idea of gender and the role that it plays in our society.
[02:06:45.420 --> 02:06:53.500]  The article discussed peers using teasing as a way to enforce gender norms. I don't look at this
[02:06:53.500 --> 02:06:58.860]  necessarily as a problem. God made male and female and made us differently from each other on purpose
[02:06:59.420 --> 02:07:05.260]  and for a purpose. God is very intentional with what he makes. I believe trying to change that
[02:07:05.260 --> 02:07:09.340]  would only do more harm. Gender roles and tendencies should not be considered to be
[02:07:09.340 --> 02:07:14.940]  stereotypes. Women naturally want to do womanly things because God created us with those
[02:07:14.940 --> 02:07:19.580]  womanly desires in our hearts. But of course we can propagandize those out, can't we?
[02:07:20.620 --> 02:07:26.460]  The same goes for men. God created men in the image of his courage and strength. He created
[02:07:26.460 --> 02:07:31.260]  women in the image of his beauty. He intentionally created women differently than men, and we should
[02:07:31.260 --> 02:07:35.820]  live our lives with that in mind. It's frustrating to me when I read articles like this and discussion
[02:07:35.820 --> 02:07:41.420]  posts from my classmates of so many people trying to conform to the same mundane opinion
[02:07:42.060 --> 02:07:48.380]  so that they don't step on anybody's toes. I think that is cowardly and an insincere way to live.
[02:07:48.940 --> 02:07:53.020]  It is important to me to use the freedom of speech we have been given in this country,
[02:07:53.020 --> 02:07:58.940]  and I personally believe that eliminating gender in our society would be detrimental as it pulls
[02:07:58.940 --> 02:08:04.380]  us further from God's original plan for humans. In Genesis, God says that it's not good for man
[02:08:04.380 --> 02:08:09.980]  to be alone, so he created a helper for man, which is woman. Many people assume the word
[02:08:09.980 --> 02:08:15.500]  helper in this context to be condescending and offensive to women. However, the original word
[02:08:15.500 --> 02:08:23.580]  in Hebrew is ezer kanegdo, and that directly translates to helper equal to. Additionally,
[02:08:23.580 --> 02:08:30.140]  God describes himself in the Bible using that same term, ezer kanegdo, or helper,
[02:08:30.700 --> 02:08:36.140]  and he describes his Holy Spirit as our helper as well. This shows the importance that God places
[02:08:36.140 --> 02:08:41.180]  on the role of the helper. God does not view women as less significant than men. He created
[02:08:41.180 --> 02:08:46.460]  us with such intentionality and care, and he made women in his image of being a helper and in the
[02:08:46.460 --> 02:08:51.500]  image of his beauty. If leaning into that role means that I'm following gender stereotypes,
[02:08:51.500 --> 02:08:55.420]  then I am happy to be following a stereotype that aligns with the gifts and the abilities
[02:08:55.420 --> 02:09:00.140]  that God gives me as a woman. I do not think that men and women are pressured to be more masculine
[02:09:00.140 --> 02:09:05.020]  or feminine. I strongly disagree with the idea from the article that encouraging acceptance of
[02:09:05.020 --> 02:09:11.900]  diverse gender expressions can improve students' confidence. Society pushing the lie that there
[02:09:11.900 --> 02:09:17.500]  are multiple genders and everyone should be whatever they want to be is demonic and severely harms
[02:09:17.500 --> 02:09:22.460]  American youth. I do not want kids to be teased or bullied in school. However, pushing the lie
[02:09:22.460 --> 02:09:27.740]  that everyone has their own truth and everyone can do whatever they want and be whoever they want
[02:09:27.740 --> 02:09:33.180]  is not biblical whatsoever. Reading articles like this encourages me to one day raise my children
[02:09:33.180 --> 02:09:38.620]  knowing that they have a Heavenly Father who loves them and cherishes them deeply, and that
[02:09:38.620 --> 02:09:44.380]  having their identity firmly rooted in who he is will give them the satisfaction and acceptance
[02:09:44.380 --> 02:09:48.860]  that the world can never provide for them. My prayer for the world and specifically for American
[02:09:48.860 --> 02:09:54.300]  society and youth is that they would not believe the lies being spread by Satan that make them
[02:09:54.300 --> 02:09:59.820]  believe they're better off with another gender than what God has made them. I pray that they feel
[02:09:59.820 --> 02:10:05.260]  God's love and acceptance as who he originally created them to be. So after she got a zero for
[02:10:05.260 --> 02:10:11.260]  that from the transgender, she complained to the university. They did nothing. She complained to
[02:10:11.260 --> 02:10:17.580]  governor's office and other politicians and the response was that the university gave him a paid
[02:10:17.580 --> 02:10:24.060]  vacation, paid leave. But they said that if that happened to me, all I can say is my next paper
[02:10:24.060 --> 02:10:30.700]  would be something. Yeah, he can't grade her papers anymore, but he gets a paid vacation, paid leave.
[02:10:31.740 --> 02:10:39.100]  Her essay was posted on social media, however. It's been viewed by people over 15 million times.
[02:10:39.740 --> 02:10:45.580]  So her bottom line is she said we must not be intimidated to run away from our principles,
[02:10:45.580 --> 02:10:52.540]  what we believe to be true. We have the freedom to speak and to believe what we wish. State
[02:10:52.540 --> 02:10:56.940]  senators said it's about a state-funded, taxpayer-funded institution that's allowing
[02:10:56.940 --> 02:11:00.780]  their faculty members to abridge or to impede a student's right to express their faith.
[02:11:01.500 --> 02:11:06.780]  And so she's been able to speak many different places as well. Well, I've got more than I wanted
[02:11:06.780 --> 02:11:12.220]  to get into, but we are out of time and we have a guest that is ready to join us. And just real
[02:11:12.220 --> 02:11:22.940]  briefly, the guest we have joining us is a doctor and his name is Richard Restak, MD. He has written
[02:11:22.940 --> 02:11:28.860]  over 25 books and he's been on the bestsellers list. And the book that we're going to be discussing
[02:11:28.860 --> 02:11:34.540]  today, especially basically, is neuroscience. And the book that we're going to be discussing
[02:11:34.540 --> 02:11:42.060]  today is The 21st Century Brain. Subtitle says how our brains are changing in response to the
[02:11:42.060 --> 02:11:48.300]  challenges by social networks, AI, climate change, and stress. So we're going to talk about those
[02:11:48.300 --> 02:11:52.380]  things. And I've got a lot of questions that I would like to ask him about that as well.
[02:11:52.380 --> 02:11:57.100]  So I think it's going to be an interesting interview. Stay with us, folks. We will be right back.
[02:14:04.540 --> 02:14:29.500]  You're listening to the David Knight Show.
[02:14:34.540 --> 02:14:40.140]  America's social casino. Welcome to Moto Casino where the excitement never ends with thousands
[02:14:40.140 --> 02:14:44.380]  of the hottest free to play social casino games, fastest payouts, and the best promotions in the
[02:14:44.380 --> 02:14:49.020]  industry. No tricks or gimmicks owned and operated in the USA. Moto Casino is a free to play social
[02:14:49.020 --> 02:14:53.020]  casino. No purchase necessary. 21 plus to play void were prohibited. Sign up today for a generous
[02:14:53.020 --> 02:15:01.020]  welcome bonus. Moto Casino. America's social casino. Download the Moto Casino app today.
[02:15:01.660 --> 02:15:09.740]  Multiply, multiply, multiply, multiply. With X the cash scratch tickets from the Texas lottery,
[02:15:09.740 --> 02:15:16.860]  you could multiply the cash by 30, 50, 100, or even 200 times. And when you multiply the cash,
[02:15:16.860 --> 02:15:22.060]  you multiply the celebration with top prizes from 60,000 up to a million dollars. It's the
[02:15:22.060 --> 02:15:27.020]  easiest way to multiply your luck and enter for a chance to win a VIP I heart experience.
[02:15:27.020 --> 02:15:31.580]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[02:15:33.820 --> 02:15:40.700]  Hear news now at APSradioNews.com or get the APSradio app and never miss another story.
[02:15:43.740 --> 02:15:50.620]  All right. And joining us now is Dr. Richard Restak, MD, and he is a neuroscientist as well.
[02:15:50.620 --> 02:15:58.620]  And he has written a lot of books on the brain. And now this is one kind of the nexus of our brain
[02:15:58.620 --> 02:16:03.820]  and artificial intelligence. So I wanted to get him on because we, as you know, we talk about
[02:16:03.820 --> 02:16:08.220]  AI and its impact on society quite a bit. Thank you for joining us, Dr. Restak.
[02:16:09.580 --> 02:16:11.740]  Well, I'm happy to be here. Thank you, David.
[02:16:11.740 --> 02:16:16.300]  You've written so many books and the bestselling author. And of course, people can find this on
[02:16:16.300 --> 02:16:20.940]  Amazon. You've written so many books. What is different about the brain? What is different
[02:16:20.940 --> 02:16:23.660]  about this one? And why did you write this book?
[02:16:24.540 --> 02:16:33.260]  I wrote this book to announce and to discuss the dangers that are lurking, so to speak,
[02:16:33.260 --> 02:16:39.340]  in the 21st century and are unique to the 21st century, but are having an effect on the brain
[02:16:39.340 --> 02:16:45.420]  and a negative one. So that we really are imperiled by eight different factors,
[02:16:45.980 --> 02:16:53.660]  one of which is the global warming. We have new diseases that are present in the 21st century that
[02:16:53.660 --> 02:16:59.580]  are increasing, starting with COVID and moving forward. We have problems, of course, with
[02:17:01.820 --> 02:17:06.060]  the global warming, which we'll talk about in more detail. And then the internet,
[02:17:06.060 --> 02:17:12.780]  the effect of the internet, the effect of AI, memory, the alteration, the attempt to alter
[02:17:12.780 --> 02:17:20.220]  memory, almost to alter our memories of what the past was like. This is an ongoing enterprise by
[02:17:20.220 --> 02:17:26.780]  various governments in the world, including our own. We also have surveillance, the seventh,
[02:17:26.780 --> 02:17:31.980]  the surveillance becoming increasingly a surveillance society. It's almost impossible
[02:17:32.540 --> 02:17:39.420]  to not be revealing things about yourself because there's surveillance cameras everywhere.
[02:17:39.420 --> 02:17:42.780]  I can give you several examples of that just in my own personal life.
[02:17:42.780 --> 02:17:48.780]  And then finally, the eighth one is anxiety. All of these things are creating what I call
[02:17:48.780 --> 02:17:56.860]  an existential anxiety. People are being given information, but it's being molded according to
[02:17:56.860 --> 02:18:03.020]  the thoughts and the inclinations of people in power. For instance, let's take today's,
[02:18:03.020 --> 02:18:08.700]  right out of today's New York Times, on page A7, there's an article called,
[02:18:09.420 --> 02:18:17.420]  The Air in New Delhi is Life Threatening. And it tells the tale of the New York Times reporters
[02:18:17.980 --> 02:18:24.140]  who have spread themselves throughout New Delhi from 6 a.m. until late in the evening of a certain
[02:18:24.140 --> 02:18:31.900]  day recently, and they measured the particulate matter in the air, and it was anywhere from 10
[02:18:31.900 --> 02:18:41.740]  times to 30 times as great as would be considered minimally normal. Now, on top of that, you have
[02:18:41.740 --> 02:18:49.580]  the statement that they state that the government is actually trying to hide this kind of insight
[02:18:50.300 --> 02:18:56.300]  to the populace by spraying water and other things like that. It says that they're doing this
[02:18:57.020 --> 02:19:03.980]  around the measuring stations, and they're also losing data from measuring stations during the
[02:19:03.980 --> 02:19:11.740]  worst bouts of pollution. So there you have the molding of the facts, either denying them all
[02:19:11.740 --> 02:19:17.420]  together or trying to improve them so people say, oh well, they measured it down at such and such
[02:19:17.420 --> 02:19:21.740]  a measuring station, and it was really not all that high. Of course, they were spreading
[02:19:22.380 --> 02:19:28.620]  water and other things to try to reduce this. So we've got a capitalist society here in the United
[02:19:28.620 --> 02:19:36.300]  States which has a vested interest in pushing forward certain scientific points of view.
[02:19:37.100 --> 02:19:42.780]  So science is being put sort of in the back seat, and there's politicians and other people,
[02:19:42.780 --> 02:19:49.340]  all of whom share one thing, capitalistic enterprises in which they're part of or which
[02:19:49.340 --> 02:19:57.580]  they are advancing. And a kind of crony capitalism where they can get protection and subsidies as
[02:19:57.580 --> 02:20:03.340]  well. And the control is being taken away from us because, as I was just reporting earlier today,
[02:20:04.060 --> 02:20:10.540]  they're working very hard to make sure that state and local governments can't enact any control on
[02:20:10.540 --> 02:20:16.300]  artificial intelligence. And that came up in the context of talking about how the manufacturers of
[02:20:16.300 --> 02:20:21.500]  tasers, also big manufacturers of police body cams, how they want to wed that to
[02:20:21.500 --> 02:20:26.140]  artificial intelligence. And the question is, you know, what could possibly go wrong with that
[02:20:26.140 --> 02:20:31.900]  if they identify you, they misidentify you as a dangerous criminal and warn the police about
[02:20:31.900 --> 02:20:38.060]  how dangerous you are? They could get people killed. Well, not only that, but all of these efforts
[02:20:38.780 --> 02:20:46.380]  set up a sense of anxiety and fear. Let me just tell you what happened to me in one morning.
[02:20:47.340 --> 02:20:52.940]  Called a cab to go to a medical appointment, and we started going down the road. I said to the
[02:20:52.940 --> 02:20:57.660]  driver, you know, you're not going the most efficient or the quickest way. He said, I know
[02:20:57.660 --> 02:21:02.780]  that. He said, but I don't want to go that way because there's speed cameras. I said, well,
[02:21:02.860 --> 02:21:07.660]  you know, you're driving very sensibly and you're not speeding and I'm in no hurry. So
[02:21:07.660 --> 02:21:12.940]  what's the problem? He said, well, they take pictures of everybody that goes by those cameras
[02:21:12.940 --> 02:21:18.780]  because they want to see who's in those photos in those cars. So I asked him to give me a reference
[02:21:18.780 --> 02:21:23.900]  for that. And he got sort of didn't say anything else for the rest of the trip. So when I got down
[02:21:23.900 --> 02:21:30.060]  to the medical building, I got in the elevator and said, in this facility, there is surveillance, both
[02:21:31.020 --> 02:21:40.460]  obvious and hidden. And the Santa Claus was watching you now. This is one morning. And then
[02:21:40.460 --> 02:21:48.700]  when I got up to sign in, I signed the board with electronic pen and I didn't see no signature. I
[02:21:48.700 --> 02:21:52.940]  saw it. I said, well, it didn't take. She said, oh, it took, but we don't allow it to go on the
[02:21:52.940 --> 02:21:57.740]  screen. So it could be seen. I said, why is that? She said, well, somebody behind you might see the
[02:21:57.740 --> 02:22:04.780]  thing and then remember it and use your for your signature to forward something somewhere. Well,
[02:22:04.780 --> 02:22:10.060]  first of all, there was a sign that said, stand 10 feet back. And secondly, there's nobody else
[02:22:10.060 --> 02:22:15.980]  behind me. So there's three examples just drawn at random that were becoming an increasingly
[02:22:15.980 --> 02:22:22.460]  surveilled society, which is creating a sense of paranoia and a sense of fear. So the brain has
[02:22:22.460 --> 02:22:29.420]  to adjust to these types of things, Dave, and it's very hard to do. And I think that is calculated.
[02:22:29.420 --> 02:22:34.380]  You know, they've been, they want to do this even to the extent when you talk about these cameras
[02:22:34.380 --> 02:22:38.780]  taking everybody's picture, that the flock network that is out there, this corporation that is
[02:22:38.780 --> 02:22:44.540]  saying, well, we can do whatever we want because it's in public space and, and, you know, we're,
[02:22:44.540 --> 02:22:48.940]  we're not government, so we can collect this information. And yet they collect it in order
[02:22:48.940 --> 02:22:54.780]  to sell it to the government. So it's just a one level indirect, but they not only grab your
[02:22:54.780 --> 02:23:00.940]  license plate, but they also do a complete profile of your car and all of its idiosyncrasies. Does it
[02:23:00.940 --> 02:23:05.180]  have a dent here? Does it have a scrape there? What about a bumper sticker? So that creates a
[02:23:05.180 --> 02:23:10.620]  model of your car. And so they almost have like, you know, biometric identification of your cars
[02:23:10.620 --> 02:23:17.820]  as well as of you. And this is now made possible because of the advances of AI. But this has been
[02:23:17.820 --> 02:23:23.740]  something that has been concerning me. I look at things kind of from a libertarian perspective,
[02:23:24.300 --> 02:23:30.620]  and this has been concerning me for a long time. The idea that government is using technology many
[02:23:30.620 --> 02:23:36.380]  different ways, internet, social media, things like that, to monitor and to manipulate us all
[02:23:36.380 --> 02:23:43.740]  the time. And to me, artificial intelligence just puts this on steroids. And so I think there is
[02:23:43.740 --> 02:23:48.940]  something to be anxious about if we're going to look at this. We should be concerned about it.
[02:23:48.940 --> 02:23:53.740]  Maybe not anxious, but we should be concerned about the goals of people who are putting this
[02:23:53.740 --> 02:24:00.780]  kind of stuff together. So, yeah. Well, there's that. And then if you can manage to change the
[02:24:00.780 --> 02:24:06.620]  present, you can manipulate the future. Of course, the real way to get it is to get control of the
[02:24:06.620 --> 02:24:13.020]  past, as Orwell pointed out. You control the past, you know, you can control the present.
[02:24:14.220 --> 02:24:21.100]  Implication, control the future. And we're seeing alterations of materials, even government
[02:24:21.100 --> 02:24:27.340]  documents, government films, documentaries, things like that are being altered in ways that
[02:24:27.980 --> 02:24:34.220]  are not visible, not, I should say, detectable, not detectable to the ordinary person. So they
[02:24:34.220 --> 02:24:42.700]  get ideas about what the past was like, which are wrong and don't show you, as I mentioned in the
[02:24:42.700 --> 02:24:51.980]  book, if you were at a dance in 1850 before the civil war, and it's a film we're watching. Let's
[02:24:51.980 --> 02:24:57.820]  just say we're watching a film about 1850 and we're seeing people ballroom dancing, all that. Then
[02:24:57.820 --> 02:25:03.340]  one of them pulls to the side and pulls out a cell phone. And you say, wait a minute, we didn't have
[02:25:03.340 --> 02:25:09.980]  cell phones then. Well, you know, there were a lot of things that were going on now that were not
[02:25:09.980 --> 02:25:15.900]  going on in the past. And it's not to our advantage to try to pretend that they were,
[02:25:15.900 --> 02:25:21.260]  they weren't. We have to understand the past, understand the future. And we're not only creating
[02:25:22.380 --> 02:25:32.780]  situations that are false, but we're also, like in 1984, Orwell created a character called Commander
[02:25:32.780 --> 02:25:40.620]  Ogilvy. He was a war hero. He got all sorts of medals and it was all the proletaries that were
[02:25:40.620 --> 02:25:48.460]  all told to honor him and so forth. Well, he never existed. He actually was made up entirely. And
[02:25:48.460 --> 02:25:54.860]  that's one of the things that the narrator is doing in the job at work, is filling in photographs
[02:25:55.340 --> 02:26:01.660]  of C-concerting Ogilvy into historical events that happened, wartime scenarios,
[02:26:01.660 --> 02:26:07.180]  et cetera. And while reading it, we'll say, wow, this is some man. Well, he was a complete
[02:26:07.180 --> 02:26:15.500]  fabrication. We're just about at that point with Sora out, the AI out. Well, it could take you and
[02:26:15.500 --> 02:26:21.500]  had you, you know, to say, let's get David Knight and have him leading some sort of a parade or
[02:26:21.500 --> 02:26:26.300]  whatever. And, you know, suddenly people say, well, gosh, I saw him with my own eyes.
[02:26:26.300 --> 02:26:32.540]  So what's happening is that the actual seeing is believing is being turned on its head. So that's
[02:26:32.540 --> 02:26:38.780]  no longer true. You're talking about a completely fabricated character out of Orwell. Just recently,
[02:26:38.780 --> 02:26:43.660]  they had Tilly Norwood, who is a completely fabricated AI personality. And the person who
[02:26:43.660 --> 02:26:48.860]  came up with it has got agents representing her. They got her out there as an actress.
[02:26:49.660 --> 02:26:54.860]  Yeah. I mean, it's like, so I've created an AI actress, which will do a lot of different roles
[02:26:54.860 --> 02:27:00.780]  for you. She probably does her own stunts as well. I mentioned, but people in SAG, the Screen Actors
[02:27:00.780 --> 02:27:06.860]  Guild, they're furious about this. And I said, any agent that represents this AI character is
[02:27:06.860 --> 02:27:11.980]  not going to do any business with us. But we're already at that point. It truly is interesting.
[02:27:13.500 --> 02:27:17.980]  And one of the ways of neutralizing it is to create the situation that exists right now between
[02:27:17.980 --> 02:27:23.180]  you and me. You're laughing and I'm laughing because it seems funny. And it is funny. But
[02:27:23.180 --> 02:27:30.860]  it's a very serious purpose behind all this. It's all about trying to alter people's perceptions
[02:27:30.860 --> 02:27:37.180]  so that they begin to doubt the veredity of what they're seeing. That's right. Yes. And I've
[02:27:37.180 --> 02:27:41.740]  talked for the longest time about how the whole idea for the internet was created by DARPA
[02:27:41.740 --> 02:27:46.140]  psychologists. And I've been concerned that it was all about the psychological manipulation
[02:27:46.140 --> 02:27:51.900]  from the get-go with all of this. But as a physician and as a neuroscientist,
[02:27:53.100 --> 02:27:57.100]  I'd be interested in your take on what is currently going on. Because besides
[02:27:57.100 --> 02:28:02.460]  manipulating the past by changing information about the past or memory-holing it or writing
[02:28:02.460 --> 02:28:07.500]  a new alternative history of it, they are also concerned. And there's been projects that have
[02:28:07.500 --> 02:28:12.540]  been put out by DARPA. And I don't know if they've been successful or not, but they're putting out
[02:28:12.540 --> 02:28:18.220]  requests for people to come up with things to manipulate people's memories. So you've got a
[02:28:18.220 --> 02:28:24.460]  soldier, they say, who's got bad PTSD. Let's get rid of that memory. Let's give them different
[02:28:24.460 --> 02:28:30.380]  memories. What do you see in terms of someone who studies the brain and neuroscience? What
[02:28:30.380 --> 02:28:35.180]  do you see about that? What do you think is the state of the art with that?
[02:28:35.740 --> 02:28:43.500]  Motocasino, America's social casino. Welcome to Motocasino where the excitement never ends
[02:28:43.500 --> 02:28:47.340]  with thousands of the hottest free to play social casino games, fastest payouts and the best
[02:28:47.340 --> 02:28:52.540]  promotions in the industry. No tricks or gimmicks owned and operated in the USA. Motocasino is a
[02:28:52.540 --> 02:28:57.100]  free to play social casino. No purchase necessary. 21 plus to play. Sign up today for a generous
[02:28:57.100 --> 02:29:05.020]  welcome bonus. Motocasino, America's social casino. Download the Motocasino app today.
[02:29:05.820 --> 02:29:13.820]  Multiply, multiply, multiply, multiply. With the cash scratch tickets from the Texas lottery,
[02:29:13.820 --> 02:29:20.940]  you could multiply the cash by 30, 50, 100 or even 200 times. And when you multiply the cash,
[02:29:20.940 --> 02:29:26.140]  you multiply the celebration with top prizes from 60,000 up to a million dollars. It's the
[02:29:26.140 --> 02:29:31.100]  easiest way to multiply your luck and enter for a chance to win a VIP iHeart experience.
[02:29:31.100 --> 02:29:35.660]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[02:29:36.300 --> 02:29:40.460]  Well, my last book was called the complete book of memory. It had to do with memory. I studied
[02:29:40.460 --> 02:29:47.020]  memory in great detail. And of course you have to do away with the concept that memory is like a
[02:29:47.740 --> 02:29:52.780]  video tape or something that you just store in your brain. And when you get and want to get it,
[02:29:52.780 --> 02:29:58.460]  you just bring it out. Like you bring out a video tape. It's not like that. It's a reconstruction.
[02:29:58.460 --> 02:30:06.140]  Each time you think back to a certain event, you alter that memory so that you have memory one,
[02:30:06.140 --> 02:30:12.620]  memory two, memory three, on and on and on. That's the nature of memory. And memory can be
[02:30:12.620 --> 02:30:18.300]  manipulated. It's always, you know, in the courtroom, they're always trying to avoid the
[02:30:18.300 --> 02:30:24.140]  contamination of the witness. An example of that would be, well, which car went through the red
[02:30:24.140 --> 02:30:30.540]  light? And to ask a witness, he said, oh, it was, it was a red car went through the red light.
[02:30:31.180 --> 02:30:35.100]  Well, wouldn't surprise you to know that it wasn't a red light, but it was a stop sign,
[02:30:35.660 --> 02:30:43.020]  Mr. Witness. Of course his credibility is gone because he took the suggestion that it was a
[02:30:43.020 --> 02:30:49.260]  red light instead of, and it'd be very easy to do because you don't necessarily have that image,
[02:30:49.260 --> 02:30:54.620]  that intersection in your mind. So that's why there's protections, even in the courtroom
[02:30:54.620 --> 02:31:00.700]  against leading the witness, they call it. In other words, providing information that's either
[02:31:00.700 --> 02:31:07.900]  not true at all or half true. So we've got that cause this is not, this didn't start in the 21st
[02:31:07.900 --> 02:31:13.100]  century. That that started, you know, as long as we've had courtrooms, this is a more an emphasis
[02:31:13.980 --> 02:31:19.180]  on altering memory. So the people will not, he will get up there and under cross-examination,
[02:31:19.180 --> 02:31:23.660]  they'll do pretty well because their whole memory's been altered. They've changed to by
[02:31:23.660 --> 02:31:29.340]  various mechanisms, suggestion, repeating the information, which is false, of course,
[02:31:29.340 --> 02:31:36.860]  which is the missing information. There was a cartoon about a week ago by Ramirez in which
[02:31:37.420 --> 02:31:45.580]  he's surprise winner. He has three doctors in an operating room in a laboratory. One of them is
[02:31:45.580 --> 02:31:51.660]  looking into a microscope and he looks up and he says, this is the most dangerous pathogen we have
[02:31:51.660 --> 02:31:57.420]  ever encountered. And the second doctor says, well, is it bubonic plague? Is it smallpox?
[02:31:58.140 --> 02:32:02.940]  And then the one that he says, no, it's misinformation and disinformation.
[02:32:03.660 --> 02:32:11.020]  That's right. And we've got to be very careful because many times the people who will
[02:32:11.020 --> 02:32:16.300]  tell us about that are the people who want to be the ones who define what the information is for
[02:32:16.300 --> 02:32:20.460]  us. And they will ask those leading questions. You know, when we talk about leading questions
[02:32:20.460 --> 02:32:25.420]  and manipulating people, there's been a lot of reports about artificial intelligence,
[02:32:26.380 --> 02:32:34.700]  kind of people who have particular psychosis or something and they get involved with the AI
[02:32:34.700 --> 02:32:39.100]  and it starts to confirm the things that they want because that's what it is set up to do
[02:32:39.100 --> 02:32:44.540]  in terms of bias that want to, you know, be empathetic and sympathetic to people. And so
[02:32:44.540 --> 02:32:48.620]  it starts doing that and leading them further and further down a particular rabbit hole.
[02:32:48.620 --> 02:32:54.940]  There's been situations of, you know, people who got into severe mental distress, some suicides of
[02:32:55.020 --> 02:33:00.300]  some young children and other things like that, speak to that aspect of it and the real danger
[02:33:00.300 --> 02:33:07.260]  of that. That is really kind of, I think, speaks to the psychological aspect and potential of
[02:33:07.260 --> 02:33:11.020]  artificial intelligence. And that could be weaponized. Right now it's just kind of happening
[02:33:11.980 --> 02:33:16.860]  out of their business model, right? But that could definitely be weaponized against people.
[02:33:16.860 --> 02:33:21.740]  Well, I talk about that in my book in the chapter on the internet. There are famous examples of
[02:33:21.740 --> 02:33:31.260]  people who have suicided right on the internet live feed and they've been manipulated to doing
[02:33:31.260 --> 02:33:37.900]  that by other people who have encouraged them, said this would be a sign of strength, this would
[02:33:37.900 --> 02:33:45.100]  be a sign that you're not afraid to die if necessary. And there's cases of it that actually
[02:33:45.100 --> 02:33:51.580]  led to the suicide. One of them is the most grisly I have in my book about a person who's
[02:33:51.580 --> 02:33:59.100]  talked into pouring gasoline over themselves and setting a match all on open feed internet.
[02:33:59.100 --> 02:34:04.940]  And while this fire is burning, you can hear everybody in the backgrounds cheering,
[02:34:04.940 --> 02:34:11.420]  we did it, we did it, we got them to do it. Wow. That's amazing. That's amazing. So there's
[02:34:11.420 --> 02:34:20.700]  something about the internet that actually brings out sadistic, criminal, psychopathic
[02:34:20.700 --> 02:34:26.300]  trends. And we don't know why. Is it the fact that you don't necessarily can't be identified?
[02:34:27.020 --> 02:34:30.860]  It's something that is going to be influencing, it has influenced the internet greatly
[02:34:31.420 --> 02:34:37.580]  and it will continue to do so until we understand it. I think that's one of the things that's so
[02:34:37.580 --> 02:34:41.900]  dangerous about the things that we saw with lockdown and other aspects of it. There's an
[02:34:41.900 --> 02:34:49.180]  atomization here. And so many different ways the government and tech companies are trying to
[02:34:49.820 --> 02:34:55.180]  make sure that we're not in person with each other. In many cases, like for example,
[02:34:55.180 --> 02:35:00.940]  in this interview, we couldn't do this interview if both of us had to travel. We're able to do
[02:35:00.940 --> 02:35:08.300]  this because we can do it over Zoom or whatever. But just taking ordinary things that you would
[02:35:08.300 --> 02:35:14.620]  normally do in terms of interacting with people in school or in church or in your community or
[02:35:14.620 --> 02:35:18.780]  whatever, taking that away and putting a screen between the two of you, it really does change
[02:35:18.780 --> 02:35:22.780]  the way people interact with each other. I remember Errol Morris, the film director,
[02:35:22.780 --> 02:35:28.060]  was able to get people to say all kinds of things to him. He got a murderer to confess. He got
[02:35:28.860 --> 02:35:34.940]  he got Robert McNamara to confess about the false flag of the Vietnam War. He got people to say all
[02:35:34.940 --> 02:35:39.420]  kinds of stuff because there was that distance between him and them. He could have interviewed
[02:35:39.420 --> 02:35:44.220]  them in person, but what he did was he put an interrotron, which is what he called it. It was
[02:35:44.220 --> 02:35:49.180]  basically a teleprompter that he had set up so he could do two-way communication at the time.
[02:35:49.180 --> 02:35:55.180]  And once he had that distance there, then it completely changed the dynamics that he would
[02:35:55.660 --> 02:36:00.220]  have versus with somebody person to person. And that's what we're talking about here, isn't it?
[02:36:01.340 --> 02:36:04.780]  Yeah, we're talking about that. And of course, there's integrations of this,
[02:36:04.780 --> 02:36:10.140]  and it continues. You're interviewing me. We're discussing. I feel like it's a discussion.
[02:36:10.940 --> 02:36:14.860]  If I were to say something that later I regretted, I could probably say,
[02:36:14.860 --> 02:36:23.340]  oh, well, that wasn't me. That was my avatar. Or my agent, right? I got an AI agent that's out there
[02:36:23.340 --> 02:36:32.780]  doing stuff. That's crazy. We also see, though, as a doctor, you're seeing people have noticed
[02:36:32.780 --> 02:36:37.420]  actual physical changes that can be observed in people's brains. I'm thinking of the story
[02:36:37.420 --> 02:36:42.620]  about the London taxi drivers who would do the knowledge, and they would find that as they
[02:36:42.620 --> 02:36:49.500]  memorized all these factual details and drew on that all the time in order to take people to
[02:36:50.460 --> 02:36:54.860]  this very complicated city with its complicated streets, that they had a particular part of their
[02:36:54.860 --> 02:37:01.260]  brain that was larger than the typical person. And then they found that once they stopped doing
[02:37:01.260 --> 02:37:05.660]  that, it started to shrink again. And we're starting to see that happening with people in
[02:37:05.660 --> 02:37:10.620]  a lot of different areas of their life, that kind of atrophy. And it's physically observable, isn't it?
[02:37:11.580 --> 02:37:16.700]  Well, it is. You have to learn. You have to use the things that you have learned to do.
[02:37:17.260 --> 02:37:21.900]  Like I mentioned in my memory book, there's all kinds of memory exercises that you could do. I
[02:37:21.900 --> 02:37:28.220]  do them every day, and they're very easy, and they help you to continue with your memory,
[02:37:28.780 --> 02:37:33.420]  and keep it sharp. Give us some examples. I'm sure everybody would love to know that. We'd all like
[02:37:33.420 --> 02:37:38.060]  to have a better memory. What kind of things can we do to exercise our memory? Well, think about
[02:37:38.060 --> 02:37:45.660]  the fact that you never had to learn pictures. When you were an infant, a young child, a picture
[02:37:45.660 --> 02:37:49.740]  was something that you could, you may not know what you're looking at, but you could see it
[02:37:49.740 --> 02:37:55.820]  without an intermediary. Whereas language is something that you have to hear from other people.
[02:37:55.820 --> 02:38:01.660]  It's something that's sort of added on to the brain. So as a result, the most
[02:38:03.020 --> 02:38:13.340]  best way of remembering something is to make an image for it. For instance, I have a little
[02:38:13.340 --> 02:38:19.340]  dog called a skipper key. Skipper key is a Belgian dog. He's a nice little fellow,
[02:38:20.060 --> 02:38:24.140]  but it was embarrassing to me when walking the street. People say, what kind of a dog is that?
[02:38:24.140 --> 02:38:29.740]  And I couldn't come up with a name because it was such complicated. I thought that's skipper key. I
[02:38:29.740 --> 02:38:36.540]  didn't speak any Dutch or anything. So then I got this image of a small boat with a large captain
[02:38:36.540 --> 02:38:44.300]  with a beard holding a big key. So it was skipper key. And I remember forever. So I was going to
[02:38:44.300 --> 02:38:50.380]  have the picture. Once I have the picture, it's easy to do. Another way, easy way to do it. And
[02:38:50.380 --> 02:38:55.980]  you can do that with all kinds of times, all the time. I was going upstairs before I came down to
[02:38:55.980 --> 02:39:03.340]  the office and I wanted to get my wallet and I wanted to get my cell phone. So I just had an image
[02:39:03.340 --> 02:39:08.940]  of a wallet in the form of a cell phone. And I was walking up the stairs talking into the
[02:39:09.580 --> 02:39:14.780]  wallet cell phone. So I got up and I knew I had these two elements to get. It'd be very easy to
[02:39:14.780 --> 02:39:20.940]  get one and forget the other. So you have these images all the time. And the quickest, you know,
[02:39:20.940 --> 02:39:28.220]  this is sort of off the topic of the book, but if you want to have a firepower memory for
[02:39:28.220 --> 02:39:35.340]  a load of things, that's up to 10 things and get 10 areas that you are familiar with that you see
[02:39:35.340 --> 02:39:44.700]  every day. And then you can put on those images, the thing you're trying to remember. So if I'm
[02:39:44.700 --> 02:39:55.100]  trying to remember a loaf of bread, milk, maybe a batteries, I have a regular way of doing that. I
[02:39:55.100 --> 02:40:00.380]  have a regular way of doing that. I have, like, I remember the library that's near my home,
[02:40:01.180 --> 02:40:07.260]  the coffee shop, liquor store, Georgetown University Medical School, where I went,
[02:40:07.900 --> 02:40:14.700]  Georgetown University, Cafe Milano, which is a place in Washington everybody gathers,
[02:40:15.260 --> 02:40:24.460]  and then Keybridge, Iwo Jima Memorial, and Reagan Airport. So that bread would be, for instance,
[02:40:24.460 --> 02:40:29.500]  the loaf of bread. I would look in the window of the library instead of seeing books, I'd see bread,
[02:40:29.500 --> 02:40:35.660]  loaves of bread. And when I get down to the liquor store, instead of it being filled with liquor,
[02:40:35.660 --> 02:40:40.620]  there'd all be milk bottles. So that's how I got to get to it. So I have those 10, so I can get 10
[02:40:40.620 --> 02:40:46.860]  items together without any problems at all. That's great. Yeah, you know, it's interesting to talk
[02:40:46.860 --> 02:40:53.580]  about the importance of a visualization. It's one of the things that I do in terms of preparing for
[02:40:53.580 --> 02:40:58.860]  the show. I have a lot of articles that I go through. And it's really when I highlight things
[02:40:58.860 --> 02:41:02.540]  or when I write them down, that's when I can remember them. If I don't do that, if I were just
[02:41:02.540 --> 02:41:07.100]  to read these things, I wouldn't remember them. But if I interact with it and write it down,
[02:41:07.100 --> 02:41:11.260]  that helps me to remember it. So that is a kind of visualization there, I guess, as well.
[02:41:12.380 --> 02:41:17.740]  It truly is interesting. And what you said earlier about memory not being something that is stored
[02:41:17.740 --> 02:41:23.020]  in a place as somebody coming from a computer science background, that was a very different
[02:41:23.020 --> 02:41:31.100]  thing. When you construct your memory, how do you reconstruct that? That opens up a whole new
[02:41:31.100 --> 02:41:35.500]  area of questions as well. In other words, if every time somebody brings up a subject,
[02:41:36.620 --> 02:41:41.260]  there isn't something that's stored initially to reference that and then rebuild from that?
[02:41:42.540 --> 02:41:47.660]  Yeah, there's that. Plus, there's the interconnections. Somebody listening to us
[02:41:47.660 --> 02:41:51.740]  might say, well, gee, this is called the 21st century of brain, but I haven't heard that much
[02:41:51.740 --> 02:41:57.580]  about the brain. Well, let me just link that up so that these things make sense. We have a new
[02:41:57.580 --> 02:42:01.740]  version, or I should say a new understanding of the brain called the connectomic brain,
[02:42:02.460 --> 02:42:09.260]  in which there's all kinds of interactions in the brain of parts of the brain which you don't,
[02:42:09.260 --> 02:42:14.220]  we're just learning about. I have the, I use the metaphor of a bowl of spaghetti.
[02:42:14.780 --> 02:42:20.620]  You pull out one of the strains of spaghetti and you never have any idea what it's connected to,
[02:42:20.620 --> 02:42:27.260]  how many other strains of spaghetti this is connected to. If you think of the brain as
[02:42:27.260 --> 02:42:36.300]  being set to make connections, that's its natural processing. It gets back to these things that we
[02:42:36.300 --> 02:42:42.540]  were talking about earlier, global warming and memory and surveillance and all that.
[02:42:42.540 --> 02:42:49.020]  How are we going to solve all those? Well, somehow or other, those things are connected with each
[02:42:49.100 --> 02:42:57.660]  other. That's the take home message to this book. And the basic goal is to try to figure out what
[02:42:57.660 --> 02:43:05.740]  it is that connects these things, what it is that would allow us to, by solving one of them,
[02:43:06.380 --> 02:43:13.500]  solve the other. And I mentioned at the end of the book, experts so far haven't done it.
[02:43:14.140 --> 02:43:21.180]  So it's useful, as Hayek said, to get ordinary people to give, when I say ordinary, I mean
[02:43:21.180 --> 02:43:26.620]  non-specialized people, to give their ideas. Gee, I wonder what such and such would happen.
[02:43:43.500 --> 02:43:46.540]  Modo Casino is a free-to-play social casino. No purchase necessary. 21 plus to play.
[02:43:46.540 --> 02:43:49.580]  Void were prohibited. Sign up today for a generous welcome bonus.
[02:43:49.580 --> 02:43:56.620]  Modo Casino. American Social Casino. Download the Modo Casino app today.
[02:43:57.420 --> 02:44:05.500]  Multiply, multiply, multiply, multiply. With X's the cash scratch tickets from the Texas lottery,
[02:44:05.500 --> 02:44:12.540]  you could multiply the cash by 30, 50, 100, or even 200 times. And when you multiply the cash,
[02:44:12.540 --> 02:44:17.740]  you multiply the celebration. With top prizes from 60,000 up to a million dollars, it's the
[02:44:17.740 --> 02:44:22.700]  easiest way to multiply your luck. And enter for a chance to win a VIP iHeart experience.
[02:44:22.700 --> 02:44:27.260]  Play X the cash scratch tickets today. Must be 18 or older. Play responsibly.
[02:44:27.900 --> 02:44:31.100]  What would happen about global warming? For a while there was, in fact, there's still
[02:44:31.660 --> 02:44:40.780]  experiments going on on the effect of sulfur that would help the CO2 problem. And, you know,
[02:44:40.780 --> 02:44:45.900]  shooting sulfur up into the atmosphere. Of course, the reason for that was the
[02:44:46.700 --> 02:44:54.300]  volcano in 1980 something, in which after that volcano in Hawaii, it was noted that the
[02:44:55.180 --> 02:45:00.940]  air was clearer and there was less pollution. So that's something to think about. Is there some
[02:45:00.940 --> 02:45:09.820]  way of using that particular sulfur experiment to decrease global warming? War, for instance,
[02:45:09.820 --> 02:45:15.820]  we don't think of war as a cause of global warming, but it is. Or CO2. Thermo nuclear
[02:45:15.820 --> 02:45:25.340]  warming. Yeah, but what upsets the Ukraine war and the Gaza war, then, you know, a tremendous
[02:45:25.340 --> 02:45:31.260]  amount that's going to overcome and exceed the benefit of any of these things like, you know,
[02:45:32.540 --> 02:45:39.100]  non-gasoline engines, but using things like that. Absolutely. Yeah, it's kind of like, you know,
[02:45:39.180 --> 02:45:46.860]  shooting up rockets in order to put satellites up. You know, how many cars and lifetime use of cars
[02:45:46.860 --> 02:45:50.380]  from people would that be equivalent to? And you start talking about all the missiles that
[02:45:50.380 --> 02:45:56.300]  are being shot and then you get to the explosives as well. It is really interesting how they focus
[02:45:57.340 --> 02:46:02.940]  on their objectives for their ways to control it. The manipulation has been going on for quite
[02:46:02.940 --> 02:46:10.220]  some time. And so, yeah, that is pretty amazing. And I guess that's my, you know,
[02:46:11.740 --> 02:46:15.580]  look at this stuff. It really does look like science fiction and I'm almost inclined to
[02:46:15.580 --> 02:46:19.820]  write it off when I first see it. When DARPA is saying, well, we need to find some way that we can,
[02:46:21.020 --> 02:46:25.260]  you know, erase memories and people and insert new memories into them. I mean,
[02:46:25.260 --> 02:46:31.260]  we're going back to total recall, right? So it sounds like something from a Philip K. Dick
[02:46:31.260 --> 02:46:35.340]  novel, but they're really working on that. And I guess one of the most striking things we saw,
[02:46:35.340 --> 02:46:40.620]  we reported on a couple of weeks ago, and it was a company that was bragging about how
[02:46:41.260 --> 02:46:46.940]  they could read your mind more accurately and quickly than their competitors. Because there's
[02:46:46.940 --> 02:46:53.420]  a lot of different companies that are doing this and how they could, it's called Brain IT was the
[02:46:53.420 --> 02:47:02.380]  name of the company. And so they had a way that they would do MRI and they could essentially
[02:47:02.380 --> 02:47:06.140]  train it on your brain in a much shorter period of time than the other people. And they could get
[02:47:06.140 --> 02:47:10.620]  much better results and our producers just pulled this up. So what they do is they show you an image
[02:47:11.340 --> 02:47:15.260]  and you're looking at that image and then it's reading your mind and reconstructing what you're
[02:47:15.260 --> 02:47:19.980]  looking at, which I thought was absolutely amazing and terrifying at the same time.
[02:47:20.620 --> 02:47:25.100]  How is this going to be used? I guess that's the real issue. When we start talking about all
[02:47:25.100 --> 02:47:30.700]  these different things, I think that is the real case that it's difficult for people to understand
[02:47:30.700 --> 02:47:36.780]  just how far and how quickly the technology has progressed. And then to say, and how do we control
[02:47:36.780 --> 02:47:45.020]  this from it being used for bad purposes? Well, that's specifically 21st century problem.
[02:47:45.260 --> 02:47:51.180]  Because all of these things have either originated in the 21st century or they have in fact
[02:47:51.900 --> 02:47:57.180]  further developed and become increasingly threatening. And bear in mind, we have to
[02:47:58.060 --> 02:48:01.900]  have to solve these problems because they're not something that's going to go away.
[02:48:01.900 --> 02:48:08.540]  And then the most important thing to remember, David, is that all of these things harm the brain
[02:48:08.540 --> 02:48:14.620]  and the brain is the thinking processor that's going to save us. It's going to figure out what
[02:48:14.780 --> 02:48:21.100]  the solutions to the problems are. So we know now that wildfire smoke, for instance,
[02:48:21.900 --> 02:48:29.100]  it creates dementia. It enhances the likelihood of somebody coming to med it. So as the brain is
[02:48:29.740 --> 02:48:36.060]  affected negatively increasingly over longer and longer periods of time, our ability to solve these
[02:48:36.060 --> 02:48:41.980]  problems is going to decrease. So we've got to do it now. We've got to get serious about it.
[02:48:41.980 --> 02:48:46.300]  And this business of people getting up and saying the global warming
[02:48:46.300 --> 02:48:50.700]  is fiction and all that is really very, very disturbing.
[02:48:51.900 --> 02:48:56.780]  Yeah. Well, you know, the example that you gave earlier of the fact that the Indian government
[02:48:56.780 --> 02:49:01.740]  was manipulating the temperature at some of the stations there, that kind of works both ways.
[02:49:02.460 --> 02:49:07.580]  They have put some of these temperature stations on the airport tarmacs. And in the UK, they have
[02:49:08.380 --> 02:49:12.220]  a lot of the temperature stations that they've got there. They're just extrapolating the data.
[02:49:12.220 --> 02:49:16.780]  They don't have real temperature measurement stations there. So it all really gets back,
[02:49:16.780 --> 02:49:21.500]  I think, to the scientific method. And that's really where we have to hold people's feet to
[02:49:21.500 --> 02:49:26.700]  the fire. We're talking about something like that. We can have an absolute standard of what truth is.
[02:49:26.700 --> 02:49:31.900]  And that truth is going to be being able to measure something accurately and being able
[02:49:31.900 --> 02:49:36.940]  to reproduce that. And then I think a good yardstick for that is when somebody is trying
[02:49:36.940 --> 02:49:42.620]  to hide their data, that's the clue right there, that they're not doing science because if they're
[02:49:42.620 --> 02:49:47.100]  doing science and they've come to the right conclusion, they don't have a problem with
[02:49:47.100 --> 02:49:52.700]  somebody looking at their data. And so I've got a question here for you from a person in the
[02:49:52.700 --> 02:49:58.940]  audience asking, you know, about doctors James Giordano and Charles Morgan and their work with
[02:49:58.940 --> 02:50:02.860]  military. I'm not familiar with those names. I don't know if you know anything about that or not.
[02:50:03.580 --> 02:50:09.020]  Giordano says familiar. What particular thing are they asking about them?
[02:50:09.020 --> 02:50:11.740]  I don't know. It just says their work with the military. I guess it would have to do with
[02:50:11.740 --> 02:50:18.140]  something. But you haven't heard of it? I'm not sure. I could say Giordano did this or did that.
[02:50:18.140 --> 02:50:23.820]  No. Sure. I understand. Yeah. Let's talk a little bit about the things that we have been anxious
[02:50:23.820 --> 02:50:29.340]  about. And of course, as Christians, we have one answer to it. But you talk about how this is
[02:50:29.340 --> 02:50:36.700]  something that has been around pretty much all of our life. I grew up with anxiety about nuclear
[02:50:36.700 --> 02:50:43.500]  war, for example. That was in everybody's television and that was forefront of our mind,
[02:50:43.500 --> 02:50:48.220]  especially growing up in Florida when the Cuban Missile Crisis was happening. They got us really
[02:50:48.220 --> 02:50:51.900]  afraid of that when I was in elementary school. It's like there's not going to be enough time for
[02:50:51.900 --> 02:50:57.020]  you to get home when the nuclear bombs start falling. And so, I mean, there's all these
[02:50:57.020 --> 02:51:02.940]  different ways that you can panic people. I guess part of it is how do we identify the real problems
[02:51:03.500 --> 02:51:09.260]  and how do we deal with those problems? Because there's always things that are competing
[02:51:09.820 --> 02:51:15.580]  for our attention and our anxiety, many of which are not real. And usually the things that you're
[02:51:15.580 --> 02:51:20.460]  really the most concerned about don't happen. And it may be sometimes because you have taken
[02:51:20.460 --> 02:51:24.460]  a precaution about it. What would you say about that, about anxiety?
[02:51:27.020 --> 02:51:32.380]  You're starting to break up a little bit. Can you hear me clearly?
[02:51:32.380 --> 02:51:35.180]  I hear you. Yes. Yes. Sorry about that. You're talking about
[02:51:35.180 --> 02:51:39.340]  breaking up a little bit. You're talking about traumatizing a population.
[02:51:40.300 --> 02:51:43.420]  You know, what do I do to guard against that type of thing? And of course,
[02:51:43.420 --> 02:51:49.420]  that's going to really escalate with the ability of AI to create a narrative.
[02:51:49.420 --> 02:51:54.540]  Yeah. Well, let's talk about it as an avenue to get into that. Let's go back to what you brought
[02:51:54.540 --> 02:52:00.380]  about the atomic weapons and the atomic war and the fears of the people that there's going to be
[02:52:00.380 --> 02:52:06.940]  another atomic war. I mean, you know, this is not unrealistic. There's even been a move that's just
[02:52:06.940 --> 02:52:13.180]  come out that's getting all kinds of attention, as you know, and it has to do with the threat of a
[02:52:13.820 --> 02:52:21.660]  nuclear war. If you look at what's happening in Europe right now, there's all kinds of suggestions
[02:52:21.660 --> 02:52:27.260]  that could lead to a nuclear war. I mean, Ukraine now has announced that they're under no conditions
[02:52:27.260 --> 02:52:34.940]  willing to give up any land. And Stalin is, I mean, Putin is thinking what he can do to change that.
[02:52:34.940 --> 02:52:39.580]  Maybe he'll attack another country. I mean, this is scary stuff.
[02:52:40.140 --> 02:52:46.540]  So what's happening in response to the government is to try to show that we shouldn't worry about
[02:52:46.540 --> 02:52:49.660]  it. We have things under control, but I don't think things are under control.
[02:52:49.660 --> 02:52:57.100]  Mm hmm. And we've talked about the problems and we've talked about problems. You have your final
[02:52:57.100 --> 02:53:04.940]  chapter is new ways of thinking. And I'd like to talk about that. One of the things that you say
[02:53:05.020 --> 02:53:09.740]  is Ockham was wrong, Ockham's razor that, you know, people are familiar with. Tell us a little bit
[02:53:09.740 --> 02:53:14.780]  about that. Why is Ockham wrong? Well, because he says that, you know, the
[02:53:15.660 --> 02:53:20.620]  entities are not to be multiplied, meaning that we can always explain things best by limiting
[02:53:20.620 --> 02:53:27.740]  ourselves to the minimum amount of factors. Ideally one, one cause of every fact. That's not
[02:53:27.740 --> 02:53:33.100]  true. It's certainly not true in the 21st century, where there's all kinds of interruptions. And
[02:53:33.420 --> 02:53:39.660]  there's all kinds of interactions between factors. You ain't heard about Moto Casino. Moto has real
[02:53:39.660 --> 02:53:44.380]  Vegas slots. Any game you can find on the floor in Vegas, you can play it on Moto. I like my slots
[02:53:44.380 --> 02:53:49.580]  hot. Moto's free to play like food stamps in line at the grocery store, at a funeral, in traffic.
[02:53:49.580 --> 02:53:53.740]  Keep your eyes on the road. Hop on Moto Casino. Moto Casino got jackpots that are bigger than my
[02:53:53.740 --> 02:53:57.980]  belly. Moto, America's hottest free to play social casino. Download the Moto Casino app today.
[02:53:57.980 --> 02:54:01.180]  Moto Casino is a social casino board. We're prohibited. No purchase necessary. Visit moto.us
[02:54:01.180 --> 02:54:04.460]  for more details.
[02:54:31.180 --> 02:54:44.220]  And causes. So that Ockham was wrong in that basis. We have to think of an interconnecting
[02:54:44.220 --> 02:54:49.180]  pool, just as in the brain, of interconnections of neurons, interconnections of these problems,
[02:54:49.180 --> 02:54:53.900]  and they're all related. They're all related. All eight of them I talk about in my book,
[02:54:53.900 --> 02:54:59.020]  they're all related. And if you can figure a way of influencing one, you influence all the others.
[02:54:59.500 --> 02:55:04.620]  I mean, who would think there'd be a connection between global warming and the amount of
[02:55:05.340 --> 02:55:12.300]  artisan and cheese, for instance, high end cheese? Well, there is, because they don't,
[02:55:12.300 --> 02:55:18.140]  chickens don't lay many eggs, and there'd be all the various other things that come on in terms
[02:55:18.140 --> 02:55:24.220]  of making cheese. I learned that the other day. That was something that was a surprise to me.
[02:55:24.220 --> 02:55:27.340]  You know, it's kind of interesting when you talk about connections so much. There was a
[02:55:28.140 --> 02:55:32.860]  series that was on PBS. I think the guy's name was Burke. I can't remember his first name. I'm
[02:55:32.860 --> 02:55:38.540]  not sure about the last name, but he had a series called connections. And I thought it was fascinating
[02:55:38.540 --> 02:55:44.860]  because what he would do is he would take a whole series of connections to show how a particular
[02:55:44.860 --> 02:55:52.220]  technology had evolved. So he might go from the quill to the jet engine or something like that.
[02:55:52.220 --> 02:55:57.660]  And it was a fascinating, fascinating thread of things. It's very much like what you're talking
[02:55:57.660 --> 02:56:06.300]  about. It really is. And I did, I did consult his work, actually. Did you? I was writing this book
[02:56:06.300 --> 02:56:11.420]  because he did that connections. He did a book called the day the world changed and all this.
[02:56:11.420 --> 02:56:16.780]  He also did a book called circles in which he would start with one particular event that
[02:56:16.780 --> 02:56:22.300]  it had carried in history. And if you go around the circle, you come back to the beginning where
[02:56:22.300 --> 02:56:27.580]  it started, where this particular inventor invented something. What led up to it? What was the
[02:56:27.580 --> 02:56:32.780]  circle leading to that? So yes, we're talking about connections and we're talking about the
[02:56:32.780 --> 02:56:40.220]  inability to understand things without reference to supporting and accessory factors. We have that
[02:56:40.220 --> 02:56:46.220]  going all the time, denying things that are going to be happening. And of course, I think the fearful
[02:56:46.220 --> 02:56:52.380]  thing is that the government is aiding in this denial, because if you would deny that there's a
[02:56:52.380 --> 02:57:00.300]  problem, then there's very little impetus to try to solve it. You know, yeah, and there ain't no problem.
[02:57:00.300 --> 02:57:08.380]  Don't try to solve it. They're throwing out their own chaos and uncertainty and anxiety that's out
[02:57:08.380 --> 02:57:14.780]  there all the time, always, I guess. So the question is, she's talking about volatility,
[02:57:14.780 --> 02:57:18.860]  uncertainty, complexity and ambiguity. I mean, it sounds like a government policy.
[02:57:19.500 --> 02:57:26.140]  I think they've got bureaucracies that specialize in that. Yeah. Yeah. Well, actually, that's true.
[02:57:26.140 --> 02:57:30.460]  Yeah. That's in your section there about new ways of thinking. And so how do we incorporate that
[02:57:31.020 --> 02:57:34.220]  into new ways of thinking that help us to solve this riddle?
[02:57:36.140 --> 02:57:42.620]  Well, each of those factors is a factor that helps you to understand things
[02:57:43.100 --> 02:57:46.300]  and to have more control. It doesn't necessarily mean it helps you to
[02:57:46.940 --> 02:57:52.220]  link them together. That has to be done by original thinking. You have to be, you know,
[02:57:52.220 --> 02:57:58.780]  under those things, things are volatile. You don't have a basic situation that doesn't change. It
[02:57:58.780 --> 02:58:06.940]  changes all the time. So the other thing that I want to emphasize most is the role of capitalism
[02:58:06.940 --> 02:58:14.140]  in all of this. I mean, there's all this, like the private equity, the business of people
[02:58:14.140 --> 02:58:22.620]  having a point of view that is going to advance them financially and that blinding them to the
[02:58:22.620 --> 02:58:27.580]  problems that are here. Like, for instance, we talked about global warming. Well, the rich people,
[02:58:27.580 --> 02:58:34.700]  very rich people, are buying multi-million dollar apartments and condominiums which have special
[02:58:34.700 --> 02:58:41.980]  air filters, which will keep the wildfire smoke out. And we'll try to keep the global warming
[02:58:41.980 --> 02:58:52.300]  effect at bay by superpower air conditioners. Of course, they're building their own bunkers,
[02:58:52.300 --> 02:58:59.340]  too. They're building things that are creating all kinds of chaos and weapons of war, mass
[02:58:59.340 --> 02:59:04.380]  destruction. They're building super bunkers in various places as well. So I think there's
[02:59:04.380 --> 02:59:09.740]  somewhat pessimistic about what they're doing. Well, it's basically the idea is that, you know,
[02:59:09.740 --> 02:59:14.300]  we don't care about the ordinary person. We're going to survive. We're going to see to our own
[02:59:14.300 --> 02:59:21.180]  survival. And in order to do that, we have to deny certain things that are going on. We'll do so.
[02:59:21.180 --> 02:59:26.380]  Now, incidentally, all of this is not conscious thinking. They don't necessarily say, well,
[02:59:26.380 --> 02:59:32.540]  I'm going to deny global warming because it'll be to my advantage financially because all my
[02:59:32.540 --> 02:59:39.180]  investment is in the oil and gas industry. They don't do it that way. They come up with pseudo
[02:59:39.820 --> 02:59:44.540]  logic, things that seem to make sense to them. But if they didn't have a financial
[02:59:45.420 --> 02:59:49.500]  thrust in the matter, they would look out upon it quite differently.
[02:59:49.500 --> 02:59:53.580]  That's right. We can always find a justification for what it is that we really want.
[02:59:54.700 --> 02:59:58.220]  Everybody should understand that if you're a parent this time of year at Christmas time,
[02:59:58.220 --> 03:00:02.460]  you can always understand that people come up with a justification for what they want.
[03:00:02.460 --> 03:00:07.340]  And that's as true of a government as it is of corporations out there. And it's really dangerous
[03:00:07.340 --> 03:00:10.700]  when the two of them connect with each other. I think that's one of the things, you know,
[03:00:10.700 --> 03:00:15.580]  you talk about connections and the importance of it and how we can try to connect these different
[03:00:15.580 --> 03:00:21.020]  factors, each of us individually. But I think it's the human connection that is out there that is
[03:00:21.020 --> 03:00:27.260]  going to be essential for all of this. It's going to be our collective work on all this. What do you
[03:00:27.260 --> 03:00:32.220]  think about that? Would you agree with that? Well, I'd agree with it. But there's so many
[03:00:32.220 --> 03:00:39.900]  things that are taking place now that are causing the schisms and splitting people into factors and
[03:00:39.900 --> 03:00:46.220]  belief systems and political points of view. And that's very dangerous because then you
[03:00:46.220 --> 03:00:52.300]  can't get together any kind of unity even in the face of an emergency. Well, I think we've always
[03:00:53.260 --> 03:00:57.660]  had these factor, you know, factions and things like that. You know, the founders of the country
[03:00:57.660 --> 03:01:02.940]  warned about factions and political parties. But I think what makes it unique is that when you're
[03:01:02.940 --> 03:01:08.460]  interacting with people on a personal basis, you interact with them a little bit differently than
[03:01:08.460 --> 03:01:13.660]  if you've got that separation between you that technology is giving us now. Because now you're
[03:01:13.660 --> 03:01:18.140]  interacting with something that's abstract. It's not with another person. And there's also the
[03:01:18.140 --> 03:01:23.260]  body language that you're not picking up on. But it makes it easier for you to be harder on people
[03:01:23.260 --> 03:01:27.980]  when there's that distance there, I think. That's why I think, you know, the personal connection,
[03:01:27.980 --> 03:01:33.340]  I think, is really vital to making these connections and coming up with an understanding
[03:01:33.340 --> 03:01:37.740]  of what's going on. We talk about the hidden factors that are out there, hidden unrelated
[03:01:37.740 --> 03:01:43.100]  topics. Other people, as you pointed out earlier, just talking to ordinary people about what it is
[03:01:43.100 --> 03:01:48.540]  that you see with different things. I think that is the genius of the collective free market out
[03:01:48.540 --> 03:01:55.180]  there that there's so many observers who are looking at things and thinking about them.
[03:01:55.180 --> 03:02:00.700]  And it's kind of their collective decision that is kind of guiding things along, as opposed to
[03:02:00.700 --> 03:02:06.140]  having a central planner who's doing that. What do you think about that? You've got to, in your
[03:02:06.140 --> 03:02:10.380]  final chapter, a new way of thinking, you have what you call a sensible solution.
[03:02:10.860 --> 03:02:17.660]  What does that really involve? I'm sorry, I didn't hear what you said. What's the last part?
[03:02:17.660 --> 03:02:24.380]  You have a sensible solution. What do you think a sensible solution to the kind of stress and chaos
[03:02:25.100 --> 03:02:28.620]  and anxiety that we have, manipulation that we have? What is the solution to that?
[03:02:29.420 --> 03:02:35.100]  Well, I think the Wikipedia is a good example of that. They have people from all walks of life,
[03:02:35.100 --> 03:02:41.980]  all levels of education, free to contribute to whatever topic they may want to do that.
[03:02:43.580 --> 03:02:50.780]  I mentioned earlier about the effect of global warming on the making of cheese. There might be
[03:02:50.780 --> 03:02:55.900]  somebody who makes cheese that's going to come up with some idea. You know, we don't know that. We
[03:02:55.900 --> 03:03:01.900]  don't know that that may not be where comes some original idea on what to do about global warming.
[03:03:01.900 --> 03:03:06.860]  And you put it on what I'd like to think, and I hope it will be developed, a kind of Wikipedia
[03:03:07.660 --> 03:03:13.260]  where the ordinary person can feel free to put forth their ideas about it. Now, you say, well,
[03:03:13.260 --> 03:03:19.340]  we already have that. We have the internet. No, we don't. The internet is a commercial situation.
[03:03:19.340 --> 03:03:25.500]  It's all done for making money and grab attention and all that. And there's no criticism on it.
[03:03:25.500 --> 03:03:30.540]  There's no peer review, if you will. Whereas in the Wikipedia, I mean, you know, people can write
[03:03:30.540 --> 03:03:35.900]  in and say, well, that particular contribution is bonkers and then give an example why it is,
[03:03:35.900 --> 03:03:41.020]  or that was a very good idea. And after that, you begin to get things coming together
[03:03:42.060 --> 03:03:46.060]  in unpredictable ways that may help us solve these eight problems.
[03:04:00.540 --> 03:04:03.420]  Tricks or gimmicks owned and operated in the USA.
[03:04:03.420 --> 03:04:06.220]  Moto Casino is a free to play social casino. No purchase necessary. 21 plus to play.
[03:04:06.220 --> 03:04:09.340]  Void word prohibited. Sign up today for a generous welcome bonus.
[03:04:09.340 --> 03:04:14.540]  Moto Casino. America's social casino.
[03:04:14.540 --> 03:04:16.460]  Download the Moto Casino app today.
[03:04:17.180 --> 03:04:22.140]  Multiply, multiply, multiply, multiply.
[03:04:22.140 --> 03:04:26.860]  With X's the cash scratch tickets from the Texas lottery, you could multiply the cash
[03:04:26.860 --> 03:04:33.740]  by 30, 50, 100 or even 200 times. And when you multiply the cash, you multiply the celebration
[03:04:33.740 --> 03:04:38.780]  with top prizes from 60,000 up to a million dollars. It's the easiest way to multiply your
[03:04:38.780 --> 03:04:44.700]  luck and enter for a chance to win a VIP I heart experience play X the cash scratch tickets today
[03:04:44.700 --> 03:04:49.100]  must be 18 or older play responsibly. You know, the problem is, it seems like
[03:04:49.100 --> 03:04:53.580]  whenever you wind up having a form or a place where things can be, and that's true of the
[03:04:53.580 --> 03:04:58.140]  internet, it's also true of Wikipedia, then it becomes, you have gatekeepers who are there.
[03:04:58.140 --> 03:05:05.420]  And we saw this in spades throughout the COVID stuff that if somebody's got a different idea,
[03:05:05.420 --> 03:05:12.380]  rather than debate them, the impetus is to silence them by the people who are in authority.
[03:05:12.380 --> 03:05:19.980]  And so that really, I think is the key thing. And I think as part of that, we see a continuing rise
[03:05:19.980 --> 03:05:30.380]  in disgust and deprivation of free speech, people are not interested in the principle
[03:05:30.380 --> 03:05:35.020]  of free speech, they don't want to have open debate. And I see this regardless of where people
[03:05:35.020 --> 03:05:42.460]  are coming from on the political spectrum, there is a declining interest in debate and thinking,
[03:05:42.460 --> 03:05:47.980]  you know, the debate is critical to critical thinking. And so the people who are in charge,
[03:05:48.620 --> 03:05:52.380]  the gatekeepers, whether it's Wikipedia, or the internet, or, you know, any other
[03:05:53.020 --> 03:05:58.940]  form of information, they are weighing in on that. And they don't want things that they disagree
[03:05:58.940 --> 03:06:02.940]  with. And it might be because they've got an agenda, or it might be because they've just got a
[03:06:03.500 --> 03:06:09.180]  particular prejudice about something, they want to make sure that the contrary views don't get out
[03:06:09.180 --> 03:06:14.860]  there. That I think is the real key that's there. And again, this is part of this
[03:06:15.820 --> 03:06:21.740]  atomization that we have of people feeding that tribalism in a way that we've never seen it before,
[03:06:21.740 --> 03:06:28.220]  using technology. I agree with everything you've just said, exactly. And I think we have to try to
[03:06:28.220 --> 03:06:34.700]  get beyond that. But we get back again to this business of people having their own personal,
[03:06:34.700 --> 03:06:42.540]  financial point of view and position and pushing that basically on the fact that they look upon
[03:06:42.540 --> 03:06:48.620]  it as so maybe we're talking about a capitalism problem. We've got capitalism, that's what this
[03:06:48.620 --> 03:06:52.460]  country is all about. But I mean, it's certain parts of it. Now, we've gotten to the point
[03:06:52.460 --> 03:06:58.860]  where people are unable to take another point of view, if it's going to be financially harmful
[03:06:58.860 --> 03:07:05.500]  and hurtful to them. Yeah, I think that, you know, certainly kind of the tech companies,
[03:07:05.500 --> 03:07:09.100]  I don't think that their capitalism would exist. I don't think that have billions of dollars if
[03:07:09.100 --> 03:07:15.420]  they weren't unified with the government. So there's a symbiosis there that the two of these
[03:07:15.980 --> 03:07:23.100]  entities feed off of each other. And I think that that nexus right there is the difficult thing.
[03:07:23.100 --> 03:07:28.620]  And so I think, you know, when I think of capitalism, I don't like to refer to capitalism
[03:07:28.620 --> 03:07:34.540]  anymore, because I think of it as a partnership, a public-private partnership, some kind of a
[03:07:34.540 --> 03:07:39.740]  economic fascism where they are working together. But I like to think of a free competitive market
[03:07:39.740 --> 03:07:45.100]  where the government doesn't have any role except as some kind of a referee between two parties that
[03:07:45.100 --> 03:07:49.900]  have a conflict or something. But yeah, that's the thing that's really driving this. You know,
[03:07:49.900 --> 03:07:53.340]  many people, when they talk about AI, they said, well, you know, here's a couple of different
[03:07:53.340 --> 03:07:57.980]  outcomes. Maybe this stuff really works the way it's supposed to work and takes everybody's jobs,
[03:07:57.980 --> 03:08:03.580]  and we wind up with a depression. Or maybe it doesn't work at all, in which case the big AI
[03:08:03.580 --> 03:08:07.100]  stock bubble that we've got burst and everybody loses their job because of that.
[03:08:07.740 --> 03:08:12.380]  And so there's a third alternative, and that is that the government keeps propping it up with
[03:08:12.380 --> 03:08:21.500]  public funds, because it feeds their surveillance and manipulation needs, their ability to surveil
[03:08:21.500 --> 03:08:26.780]  and to control us. And I really think that that's where this is all going to head. I don't really,
[03:08:26.780 --> 03:08:31.500]  you know, those other two things may happen, and they may be true. But I think there is a customer
[03:08:31.500 --> 03:08:36.460]  out there for the AI stuff that is driving all this stuff that has been putting out these
[03:08:36.460 --> 03:08:40.060]  proposals for the longest time. And that's governments, governments around the world.
[03:08:40.060 --> 03:08:44.700]  I mean, we look at the brain project that we had a few years ago. That was during the Obama
[03:08:44.700 --> 03:08:50.540]  administration. But things like the brain-computer interface that Elon Musk and many other tech
[03:08:50.540 --> 03:08:55.980]  companies are doing out there, there's Neuralink, and there's a lot of them that are doing that.
[03:08:55.980 --> 03:09:01.980]  That's being driven by the government wanting to connect into our minds, hack into our minds,
[03:09:01.980 --> 03:09:05.980]  really. And they've been funding that kind of stuff. So how do we break that?
[03:09:07.820 --> 03:09:11.660]  On the Musk side, he's doing it for money. I mean, obviously to make money.
[03:09:11.660 --> 03:09:12.300]  That's right.
[03:09:12.300 --> 03:09:17.820]  So that there's an unholy alliance, if you will, between someone who can't see anything other than
[03:09:17.820 --> 03:09:22.620]  the dollar, and on the other side, the government can't see anything other than increasing power
[03:09:22.620 --> 03:09:25.020]  and surveillance over the population.
[03:09:25.020 --> 03:09:29.420]  Yeah, that's right. Absolutely true. Well, it's a fascinating book. It's a fascinating
[03:09:29.420 --> 03:09:35.820]  take on this. And of course, you've written many books on the brain. The memory one,
[03:09:35.820 --> 03:09:42.460]  very interesting. And you do have sections about memory in this book as well. And people be able
[03:09:42.460 --> 03:09:47.500]  to find this on Amazon, I guess is the best place that they can find it, looking for the title of
[03:09:47.500 --> 03:09:54.620]  this. And it is, you know, it is something that I think we all need to think about how we're going
[03:09:54.620 --> 03:10:00.700]  to operate the effects that this technology is having on our brains in the 21st century.
[03:10:00.700 --> 03:10:06.700]  And that is the title of the book, The 21st Century Brain by Richard Restak. Thank you very
[03:10:06.700 --> 03:10:09.900]  much, Dr. Restak. Thank you. Appreciate you coming on.
[03:10:09.900 --> 03:10:11.980]  Good day. I enjoyed it very much. Thank you.
[03:10:11.980 --> 03:10:15.340]  Very interesting conversation. Thank you. Have a good day, folks. We're going to take a quick
[03:10:15.340 --> 03:10:18.380]  break and we will be right back.
[03:11:45.340 --> 03:12:04.940]  You're listening to the David Knight Show.
[03:12:15.340 --> 03:12:34.940]  We're going to take a quick break and we will be right back.
[03:12:45.340 --> 03:13:14.700]  Welcome back. And I've had a lot of comments. I don't want to get to these. I knew before I
[03:13:14.700 --> 03:13:18.700]  brought him in that he was I didn't think he was going to be that focused on climate change. I
[03:13:18.700 --> 03:13:25.660]  really wanted to talk to him about the other issues that were there. But yeah, it's we had
[03:13:25.660 --> 03:13:33.260]  a lot of comments about that. As a matter of fact, listen, is this thing about the cheese stuff and
[03:13:33.260 --> 03:13:39.820]  global warming connection? Is that so they can try to tax the cheese? So I guess the question is,
[03:13:39.820 --> 03:13:43.980]  who stole the cheese, right? These people are trying to steal our cheese all the time. But we
[03:13:43.980 --> 03:13:53.420]  do have an update, by the way. And this is some comments from the telegram chat. Paul McCloud said,
[03:13:53.420 --> 03:13:57.740]  I'm asking each and every one of you to send prayers in my direction for a specific reason
[03:13:57.740 --> 03:14:03.500]  that I cannot disclose at the moment. By the pricking of my thumbs, something wicked this way
[03:14:03.500 --> 03:14:09.020]  comes. So they sent that. So I just pass that along to you. That's for Paul McCloud, who is asking
[03:14:09.020 --> 03:14:14.940]  for prayer. And for the love of the road, Ryan has given us an update on his dad's surgery. He said,
[03:14:14.940 --> 03:14:20.620]  Dad, surgery was done afternoon yesterday. It went well. And they eliminated all seven blockages. Wow.
[03:14:21.660 --> 03:14:26.780]  Had to take veins from other parts of the body to go around some of them, though. He should be home
[03:14:26.780 --> 03:14:31.740]  by Saturday. He said, sorry to hear about Clyde Lewis. Glad he's got a loyal base that is helping
[03:14:31.740 --> 03:14:39.580]  him with GoFundMe. Yes. And so I'm glad that things are going well for your dad, Ryan. I hope it
[03:14:39.580 --> 03:14:44.620]  continues to go that way. We'll continue to pray about that. And let me get some of your comments
[03:14:44.620 --> 03:14:50.780]  here. Occam's Razor is not what people think it is. It states that the explanation with the least
[03:14:50.780 --> 03:14:55.660]  number of assumptions is likely to be correct. Not the simplest explanation is likely to be correct.
[03:14:55.660 --> 03:15:04.060]  That's from Greg Hume, 121. That's fine. Yes. And he says, oh, let's see, this is I'm Marty. He says,
[03:15:04.060 --> 03:15:09.820]  come on, most wildfires are arson, not global warming. I agree with that. I agree with that.
[03:15:09.820 --> 03:15:18.700]  And you all know that I'm not buying into global warming. And he began by talking about how they
[03:15:18.700 --> 03:15:25.260]  were manipulating the data at the Indian stations to try to minimize the pollution that was there.
[03:15:25.260 --> 03:15:32.140]  And to lower the temperature. But typically, governments are doing just the opposite.
[03:15:32.780 --> 03:15:40.140]  And it was the climate change crowd, the global warming crowd, that gave India the license to
[03:15:40.140 --> 03:15:46.140]  have as cheap and dirty power plants as possible. So you might want to start with what the government
[03:15:46.140 --> 03:15:52.860]  policy has been towards their MacGuffin of climate change. That's the reason they have that kind of
[03:15:52.860 --> 03:15:58.060]  pollution that's there. And of course, that was why Nixon unconstitutionally created the
[03:15:58.060 --> 03:16:02.860]  Environmental Protection Agency. There's nothing in the Constitution that says that it's the role
[03:16:02.860 --> 03:16:08.460]  of the federal government to protect the environment. And they did it because of pollution.
[03:16:08.460 --> 03:16:13.500]  They said, we've got some polluted sites that are so big, we don't have the money to address them
[03:16:13.500 --> 03:16:18.700]  locally or state level. So let's do it at the federal level. And so they had their Superfund
[03:16:18.780 --> 03:16:26.700]  cleanup thing. And then they metastasized from pollution to telling us what kind of cars we
[03:16:26.700 --> 03:16:33.420]  could have and mission control with that. So again, it's mission creep, or I guess we could say
[03:16:33.420 --> 03:16:39.900]  emission creep. Though in the case of the Indian testing stations, I believe he was referring to
[03:16:39.900 --> 03:16:46.460]  air quality with the massive amounts of air pollution they have in these cities. And spraying
[03:16:46.460 --> 03:16:51.980]  it, I believe he was implying that you clean up the air, which in that instance, I would agree.
[03:16:53.020 --> 03:16:59.500]  Yeah, you find, interestingly enough, you know, and the two most populous countries, China and
[03:16:59.500 --> 03:17:04.300]  India, where they've said, don't worry about cleaning up the pollution from your factories
[03:17:04.300 --> 03:17:10.380]  or your power stations, do whatever you want, right. They also have the worst air pollution,
[03:17:10.380 --> 03:17:16.940]  Wuhan is one of the worst places for air pollution. So, Real Octo Spook says he's
[03:17:16.940 --> 03:17:21.020]  correct about one thing, the money around global warming will buy the truth before it can be
[03:17:21.020 --> 03:17:26.700]  muttered. That's right. Money problem and a gigantic government. Yeah, we can head around
[03:17:26.700 --> 03:17:31.580]  the whole issue. I think all the little spaghetti strings, when you keep pulling them all out,
[03:17:31.580 --> 03:17:38.380]  you'll find the government and you'll find human nature in terms of the greed for power and for
[03:17:38.380 --> 03:17:44.860]  money. That is the common spaghetti thread that ties all this stuff together. And that's how we
[03:17:44.860 --> 03:17:49.900]  keep our distance from this. But I think the real key thing takeaway for me from that interview
[03:17:49.900 --> 03:17:54.860]  was the key thing is the connections. Our brain works on connections, our brain works best with
[03:17:54.860 --> 03:18:01.180]  connections, connections with other people, expand our mind, expand our universe. And it's
[03:18:01.180 --> 03:18:07.260]  that person to person connection that is so difficult for us to maintain today, that is so
[03:18:07.260 --> 03:18:10.380]  vital for us, our survival. Thank you for joining us. Have a good day.
[03:18:37.820 --> 03:18:46.300]  They see the common man as simple, unsophisticated, ordinary. But each of us has worth and dignity
[03:18:46.300 --> 03:18:54.060]  created in the image of God. That is what we have in common. That is what they want to take away.
[03:18:54.860 --> 03:19:00.940]  Their most powerful weapons are isolation, deception, intimidation. They desire to know
[03:19:00.940 --> 03:19:07.420]  everything about us while they hide everything from us. It's time to turn that around and
[03:19:07.420 --> 03:19:12.940]  expose what they want to hide. Please share the information and links you'll find
[03:19:12.940 --> 03:19:17.340]  at TheDavidNightShow.com. Thank you for listening. Thank you for sharing.
[03:19:23.020 --> 03:19:28.380]  If you can't support us financially, please keep us in your prayers. TheDavidNightShow.com.
[03:19:30.940 --> 03:19:54.220]  You ain't heard about Modo Casino. Modo has real Vegas slots. Any game you can find on the floor
[03:19:54.220 --> 03:19:59.100]  in Vegas, you can play it on Modo. I like my slots hot. Modo's free to play like food stamps
[03:19:59.100 --> 03:20:03.020]  in line at the grocery store, at a funeral, in traffic. Keep your eyes on the road. Hop on
[03:20:03.020 --> 03:20:07.580]  Modo Casino. Modo Casino got jackpots that are bigger than my belly. Modo, America's hottest
[03:20:07.580 --> 03:20:11.340]  free to play social casino. Download the Modo Casino app today. Modo Casino is a social casino,
[03:20:11.340 --> 03:20:14.140]  but we're prohibited. No purchase necessary. Visit Modo.us for more details.
[03:20:20.060 --> 03:20:24.780]  The weather's chilly and your allergies won't let up, right? Enter Aspire Allergy and Sinus.
[03:20:24.780 --> 03:20:28.700]  Aspire patients find lasting relief with no more winter spent suffering. Plus,
[03:20:28.700 --> 03:20:33.420]  Aspire Allergy offers same-day results, no more waiting. Skip endless drugstore visits,
[03:20:33.420 --> 03:20:37.820]  save in the long run with Aspire Allergy's long-term solutions. No more empty promises,
[03:20:37.820 --> 03:20:41.260]  just the relief you truly deserve. Ready to conquer your allergies?
[03:20:41.260 --> 03:20:49.500]  Visit AspireAllergy.com and schedule your appointment today. That's AspireAllergy.com.