dr_restack_12_09_2025.timecode
Detecting language using up to the first 30 seconds. Use `--language` to specify the language
Detected language: English
[00:00.000 --> 00:05.300] MOTO Casino, America's Social Casino.
[00:05.300 --> 00:07.900] Welcome to MOTO Casino, where the excitement never ends.
[00:07.900 --> 00:11.300] With thousands of the hottest free-to-play social casino games, fastest payouts,
[00:11.300 --> 00:13.000] and the best promotions in the industry.
[00:13.000 --> 00:14.400] No tricks or gimmicks.
[00:14.400 --> 00:16.300] Owned and operated in the USA.
[00:16.300 --> 00:17.800] MOTO Casino is a free-to-play social casino.
[00:17.800 --> 00:18.500] No purchase necessary.
[00:18.500 --> 00:19.300] 21 plus to play.
[00:19.300 --> 00:19.900] Avoid where prohibited.
[00:19.900 --> 00:22.300] Sign up today for a generous welcome bonus.
[00:22.300 --> 00:27.500] MOTO Casino, America's Social Casino.
[00:27.500 --> 00:29.400] Download the MOTO Casino app today.
[00:30.100 --> 00:35.100] Multiply, multiply, multiply, multiply.
[00:35.100 --> 00:39.800] With X the cash scratch tickets from the Texas Lottery, you could multiply the cash
[00:39.800 --> 00:43.700] by 30, 50, 100, or even 200 times.
[00:43.700 --> 00:46.800] And when you multiply the cash, you multiply the celebration.
[00:46.800 --> 00:50.100] With top prizes from 60,000 up to a million dollars,
[00:50.100 --> 00:52.300] it's the easiest way to multiply your luck.
[00:52.300 --> 00:55.400] And enter for a chance to win a VIP iHeart experience.
[00:55.400 --> 00:57.700] Play X the cash scratch tickets today.
[00:57.800 --> 00:59.000] Must be 18 or older.
[00:59.000 --> 01:00.100] Play responsibly.
[01:07.300 --> 01:11.600] All right, and joining us now is Dr. Richard Restak, MD.
[01:11.600 --> 01:14.200] And he is a neuroscientist as well.
[01:14.200 --> 01:17.700] And he has written a lot of books on the brain.
[01:17.700 --> 01:24.100] And now this is one kind of the nexus of our brain and artificial intelligence.
[01:24.200 --> 01:26.400] So I wanted to get him on because we, as you know,
[01:26.400 --> 01:30.300] we talk about AI and its impact on society quite a bit.
[01:30.300 --> 01:31.700] Thank you for joining us, Dr. Restak.
[01:33.300 --> 01:34.300] Well, I'm happy to be here.
[01:34.300 --> 01:35.200] Thank you, David.
[01:35.200 --> 01:38.500] You've written so many books and a best-selling author.
[01:38.500 --> 01:40.500] And of course, people can find this on Amazon.
[01:40.500 --> 01:42.500] You've written so many books.
[01:42.500 --> 01:43.800] What is different about the brain?
[01:43.800 --> 01:45.900] What is different about this one?
[01:45.900 --> 01:47.200] And why did you write this book?
[01:48.100 --> 01:56.000] I wrote this book to announce and to discuss the dangers that are lurking,
[01:56.000 --> 02:00.900] so to speak, in the 21st century and are unique to the 21st century,
[02:00.900 --> 02:04.600] but are having an effect on the brain and a negative one.
[02:04.600 --> 02:09.600] So that we really are imperiled by eight different factors,
[02:09.600 --> 02:11.700] one of which is the global warming.
[02:11.700 --> 02:17.100] We have new diseases that are present in the 21st century,
[02:17.100 --> 02:21.900] that are increasing, starting with COVID and moving forward.
[02:21.900 --> 02:26.800] We have problems, of course, with the global warming,
[02:26.800 --> 02:28.600] which we'll talk about in more detail.
[02:28.600 --> 02:30.800] And then the Internet, the effect of the Internet,
[02:30.800 --> 02:36.800] the effect of AI, memory, the alteration, the attempt to alter memory,
[02:36.800 --> 02:40.300] almost to alter our memories of what the past was like.
[02:40.300 --> 02:45.300] This is an ongoing enterprise by various governments in the world,
[02:45.300 --> 02:47.100] including our own.
[02:47.100 --> 02:50.400] We also have surveillance, the seventh,
[02:50.400 --> 02:54.300] the surveillance becoming increasingly a surveillance society.
[02:54.300 --> 02:59.700] It's almost impossible to not be revealing things about yourself
[02:59.700 --> 03:03.000] because there's surveillance cameras everywhere.
[03:03.000 --> 03:06.300] I can give you several examples of that just in my own personal life.
[03:06.300 --> 03:09.800] And then finally, the eighth one is anxiety.
[03:09.800 --> 03:15.100] All of these things are creating what I call an existential anxiety.
[03:15.100 --> 03:17.900] People are being given information,
[03:17.900 --> 03:21.800] but it's being molded according to the thoughts
[03:21.800 --> 03:24.700] and the inclinations of people in power.
[03:24.700 --> 03:29.000] For instance, let's take today's, right out of today's New York Times,
[03:29.000 --> 03:33.000] on page A7, there's an article called
[03:33.000 --> 03:36.700] The Air in New Delhi is Life Threatening.
[03:36.700 --> 03:41.500] And it tells the tale of the New York Times reporters
[03:41.500 --> 03:44.300] who have spread themselves throughout New Delhi
[03:44.300 --> 03:49.800] from 6 a.m. until late in the evening of a certain day recently.
[03:49.800 --> 03:54.000] And they measured the particulate matter in the air,
[03:54.000 --> 03:57.500] and it was anywhere from 10 times to 30 times
[03:57.500 --> 04:03.100] as great as would be considered minimally normal.
[04:03.100 --> 04:07.300] Now, on top of that, you have the statement that they state
[04:07.300 --> 04:13.900] that the government is actually trying to hide this kind of insight
[04:13.900 --> 04:18.700] to the populace by spraying water and other things like that.
[04:18.700 --> 04:23.100] It says that they're doing this around the measuring stations.
[04:23.100 --> 04:27.300] They're also losing data from measuring stations
[04:27.300 --> 04:29.900] during the worst amounts of pollution.
[04:29.900 --> 04:34.200] So there you have the molding of the facts,
[04:34.200 --> 04:38.100] either denying them all together or trying to improve them
[04:38.100 --> 04:42.200] so people say, oh, well, they measured it down at such-and-such a measuring station,
[04:42.200 --> 04:44.100] and it was really not all that high.
[04:44.100 --> 04:49.000] Well, of course, they were spreading water and other things to try to reduce this.
[04:49.000 --> 04:53.100] So we've got a capitalist society here in the United States,
[04:53.100 --> 05:00.700] which has a vested interest in pushing forward certain scientific points of view.
[05:00.700 --> 05:03.800] So science is being put sort of in the back seat,
[05:03.800 --> 05:08.400] and there's politicians and other people, all of whom share one thing,
[05:08.400 --> 05:16.000] capitalistic enterprises in which they're part of or which they are advancing.
[05:16.000 --> 05:21.600] And a kind of crony capitalism where they can get protection and subsidies as well.
[05:21.600 --> 05:25.200] And the control is being taken away from us because,
[05:25.200 --> 05:27.600] as I was just reporting earlier today,
[05:27.600 --> 05:31.200] they're working very hard to make sure that state and local governments
[05:31.200 --> 05:35.000] can't enact any control on artificial intelligence.
[05:35.100 --> 05:40.900] And that came up in the context of talking about how the manufacturers of tasers,
[05:40.900 --> 05:43.900] also big manufacturers of police body cams,
[05:43.900 --> 05:47.000] how they want to wed that to artificial intelligence.
[05:47.000 --> 05:49.700] And the question is, you know, what could possibly go wrong with that?
[05:49.700 --> 05:53.800] If they identify you, they misidentify you as a dangerous criminal
[05:53.800 --> 05:59.300] and warn the police about how dangerous you are, they could get people killed.
[05:59.300 --> 06:07.500] Well, not only that, but all of these efforts set up a sense of anxiety and fear.
[06:07.500 --> 06:10.900] Let me just tell you what happened to me in one morning.
[06:10.900 --> 06:13.600] Called a cab to go to a medical appointment,
[06:13.600 --> 06:15.900] and we've started going down the road.
[06:15.900 --> 06:20.700] I said to the driver, you know, you're not going the most efficient or the quickest way.
[06:20.700 --> 06:21.900] He said, I know that.
[06:21.900 --> 06:25.800] He said, but I don't want to go that way because there's speed cameras.
[06:25.800 --> 06:29.400] I said, well, you know, you're driving very sensibly and you're not speeding.
[06:29.400 --> 06:30.800] And I'm in no hurry.
[06:30.800 --> 06:32.600] So what's the problem?
[06:32.600 --> 06:36.500] He said, well, they take pictures of everybody that goes by those cameras
[06:36.500 --> 06:40.400] because they want to see who's in those photos in those cars.
[06:40.400 --> 06:42.700] So I asked him to give me a reference for that.
[06:42.700 --> 06:46.800] And he got sort of didn't say anything else for the rest of the trip.
[06:46.800 --> 06:50.400] So when I got down to the medical building, I got in the elevator and said,
[06:50.400 --> 06:56.100] in this facility, there is surveillance, both obvious and hidden.
[06:58.600 --> 07:03.800] And the Santa Claus was watching, you know, this is one morning.
[07:03.800 --> 07:09.500] And then when I got up to sign in, I signed the board with electronic
[07:09.500 --> 07:12.200] pen and I didn't see no signature.
[07:12.200 --> 07:13.700] I saw it. I said, well, it didn't take.
[07:13.700 --> 07:17.000] She said, oh, it took, but we don't allow it to go on the screen.
[07:17.000 --> 07:19.300] So it could be seen. I said, why is that?
[07:19.300 --> 07:23.000] She said, well, somebody behind you might see the thing and then remember
[07:23.000 --> 07:28.300] it and use your for your signature to forward something somewhere.
[07:28.300 --> 07:32.200] Well, first of all, there was a sign that said stand 10 feet back.
[07:32.200 --> 07:34.600] And secondly, there's nobody else behind me.
[07:34.600 --> 07:38.600] So there's three examples just drawn at random that were becoming an
[07:38.600 --> 07:43.800] increasingly surveilled society, which is creating a sense of paranoia
[07:43.800 --> 07:45.300] and a sense of fear.
[07:45.300 --> 07:48.500] So the brain has to adjust to these type of things, Dave.
[07:48.500 --> 07:51.100] And it's very hard to do.
[07:51.100 --> 07:53.000] And I think that is calculated.
[07:53.000 --> 07:56.900] You know, they've been they want to do this even to the extent when you
[07:56.900 --> 08:00.200] talk about these cameras taking everybody's picture, the flock network
[08:00.200 --> 08:03.000] that is out there, this corporation that is saying, well, we can do
[08:03.000 --> 08:08.200] whatever we want because it's in public space and and you know, we're
[08:08.200 --> 08:11.500] we're not government so we can collect this information and yet they
[08:11.500 --> 08:13.900] collect it in order to sell it to the government.
[08:14.000 --> 08:18.800] So it's just one level indirect, but they not only grab your license
[08:18.800 --> 08:23.300] plate, but they also do a complete profile of your car and all of its
[08:23.300 --> 08:24.200] idiosyncrasies.
[08:24.200 --> 08:25.500] Does it have a dent here?
[08:25.500 --> 08:26.500] Does it have a scrape there?
[08:26.500 --> 08:28.000] What about a bumper sticker?
[08:28.000 --> 08:30.200] So it creates a model of your car.
[08:30.200 --> 08:33.500] And so they almost have like, you know, biometric identification of your
[08:33.500 --> 08:36.000] cars as well as of you.
[08:36.000 --> 08:40.700] And this is now made possible because of the advances of AI.
[08:40.700 --> 08:44.200] But this has been something that has been concerning me.
[08:44.800 --> 08:48.700] I look at things kind of from a libertarian perspective and this has
[08:48.700 --> 08:50.000] been concerning me for a long time.
[08:50.000 --> 08:54.800] The idea that government is using technology many different ways,
[08:54.800 --> 08:59.300] internet, social media, things like that to monitor and to manipulate
[08:59.300 --> 09:04.300] us all the time and to me artificial intelligence just puts this on
[09:04.300 --> 09:09.400] steroids and so I think there is something to be anxious about if
[09:09.400 --> 09:11.500] we're going to to look at this.
[09:11.500 --> 09:12.500] We should be concerned about it.
[09:12.500 --> 09:16.800] Maybe not anxious, but we should be concerned about the goals of people
[09:16.800 --> 09:18.300] who are putting this kind of stuff together.
[09:19.400 --> 09:24.100] So yeah, but there's that and then there's if you can manage to change
[09:24.100 --> 09:29.100] the present you can manipulate the future versus real way to get it
[09:29.100 --> 09:32.500] is to get control of past as Orwell pointed out.
[09:32.500 --> 09:33.900] Yes, you control the past.
[09:33.900 --> 09:38.700] You know, you can control the present and by the implication control
[09:38.700 --> 09:44.600] the future and we've seen alterations of materials even with government
[09:44.600 --> 09:49.300] documents government films documentaries things like that are being
[09:49.300 --> 09:55.500] altered in ways that are not visible not I should say detectable not
[09:55.500 --> 09:59.600] detectable to the ordinary person so they get ideas about what the past
[09:59.600 --> 10:06.200] was like which are wrong and are don't show you as I mentioned in the
[10:06.200 --> 10:13.600] book if you were at a at a dance in 1850 before the Civil War and it's
[10:13.600 --> 10:15.300] a film we're watching.
[10:15.300 --> 10:19.600] Let's just say we're watching a film about 1850 and we're seeing people
[10:19.600 --> 10:23.100] ballroom dancing all that then one of them pulls to the side and pulls
[10:23.100 --> 10:26.400] out a cell phone and you say wait a minute.
[10:26.400 --> 10:29.700] We didn't have cell phones then well, you know, there were a lot of things
[10:29.700 --> 10:30.700] that were going on.
[10:31.700 --> 10:37.300] Now that we're not going on in the past and it's not to our advantage to try
[10:37.300 --> 10:41.900] to pretend that they were they weren't we have to understand the past
[10:41.900 --> 10:49.000] understand the future and we're not only creating situations that are false
[10:49.000 --> 10:56.300] but we're also like in 1984 Orwell created a character called Commander
[10:56.300 --> 10:59.200] Ogilvy he was a war hero.
[10:59.200 --> 11:05.300] He got all sorts of medals and it was all the proletariat were all told to
[11:05.300 --> 11:06.700] honor him and so forth.
[11:06.700 --> 11:09.000] Well, he never existed.
[11:09.000 --> 11:13.100] He actually was made up entirely and that's one of the things that the
[11:13.100 --> 11:19.100] narrator is doing in the job of work is filling in photographs of
[11:19.100 --> 11:25.100] seeconcerting Ogilvy into historical events that happened war time scenarios
[11:25.200 --> 11:29.500] Etc and anyone reading it will say wow, this is this is some man.
[11:29.500 --> 11:31.800] Well, he was a complete fabrication.
[11:31.800 --> 11:37.500] We're just about at that point with Sora out the AI out.
[11:42.700 --> 11:45.900] Welcome to Modo casino where the excitement never ends with thousands
[11:45.900 --> 11:48.900] of the hottest free-to-play social casino games fastest payouts and
[11:48.900 --> 11:52.200] the best promotions in the industry no tricks or gimmicks owned and
[11:52.200 --> 11:53.800] operated in the USA.
[11:53.800 --> 11:55.300] Modo casino is a free-to-play social casino.
[11:55.300 --> 11:58.200] No purchase necessary 21 plus to play avoid were prohibited sign up today
[11:58.200 --> 11:59.900] for a generous welcome bonus.
[12:05.000 --> 12:12.600] Download the Modo casino app today multiply multiply multiply multiply
[12:12.600 --> 12:15.800] with X the cash scratch tickets from the Texas lottery.
[12:15.800 --> 12:21.800] You could multiply the cash by 30 50 100 or even 200 times and when you
[12:21.800 --> 12:25.500] multiply the cash you multiply the celebration with top prizes from
[12:25.500 --> 12:27.600] 60,000 up to a million dollars.
[12:27.600 --> 12:31.100] It's the easiest way to multiply your luck and enter for a chance to win
[12:31.100 --> 12:35.700] a VIP I heart experience play X the cash scratch tickets today must be
[12:35.700 --> 12:39.900] 18 or older play responsibly well into it could take you and had you
[12:39.900 --> 12:44.000] you know to say let's get the David Knight and have him leading some
[12:44.000 --> 12:48.400] sort of a parade of whatever and you know suddenly people say well gosh
[12:48.400 --> 12:49.700] I saw with my own eyes.
[12:49.700 --> 12:55.300] So what's happening is that the actual seeing is believing is being turned
[12:55.300 --> 12:56.000] on its head.
[12:56.000 --> 12:57.800] So that's no longer true.
[12:57.800 --> 13:01.200] You're talking about a completely fabricated character out of Orwell.
[13:01.200 --> 13:05.500] It's just recently they had Tilly Norwood who is a completely fabricated
[13:05.500 --> 13:10.900] AI personality and the person who came up with it is got agents representing
[13:10.900 --> 13:11.000] her.
[13:11.000 --> 13:12.800] They got her out there as an actress.
[13:13.800 --> 13:18.300] It's like so I'm created an AI actress which will do a lot of different
[13:18.300 --> 13:19.100] roles for you.
[13:19.100 --> 13:21.400] She probably does her own stunts as well.
[13:21.400 --> 13:25.500] I mentioned people in sag the screen Actors Guild and they're furious
[13:25.500 --> 13:31.300] about this and said any agent this AI character is not going to do any
[13:31.300 --> 13:34.200] business with us, but we're already at that point.
[13:34.200 --> 13:35.800] It truly is interesting.
[13:35.800 --> 13:40.700] Yeah, and one of ways of neutralizing it is to create the situation that
[13:40.700 --> 13:42.200] exists right now between you and me.
[13:42.200 --> 13:46.400] You're laughing and I'm laughing because it seems funny and it is funny,
[13:46.400 --> 13:49.600] but it's a very serious purpose behind all this.
[13:49.600 --> 13:54.300] Yes, it's all about her to try to alter people's perceptions.
[13:54.300 --> 13:58.600] So they begin to doubt the ver veredity of what they're seeing.
[13:58.600 --> 13:59.500] That's right.
[13:59.500 --> 14:04.000] Yes, and I've talked for the longest time about how the the whole idea for
[14:04.000 --> 14:07.500] the internet was created by DARPA psychologist and I've been concerned
[14:07.500 --> 14:10.900] that it was all about the psychological manipulation from the get-go with
[14:10.900 --> 14:17.500] all of this, but as a physician and as a neuroscientist, I'd be interested
[14:17.500 --> 14:20.900] in your take on you know, what is currently going on because besides
[14:20.900 --> 14:24.700] manipulating the past by changing information about the past or you
[14:24.700 --> 14:28.700] know memory holding it or writing a new alternative history of it.
[14:28.700 --> 14:31.900] They are also concerned and there's been projects have been put out by
[14:31.900 --> 14:35.800] DARPA and I don't know if they've been successful or not, but they you
[14:35.800 --> 14:38.900] know, they're putting out requests for people to come up with things to
[14:39.000 --> 14:41.000] manipulate people's memories.
[14:41.000 --> 14:45.600] So you've got a soldier they say it who's got bad PTSD.
[14:45.600 --> 14:47.300] Let's get rid of that memory.
[14:47.300 --> 14:49.700] Let's give them different memories.
[14:49.700 --> 14:54.000] What do you see in terms of someone who studies the brain and neuroscience?
[14:54.000 --> 14:55.200] What do you see about that?
[14:55.200 --> 14:59.900] What do you take is think is the state of the art with that?
[14:59.900 --> 15:02.400] Well, my last book was called the complete book of memory.
[15:02.400 --> 15:03.400] It had to do with memory.
[15:03.400 --> 15:06.300] I studied memory in great detail.
[15:06.300 --> 15:10.200] And of course you have to do away with the concept that memory is
[15:10.200 --> 15:14.400] like a video tape or something that you just store in your brain.
[15:14.400 --> 15:17.400] And when you get and want to get it, you just bring it out.
[15:17.400 --> 15:19.300] Like you bring out a video tape.
[15:19.300 --> 15:20.400] It's not like that.
[15:20.400 --> 15:22.100] It's it's a reconstruction.
[15:22.100 --> 15:28.100] Each time you think back to a certain event you alter that memory so
[15:28.100 --> 15:33.000] that you have memory one memory two memory three on and on and on.
[15:33.000 --> 15:35.500] That's the nature of memory.
[15:35.500 --> 15:37.600] Memory can be manipulated.
[15:37.600 --> 15:40.300] It's always, you know, in the courtroom.
[15:40.300 --> 15:44.100] They're always trying to avoid the contamination of the witness.
[15:44.100 --> 15:49.300] Example that would be well, which car went through the red light and
[15:49.300 --> 15:54.800] to ask a witness he said, oh, it was it was a red car went through the red light.
[15:54.800 --> 15:57.600] Well, wouldn't surprise you to know that it wasn't a red light, but
[15:57.600 --> 15:59.300] it was a stop sign.
[15:59.300 --> 16:03.400] Mr. Witness, of course, his credibility is gone.
[16:03.400 --> 16:07.800] Yeah, because he took the suggestion that it was a red light.
[16:07.800 --> 16:11.700] He said it would be very easy to do because you don't necessarily have
[16:11.700 --> 16:15.200] that image that intersection in your mind.
[16:15.200 --> 16:19.900] So that's why there's protections even in the courtroom against leading
[16:19.900 --> 16:23.300] the witness they caught in other words providing information.
[16:23.300 --> 16:27.000] That's either not true at all or half true.
[16:27.000 --> 16:32.000] So we've got that cause this is not this didn't start in the 21st century
[16:32.000 --> 16:35.300] that that started, you know, as long as we've had courtrooms.
[16:35.300 --> 16:39.300] This is a more an emphasis now on altering memory.
[16:39.300 --> 16:42.700] So the people will not even will get up there and under cross-examination.
[16:42.700 --> 16:45.700] They'll do pretty well because their whole memory has been altered.
[16:45.700 --> 16:50.400] They've changed to the by various mechanisms suggestion repeating the
[16:50.400 --> 16:55.000] information, which is false, of course, which is the misinformation.
[16:55.000 --> 17:00.900] There's a cartoon about a week ago by Ramirez in which he
[17:01.000 --> 17:07.400] he spoke to prize winner is three doctors in an operating room in a
[17:07.400 --> 17:11.800] laboratory. One of them is looking into a microscope and he looks up
[17:11.800 --> 17:16.300] and he says this is the most dangerous pathogen we have ever encountered
[17:16.900 --> 17:21.200] and the second doctor says, well, is it bubonic plague is a smallpox
[17:21.800 --> 17:26.600] and then the one that he says, no, it's misinformation and disinformation.
[17:27.500 --> 17:33.900] And we've got to be very careful because many times the people who
[17:33.900 --> 17:37.300] will tell us about that are the people who want to be the ones who
[17:37.300 --> 17:41.900] define what the information is for us and they will ask those leading
[17:41.900 --> 17:44.700] questions. You know, we talk about leading questions and manipulating
[17:44.700 --> 17:49.100] people. There's been a lot of reports about artificial intelligence
[17:49.100 --> 17:55.900] kind of people who have particular psychosis or something and
[17:57.000 --> 18:00.700] they get involved with the AI and it starts to confirm the things
[18:00.700 --> 18:03.300] that they want because that's what it is set up to do in terms of
[18:03.300 --> 18:07.500] bias that want to, you know, be empathetic and sympathetic to people
[18:07.500 --> 18:10.900] and so it starts doing that and leading them further and further down
[18:10.900 --> 18:15.300] a particular rabbit hole. There's been situations of, you know, people
[18:15.300 --> 18:19.300] got into severe mental distress some suicides of some young children
[18:19.300 --> 18:23.500] and other things like that speak to that aspect of it and the real
[18:23.600 --> 18:28.600] danger of that that is really kind of I think speaks to the to the
[18:28.600 --> 18:32.000] psychological aspect and potential of artificial intelligence and
[18:32.000 --> 18:35.800] that could be weaponized right now. It's just kind of happening out
[18:35.800 --> 18:39.400] of their business model, right? But that could definitely be weaponized
[18:39.400 --> 18:42.600] against people. Well, I talk about that in my book in the chapter
[18:42.600 --> 18:46.200] on the internet. There are famous examples of people who have
[18:46.700 --> 18:54.400] suicided right on the internet live feed and they've been manipulated
[18:54.400 --> 18:59.700] to doing that by other people who've encouraged them said this
[18:59.700 --> 19:04.200] would be a sign of strength. This would be a sign of that you're
[19:04.200 --> 19:08.600] not afraid to die if necessary and there's cases of it actually
[19:08.600 --> 19:13.700] led to the suicide. One of them is most grisly I have in my book
[19:13.900 --> 19:17.900] about a person who was talked into pouring gasoline over themselves
[19:17.900 --> 19:23.500] and setting a match ball on open feed internet. And while this
[19:23.500 --> 19:27.000] fire is burning, you can hear everybody in the background's
[19:27.600 --> 19:32.400] cheering. We did it. We did it. We got him to do it. Wow. That's
[19:32.400 --> 19:36.600] amazing. That's amazing. There's something about the internet
[19:36.800 --> 19:44.200] about that actually brings out sadistic criminal psychopathic
[19:44.200 --> 19:47.900] trends and we don't know why. Is it the fact that you don't
[19:47.900 --> 19:51.600] necessarily can't be identified? It's something that is going to
[19:51.600 --> 19:55.500] be influencing and has influenced the internet greatly and it
[19:55.500 --> 20:00.300] will continue to do so until we understand it. I think that's
[20:00.300 --> 20:02.600] one of the things that's so dangerous about the things that
[20:02.600 --> 20:05.400] we saw with lockdown and other aspects of it. There's an
[20:05.400 --> 20:09.400] atomization here and so many different ways the government
[20:09.400 --> 20:14.000] and tech and tech companies are trying to make sure that we
[20:14.000 --> 20:17.900] don't we're not in person with each other, you know many cases
[20:23.100 --> 20:25.900] Welcome to Moto Casino where the excitement never ends with
[20:25.900 --> 20:28.300] thousands of the hottest free to play social casino games
[20:28.300 --> 20:31.000] fastest payouts and the best promotions in the industry no
[20:31.000 --> 20:34.600] tricks or gimmicks owned and operated in the USA. Moto Casino
[20:34.600 --> 20:36.300] is a free to play social casino. No purchase necessary.
[20:36.300 --> 20:40.100] 21 plus to play. Sign up today for a generous welcome bonus.
[20:45.300 --> 20:52.900] Download the Moto Casino app today. Multiply, multiply, multiply, multiply.
[20:52.900 --> 20:56.200] With X's the cash scratch tickets from the Texas Lottery you
[20:56.200 --> 21:01.800] could multiply the cash by 30, 50, 100 or even 200 times and
[21:01.900 --> 21:04.900] when you multiply the cash you multiply the celebration with
[21:04.900 --> 21:08.300] top prizes from 60,000 up to a million dollars. It's the
[21:08.300 --> 21:11.200] easiest way to multiply your luck and enter for a chance to
[21:11.200 --> 21:15.000] win a VIP I heart experience play X the cash scratch tickets
[21:15.000 --> 21:19.000] today must be 18 or older play responsibly like for example
[21:19.000 --> 21:21.600] this interview we couldn't do this interview if we both had
[21:21.600 --> 21:24.900] if one of both of us had to travel we're able to do this
[21:24.900 --> 21:30.100] because we can do it over zoom or whatever but just taking
[21:30.100 --> 21:34.300] ordinary things that you would normally do in terms of
[21:34.300 --> 21:37.800] interacting with people in school or in church or in your
[21:37.800 --> 21:40.900] community or whatever taking that away and putting a screen
[21:40.900 --> 21:43.100] between the two of you. It really does change the way people
[21:43.100 --> 21:45.600] interact with each other. I remember Errol Morris the film
[21:45.600 --> 21:49.900] director was able to get people to say all kinds of things to
[21:49.900 --> 21:54.500] him. He got a murderer to confess. He got he got Robert
[21:54.500 --> 21:58.200] McNamara to confess about the false flag of the Vietnam War.
[21:58.200 --> 22:01.000] He got people say all kinds of stuff because there was that
[22:01.000 --> 22:03.800] distance between him and them. He could have interviewed them
[22:03.800 --> 22:07.300] in person but what he did was he put an interotron which he
[22:07.300 --> 22:10.400] is what he called it. It was basically a teleprompter that
[22:10.400 --> 22:13.200] he had set up so he could do two-way communication at the
[22:13.200 --> 22:17.700] time and once he had that distance there then it completely
[22:17.700 --> 22:21.700] changed the dynamics that he would have versus with somebody
[22:21.700 --> 22:23.600] person to person and that's what we're talking about here,
[22:23.600 --> 22:26.700] isn't it? Yeah, we're talking about that and of course there's
[22:27.200 --> 22:31.100] gradations of this and it continues. Like you're
[22:31.100 --> 22:33.300] interviewing me. We're discussing. I feel like it's a
[22:33.300 --> 22:37.600] discussion. If I were to say something that later I regret
[22:37.600 --> 22:39.900] it, I could probably say, oh well that wasn't me. That was
[22:39.900 --> 22:46.900] my avatar. Or my agent, right? I got an AI agent that's out
[22:46.900 --> 22:52.600] there doing stuff. That's right. It's crazy. We also see
[22:52.600 --> 22:56.500] though as a doctor you're seeing people have noticed
[22:56.500 --> 22:59.400] actual physical changes that can be observed in people's
[22:59.400 --> 23:02.400] brains. I'm thinking of the story about the London taxi
[23:02.400 --> 23:05.900] drivers who would do the knowledge and they would find
[23:05.900 --> 23:10.500] that as they memorized all these factual details and drew
[23:10.500 --> 23:14.800] on that all the time in order to take people to this very
[23:14.800 --> 23:17.800] complicated city with its complicated streets that they
[23:17.800 --> 23:21.100] had a particular part of the brain that was larger than the
[23:21.100 --> 23:24.800] typical person and then they found that once they stopped
[23:24.800 --> 23:26.600] doing that it started to shrink again and we're starting
[23:26.600 --> 23:30.000] to see that happening with people in a lot of different
[23:30.000 --> 23:33.400] areas of their life. That kind of atrophy and it's physically
[23:33.400 --> 23:37.000] observable, isn't it? Well, it is. You have to learn. You
[23:37.000 --> 23:41.400] have to use the things that you have learned to do. Like I
[23:41.400 --> 23:43.800] mentioned in my memory book, there's all kinds of memory
[23:43.800 --> 23:47.300] exercises that you could do. I do them every day and they're
[23:47.300 --> 23:51.100] very easy and they keep help you to continue with your
[23:51.500 --> 23:55.500] memory and keep it sharp. Give us some examples. I'm sure
[23:55.500 --> 23:57.500] everybody would love to know that. We'd all like to have a
[23:57.500 --> 23:59.900] better memory. What kind of what kind of things do we can
[23:59.900 --> 24:03.500] we do to exercise? Think about the fact that you never had to
[24:03.500 --> 24:09.000] learn pictures when you were an infant, a young child. A
[24:09.000 --> 24:11.900] picture was something that you could, you know, may not know
[24:11.900 --> 24:14.100] what you're looking at, but you could see it without an
[24:14.200 --> 24:18.300] intermediary. Language is something that you have to hear
[24:18.300 --> 24:21.200] from other people. It's something that's sort of added
[24:21.200 --> 24:27.700] on to the brain. Okay. So as a result, the most best way of
[24:27.700 --> 24:35.400] remembering something is to make a image for it. Okay. For
[24:35.400 --> 24:39.500] instance, I have a little dog called a skipper key. Skipper
[24:39.500 --> 24:44.300] key is a Belgian dog. He's a nice little fellow, but it was
[24:44.300 --> 24:46.600] embarrassing to me when walking the street. People say what
[24:46.600 --> 24:49.800] kind of a dog is that? And I couldn't come up with a name
[24:49.800 --> 24:53.300] because it was so complicated. And I thought that's skipper
[24:53.300 --> 24:56.900] key. I didn't speak any Dutch or anything. So then I got this
[24:56.900 --> 25:01.100] image of a small boat with a large captain with a beard
[25:01.800 --> 25:07.100] holding a big key. So it was skipper key. And I remember
[25:07.100 --> 25:09.400] forever. So I was going to have the picture. Once I have the
[25:09.400 --> 25:14.100] picture, it's easy to do. Another way, easy way to do it
[25:14.100 --> 25:16.900] and you can do that with all kinds of times all the time. I
[25:16.900 --> 25:21.300] was going upstairs before I came down to the office and I
[25:21.300 --> 25:26.200] wanted to get my wallet and I wanted to get my cell phone. So
[25:26.200 --> 25:30.400] I just had an image of a wallet in the form of a cell phone.
[25:30.400 --> 25:34.100] And I was walking up the stairs talking into the wallet cell
[25:34.100 --> 25:37.200] phone. So I got up and I knew I had these two elements to
[25:37.200 --> 25:40.600] get. It'd be very easy to get one and forget the other. So
[25:40.900 --> 25:44.600] you have these images all the time. And the quickest, you
[25:44.600 --> 25:47.500] know, this is sort of off the topic of the book. But if you
[25:47.500 --> 25:55.900] want to have a firepower memory for a load of things, that's
[25:55.900 --> 26:00.300] up to 10 things and get 10 areas that you are familiar with
[26:00.300 --> 26:06.500] that you see every day. And then you could put on those images
[26:07.000 --> 26:10.600] the thing you're trying to remember. So if I'm trying to
[26:10.600 --> 26:19.200] remember a loaf of bread, milk, maybe a batteries. I have a
[26:19.200 --> 26:23.200] regular way of doing that. I have like, I remember my library
[26:23.200 --> 26:29.300] that's near my home, the coffee shop, liquor store, Georgetown
[26:29.300 --> 26:33.300] University Medical School where I went, Georgetown University,
[26:34.200 --> 26:37.800] Cafe Milano, which is a place in Washington, everybody
[26:37.900 --> 26:45.000] gathers, and then Keybridge, Iwo Jima Memorial, and Reagan
[26:45.000 --> 26:49.100] Airport. So that bread would be for instance, the loaf of bread,
[26:49.100 --> 26:51.300] I would look in the window of the library instead of seeing
[26:51.300 --> 26:55.700] books, I see bread, loaves of bread. And when I get down to
[26:55.700 --> 26:58.800] the to the liquor store, instead of it being filled with
[26:58.800 --> 27:02.500] liquor, that'll be milk bottles. So that's how I get to it. So
[27:02.500 --> 27:05.900] I have those 10, so I can get 10 items together without any
[27:06.700 --> 27:07.700] problems at all.
[27:08.200 --> 27:10.900] That's great. Yeah, you know, it's interesting to talk about
[27:10.900 --> 27:14.300] the importance of a visualization. It's one of the
[27:14.300 --> 27:18.000] things that I do in terms of preparing for the show. I have
[27:18.000 --> 27:21.700] a lot of articles that I go through. And it's really when I
[27:21.700 --> 27:24.200] highlight things or when I write them down, that's when I can
[27:24.200 --> 27:26.700] remember them. If I don't do that, if I were just to read
[27:26.700 --> 27:29.400] these things, I wouldn't remember them. But if I interact
[27:29.400 --> 27:32.300] with it and write it down, that helps me to remember it. And
[27:32.300 --> 27:34.800] so that is a kind of visualization there, I guess, as
[27:34.800 --> 27:39.700] well. It truly is interesting. And what you said earlier about
[27:39.700 --> 27:42.600] memory not being something that is stored in a place as somebody
[27:42.600 --> 27:46.600] coming from a computer science background. That was a very
[27:46.600 --> 27:50.400] different thing. When you construct your memory, how do
[27:50.400 --> 27:55.200] you reconstruct that? I mean, that opens up a whole new area
[27:55.200 --> 27:58.300] of questions as well. In other words, if every time somebody
[27:58.300 --> 28:01.700] brings up a subject, I mean, there isn't something that's
[28:01.700 --> 28:05.100] stored initially to reference that and then rebuild from that.
[28:06.400 --> 28:09.500] Yeah, there's that. Plus there's the inner connections. Like,
[28:09.700 --> 28:12.400] you know, somebody listening to us might say, well, gee, this
[28:12.400 --> 28:15.400] is called the 21st century brain, but I haven't heard that
[28:15.400 --> 28:18.100] much about the brain. Well, let me just link that up so that
[28:18.100 --> 28:22.500] these things make sense. We have a new version, or I should
[28:22.500 --> 28:25.200] say a new understanding of the brain called the connectomic
[28:25.200 --> 28:29.900] brain, in which there's all kinds of interactions in the
[28:29.900 --> 28:33.500] brain of parts of the brain, which you don't, we're just
[28:33.500 --> 28:37.500] learning about. I have the, I use the metaphor of a bowl of
[28:37.500 --> 28:42.200] spaghetti. You pull out one of the strains of spaghetti and
[28:42.200 --> 28:44.900] you never have any idea what it's connected to. How many
[28:44.900 --> 28:49.600] other strains of spaghetti this is connected to. So that's,
[28:49.700 --> 28:55.800] you think of the brain as being kind of set to make connections.
[28:55.800 --> 28:58.000] That's its natural processing.
[28:59.900 --> 29:03.000] Moto has real Vegas slots. Any game you can find on the
[29:03.000 --> 29:05.900] floor in Vegas, you can play it on Moto. I like my slots hot.
[29:05.900 --> 29:09.000] Moto's free to play. Like food stamps. Inline at the grocery
[29:09.000 --> 29:11.700] store. At a funeral. In traffic. Keep your eyes on the road.
[29:11.700 --> 29:14.700] Hop on Moto Casino. Moto Casino got jackpots that are bigger
[29:14.700 --> 29:17.400] than my belly. Moto, America's hottest free to play social
[29:17.400 --> 29:19.800] casino. Download the Moto Casino app today. Moto Casino is
[29:19.800 --> 29:21.300] the social casino. Boy, we're prohibited. No purchase
[29:21.300 --> 29:23.000] necessary. Visit moto.us for more details.
[29:29.900 --> 29:35.300] Multiply. Multiply. Multiply. With X the cash scratch tickets
[29:35.300 --> 29:39.000] from the Texas lottery, you could multiply the cash by 30,
[29:39.200 --> 29:43.800] 50, 100, or even 200 times. And when you multiply the cash,
[29:43.800 --> 29:47.500] you multiply the celebration with top prizes from 60,000 up
[29:47.500 --> 29:50.300] to a million dollars. It's the easiest way to multiply your
[29:50.300 --> 29:53.900] luck and enter for a chance to win a VIP I heart experience.
[29:53.900 --> 29:57.400] Play X the cash scratch tickets today. Must be 18 or older.
[29:57.400 --> 30:00.500] Play responsibly. So it gets back to these things that we
[30:00.500 --> 30:03.800] were talking about earlier. You know, global warming and
[30:03.800 --> 30:07.800] memory and surveillance and all that. How are we going to solve
[30:07.800 --> 30:12.800] all those? Well, somehow or other, those things are connected
[30:12.800 --> 30:16.700] with each other. That's the take home message to this book.
[30:17.200 --> 30:22.400] And the basic goal is to try to figure out what it is that
[30:22.400 --> 30:26.200] connects these things. What it is that would allow us to
[30:27.100 --> 30:34.000] to by solving one of them, solve the other. And I mentioned
[30:34.000 --> 30:38.500] at the end of the book, experts so far haven't done it. So
[30:38.500 --> 30:43.800] it's useful, as Hayek said, to get ordinary people to give
[30:43.900 --> 30:48.100] when I say ordinary, I mean, non specialized people to give
[30:48.100 --> 30:50.900] their ideas. Gee, I wonder what such and such would happen.
[30:51.900 --> 30:54.200] What would happen about global warming for a while? There was
[30:54.200 --> 30:57.300] in fact, there's still experiments going on on the effect
[30:57.300 --> 31:04.500] of sulfur that would would help the the co2 problem. And you
[31:04.500 --> 31:08.100] don't shooting sulfur up into the into the atmosphere. Of
[31:08.100 --> 31:12.700] course, the reason the reason for that was the volcano 1980
[31:12.700 --> 31:17.900] something in which after that volcano in Hawaii, it was noted
[31:17.900 --> 31:23.200] that the air was clearer and it was less pollution. So that's
[31:23.300 --> 31:25.900] something to think about. Is there some way of using that
[31:26.000 --> 31:33.100] particular sulfur experiment to decrease global warming? War,
[31:33.100 --> 31:36.100] for instance, we don't think of war as a cause of global
[31:36.100 --> 31:38.700] warming, but it is. Oh, yeah, too.
[31:39.500 --> 31:44.100] We're warming up. Yeah, it does. But what upsets the the
[31:44.100 --> 31:49.800] Ukraine war and the Gaza war, then the tremendous amount that's
[31:49.800 --> 31:54.700] going to overcome and exceed the benefit of any of these things
[31:54.700 --> 32:00.600] like, you know, non gasoline engines, but using things like
[32:00.600 --> 32:00.900] that.
[32:00.900 --> 32:03.800] Absolutely. Yeah, it's kind of like, you know, shooting up
[32:04.100 --> 32:06.900] rockets in order to put satellites up, you know, how many
[32:07.500 --> 32:11.700] how many cars and lifetime use of cars from people would that
[32:11.700 --> 32:14.200] be equivalent to and you start talking about all the missiles
[32:14.200 --> 32:16.800] that are being shot and then you get to the explosives as
[32:16.800 --> 32:22.000] well. It is really interesting how they focus on their
[32:22.000 --> 32:25.900] objectives for their ways to control it through the manipulations
[32:25.900 --> 32:31.000] been going on for quite some time. And so yeah, that is, it
[32:31.000 --> 32:35.100] is pretty amazing. And I guess that's my, you know, my my
[32:35.700 --> 32:38.000] look at this stuff. It really does look like science fiction
[32:38.000 --> 32:41.000] and I'm almost inclined to write it off on our first see it
[32:41.200 --> 32:43.500] when DARPA is saying, well, we need to find some way that we
[32:43.500 --> 32:48.200] can, you know, erase memories and people and insert new
[32:48.200 --> 32:50.500] memories into them. And we were going back to total recall,
[32:50.500 --> 32:55.100] right? So it sounds like something from a Philip K. Dick
[32:55.100 --> 32:57.900] novel, but they're really working on that. And I guess one
[32:57.900 --> 33:00.200] of the most striking things we saw, we reported on a couple
[33:00.200 --> 33:04.300] of weeks ago, and it was a company that was bragging about
[33:04.300 --> 33:09.500] how they could read your mind more accurately and quickly
[33:09.500 --> 33:11.500] than their competitors, because there's a lot of different
[33:11.500 --> 33:15.800] companies that are doing this and how they could, it's called
[33:15.800 --> 33:20.200] Brain IT was the name of the company. And so they had a way
[33:20.200 --> 33:27.000] that they would do MRI, and they could essentially train it on
[33:27.000 --> 33:29.300] your brain in a much shorter period of time than the other
[33:29.300 --> 33:31.600] people and they could get much better results. Our producers
[33:31.600 --> 33:34.800] just pull this up. So what they do is they show you an image
[33:35.300 --> 33:37.700] and you're looking at that image and then it's reading your
[33:37.700 --> 33:40.500] mind and reconstructing what you're looking at, which I
[33:40.500 --> 33:43.600] thought was absolutely amazing and terrifying at the same
[33:43.600 --> 33:47.100] time. How is this going to be used? I guess that's the real
[33:47.100 --> 33:49.600] issue. When we start talking about all these different
[33:49.600 --> 33:53.700] things, I think that is the real case that it's difficult
[33:53.700 --> 33:56.500] for people to understand just how far and how quickly the
[33:56.500 --> 34:00.200] technology has progressed and then to say, and how do we
[34:00.200 --> 34:04.200] control this from it being used for for bad purposes?
[34:05.300 --> 34:09.500] Well, that's specifically 21st century problem. Yes, because
[34:09.500 --> 34:13.300] all of these things have either originated in the 21st
[34:13.300 --> 34:17.800] century or they have in fact further developed and become
[34:18.000 --> 34:22.300] increasingly threatening. And bear in mind, we have to have
[34:22.300 --> 34:25.100] to solve these problems because they're not something that's
[34:25.100 --> 34:28.000] going to go away. And then the most important thing to remember
[34:28.000 --> 34:32.800] David is that all of these things harm the brain and the
[34:32.800 --> 34:37.600] brain is the thinking processor that's going to save us. It's
[34:37.600 --> 34:40.800] going to figure out what the problems, what the solutions to
[34:40.800 --> 34:44.600] the problems are. So we know now that wildfire smoke, for
[34:44.600 --> 34:50.100] instance, it creates dementia. It enhances the likelihood of
[34:50.100 --> 34:54.300] somebody coming to men it. So as the brain is affected
[34:54.300 --> 34:58.000] negatively, increasingly over longer and longer periods of
[34:58.000 --> 35:02.300] time, our ability to solve these problems is going to decrease.
[35:02.500 --> 35:05.700] So we've got to do it now. We've got to get serious about it.
[35:06.000 --> 35:09.300] And this business of people getting up and saying the global
[35:09.300 --> 35:14.000] warming is fiction and all that is is really very, very
[35:14.000 --> 35:14.700] disturbing.
[35:15.900 --> 35:19.900] You know, the example that you gave earlier of the fact that
[35:19.900 --> 35:22.600] the Indian government was manipulating the temperature at
[35:22.600 --> 35:26.600] some of the stations there. That kind of works both ways. They
[35:26.600 --> 35:28.900] have put some of these temperature stations on the airport
[35:28.900 --> 35:33.300] tarmacs. And in the UK, they have a lot of the temperature
[35:33.300 --> 35:35.800] stations that they've got there. They're just extrapolating
[35:35.800 --> 35:38.600] the data. They don't have real temperature measurement stations
[35:38.600 --> 35:42.000] there. So it all really gets back, I think, to the scientific
[35:42.000 --> 35:45.300] method. And that's really where we have to hold people's feet
[35:45.300 --> 35:47.600] to the fire. We're talking about something like that. We can
[35:47.600 --> 35:51.700] have an absolute standard of what truth is. And that truth is
[35:51.700 --> 35:55.200] going to be being able to measure something accurately and
[35:55.200 --> 35:59.600] being able to reproduce that. And then I think a good yardstick
[35:59.600 --> 36:01.900] for that is when somebody is trying to hide their data.
[36:02.500 --> 36:06.000] That's the clue right there that they're not doing science
[36:06.000 --> 36:09.600] because if they're doing science and they've come to the
[36:09.600 --> 36:11.400] right conclusion, they don't have a problem with somebody
[36:11.400 --> 36:15.200] looking at their data. And so I've got a question here for
[36:15.200 --> 36:18.900] you from a person in the audience asking if you know about
[36:18.900 --> 36:22.700] doctors James Giordano and Charles Morgan and their work
[36:22.700 --> 36:25.400] with military. I'm not familiar with those names. I don't
[36:25.400 --> 36:26.800] know if you know anything about that or not.
[36:27.500 --> 36:32.400] Giordano says familiar. What particular thing are they asking
[36:32.400 --> 36:32.900] about them?
[36:33.000 --> 36:34.700] I don't know. It just says their work with the military.
[36:34.700 --> 36:36.900] I guess it would have to do with something, but you haven't
[36:36.900 --> 36:38.300] heard of it. I'm not sure.
[36:38.900 --> 36:42.100] I could say Giordano did this or did that.
[36:42.500 --> 36:45.700] Sure. I understand. Yeah, let's talk a little bit about the
[36:45.700 --> 36:50.100] things that we have been anxious about. And of course as
[36:50.100 --> 36:52.900] Christians, we have one answer to it. But you talk about how
[36:52.900 --> 36:56.300] this is something that has been, you know, around pretty much
[36:56.400 --> 37:00.900] all of our life. I grew up with anxiety about nuclear war,
[37:00.900 --> 37:04.800] for example. That was in everybody's television and that
[37:04.800 --> 37:09.600] was forefront of our mind, especially growing up in Florida
[37:09.600 --> 37:11.800] when the Cuban Missile Crisis was happening. They got us
[37:11.900 --> 37:14.000] really afraid of that when I was in elementary school, you
[37:14.000 --> 37:16.000] know. It's like there's not going to be enough time for you
[37:16.000 --> 37:18.800] to get home, you know, when the nuclear bombs start falling.
[37:18.800 --> 37:21.800] And so, I mean, there's all these different ways that you
[37:21.800 --> 37:25.900] can panic people. I guess part of it is how do we identify
[37:25.900 --> 37:30.400] the real problems and how do we deal with those problems?
[37:30.400 --> 37:34.200] Because there's always things that are competing for our
[37:34.200 --> 37:38.500] tension and our anxiety, many of which are not real. And
[37:38.500 --> 37:40.500] usually the things that you're really the most concerned
[37:40.500 --> 37:44.100] about don't happen. And it may be sometimes because you have
[37:44.100 --> 37:47.500] taken a precaution about it. What would you say about that
[37:47.500 --> 37:48.400] about anxiety?
[37:49.400 --> 37:51.900] You're starting to break up a little bit. Can you hear me
[37:51.900 --> 37:54.900] clearly? I hear you. Yes. Yes. Sorry about that. You talk
[37:54.900 --> 37:58.400] about breaking up a little bit. You're talking about traumatizing
[37:58.400 --> 38:02.400] a population. You know, what do I do to guard against that
[38:02.400 --> 38:05.400] type of thing? And of course, that's going to really escalate
[38:05.400 --> 38:09.400] with the ability of AI to create a narrative.
[38:09.400 --> 38:13.900] Yeah, well, let's talk about it as an avenue to get into that.
[38:13.900 --> 38:16.900] Let's go back to what you brought about the atomic weapons
[38:16.900 --> 38:20.400] and the atomic war and the fears of the people that there's
[38:20.400 --> 38:24.400] going to be another atomic war. I mean, you know, this is not
[38:24.400 --> 38:28.400] unrealistic. There's even been a movie that's just come out
[38:28.400 --> 38:31.400] that's getting all kinds of attention, as you know, and it
[38:31.400 --> 38:35.400] has to do with, you know, what's going on in the world.
[38:36.400 --> 38:39.400] If you look at what's happening in Europe right now, there's all
[38:39.400 --> 38:43.400] kinds of suggestions that could lead to a nuclear war. I mean,
[38:43.400 --> 38:46.400] Ukraine now has announced that they're under no conditions
[38:46.400 --> 38:51.400] willing to give up any land. And Stalin is, I mean, Putin is
[38:51.400 --> 38:55.400] thinking what he can do to change that. Maybe he'll attack
[38:55.400 --> 38:59.400] another country. I mean, this is something that's happening
[38:59.400 --> 39:02.400] in Europe right now. I mean, you know, there's a lot of
[39:02.400 --> 39:05.400] ideas that maybe he'll attack another country. I mean, this
[39:05.400 --> 39:09.400] is scary stuff. So what's happening in response to the
[39:09.400 --> 39:13.400] government is to try to show that, oh, we shouldn't worry
[39:13.400 --> 39:15.400] about it. We have things under control, but I don't think
[39:15.400 --> 39:17.400] things are under control.
[39:19.400 --> 39:22.400] And, you know, we've talked about the problems and we talked
[39:22.400 --> 39:26.400] about problems. You have your final chapter is new ways of
[39:26.400 --> 39:31.400] thinking. And I'd like to talk about that. One of the things
[39:31.400 --> 39:34.400] you say is Ockham was wrong. Ockham's razor that, you know,
[39:34.400 --> 39:37.400] people are familiar with. Tell us a little bit about that. Why
[39:37.400 --> 39:38.400] is Ockham wrong?
[39:39.400 --> 39:43.400] Well, because he says that, you know, the entities are not to
[39:43.400 --> 39:46.400] be multiplied, meaning that we can always explain things best
[39:46.400 --> 39:50.400] by limiting ourselves to the minimum amount of factors,
[39:50.400 --> 39:54.400] ideally one, one cause of every fact. That's not true. It's
[39:54.400 --> 39:57.400] certainly not true in the 21st century, where there's all
[39:57.400 --> 40:02.400] kinds of interactions between factors and causes. So that
[40:02.400 --> 40:06.400] Ockham was wrong in that basis. We have to think of an
[40:06.400 --> 40:09.400] interconnecting pool, just as in the brain of interconnections
[40:09.400 --> 40:12.400] of neurons, interconnections of these problems. And they're all
[40:12.400 --> 40:16.400] related. They're all related. All eight of them I talk about in
[40:16.400 --> 40:19.400] my book, they're all related. And if you can figure a way of
[40:19.400 --> 40:24.400] influencing one, you influence all the others. I mean, who
[40:24.400 --> 40:27.400] would think there'd be a connection between global warming
[40:27.400 --> 40:32.400] and the amount of artisan and cheese, for instance, high end
[40:32.400 --> 40:36.400] cheese? Well, there is, because they don't, chickens don't lay
[40:36.400 --> 40:39.400] many eggs, and there'd be all the various other things that
[40:39.400 --> 40:44.400] come on in terms of making cheese. I learned that the other
[40:44.400 --> 40:47.400] day. That was something that was a surprise to me.
[40:47.400 --> 40:49.400] You know, it's kind of interesting when you talk about
[40:49.400 --> 40:52.400] connections so much. There was a series that was, I think it was
[40:52.400 --> 40:55.400] on PBS. I think the guy's name was Burke. I can't remember his
[40:55.400 --> 40:57.400] first name. I'm not sure about the last name, but he had a
[40:57.400 --> 41:01.400] series called Connections. And I thought it was fascinating
[41:01.400 --> 41:05.400] because what he would do is he would take a whole series of
[41:05.400 --> 41:08.400] connections to show how a particular technology had
[41:08.400 --> 41:14.400] evolved. So he might go from the quill to the jet engine or
[41:14.400 --> 41:19.400] something like that. And it was a fascinating thread of things.
[41:19.400 --> 41:21.400] It's very much like what you're talking about.
[41:24.400 --> 41:27.400] It really is. And I did, I did consult his work, actually.
[41:27.400 --> 41:30.400] Did you? I was writing this book because he did that
[41:30.400 --> 41:33.400] Connections. He did a book called The Day the World Changed
[41:33.400 --> 41:37.400] and all this. He also did a book called Circles, in which he
[41:37.400 --> 41:40.400] would start with one particular event that it had carried in
[41:40.400 --> 41:44.400] history. And if you go around the circle, you come back to
[41:44.400 --> 41:48.400] the beginning where it started, where this particular inventor
[41:48.400 --> 41:51.400] invented something. What led up to it? What was the circle
[41:51.400 --> 41:55.400] leading to that? So yes, we're talking about connections and
[41:55.400 --> 41:58.400] we're talking about the inability to understand things
[41:58.400 --> 42:03.400] without reference to supporting and accessory factors. We have
[42:03.400 --> 42:06.400] that going all the time, denying things that are going to be
[42:06.400 --> 42:10.400] happening. Of course, I think the fearful thing is that the
[42:10.400 --> 42:14.400] government is aiding in this denial. Because if you would
[42:14.400 --> 42:18.400] deny that there's a problem, then there's very little impetus
[42:18.400 --> 42:24.400] to try to solve it. If there ain't no problem, don't try to
[42:24.400 --> 42:30.400] solve it. They're throwing out their own chaos and uncertainty
[42:30.400 --> 42:34.400] and anxiety that's out there all the time, always, I guess. So
[42:34.400 --> 42:38.400] the question is, she's talking about volatility, uncertainty,
[42:38.400 --> 42:41.400] complexity and ambiguity. I mean, it sounds like a government
[42:41.400 --> 42:44.400] policy. I think they've got bureaucracies that specialize in
[42:44.400 --> 42:50.400] that. Yeah, well, actually, that's true. That's in your section
[42:50.400 --> 42:53.400] there about new ways of thinking. And so how do we incorporate
[42:53.400 --> 42:57.400] that into new ways of thinking that help us to solve this riddle?
[42:59.400 --> 43:04.400] Well, each of those factors is a factor that helps you to
[43:04.400 --> 43:08.400] understand things and to have more control. It doesn't
[43:08.400 --> 43:11.400] necessarily mean it helps you to link them together. That has
[43:11.400 --> 43:15.400] to be done by original thinking. You have to be under those
[43:15.400 --> 43:20.400] things. Things are volatile. You don't have a basic situation
[43:20.400 --> 43:23.400] that doesn't change. It changes all the time.
[44:08.400 --> 44:11.400] If you multiply the cash, you multiply the celebration. With
[44:11.400 --> 44:14.400] top prizes from $60,000 up to a million dollars, it's the
[44:14.400 --> 44:17.400] easiest way to multiply your luck. And enter for a chance to
[44:17.400 --> 44:21.400] win a VIP iHeart experience. Play X the Cash Crash Tickets
[44:21.400 --> 44:24.400] today. Must be 18 or older. Play responsibly.
[44:24.400 --> 44:29.400] So the other thing that I want to emphasize the most is the
[44:29.400 --> 44:33.400] role of capitalism in all of this. I mean, there's all this,
[44:33.400 --> 44:38.400] like the private equity, the business of people having a point
[44:38.400 --> 44:43.400] of view that is going to advance them financially and that
[44:43.400 --> 44:48.400] blinding them to the problems that are here. For instance, we
[44:48.400 --> 44:51.400] talked about global warming. Well, the rich people, the very
[44:51.400 --> 44:55.400] rich people, are buying multi-million dollar apartments
[44:55.400 --> 44:59.400] and condominiums, which have special air filters, which will
[44:59.400 --> 45:04.400] keep the wildfire smoke out and will try to keep the global
[45:04.400 --> 45:11.400] warming effect at bay by super power air conditioners.
[45:11.400 --> 45:16.400] Of course, they're building their own bunkers, too.
[45:16.400 --> 45:19.400] They're building things that are creating all kinds of chaos
[45:19.400 --> 45:23.400] and weapons of war, mass destruction. They're building
[45:23.400 --> 45:27.400] super bunkers in various places as well. So I think they're
[45:27.400 --> 45:31.400] somewhat pessimistic about what they're doing.
[45:31.400 --> 45:34.400] Well, it's basically the idea is that we don't care about the
[45:34.400 --> 45:37.400] ordinary person. We're going to survive. We're going to see to
[45:37.400 --> 45:41.400] our own survival. And in order to do that, we have to deny
[45:41.400 --> 45:45.400] certain things that are going on. We'll do so. Now, incidentally,
[45:45.400 --> 45:49.400] all of this is not conscious thinking. They don't necessarily
[45:49.400 --> 45:53.400] say, well, I'm going to deny global warming because it will be
[45:53.400 --> 45:56.400] to my advantage financially because all my investment is in
[45:56.400 --> 46:01.400] the oil and gas industry. They don't do it that way. They come
[46:01.400 --> 46:06.400] up with pseudo logic, things that seem to make sense to them.
[46:06.400 --> 46:10.400] But if they didn't have a financial thrust in the matter,
[46:10.400 --> 46:13.400] they would look at it quite differently.
[46:13.400 --> 46:16.400] That's right. We can always find a justification for what it is
[46:16.400 --> 46:19.400] that we really want. Everybody should understand that if you're
[46:19.400 --> 46:22.400] a parent this time of year at Christmas time, you can always
[46:22.400 --> 46:25.400] understand that people come up with a justification for what
[46:25.400 --> 46:28.400] they want. And that's as true of a government as it is of
[46:28.400 --> 46:31.400] corporations out there. And it's really dangerous when the two
[46:31.400 --> 46:34.400] of them connect with each other. I think that's one of the
[46:34.400 --> 46:37.400] things, you know, you talk about connections and the importance
[46:37.400 --> 46:40.400] of it and how we can try to connect these different factors
[46:40.400 --> 46:43.400] each of us individually. But I think it's the human connection
[46:43.400 --> 46:46.400] that is out there that is going to be essential for all of this.
[46:46.400 --> 46:50.400] It's going to be our collective work on all this. What do you
[46:50.400 --> 46:54.400] think about that? Would you agree with that?
[46:54.400 --> 46:57.400] Well, I'd agree with it. But there's so many things that are
[46:57.400 --> 47:01.400] taking place now that are causing the schisms and splitting
[47:01.400 --> 47:06.400] people into factors and belief systems and political points of
[47:06.400 --> 47:10.400] view. And that's very dangerous because then you can't get
[47:10.400 --> 47:14.400] together any kind of unity even in the face of an emergency.
[47:14.400 --> 47:18.400] Well, I think we've always had these factor, you know,
[47:18.400 --> 47:21.400] factions and things like that. You know, the founders of the
[47:21.400 --> 47:24.400] country warned about factions and political parties. But I think
[47:24.400 --> 47:27.400] what makes it unique is that when you're interacting with
[47:27.400 --> 47:31.400] people on a personal basis, you interact with them a little bit
[47:31.400 --> 47:35.400] differently than if you've got that separation between you that
[47:35.400 --> 47:38.400] technology is giving us now. Because now you're interacting
[47:38.400 --> 47:41.400] with something that's abstract. It's not with another person.
[47:41.400 --> 47:44.400] And there's also the body language that you're not picking up
[47:44.400 --> 47:47.400] on. But it makes it easier for you to be harder on people when
[47:47.400 --> 47:50.400] there's that distance there, I think. That's why I think, you
[47:50.400 --> 47:54.400] know, the personal connection I think is really vital to making
[47:54.400 --> 47:57.400] these connections and coming up with an understanding of what's
[47:57.400 --> 48:00.400] going on. We talk about the hidden factors that are out there,
[48:00.400 --> 48:03.400] hidden unrelated topics. Other people, as you pointed out
[48:03.400 --> 48:07.400] earlier, just talking to ordinary people about what it is that you
[48:07.400 --> 48:10.400] see with different things. I think that is the genius of the
[48:10.400 --> 48:15.400] collective free market out there that there's so many observers
[48:15.400 --> 48:19.400] who are looking at things and thinking about them. And it's
[48:19.400 --> 48:22.400] kind of their collective decision that is kind of guiding things
[48:22.400 --> 48:26.400] along, as opposed to having a central planner who's doing that.
[48:26.400 --> 48:30.400] What do you think about that? You've got in your final chapter,
[48:30.400 --> 48:34.400] A New Way of Thinking, you have what you call a sensible solution.
[48:34.400 --> 48:37.400] What does that really involve?
[48:37.400 --> 48:40.400] I'm sorry, I didn't hear what you said. What's the last part?
[48:40.400 --> 48:44.400] You have a sensible solution. What do you think a sensible solution
[48:44.400 --> 48:49.400] to the kind of stress and chaos and anxiety that we have,
[48:49.400 --> 48:52.400] manipulation that we have? What is the solution to that?
[48:52.400 --> 48:56.400] Well, I think the Wikipedia is a good example of that. They have
[48:56.400 --> 49:01.400] people from all walks of life, all levels of education, free to
[49:01.400 --> 49:06.400] contribute to whatever topic they may want to do that. It may be
[49:06.400 --> 49:10.400] helpful. I mentioned earlier about the effect of global warming
[49:10.400 --> 49:15.400] on the making of cheese. There might be somebody who makes cheese
[49:15.400 --> 49:19.400] that's going to come up with some idea. You know, we don't know that.
[49:19.400 --> 49:23.400] We don't know that that may not be where comes some original idea
[49:23.400 --> 49:27.400] on what to do about global warming. And you put it on what I'd like to
[49:27.400 --> 49:31.400] think and I hope it will be developed, a kind of Wikipedia where the
[49:31.400 --> 49:35.400] ordinary person can feel free to put forth their ideas about it.
[49:35.400 --> 49:39.400] Now, you say, well, we already have that. We have the internet.
[49:39.400 --> 49:42.400] No, we don't. The internet is a commercial situation.
[49:42.400 --> 49:46.400] It's all done for making money and grab attention and all that.
[49:46.400 --> 49:50.400] And there's no criticism of it. There's no peer review, if you will.
[49:50.400 --> 49:54.400] Whereas in the Wikipedia, I mean, you know, people can write in and say,
[49:54.400 --> 49:58.400] well, that particular contribution is bonkers and then give an example
[49:58.400 --> 50:02.400] why it is or that was a very good idea. And after that, you begin to get
[50:02.400 --> 50:08.400] things coming together in unpredictable ways that may help us solve
[50:08.400 --> 50:12.400] these eight problems. You know, the problem is, it seems like whenever
[50:12.400 --> 50:16.400] you wind up having a form or a place where things can be, and that's true
[50:16.400 --> 50:20.400] of the Internet, it's also true of Wikipedia, then it becomes you have
[50:20.400 --> 50:25.400] gatekeepers who are there. And we saw this in spades throughout the
[50:25.400 --> 50:30.400] covid stuff that if somebody's got a different idea rather than debate them,
[50:30.400 --> 50:35.400] the impetus is to silence them by the people who are in authority.
[50:35.400 --> 50:40.400] And so that really, I think, is the key thing. And I think as part of that,
[50:40.400 --> 50:51.400] we see a continuing rise in disgust and deprivation of free speech.
[50:51.400 --> 50:54.400] People are not interested in the principle of free speech.
[50:54.400 --> 50:58.400] They don't want to have open debate. And I see this regardless of where people
[50:58.400 --> 51:03.400] are coming from on the political spectrum. There is a declining interest
[51:03.400 --> 51:08.400] in debate and thinking. The debate is critical to critical thinking.
[51:08.400 --> 51:13.400] And so the people who are in charge, the gatekeepers, whether it's Wikipedia
[51:13.400 --> 51:19.400] or the Internet or any other form of information, they are weighing in on that.
[51:19.400 --> 51:23.400] And they don't want things that they disagree with. And it might be because
[51:23.400 --> 51:27.400] they've got an agenda or it might be because they've just got a particular
[51:27.400 --> 51:31.400] prejudice about something. They want to make sure that the contrary views
[51:31.400 --> 51:36.400] don't get out there. That, I think, is the real key that's there.
[51:36.400 --> 51:41.400] And again, this is part of this atomization that we have of people,
[51:41.400 --> 51:46.400] feeding that tribalism in a way that we've never seen it before, using technology.
[51:46.400 --> 51:51.400] I would agree with everything you've just said, exactly. And I think we have to try
[51:51.400 --> 51:56.400] to get beyond that. But we get back again to this business of people having
[51:56.400 --> 52:01.400] their own personal financial point of view and position and pushing that
[52:03.400 --> 52:08.400] basically on the fact that they look upon it as so maybe we're talking about a
[52:08.400 --> 52:12.400] capitalism problem. We've got capitalism. That's what this country is all about.
[52:12.400 --> 52:16.400] But I mean, it's certain parts of it now. We've gotten to the point where people
[52:16.400 --> 52:21.400] are unable to take another point of view if it's going to be financially harmful
[52:22.400 --> 52:27.400] and hurtful to them. Yeah. I think that, you know, we start looking at the tech
[52:27.400 --> 52:31.400] companies. I don't think that their capitalism would exist. I don't think they'd
[52:31.400 --> 52:36.400] have billions of dollars if they weren't unified with the government. So there's a
[52:36.400 --> 52:41.400] symbiosis there that the two of these entities feed off of each other.
[52:41.400 --> 52:46.400] And I think that's right. That nexus right there is the difficult thing.
[52:46.400 --> 52:51.400] And so I think, you know, when I think of capitalism, I don't like to refer to
[52:51.400 --> 52:55.400] capitalism anymore because I think of it as a partnership, a public-private
[52:55.400 --> 53:00.400] partnership, some kind of a economic fascism where they are working together.
[53:00.400 --> 53:04.400] But I like to think of a free competitive market where the government doesn't have
[53:04.400 --> 53:09.400] any role except as some kind of a referee between two parties that have a conflict
[53:09.400 --> 53:13.400] or something. But yeah, that's the thing that's really driving this. You know,
[53:13.400 --> 53:16.400] many people, when they talk about AI, they said, well, you know, here's a couple of
[53:16.400 --> 53:20.400] different outcomes. Maybe this stuff really works the way it's supposed to work and
[53:20.400 --> 53:24.400] takes everybody's jobs and we wind up with a depression. Or maybe it doesn't work at
[53:24.400 --> 53:29.400] all, in which case the big AI stock bubble that we've got bursts and everybody loses
[53:29.400 --> 53:33.400] their job because of that. And so there's a third alternative, and that is that the
[53:33.400 --> 53:39.400] government keeps propping it up with public funds because it feeds their surveillance
[53:39.400 --> 53:44.400] and manipulation needs, their ability to surveil and to control us. And I really
[53:46.400 --> 53:50.400] think that that's where this is all going to head. I don't really, you know, those
[53:50.400 --> 53:54.400] other two things may happen and they may be true, but I think there is a customer out
[53:54.400 --> 53:59.400] there for the AI stuff that is driving all this stuff that has been putting out these
[53:59.400 --> 54:03.400] proposals for the longest time. And that's governments, governments around the world.
[54:03.400 --> 54:07.400] I mean, we look at the brain project that we had a few years ago. That was during the
[54:07.400 --> 54:12.400] Obama administration, but things like the brain computer interface that Elon Musk and
[54:12.400 --> 54:17.400] many other tech companies are doing out there. There's Neuralink and there's a lot of
[54:17.400 --> 54:23.400] them that are doing that. That's being driven by the government wanting to connect into
[54:23.400 --> 54:27.400] our minds, hack into our minds really. And they've been funding that kind of stuff. So
[54:27.400 --> 54:33.400] how do we break that? Yeah. On the Musk side, he's doing it for money. I mean, obviously
[54:34.400 --> 54:39.400] to make money. That's right. So that there's an unholy alliance, if you will, between someone
[54:39.400 --> 54:44.400] who can't see anything other than the dollar and another side of the government can't see
[54:44.400 --> 54:49.400] anything other than increasing power and surveillance over the population. Yeah, that's right.
[54:49.400 --> 54:54.400] Absolutely true. Well, it's a fascinating book. It's a fascinating take on this. And
[54:54.400 --> 55:00.400] of course, you've written many books on the brain. The memory one, very interesting. And
[55:00.400 --> 55:05.400] you do have sections about memory in this book as well. And people will be able to find
[55:05.400 --> 55:10.400] this on Amazon, I guess is the best place that they can find it, looking for the title
[55:10.400 --> 55:17.400] of this. And it is, you know, it is something that I think we all need to think about how
[55:17.400 --> 55:23.400] we're going to operate the effects that this technology is having on our brains in the
[55:23.400 --> 55:29.400] 21st century. And that is the title of the book, The 21st Century Brain by Richard Restak.
[55:29.400 --> 55:33.400] Thank you very much, Dr. Restak. Thank you. Appreciate you coming on.
[55:33.400 --> 55:35.400] Good day. I enjoyed it very much. Thank you.
[55:35.400 --> 55:38.400] A very interesting conversation. Thank you. Have a good day. Folks, we're going to take
[55:38.400 --> 55:41.400] a quick break and we will be right back.
[55:41.400 --> 55:59.400] The Common Man. They created Common Core to dumb down our children. They created Common
[55:59.400 --> 56:06.400] Pass to track and control us. Their commons project to make sure the commoners own nothing
[56:07.400 --> 56:14.400] and the communist future. They see the common man as simple, unsophisticated, ordinary.
[56:14.400 --> 56:22.400] But each of us has worth and dignity created in the image of God. That is what we have
[56:22.400 --> 56:28.400] in common. That is what they want to take away. Their most powerful weapons are isolation,
[56:28.400 --> 56:34.400] deception, intimidation. They desire to know everything about us while they hide everything
[56:34.400 --> 56:41.400] from us. It's time to turn that around and expose what they want to hide. Please share
[56:41.400 --> 56:47.400] the information and links you'll find at TheDavidNightShow.com. Thank you for listening.
[56:47.400 --> 56:57.400] Thank you for sharing. If you can't support us financially, please keep us in your prayers.
[56:57.400 --> 56:59.400] TheDavidNightShow.com.
[57:27.400 --> 57:31.400] It never ends with thousands of the hottest free-to-play social casino games, fastest
[57:31.400 --> 57:35.400] payouts, and the best promotions in the industry. No tricks or gimmicks owned and operated in
[57:35.400 --> 57:39.400] the USA. Moto Casino is a free-to-play social casino. No purchase necessary. 21 plus to
[57:39.400 --> 57:45.400] play. Void word prohibited. Sign up today for a generous welcome bonus. Moto Casino.
[57:45.400 --> 57:51.400] America's social casino. Download the Moto Casino app today. When it's time to scale
[57:51.400 --> 57:56.400] your business, it's time for Shopify. Get everything you need to grow the way you want,
[57:56.400 --> 58:04.400] like all the way. Stack more sales with the best converting checkout on the planet. Track
[58:04.400 --> 58:09.400] your cha-chings from every channel right in one spot and turn real-time reporting into
[58:09.400 --> 58:15.400] big-time opportunities. Take your business to a whole new level. Switch to Shopify. Start
[58:15.400 --> 58:17.400] your free trial today.