Mind Meets Machine: AI’s Role in Mental Health – Keith Kurlander & Dr. Will Van Derveer – HPP 142
The rise of artificial intelligence (AI) provokes both excitement and fear. Given its recent rapid growth, we believe it is important to be well-informed about AI’s role in society and mental health. In today’s episode, we assess the evolution of technology and its profound impact on our well-being. Our conversation unpacks how AI influences not only individual habits but also cultural norms.
We discuss the concept of attention sovereignty and express our concerns about social media and smartphones, particularly among younger demographics. Looking to the future, we explain how AI applications might exploit our attention and dopamine systems, leading to addictive behaviors and a reduced capacity for emotional regulation. If we are not mindful, the hijacking of our attention systems can erode the natural resilience built through human connection and real-life challenges. We address the need for educational initiatives to manage attention amid this overwhelming digital stimulation, advocating for intentional technology use to mitigate distractions and foster emotional regulation.
We also explore the promise of AI in revolutionizing mental health care and medicine, noting its emerging potential in areas such as diagnostics, personalized psychological support, and AI-enhanced therapy. As AI becomes more and more integrated into our lives, we emphasize the importance of pausing, reflecting, and recalibrating our focus so that we do not become overly dependent on AI interactions. This episode serves as a powerful reminder to remain self-aware about how we let technology shape our futures.
Show Notes:
Importance of Pausing and Reflecting in Life – 1:00
Reflecting on where attention goes is the first step to making conscious choices.
Technology in Mental Health – 6:01
Technology has drastically changed communication and access to information. The ability to filter overwhelming data streams can mitigate negative effects on mental well-being.
Attention Sovereignty and Cultural Impact – 12:01
Digital media platforms significantly influence individuals’ attention, leading to reflexive responses and loss of personal control.
Regaining Attention Control – 18:01
Attention control lies on a spectrum of capacity. This means that constant, mindless scrolling might have detrimental consequences for emotional regulation.
Risks and Benefits of AI in Mental Health – 24:01
While AI might support personalized therapy and enhance diagnostic accuracy, there is a risk of over-reliance on AI and diminishing personal resilience.
Mindful Technology Use and Personal Agency – 32:01
Increased self-awareness and intentional engagement with screen time can improve both mental and physical wellbeing.
Impact on Social and Family Dynamics – 38:01
There is growing concern about the potential for AI to replace human relationships and the adverse effects this could have on children and society. Maintaining genuine human connections can ensure healthy emotional regulation.
Confronting Challenges and Future Implications – 43:01
Potential positive impacts of AI on healthcare include increased lifespan and enhanced patient care. Critical questions need to be asked in order to find balanced engagement and personal agency.
Full Episode Transcript
Keith Kurlander [00:00:06]:
Thank you for joining us for the Higher Practice podcast. I’m Keith Kurlander with Dr. Will Van Derveer, and this is the podcast where we explore what it takes to achieve optimal mental health.
Dr. Will Van Derveer [00:00:17]:
Hey, Keith, good to see you.
Keith Kurlander [00:00:19]:
Hello again.
Dr. Will Van Derveer [00:00:21]:
So today we want to get into an emerging conversation. It’s been going on now for a little while, but is becoming more front and center as we see development of ChatGPT and artificial intelligence and a lot of conversation about the ups and downs and pros and cons and where this is headed in terms of how this is going to potentially impact us, not personally only, but also in terms of just bigger themes of our culture and how we relate to one another.
Keith Kurlander [00:00:56]:
So because what’s happening in technology right now is such a hockey stick, particularly around artificial intelligence, this is a episode about what to be considering in this time period. There’s always a continued question, how do we stay well? How do we feel well? How do we act well? Right. And I think this is actually so much change is occurring and has been occurring at the turn of the cent century in technology that has absolutely impacted people’s ability to feel well both positively and negatively. It’s only going to keep. The hockey stick is going to keep going right now with AI about how this impacts culture and our individual experience of mental wellness. So that’s what this is about is sort of you’re trying to pay attention enough here together so that we can see what’s happening so we don’t get influenced and not realize it. So let’s start there.
Dr. Will Van Derveer [00:01:52]:
I’m having this funny memory come up right now as you were speaking of something that became the Internet that I got to see unfold. When I was 19, I spent the summer at Stanford University as a research assistant to a psychology professor. And they were communicating with other people at other universities through this messaging system. And they didn’t call it email. I forget what they called it, but it became email. And at that time, I’m just reflecting on my life then I, you know, there were no cell phones, there was no Internet, there was no email, no texting. If you got to where you were going, nobody knew whether you were getting there or not until you were there and so on. Just like hard to even put your mind back in the late 80s and think about, like, what life was like without what you were talking about, of what happened and what continues to speed up. It seems like every week, every month, every year, it speeds up faster and faster.
Keith Kurlander [00:02:59]:
Yeah, well, let’s take a little quick road of what’s happened, right? Because it’s, it’s hard to extract and really consider like the way we think, the way we process the context of our emotions. Now the context is different than 20 plus years or 25, 30 years ago. And that’s the same throughout all of human history. You have different time periods and the way we conceptualize worldviews and mental health. And these things change because the context change. Right? But we had such a rapid change that happened starting in the late 80s, early 90s with the Internet starting to come to form. And the Internet completely changed the way we socialize and the way we interact with each other and the way we do business. And the access of information changed so greatly so fast. And that access of information has impacted us in both negative and positive ways about how our identities are and who we see ourselves to be. Now some people now with the amount of access of information that keeps speeding up, it can be overwhelming what to pay attention to. We’re these very interesting organisms where we, we’re a little different than other animals. Like we have to know what to filter in and out. There’s so much complex information we receive that’s not just bioenergetic. There’s so much social information and information about norms and messaging. And so we have so much to filter. Right. And so that sped up in a big way, starting with the Internet. Now there’s positive sides to that obviously that have come out of that. Like we have more access to information. We can educate, educate ourselves more globally. People can get to materials, educational materials faster. They can learn more about the world and what else is happening in other parts of the world. We can, we can see that. And then there’s the downside, which is it’s how do you know what to filter? Us humans, we can go into addictive patterning and not filter at all and consume things that are negative for our brain structure. And getting into addictive cycles in our brain that could be pornography, that could just be addictive type of entertainment material, material that we’re just pounding at our brain all day long in our phones.
Dr. Will Van Derveer [00:05:17]:
Or it could be doom scrolling or getting, you’re getting hijacked by misinformation, misinformation panic.
Keith Kurlander [00:05:27]:
People who have large microphones that scare people. There was always that process where people had microphones, but there’s this new process where the individual can get access to so much. And so that’s happened in the last really, you know, that didn’t really take off till the cell phone fully, like intensely. And that’s when it really just skyrocketed where we had cell phones that were now computers. So it’s like people were able to do that all day long. You didn’t need to pull your laptop out or your desktop. And from 2000 on, in the last 20 years, we have this thing. And then we, then we introduced new forms of media that were never on the planet. Perform like social media, where you put the media to the individual versus to of platform or organizing body. Right. Again, positive and negative, but it’s impacted our mental health. And we’re still in this phase. And we’re going to talk about AI today, but we’re still in the phase of like, well, how is this impacting us about how we see the world and how we’re behaving with each other. And again, we’ve talked about this before, but you look at younger people, it’s not. It doesn’t seem like it’s impacting them very well if you just turn to younger people to look at that.
Dr. Will Van Derveer [00:06:40]:
I mean, one of the things that we never needed before, that we now need is to pay attention to how we allocate our attention. You know, there used to be a lot because there were so much less data coming in. And I’m going back, way back before, you know, the turn of the century. But if you go back a hundred years, the amount of data that was coming into our nervous systems was way, way less. So we didn’t need, plus we didn’t have these dopamine hooks and the things that were coming across our eyeballs and our nervous system. So as much so. I mean, even sugar was hard to get a hundred years ago. You know, if you think about dopamine.
Keith Kurlander [00:07:21]:
Yeah. You know, you had drugs or like alcohol, but like, again, the choices were more limited.
Dr. Will Van Derveer [00:07:27]:
Right.
Keith Kurlander [00:07:27]:
And the impact on those choices were very noticeable when you were very addicted to them.
Dr. Will Van Derveer [00:07:31]:
Right. Maybe this is a pipe dream, but I think one of the future themes that I think we’re going to need to see in education is how to steward your attention properly. I really resonate with the perspective that our attention, it used to be like, oh, it’s our time as our finite resource in our lives. But I really think our attention is what our greatest asset is. And when you think about children growing up on devices and, you know, in video and all this stimulation, this hockey stick of stimulation, there’s really not that many people thinking about how do we train our children to actually have sovereignty over their own attention and who gets it, what gets it?
Keith Kurlander [00:08:14]:
Yeah, I mean, this is. You’re you’re speaking to a really important point and again this is. We’re just talking about what we’re facing now. We’re going to talk about AI in a minute, but is that we’re not training our children how to work with our attention, their attention. But because we’re not training ourselves how to confront the fact that we have attention seeking devices in our hands now all day long that if you’re not consciously choosing where to place your intention attention, the device will choose it for you.
Dr. Will Van Derveer [00:08:47]:
Exactly.
Keith Kurlander [00:08:48]:
And the platform will choose for you where to place your attention. That is not being discussed enough. Right. And it’s not being addressed enough as a culture which is we are more and more giving away our sovereignty and power over our own attention and allowing software to tell us where to put our attention, which will get us to AI in a little bit because it’s related. But that consequence, the cause and effect of that. This is a global thing. I mean, billions of people use social media. So this is a global issue. This is not a United States issue. We don’t know yet the consequence of having given up so much power to software that’s going to entrain our attention into things that they’re not that important to us. They give us dopamine and they excite us, they excite our system. But they’re not meaningful. They’re not deeply meaningful.
Dr. Will Van Derveer [00:09:53]:
For example, I’d be really curious if I could design a study and just speak into the microphone and have AI do a medical study for me right now, which maybe one day that will happen. No, I would love to see the average capacity for controlling one’s attention compared to the capacity for controlling your attention in people who do amazing things, like Olympic athletes, like business leaders, like people who excel in their chosen domain of expertise. How do you get anything done if you’re not sovereign over your attention? Right.
Keith Kurlander [00:10:32]:
I think you get less done. That’s problem A. And then the second thing is, I think very dysregulating to the nervous system when you don’t have a focal point for your attention and it’s scattered again on things that are not meaningful for excitement. I think it’s very dysregulating to the nervous system. And again, we haven’t seen the full outcomes, but we know there’s a mental health issue in the last couple decades growing. We know that. And you have to start to ask questions of what it is, what’s causing it. And it’s. To me it’s pretty clear this is one of the factors and this isn’t getting. The problem is this factor isn’t being dealt with and it’s getting worse. I think as a mental health provider you also have to ask yourself the question of like, I’m not even asking about this. And again, I’m, I’m not saying I don’t do these things. Like I don’t. I do, like I definitely do. I’m not saying that and I’m not saying this should all just go away. But like if you’re not asking these questions, like you’re not looking at what people are up against, right. You know, they’re not just up against their history.
Dr. Will Van Derveer [00:11:42]:
Right.
Keith Kurlander [00:11:43]:
Like they’re up against their current reality right now. And it’s a serious, serious, complex reality right now with, you know, that we’ve lost our power over own attention. That’s a really big deal.
Dr. Will Van Derveer [00:11:56]:
Yeah. I mean if our goal in mental health care is to empower people and set them free from what binds them up, it’s hard to imagine accomplishing that without talking about this, I think.
Keith Kurlander [00:12:10]:
So again, if you haven’t dealt with it yourself, and I’ve dealt with it spurts where I start dealing with it and then I don’t. But I, I have enough self awareness to know when I start scrolling for too long in these platforms, I can notice things change inside of me. Sleep can change. My general inspiration and level of energy can start to deplete. I can feel more agitated. I mean most people can relate to this if they’ve scrolled for like two or three hours and they’re like, oh. And then all of a sudden you’re like, I don’t feel great right now. Maybe that wasn’t a good idea. But you know, you do that every day. Or maybe you’re up to three to five hours. Go look at your screen time on your phone. You know, go get honest. Just go look. Your phone will tell you like, you might be like, oh, I don’t scroll that much. And then you look at your phone and your screen time and you’re like, I was in, I was on Instagram five hours today. And then it’s like, well, what is that doing to you? What is it doing to your nervous system and what is it doing to your mind and how you view yourself in the world? And so yeah, I think as mental health providers we do need to be asking that question. And as individuals, if you care about mental health, your own mental health, if you care about living life in a superb way, you definitely have to ask, how is this stuff affecting me right.
Dr. Will Van Derveer [00:13:34]:
Right before we started recording, you and I were talking for a few minutes about this issue about emotional regulation and, you know, the role of emotions and mental health and what does it really mean to have a healthy emotional system. Right. And so inside the context of this conversation, I think if we’re allowing an algorithm that is designed to stoke shorter attention spans and emotional dysregulation in a way, I think about it in the metaphor of like, it’s kind of like saying, okay, I’m going to take on a challenge that I fill every minute, every nook and cranny of my day with. I’m going to lift weight every time I’m not on a focused attention task of a purposeful action to get something done, I’m going to be lifting weight in the gym. It’s sort of like the emotional equivalent of that. I’m taking on a challenge every moment of every day, outside of what I’m trying to get done every day. That’s what the emotional cost is. A little bit of this habit of scrolling.
Keith Kurlander [00:14:44]:
Right. Probably the. Probably a healthy way to relate to this is you’re really consciously choosing when you’re just throwing the whole thing, the project out the window. And you’re just like, all right, I’m going to go, well, right now, I’m gonna allow this thing to drive my attention. I’m gonna allow it to. And it’s this amount of time, you know, there’s sort of like, you’re choosing to do this for space to let go. There’s probably a healthy version of that. You’re like, okay, I’m on this much. I just wanna let go. I don’t wanna have to drive my attention anymore. You know, it’s like, I need a break.
Dr. Will Van Derveer [00:15:25]:
Right, exactly. But. But if we’re doing it mindlessly, we’re doing it all the time. It’s sort of like, are you really going to get going back to the weights metaphor. You’re going to get stronger by lifting weights constantly, all day, every day, mind mindlessly, or is it actually more effective if the goal is to be strong, to lift weights for an hour, three days a week?
Keith Kurlander [00:15:47]:
Yeah, you’re going to. Obviously you’re going to get stronger with a program.
Dr. Will Van Derveer [00:15:51]:
Right.
Keith Kurlander [00:15:51]:
You know, with recovery, just haphazardly, you know, lifting weights and hurting your body and not recovering and.
Dr. Will Van Derveer [00:16:02]:
Yeah, yeah.
Keith Kurlander [00:16:03]:
And so you brought up an important question about emotion in this whole thing.
Dr. Will Van Derveer [00:16:09]:
Right, right.
Keith Kurlander [00:16:11]:
Because there’s a huge interface here happening between technology and emotion, and there’s going to be a massive interface between AI and emotion coming. That’s very close. And so right now we have this interface where emotionally you can use technology in a generative way, gain energy and feel emotionally inspired. And you can do that. And a lot of people do. And they’re using it to meet people and make great connections and find information they need and build things and create new things and innovate. And I mean that’s amazing. And that’s why we’re, you know, that’s the miracle of our species. Right. There’s innovation and collaboration and. But then there’s this interface with technology and emotion where you can actually use technology in a hedonistic way where your driver is I need more pleasure, I need less pain. Right. And you’re using it as this sort of emotional tool to like keep trying to rush yourself into moments of pleasure and joy and happy and not sad and this and again like that. We know that that doesn’t work in the long run.
Dr. Will Van Derveer [00:17:33]:
That and that interface that we can’t, it doesn’t work because we can’t create anything of much substance without sticking it out through the inevitable adversity that shows up when we’re trying to do something new and different.
Keith Kurlander [00:17:52]:
Yeah, it doesn’t work on that level of. That’s right. It also doesn’t work on the level of just. It’s dysregulated.
Dr. Will Van Derveer [00:18:00]:
Yeah.
Keith Kurlander [00:18:02]:
It’s not the regulation. The homeostatic set point in the body is not constant high pleasure, low pain like that does not bring us to homeostasis. And so yeah, we don’t generate as much in our life when we live that way. Typically some people figure it out and still do.
Dr. Will Van Derveer [00:18:23]:
Wow.
Keith Kurlander [00:18:24]:
It’s not me, but I’m not good at it. I’ve tried many times. Some people can but imagine the impact you could have with your clients when you’re able to practice the most cutting edge modality available today. Psychedelic therapy is the future of mental health care. And the Integrative Psychiatry Institute will empower you with the tools and knowledge you need to master this exciting modality. IPI’s comprehensive training and in person experiential practicums will elevate you personally and professionally. Its in depth curriculum is the gold standard certification in the field. When you join, you will step into a global community of thousands of innovative colleagues who are integrating psychedelic therapy and to their practices. Visit psychiatryinstitute.com Reply where you will find all the information you need about IPI’s training. And when you visit psychiatryinstitute.com apply you’ll also receive IPI’s free eBook, Getting Started with Psychedelic Therapy so you can get the most up to date information immediately. Again, that’s psychiatryinstitute.com apply to learn more about the training and to get your free ebook.
Dr. Will Van Derveer [00:19:34]:
Yeah, I mean, I think about some of the artists that you know, whose work I know and love and the level of intensity of the emotional swings of what I know about their lives and, you know, some of them dying early by suicide or addiction or various problems. Incredibly beautiful output of work. So that can happen too. But I think what you said is really the key piece is like, if we’re talking about mental health, then we’re. We have to be thinking about either consciously or unconsciously of homeostasis, like how balance, resilience, how do we tolerate the challenges in our lives? And if our MO of our life. And you know, most people would not be consciously choosing this, but I think a lot of people unconsciously are choosing this path. If the MO is let’s avoid pain and go after more pleasure. Because this is what the lower part of the human nervous system, human brain does, the limbic system and the reptilian brain, brainstem, it’s all driven toward that. It’s animalistic instinct. Right. You jump away from something, you put your hand on the stove and you jump away. It protects you, but it also can own you if you’re not mindful about how you balance the front part of your brain.
Keith Kurlander [00:21:00]:
Yeah.
Dr. Will Van Derveer [00:21:00]:
The back part.
Keith Kurlander [00:21:01]:
Yeah. You should create a bumper sticker that says Instagram owns you. Well, it’s meta owns you. Yeah.
Dr. Will Van Derveer [00:21:09]:
It’s a brilliant technology where the creators it is are the customers. Right. You don’t have to.
Keith Kurlander [00:21:16]:
That’s right.
Dr. Will Van Derveer [00:21:17]:
If you’re Instagram, you don’t have to invest any money whatsoever in creating content.
Keith Kurlander [00:21:23]:
That’s right. And you know, I again, it’s like, it’s a tool like a hammer. Like it can be used in a million ways. Like, I don’t go into, you know, conspiracy of like, who created these things and they’re out to get us. Like, it’s a tool and it can be, it’s used in a lot of great ways, but it’s a tool that’s. We’re very vulnerable to this tool. So let’s get into AI. So look, 10 years ago, 2014, like this conversation wouldn’t have meant that much about social media yet it would have. Like, it could have would have fallen on deaf ears. Like, it wouldn’t have meant that much yet even 10 years ago it would have started to, but not really, right? And so now means a lot like, okay, go pay attention because your way you see the world, your mental health is being impacted. So go pay attention if you want to. If you care about your mental health and you care about your friend’s mental health or your partner’s mental health, like you pay attention your client’s mental health. But okay, but what would be better is in 10 years from now we’re going to be having the same, same discussion about AI. And if you don’t watch for that over the next 10 years, you’re going to be out in 10 years going, oh my God, I got even more hijacked. The level of hijacking that’s going to happen in AI as well as the level of positive innovation is going to be way bigger than what’s happening right now in the conversation.
Dr. Will Van Derveer [00:22:45]:
What do you see way bigger about it? What’s on your radar of what’s most concerning?
Keith Kurlander [00:22:49]:
What’s on my radar, which is both a concern and an excitement, is that the first thing is you will be able to completely replicate a human being in the form of software, emotions, personality, uniqueness, individuality, creativity, intelligence, you’re going to be able to replicate that. We’re not that far away. Ten years from now you’re probably going to see that is what I’m reading. Maybe it’s 15 years from now, but it’s close. I’ll be online with you. Except it’s not you. It’s an AI avatar of a completely made up person, doesn’t exist in the world, acts exactly like a human being and you cannot tell the difference. We’re not that far away, right? So that’s coming. So what concerns me, the innovation possibility, the positive thing is huge. I mean, well, let me, let me.
Dr. Will Van Derveer [00:23:46]:
Just ask you a clarifying question before you go. There is, and I want to get back to that. But what came up for me is a question about fully replicating a human being. Like my experience as a human being includes and maybe this is just me because have like a big trauma history and big A score and like I come from like decades of psychotherapy that I’ve been in and the world that we’ve been swimming in. But my point is like the human, the fully fledged human experience that I have often includes things, themes, emotions, memories, traumas, things coming up from the underground or the subconscious mind. And they come up triggered by current events. For example, a couple of days ago a friend of mine killed himself and it brought things up for me as it would. But what I’m asking you, what I get curious about is like, is a fully fledged AI human going to be having that, like, if I’m AI right now, 10 years from now with you, and you’re human and I’m AI 10 years from now, am I going to be seemingly randomly, like occasionally, like getting.
Keith Kurlander [00:24:57]:
Thrown off by sourcing from a fake history?
Dr. Will Van Derveer [00:25:00]:
Yeah.
Keith Kurlander [00:25:01]:
Well, I think the answer is whether I don’t know if it’s 10 years or not, but the answer is, sure, you could choose that avatar that behaves like a human who has some history that they made up for that avatar that is acting in a complex way. Like, I think, yeah, software could do anything like that. I don’t see that as actually that difficult, honestly, to sort of be acting in spontaneous manner that way where the AI profile has some very, very extensive history that it’s working from. Also you could probably choose that avatar. Maybe you wouldn’t want to. Maybe you’ve got enough of that with real humans. But so I think the answer is, I mean, this is talked about in these, you know, AI circles and like that we can do anything like that and it’s not that far away. But to me, then the question is, okay, so let’s say, yes, you could have a. You could be relating to an AI that is acting like a human and you just would never know. Right. If you put, we put us on a screen and we had a non human and a human, there will be a point where humans will fail the test.
Dr. Will Van Derveer [00:26:13]:
Right.
Keith Kurlander [00:26:13]:
The Turing is not that far away. The concern for me is more of the concern of how we use social media more than anything else. I mean, you could get scammed and all that, but that’s a different issue. I’m not talking about that stuff and that’s already happening. The concern for me is more that we’re very vulnerable as human beings to go into this more addictive cycle in ourselves, especially if it’s a culturally accepted addiction. And so we’ll choose AI relationships and we’ll choose to use AI in ways that further this issue for us. And I think that’s my concern. And the thing is with AI, like, what we’re seeing is still very primitive in social media, actually, it’s still pretty simple. It’s like, choose this video or this video, A or B, and it’s all based on do we click or not? Very primitive, actually. Simple. I mean, it’s advanced and it’s simple. But when you start getting in intelligence, artificial intelligence, more and more and involved in it now, you’re really going to be able to relate to what gives you the most satisfaction, the most support, the least amount of challenge. And it’s just very easy to get addicted.
Dr. Will Van Derveer [00:27:34]:
We’re not talking about one of the possible futures that people talk about, like Ilya Suscaver talks about. You know, he was at OpenAI. He left concerns about AI safety. We’re not talking about like, okay, the AI is taking over. We’re talking.
Keith Kurlander [00:27:47]:
Yeah, we’re not talking about that. Which is a different concern.
Dr. Will Van Derveer [00:27:50]:
It’s a different concern.
Keith Kurlander [00:27:51]:
It’s not my thing. I’m not talking about that.
Dr. Will Van Derveer [00:27:53]:
Yeah, it’s a different concern. We’re talking about AI that’s sophisticated, that when we talk about, you can’t tell that the AI is AI. We’re talking about more of an unsophisticated type of AI, not the kind where humans are working for AI now.
Keith Kurlander [00:28:07]:
Yeah, we’re talking about a time period, at least, if that ever happens. That’s a different thing. That’s a whole different type of concern. But we’re talking more about a sort of mental, emotional, spiritual issue for ourselves as humans, which is what happens when we start creating, creating relationships to AI. We’ve already created a relationship to software. We’ve done it in social media. Like that is a relationship. You might not have the same level and. But we’re going to have closer when we get to AI. Right. There’s still a relationship happening, though, where we’re watching people relating to each other and talking and speaking to us and saying things to us as the listener, and there’s a relationship, but it’s not that advanced. But when we have AI, it’s going to get advanced. And now we have an advanced relationship that feels almost like we have another person that we’re relating to. And my concern is probably many, I’ll say many people will use that to remove challenge in relating.
Dr. Will Van Derveer [00:29:07]:
Right.
Keith Kurlander [00:29:08]:
Well, and that’s my concern.
Dr. Will Van Derveer [00:29:10]:
Let’s talk about that, because I think I agree with you. I’m very concerned about that. Let’s flesh that picture out for people who haven’t heard us talk about, you know, support and challenge and, you know, what love is and, you know, how values work for people and so on, kind of turning into a territory of really existential, maybe philosophy or what matters to people and what happens when what matters to you gets challenged by another person or what happens to you when someone supports what’s important to you. You know, obviously this has implications in tribalism and how, what kinds of friends we make, what kind of media we look at basically what our preferences are and what happens. I don’t even think it’s fair to say everyday life, but what maybe used to happen more in everyday life is you would encounter people who would maybe challenge you a little bit here and there.
Keith Kurlander [00:30:01]:
I think that happens every day. I mean, people are still interfacing with each other quite a bit so that you know you’re getting triggered because you got challenged or you didn’t like it or someone mistreated you. You have to stretch right in that moment.
Dr. Will Van Derveer [00:30:14]:
Right, right, right. Yeah. And then again, I go back because I’m a sports fan. I go into a sports metaphor with this stuff. Is like you have a teacher or a coach who sees your potential and knows you’re not there yet, is going to kick your ass and get your, you know, really be hard on you because they see your potential.
Keith Kurlander [00:30:36]:
Yeah. And they’re going to focus on where you’re limited.
Dr. Will Van Derveer [00:30:39]:
Yeah.
Keith Kurlander [00:30:40]:
To build you up.
Dr. Will Van Derveer [00:30:41]:
They’re going to keep putting their right on the thing that is most uncomfortable for them.
Keith Kurlander [00:30:46]:
Yeah. And that’s what happens in partnership. It happens in romantic partnership. It happens in parenting. It happens in life every day with co workers. Wherever it is. We challenge each other and it’s not a bad. Sometimes we don’t challenge each other in a healthy way. That’s a different conversation.
Dr. Will Van Derveer [00:31:05]:
So you could look at a actual human right now as a random challenge generator.
Keith Kurlander [00:31:10]:
Yeah. We’re challenge generators and support. And we often don’t do a great job at it. Again, right now, it doesn’t mean coming back to AI, it doesn’t mean you can’t go toward an AI that does that for you. But the issue is that we are more and more choosing to move away from that, I think through these type of tools that just give us a break from it and give us what we want, which is the quickest dose of pleasure and doesn’t challenge us that much. And so the issue is, we could use AI with that in such a dramatic way because now we. We can just start actually leaving human relationship more and more to like, actually get the feedback that’s most satisfying to us only and not unpleasant. And I don’t think people want to admit that, that they’re going to do that, but that’s what we’re already doing. And we’re going to keep. And I do it and it’s. We’re going to keep doing that. We’re about to hit a hockey stick over the next decade where that can just go in a wild way in a New way.
Dr. Will Van Derveer [00:32:15]:
Yeah, yeah. So rather than dealing with a random challenge generator in an actual human, there could be inside of an algorithm that is learning what your preferences are and hitting you with more pleasure and less challenge. You could lose a lot of the resilience that we’re building up in the absence of that.
Keith Kurlander [00:32:38]:
Yeah, you’ll, you’ll lose the healthy stress state. You know, not the unhealthy stress state that can happen if you’re over challenged or under challenged, but healthy stress stress state where you’re expanding. Right. Your bandwidth and your capacity for adaptation and innovation, communication, that if you start choosing something that’s over supportive and those opportunities are going to increase now with AI choosing more and more over supportive opportunities as your relating tool or I think we’re going to see a pretty, we’re going to just keep seeing that mental health rate skyrocket, the mental illness rate. That’s what we’re going to see, I think from it.
Dr. Will Van Derveer [00:33:19]:
And there’s also this opportunity, right for, and there’s this opportunity lonely people, people in remote areas, people with a ton of social anxiety, for example, having a digital friend that has the skills to gently and therapeutically challenge you to grow and get out of your house, for example, and you know, do the things.
Keith Kurlander [00:33:40]:
Exactly. I mean that’s where it’s endless. Like the powerful generative innovation that’s coming is incredible. I mean the AI therapist that’s coming soon and, and let’s say you’re still, let’s say you’re a therapist, a human therapist and maybe you supplement with something there. The power of the algorithms to understand people is actually going to be pretty cool to give, let’s say your provider like to give you some new ideas. And then also let’s say it’s a person who doesn’t want to work with a human provider. Like there’s going to be some really good AI therapists out there. That’s real. They’re going to be good. They’re going to be really good. Like really be able to algorithmically know what is too much challenge and too little challenge, what’s too much support and too little support and really just keep adjusting that. Right. That’s amazing. I think we’re going to see some incredible stuff soon in the next decade around some tools around ptsd. We’re going to see some really cool stuff in virtual reality eventually to help people. And I know there’s already been a good amount of research with, you know, VR and. But the thing is it’s still, still primitive, but it’s with AI now it’s going to take off. So we’re going to see some cool stuff.
Dr. Will Van Derveer [00:34:52]:
I think adjacent to this idea of AI enhanced therapy or just straight up AI therapy is this incredible opportunity that we’re already seeing come true in medicine is because, you know, for example, I saw this really cool study showing that when you train AI on let’s say a hundred thousand chest X rays and then you put a chest X ray in front of that AI that’s seen a hundred thousand X rays, it’s like putting, it’s like putting the eyes of a radiologist who’s had 40 years career looking at X rays, looking at that X ray. But you don’t, you know, it’s. But then it’s limitless in terms of the scale of being able to use that. And they’re catching thing they’re seeing things on these X rays that a human eye doesn’t even see, let alone.
Keith Kurlander [00:35:40]:
That’s right.
Dr. Will Van Derveer [00:35:41]:
Being trained for 40 years. So it’s really exciting the possibility.
Keith Kurlander [00:35:44]:
I mean it’s super exciting. I mean, you know, I think we’re going to see incredible advances in the next, you know, 20 to 30 years in healthcare. I mean miraculous advances. Like we’ll probably see the average lifespan go a lot longer. They added a lot in the last hundred, I mean it doubled basically a little more. 130 years it doubled. But like we’re probably going to see a nice jump again soon with we’re.
Dr. Will Van Derveer [00:36:09]:
Headed right now as a psychiatrist. I mean one of the, I just think back about like what some of the big challenges have been in providing effective care. And one of them is just literally getting all the data together that someone has. Right, right. So having an AI that can compile and tabulate and organize and synthesize data, genetics, images, blood work, physical exam, I mean that, you name it, like whatever kind of data there is to be able to put all that together and have it at your fingertips is just absolutely incredible. I mean it’s, it’s game changing. So many medical problems or mistakes in care happen from just the absence of having access to.
Keith Kurlander [00:36:49]:
Absolutely. Yeah. I mean it’s incredible. I, yeah. I was reading an article today around diagnosis in mental health. With AI, we’re not far away from it helping us. I think it’s incredible. We need a lot of help and the thing is we’re going to learn more with AI about what’s the root cause we’re trying to treat. I think that’s where it’s still a little tricky. It’s like there’s a lot of root causes that we talk about on here. So many from inside the body to outside the body and the mind. It’s like AI can help with that quite a bit because they’re data points.
Dr. Will Van Derveer [00:37:26]:
Right. And then can help on the other side of the equation from what we were talking about before about dysregulation can actually train regulation with wearable biomarkers, right?
Keith Kurlander [00:37:36]:
That’s right. Yeah, that totally. And that brings us to again, I think the crux of this conversation, which is you really got to be choosing what you’re using these tools for. And you just again, like you don’t want to be a robot. You know, like an AI robot where it’s like you’re getting rigid about everything. But it’s. You do have to be mindful or your attention’s going to get hijacked. You’re not even in charge of it. Your emotions are going to get hijacked. You have a no kind of capacity to regulate your own emotions. And then coming down the road, your relationships are going to get hijacked. You know, that’s the next mind. Yeah, we’ve hijacked our minds, emotions or nervous systems. We’ve hijacked those to some degree at this point, but we’re still doing okay. But I think we are starting to see some issues here and we’ve seen it in children a lot. But once we start hijacking relationships, that’s a next level impact.
Dr. Will Van Derveer [00:38:34]:
Once we do that, I think that expands. It’s a much bigger conversation than what we have time for right today. But it expands into this bigger conversation about the divide that we see in this country about how people are even perceiving reality. Right. And this kind of descent into this deeply polarized faction. For example, people who really believe that conspiracies about, let’s say election fraud or that American elections are deeply flawed. Like that view that democratic process is not happening already in this country is a view that a massive amount of people already have that is not in attributable. And on the flip side or the other side of the aisle, let’s say the view that election tampering could even be a problem of significance in this country is completely unacceptable. Like you can’t imagine it. And so it’s what you were saying about polarizing even further into here’s a set of beliefs that I ascribe to and my tribe believes in, and here’s a set over here that that tribe believes in and there’s no capacity to have influence between the two groups. Yeah, it’s kind of scary.
Keith Kurlander [00:39:55]:
Yeah. Well, we’re in a definitely delicate time, clearly. And we’re also in an amazing time too. Right. Like it’s both. And we’re in a fragile moment, you know, in a new way. There’s been many fragile moments in history where we’re in a different type of fragile moment and we have to ask hard questions if we want to live well as individuals. I mean, yeah, society and cultures and government could ask those questions too, which they are. They are. Government is asking questions about how is social media impacting our culture? And those questions are being asked. But like, at the end of the day, you have an. As an individual, you have to. To be asking the question. If you care about knowing how you’re impacted. If you want to know, how am I impacted? How am I going to be impacted if I start relating to, you know, an AI bot in a few years from now when they’re awesome and hard to detect differences, or five years, whenever it is, and I start doing that and it’s this whole new thing. Just like social media was a new thing. But I don’t ask the questions and I start doing it and then all of a sudden, like, I notice I don’t want to relate to my partner as much and I want to relate to this other thing. And like, I actually learn more about myself there. I tell myself and I enjoy myself there more. And these are real things that are coming. Like these are not.
Dr. Will Van Derveer [00:41:17]:
Or your children even, right?
Keith Kurlander [00:41:19]:
Yeah. This is not science fiction, you know, this is coming. So I think it’s just right now we’re in this, a new tipping point with a new technology that’s going to change the way we live our lives. And then, you know, you just get to choose, like, you know, are you going to be in charge enough of yourself to choose to use these things in ways that really make your life, life better and more fulfilling and more meaningful. And you can. I mean, these technologies have totally opened up our worlds.
Dr. Will Van Derveer [00:41:46]:
I mean, right.
Keith Kurlander [00:41:48]:
We couldn’t do, we couldn’t do what we do right now if it wasn’t for all these technologies. I mean, we use these technologies every day. Yeah.
Dr. Will Van Derveer [00:41:55]:
I mean, I think for me, the sort of take home or the crux. What it turns on for me is having conversations like this that remind me that, that there is an opportunity to pause and reflect and recalibrate. How am I, where am I giving my attention to and who’s got the reins? I’m using a horse analogy here. Who’s got the reins of my emotions.
Keith Kurlander [00:42:25]:
Yeah.
Dr. Will Van Derveer [00:42:25]:
You know, am I in charge, or am I. Am I handing it off to something else that, you know, again, like you said, you can choose that, and it’s great if you. If you have free will and you choose that, no problem. The problem is what happens to you when you’re not aware of the choices that you’re making. Yeah.
Keith Kurlander [00:42:44]:
I mean, everybody likes to, you know, get on the bucking bronco once in a while, see how long you can stay on there.
Dr. Will Van Derveer [00:42:51]:
That’s right.
Keith Kurlander [00:42:52]:
It’s fun. I would get on there and try it out.
Dr. Will Van Derveer [00:42:55]:
Right. You don’t live on there?
Keith Kurlander [00:42:57]:
No. You don’t want to live on there?
Dr. Will Van Derveer [00:43:00]:
No.
Keith Kurlander [00:43:00]:
You. You’re gonna have, like, a lot of tbis if you live on.
Dr. Will Van Derveer [00:43:04]:
Hurt yourself.
Keith Kurlander [00:43:05]:
Yeah.
Dr. Will Van Derveer [00:43:05]:
Yeah.
Keith Kurlander [00:43:06]:
Well, why don’t we wrap up there? Okay. All right. We’ll be back soon. We look forward to connecting with you again on the next episode of the Higher Practice podcast, where we explore what it takes to achieve optimal mental health.