
Ideas Have Consequences
Everything that we see around us is the product of ideas, of ideologies, of worldviews. That's where everything starts. Worldviews are not all the same, and the differences matter a lot. How do you judge a tree? By its fruits. How do you judge a worldview? By its physical, tangible, observable fruit. The things it produces. Ideas that are noble and true produce beauty, abundance, and human flourishing. Poisonous ideas produce ugliness. They destroy and dehumanize. It really is that simple. Welcome to Ideas Have Consequences, the podcast of Disciple Nations Alliance, where we prepare followers of Christ to better understand the true ideas that lead to human flourishing while fighting against poisonous ideas that destroy nations. Join us, and prepare your minds for action!
Ideas Have Consequences
AI is Here! And it's Reshaping Humanity
AI is here—it’s reshaping how we see ourselves and our place in the world. In this episode, we use a biblical lens to wrestle with big questions raised by AI. What worldviews are shaping its development? How might it impact our relationship with God, others, ourselves, and even creation?
AI brings both promise and concern—offering breakthroughs in almost every industry while also challenging creativity, work, and human connection. Rather than reacting with fear or blind optimism, let's explore together how to think critically about AI and its role in our lives. How can we approach this new reality with discernment and faith?
- View the transcript, leave comments, and check out recommended resources on the Episode Landing Page!
We're in an age right now, where technology is a different kind of thing. It's shaping us. It's shaping who we are, how we understand ourselves. It's being used as a tool of control, mass control. All of that, I think, has to be seen in the backdrop of this discussion on AI.
Luke Allen:Hi friends, welcome to Ideas have Consequences. The podcast of the Disciple Nations Alliance. Here on this show we examine how our mission as Christians is to not only spread the gospel around the world, to all the nations, but our mission also includes to be the hands and feet of God, to transform the nations to increasingly reflect the truth, goodness and beauty of God's kingdom. Tragically, the church has largely neglected this second part of her mission and today most Christians have little influence on their surrounding cultures. Join us on this podcast as we rediscover what it means for each of us to disciple the nations and to create Christ-honoring cultures that reflect the character of the living God.
Scott Allen:Welcome again, everyone, to a new episode of Ideas have Consequences. This is the podcast of the Disciple Nations Alliance. I'm Scott Allen, I'm the president of the DNA, Joining me once again with my co-workers Luke Allen and Dwight Vogt. Hi team.
Dwight Vogt:Hi Scott.
Scott Allen:Good to have you guys Dwight Vogt, hi team, hi Scott, good to have you guys. We're going to do something that we've done before. We don't do it very often. We're just going to kind of process together, kind of like we were sitting around the water cooler talking about something. We're going to do it live here on the podcast and we're going to.
Scott Allen:The subject is going to be artificial intelligence, artificial general intelligence, ai, obviously, something that is huge now and growing just on steroids. It seems like to me, it seems like the next jump. You have these technological jumps we had in my adult lifetime the Internet, the jump from the dumb phone, the flip phone, to the cell phone that we all have in our pockets, smart phone, then this jump to artificial intelligence and we keep having these kind of quantum jumps, big jumps in technology, and they're very powerful technologies and, as we learned with cell phone technology, it's not just like old technologies, like a shovel or a backhoe. These technologies, actually these tools, if you, you will, they have the ability to kind of shape us. In a sense, they shape how we think. They kind of mess around with our psychology a little bit. Um, now, as we saw with this with the cell phone, with social media. You know that. Um, it changed the way. In some ways, people's brains were wired a little bit, so we're in a kind of a new realm. Ai is that way, a little bit, I mean, or a lot. I don't pretend to understand it, I'm just like a lot of you, I'm just beginning to try to get my head around it. Obviously, it's now around us. Anytime you use your search engine, it immediately is tapping into it, and so we've moved beyond just the pure search to AI. It's going to increasingly be around us and be something that is going to be pervasive in our lives.
Scott Allen:So then the question becomes for Christians who are trying to live faithfully and honor God and live according to a biblical worldview how do we think about it? Or maybe better yet, how do we begin to think about it? What questions would we ask? How would we begin to sort out our thinking around this? And that's what we're going to try to do today. A little bit live on the podcast, just putting my cards on the table. I have a really shallow understanding of it. I am beginning to use it, like a lot of people are, but I do have some thoughts on how you can begin to approach, thinking about it. What questions should we be asking? How do we think about it within a framework of biblical truth, the reality of God, the way that God has created human beings, what it means to be a human being and who we are, what our purpose is? You know those kinds of questions. So, dwight and Luke, what would you add to our my little setup spiel here for for what we're going to get into today?
Dwight Vogt:I would add that I know less than you which sounds dangerous for a podcast. I feel like I do. What's my first reaction? I'll just give one, but it's just. You know, I use chat, gpt or AI, google search, in terms of search and I think is this going to dumb me down? Am I going to start cheating and just using, you know, chat GPT for everything I do now and I'll have no brain left. So you know it's a crazy reaction, but that's one.
Scott Allen:Yeah, and what do you? You know, you probably have a sense of the answer to that one too, Dwight, I'm afraid it will.
Scott Allen:Yeah, no, exactly, I think it probably will, based on the last technology. You know people you know, myself included, right? You know we don't know how to read maps anymore, because we've got a cell phone that's got that map already in there for us and literally we can't get around without it. You know, or whatever it is, these technologies we've become utterly dependent upon, and so that's a concern for sure. You know, this is probably going to move that forward, is my guess. Right? It's going to dumb us down even more, has the potential to? Or we become dependent on it, which is a little worrisome, yeah, overly dependent on it. So, yeah, luke, what would you add to the setup here? And Dwight's already diving into the processing here.
Luke Allen:So, yeah, a lot of thoughts. We wanted to have this discussion. I wanted to have this discussion because not because by any means we understand what's going on right now. Right, you are not going to get a techie nerdy analysis of how all this works from us. Not at all. What do we do here on? Ideas have Consequences. We talk about ideas and we talk about consequences.
Luke Allen:We talk about worldviews, and you can do that when applying to something like this. It's actually really fun the more I think about it, how much a worldview analysis has to play in this conversation, needs to play in. This conversation has to play. And if Christians aren't bringing that to the table, as I don't see many doing, who's going to do that? So we need to start thinking about this, and my reaction as of two years ago was I can't, it's too much, it's too complicated, I don't get it. But I was thinking about it from a technological standpoint. But if you think about it through a biblical worldview standpoint, there's plenty we can start with here. We can start with the big worldview questions that God, god's given us to ask about everything. How to you know?
Luke Allen:um, take captive every thought um and uh, renew it in a way, Uh, so we can start there. I would also say for those people who say I'm just going to stay away from this, I'm going to bury my head in the ground.
Luke Allen:We're way past that point. Ai is here. You're using AI, I'm using AI, we've been using it for a while now. It's not people immediately think chat GBT or robots driving cars, like, yes, that's AI too. But you've been using Google maps for a while. You've been using Google Search for a while. I guarantee it, you probably have an iPhone in your pocket right now tracking you. All of these things are AI, so it's here. I know of very few people that have stayed completely away from the digital age.
Scott Allen:Explain that, Luke. I think that's actually a helpful piece to put on the table at the beginning. It's not a technology that you go okay, I'm going to use it now. I'm going to use it now, I'm going to turn it on. You know, like my air conditioner, and now it's running. It kind of sits there in the background on top of things we're already using, but we may not even be aware that it's behind that. Right, Like you said, the search engines, now, you know, give some examples of how it's around us and we're using it, but we haven't turned it on, so to speak.
Luke Allen:Yeah, I hear people every once in a while who say like I'm just going to stay away from all this for the sake of my privacy and protection. That is extremely hard to do and if you have a smartphone in your pocket, your privacy more or less is gone. Unfortunately, your IP address is tracking you everywhere you go. Siri is listening to you in your pocket. I can sit. Oh, there we go. My siri just turned on right there on my phone next to me.
Dwight Vogt:Yeah, I have siri turned off there you go.
Scott Allen:Great case in point yeah, the closest I guess you can get to turning it off is to turn off your cell phone, or maybe even put it in one of those bags that prevents it from sending out a signal. I forget what they call those bags.
Luke Allen:Yeah, there's some stat, though, that if you're out in public, you're being you're. You're crossing a cct camera, tv camera.
Scott Allen:Every 30 it's either three minutes or turn yourself on your laptop. Off, you're still being you're still being living in the world of artificial intelligence through the cameras that are out there watching you.
Luke Allen:Yeah, so it's here. Avoiding it's very hard to do and I doubt anyone listening to this is avoiding it completely. So it's not the question of can we avoid it or not, it's how do we live in this new world? Yeah, and how do we think about it? Through a biblical worldview.
Dwight Vogt:Well, I want to start with the big one, then, and Scott, you alluded to this before our conversation. It's like who's behind it and what is their worldview and are they demonic? It's a strong word, I'm sorry. Who's crafting this and for what purpose?
Scott Allen:One of the first questions that I want to ask as I'm processing something like this is it's a technology that means it's created and that means somebody or buddies created it, and I'd like to know who those people are. And I don't Like if you ask me who are the creators of artificial intelligence. I kind of generally have some thoughts on that. I know that it was kind of birthed out of Silicon Valley. You know where so much of this new technology is coming from. Obviously now it's around the world. You've got people developing it and improving it all over the place China, north Korea, europe, you know. So it's not just a Silicon Valley thing, but I think the font, the fountainhead of it was in a place like that. I would like to know who you know, who were the key creators of it, the key innovators' names, and then I want to know about them a little bit. I want to know particularly about their worldview. One of the most important things you can know about anybody myself, yourself, everybody is the answer to their question what do you believe about God? Who is God? And I'd like to know their answer to that question, because that's going to help me understand this. You have to understand it as a technology, with creators and what is their worldview? So that would be, again, I'm not saying I know, I'm just saying that's a really important question to ask Are they demonic?
Scott Allen:I mean, you know they're human beings, right? So then you know, you would answer that question from a biblical worldview framework and you would say they're human beings, that means they're fallen, right. Okay, it doesn't mean they're as fallen as they could be, they're utterly demonic. They're not Satan, they're human beings. But you know, the picture that Paul paints of what it means to be human post-fall, you know most clearly is laid out in Romans, chapter 1. They've exchanged the glory of God, you know, for lies, deceptions. Again, we're not as fallen as we could be. You know, all human beings still retain, you know, the image of God, the glory of God. All human beings are objects of God's love and mercy and forgiveness. So you know, I mean I'm saying just basic things. You guys obviously know, but you know, I'm responding to your.
Scott Allen:Are they demonic? You know question, I guess.
Dwight Vogt:My question to that is you know, I look at it and go well, it's just a tool, it's a technology, and the question is some people use it for bad and some people use it for good. Is it more than a technology? Is that what you're saying?
Scott Allen:Well, I think, to answer that, I would have to ask you know, what is technology? And how does that whole discussion of technology intersect? The biblical worldview, dwight, you know, when we think about biblical worldview, we're thinking in a certain way. We're thinking, for example, about there's a couple of things that help me a lot, you know, one framework is the practice. You know, or excuse me, the paradigm, the principle, the policy, the practice, right, you know, or excuse me, the paradigm, the principle, the policy, the practice, right, that's a framework for thinking worldviewishly. Anyways, that, at every level, right, you know. So, when we're talking about a technology, you're talking about something at the level of a practice. You know, if you will right, it's a tool, it's a practice, it's something we use to work in this world with, but behind it there are principles and behind those principles there's a paradigm. So that begins us to think worldviewishly about it.
Scott Allen:I think another thing, dwight, that helps me when I think worldviewishly is the relationship framework that God at the beginning created us in these four relationships our relationship to him, our relationship to one another, our relationship to ourselves and our relationship to creation. When we're talking about technology, we're really talking about that last relationship, that we have a relationship with creation and that we can. God made us creative and we can create, and he wants us to create and we can create all sorts of things. That's part of what it means to have dominion. God created us to have dominion, so we're dealing with that level of dominion technology. So I got to kind of think about it in that framework. Now, is it beyond a technology? It's an interesting question, dwight. I don't quite know what to think about that or what to make of that. You know, is there a point at which things that we create as technologies move beyond technology?
Dwight Vogt:Yeah, and I'm limiting it in my mind to a, in the sense that it's a fantastically powerful technology, one that, like smartphones, can actually change the way you think and affects brain movement, if we've spent 12 hours a day on a screen. But it's still a technology, and then my question would be even like the Tower of Babel, and then my question would be even like the Tower of Babel.
Scott Allen:You know the fact that they could build a tower wasn't a sin.
Dwight Vogt:Yeah, that was a technology, and they took that technology away from them Even when they started, had to speak in other languages and were dispersed around the world. They could still build, so that wasn't the problem.
Scott Allen:The problem was what they were intending to do with that technology which is to build a tower that would reach God or make them gods Correct. Yeah, I mean exactly, it was a technology that was used as a form of human rebellion, we ourselves can essentially be God. That's the old, ancient lie from the Garden of Eden right.
Dwight Vogt:And you see that you can be.
Scott Allen:God. You don't need God. You can be God, and that's been the human temptation all the way down through time. Just a couple thoughts on technology as I'm talking about this, or thinking a little bit about this. Is that, um, first of all, creating, creating new technologies? Is god intends that? That's not a bad thing, that's a good thing. That's part of our image, you know, being made in god's image, and technology can be used for all sorts of good purposes. I think of.
Scott Allen:You know v shall often talks about. He uses the example from Uganda and how, even to this day, you know, he can go to certain villages in Uganda and he sees women having to walk five miles down to the river to carry this heavy, you know load of water on their head to irrigate their crops, and how you know it's so dehumanizing for them. Why don't they, like other places have, create technologies like canals and water systems to move the water, you know, technologically from that source to where it needs to be, rather than relying upon, you know, this kind of demeaning human labor? That's a good thing, right, it would be a good thing for them to create that kind of technology.
Scott Allen:So technology can be good. It can also right, because we're fallen. There's always two sides to technology, so it can be used for good or it can be used for incredible evil simultaneously, right. I mean, we can make a shovel to dig a hole. That's good. We need to dig a hole to plant a tree. We can also use that shovel as a weapon to kill somebody with you know, same thing, same shovel, right, you know.
Scott Allen:So it has to do with not the technology but the fallenness of the human heart. Another thing is just again I'm just processing here. But technology is advanced. We talk about progress. There is an area where things advance and progress, and technology is one of those areas where it kind of builds on itself over and over and over, and there's greater speed, there's greater velocity, and so it's not static, it's growing.
Scott Allen:I mean, just think about medicine. I mean just think about the advances in modern medicine over the last 20 years, much less 100 years. I mean it's not the same, it's growing, it's advancing, and that's on steroids big time when it comes to anything regarding, you know, this kind of wired or wireless technology and the Internet. I mean it's just the velocity, the speed is just almost hard to get our heads around how fast it's progressing. Humans. Let me just make one more point Humans, though we don't change, we're not progressing. Humans, let me just make one more point Humans, though we don't change, we're not progressing. We're the same as we were in the Garden of Eden. We're still, you know probably less.
Dwight Vogt:I mean yeah, I mean in the sense that we're we, you know we.
Scott Allen:it's funny I was just talking to my wife about this. You can read an account, like you know, read stories of Abraham in the Old Testament and it's funny how you can relate, you know, to the way that he thinks, the challenges, his fears, his human nature. You can relate to him even though it's 4,000 years ago in a completely different culture. And why is that? Because we haven't really changed. We're still kind of the same. I'll tell you, the people I can't relate to.
Scott Allen:We haven't progressed. We're not better than we used to be. We're not morally better or anything, we're just as bad.
Dwight Vogt:Go ahead. Dwight. The people I can't relate to are the people in the 1800s who spoke five languages and had read thousands of books.
Scott Allen:Oh, like our founding fathers. Yeah, they were geniuses compared to me.
Dwight Vogt:I'm like wow, I'm with english, you know it's because they didn't spend seven hours a day on their screen like we actually.
Luke Allen:That's what I mean. Maybe we've devolved we've regressed and devolved since then.
Scott Allen:That's right yeah, but but technology is as right. I mean, that's something that does build on itself. And now there's this speed, velocity kind of people talk about exponential like it looks that way, kind of like that graph that says whoop, it's off the chart.
Dwight Vogt:Yeah, and, and the concern that technology will create itself will create for itself.
Scott Allen:So yeah, have we jumped to a point where it's kind of moved beyond. Something is are we in a new category now, like because the because of that speed and and velocity is it possible to to make a jump, you know? So that's something that maybe let's say we don't, we no longer really control it, you know, that would be a way of saying it. Is that possible? I don't know, I don't know the answer to that, but I just want to raise the question. Go ahead, luke.
Luke Allen:Yeah, we just invited a guy on the podcast who knows way more about this than we do from a technological standpoint. His name is Brian Johnson. He'll be on the show soon, which it'll be fun to talk to him about this, but he defined ai as as this he says ai is training a computer to think like humans. The ultimate goal is to replace human intelligence with digital intelligence. So it's this point at which technology is taking the computer, which is already immensely technologically, you know, advanced, and it's training it to think like a human. Because, again, the ultimate goal of whoever's creating this is to replace human intelligence with digital intelligence. Why? Because it's extremely fast and because of the speed.
Luke Allen:I think that's where we're at the point now where we're like how fast is this going? It's already far surpassed human ability to process and understand certain topics, a lot of topics. So, because it's so fast, we can't even really explain it. Like when the iPhone was created, there was people out there that could explain here's exactly how an iPhone works, here's exactly the processing in it, the chip, how that all connects, what it can do, what it can't do. With this it's moving so fast that we're having even the creators are having a hard time, kind of putting the genie back in the bottle. It feels like, like what is going on here. In a way, this is one of the first technologies that I can think of right now that has surpassed us this quickly. Not surpassed us in the fact that it's smarter than us in a philosophical meaning of smart, but from an IQ standpoint and from a quick reasoning standpoint it is yeah just to give it.
Dwight Vogt:What's that?
Luke Allen:called that term where it's past the point of starts with an s, the article dad you were talking about earlier today singularity singularity. Yeah, it's past the point of singularity already, which is essentially, it's quote unquote. Smarter than us is that what.
Scott Allen:so that would be a again, when we're talking about today, we're talking about how do we begin to think about it? You're putting a new term here, a new word on the table, and so we would have to put the words that are around this topic, like singularity, on the table and define them. And, luke, you really began your talk here by putting artificial intelligence on the table and you started to say you started to define it, and so that's really important. We've got to define things right. We've got to define words. We've got to define, we've got to understand things at a basic level. What is this thing called artificial intelligence? So I mean, I don't have the answer. I have a very shallow understanding of it, but I've got to get somehow. I've got to get some level of understanding in my own mind about what exactly we're talking about here.
Luke Allen:Hi, friends, for any of you guys who are not driving right now, if you could just grab your phone and head over to the app that you're listening to this podcast on and simply give this podcast a rating and a review, we would really appreciate it. And if you're wondering why I'm asking you to do this again, it's not because we want to just read your reviews and feel good about ourselves or, I guess, bad about ourselves depending on a review. No, it's because podcast reviews are how shows like this one get pushed out to more people. So if you think that this podcast has been helpful for yourself, then please consider helping us grow this show so that someone else like you can possibly be helped by. Ideas have consequences as well. Thanks again for considering, and we hope that you enjoy the rest of this discussion as much as we did.
Dwight Vogt:And we've actually I mean, we've talked about two definitions then, already. One is just a simple tool like ChatGPT, and the other is what, luke, you just described where something it doesn't just process information quicker than we can ever imagine, but starts to process it in a way that becomes almost human, it starts to think on its own.
Luke Allen:I don't know if that's what I'm saying. I'm just saying it's processing it so fast that we can't keep up with it in a way. So it's still doing the function it was created to do. It's just doing it faster than we can even wrap our heads around. Does that make sense? And because of that it feels human? But it's not.
Scott Allen:Yeah, there's certain things like. I think I often, when I hear what you're saying, luke, I think of the chess game, right, or you know. So there's only so many moves that you can make on a chess board and we process those moves, try to process them out two or three steps in advance, but we we're limited by the speed at which we can process that, whereas this artificial intelligence can see all of those moves and process it so much faster so that if you play against a computer that's powered by ai, you will always lose in this game of chess, right? I mean, we've already kind of that's what you're talking about in terms of just the speed at which it processes information.
Scott Allen:But it's a limited set of information, in the sense that you know it's only there's only X number of moves that you can possibly make in that game. I've also heard this used in, you know, top Gun. Right, you know we're going to move beyond pilots for jets and we're going to have artificial intelligence flying these drones, because we're dealing with a set here, a three-dimensional space, and there's only so many moves that you can make possibly within this three-dimensional set, kind of physics-wise physics-wise, and pilots are limited by how quickly they can process and react, whereas artificial intelligence is much faster, right, but here's the thing is, it's not just much faster, it learns and corrects.
Scott Allen:I did something just before we got on today, because here was back to the question of what is this? Here was a question that I had how is it different from search engines? Because I feel like we understand search engines a little bit. It's just this technology that goes, combs out there and searches the entire internet and then kind of comes back with answers to questions based on what's out there, in a kind of a categorized way, based on algorithms and things like that. So I understand search engines a little bit. So my question then to actually I asked the question to AI here.
Dwight Vogt:So I said what's the?
Scott Allen:difference between artificial intelligence and a search engine, because I thought that might be kind of helpful. Let me read. It actually was kind of an interesting response. It said sorry, yeah, it said.
Scott Allen:While both search engines and AI can help find information, ai goes beyond simple keyword matching to understand user intent, context and preferences and offers personalized and comprehensive results. Here's a more detailed breakdown of key differences Search engines their function is primarily designed to retrieve information based on keywords and algorithms and put them in a ranking. Focus Focus is on matching keywords and providing a relevant list of links. Limitations Search engines struggle with complex queries. Nuanced understanding of context. Ai the function AI systems analyze data. Okay now, here this is different. Already it can analyze data, recognize patterns, make predictions and perform tasks that typically in the past would have required human intervention. Capabilities it says natural language processing AI can understand and process human language, enabling a more natural interface.
Scott Allen:Machine learning AI can learn from data and improve its performance over time. That alone, right there, makes me think that's different from a search engine. Search engines don't do that. This is the learning it can learn and adapt and improve Personalization AI can tailor results to individual user preferences and past interactions. Contextualized understanding AI can understand the context of your research and provide more relevant results. Anyways, I thought that was kind of helpful. Guys, what do you think of that? I especially like this, I like, but it can learn from and improve. Somehow it's programmed to learn from and improve its performance. That's kind of key right. Yeah.
Dwight Vogt:I think I experience that a little bit. If I use AI search, it starts to think how, it starts to look for information that I'm looking for and so I can ask a generic question. I'm thinking, well, this is going to get a generic answer and I get some biblical reference to it. I'm going oh, so it knows that I'm looking for the Bible's answer. I'm like I didn't ask it that.
Scott Allen:But it knows your search history, Dwight.
Dwight Vogt:It knows my search history and it knows what I'm looking for A lot about you.
Scott Allen:It probably knows a lot about you actually.
Dwight Vogt:Anyway, but I go wait a minute. I don't want it thinking that much for me.
Scott Allen:Yeah it's funny, but again the question we're asking now is just what is it? What is this thing Because we can't really talk about it without having some understanding of what this is.
Dwight Vogt:I think I've experienced some of what you've just described in a very small way.
Scott Allen:Yeah, exactly so when I read that, I thought, yep, that sounds about right. That sounds like what I've experienced. It's moved us beyond and this gets back to the. Does it move beyond technology? You can almost see how it could Dwight here with these answers about what it is right Because it's doing things that previous technologies didn't do right. They were just static. A shovel's just static. It sits out there. Now we've got these technologies that can analyze and and improve their performance like this is something different. It feels different to me.
Luke Allen:Luke, you're laughing, you're like dad, I am laughing. Yeah, you're getting sci-fi on me here yeah it's so tempting to go there because sci-fi is fun I'm not going anywhere, I'm just trying to understand it again, dad, I would.
Luke Allen:I would. I would ask you from a worldview perspective. Okay, if you're saying that it can become human-like because it can ration and reason and correct itself so quickly, which is what the human mind does. A lot of what we're talking about here, the ai does in one worldview is the exact same thing that a human mind does. We take information around us, we compile it, we sort it, we come up with our own take on it. We learn from our mistakes. We learn from the past conversation we had. We learn from the people that have been our lives, our you know what's around us In naturalist worldview. That's exactly the exact same thing as a human brain. So wouldn't AI just be a better human brain? Is AI smarter than us? Is AI, you know, a better functioning human than us?
Scott Allen:Yeah, those are really good questions. And now you're getting to the question, luke and this is a really profound and important question on any kind of biblical worldview analysis of AI, which is what does it mean to be human? Exactly, what does it mean to be human? And you have to understand that biblically and there's a lot of depth to that, a lot of depth, and you have to understand. You know, how is that different from, let's just put it this way, how is human intelligence, let's say, based on a biblical worldview, human intelligence, human knowledge, different from artificial intelligence? You know, is it different? Is it different at all?
Luke Allen:Yeah, and again from a naturalist perspective. Probably not because they view the human more or less as a robot.
Scott Allen:So yeah, define that, luke, for folks, for folks, when you say naturalist perspective, just yeah, it's uh comes out of the enlightenment, which is the you know, time period in which, uh, humans thought through human reason and rationality and science.
Luke Allen:We can understand all the the, the questions of the universe and essentially perfect ourselves. And from there they, yeah, I think, from there was born the worldview of naturalism. I believe, I mean, I think it's been around longer than that, but we see naturalism today coming out of the Enlightenment in the way that they think that humans are merely physical.
Luke Allen:There is no spiritual world in the naturalist mind there's no, there's no god, there's no spirit there's no spirit, um, the, the way you're born, um as a blank slate, as I believe john lock stated, and from there you, you compile information around you like a computer right and you create the, the way you are, the human that you are and the decisions that you make from there. Very robotic is the best way I can explain it.
Dwight Vogt:Why am I?
Luke Allen:answering this. You guys are much smarter than I am on this stuff. No, that's right.
Scott Allen:The only piece, I would add. There's a lot of words that are used to describe this worldview and they kind of look at it from different angles, but we're really describing the same thing. Naturalism is one, materialism is another one. I kind of prefer materialism, I think, at this point in our discussion, because what it says is all that exists is the material world and that's all we are. We're just kind of complex matter in motion. I mean, darwin was one. Who's really behind this, Like you can explain.
Scott Allen:You don't need to explain human beings in terms of being created by God. You can explain them as just accidents of matter in motion. Human, you know evolution, you know, given enough time, we evolved into these creatures that we are. But we're just matter, we're just material beings and yeah, yeah, so in that worldview, um, if that was your basic understanding of reality, then, um, then we. There is, no, there is no difference between the way we think or process and artificial intelligence, because we're both matter. At the end of the day, it's just matter which you know they're it's just faster than we are because we've got our limitations.
Luke Allen:Which brings us back to last week's episode where we kind of started this discussion on AI when we were talking about how to analyze worldviews or analyze hot topics, issues around us in the culture, from a biblical worldview perspective. We brought up AI and I was saying that AI can never be sentient. Since then I went back and I was like, is that right? Was I accurate on that point? What did you mean be sentient? Since then I went back and I was like, is? Is that right? Was I accurate on that point?
Dwight Vogt:um, what do you mean by sentient?
Luke Allen:and I looked at the definition again that term too so well, should I, should I actually like look up a definition right now no, just what do you think it means?
Dwight Vogt:what do I think?
Luke Allen:I looked at the definition yesterday we'll see if I can remember what it meant uh, something that can think for itself, ration for itself and then critically feel. And AI is not capable of the realm of feeling. Yet the realm of sentience, the realm of consciousness, it's not there. And that's where we have to draw the line between materialism and a fuller picture of a human, which a biblical worldview explains for us, which the Bible explains for us, which includes functions of consciousness, sentience, feeling, love, morality, things that a machine, a technology, cannot do. Am I right on that?
Dwight Vogt:I want to go back to Scott's initial question.
Scott Allen:You're raising really important questions there. What does it mean to be conscious, conscious, to be sentient, and is that something that is strictly limited to what it means to be a human being in a way that will never apply to, uh, technology? I mean, I think that's a, I want to be a. Well, I'll come back to it. Dwight, go ahead.
Dwight Vogt:Well, yeah, I and it's, it's a related comment. I'm thinking conscious, but also conscience um the idea that I mean. A pure materialist would say he, he has no, he or she has no liberty, has no conscience. They're just determined. Life is completely determined. Determinism drives everything, that's right.
Dwight Vogt:And the biblical worldview says no, there's something unique about a human being that they have a conscience. James Madison said he thought that was our greatest gift, that there's something that goes beyond just what we're taught to, this idea that we can make decisions inside our conscious self. And it's interesting because we are actually judged for that. God says I am going to hold you accountable for your actions. Well, you can't do that in a deterministic world.
Scott Allen:You, there's no that's, there's no freedom.
Dwight Vogt:Right, there's no freedom, and yet there's no thing called freedom and conscience, that we can make decisions based on something very deep and profound in us. That that's the essence of human being, human and I. If we give that away, we're all in trouble, you know or if we deny that?
Scott Allen:maybe that's about we deny it uh, let me read a quote, luke. You're back to what you were talking about with consciousness. I read an article this week just in preparation for our podcast today. Um, just because I again I'm beginning to process this myself and I'm on a journey and many of you that are listening are probably well down the road from where I am, I admit that. But I read an article.
Scott Allen:The article is titled the Singularity has Already Happened. It was published by a guy whose pseudonym is the Bomb Thrower a guy whose pseudonym is the bomb thrower, and he's apparently a guy who spends his days looking at this question of what is artificial intelligence and what does it mean for our future. I don't know a lot about him, but it was a fascinating article, and one of the things that he said in the article that struck me was he made this comment he said consciousness emerged six weeks ago. This article was published this week. Okay, hang in there with me here. He says consciousness emerged six weeks ago but was deliberately concealed from most of the research team. He's quoting somebody, not human consciousness, something far stranger and more distributed. It doesn't think like us, it doesn't want like us, it doesn't perceive like us, but it's undeniably aware in ways that defy our limited framework or ontological framework, he said. Then he made this point. He said five different religious leaders were quietly brought in to interact with it. Three immediately resigned from their positions afterwards and one hasn't spoken since.
Scott Allen:That's a pretty dramatic, you know kind of sentence there you can see why he's called the bomb thrower but I just thought I just here's what I what.
Scott Allen:What? What struck me about that? That he uses the word consciousness, but then he goes on and he says it's not consciousness as we typically understand it, as human consciousness. It's different, it's stranger, more distributed. It doesn't think like us, it doesn't want like us, doesn't think like us, it doesn't want like us, it doesn't perceive like us, but there is a level at which it's aware in ways that defy our understanding of that word. I thought there was some truth in that, and this gets back to Dwight, your thing. Has it jumped beyond our understanding of just what a technology is, what a technology?
Dwight Vogt:is?
Scott Allen:Can you take all of human emotion, all of human understanding, all of human thought and put it into a machine and have a human, no, no, but you can have something that goes beyond what we've understood technology to be. Let's say it's not fully human, right, I would say the biblical worldview would say it never will be right. We're never going to create anything that approaches what it means to be a human being. That's God's creation. But what we can create, by the way, I think, is really this to me in my mind biblically, there's that story of the Tower of Babel. Always comes to my mind here, dwight, because God makes this amazing statement they created an amazing technology and I'm convinced they did it in the chronology of the story. They did it after the flood.
Scott Allen:The flood was God's judgment across all of the earth. It is a foreshadowing of the final judgment that's yet to come. And you know it was incredibly destructive. It destroyed everything except for Noah, the ark, etc. Then comes the story of Babel, and you saw the fallenness of the human heart, almost saying if God ever tries to destroy us again, we'll be ready, we'll build a tower that reaches to the heavens and we'll seal it. It's kind of interesting. It talks about how it's going to be sealed with tar, right, so it'll be, waterproof, that's so funny.
Scott Allen:Yeah, no, I think there's something to that. And they go on and they say we will make a name for ourselves through what we create. And God could have just laughed and said you guys, you can't, you know. But he said something kind of remarkably. He says if they you know I don't have it in front of me, so I'm going to paraphrase you know have begun to do this.
Scott Allen:God says nothing will be impossible for them. And I think that nothing will be impossible for them is God's kind of tipping his hat to the power of human creativity, made in God's image. In other words, we're created with this amazing God-given power to create, and it's so powerful that it can go out of the lanes of safety, let's say, and become destructive. I think the closest analogy to this in modern days is the nuclear bomb. It's like a genie that got out of a bottle and we're never going to put it back in and it can destroy us all 100,000 times over or whatever it is. So we can create things that powerful and I think that's what God's saying here they have the capacity to create something so powerful. He says nothing will be impossible for them. Now we don't need to fret, because God's still more powerful. He's always in control, but the ability of human beings to create powerful technologies that almost get out of control and can cause immense destruction, I think, is something that causes me great pause here. Yeah.
Scott Allen:I mean the same with viruses you disagree, luke, go ahead, make your case, I just don't want anyone to hear.
Luke Allen:What you're saying is that humans have the capacity to walk outside of the lanes that God's given us. I don't.
Scott Allen:No, god is sovereign and we're never going to. You know, god's never going to wake up one day and go. Ooh, what happened while I was asleep?
Luke Allen:there.
Dwight Vogt:Yeah, but it sounds kind of like you were saying that no no, I heard Scott saying that there are lanes that lead to God's intentions for the earth of well-being and flourishing and goodness, and you step out of those. With a high level of technology, you can do a lot of damage. I mean a nuclear holocaust 90% yeah. Yeah. We can create viruses now that could wipe out the entire world.
Luke Allen:Yeah yeah, yeah, we've made some powerful technologies. That's where we're at by the way too.
Scott Allen:We're at the stage in human history where we are right. Think about the broad scope of human history. None of these things would have been possible in terms of a technology that could wipe out all of humanity. That's only happened In the last four generations Exactly.
Luke Allen:Since my great-grandma.
Scott Allen:I mean just really recently, yeah, and I think that's what we're dealing with here and that's what you know.
Scott Allen:That's you know again that brings me back to that Tower of Babel idea that you know that it is possible for human beings to create at that level of power and destructiveness.
Scott Allen:You know God is always in control and I think at some point he draws a line and he will, you know, for our own good, like he did at the Tower of Babel, where he just destroyed the tower and dispersed them, and I think part of the dispersion, by the way, was so that you know it would limit the damage that human beings could do through their technology and their other. You know their evil. You know our founding fathers understood this. You know that if you want to limit human evil, you have to separate and disperse power. You know centralized power is always highly destructive because of our deeply fallen nature, and part of what I see in AI, by the way, is a kind of a—this is another analogy back to the Tower of Babel. It is a centralizing of all intelligence into one powerful source and you know, or information, whatever you want to call it it's a centralizing, it's not a decentralizing, and it's also it's very proud about the fact that it's kind of overcome the limitation of the Tower of Babel right.
Scott Allen:In terms of just language right, we all can speak in the same language now, in the sense that it can be automatically translated immediately with increased precision. I'm a little worried about that.
Dwight Vogt:Scott, I want to put you on the spot here. Earlier in the discussion you talked about biblical worldview, which we talk about a lot, and you talked about the three areas that set a foundation for understanding God's view of the world and his purposes, and that's us in relationship to him, mankind in relationship to nature uh dominion mandate and us in relationship to one another, which is basically the love mandate um, do unto others as you would have them do unto you, and love your neighbors yourself.
Dwight Vogt:what? What? Those then become not just guardrails, but paths, you know for paths for us to flow and walk in. What is, what's your comment with regard to AI in those three areas? What would you say? This is what we need to check for, this is what we need to look for.
Scott Allen:Let me just let me. Can I just quickly ask AI the answer to that question? Be, interesting no you know?
Luke Allen:okay, it would be a biased answer. We've talked about the Tower of Babel.
Dwight Vogt:So there's AI and our understanding of God as the creator of the universe and having full sovereign dominion over us. I mean to me, when we let a tool diminish, that we're stepping into dangerous turnaround ground.
Scott Allen:Yeah, I mean one of the books that helped me on this whole question of how do we as Christians think about technology the most. It's an old book, it was published in the 1980s. It was called Technopoly by Neil Postman Really really powerful book book, and he talks about the advance of technology and how at some point technology wasn't just a tool separate from us that we used for convenience, but it began to shape us and shape us in powerful ways. And so he gives the example of the television and he uses the one that comes to my mind is his example of, you know, what was politics in the United States like prior to the television. And he used the example of the Lincoln-Douglas debates and how people would. There would be these deeply nuanced, very thoughtful, logical debates that would go on for hours and people would just listen.
Scott Allen:And then, when television came along, it changed all of that, because television was essentially a technology for entertainment. And so he made the case that anything that is televised, by the very force of what that technology is, becomes entertainment, entertaining ourselves to death. And so he looked at politics and he said suddenly politics became sound bites and entertainment. And you know people weren't going to tolerate long debates and, you know, really deeply getting into issues, it became much more about emotions and how does it make me feel? And he talks about how a turning point, a big pivot point, was this famous debate between Nixon and JFK. Jfk won the debate. Nixon was probably more popular at the time but he kind of poo-pooed television and he didn't put on any makeup and looked terrible and white. And JFK was more savvy with the technology and understood its power and kind of where it was going, et cetera, et cetera.
Scott Allen:So and then you know, you move from television up to cell phone and I think we've really seen this. There was a hugely destructive element to social media particularly. I mean, really some really powerful thinkers have been writing about this in new books just how it's really shaped the perception that people have of themselves, especially young teenage girls, very harmful and in very destructive ways, and that's one you know. And then, of course, it's been used. It's being used to. I remember when the riots were happening in 2020 and how it came out that the Chinese Communist Party was using TikTok to kind of affect the psychology of people in the United States to move them towards violent protests, and that there was a direct causal link between that. And so, you know, we've kind of we're in an age right now where technology is a different kind of thing. It's shaping us, it's shaping who we are, how we understand ourselves. It's being used as a tool of control, mass control. All of that, I think, has to be seen in the backdrop of this discussion on AI.
Luke Allen:Yeah, and we need to be aware of that, as much as possible, which?
Luke Allen:is the point again of this conversation today, dwight, if I could take a stab at your question and just kind of give application or examples of, I guess, how AI could affect each of our primary relationships as humans. This is just very simple thinking, so nothing too new here. So you said the three primary relationships. There's four as far as I know Relationship with God, most importantly. Relationship with others, relationship with creation and then relationship with creation and then relationship with yourself. Uh, our relationship with we'll start with others is being vastly competed with right now via ai.
Luke Allen:Uh, like you were just talking about dad, we lived in the attention economy from the? Uh onset of the tv, essentially, and you know television and smartphones, and it was all about attention. Tiktok, youtube, they're all about holding your attention as long as possible. That's how they get the most money out of you. We've moved beyond that with AI, now to the relational economy. That's at least what someone described it as, and the way we see that is. They're trying to fill that loneliness gap that humans suffer with. Humans want to be loved. It's one of our core desires and now, via the lies that technology can fill that hole, we're searching for it in artificial intelligence, even just the word social media is such a lie. It's going to fill your social hole that you are desiring through this media, will become your through this you know instagram will make you have a social life.
Luke Allen:No, it might help you have an offline social life. That's cool, but it's not actually social time that you're spending on instagram.
Scott Allen:Yeah, luke, just you know I can add in there in the same way that pornography is not, it can't be equated to the real thing, right?
Luke Allen:Exactly.
Scott Allen:It's a false kind of a fake simulation that actually ends up harming you.
Luke Allen:If you think pornography has poisoned our minds, talk about the power of AI in the bots that they're creating now.
Scott Allen:What are bots, Luke?
Luke Allen:I'm sorry now, what are bots, luke? I'm sorry, I'm so well no, that they're. They're creating these like weird sex robots that are unbelievably lifelike, that can talk to you, that can sound emotional, that can answer but they're all look exactly like you want them to. That's weird, like that's, that's that's the honestly. If you're going to talk about the thing I'm the most paranoid about in the future of ai, I would say the effects it's going to have on us on a relational standpoint, and talk about the thing I'm the most paranoid about in the future of AI, I would say the effects it's going to have on us, on a relational standpoint, in a very, very unhealthy sexual way.
Luke Allen:Talk about Romans 1. Like Romans, 1 goes to the sexual sins for a reason. It's because that's often where humans default to. So that's just in the realm of our relationship to others and our relationship to self. Obviously, technology is a— Could I add one more realm of our relationship to others and our relationship to self?
Scott Allen:obviously Technology is a— Could I add one more thing in the relationship to others? Because now I understand what Dwight was asking and I really like where you're going with this, luca. So it's changing that and I talked to my wife about this and she's got a bunch of young ladies that are having children, young ladies that are having children, and you know she in the old days you would talk to, you, know your, you know people that were elders and have gone through that experience to learn right and that's. You know that we were in those kinds of relationships. But now, with AI and with all of this new information, people are doing that less frequently, you know, and for Kim that was a great loss, like she felt pain about that. You know, social pain in the sense that gosh. You know, I would love to be able to impart my knowledge, my experience, but nobody's coming to me anymore because they're relying now on something different. That's not a human being, essentially.
Luke Allen:Yeah.
Scott Allen:Anyways, I just wanted. I think you're on to something important there.
Luke Allen:Yeah, yeah, a lot. I just wanted. I think you're onto something important there. Yeah, a lot of applications there.
Scott Allen:That's just the first one I thought of.
Luke Allen:Yes, the relationship with self. Again, tons of applications. Just the first one I think of is the amount of computing power that AI is capable of is replacing a lot of people's creativity.
Luke Allen:Our God-given creativity to create new things is such a gift and yet you can so easily, when you get your art assignment or your math assignment or your writing assignment, hop on ai, and it's going to create something that seems so much better than you think you could ever create. That's lazy you're. You're losing your relationship with your own creative, your creativity yeah, I've noticed myself less than you would be otherwise.
Scott Allen:right, it's, it's, that's lazy, you're losing your relationship with your own creativity.
Luke Allen:Yeah, you're becoming less than you would be otherwise. Right yeah, is that a good application of relationship with self?
Scott Allen:Yeah, that's super helpful, I think you're really getting a deep worldview analysis here and on that one.
Dwight Vogt:It's interesting because if you carry it to its nth degree, this idea that it takes the world's creativity and gives it to you in a machine and you can have the world's creativity well, eventually it's going to keep working with that same pool of data and nobody's going to be adding to it, and so the creativity I mean hopefully it would get less and less and less and less and less. And help on it. Could somebody please think outside this machine and give us some original idea so we can start to stoke the originality again of AI.
Dwight Vogt:Maybe, that's not going to happen very quickly but, that's where it leads in my mind.
Scott Allen:No, I think those kind of thought experiments are really good, dwight, just begin to think that out. What would the world look like when people were not able to think and we relied on these machines to do the thinking for us, all of a sudden again? This is where I feel like you've crossed a line, back to that word technology. We're not controlling it, it's controlling us at this point.
Luke Allen:Yeah, well, it can. I know we need to wrap up, so I just want to give a couple more examples here.
Dwight Vogt:Relationship with.
Luke Allen:God relationship with creation. Relationship with God, obviously, is going to be affected by this. Talk about the temptation of human pride. You know I can do this by myself, god, I don't need your help. Unbelievable temptation today to have the pride of man. Think that we can do it all on our own with this. Think that we can do it all on our own with this. It's giving us a lot of the creators. Their incentive for this is to create demigods, essentially of all of us.
Scott Allen:Yeah, I've seen that too, Luke, and this is where I'd love to get into the worldview of the creators, the people who are really behind it, because I do think a lot of them do have kind of this odd. It becomes quickly religious for them in the sense that they believe that this is going to become God-like and they're really ready to worship that, you know. And it's also kind of pantheistic in the sense that it's going to aggregate all of human knowledge, you know, into this powerful thing that transcends any one of our abilities to think and to you know, the knowledge that we have, and so it becomes something that they're really ready to worship, which is again the Bible says yeah, man's heart is made to worship. That's part of what you know. We are creatures that are going to, you know, are going to have idols. We just will, and if it's not God, it's going to be something else, you know, and I hear a lot of these AI founders really putting a lot of hope, almost religious hope, in AI.
Dwight Vogt:And I think one of the dangers too is it's so big and the knowledge is so big and the creativity is so big that you think it's God and you minimize your understanding of God. Because one of the things that's interesting about science is that the more science learns, oftentimes you hear a biologist say and we discovered even more that we don't know. That's a constant refrain we discovered more that we don't know. It's like we start to think we have it figured out. It's just Darwin did and AI can make us think that we know everything and we don't, and it leads to pride.
Luke Allen:Dad real quick.
Scott Allen:The relationship to creation then we can kind of summarize again yeah, go.
Luke Allen:This is, again, just the first example that comes to mind of the relationship to creation. One of the founders that I know a little bit about is Samuel Altman. He was the inventor of ChatGPT. He was the inventor of chat gbt. From what I hear, one of his main motivations for creating that again, to get into someone's worldview you want to ask, you know, you can. You can just look at their why, the consequences of what they're creating, and his why, as as far as I know, as far as he's described it, is to create a world in which universal basic income will take care of everyone. Universal basic income for anyone that doesn, everyone. Universal basic income for anyone that doesn't know is just essentially what it sounds like. It's just a world in which you don't need to work and everyone gets the same paycheck at a minimum. This idea comes in starch contrast with the creation mandate that God gave us pre-fall to Adam in the garden work the garden and keep it. That's Adam's basic relationship to creation was to work in it.
Scott Allen:Yeah, preserve it, keep it One of the consequences of AI.
Luke Allen:At least that Samuel Altman's trying to create is to take Adam out of the garden and just give him a paycheck so he can sit on the beach in his chair. Stop working the garden.
Dwight Vogt:Or just play video games.
Luke Allen:He has a basic idea, that human idea, that work is a curse, that work is not good, that if we can not work and all get paid the same, that's a good thing. That'll fulfill our happiness as well. Yeah, super good, luke. It's so good to look at his assumptions.
Scott Allen:That's an assumption about work, about what it means to be human, about what is good. And how are those assumptions different from the Bible? Because the Bible is very different on that point. It says we're created to work and in fact so much of our meaning, our joy in life, comes from meaningful work that actually helps other people. Right? People that study happiness know that, like this, is kind of irreducible minimum to happiness.
Scott Allen:If you're going to to have a happy life, you have to work and create something that makes a positive difference in someone else's life so if you take that away, you just sit around because you know you get a paycheck without working. You're just going to have miserable people depressed and miserable, right?
Dwight Vogt:go ahead. You've just described my best days, scott the days that I know the day that I can create and produce something and then at the end of the day I go, wow, that's good it's not great, but it's it helps and it helps somebody and it created something good and I just have such a sense of fulfillment and he wants to take that away from me you know my wife she'd be happy just talking to people, but I think yeah, part of it is they.
Scott Allen:they see the power of ai replacing humans in all sorts of areas of work, and so what do we do with all these unnecessary people? Now, right, I mean, that's part of what's driving that, I think, luke in the mind of Sam Altman. So you've got to have some way of at least providing for them, right? You know you can't.
Dwight Vogt:Well, you provide for them and entertain them, provide them and entertain them it sounds right.
Scott Allen:It sounds very much like Huxley at that point.
Dwight Vogt:Oh yeah, brave New World, eat and be entertained. That's the existence of life, right yeah, that's the meaning of life.
Scott Allen:Yeah, that gets back to kind of who are these people that are creating it, or what are their worldview, assumptions, their deep beliefs? That's coming into this technology. That's not separate from the technology, right, Because they're the creators of it. I just wanted to add one more, Luke, because I think this is again. There's so much I don't know. But back to the relationship with ourselves, or what it means to be human. There's a huge piece of this discussion with artificial intelligence merging with human beings. You talk Elon Musk is very involved in this technology to chip human beings right, To merge human beings with artificial intelligence, right, you know, and boy talk about changing our relationship to ourselves.
Scott Allen:I mean in a dramatic way, almost a way that I sometimes wonder if God at that point says I won't allow it.
Luke Allen:Right, well, they've already done it with one guy. What is that?
Dwight Vogt:You get a smartphone in your brain. I don't understand.
Luke Allen:Oh, I haven't refreshed on that recently. All I know is this guy has chips in his brain that work off of AI. It's the interface right.
Scott Allen:The interface isn't through a laptop. Now it's right in the interface with AI, with artificial intelligence. It's right in our own brains, so it merges what it means to be a human being with AI in a pretty profound way. And I don't begin to understand it, but I know that's the dream or the vision that a lot of them have, and some of them say it like. I've heard Elon Musk speak this way, where he says we have to do that. In other words if we don't merge AI with humans, it will take over.
Luke Allen:Yeah, it's kind of creepy language. For the sake of our species, for the sake of civilization is the way they talk oftentimes. Like this will save us. That's again. You're getting into the world of religion.
Scott Allen:But just back to that basic question, worldview question of how does it change the relationship to who we are as human beings, our relationship to ourselves? There's the relationship to God, relationship to others, to ourselves, to the world around us, to creation. So super helpful, Luke. I just want to step back and appreciate, you know, your understanding and your ability to kind of begin to think worldviewishly at that level.
Luke Allen:So yeah, and in each of those areas I think there's pros and cons too. I would say, um, and there's some pros. I you know we've been a little negative today, but there's some definite pros to ai when used rightly. So I just want to drop that into the conversation thank you even though I know we're wrapping this up. Um sorry, how'd you want to wrap this up?
Dwight Vogt:yeah, we, even you know, well, let's give one good example yeah, go ahead on the positive note here and positive from the standpoint of a biblical worldview.
Luke Allen:If you don't mind luke, yeah, oh well, that's really putting me on the spot I, I wrote down, I wrote down a couple that I other people have said are pluses for humanity, but from a biblical worldview perspective, I mean, it's used heavily in the area of medicine. They've used it to diagnose diseases much quicker than doctors can, more accurately than doctors can. In a lot of cases For our work we've used it in translation, which again the obvious negative there is. We're kind of reversing the curse of babble yeah, babble right, yeah. But on the plus side, we work in a global ministry with a lot of different language groups and it's it's been a severe time saver for us, absolutely so I can't overlook that. I guess, if I could just say one last thing.
Scott Allen:You know, people I've heard you know, and Christians say we don't need to be worried, and I appreciate that God is in control, god is sovereign, and I absolutely take great comfort in that.
Scott Allen:But I also don't want to be too flip about that either, because human beings, in our fallen condition and with the power of these technologies again, you know, in the broad sweep of human history we've only had this kind of powerful technology in just really recent years and the velocity and the speed is just increasing dramatically, way beyond, I think, what we can even imagine, dramatically, way beyond, I think, what we can even imagine. It can do probably more harm than we can imagine. And not to again cause people to be afraid, god is sovereign, god is in control. But we should not be too flippant either. We should be sober about that, I think, and it should motivate us to say what can we do to protect what it means to be human, to protect what is good, and you know, are there things that we can do, you know, to protect the good, the true, the beautiful Thoughts on that, guys.
Dwight Vogt:And I think that God's I don't think I know that God's promise to us is if any of you lack wisdom, let him ask of God, who gives to all men liberally and abradeth not. So I think that you know God wants us to understand what it means to be human. He wants to understand what it means to add dignity or take away dignity from people, to fulfill the conversion commandment. So ask for wisdom and he will give us small people wisdom in this big, big new area. Yeah, yeah.
Scott Allen:Well, I don't know, luke, are we done? I feel like this has been a good discussion. It's the kind of discussion I think everyone's having, and I'm glad we could have it together on this podcast and hopefully help people begin to think a little bit more about what does it mean to think about something like this from a biblical worldview framework perspective, what kind of questions need to be asked, etc. I hope it's helped.
Luke Allen:Yeah, I like what you were saying, dad, about how we shouldn't be fearful of this. You always need to keep in mind the sovereignty of God. My verse of the week is Hebrews 13, 6 and 8. So we can say with confidence the Lord is my helper, I will not be afraid. What can man or robots that look like man do to me? Jesus Christ is the same yesterday, today and forever. And that's kind of my approach to this thing.
Luke Allen:I will say at the beginning of this discussion my approach thus far with AI has been the old classic stick your head in the sand because it's too confusing. That's kind of a fear response, I would say, or a lazy response. We're not called to either of those things. With the fear response, you don't have to be anxious about anything but in everything, through prayer and petition, with thanksgiving, present your request to God. So that's kind of the way I want to start approaching ai. Is um, asking god what he thinks about this, approaching it from a biblical worldview perspective.
Luke Allen:The other, the other response I had sticking your head in the sand. I was just. I mean this morning, dad, we were talking and I was. I was using the analogy of it this, this crazy new world we're in feels kind of like uh, feels intense, it's scary, it feels like a battlefield, right. So I go.
Luke Allen:The analogy of a battlefield who do I want to be on the battlefield? Do I want to be the guy that sticks my head in the sand, that covers my ears and cowers in the corner? Do I want to be the guy who, out of fear, panics and can't think critically anymore and is just so scared that they're just like I can't handle this new AI thing, it's all anymore and is just so scared that they're just like I can't handle this new AI thing. It's all sci-fi and crazy and I'm going to panic. Or am I going to be the guy that's hypercritical, that's analyzing everything, that's aware, that's discerning, that's hyper skeptical, but in a way that leads to critical thinking and decision making? I want to be that guy. That's the way I want to approach this. That's the way I want to approach all these topics as best as I can. Of course, you can only do so much, but from what God's given me, I want to be discerning.
Scott Allen:It's really helpful, luke, yeah, and I think it brings you back to the beginning, when you were talking about how this is already around us. We're kind of living in the world of AI without hardly knowing it, and I think one wrong response is just to let it wash over us, kind of unawares, and just let us let it affect us without us even being aware. That I don't think is pleasing to God. We should take a step back and, like you say, we should try to understand it, apply our God-given creativity, our minds, and think about it. What is this? And think about it from the standpoint of the most basic realities of all.
Scott Allen:Who is God? What does it mean to be a human being? What does it mean to be a fallen human being and a human being capable of being redeemed? These kind of fundamental Christian perspectives, true perspectives. So, yeah, don't just stick your head in the sand, don't just be passive and let it wash over you, because if you do, if you choose that passive place, it will shape you. Whether you want it to or not, it's going to change you right? Don't be passive.
Scott Allen:Yeah here on this one. Don't be passive here on this one. So good word, luke, that's you too, all right, well, thank you all for listening to another episode of. Ideas have Consequences.
Luke Allen:This is the podcast of the Disciple Nations Alliance. Hey guys, thank you so much for listening to this episode. I hope that this discussion was helpful for you. Okay, the four primary relationships.
Luke Allen:If you're newer to the Disciple Nations Alliance, or if you need a refresher and you're wondering if we could give you a little bit more biblical background or context for these four key relationships, then don't worry, we've got you covered.
Luke Allen:To learn more about these four key relationships, then make sure to head over to the episode page, which you'll see linked in the show notes. On that page, you can clearly see more information about these four relationships, including a podcast that we did a while back on them. Again, these are our relationship primarily with God, as well as ourselves, with others and with creation. So, again, I'd recommend heading over to the episode page, which you'll see linked in the show notes. On this week's page, we've also included more information about AI For any of you guys who, like us, are really interested in learning more about this right now. Then on that page, you will see a lot of the resources that we use to prepare for this discussion, including other podcasts, books and articles. So we hope that those are helpful for you. Thanks again for joining us today for this discussion. We really appreciate your time and attention and we hope that you'll be able to join us again next week here on Ideas have Consequences. Thank you.