Interview with

Founder & Teacher,

Audio Transcript

On Tuesday we talked about paying taxes to a government that funds abortions. Today the question is different, though somewhat related, in the ethical entailments of proximity. The question comes to us from Adam in Sydney, Australia, who asks, “Pastor John, as a software developer, I worry my code could end up running a porn site or a war plane in the future. Do developers of new technologies have responsibilities over how an innovation may be reapplied for evil in the future? I think of examples like nuclear science and the atom bomb. More contemporary examples would be artificial intelligence and sexually submissive robotics.” What advice do you have for Adam?

Three factors come to mind for Adam to ponder and take into consideration, which I hope will give some guidance.

1. Design for good — despite the potential for misuse.

The first one is that some things are so good and so right and so necessary that we should go ahead and do them or speak them or create them, even though we are relatively sure they are going to be misused.

And the example in the Bible that I think about is Paul’s doctrine of justification by faith. He says in Romans 3:28, “We hold that one is justified by faith apart from works of the law.” Now when he put that doctrine forward among people who are hostile, he knew what they would do with it. They would twist it and they would make it mean: “Oh, cool, I don’t have to do any works.”

Romans 3:8 says, “Why not do evil that good may come? — as some people slanderously charge us with saying? Their condemnation is just.” So Paul was being accused of saying, “Let’s do evil that good may come” because he taught that law and law-keeping are not the foundation of our justification before God. The same thing appears in Romans 5:20–6:2: “Where sin increased, grace abounded all the more. . . . What shall we say then? Are we to continue in sin that grace may abound? By no means! How can we who died to sin still live in it?” (Romans 6:1–2).

Paul knew (and so did Peter; he talks about grace turning to licentiousness in 2 Peter) that when he preached the truth, the whole counsel of God, there were glorious aspects of it that would be twisted and misused to people’s very destruction, just like an atom bomb. In fact, my guess is more people are in hell today because of the twisting of Paul’s language than from the atom bomb. I think I could say with a pretty serious degree of confidence: the false teaching of depending on works has damned more souls than the atom bomb ever destroyed. And yet Paul went on teaching the truth.

So I think with regard to many inventions this is true — maybe all of them. Roads, wine, hunting guns, cars, smartphones, and any number of other things can all be used for good or turned for evil. And the potential for good, I would argue, is worth the effort. And those who design the good are not necessarily responsible for the evil. So that is my first thing for Adam to consider.

2. Weigh the outcomes for both good and evil.

A second thing to ponder is whether the code that he is working on — or whatever it is that we develop and bring about — is minimally useful for good and maximally useful for evil.

In other words, there is a certain proportionality that should figure in here. A tiny possibility of some good and a huge possibility of much evil should cause a Christian to put his creative efforts in another direction.

3. Consider ways to minimize misuse.

Finally, are there controls that can be attached to the invention that would limit its misuse, something like patent rights or the like? I don’t know the laws here or the possibilities, but the principle holds that if you bring something into being that could hurt others, it would be justifiable and loving to find ways of limiting the misuse of the invention.

Here is an analogy. Truett Cathy created the Chick-fil-A sandwich, and it’s a good sandwich — especially the spicy one. And he maintains such control over this sandwich that you can’t buy one on Sunday. Why did he do that? Well, he just said, “That’s a use I don’t want to be made of my sandwich. I don’t want my sandwich to be used that way, and I don’t want my stores to be used that way, and so I am going to control it.” That means there are various ways you can use to control things, and I think Adam should seriously consider that.

So let me summarize my points:

  1. The potential for great good is worth the risk of misuse.
  2. If we determine there is small potential for good and great potential for evil, we should direct our energies in another way.
  3. We should investigate appropriate controls on what we create to channel it to the best use and minimize its misuse in the future.