Is there a line where you think science could go too far? To me "playing God", just means bettering human life using science and technology.
I believe as long as our advances help people, and are ethical then it should be fine to better ourselves.
Yep, I agree with the above. There's no way to justify letting someone die or be hurt because you didn't want to "play God," just as there is no way to justify hurting someone for science. I'm still really undecided on most animal testing though, because on then one hand I won't to be all strict vegetarian and on the other hand I want to do medical research and want people to live.
We are going too far when our science attempts to alter humanity or the world around us for the better without a comprehensive understanding about the consequences of those alterations. I think occasionally people are getting at something like this when they make the "playing God" criticism. But apart from this, I don't think there's really any other line we need to worry about crossing.
Playing God behavior I despise is when some trauma renders a person a vegetable/comatose and then family members (for whatever stupid or selfish reason) decidde to leave said person hooked up indefinitely to some life support system. Let the resting rest in peace, for the love of Mike.
I disagree with the ideal when it comes down to the betterment of human life, but I do agree that there are times when letting nature run it course is best, like in Malestrom's example.
Playing God isn't merely the (subjectively: excessive, objectively: extensive) use of technology, it's the enjoyment of the power you can hold over someone. For example, in Malestrom's example, when the family then proceeds to have a moral feeling of righteousness and professes the mercy in their actions. Playing God isn't being technologically advanced. Playing God is when progress becomes an abuse of power. Another example is drones. It's sadistic to give the message that government pilots are worth so much more than potentially hostile people that they're going to massacre people without risking the precious life of their own military.
I don't see why a person should not be kept alive IF they put that in their will or told their family beforehand. I agree about drones.
2 points: 1. I agree with this, BUT there is only so much we can comprehensively understand. There comes a time when we become so bogged down with the "what-ifs" that we never move forward. Inaction has an effect on our environment and our species just as action does. Both carry risks. 2. For the previous posters who reference "ethical" and "hurting someone": How do you define ethical? Broad ethics are pretty easy to agree upon, but when you get into the gray areas, everyone has their own line in the sand. Example: "hurting someone" - that was pretty easily agreed upon. Then stem cells came into the equation. Then the dilemma was: A. Aborted tissue is from a human being and therefore someone was hurt in order to attain it. It cannot be used for research. B. Inaction because of that debate could hurt many fully developed, living, breathing humans who need the life saving research in order to continue living themselves. How do you decide who's "line in the sand" is correct and ethical? To the original question: I agree with the last line. Nature is an uncaring beast. Sure, it's beautiful and intricate, but it holds no value for your life or your loved ones. In theory, I would not try to elevate one form of life, or one individual within that form above all others, but that's not reality. In reality, I do value my children more than other people. I do value humans more than other animals. I do support utilizing human's greatest asset for the benefit of my species and therefore my family and myself. I have strong ethics, and my ethics lead me here. Of course there is a balance, but my default favors "playing god".
I'm not really sure what someone means by "playing God." We are constantly "playing God" everyday, it's just common place and accepted at this point. Every time you take an antibiotic to fight off an infection, you're tinkering with what nature intended - indeed pretty much any use of medicine is going against the will of nature. Evolutionarily speaking, humans would probably be better off if those who were most prone to get infections actually died of infections, don't breed, and thus don't pass on such weaknesses to future generations. This means that humans that are more hearty against such things would survive and reproduce in their stead. I don't want to live in such a world where people with that mindset are dominant, so I'm perfectly happy with science "playing God". Like all forms of knowledge and tools in this world it is amoral. It can be used to harm and to hurt. The goal of science should be for the betterment of the human species as a whole. When it comes to things that should be prevented or discouraged, I find that I have rather loose limits - next to none, in fact. What I look for are victims or potential victims. The question I ask: Who will be harmed by this and why? At that point, my concern is more focused on minimizing harm and managing risk. Regulating something is quite different from banning it all together. I find that most people who are opposed to certain things fall into one of two camps: those who are opposed for reasons based on personal values (often rooted in religion), and those who are opposed due to a sense of squeamishness. As my values are usually in direct opposition to the first camp, we're never going to see eye to eye. The second group, however, is persuadable. So, for people who have a sense of squeamishness, I point to the fact that it's normal to feel uncomfortable about the changes or the implications that certain technology could create. It's normal to feel unsettled - even frightened or scared by the prospect. There is an element of unknown there. However, just because something seems scary doesn't make it bad or wrong. It's true that it may not be natural, and that may lead to a sense of unease. However, that perspective is odd when viewed in the proper context. Absolutely nothing about our lives is natural. Our species evolved to be nomadic hunter gatherers. We abandoned what nature intended for us the moment we began agriculture and started building civilizations. All the benefits and luxuries that we enjoy today is a direct result of our species heading down that path, and I for one am grateful to our ancestors for taking those steps. We don't often think of our modern lives as unnatural, because we're used to it. It's all we've ever known, and thus it's easy to take it for granted. It feels natural to us, even though it isn't. Many of our problems today as a species is a direct result of the fact that we didn't evolve to live the lives that we currently lead, and we have certain inherent limitations as a result. While we may live in a modern world our brains are still largely that of our nomadic hunter gatherer ancestors. Just as what is unnatural seems natural to us today, because it's all we've ever known, whatever scientific advances we have in the future will start to become natural and common place in the future. The sense of squeamishness fades away as the unfamiliar becomes the familiar, and we as a species reap the benefits of the new knowledge and technology. When looking to the future, our goal should always be focused on the betterment of the human species as a whole. We must strive to not create victims, and must therefore constantly evaluate who will be harmed by our actions. Then we must take steps to minimize that harm and any risk involved. When we do that we can move forward with a clean conscience and with confidence that future generations will benefit from what we build and discover today.
I don't think it's inherently immoral to attempt to improve humans with things like genetic engineering and biotech. If you can improve lives, then you have a responsibility to do so, I suppose. However, I do recognise a significant problem, particularly with modification of the genetic code. We see it with artificial selection in animals; we make changes without fully considering the consequences, and end up with cows that are so large they can't hold their own weight, and dogs that can't breathe. In humans, we act to keep someone alive when they would have died without medicine (obviously a good thing), and this is already leading to an accumulation of harmful genes. In genetically modifying ourselves, we risk causing more problems than we solve.
*cough* E=mc^2 Einstein intentions were good, spaceships into space is a nice thing, but once the-woman-who-split-the-atom discovered the fission, the road was short for atomic bombs. There will always be people who will seek and abuse power, like we are about to make quantum computers, it'll be really fun playing games with virtual reality and all, but some might use it for hacking and applying viruses and stuff like that, so if the question is between advancing technology or stopping it because it might bring harm, I say we just need to put efforts into educations so people wouldn't want to be abusive in such a way.
There is no arbitrary line of understanding we should not cross; only immaturity of people who need to become wiser before crossing it.
If God is omnipotent, He'd put some sort of punishment upon those who "toiled in His domain", right? If criminals have free will, then so do scientists. All things can be abused. They can also be used to help others. There's a few lines I would never cross, though. > Completely immortal people > Creating apocalypses or diseases > Fucking around with things that you don't know enough about > Making devices that don't really have a purpose (just a pet peeve)
Maybe you underestimate omnipotence. If he has every power possible, he would limit our abilities so that we could never infringe on his domain. But that's another matter entirely.