Q:
Is it contrary to the principles of faith to use doctors and medicine when you need healing?
A:
It Is God’s Will to Heal You
The first and most important truth you must know is this: God wants you well. Healing is part of His will for your life. If your faith is strong—firmly rooted in God’s Word and unshaken by symptoms or circumstances—you can receive healing by faith alone.
But that kind of unwavering faith doesn’t come from hearing just a few sermons. It comes through a personal, deep revelation of God’s healing power—something developed over time through prayer, study, and walking with God.
If you haven’t reached that level of faith yet, don’t be discouraged. In that case, a doctor can be your best ally. Medical help is not a sign of weak faith; it’s a tool God can use to aid your healing.
Still unsure about whether or not to seek medical attention? Follow the wisdom of Colossians 3:15: “Let the peace of God rule in your hearts.” Let peace be your guide. If you feel fear or unrest when thinking of refusing medical help, then go ahead and see a doctor. But if you feel a strong, peaceful confidence that you can trust God completely for your healing, then stand in faith and receive it.
Either way, never let the enemy condemn you for your decision. It’s not his place. Healing is between you and God, and there’s no room for shame—only faith, peace, and God’s love guiding you forward.
Support CfaithToday.Com
Donate Now