I think they SHOULD quit. They're not needed anyway. Most people only THINK they need to see a doctor, when in all reality, they'd be better off without them. I can see going in to set a bone or something, but now I start to think that even THAT can be done and was in the past so why can't we get back to that now. We should be healing ourselves, you're right kotn.
If a doctor's treatment involves any type of non-natural solution, ie pharmaceuticals, then they are not in the business to heal anyone, they are in it for the money, and yes truth, their education is massively corrupt, so they don't even know sometimes that they are harming people. Well, at least not maybe at first, but then the want for money comes into play and so they just don't care anymore.
If you want to be a doctor, you should be in it to HELP people, not to make a business out of it by forcing them to come back!