When did doctoring start being about the money and not about the patient? When did it become the norm to have a visit with your doctor where he or she never actually touches any part of your body? I don’t know if it’s the part of the country that I’m in or if this is happening everywhere.
I was recently diagnosed with a huge mass in my uterus which was discovered by my Korean masseuse (I dunno is it politically incorrect to note that my masseuse was Korean? I’m less than a week post op and I really ain’t got time for that, I got pain I’m trying to stay ahead of).
Ok, maybe some doctors don’t like to touch their patients for whatever reason, but how about the one who called in a prescription for something I didn’t have? What if I was totally stupid and went ahead and used the prescription for the thing that I didn’t have? And what about the doctor who would not call in my BP meds refill unless I made an appointment to see him? And he said this to me while I was standing… in front of him… in his office…on a Friday(care to guess when the next opening in his schedule was?).
Where have the good caring doctors gone?