kmich wrote:Alright, I will state outright that I have no intention on getting involved in the seemingly endless, pointless pissing contests between Col Sun and Marcus et al. So keep me out of it, please.
kmich wrote:In my world, medical practice is as much of an art as it is a science. Any competent medical practitioner has to keep up on the peer reviewed research to know what the hell he or she is doing and to stay up to date. A respect for science and the findings of solid research and ongoing education is required for any practitioner. So I have reserved at least 2, usually typically dull, days each month to review my journals to keep on top of stuff.
kmich wrote:But the fact that you know your science does not make you a good practitioner. I have known many an academic who knows the research and literature inside out but who pretty much sucks in their clinical and surgical practice. That is because every patient, every situation, every organ, blood vessel, artery, is different, and it typically takes many years of experience to know what to do in an instant of decision with those to save a life, a heart, a brain, or a limb.
Conversely, I have encountered may clinicians who have no clue on how to perform research, mistaking clinical experience for research skill, but go charge ahead and do it anyways wasting valuable research funds.
So when I read the latest medical research headline the popular press, say that eating too many granola bars may result one's left testicle bursting into flames and dropping off, I more or less dismiss it unless I read the original paper and see that the study was done right.
Not only do such clinicians not have any clue, they often don't have a clue that they don't have a clue.
So a statistician is not consulted in the design of the experiment [often a far more subtle problem than is appreciated esp in the life sciences], the data is not collected in a consistent manner, various biases are introduced mid study, the size of the study is too small, etc.
Instead they think that if they input some numbers into a black box stats package and get an output with p < 0.05, then it's significant [the null hypothesis can be rejected with confidence].
kmich wrote:Medical and surgical practice is an art based upon your experience and the quality of your clinical observations and reasoning. When I work with residents, I often don't give a sh-t about what research studies they cite. I want to know what they heard and understood from the examination of their patients and their review of their radiological and laboratory studies. I want to know what hypotheses and strategies they have devised from those experiences and reviews and why. Tell me what you think and why about this particular person and what needs to be done next, not what "blank blank et al" published in the latest Journal of Vascular Surgery, the NEJM, or whatever.
There's no substitute for practice and experience in mastering a particular skill, be it surgery or golf.
kmich wrote:After observing me in surgery, they ask me sometimes why I did X or Y procedure with a patient, but, to me, it just seemed like the obvious thing to do, but I cannot cite an article to support what I did. How the patient did later proves I was right or wrong, and fortunately for my patients, I am usually right. Some residents never seem to get that line of thinking. Oh well, I am sure they can find some academic post in a medical school somewhere where they can cite Duffus et al or whatever to their wowed medical students.
Well, research is necessary to advance clinical practice, which is why stating the obvious you spend time reviewing new papers. Conversely, feedback from clinical work is essential to determine what questions require further research.
Most medical students and residents are best suited for clinical work. A smaller number are better suited for research. A very small minority are good at both.