They have a lot of experience with people who are living with the consequences of past decisions--that's who they are. There's practically no-one else--assuming they've been in the clinic for a good while, not just school--who has a better perspective on how people regret their own follies and foibles, once some time passes and they actually have to live with the consequences of their indiscretions.
(Maybe hairdressers come close.)
It's really not hard to resist a doctor's entreaties to stop eating cookies as a diabetic if you've actually thought it through carefully and make your case. "I would literally die for cookies," you might say. "And yeah, I am literally dying for cookies. But something's gonna get me eventually, right, doc? I'd rather it be this way. Yeah, I know, my vision's starting to go, but I don't regret it a minute. Just help me keep my insulin levels up for as long as you can, okay?"
Ironically, you managed to pick an example where, in fact, the doctor does have every opportunity to develop an unusual degree of expertise--maybe not quite as much as in the health of the human body, but it's not hard for it to be a close #2. That doesn't make them an incontrovertible guru, and doctors do tend to develop a bit of a god complex. But they're abnormally well-situated to advise people on how to live their lives.
----------
The whole non-overlapping magisteria thing, popularized by Stephen Jay Gould, was rubbish from the start. As per Sam Harris, notwithstanding his extraordinarily unconvincing attempts at engaging with philosophical questions, almost everything we do is absolutely strewn with statements and (possibly incorrect) assumptions of facts, and of which consequences will entail from which actions. Furthermore, although modern civilization has some wonderful perks, it's very new; we're not all that well adapted for it, which means that we're primed to make some very very dumb mistakes that we could catch if we would listen to evidence. Science has an immense amount to tell us about what is <Harris-Voice>clearly</Harris-Voice> better than the alternative, and consistent with our desires and revealed preferences. So the proper magisteria of science in principle covers so much that it's hardly worth worrying about the exceptions.
As long as it's actually science, and not BS or wishful thinking onto which we tried to stick a label that says "science".
However, we do have a problem, and it isn't (mostly) with science, but rather with philosophy. (Well, and how human nature interacts with technology and such--that's actually most of it.)
The reason is that although the potential magisteria of science is "basically everything that anyone ever cares about at all", the reality is that there are oodles of questions for which we don't have adequate evidence to say very much, or we do, but we don't have adequate analysis to understand with confidence what the evidence is telling us.
This is something that is emphasized as part of the scientific method ("hold hypotheses tentatively", "base your beliefs on evidence", etc. etc.) over and over again. When we ask science about our lives, most answers should come back, "I don't know at all", or "well, not sure, but we've looked at this and it sorta suggests that, and..." or "on average--but we know individual variation is more important than the average--the answer seems under these limited conditions to be...".
The problem is that we aren't encouraging scientists to just discover things. "This seems like it is probably important. Please find out how it works, scientists! I don't care what the answer is; I just want to know what it is!"
You start to do that, and then we find out that the Earth isn't 6000 years old, and that humans aren't blank slates ready to be fashioned into perfect communists. And that's no good at all.
If that wasn't enough, in the U.S. especially, the political left has observed the long-running tension between dearly-held but evidentially-unsupported beliefs of the political right and tried, largely successfully, to engage in tribal capture of "science" as their thing.
Of course, tribal thinking is the complete antithesis of scientific thinking--but scientists are people and the left sounds oh-so-much-more-friendly than the right.
And then the left does stuff that goes way outside of what is indicated by the evidence and proudly proclaims that "science says that...", and it's really hard to tell your own tribe, "No, wait, that is NOT what we can safely conclude; we can only say the much more modest this-that-complex-thing-caveat-condition-error-bars."
But people notice the lack of reliability, and blame "science" or "scientists" instead of "political messaging masquerading as science, supported by scientists for social but not scientific reasons".
And if this weren't enough, the postmodern and critical theory philosophers have been waging a double-war on science for decades (because science is tricky for philosophers), with the postmodernists diminishing the concept of objective reality, and the critical theorists diminishing the concept of impartial study, both of which make conducting science harder, and which train people to embrace a visceral anti-evidence instinct, and consider themselves virtuous to have done so.
And, sure, scientists tend to be really smart, and really smart people tend to have big egos, and people with big egos tend to think they're right about lots of stuff, and overconfidently say wrong stuff way outside their actual realm of expertise. But that hasn't changed. There's no reason why people should change their opinions of science now just because some scientists are still full of themselves, just like they've been for hundreds of years. ("I have taken all knowledge to be my province."--yeah, sure, Bacon; even in the 1500s this was ridiculously impossible.)
So, meh. I don't think I agree with your thesis at all.
But I would, if you could provide some really solid evidence.