It is, but I don't think you've interpreted the scene behind the curtain quite right.
It's not that they have no faith in truth.
Part of it is tech hubris that you don't need to do things the right way; you can invent a solution to anything. There is a genuine problem in that the easily-accessible data isn't remotely reflective of the statistics of reality. It's possible to get data which is mostly reality, but the frequencies are distorted and the learning methods almost all rely on the frequencies in order to build credible output. Statistical inference is the engine of machine learning, and the statistics are wrong. So, with tech-bro hubris, what's the answer? When you notice a problem, train the system to give the "right" answer even though the training set is wrong.
Part of it is ideological myopia. What is a "wrong" answer worth fixing, and what isn't worth thinking about, is heavily colored by pretty far-left ideas rather than concern for accuracy. Given the extremely predictable backlash, the egg on Google's face is that somehow training for what they deemed to be socially responsible racial diversity got so much more mindshare than training for basic factual accuracy. Of course they didn't want anything like this to happen. There's no way they could have had a solid test of anything like this and yet still released the product. They simply didn't think of testing it. They're in the business of producing realistic output, but were so ideologically captured that they didn't think to check if their output was realistic after they applied their fixes. It's not because they want us to believe that, or even that they don't care at all about the truth; it's just that they were too fixated on what they perceived as the core problem and "hey, what could possibly go wrong?"
This is always the progressive conceit--we came up with this idea, what could possibly go wrong? (It is a common human conceit; the right wing does it plenty too, but progressive-and-farther-left raises it to an art form of denial. The corresponding conservative blind spot is when things are objectively and comparatively terrible, and they still fail to acknowledge that copying literally almost anyone else would produce a superior outcome--progressives have that conceit too, just less. Everyone has conceits. Aren't we lovely creatures?)
So I don't think this is really an attack on truth in the same way that postmodernism is, or that "the 2020 election was stolen" is, or many other things. Rather, it's the typical tech hubris where the highest praise is to "disrupt" something, plus the typical ideological blinders that keep one from asking tough questions like "in trying to fix that thing close to our hearts, did we screw everything else up?".