But there were a few projects where the student had gotten access to very sophisticated lab equipment, and was doing gene sequencing or something along those lines. I wondered: did the student have any grasp of what the likely or right answer should be? Did the student blindly accept the number which the very expensive "black box" spits out, and reach conclusions based on those? Perhaps the instrument is way out of calibration, perhaps its algorithms are faulty, or perhaps it is programmed maliciously to generate random numbers around the nominal answers – who would know?
That's why it's always important to figure out a way to check sensor-based data and results as close to the source as possible, before the fancy analysis, the color charts, and the razzle-dazzle has been added. You could be going off in a very wrong direction and not even suspect it.
This is not a new problem nor is it one that only engineers face. In Einstein's Ph.D. dissertation "A New Determination of Molecular Dimensions" (one of five brilliant papers he published in 1905, including the best-known one on special relativity), he does a complex analysis of motion of particles in liquid diffusion, the kinetic theory of liquids, and more. I'll admit I can't follow his analysis, but I do know how the story ends: after all his equations and conclusions, he takes some well-established data from other researchers on diffusion coefficients of various solutions and puts them into his equations. The result is a value for Avogadro's number – a parameter not immediately related to the paper's subject – which is very close to the value that had been independently determined by many other techniques through the years. In other words, he was able to verify his intense and unique insight using basic data and an accepted chemistry number.
Next: And Einstein agrees