Anyone who has ever had a wristwatch of similar tech should know how hard it is to get anything like precision out of those things. It's a millimeter sized button with a millimeter depth of press and could easily need half a second of jabbing at it to get it to trigger. It's for measuring your mile times in minutes, not fractions of a second fall times.
Naturally, our data was total, utter crap. Any sensible analysis would have error bars that, if you treat the problem linearly, would have put 0 and negative numbers within our error bars. I dutifully crunched the numbers and determined that the gravitational constant was something like 6.8m/s^2 and turned it in.
Naturally, I got a failing grade, because that's not particularly close, and no matter how many times you are solemnly assured otherwise, you are never graded on whether you did your best and honestly report what you observe. From grade school on, you are graded on whether or not the grading authority likes the results you got. You might hope that there comes some point in your career where that stops being the case, but as near as I can tell, it literally never does. Right on up to professorships, this is how science really works.
The lesson is taught early and often. It often sort of baffles me when other people are baffled at how often this happens in science, because it more-or-less always happens. Science proceeds despite this, not because of it.
(But jerf, my teacher... Yes, you had a wonderful teacher who didn't only give you an A for the equivalent but called you out in class for your honesty and I dunno, flunked everyone who claimed they got the supposed "correct" answer to three significant digits because that was impossible. There are a few shining lights in the field and I would never dream of denying that. Now tell me how that idealism worked for you going forward the next several years.)
The closing sentence is also prescient; the author pivoted to CS, ultimately completing his doctorate at the University of Wisconsin at Madison
This is both hilarious and more common than you might think. In my field of expertise (ultrafast condensed matter physics), lots of noisy garbage was rationalized through "curve-fitting", without presenting the (I assume horrifyingly skewed) residuals, or any other goodness-of-fit test.
Honestly, I'm kind of frustrated now, too many work is close-source in this area. The research paper will tell you everything except how to reproduce this work in minimal effort, it's like they are hiding something.
They also using a `Origin` to plot and MS Word to write paper, which is also non-free licensed, and made them harder to collaborate and reproduce.
> Ph.D. Computer Science, November 2004 > University of Wisconsin, Madison
> M.S. Computer Science, May 2001 University of Wisconsin, Madison
> B.S. Physics, June 1999 Stanford University
[1] https://pages.cs.wisc.edu/~kovar/cv.html
EDIT:
Went on to work at IL&M for 5 years and has been at Google for 14 [2]. My guy did indeed end up rolling in cash haha
(at most: https://web.archive.org/web/20001031193257/http://www.cs.wis...)
Last-Modified: Sun, 26 May 2002 22:33:04 GMT
(And the HTML code structure matches that era perfectly.)https://web.archive.org/web/20250311184956/https://pages.cs....
Summary:
``` The author sets out to investigate the temperature-dependent resistivity of germanium, a classic topic in solid-state physics. However, they quickly become disillusioned with both the theory (which they find overly abstract and nonsensical) and the practice (plagued by faulty equipment, uncooperative materials, and lack of support). Their experimental setup involves a precariously mounted crystal, unreliable tools, and a leaking thermos of liquid nitrogen.
Despite their intense effort, the data collected fails to demonstrate the expected exponential relationship. In frustration, the author draws a curve "through the noise" hoping it will look convincing enough to pass. Ultimately, they conclude the project—and their choice to study physics—was a total waste of time and regret not choosing computer science instead, where at least they'd be making money, if still unlucky in love. ```
Also, you don't need to solder wires to the sample. But if you want to measure the hall resistance of a thin film of a semiconductor, you can solder a glob of indium on to four corners of a 1 cm x 1 cm wafer, put it in a magnetic field, and then do basically the same measurement as four point probe, except not inline.
Previously:
https://news.ycombinator.com/item?id=16360479
https://news.ycombinator.com/item?id=23494243
I currently write my master's thesis in experimental quantum computing - the platform is similar to what Google published in December, just with less qubits. A lot of it just comes down to how much money the lab can spend to get the best equipment and how good your fabrication is.
You can have the best minds in experimental physics, but without the right equipment the grad students are just busy trying to make things work somehow and waste months if not years away.
I could never reproduce it well in the lab, because it's really not true. Take a heavy cube the shape of a book. Orient it so that the spine is on the floor. It's a lot more friction to move it in one direction than in the transverse direction. Yet the normal force is the same. Any kid knows this, and I feel dumb it never occurred to me till someone pointed it out to me.
Guess that's the power pictures have over words.
404 error now, perhaps some admins took it down due to traffic?
The user's directory is still linked from the listing [0] though.
I wish universities were better equipped for what you pay. Where is all that money going anyways? Leaking like free electrons?
Should we tell him?
He got pissed off at me for questioning his authority, I told the class "Uh, guys, why don't we all wait until [GTA's name] and I talk this out to proceeded, unless ya'll want to be replacing fuses in the multimeters" that REALLY pissed him off.
He was yelling. He told me I needed to talk to him in the hallway. I informed him that if I was wrong, this would be a great lesson for the class, and that, no, I will not being going somewhere to be yelled at in private, anything he had to say could be said there. That really did it. He yelled more. I was laughing at his tantrum. He took me up to the lab lead (not the prof overseeing the class - not 100% sure of how this person fit into the the hierarchy), intending to get me kicked out of the class for disrespect. He goes on to this guy about how I'm the worst, and I just stand there, smiling.
Finally, lab lead guy has gotten tired of the second hand yelling, and asks for my side - He wasn't oblivious to the fact that I'm sitting there fiddling with my 12AX7 necklace while leaning on my longboard I burnt with high voltage. I oozed the hardware hacker ethos very visibly - and I respond simply "He told the class to measure impedance, with an ohmmeter, while the circuit was live"
It was at that moment I learned it was this lab lead's role to repair equipment (or at least replace fuses) when things like this happen.
Watching that GTA have to tell the class "I was wrong" after he was yelling at me in front of everyone had to be the best.
---
Fast forward a year, and I got to deal with even more mind numbing stupidity: https://opguides.info/posts/whydidipay/#8---senior-spring-20
Honestly, physics is so full of pretension and hero worship. Even among seasoned lecturers there's a tendency to mythologise the progress of the art by making it sound like all the great results we rely on were birthed fully-formed by the giants who kindly lend us their divine shoulders.
Ironically there's a kind of Gell-Mann amnesia here, working scientists know that must of your work will consist of stumbling down blind alleys in the dark and looking for needles under lampposts that aren't even near the haystack.
I'm reminded of an anecdote which I can't currently source, but as I remember it Hilbert was trying to derive the Einstein Field Equations by a variational method. He correctly took the Ricci curvature R as the Lagrangian, but then neglected to multiply by the tensor density, sqrt(-g). This is kind of a rookie mistake, but made by one of the history's greatest mathematical physicists.
Anyway I love this article, it's a breath of fresh air and rightly beloved by undergrads.
(edit: for a counterpoint to this work please see another classic: "The physics is the life" -http://i.imgur.com/eQuqp.png )
I’m afraid you’ll have to repeat the experiment.
One of the most devious analytic chemistry labs I had was the one where the spectroscope was ancient, its tray was less transparent and more milky white, and the fluid to analyze was some sort of expired flavored water. The attempt vs result chart looked exactly like that figure.
A really eye-opening experience in many ways.
I've just soldered SOT-723 onto SOT-23 adapter board, I can solder anything to anything
Maybe the frustrations of undergrad lab work would be easier to swallow if they were better situated in historical context. This kind of result should give the experimenter some sympathy for the folks who originally made these discoveries, with less knowledge and worse equipment. But I don't think it's usually explained that way.
For all its flaws, Fahrenheit was based on some good ideas and firmly grounded in what you could easily measure in the 1720s. A brine solution and body heat are two things you can measure without risking burning or freezing the observer. Even the gradations were intentional: in the original scale, the reference temperatures mapped to 32 and 96, and since those are 64 units apart, you could mark the rest of the thermometer with a bit of string and some halving geometry. Marking a Celsius scale from 0 to 100 accurately? Hope you have a good pair of calipers to divide a range into five evenly-spaced divisions...
Nowadays, we have machines capable of doing proper calibration of such mundane temperature ranges to far higher accuracy than the needle or alcohol-mix can even show, but back then, when scientists had to craft their own thermometers? Ease of manufacture mattered a lot.
Cracks in the Nuclear Model: Surprising Evidence for Structure
Has science gone too far? :D
As an odd coincidence, I did the same experiment on a shoestring budget with substandard equipment also. I too used a fancy computer algorithm to get a best fit. Except that I managed to get four significant decimal places in the result — an improvement over the (also outdated) textbook.
The author of the angry rant had a life-defining experience of overwhelming frustration.
The same scenario resulted in a positive life-defining experience for me
It’s funny how unpredictably things pan out even in identical circumstances…