307

147

maitola

I love the ironic side of the article. Perhaps they should add the reason for it, from Fermi's and Neumann's. When you are building a model of reality in Physics, If something doesn’t fit the experiments, you can’t just add a parameter (or more) variate it and fit the data. The model should have zero parameters, ideally, or the least possible, or, even at a more deeper level, the parameters should emerge naturally from some simple assumptions. With 4 parameters you don’t know whether you are really capturing a true aspect of reality of just fitting the data of some experiment.

elijahbenizzy

This is humorous (and well-written), but I think its more than that.

I'm always making the joke (observation) that ML (AI) is just curve-fitting. Whether "just curve-fitting" is enough to produce something "intelligent" is, IMO, currently unanswered, largely due to differing viewpoints on the meaning of "intelligent".

In this case they're demonstrating some very clean, easy-to-understand curve-fitting, but it's really the same process -- come up with a target, optimize over a loss function, and hope that it generalizes, (this one, obviously, does not. But the elephant is cute.)

This raises the question Neumann was asking -- why have so many parameters? Ironically (or maybe just interestingly), we've done a *lot* with a ton of parameters recently, answering it with "well, with a lot of parameters you can do cool things".

EdwardCoffin

Freeman Dyson recounts the episode [1] that inspired this paper in his web of life interviews (prepositioned to the fitting an elephant bit) [2]

Steuard

Sadly, the constant term (the average r_0) is never specified in the paper (it seems to be something in the neighborhood of 180?): getting that right is necessary to produce the image, and I can't see any way *not* to consider it a fifth necessary parameter. So I don't think they've genuinely accomplished their goal.

(Seriously, though, this was a lot of fun!)

lazamar

Lol. Loved it.

This was a lovely passage from Dyson’s Web of Stories interview, and it struck a chord with me, like it clearly did with the authors too.

It happened when Dyson took the preliminary results of his work on the Pseudoscalar theory of Pions to Fermi and Fermi very quickly dismissed the whole thing. It was a shock to Dyson but freed him from wasting more time on it.

Fermi: When one does a theoretical calculation, either you have a clear physical module in mind or a rigorous mathematical basis. You have neither. How many free parameters did you use for your fitting?

Dyson: 4

Fermi: You know, Johnny Von Neumann always used to say ‘with four parameters I can fit an elephant; and with five I can make him wiggle his trunk’.

dheera

I wish there was more humor on arXiv.

If I could make a discovery in my own time without using company resources I would absolutely publish it in the most humorous way possible.

bertm

I have to plug Dr Octave Levenspiel. Levenspiel was a professor emeritus when I did my undergrad. He did much of the work on industrial fluidized beds among other things. The elephant curve discussions were a criticism of the complex multi-parameter fitting for heterogenous catalysis of the time. https://levenspiel.com/elephants/

He tried for a while to get an aerodynamics paper published on the flight of dinosaurs. http://levenspiel.com/wp-content/uploads/2016/02/DinosaurW.p...

This intellectual curiosity reminded me a bit of Feynman and his plate spinning.

ggm

No love for D'Arcy Thompson on growth and form? His parametric models for organisms were quite nice (if very simplistic)

lupire

IIUC:

A real-parameter (r(theta) = sum(r_k cos(k theta))) Fourier series can only draw a "wiggly circle" figure with one point on each radial ray from the origin.

A compex parameter (z(theta) = sum(e^(z_ theta))) can draw more squiggly figures (epicycles) -- the pen can backtrack as the drawing arm rotates, as each parameter can move a point somewhere on a small circle around the point computed from the previous parameter (and recursively).

Obligatory 3B1B https://m.youtube.com/watch?v=r6sGWTCMz2k

Since a complex parameter is 2 real parameters, we should compare the best 4-cosine curve to the best 2-complex-exponential curve.

xpe

One take away: Don’t count parameters. Count bits.

bee_rider

Ya know, in academic writing I tend to struggle with making it sound nice and formal. I try not to use the super-stilted academic style, but it is still always a struggle to walk the line between too loose and too jargony.

Maybe this sort of thing would be a really good tradition. Everyone must write a very silly article with some mathematical arguments in it. Then, we can all go forward with the comfort of knowing that we aren’t really at risk of breaking new grounds in appearing unserious.

It is well written and very understandable!

aqme28

> It only satisfies a weaker condition, i.e., using four non-zero parameters
instead of four parameters.

Why would that be a harder problem? In the case that you get a zero parameter, you could inflate it by some epsilon and the solution would basically be the same.

foobarian

Reminds me of an old joke: "What is the difference between an elephant and an aspirin?" - "There isn't any, except the elephant is large, wrinkly and grey."

xpe

Another take away (not directly stated in the article but implied): Counting the information content of a model is more than just the parameters; the structure of the model itself conveys information.

pietroppeter

Love how they misspelled Piantadosi as Paintadosi :)

classified

What is that horizontal bar above r0 in the last equation?

tagami

ATCG

boywitharupee

what's the purpose of this? is it one of those 'fun' problems to solve?

LisaST

[dead]