Tuesday, September 22, 2009
Very often in experimental physics, we're interested in comparing some data to a physical model that may involve a number of unknown parameters, and we want to find the set of parameters that gives the best fit. Typically "best fit" means minimizing a "cost" function, often the sum of the squares of the deviations between the model and the data. The challenge is that many models can be very complicated, with nonlinear dependences on the parameters. This often means that finding the optimal parameters can be very difficult - the cost function in parameter-space can have lots of shallow, local minima, for example. The cost function may also be extremely sensitive to some parameters (the "stiff" ones) and comparatively insensitive to others (the "sloppy" ones). In arxiv:0909.3884, James Sethna and Cornell colleagues take a look at this dilemma using the tools of differential geometry, and they propose an improvement to standard techniques based on geodesics on the relevant hypersurface in parameter space. This looks really cool (if mathematical!), and I wish they'd included an example of an actual minimization problem that they'd done with this (instead of leaving it for an "in preparation" reference). Any prospect for real improvements in nonlinear fitting is exciting.
Posted by Douglas Natelson at 9:03 PM