Monday, November 29, 2010

Writing exams.

Writing (or perhaps I should say "creating", for the benefit of UK/Canada/Australia/NZ grammarians) good exams is not a trivial task.  You want very much to test certain concepts, and you don't want the exam to measure thing you consider comparatively unimportant.  For example, the first exam I ever took in college was in honors mechanics; out of a possible 30 points, the mean was a 9 (!), and I got a 6 (!!).  Apart from being a real wake-up call about how hard I would have to apply myself to succeed academically, that test was a classic example of an exam that did not do its job.  The reason the scores were so low is that the test was considerably too long for the time allotted.  Rather than measuring knowledge of mechanics or problem solving ability, the test largely measured people's speed of work - not an unimportant indicator (brilliant, well-prepared people do often work relatively quickly), but surely not what the instructor cared most about, since there usually isn't a need for raw speed in real physics or engineering.  

Ideally, the exam will have enough "dynamic range" that you can get a good idea of the spread of knowledge in the students.  If the test is too easy, you end up with a grade distribution that is very top-heavy, and you can't distinguish between the good and the excellent.  If the test is too difficult, the distribution is soul-crushingly bottom-heavy (leading to great angst among the students), and again you can't tell between those who really don't know what's going on and those who just slipped up.  Along these lines, you also need the test to be comparatively straightforward to take (step-by-step multipart problems, where there are still paths forward even if one part is wrong) and to grade.

Finally, in an ideal world, you'd actually like students to learn something from the test, not just have it act purely as a hurdle to be overcome.  This last goal is almost impossible to achieve in classes so large that multiple choice exams are the only real option.  It is where exam writing can be educational for the instructor as well, though - nothing quite like starting out to write a problem, only to realize partway through that the situation is more subtle than you'd first thought!  Ahh well.  Back to working on my test questions.

Thursday, November 18, 2010

Memristors - how fundamental, and how useful?

You may have heard about an electronic device called a memristor, a term originally coined by Leon Chua back in 1971, and billed as the "missing fourth fundamental circuit element".  It's worth taking a look at what that means, and whether memristors are fundamental in the physics sense that resistors, capacitors, and inductors are.  Note that this is an entirely separate question from whether such devices and their relatives are technologically useful! 

In a resistor, electronic current flows in phase with the voltage drop across the resistor (assuming the voltage is cycled in an ac fashion).  In the dc limit, current flows in steady state proportional to the voltage, and power is dissipated.  In a capacitor, in contrast, the flow of current builds up charge (in the usual parallel plate concept, charge on the plates) that leads to the formation of an electric field between conducting parts, and hence a voltage difference.  The current leads the voltage (current is proportional to the rate of change of the voltage); when a constant voltage is specified, the current decreases to zero once that voltage is achieved, and energy is stored in the electric field of the capacitor.  In an inductor, the voltage leads the current - the voltage across an inductor, through Faraday's law, is proportional to the rate at which the current is changing.  Note that in a standard inductor (usually drawn as a coil of wire), the magnetic flux through the inductor is proportional to the current (flux = L I, where L is the inductance).  That means that if a certain current is specified through the inductor, the voltage drops to zero (in the ideal, zero-resistance case), and there is energy stored in the magnetic field of the inductor.  Notice that there is a duality between the inductor and capacitor cases (current and voltage swapping roles; energy stored in either electric or magnetic field).

Prof. Chua said that one could think of things a bit differently, and consider a circuit element where the magnetic flux (remember, in an inductor this would be proportional to the time integral of the voltage) is proportional to the charge that has passed through the device (the time integral of the current (rather than the current itself in an inductor)).  No one has actually made such a device, in terms of magnetic flux.  However, what people have made are any number of devices where the relationship between current and voltage depends on the past history of the current flow through the device.  One special case of this is the gadget marketed by HP as a memristor, consisting of two metal electrodes separated by a titanium oxide film.  In that particular example, at sufficiently high bias voltage, the flow of current through the device performs electrochemistry on the titanium oxide, either reducing it to titanium metal, or oxidizing it further, depending on the polarity of the flow.  The result is that the resistance (the proportionality between voltage and current; in the memristor language, the proportionality between the time integral of the voltage and the time integral of the current) depends on how much charge has flowed through the device.  Voila, a memristor.

I would maintain that this is conceptually very different and less fundamental than the resistor, capacitor, or inductor elements.  The resistor is the simplest possible relationship between current and voltage; the capacitor and inductor have a dual relationship and each involve energy storage in electromagnetic fields.  The memristor does not have a deep connection to electromagnetism - it is one particular example of the general "mem"device, which has a complex electrical impedance that depends on the current/voltage history of the device.  Indeed, my friend Max di Ventra has, with a colleague, written a review of the general case, which can be said to include "memcapacitors" and "meminductors".  The various memgizmos are certainly fun to think about, and in their simplest implementation have great potential for certain applications, such as nonvolatile memory.

Monday, November 15, 2010

Great moments in consumer electronics

It's been an extremely busy time of the semester, and there appears to be no end in sight.  There will be more physics posts soon, but in the meantime, I have a question for those of you out there that have Nintendo Wii consoles.  (The Wii is a great example of micromachining technology, by the way, since the controller contains a 3-axis MEMS accelerometer, and the Wii Motion Plus also contains a micromachined gyroscope.)  Apparently, if there is a power glitch, it is necessary to "reset your AC adapter" in order to power on the console.  The AC adapter looks for all the world like an ordinary "brick" power supply, which I would think should contain a transformer, some diodes, capacitors, and probably voltage regulators.  Resetting it involves unplugging it from both ends (the Wii and the power strip), letting it sit for two solid minutes, and then plugging it back directly into a wall outlet (not a power strip).  What the heck did Nintendo put in this thing, and why does that procedure work, when plugging it back into a power strip does not?!  Does Nintendo rely on poorly conditioned power to keep the adapter happy?  Is this all some scheme so that they can make sure you're not trying to use a gray-market adapter?  This is so odd that it seemed like the only natural way to try to get to the bottom of it (without following my physicist's inclination of ripping the adapter apart) was to ask the internet.

Wednesday, November 10, 2010

Paul Barbara

I was shocked and saddened to learn of the death of Paul Barbara, a tremendous physical chemist and National Academy of Sciences member at the University of Texas.  Prof. Barbara's research focused largely on electron transfer and single-molecule spectroscopy, and I met him originally because of a mutual interest in organic semiconductors.  He was very smart, funny, and a class act all the way, happy to talk science with me even when I was a brand new assistant professor just getting into our field of mutual interest.  He will be missed.

Friday, November 05, 2010

Two cool videos, + science funding

Here are two extremely interesting videos related to physics topics.  Both combine two things I enjoy in life:  physics and coffee.  Here is a video made by scientists at the Institut Laue-Langevin, a neutron science laboratory in Grenoble funded by the EU.  The scientists decided to use a neutron beam to image through a little espresso maker as it brews.  They did this partly for fun, and partly to demonstrate how neutrons may be used to examine materials - for example, one could use this sort of imaging to look for flaws or cracks in turbine blades.  The cross-section for absorbing neutrons varies quite strongly from element to element, giving good material contrast.  The aluminum housing for the espresso maker shows up as very light gray, while the water (and resulting espresso, which is still mostly water, even when my old friend Sven makes it) shows up as very dark.  This is because the hydrogen in the water has a relatively large cross-section for capturing a neutron and becoming deuterium.

The second video I saw thanks to Charles Day's blog.  To me, a former mechanical engineer, this is rather jaw-dropping.  Hod Lipson and his graduate students at Cornell have managed to leverage a great piece of physics called the jamming transition.  Many physics students are surprised to learn that some "simple" problems of classical statistical physics can exhibit complex phenomenon and remain active subjects of research, even though they seem on the surface like they should have been solved by 19th century French mathematician whose name started with L.  The jamming transition is one of these problems.  Take a bunch of dry grains (in this case, ground coffee).  When there is a bit of air mixed in with the grains, the grains can slide over and past each other relatively easily.  A latex balloon filled with this mixture is squishy.  If the air is removed, however, the grains jam up, and the grain-filled balloon becomes very hard (as if the effective viscosity of the blob of grains diverges).  The Cornell researchers have used this phenomenon to make a universal "gripper" for picking up objects.  Just watch the movie.  It's very impressive.

Finally, a tidbit about science funding.  Well, it didn't take long.  The Heritage Foundation (a US conservative think-tank) is already proposing cutting the research budgets of NSF, DOE, and NIST, as well as eliminating NSF support for K12 education.  This isn't a surprise - they do this all the time - though it's interesting that they propose absolutely zero cuts to the Department of Defense (though they have no problem suggesting cuts for veterans benefits).   

Wednesday, November 03, 2010

Data and backups

I don't talk too much on here about the university service stuff that I do - frankly, much of it wouldn't be very interesting to most of my readers.  However, this year I'm chairing Rice University's Committee on Research, and we're discussing an issue that many of you may care about:  data management and preservation.   Generally, principal investigators are assumed to be "responsible custodians" of data taken during research.  Note that "data" can mean many things in this context - see here, for example.   US federal agencies that sponsor research typically expect the PIs to hold on to their data for several years following the conclusion of a project, and that PIs will make their data available if requested.  The university is legally responsible to ensure that the data is retained, in fact.  There are many issues that crop up here, but the particular one on which I'd like some feedback is university storage of electronic data.  If you're at a university, does your institution provide electronic (or physical, for that matter) storage space for the retention of research data?  Do they charge the investigators for that storage?  What kind of storage is it, and is the transfer of data from a PI's lab, say, to that storage automated?  I'd be very interested in hearing either success stories about university or institutional data management, or alternately horror stories.