Thursday, July 30, 2009

More musing about phase transitions

Everyone has seen phase transitions - water freezing and water boiling, for example. These are both examples of "first-order" phase transitions, meaning that there is some kind of "latent heat" associated with the transition. That is, it takes a certain amount of energy to convert 1 g of solid ice into 1 g of liquid water while the temperature remains constant. The heat energy is "latent" because as it goes into the material, it's not raising the temperature - instead it's changing the entropy, by making many more microscopic states available to the atoms than were available before. In our ice-water example, at 0 C there are a certain number of microscopic states available to the water molecules in solid ice, including states where the molecules are slightly displaced from their equilibrium positions in the ice crystal and rattling around. In liquid water at the same temperature, there are many more possible microscopic states available, since the water molecules can, e.g., rotate all over the place, which they could not do in the solid state. (This kind of transition is "first order" because the entropy, which can be thought of as the first derivative of some thermodynamic potential, is discontinuous at the transition.) Because this kind of phase transition requires an input or output of energy to convert material between phases, there really aren't big fluctuations near the transition - you don't see pieces of ice bopping in and out of existence spontaneously inside a glass of icewater.

There are other kinds of phase transitions. A major class of much interest to physicists is that of "second-order" transitions. If one goes to high enough pressure and temperature, the liquid-gas transition becomes second order, right at the critical point where the distinction between liquid and gas vanishes. A second order transition is continuous - that is, while there is a change in the collective properties of the system (e.g., in the ferro- to paramagnetic transition, you can think of the electron spins as many little compass needles; in the ferromagnetic phase the needles all point the same direction, while in the paramagnetic phase they don't), the number of microscopic states available doesn't change across the transition. However, the rate at which microstates become available with changes in energy is different on the two sides of the transition. In second order transitions, you can get big fluctuations in the order of the system near the transition. Understanding these fluctuations ("critical phenomena") was a major achievement of late 20th century theoretical physics.

Here's an analogy to help with the distinction: as you ride a bicycle along a road, the horizontal distance you travel is analogous to increasing the energy available to one of our systems, and the height of the road corresponds to the number of microscopic states available to the system. If you pedal along and come to a vertical cliff, and the road continues on above your head somewhere, that's a bit like the 1st order transition. With a little bit of energy available, you can't easily go back and forth up and down the cliff face. On the other hand, if you are pedaling along and come to a change in the slope of the road, that's a bit like the 2nd order case. Now with a little bit of energy available, you can imagine rolling back and forth over that kink in the road. This analogy is far from perfect, but maybe it'll provide a little help in thinking about these distinctions. One challenge in trying to discuss this stuff with the lay public is that most people only have everyday experience with first-order transitions, and it's hard to explain the subtle distinction between 1st and 2nd order.

Wednesday, July 22, 2009

The Anacapa Society

Hat tip to Arjendu for pointing this out. The Anacapa Society is a national society that promotes and encourages research in computational and theoretical physics at primarily undergrad institutions. They've had a good relationship with the KITP at UCSB, and have just signed an agreement that gives them a real home at Amherst College. (I've had a soft spot for Amherst since back in the day when I was struggling to decide whether to do the tier-1 research route vs. the undergrad education trajectory.) The nice thing about promoting this kind of research is that, particularly on the computational side of things, well-prepared undergrads at smaller institutions can make real contributions to science without necessarily the expensive infrastructure required for some hardcore experimental areas.

Cute optics demo

This youtube video is something that I'll have to remember for a future demo. It shows that cellophane tape makes (1-side) frosted glass appear to be transparent. Quite striking! The reason this works is pretty straightforward from the physics perspective. Frosted glass looks whitish because its surface has been covered (by sandblasting or something analogous) with little irregularities that have a typical size scale comparable to the wavelengths of visible light. Because of the different in index of refraction between glass and air, these little irregularities diffusely scatter light, and they do a pretty equitable job across the visible spectrum. (This is why clouds are white, too, by the way.) By coating the glass intimately with a polymer layer (with an index of refraction closer to the glass than that of the air), one is effectively smoothing out the irregularities to a large degree. As far as I know, this is essentially the same physics behind why wet fabrics often appear darker than dry fabrics. Some of the apparent lightness of the dry material is due to diffuse scattering by ~ wavelength-sized stray threads and fibers. A wetting liquid acts as an index-matching medium, effectively smoothing out those inhomogeneities and reducing that diffuse scattering.

Tuesday, July 21, 2009

Phase transitions and "mean field theory"

One truly remarkable feature of statistical physics (and condensed matter physics in particular) is the emergence of phase transitions. When dealing with large numbers of particles one often finds that, as a function of some parameter like temperature or pressure, the whole collection of particles can undergo a change of state. For example, as liquid water is warmed through 100 C at atmospheric pressure, it boils into a vapor phase of much lower density, even though it is still made up of the same water molecules as before. Understanding how and why phase transitions take place has kept many physicists occupied for a long time.

Of particular interest is understanding how microscopic interactions (e.g., polar attraction between individual water molecules) connect to the phase behavior. A classic toy model of this is used to examine magnetism. It's a comparatively simple statistical physics problem to understand how a single magnetic spin (in real life, something like one of the d electrons in iron) interacts with an external magnetic field. The energy of a magnetic moment is lowered if the magnetic moment aligns with a magnetic field - this is why it's energetically favorable for a compass needle to point north. So, one does the statistical physics problem of a single spin in a magnetic field, and there's a competition between this alignment energy on the one hand, and thermal fluctuations on the other. At large enough fields and low enough temperatures, the spin is highly likely to align with the field. Now, in a ferromagnet (think for now about a magnetic insulator, where the electrons aren't free to move around), there is some temperature, the Curie temperature, below which the spins spontaneously decide to align with each other, even without an external field. Going from the nonmagnetic to the aligned (ferromagnetic) state is a phase transition. A toy model for this is to go back to the single spin treatment, and instead of thinking about the spin interacting with an externally applied magnetic field, say that the spin is interacting with an average (or "mean") magnetic field that is generated by its neighbors. This is an example of a "mean field theory", and may be solved self-consistently to find out, in this model, the Curie temperature and how the magnetization behaves near there.

Mean field theories are nice, but it is comparatively rare that real systems are well described in detail by mean field treatments. For example, in the magnetism example the magnetization (spontaneous alignment of the spins, in appropriate units) goes like (1-T/TC)1/2 at temperatures just below TC. This is not the case for real ferromagnets - the exponent is different. Because of the nature of the approximations made in mean field theory, it is expected to be best in higher dimensionality (that is, when there are lots of neighbors!). Here's a question for experts: what real phase transitions are well described by mean field theory? I can only think of two examples: superconductivity (where the superconducting gap scales like
(1-T/TC)1/2 near the transition, just as mean field theory predicts) and a transition between liquid crystal phases. Any others?


Wednesday, July 15, 2009

The elevator message

I had a conversation today that made me think about the following. These days we're told countless times that it's essential for a scientist to have an "elevator message". That is, we need to be able to describe what we're doing in a pitch accessible to a lay person ideally in something like a single sentence. Some people have a comparatively easy time of this. They can say "I'm trying to cure cancer", or "I'm trying to solve the energy crisis", and have that be a reasonable description of their overarching research goals. Condensed matter physicists in general often have trouble with this, and tend to fall back on things like "My work will eventually enable faster computers" or "...better sensors". I'm all in favor of brief, accessible descriptions of what scientists do, but there are times when I think the elevator message idea is misguided. Not every good research program can be summed up in one sentence.

In the case of my group, we are trying to understand the (electronic, magnetic, and optical) properties of matter on the smallest scales, with an eye toward eventually engineering these properties to do useful things. It's basic research. Sometimes we can test existing theoretical ideas or address long-standing questions; sometimes, because we're working in previously unexplored regimes, we find surprises, and that can be really fun. I know that this italicized section is more sophisticated and therefore less pithy than "it'll give us faster computers". Still, I feel like this longer description does a much better job of capturing what we're actually doing. Our work is much more like puzzle-solving and exploring than it is a focused one-goal pursuit. I don't think that this means I lack vision, but I'm sure others would disagree.

On a separate note: Thanks, Arjendu, for pointing me to this, Microsoft Research's hosting of a series of Feynman lectures at Cornell in 1964. Very cool, even if I had to install MS's plug-in for the video.

Thursday, July 09, 2009

We need more papers like this.

Somehow I had missed this paper when it came out on the arxiv last November, but I came across it the other day while looking for something else in the literature. It's all about the challenges and hazards of trying to measure magnetization of either tiny samples or those with extremely small magnetic responses. Some of the cautions are rather obvious (e.g., don't handle samples with steel tools, since even tiny amounts of steel contamination will give detectable magnetic signals), and others are much more subtle (e.g., magnetic signatures from Kapton tape (due to dust! I learned about this one first hand a few years ago.) and deformed plastic straws (commonly used as sample holders in a popular brand of magnetometer)). Papers like this are incredibly valuable, and usually hard to publish. Still, I much prefer this style, writing a substantive, cautionary paper that is informative and helpful, to the obvious alternative of writing aggressive comments in response to papers that look suspect to you. The paper is so good that I'm even willing to forgive them their choice of font.

Wednesday, July 08, 2009

Figures and permissions - Why, AAAS?

Perhaps someone out there can enlighten me. For review articles, if you want to reproduce a figure from someone's published work, you are required to get permission from the copyright holder (e.g., APS for Physical Review, ACS for Nano Letters, etc.). As far as I can tell, the professional societies (APS, ACS) are cool about this, and won't charge you for permission. Even Nature, a for-profit magazine, does not charge for this if all you're doing is using a figure here and there. However, Science, run by the non-profit AAAS, wants to charge $31.25 per figure for permission to reproduce that figure in a review article. Why is Science doing this? Is this some attempt to recoup publication costs? Anyone got an explanation?

arxiv failure

It would appear that the arxiv is having some issues. Bizarrely, this seems to affect cond-mat, but not (for example) astr-ph. In cond-mat, asking for "recent" papers points you to October, 2008. Asking for "new" papers gets you things like:

New submissions for Wed, 8 Jul 09

Error with 0907.1092
Error with 0907.1096
Error with 0907.1111
Very odd. Hopefully this will be fixed soon. Come to think of it, this is the first problem I've seen like this in a decade of reading cond-mat.

Wednesday, July 01, 2009

This week in cond-mat

There have been a number of exciting (to me, anyway) papers on the arxiv this past week. One in particular, though, seems like a neat illustration of a physical principal that crops up a lot in condensed matter physics.

arxiv:0906.5206 - Tanda et al., Aharonov-Bohm Effect at liquid-nitrogen temperature: Frohlich superconducting quantum device

There are several examples in condensed matter physics of "special" (I'll explain what I mean in a second) electronic ground states that are "gapped", meaning that the lowest energy excited states for the many-electron system are separated from the ground state by an energy range where there are no allowed states. When I say that a ground state is special, I mean that it has some particular order parameter (or broken symmetry) that is distinct from that of the excited states. In this sense, a band insulator or semiconductor is not special - the many-body filled valence band states really don't have any different symmetries than the empty conduction band states. However, the superconducting ground state is special, with broken gauge symmetry (when compared to the normal metallic state) and a minimum energy (the gap energy) required to make any excitations (in this case, by breaking apart a Cooper pair). Fractional quantum Hall states are similarly gapped. The consequence of that energy gap is that the ground state can be very robust. In particular, the gap means that low energy (compared to the gap) inelastic processes cannot perturb the system, since there are no allowed final states around. This is one reason why it is possible to see macroscopic quantum effects in superconductors, as long as T is small compared to the gap.

The authors of this paper have decided to see whether such macroscopic quantum effects (detectable via quantum interference measurements analogous to the two-slit experiment) can survive in another gapped system. The distinction here is that the special state is something called a charge density wave (CDW), where the electronic density in a material (in this case tantalum trisulfide) spontaneously takes on a spatially periodic modulation. This gapped state kicks in at much higher temperatures than typical superconducting transitions. The authors have been able to measure quantum interference robustly in their device at liquid nitrogen temperatures, which is pretty impressive, and there is reason to believe that this could be extended to room temperature. The sample fabrication is very impressive, by the way. You can't just take a sheet of this stuff and punch a hole in it to make your ring-shaped interferometer. Instead, you have to actually curl a sheet up into a tube. Neat stuff, and quite surprising to me. I need to read up more about CDWs....