Wednesday, November 19, 2014

Frequently Asked Questions

[Image source: Stickypictures.]

My mom is a, now-retired, high school teacher. As teenager I thought this was a great job and wanted to become a teacher myself. To practice, I made money giving homework help but discovered quickly I hated it for a simple reason: I don’t like to repeat myself. I really don’t like to repeat myself.

But if I thought spending two years repeating how to take square roots - to the same boy - was getting me as close to spontaneous brain implosion I ever wanted to get, it still didn’t quite prepare me for the joys of parenthood. Only the twins would introduce me to the pleasure of hearing Jingle Bells for 5 hours in a row, and re-reading the story about Clara and her Binky until the book mysteriously vanished and will not be seen again unless somebody bothers to clean behind the shoe rack. “I told you twice not to wash the hair dryer,” clearly wasn’t my most didactic moment. But my daughter just laughed when the fuse blew and the lights went off. Thanks for asking, we got a new dryer.

And so I often feel like I write this blog as an exercise in patience. Nobody of course bothers to search the blog archives where I have explained everything. Sometimes twice! But today I will try to be inspired by Ethan who seems to have the patience of an angel, if a blue one, and basically answers the same questions all over and over and over again. So here are answers to the questions I get most often. Once and forever I hope...
  1. Is string theory testable?

    The all-time favorite. Yes, it is. There is really no doubt about it. The problem is that it is testable in principle, but at least so far nobody knows how to test it in practice. The energy (densities) necessary for this are just too high. Some models that are inspired by string theory, notably string cosmology, are testable with existing experiments. That it is testable in principle is a very important point because some variants of the multiverse aren’t even testable in principle and then it is indeed highly questionable whether it is still science. Not so though for string theory. And let me be clear that I mean here string theory as the candidate theory of everything including gravity. Testing string theory as means to explain certain strongly coupled condensed matter systems is an entirely different thing.

  2. Do black holes exist?

    Yes. We have ample evidence that supermassive black holes exist in the centers of many galaxies and that solar-sized black holes are found throughout galaxies. The existence of black holes is today generally accepted fact in the physics community. That black holes exist means concretely that we have observational evidence for objects dense enough to be a black hole and that do not have a hard surface, so they cannot be a very dim stars. One can exclude this possibility because matter hitting the surface of a star would emit radiation, whereas the same would not happen when the matter falls through the black hole horizon. This horizon does not have to be an eternal horizon. It is consistent with observation, and indeed generally believed, that the black hole horizon can eventually vanish, though this will not happen until hundreds of billions of years into the future. The defining property of the black hole is the horizon, not the singularity at its center, which is generally believed to not exist but for which we have no evidence one way or the other.

  3. Why quantize gravity?

    There is no known way to consistently couple the non-quantized theory of general relativity to the quantum field theories of the standard model. This only works in limiting cases. The most plausible way to resolve this tension is to quantize gravity too. It is in principle also possible that instead there is a way to couple quantum and classical theories that has so far been missed, or that the underlying theory is in some sense neither classical nor quantum, but this option is not favored by most researchers in the field today. Either way, the inconsistency in our existing theories is a very strong indication that the theories we have are incomplete. Research in quantum gravity basically searches for the completion of the existing theories. In the end this might or might not imply actually quantizing gravity, but Nature somehow knows how to combine general relativity with quantum field theory, and we don’t.

  4. Why is it so hard to quantize gravity?

    It isn’t. Gravity can be quantized pretty much the same way as the other interactions. It’s just that the theory one arrives at this way cannot be a fundamental theory because it breaks down at high energies. It is thus not the theory that we are looking for. Roughly speaking the reason this happens is that the gravitational equivalent of a particle’s charge is the particle’s energy. For the other known interactions the charge and the energy are distinct things. Not so for gravity.

  5. Is quantum gravity testable?

    Again, yes it is definitely testable in principle, it’s just that the energy density necessary for strong quantum gravitational effects is too high for us to produce. Personally I am convinced that quantum gravity is also testable in practice, because indirect evidence can prevail at much lower energy densities, but so far we do not have experimental evidence. There is a very active research area called quantum gravity phenomenology dedicated to finding the missing experimental evidence. You can check these two review papers to get an impression of what we are presently looking for.

Wednesday, November 12, 2014

The underappreciated value of boring truths

My primary reaction to any new idea on the arXiv is conviction that it’s almost certainly wrong, and if I can’t figure out quickly why it’s wrong, I’ll ignore it because it’s most likely a waste of time. In other words, I exemplify the stereotypical reaction of scientists which Arthur Clarke summed up so nicely in his the three stages of acceptance:
  1. “It’s crazy — don’t waste my time.”
  2. “It’s possible, but it’s not worth doing.”
  3. “I always said it was a good idea.”

Maybe I’m getting old and bold rather than wise and nice, but when it comes to quantum gravity phenomenology, craziness seems to thrive particularly well. My mother asked me the other day what I tell a journalist who wants a comment on somebody else’s work which I think is nonsense. I told her I normally say “It’s very implausible.” No, I’m not nice enough to bite my tongue if somebody asks for an opinion. And so, let me tell you that most of what gets published under the name of quantum gravity phenomenology is, well, very implausible.

But quantum gravity phenomenology is just an extreme example of a general tension that you find in theoretical physics. Consider you’d rank all unconfirmed theories on two scales, one the spectrum from exciting to boring, the other the spectrum from very implausible to likely correct. Then put a dot for each theory in a plane with these two scales as axes. You’d see that the two measures are strongly correlated: The nonsense is exciting, and the truth is boring, and most of what scientists work on falls on a diagonal from exiting nonsense to boring truths.

If you’d break this down by research area you’d also find that the more boring the truth, the more people work on nonsense. Wouldn’t you too? And that’s why there is so much exciting nonsense in quantum gravity phenomenology - because the truth is boring indeed.

Conservative wisdom says that quantum gravitational effects are tiny unless space-time curvature is very strong, which only happens in the early universe and inside black holes. This expectation comes from treating quantum gravity as an effective field theory, and quantizing it perturbatively, ie when the fluctuations of space-time are small. The so quantized theory does not make sense as a fundamental theory of gravity because it breaks down at high energies, but it should be fine for calculation in weak gravitational fields.

Most of the exciting ideas in quantum gravity phenomenology assume that this effective limit does not hold for one reason or the other. The most conservative way to be non-conservative is to allow the violation of certain symmetries that are leftover from a fundamental theory of quantum gravity which does not ultimately respect them. Violations of Lorentz-invariance, CPT invariance, space-time homogeneity, or unitarity are such cases that can be accommodated within the effective field theory framework, and that have received much attention as possible signatures of quantum gravity.

Other more exotic proposals implicitly assume that the effective limit does not apply for unexplained reasons. It is known that effective field theories can fail under certain circumstances, but I can’t see how any of these cases play a role in the weak-field limit of gravity. Then again, strong curvature is one of the reasons of failure, and we do not understand what the curvature of space-time is microscopically. So sometimes, when I feel generous, I promote “implausible” to “far-fetched”.

John Donoghue is one of the few heroically pushing through calculations in the true-but-boring corner of quantum gravity phenomenology. In a recent paper, he and his coauthors calculated the quantum contributions to the bending of light in general relativity from 1-loop effects in perturbatively quantized gravity. From their result they define a semi-classical gravitational potential and derive the quantum corrections to Einstein’s classical test of General Relativity by light deflection.

They find a correction term that is suppressed by a factor ℏ G/b2 relative to the classical result, where b is the impact parameter and G is Newton’s constant. This is the typical result you’d expect from dimensional reasons. It’s a loop correction, it must have an extra G in it, it must have an inverse power of the impact parameter so it gets smaller with distance, thus G/b2 is a first guess. Of course you don’t get tenure for guessing, and the actual calculation is quite nasty, see paper for details.

In the paper the authors write “we conclude that the quantum effect is even tinier than the current precision in the measurement of light deflection”, which is an understatement if I have ever seen one. If you are generous and put in a black hole of mass M and a photon that just about manages to avoid being swallowed, the quantum effect is smaller by a factor (mp/M)2 than the classical term, where mp is the Planck mass. For a solar mass black hole this is about 70 orders of magnitude suppression. (Though on such a close approach the approximation with a small deflection doesn’t make sense any more.) If you have a Planck-mass black hole, the correction term is of order one – again that’s what you’d expect.

Yes, that is a very plausible result indeed. I would be happy to tell this any journalist, but unfortunately news items seem to be almost exclusively picked from the ever increasing selection of exciting nonsense.

I will admit that it is hard to communicate the relevance of rather technical calculations that don’t lead to stunning results, but please bear with me while I try. The reason this work is so important is that we have to face the bitter truth to find out whether that’s really all that there is or whether we indeed have reason to expect the truth isn’t as bitter as it said on the wrapping. You have to deal with a theory and its nasty details to figure out where it defies your expectations and where your guesses go wrong. And so, we will have to deal with effective quantum gravity to understand its limits. I always said it was a good idea. Even better that somebody else did the calculation so I can continue thinking about the exciting nonsense.

Bonus: True love.

Tuesday, November 11, 2014

And the winners are...

The pile of money whose value you have been guessing came out to be 68.22 Euro and 0.5 Deutsche Mark, the latter of which I didn't count. Hoping that I didn't miss anybody's guess, this means the three winning entries are:
  • Rbot: 72
  • Rami Kraft: 62
  • droid33: 58.20
Congratulations to the winners! Please send an email to hossi[at] with your postal address and I will send the books on the way.

Saturday, November 08, 2014

Make a guess, win a book.

The twins' piggy banks are full, so I've slaughtered them. Put in your guess of how much they've swallowed and you can win a (new) copy of Chad Orzel's book "How to Teach Quantum Physics to Your Dog". (No, I'm not getting paid for this, I have a copy I don't need and hope it will make somebody happy.) You can put in your guess until Monday, midnight, East Coast Time. I will only take into account guesses posted in the comments - do not send me an email. I am looking for the amount in Cent or Euro, not the number of coins. The winners will be announced Tuesday morning. Good luck!

Wednesday, November 05, 2014

The paradigm shift you didn’t notice

Inertia creeps.

Today, for the first time in human history a scientist has written this sentence – or so would be my summary of most science headlines I read these days. Not only do the media buy rotten fish, they actually try to resell them. The irony is though that the developments which really change the way we think and live happen so gradually you wouldn’t ever learn about them in these screaming headlines.

HIV infection for example still hasn’t been cured, but decades of hard work turned it from a fatal disease into a treatable one. You read about this in longwinded essays in the back pages where nobody looks, but not on the cover page and not in your news feed. The real change didn’t come about by this one baby who smiles on the photo and who was allegedly cured, as the boldface said, but by the hundreds of trials and papers and conferences in the background.

These slow changes also happen in physics. Quantum measurement is a decoherence process rather than collapse. This doesn’t break the ground but slowly moves it. It’s an interpretational shift that has spread through the community. Similarly, it is now generally accepted that most infinities in quantum field theory do not signal a breakdown of the theory but can be dealt with by suitable calculational methods.

For me the most remarkable shift that has taken place in physics in the last decades is the technical development and, with it, acceptance of renormalization group flow and effective field theories. If this sounds over your head, bear with me for I’m not going into the details, I just want to tell you why it matters.

You have certainly heard that some quantum field theories are sick and don’t make sense – they are said to be non-renormalizable. In such a theory the previously mentioned infinities cannot be removed, or they can only be removed on the expense of introducing infinitely many free parameters which makes the theory useless. Half a century ago a theory with this disease was declared dead and went where theories go to die, into the history aisle.

Then it became increasingly clear that such non-renormalizable theories can be low-energy approximations to other theories that are healthy and renormalizable. The infinities are artifacts of the approximation and appear if one applies the approximation outside its regime of validity.

These approximations at low energies are said to be “effective” theories and they typically contain particles or degrees of freedom that are not fundamental, but instead “emergent”, which is to say they are good descriptions as long as you don’t probe them with too high energy. The theory that is good also at high energies is said to be the “UV completion” of the effective theory. (If you ever want to fake a physics PhD just say “in the IR” instead of “at low energy” and “UV” instead of “high energy”.)

A typical example for an effective theory is the nuclear force between neutrons and protons. These are not fundamental particles – we know that they are made of quarks and gluons. But for nuclear physics, at energies too small to test the quark substructure, one can treat the neutrons and protons as particles in their own right. The interaction between them is then effectively mediated by a pion, a particle that is itself composed of two quarks.

Fermi’s theory of beta-decay is a historically very important case because it brought out the origin of non-renormalizability. We know today that the weak interaction is mediated by massive gauge-bosons, the W’s and the Z. But at energies so low that one cannot probe the production and subsequent decay of these gauge bosons, the weak interaction can be effectively described without them. When a neutron undergoes beta decay, it turns into a proton and emits an electron and electron-anti-neutrino. If you do not take into account that this happens because one of the quark constituents emits a W-boson, then you are left with a four-fermion interaction with a coupling constant that depends on the mass of the W-boson. This theory is not renormalizable. Its UV completion is the standard model.

Upper image: One of the neutron's quark constituents interacts via a gauge boson with an
electron. Bottom image: If you neglect the quark substructure and the boson-exchange, you get a four-fermion interaction with a coupling that depends on the mass of the boson and which is non-renormalizable.

So now we live and work with the awareness that any quantum field theories is only one in a space of theories that can morph into each other, and the expression of the theory changes with the energy scale at which we probe the physics. A non-renormalizable theory is perfectly fine in its regime of validity. And thus today these theories are not declared dead any longer, they are declared incomplete. A theory might have other shortcomings than being non-renormalizable, for example because it contains dimensionless constants much larger than (or smaller than) one. Such a theory is called unnatural. In this case too you would now not simply discard the theory but look for its UV completion.

It is often said that physicists do not know how to quantize gravity. This isn’t true though. Gravity can be quantized just like the other interactions; the result is known as “perturbatively quantized gravity”. The problem is that the theory one gets this way is non-renormalizable, which is why it isn’t referred to as quantum gravity proper. The theory of quantum gravity that we do not know is the UV-completion of this non-renormalizable perturbative quantization. (It cannot be non-renormalizable in the same way as Fermi’s theory because gravity is a long-range interaction. We know that gravitons, if they have masses at all, have tiny masses.)

But our improved understanding of how quantum field theories at different energies belong together has done more than increasing our acceptance of theory with problems. The effective field theory framework is the tool that binds together, at least theoretically, the different disciplines in physics and in the sciences. No longer are elementary particle physics and nuclear physics and atomic physics and molecular physics different, disconnected layers of reality. Even though we cannot (yet) derive most of the relations between the models used in these disciplines, we know that they are connected through the effective field theory framework. And at high energies many physicists believe it all goes back to just one “theory of everything”. Don’t expect a big headline announcing its appearance though. The ground moves slowly.

Friday, October 31, 2014

String theory – it’s a girl thing

My first international physics conference was in Turkey. It was memorable not only because smoking was still allowed on the plane. The conference was attended by many of the local students, and almost all of them were women.

I went out one evening with the Turkish students, a group of ten with only one man who sucked away on his waterpipe while one of the women read my future from tea leaves (she read that I was going to fly through the air in the soon future). I asked the guy how come there are so few male students in this group. It’s because theoretical physics isn’t manly, it’s not considered a guy thing in Turkey, he said. Real men work outdoors or with heavy machinery, they drive, they swing tools, they hunt bears, they do men’s stuff. They don’t wipe blackboards or spend their day in the library.

I’m not sure how much of his explanation was sarcasm, but I find it odd indeed that theoretical physics is so man-dominated when it’s mostly scribbling on paper, trying to coordinate collaborations and meetings, and staring out of the window waiting for an insight. It seems mostly a historical accident that the majority of physicists today are male.

From the desk in my home office I have a view onto our downstairs neighbor’s garden. Every couple of weeks a man trims her trees and bushes. He has a key to the gate and normally comes when she is away. He uses the smoking break to tan his tattoos in her recliner and to scratch his breast hair. Then he pees on the roses. The most disturbing thing about his behavior though isn’t the peeing, it’s that he knows I’m watching. He has to cut the bushes from the outside too, facing the house, so he can see me scribbling away on my desk. He’ll stand there on his ladder and swing the chainsaw to greet me. He’s a real man, oh yeah.

After I finished high school, I went to the employment center which offered a skill- and interest-questionnaire, based on which one then was recommended a profession. I came out as landscape architect. It made sense – when asked, I said I would like to do something creative that allows me to spend time outdoors and that wouldn’t require many interpersonal skills. I also really like trees.

Then I went and studied math because what the questionnaire didn’t take into account is that I get bored incredibly quickly. I wanted a job that wouldn’t run out of novelty any time soon. Math and theoretical physics sounded just right. I never spent much time thinking about gender stereotypes, it just wasn’t something I regarded relevant. Yes, I knew the numbers, but I honestly didn’t care. Every once in a while I would realize how oddly my voice stood out, look around and realize I was the only women in the room, or one of a few. I still find it an unnatural and slightly creepy situation. But no, I never thought about gender stereotypes.

Now I’m a mother of two daughters and I realized the other day I’ve gone pink-blind. Before I had children, I’d look at little girls thinking I’d never dress my daughters all in pink. But, needless to say, most of the twin’s wardrobe today is pink because it’s either racing cars and soccer players on blue, or flowers and butterflies on pink. Unless you want to spend a ridiculous amount of money on designer clothes your kids will wear maybe once.

The internet is full with upset about girl’s toys that discourage an interest in engineering, unrealistic female body images, the objectification of women in ads and video games, the lack of strong female characters in books and movies. The internet is full with sites encouraging women to accept their bodies, the bodies of mothers with the floppy bellies and the stretch marks, the bodies of real women with the big breasts and the small breasts and the freckles and the pimples – every inch of you is perfect from the bottom to the top. It’s full with Emma Watson and He for She. It’s full of high pitched voices.

But it isn’t only women who are confronted with stereotypical gender roles and social pressure. Somebody I think must stand up and tell the boys it’s totally okay to become a string theorist, even though they don’t get to swing a chainsaw - let that somebody be me. Science is neither a boy thing nor a girl thing.

So this one is for the boys. Be what you want to be, rise like a phoenix, and witness me discovering the awesomeness of multiband compression. Happy Halloween :)

Monday, October 27, 2014

Einstein’s greatest legacy- How demons and angels advanced science

Einstein’s greatest legacy is not General Relativity, it’s not quantum entanglement, and it’s not slices of his brain either. It’s a word: Gedankenexperiment – German for “thought experiment”.

Einstein, like no other physicist before or after him, demonstrated how the power of human thought alone, used skillfully, can make up for the lack of real experiments. He showed we little humans have the power to deduce equations that govern the natural world by logical conclusion. Thought experiments are common in theoretical physics today. Physicists use them to examine the consequences of a theory beyond that what is measureable with existing technology, but still within the realm of that what is in principle measureable. A thought experiments pushes a theory to its limit and thereby can reveal inconsistencies or novel effects. The rules of the game are that a) relevant is only that what is measureable and b) do not fool yourself. This isn’t as easy as it sounds.

The famous Einstein-Podolsky-Rosen experiment was such an exploration of the consequences of a theory, in this case quantum mechanics. In a seminal paper from 1935 the three physicists showed that the standard Copenhagen interpretation of quantum mechanics has a peculiar consequence: It allows for the existence of “entangled” particles.

Entangled particles have measureable properties, for example spin, that are correlated between two particles even though the value for each single particle is not determined as long as the particles were not measured. You can know for example that if one particle has spin up the other one has spin down or vice versa, but not know which is which. The consequence is that if one of these particles is measured, the state of the other one changes – instantaneously. The moment you measure one particle having spin up, the other one must have spin down, even though it did, according to the Copenhagen interpretation, not previously have any specific spin value.

Einstein believed this ‘spooky’ action at a distance to be nonsense and decades of discussion followed. John Steward Bell later quantified exactly how entangled particles are stronger correlated than classical particles could ever be. According to Bell’s theorem, quantum entanglement can violate an inequality that bounds classical correlations.

When I was a student, tests of Bell’s theorem were still thought experiments. Today they are real experiments, and we know beyond doubt that quantum entanglement exists. It is at the basis of quantum information, quantum computation, and chances are all technologies of the coming generations will build upon Einstein, Podolsky and Rosen’s thought experiment.

Another famous thought experiment is Einstein’s elevator being pulled up by an angel. Einstein argued that inside the elevator one cannot tell, by any possible measurement, whether the elevator is in rest in a gravitational field or is being pulled up with constant acceleration. This principle of equivalence means that locally (in the elevator) the effects of gravitation are the same as that of acceleration in the absence of gravity. Converted into mathematical equations, it becomes the basis for General Relativity.

Einstein also liked to imagine chasing after photons and he seems to have spent a lot of time thinking about trains and mirrors and so on, but let us look at some other physicists’ thoughts.

Before Einstein and the advent of quantum mechanics, Laplace imagined an omniscient being able to measure the positions and velocities of all particles in the universe. He concluded, correctly, that based on Newtonian mechanics this being, named “Laplace’s demon”, would be able to predict the future perfectly for all times. Laplace did not know back then of Heisenberg’s uncertainty principle and neither did he know of chaos, both of which spoil predictability. However, his thoughts on determinism were hugely influential and lead to the idea of a clockwork universe, and our understanding of science a prediction tool in general.

Laplace’s is not the only famous demon in physics. Maxwell also imagined a demon, one that was able to sort particles of a gas into compartments depending on the particles’ velocities. The task of Maxwell’s demon was to open and close a door connecting two boxes that contain gas which initially has the same temperature on both sides. Every time a fast particle approaches from the right, the demon lets it through to the left. Every time a slow particle arrives from the right, the demon closes the door and keeps it right. This way, the average energy of particles and thus the temperature in the left box increases, and entropy of the whole system decreases. Maxwell’s demon thus seemed to violate the second law of thermodynamics!

Mawell’s demon gave headaches to physicists for many decades until it was finally understood that the demon itself must increase its entropy or use energy while it measures, stores, and eventually erases information. It has not been until a few years ago that Maxwell’s demon was in fact realized in the laboratory.

A thought experiment that still gives headaches to theoretical physicists today is the black hole information loss paradox. If you combine general relativity and quantum field theory, each of which is an extremely well established theory, then you find that black holes evaporate. You also find however that this process is not reversible; it destroys information for good. This however cannot happen in quantum field theory and thus we face a logical inconsistency when combining the two theories. This cannot be how nature works, so we must be making a mistake. But which? There are many proposed solutions to the black hole information loss problem. Most of my colleagues believe that we need a quantum theory of gravity to resolve this problem and that the inconsistency comes about by using general relativity in a regime where it should no longer be used. The thought experiments designed to resolve the problem typically use an imagined pair of observers, Bob and Alice, one of which is unfortunate to have to jump into the black hole while the other one remains outside.

One of the presently most popular solution attempts is black hole complementarity. Proposed in 1993 by Susskind and Thorlacius, black hole complementarity rests on the Gedankenexperiment main rules: That what matters is only what can be measured, and you should not fool yourself. One can avoid information loss in black holes by copying information and let it both fall into the black hole and go out. One copy remains with Bob, one goes with Alice. Copying quantum information however is itself inconsistent with quantum theory. Susskind and Thorlacius pointed out that these disagreements would not be measureable by neither Bob nor Alice, and thus no inconsistency could ever arise.

Black hole complementarity was proposed before the AdS/CFT duality was conjectured, and its popularity sparked when it was found that the non-locally doubled presence of information seemed to fit nicely with the duality that arose in string theory.

As of recently though, it has become clear that this solution has its own problems because it seems to violate the equivalence principle. The observer who crosses the horizon should not be able to notice anything unusual there. It should be like sitting in that elevator being pulled by an angel. Alas, black hole complementarity seems to imply the presence of a “firewall” that would roast the unsuspecting observer in his elevator. Is this for real or are we making a mistake again? Since the solution to this problem holds the promise of understanding the quantum nature of space and time much effort has focused on solving it.

Yes, Einstein’s legacy of thought experiments weighs heavily on theoretical physicists today – maybe too heavy for sometimes we forget that Einstein’s thoughts were based on real experiments. He had Michelson-Morley’s experiments that disproved the aether, he had the perihelion precession of mercury, he had the measurements of Planck’s radiation law. Thought alone only gets one so far. In the end, it is still data that decides whether a thought can become reality or remain fantasy.

[Cartoon: Abstruse Goose, Missed Calling]

This post first appeared on "Starts with a Bang".

Tuesday, October 21, 2014

We talk too much.

Image Source: Loom Love.

If I had one word to explain human culture at the dawn of the 21st century it would be “viral”. Everybody, it seems, is either afraid of or trying to make something go viral. And as mother of two toddlers in Kindergarten, I am of course well qualified to comment on the issue of spreading diseases, like pinkeye, lice, goat memes, black hole firewalls, and other social infections.

Today’s disease is called rainbow loom. It spreads via wrist bands that you are supposed to crochet together from rubber rings. Our daughters are too young to crochet, but that doesn’t prevent them from dragging around piles of tiny rubber bands which they put on their fingers, toes, clothes, toys, bed posts, door knobs and pretty much everything else. I spend a significant amount of my waking hours picking up these rubber bands. The other day I found some in the cereal box. Sooner or later, we’ll accidentally eat one.

But most of the infections the kids bring home are words and ideas. As of recently, they call me “little fart” or “old witch” and, leaving aside the possibility that this is my husband’s vocabulary when I am away, they probably trade these expressions at Kindergarten. I’ll give you two witches for one fart, deal? Lara, amusingly enough, sometimes confuses the words “ass” and “men” – “Arch” and “Mench” in German with her toddler’s lisp. You’re not supposed to laugh, you’re supposed to correct them. It’s “Arsch,” Lara, “SCH, not CH, Arsch.”

Man, as Aristotle put it, is a zoon politicon, she lives in communities, she is social, she shares, she spreads ideas and viruses. He does too. I pass through Frankfurt international airport on the average once per week. Research shows that the more often you are exposed to a topic the more important do you think it is, regardless of what the source is. It’s the repeated exposure that does it. Once you have a word in your head marked as relevant, your brain keeps pushing it around and hands it back to you to look for further information. Have I said Ebola yet?

Yes, words and ideas, news and memes, go viral, spread, mutate and affect the way we think. And the more connected we are, the more we share, the more we become alike. We see the same things and talk about the same things. Because if you don’t talk about what everybody else talks about would you even listen to yourself?

Not so surprisingly then, it has become fashionable to declare the end of individualism also in science, pointing towards larger and larger collaborations, and increasing co-author networks, the need to share, and the success of sharing. According to this NYT headline, the “ERA OF BIG SCIENCE DIMINISHES ROLE OF LONELY GENIUS”. We can read there
“Born out of the complexity of modern technology, the era of the vast, big-budget research team came into its own with its scientific achievements of 1984.”
Yes, that’s right, this headline dates back 30 years.

There lonely genius of course has always been a myth. Science is and has always been a community enterprise. We’re standing on the shoulders of giants. Most of them are dead, ok, but we’re still standing, standing on these dead people’s shoulders and we’re still talking and talking and talking. We’re all talking way too much. It’s hard not to have this impression after attending 5 conferences more or less in a row.

Collaboration is very en vogue today, or “trending” as we now say. Nature recently had an article about the measurement of the gravitational constant, G. Not a topic I care deeply about, but the article has an interesting quote:
“Until now, scientists measuring G have competed; everyone necessarily believes in their own value, says Stephan Schlamminger, an experimental physicist at NIST. “A lot of these people have pretty big egos, so it may be difficult,” he says. “I think when people agree which experiment to do, everyone wants their idea put forward. But in the end it will be a compromise, and we are all adults so we can probably agree.” 
Working together could even be a stress reliever, says Jens Gundlach, an experimental physicist at the University of Washington in Seattle. Getting a result that differs from the literature is very uncomfortable, he says. “You think day and night, ‘Did I do everything right?’”
And here I was thinking that worrying day and night about whether you did everything right is the essence of science. But apparently that’s too much stress. It’s clearly better we all work together to make this stressful thinking somebody else’s problem. Can you have a look at my notes and find that missing sign?

The Chinese, as you have almost certainly read, are about to overtake the world, and in that effort they now reform their science research system. Nature magazine informs us that the idea of this reform is “to encourage scientists to collaborate on fewer, large problems, rather than to churn out marginal advances in disparate projects that can be used to seek multiple grants. “Teamwork is the key word,” says Mu-Ming Poo, director of the CAS Institute of Neuroscience in Shanghai.” Essentially, it seems, they’re giving out salary increases for scientists to think the same as their colleagues.

I’m a miserable cook. My mode of operation is taking whatever is in the fridge, throwing it into a pan with loads of butter, making sure it’s really dead, and then pouring salt over it. (So you don’t notice the rubber bands.) Yes, I’m a miserable cook. But I know one thing about cooking: if you cook it for too long or stir too much, all you get is mush. It’s the same with ideas. We’re better off with various individual approaches than one collaborative one. Too much systemic risk in putting all your eggs in the same journal.

The kids, they also bring home sand-bathed gummy bears that I am supposed to wash, their friend’s socks, and stacks of millimeter paper glued together because GLUE! Apparently some store donated cubic meters of this paper to the Kindergarten because nobody buys it anymore. I recall having to draw my error bars on this paper, always trying not to use an eraser because the grid would rub away with the pencil. Those were the days.

We speak about ideas going viral, but we never speak about what happens after this. We get immune. The first time I heard about the Stückelberg mechanism I thought it was the greatest thing ever. Now it’s on the daily increasing list of oh-yeah-this-thing. I’ve always liked the myth of the lonely genius. I have a new office mate. She is very quiet.

Wednesday, October 15, 2014

Siri's Song [music video]

After the ios 8 update you can now use your iPhone entirely hands-free if the phone is plugged in and you speak the magic words "Hey Siri." I know this because last weekend my phone was on the charger next to my microphone as I was working on one of my pathetic vocal recordings, when suddenly Siri offered the following wisdom
    "Our love is like two long shadows kissing without hope of reality."

I cursed, stopped the recording, and hit playback. And there was Siri's love confession over my carefully crafted drum-bass loop. It was painfully obvious that whoever processed these vocals knew, in contrast to me, what he or she was doing. They're professionally filtered, compressed and flawlessly de-essed. In short, they sound awesome, even after re-recording.

I then had a little conversation with my phone, inquiring what this shadow business was all about. Siri stubbornly refused to repeat her lyrical deepity, but had some other weird insights to offer.

Enjoy :)

PS: No, my lyrics do of course not contain the words "Hey Siri". I'm not sure what caught her attention, but I recommend you don't sing to your phone.

Monday, October 13, 2014

Does Loop Quantum Cosmology make the black hole information loss problem worse rather than better?

Image Source: Flickr.

Martin Bojowald is one of the originators of Loop Quantum Cosmology (LQC), a model for the universe that makes use of the quantization techniques of Loop Quantum Gravity (LQG). This description of cosmology takes into account effects of quantum gravity and has become very popular during the last decade, because it allows making contact to observation.

The best known finding in LQC is that the Big Bang singularity, which one has in classical general relativity, is replaced by a bounce that takes place when the curvature becomes strong (reaches the Planckian regime). This in return has consequences for example for the spectrum of primordial gravitational waves (that we still hope will at some point emerge out of the foreground dust).

Now rumors reached me from various sources that Martin lost faith that Loop Quantum Cosmology is a viable description of our universe, and indeed he recently put a paper out on the arxiv detailing the problem that he sees.
Information loss, made worse by quantum gravity
Martin Bojowald
Loop Quantum Cosmology, to be clear, was never claimed to be strictly speaking derived from Loop Quantum Gravity, though I have frequently noticed that the similarity of the names leads to confusion in the popular science literature. LQC deals with a symmetry-reduced version of LQG, but this symmetry reduction is done before the quantization. In practice this means that in LQC one first simplifies the universe by assuming it is homogeneous and isotropic, and then quantizes the remaining degrees of freedom. Whether or not this treatment leads to the same result that one would get by taking the fully quantized theory and looking for a solution that reproduces the right symmetries is controversial, and to my knowledge this question has never been satisfactorily settled.

Be that as it may, from my perspective and from that of most people working on the topic, LQC is a phenomenological model that is potentially testable and thus interesting in its own right, regardless of its connection to LQG.

It has become apparent however during the last years that if one takes into account perturbations around the homogeneous and isotropic background in LQC then one finds something peculiar: the space-time around the bounce loses its time-coordinate, it becomes Euclidean and is thus just space without time. We discussed this earlier here.

Now the time-coordinate in the space-time that we normally deal with plays a very important role, which is that it allows us to set an initial condition at one moment in time, and then use the equations of motion to predict what will happen at later times. This so called “forward evolution” is a very typical procedure for differential equations in physics, so typical that we often do not think about it very much. Thus I have to emphasize the relevant point is that to determine what happens at some point in space-time one does not have to set an initial condition on a space-time boundary around that point, which would necessitate knowing what happens at some moments into the future, but it is sufficient to know what happened at some moment in the past.

This important property that allows us to set initial conditions in the past to predict the future is not something you get for free in any space-time background. Space-times that obey this property are called “globally hyperbolic”. (Anti-de Sitter space is the probably best known example of a space-time that is not globally hyperbolic, thus the relevance of the boundary in this case.)

In his new paper Martin now points out that if space-time has regions that are Euclidean then the initial value problem becomes problematic. It is then in fact no longer possible to predict the future from a past initial condition. For the case of the Big Bang singularity being replaced by a Euclidean regime, this does not matter so much because we would just set initial conditions after this regime has passed and move on from there. But not so with black holes.

The singularity inside black holes is in LQC then also replaced by a Euclidean regime. This regime only forms in the late stages of collapse and will eventually vanish after the black hole has evaporated. But there being an intermediate Euclidean region has the consequence that whatever is the outcome of the evaporation process depends on the boundary conditions surrounding the Euclidean region. With the intermediate Euclidean region, one can no longer predict from the initial conditions of the matter that formed the black hole what is the outcome of black hole evaporation.

In his paper Martin writes that this makes the black hole information loss considerably worse. The normal black hole information loss problem is that the process of black hole evaporation seems to be irreversible and thus in particular not unitary. The final state of the evaporation is always thermal radiation, regardless of what formed the black hole. Now with the Euclidean region the final state of the black hole evaporation depends on some boundary condition that is not even in principle predictable. We have thus gone from not unitary to not deterministic!

Martin likens this case to that of a naked singularity, a singular region that (in contrast to the normal black hole singularity which is hidden by the horizon) is in full causal contact with space-time. A singularity is where everything ends, but it is also where anything can start. The initial value problem in a space-time with a naked singularity is similarly ill-defined as that in a space-time region with a Euclidean core, Martin argues.

I find this property of black holes in LQC not as worrisome as Martin. The comparison to a naked singularity is not a good one because the defining property of a singularity is that one cannot continue through it. One can however continue through the Euclidean region, it’s just that one needs additional constraints to know how. In fact I can see that what Martin thinks is a bug might be a feature for somebody else, for after all we know that time-evolution in quantum mechanics seems to be non-deterministic indeed.

But even leaving aside this admittedly far-fetched relation, the situation that additional information is necessary on some boundary to the future is not unlike that of the mysterious “stretched horizon” in black hole complementary. Said stretched horizon somehow stores and later releases the information of what fell through it. If the LQC black hole is supposed to solve the black hole information problem, then the same must be happening on the boundary of the Euclidean region. And, yes, that is a teleological constraint. I do not see what theory could possibly lead to it, but I don’t see that it is not possible either.

In summary, I find this development more interesting than troublesome. In contrast to non-unitarity, having a Euclidean core is uncomfortable and certainly unintuitive, but not necessarily inconsistent. I am very curious to see what the community will make out of this -- and I am sure we will hear more about this in the soon future.

Wednesday, October 08, 2014

I can't forget [music video]

Update about my songwriting efforts:

I had some help with the audio mix, but the friendly savior of my high frequency mush prefers to go unnamed. Thanks to him though, you can now put the thing on your stereo and it will sound reasonably normal. I like to think that I have made some progress with the vocal recording and processing. I am not happy with the percussion in that piece, have to work on that. If you go through my last few videos you can basically hear which tutorial I read at which point. So far I believe I am making progress, but you be my judge!

As to the video, I spent some money on an inexpensive video camera, and it has made my video recording dramatically easier because it has an auto zoom. As a result, the new video is more dynamic than the previous ones, it looks considerably better to me. It would have been even better hadn't I been wearing a blue shirt on the blue-screen day, some neurons failed me there.

I still haven't found a good way to deal with the problem that the video tends to go out of synch with the audio after exporting it. In fact I noticed that the severity of the problem depends on the player with which you watch the result which I find particularly odd. And after uploading the thing to youtube the audio again shifts oh-so-slightly. In the end, no matter what I do, it never quite fits.

And since I was asked a few times, yes I do have a soundcloud account, under the name "Funny Mommy". You can find all the tracks there. It's just that I am not in the mood to play the social network game on yet another platform, so I have a total of three followers or so, all of which are probably spam-bots. That's why I use YouTube. I am totally open to suggestions for other artist names :) And yeah, I am also on Ello, as @hossi, not that it seems to be good for anything.

Friday, October 03, 2014

Is the next supercollider a good investment?

The relevance of basic research is difficult to communicate to politicians who only care about their next term and who don’t want to invest in what might take decades to pay off. But it is even more difficult to decide which research is the best to invest into, and how much it is worth, in numbers.

Whether a next supercollider is worth the billions of Euro that it will eat up is a very involved question. I find it partly annoying, partly disturbing, that many of my physics colleagues regard the answer as obvious. Clearly we need a new supercollider! To measure the details of this, and the decay channels of that, to get a cleaner signal of something and a better precision for whatever. And I am sure they will come up with an argument for why Susy, our invisible friend, is still just around the corner.

To me this superficial argumentation is just another way of demonstrating they don’t care about communicating the relevance of their research. Of course they want a next collider - they make their living writing papers about that.

The most common argument that I hear in favor of the next collider is that much more money is wasted on the war in Afghanistan (if you ask an American) or rebuilding the Greek economy (if you ask a German), and I am sure similar remarks are uttered worldwide. The logic here seems to be that a lot of money is wasted anyway, so what does it matter to spend some billions on a collider. Maybe this sounds convincing if you have a PhD in high energy physics, but I don’t know who else is supposed to buy this.

The next argument I keep hearing is that the worldwide web was invented at CERN which also hosts the LHC right now. If anything, this argument is even more stupid than the war-also-wastes-money argument. Yes, Tim Berners-Lee happened to work at CERN when he developed hypertext. The environment was certainly conductive to his invention, but the standard model of particle physics had otherwise very little to do with it. You could equally well argue we should build leaning towers to advance research on general relativity.

I just finished reading John Moffat’s book “Cracking the Particle Code of the Universe”. I can’t post the review here until it has appeared in print due to copyright issues, sorry, but by and large it’s a good book. No, he doesn’t use it to advertise his own theories. He mentions them of course, but most of the book is more generally dedicated to the history, achievements, and shortcomings of the standard model.

His argument for the relevance of particle colliders amounts to the following paragraph:
“As Guido Altarelli mused after my talk at CERN in 2008, can governments be persuaded to spend ever greater sums of money, amounting to many billions of dollars, on ever larger and higher energy accelerators than the LHC if they suspect that the new machines will also come up with nothing new beyond the Higgs boson? Of course, to put this in perspective, one should realize that the $9 billion spend on an accelerator would not run a contemporary war such as the Afghanistan war for more than five weeks. Rather than killing people, building and operating these large machines has practical and beneficial spinoffs for technology and for training scientists. Thus, even if the accelerators continued to find no new particles, they might still produce significant benefits for society. The Worldwide Web, after all, was invented at CERN.”

~ John Moffat, Cracking the Particle Code of the Universe, p. 78
Well, running a war also has practical and beneficial spinoffs for technology and training scientists. Sorry John, but that was disappointing. To be fair, the whole book itself makes a pretty good case for why understanding the laws of nature is important business. But what war doesn’t do for your country and what investing in basic research does is building a base for sustainable progress. Without new discoveries and fundamentally new insights, applied science must eventually run dry.

There is no doubt in my mind that society invests its billions well if it invests in theoretical physics. Whether that investment should go into particle colliders though is a different question. I don’t have a good answer to that, and I don’t see that the question is seriously being discussed. Is it a worthy cause?

Last year, Fermilab’s Symmetry Magazine ran a video contest on the topic “Why particle physics matters”. Ironically most of the answers have nothing to do with particle physics in particular: “could bring about a revolution,” “a wonderful model of successful international collaboration,” “explore the frontiers and boundaries of our universe,” “engages and sharpens the mind”, “captures the imagination of bright minds”. You could use literally the same arguments for cosmology, quantum information or high precision measurements. Indeed, I personally find the high precision frontier presently more promising than ramping up energy and luminosity.

I am happy of course if China will go ahead and build the next supercollider. After all it’s not my taxes and still better than spending money on diamond necklaces that your 16 year old can show off on facebook. I can’t quite shake the impression though that this plan is more the result of wanting to appear competitive than the result of a careful deliberation about return on investment.

Friday, September 26, 2014

Black holes declared non-existent again.

That's me. 

The news of the day is that Laura Mersini-Houghton has presumably shown that black holes don’t exist. The headlines refer to these two papers: arXiv:1406.1525 and arXiv:1409.1837.

The first is an analytical estimate, the second a numerical study of the same idea. Before I tell you what these papers are about, a disclaimer: I know Laura; we have met at various conferences, and I’ve found her to be very pleasant company. I read her new paper some while ago and was hoping I wouldn’t have to comment on this, but my inbox is full with people asking me what this is all about. So what can I do?

In their papers, Laura Mersini-Houghton and her collaborator Harald Pfeiffer have taken into account the backreaction from the emitted Hawking radiation on the collapsing mass which is normally neglected. They claim to have shown that the mass loss is so large that black holes never form to begin with.

To make sense of this, note that black hole radiation is produced by the dynamics of the background and not by the presence of a horizon. The horizon is why the final state misses information, but the particle creation itself does not necessitate a horizon. The radiation starts before horizon formation, which means that the mass that is left to form the black hole is actually less than the mass that initially collapsed.

Physicists have studied this problem back and forth since decades, and the majority view is that this mass loss from the radiation does not prevent horizon formation. This shouldn’t be much of a surprise because the temperature of the radiation is tiny and it’s even tinier before horizon formation. You can look eg at this paper 0906.1768 and references [3-16] therein to get an impression of this discussion. Note though that this paper also mentions that it has been claimed before every now and then that the backreaction prevents horizon formation, so it’s not like everyone agrees. Then again, this could be said about pretty much every topic.

Now what one does to estimate the backreaction is to first come up with a time-dependent emission rate. This is already problematic because the normal Hawking radiation is only the late-time radiation and time-independent. What is clear however is that the temperature before horizon formation is considerably smaller than the Hawking-temperature and it drops very quickly the farther away the mass is from horizon formation. Incidentally, this drop was topic of my master’s thesis. Since it’s not thermal equilibrium one actually shouldn’t speak of a temperature. In fact the energy spectrum isn’t quite thermal, but since we’re only concerned with the overall energy the spectral distribution doesn’t matter here.

Next problem is that you will have to model some collapsing matter and take into account the backreaction during collapse. Quite often people use a collapsing shell for this (as I did in my master’s thesis). Shells however are pathological because if they are infinitely thin they must have an infinite energy-density and are by themselves already quantum gravitational objects. If the shell isn’t infinitely thin, then the width isn’t constant during collapse. So either way, it’s a mess and you best do it numerically.

What you do next is take that approximate temperature which now depends on some proper time in which the collapse proceeds. This temperature gives via Stefan-Bolzmann’s law a rate for the mass loss with time. You integrate the mass-loss over time and subtract the integral from the initial mass. Or at least that’s what I would have done. It is not what Mersini-Houghton and Pfeiffer have done though. What they seem to have done is the following.

Hawking radiation has a negative energy-component. Normally negative energies are actually anti-particles with positive energies, but not so in the black hole evaporation. The negative energy particles though only exist inside the horizon. Now in Laura’s paper, the negative energy particles exist inside the collapsing matter, but outside the horizon. Next, she doesn’t integrate the mass loss over time and subtracts this from the initial mass, but integrates the negative energies over the inside of the mass and subtracts this integral from the initial mass. At least that is my reading of Equation IV.10 in 1406.1525, and equation 11e in 1409.1837 respectively. Note that there is no time-integration in these expressions which puzzles me.

The main problem I have with this calculation is that the temperature that enters the mass-loss rate for all I can see is that of a black hole and not that of some matter which might be far from horizon crossing. In fact it looks to me like the total mass that is lost increases with increasing radius, which I think it shouldn’t. The more dispersed the mass, the smaller the gravitational tidal force, and the smaller the effect of particle production in curved backgrounds should be. This is for what the analytical estimate is concerned. In the numerical study I am not sure what is being done because I can’t find the relevant equation, which is the dependence of the luminosity on the mass and radius.

In summary, the recent papers by Mersini-Houghton and Pfeiffer contribute to a discussion that is decades old, and it is good to see the topic being taken up by the numerical power of today. I am skeptic that their treatment of the negative energy flux is consistent with the expected emission rate during collapse. Their results are surprising and in contradiction with many previously found results. It is thus too early to claim that is has been shown black holes don’t exist.

Monday, September 22, 2014

Discoveries that weren’t

This morning’s news, as anticipated, is that the new Planck data renders the BICEP2 measurement of relic gravitational waves inconclusive. It might still be there, the signal, but at least judging by the presently existing data analysis one can’t really say whether it is or isn’t.

Discoveries that vanish into dust, or rather “background” as the technical term has it, are of course nothing new in physics. In 1984, for example, the top quark was “discovered” with a mass of about 40 GeV:

Physicists may have found 'top' quark
By Robert C. Cowen, Staff writer of The Christian Science Monitor
MAY 9, 1984

Particle physicists appear to be poised for a breakthrough in their quest for the underlying structure of matter. Puzzling phenomena have appeared at energies where present theory predicted there was little left to uncover. This indicates that reseachers may have come across an unsuspected, and possibly rich, field in which to make new discoveries. Also, a team at the European Center for Nuclear Research (CERN) at Geneva may have found the long-sought 'top' quark. Protons, neutrons, and related particles are believed to be made up of combinations of more basic entities called quarks.”
This signal turned out to be a statistical fluctuation. The top quark wasn’t really discovered until 1995 with a mass of about 175 GeV. Tommaso tells the story.

The Higgs too was already discovered in 1984, at the Crystal Ball Experiment at DESY with a mass of about 9 GeV. It even made it into the NY Times:
Published: August 2, 1984

A new subatomic particle whose properties apparently do not fit into any current theory has been discovered by an international team of 78 physicists at DESY, a research center near Hamburg, West Germany. The group has named the particle zeta […] As described yesterday to a conference at Stanford, the zeta particle has some, but not all, of the properties postulated for an important particle, called the Higgs particle, whose existence has yet to be confirmed.”
Also in 1984, Supersymmetry came and went in the UA1 experiment [ht JoAnne]:
“Experimental observation of events with large missing transverse energy accompanied by a jet or a photon (S) in p \bar p collisions at \sqrt{s} = 540 GeV
UA1 Collaboration

We report the observation of five events in which a missing transverse energy larger than 40 GeV is associated with a narrow hadronic jet and of two similar events with a neutral electromagnetic cluster (either one or more closely spaced photons). We cannot find an explanation for such events in terms of backgrounds or within the expectations of the Standard Model.”
And the year 1996 saw a quark substructure come and go. The New York Times reported:
Tiniest Nuclear Building Block May Not Be the Quark
Published: February 8, 1996

Scientists at Fermilab's huge particle accelerator 30 miles west of Chicago reported yesterday that the quark, long thought to be the simplest building block of nuclear matter, may turn out to contain still smaller building blocks and an internal structure.”
Then there is the ominous pentaquark that comes and goes, the anisotropic universe [ht Ben], the lefthanded universe [ht Ethan], and the infamous OPERA anomaly that was a loose cable - and these are only the best known ones. The BICEP2 story is remarkable primarily because the initial media reports, based on the collaboration’s own press releases, so vastly overstated the confidence of the results.

The evidence for relic gravitational waves is a discussion that will certainly be continued for at least a decade or so. My prediction is in the end, after loads of data analysis they will find the signal just where they expected it. And that is really the main difference between the BICEP announcement and the superluminal OPERA neutrinos: In the case of the gravitational waves everybody thought the signal should be there. In the case of the superluminal neutrinos everybody thought it should not be there.

The OPERA collaboration was heavily criticized for making such a big announcement out of a result that was most likely wrong, and one can debate whether or not they did the right thing. But at least they amply warned everybody that the result was likely wrong.

Saturday, September 13, 2014

Is there a smallest length?

Good ideas start with a question. Great ideas start with a question that comes back to you. One such question that has haunted scientists and philosophers since thousands of years is whether there is a smallest unit of length, a shortest distance below which we cannot resolve structures. Can we look closer and always closer into space, time, and matter? Or is there a limit, and if so, what is the limit?

I picture our foreign ancestors sitting in their cave watching the world in amazement, wondering what the stones, the trees and they themselves are made of – and starving to death. Luckily, those smart enough to hunt down the occasional bear eventually gave rise to human civilization sheltered enough from the harshness of life to let the survivors get back to watching and wondering what we are made of. Science and philosophy in earnest is only a few thousand years old, but the question whether there is smallest unit has always been a driving force in our studies of the natural world.

The old Greeks invented atomism, the idea that there is an ultimate and smallest element of matter that everything is made of. Zeno’s famous paradoxa sought to shed light on the possibility of infinite divisibility. The question came back with the advent of quantum mechanics, with Heisenberg’s uncertainty principle that fundamentally limits the precision by which we can measure. It became only more pressing with the divergences in quantum field theory that are due to the inclusion of infinitely short distances.

It was in fact Heisenberg who first suggested that divergences in quantum field theory might be cured by the existence of a fundamentally minimal length, and he introduced it by making position operators non-commuting among themselves. Like the non-commutativity of momentum and position operators leads to an uncertainty principle, so does the non-commutativity of position operators limits how well distances can be measured.

Heisenberg’s main worry, which the minimal length was supposed to deal with, was the non-renormalizability of Fermi’s theory of beta-decay. This theory however turned out to be only an approximation to the renormalizable electro-weak interaction, so he had to worry no more. Heisenberg’s idea was forgotten for some decades, then picked up again and eventually grew into the area of non-commutative geometries. Meanwhile, the problem of quantizing gravity appeared on stage and with it, again, non-renormalizability.

In the mid 1960s Mead  reinvestigated Heisenberg’s microscope, the argument that lead to the uncertainty principle, with (unquantized) gravity taken into account. He showed that gravity amplifies the uncertainty so that it becomes impossible to measure distances below the Planck length, about 10-33 cm. Mead’s argument was forgotten, then rediscovered in the 1990s by string theorists who had noticed using strings to prevent divergences by avoiding point-interactions also implies a finite resolution, if in a technically somewhat different way than Mead’s.

Since then the idea that the Planck length may be a fundamental length beyond which there is nothing new to find, ever, appeared in other approaches towards quantum gravity, such as Loop Quantum Gravity or Asymptotically Safe Gravity. It has also been studied as an effective theory by modifying quantum field theory to include a minimal length from scratch, and often runs under the name “generalized uncertainty”.

One of the main difficulties with these theories is that a minimal length, if interpreted as the length of a ruler, is not invariant under Lorentz-transformations due to length contraction. This problem is easy to overcome in momentum space, where it is a maximal energy that has to be made Lorentz-invariant, because momentum space is not translationally invariant. In position space one either has to break Lorentz-invariance or deform it and give up locality, which has observable consequences, and not always desired ones. Personally, I think it is a mistake to interpret the minimal length as the length of a ruler (a component of a Lorentz-vector), and it should instead be interpreted as a Lorentz-invariant scalar to begin with, but opinions on that matter differ.

The science and history of the minimal length has now been covered in a recent book by Amit Hagar:

Amit is a philosopher but he certainly knows his math and physics. Indeed, I suspect the book would be quite hard to understand for a reader without at least some background knowledge in math and physics. Amit has made a considerable effort to address the topic of a fundamental length from as many perspectives as possible, and he covers a lot of scientific history and philosophical considerations that I had not previously been aware of. The book is also noteworthy for including a chapter on quantum gravity phenomenology.

My only complaint about the book is its title because the question of discrete vs continuous is not the same as the question of finite vs infinite resolution. One can have a continuous structure and yet be unable to resolve it beyond some limit (this is the case when the limit makes itself noticeable as a blur rather than a discretization). On the other hand, one can have a discrete structure that does not prevent arbitrarily sharp resolution (which can happen when localization on a single base-point of the discrete structure is possible).

(Amit’s book is admittedly quite pricey, so let me add that he said should sales numbers reach 500 Cambridge University Press will put a considerably less expensive paperback version on offer. So tell your library to get a copy and let’s hope we’ll make it to 500 so it becomes affordable for more of the interested readers.)

Every once in a while I think that there maybe is no fundamentally smallest unit of length, that all these arguments for its existence are wrong. I like to think that we can look infinitely close into structures and will never find a final theory, turtles upon turtles, or that structures are ultimately self-similar and repeat. Alas, it is hard to make sense of the romantic idea of universes in universes in universes mathematically, not that I didn’t try, and so the minimal length keeps coming back to me.

Many if not most endeavors to find observational evidence for quantum gravity today look for manifestations of a minimal length in one way or the other, such as modifications of the dispersion relation, modifications of the commutation-relations, or Bekenstein’s tabletop search for quantum gravity. The properties of these theories are today a very active research area. We’ve come a long way, but we’re still out to answer the same questions that people asked themselves thousands of years ago.

This post first appeared on Starts With a Bang with the title "The Smallest Possible Scale in the Universe" on August 12, 2014.

Thursday, September 11, 2014

Experimental Search for Quantum Gravity – What is new?

Last week I was at SISSA in Trieste for the 2014 conference on “Experimental Search for Quantum Gravity”. I missed the first two days because of child care problems (Kindergarten closed during holiday season, the babysitter ill, the husband has to work), but Stefano Liberati did a great job with the summary talk the last day, so here is a community update.

The briefest of brief summaries is that we still have no experimental evidence for quantum gravity, but then you already knew this. During the last decade, the search for experimental evidence for quantum gravity has focused mostly on deviations from Lorentz-invariance and strong quantum gravity in the early universe that might have left imprints on the cosmological observables we measure today. The focus on these two topics is still present, but we now have some more variety which I think is a good development.

There is still lots of talk about gamma ray bursts and the constraints on deformations of Lorentz-invariance that can be derived from this. One has to distinguish these constraints on deformations from constraints on violations of Lorentz-invariance. In the latter case one has a preferred frame, in the former case not. Violations of Lorentz-invariance are very strongly constrained already. But to derive these constraints one makes use of an effective field theory approach, that is one assumes that whatever quantum gravity at high energies (close by the Planck scale) looks like, at small energies it must be describable by the quantum field theories of the standard model plus some additional, small terms.

Deformations of Lorentz-symmetry are said to not have an effective field theory limit and thus these constraints cannot be applied. I cautiously say “are said not to have” such a limit because I have never heard a good argument why such a limit shouldn’t exist. For all I can tell it doesn’t exist just because nobody working on this wants it to exist. In any case, without this limit one cannot use the constraints on the additional interaction terms and has to look for other ways to test the model.

This is typically done by constraining the dispersion relation for free particles which obtains small correction terms. These corrections to the dispersion relation affect the speed of massless particles, which now is energy-dependent. The effects of the deformation become larger with long travel times and large energies which is why high energetic gamma ray bursts are so interesting. The deformation would make itself noticeable by either speeding up or slowing down the highly energetic photons, depending on the sign of a parameter.

Current constraints put the limits roughly at the Planck scale if the modification is either to slow down or to speed up the photons. Putting constraints on the case where the deformation is stochastic (sometimes speeding up, sometimes slowing down) is more difficult and so far there haven’t been any good constraints on this. Jonathan Granot briefly flashed by some constraints on the stochastic case, but said he can’t spill the details yet, some collaboration issue. He and collaborators do however have a paper coming out within the next months that I expect will push the stochastic case up to the Planck scale as well.

On the other hand we heard a talk by Giacomo Rosati who argues that to derive these bounds one uses the normal expansion of the Friedmann-Robertson-Walker metric, but that the propagation of particles in this background should be affected by the deformed theory as well, which weakens the constraints somewhat. Well, I can see the rationale behind the argument, but after 15 years the space-time picture that belongs to deformed Lorentz-invariance is still unclear, so this might or might not be the case. There were some other theory talks that try to get this space-time picture sorted out but they didn’t make a connection to phenomenology.

Jakub Mielczarek was at the meeting talking about the moment of silence in the early universe and how to connect this to phenomenology. In this model for the early universe space-time makes a phase-transition from a Euclidean regime to the present Lorentzian regime, and in principle one should be able to calculate the spectral index from this model, as well as other cosmological signatures. Alas, it’s not a simple calculation and progress is slow since there aren’t many people working on it.

Another possible observable from this phase-transition may be leftover defects in the space-time structure. Needless to say, I like that very much because I was talking about my model for space-time defects that basically is a parameterization of this possibility in general (slides here). It would be great if one could connect these parameters to some model about the underlying space-time structure.

The main message that I have in my talk is that if you want to preserve Lorentz-invariance, as my model does, then you shouldn’t look at high energies because that’s not a Lorentz-invariant statement to begin with. You should look instead at wave-functions sweeping over large world-volumes. This typically means low energies and large distances, which is not a regime that presently gets a lot of attention when it comes to quantum gravity phenomenology. I certainly hope this will change within the next years because it seems promising to me. Well, more promising than the gamma ray bursts anyway.

We also heard Joao Magueijo in his no-bullshit style explaining that modified dispersion relations in the early universe can reproduce most achievements of inflation, notably the spectral index including the tilt and solving the horizon problem. This becomes possible because an energy-dependence in the speed of light together with redshift during expansion turns the energy-dependence into a time-dependence. If you haven’t read his book “Faster Than the Speed of Light”, I assure you you won’t regret it.

The idea of dimensional reduction is still popular but experimental consequences, if any, come through derived concepts such as a modified dispersion relation or early universe dynamics, again.

There was of course some discussion of the BICEP claim that they’ve found evidence for relic gravitational waves. Everybody who cared to express an opinion seemed to agree with me that this isn’t the purported evidence for quantum gravity that the press made out of it, even if the measurement was uncontroversial and statistically significant.

As we discussed in this earlier post, to begin with this doesn’t test the quantum gravity at high energies but only the perturbative quantization of gravity, which for most of my colleagues isn’t really quantum gravity. It’s the high energy limit that we do not know how to deal with. And even to claim that it is evidence for perturbative quantization requires several additional assumptions that may just not be fulfilled, for example that there are no non-standard matter couplings and that space-time and the metric on it exist to begin with. This may just not be the case in a scenario with a phase-transition or with emergent gravity. I hope that next time the media picks up the topic they care to talk to somebody who actually works on quantum gravity phenomenology.

Then there was a member from the Planck collaboration whose name I forgot, who tried to say something about their analysis of the foreground effects from the galactic dust that BICEP might not have accurately accounted for. Unfortunately, their paper isn’t finished and he wasn’t really allowed to say all that much. So all I can tell you is that Planck is pretty much done with their analysis and the results are with the BICEP collaboration which I suppose is presently redoing their data fitting. Planck should have a paper out by the end of the month we’ve been told. I am guessing it will primarily say there’s lots of uncertainty and we can’t really tell whether the signal is there or isn’t, but look out for the paper.

There was also at the conference some discussion about the possibility to test quantum gravitational effects in massive quantum systems, as suggested for example by Igor Pikovski et al. This is a topic we previously discussed here, and I still think it is extremely implausible. The Pikovski et al paper is neither the first nor the last to have proposed this type of test, but it is arguably the one that got the most attention because they managed to get published in Nature Physics. These experiments are supposed to test basically the same deformation that the gamma ray bursts also test, just on the level of commutation relations in quantum mechanics rather than in the dispersion relation (the former leads to the latter, the opposite is not necessarily so).

The problem is that in this type of theory nobody really knows how to get from the one-particle case to the many-particle case, which is known as the ‘soccer-ball-problem’. If one naively just adds the energies of particles, one finds that the corrections blow up when one approaches the Planck mass, which is about 10-5 grams. That doesn’t make a lot of sense - to begin with because we wouldn’t reproduce classical mechanics, but also because quantum gravitational effects shouldn’t scale with the energy but with the energy density. This means that the effects should get smaller for systems composed of many particles. In this case then, you cannot get good constraints on quantum gravitational effects in the proposed experiments. That doesn’t mean one shouldn’t do the experiment. This is new parameter space in quantum mechanics and one never knows what interesting things one might find there. I’m just saying don’t expect any quantum gravity there.

Also at the conference was Jonathan Miller, who I had been in contact with earlier about his paper in which he and his coauthor estimate whether the effect of gravitational bremsstrahlung on neutrino propagation is detectable (we discussed this here). It is an interesting proposal that I spent quite some time thinking about because they don’t make giant leaps of faith about the scaling of quantum gravitational effects. In this paper, it is plainly perturbatively quantized gravity.

However, after some thinking about this I came to the conclusion that while the cross-section that they estimate may be at the right order of magnitude for some cases (I am not too optimistic about the exact case that they discuss in the paper), the total probability for this to happen is still tiny. That is because unlike the case of cross-sections measured at the LHC, for neutrinos scattering off a black hole one doesn’t have a high luminosity to bring up the chance of ever observing this. When I estimated the flux, the probability turned out to be too small to be observable by at least 30 orders of magnitude, ie what you typically expect for quantum gravity. Anyways, I had some interesting exchange with Jonathan who, needless to say, isn’t entirely convinced by my argument. So it’s not a settled story, and I’ll let you know what comes out of this.

Finally, I should mention that Carlo Rovelli and Francesca Vidotto talked about their Planck stars and the possible phenomenology that these could lead to. We previously discussed their idea here. They are arguing basically that quantum gravitational effects can be so that a black hole (with an apparent horizon, not an event horizon) does not slowly evaporate until it reaches the Planck mass, but suddenly explodes at a mass still much higher than the Planck mass, thereby releasing its information. If that was possible, it would sneak around all the issues with firewalls and remnants and so on. It might also have observable consequences for these explosions might be detectable. However, this idea is still very much in its infancy and several people in the audience raised concerns similar to mine, whether this can work without violating locality and/or causality in the semi-classical limit. In any case, I am sure that we will hear more about this in the soon future.

All together I am relieved that the obsession with gamma ray bursts seems to be fading, though much of this fading is probably due to both Giovanni Amelino-Camelia and Lee Smolin not being present at this meeting ;)

This was the first time I visited SISSA since they moved to their new building, which is no longer located directly at the coast. It is however very nicely situated on a steep hill, surrounded by hiking paths through the forest. The new SISSA building used to be a hospital, like the buildings that house Nordita in Stockholm. I’ve been told my office at Nordita is in what used to be the tuberculosis sector, and if I’m stuck with a computation I can’t help but wonder how many people died at the exact spot my desk stands now. As to SISSA, I hope that the conference was on what was formerly the pregnancy ward, and that the meeting, in spirit, may give birth to novel ideas how to test quantum gravity.

Monday, September 08, 2014

Science changed my life – and yours too.

Can you name a book that made you rethink? A song that helped you through bad times? A movie that gave you a new perspective, new hope, an idea that changed your life or that of people around you? And was it worth the price of the book, the download fee, the movie ticket? If you think of the impact it has had, does it come as a number in your currency of choice?

Those of us working in basic research today are increasingly forced to justify their work by its social impact, it’s value for the society that they live in. It is a good question because scientists payed by tax money should keep in mind who they are working for. But that impact that the funding agencies are after, it is expected to come in the form of applications, something that your neighbor will eventually be able to spend money on, to keep the economic wheels turning and the gears running.

It might take centuries for today’s basic research to result in technological applications, and predicting them is more difficult than doing the research itself. The whole point of doing basic research is that its impact is unpredictable. And so this pressure to justify what we are doing is often addressed by fantastic extrapolations of today’s research, potential gadgets that might come out of it, new materials, new technologies, new services. These justification that we come up with ourselves are normally focused on material value, something that seems tangible to your national funding agency and your member of parliament who wants to be reelected.

But basic research has a long tail, and a soft one, that despite its softness has considerable impact that is often neglected. At our recent workshop for science writers, Raymond Laflamme gave us two great lectures on quantum information technology, the theory and the applications. Normally if somebody starts talking about qubits and gates, my brain switches off instantly, but amazingly enough listening to Laflamme made it sound almost comprehensible.

Here is the slide that he used to motivate the relevance of basic research (full pdf here):

Note how the arrows in the circle gradually get smaller. A good illustration for the high-risk, high impact argument. Most of what we work on in basic research will never lead anywhere, but that which does changes our societies, rewards and refuels our curiousity, then initiates a new round in the circle.

Missing in this figure though is a direct link from understanding to social impact.

New scientific insights have historically had a major impact on the vision the thinkers of the day had for the ideal society and how it was supposed to work, and they still have. Knowledge about the workings of the universe have eroded the rationale behind monarchy, strong hierarchies in general, the influence of the church, and given rise to other forms of organizations that we may call enlightened today, but that will seem archaic a thousand years from now.

The variational principle, made popular in Leibnitz’ conclusion that we live in the “best of all possible worlds”, a world that must be “optimal” in some sense, has been hugely influential and eventually spun off the belief in self-organization, in the existence of an “invisible hand” that will lead societies to an optimal state, and that we better not try to outsmart. This belief is still wide-spread among today’s liberals, even though it obviously begs the questions whether what an unthinking universe optimizes is that what humans want.

The ongoing exploration of nature on large and small scales has fundamentally altered the way in which we perceive of us as special, now knowing that our solar system is but one among billions, many of which contain planets similar to our own. And the multiverse in all its multiple realizations is the maybe ultimate reduction of humanity to an accident, whereby it remains to be seen just how lucky this accident is.

That insights coming from fundamental research affect our societies long before and in many ways besides applications come along today is documented vividly by the Singularity believers who talk about the coming of artificial intelligence surpassing our own intelligence like Christians talk about the rapture. Unless you live in Silicon valley it's a fringe phenomenon, but it is vivid proof just how much ideas affect us.

Other recent developments that have been influential way beyond the scientific niches where they originated are chaos, instability, tipping points, complexity and systemic risk. And it seems to me that the awareness that uncertainty is an integral part of scientific knowledge is slowly spreading.

The connection between understanding and social impact is one you are part of every time you read a popular science piece and update your views about the world, the planet we inhabit, our place on it, and its place in the vastness of the universe. It doesn’t seem to mean all that much, all these little people with their little blogs and their little discussions, but multiply it by some hundred millions. How we think about our being part of nature affects how we organize our living together and our societies.

Downloading a Wikipedia entry of 300 kb through your home wireless: 0.01 Euro. Knowing that the universe expands and will forever continue to expand: Priceless.